Academic journal article Iranian Journal of Public Health

The Analysis of Internet Addiction Scale Using Multivariate Adaptive Regression Splines

Academic journal article Iranian Journal of Public Health

The Analysis of Internet Addiction Scale Using Multivariate Adaptive Regression Splines

Article excerpt

Abstract

Background: Determining real effects on internet dependency is too crucial with unbiased and robust statistical method. MARS is a new non-parametric method in use in the literature for parameter estimations of cause and effect based research. MARS can both obtain legible model curves and make unbiased parametric predictions.

Methods: In order to examine the performance of MARS, MARS findings will be compared to Classification and Regression Tree (C&RT) findings, which are considered in the literature to be efficient in revealing correlations between variables. The data set for the study is taken from "The Internet Addiction Scale" (IAS), which attempts to reveal addiction levels of individuals. The population of the study consists of 754 secondary school students (301 female, 443 male students with 10 missing data). MARS 2.0 trial version is used for analysis by MARS method and C&RT analysis was done by SPSS.

Results: MARS obtained six base functions of the model. As a common result of these six functions, regression equation of the model was found. Over the predicted variable, MARS showed that the predictors of daily Internet-use time on average, the purpose of Internet- use, grade of students and occupations of mothers had a significant effect (P< 0.05). In this comparative study, MARS obtained different findings from C&RT in dependency level prediction.

Conclusion: The fact that MARS revealed extent to which the variable, which was considered significant, changes the character of the model was observed in this study.

Keywords: MARS, Piecewise function, Internet addiction, Linear correlationG

(ProQuest: ... denotes formulae omitted.)

Introduction

Whether the correlations between variables in the research design are linear defines the preferable regression method. Accurate modeling of cause-effect relationship becomes harder as the number of predicted and predictive variables increase. In other words, with the increase in the number of variables in the model and interactions between variables, parameter estimations might be biased (1, 2). In this context, regression methods in accordance with the structure of variables in the research design during application could have different functions. Regression might be linear, non-linear or mixed according to models to show correlations between variables. Linear correlation between variables is considered as an important assumption in order to apply parametric regression methods. In case of non-linear correlation between variables, the model is fitted into non-parametric methods. However, it is occasionally observed that distribution curve or regression curve is too rough for non-parametric methods and therefore the curve becomes difficult to interpret (3). The aim of such smoothing functions or additive algorithm process is to get readable curves and lower mean square error (MSE) for unbiased parameter estimations (4).

One of the latest non-parametric methods is Multivariate Adaptive Regression Spline (MARS), which involves a number of variables in the model in an independent or in an interaction fashion and enables unbiased parameter estimations with strong algorithms. MARS could be viewed as generalization of repeated discriminate method and stepwise linear regression to improve the performance of a given regression set (5). MARS creates a new regression equation for each linear region in model. Obtained each linear region is called as "knot". This method first divides data space (pile) into areas and then forms a regression equation for each, which highlights MARS as an applicable solution to multivariate regression problems that might cause multidimensionality for other methods. MARS uses both forward and backward progresses for robust and unbiased parameter estimations. At first, MARS maximizes all the possible effects of predictive variables in the forward model and then removes the least effective functions in the backward model using Ordinary Least Squares method. …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.