Difference between pages "Chi square" and "Multiple Linear Regression"

From Practical Statistics for Educators
(Difference between pages)
Jump to: navigation, search
(Created page with "A chi square analysis is used with nominal data to determine how frequency counts are distributed for different samples. This method compares expected with observed observatio...")
 
 
Line 1: Line 1:
A chi square analysis is used with nominal data to determine how frequency counts are distributed for different samples. This method compares expected with observed observations. To conduct an analysis of the chi square, one must first collect the expected frequencies. After conducting a study, and gathering the observed nominal data, one must use the chi square formula. Calculate the degrees of freedom and use the chi square table to find the critical value. Next, compare the critical value to the chi square value. If the χ2 cv > χ2 , then p>.05. There would be no statistical significant difference in this case. If χ2 cv < χ2 , then p<.05. There would be statistical significant difference in this case. Lastly, one would calculate the standard residual (R) to determine which factors are the major contributors toward significance. When R>2, then this factor is a major contributor toward the chi square value.
+
Multiple linear regression (multiple regression) is a type of correlational test in which the research is interested in finding the strength of a correlation between multiple variables. In multiple linear regression, multiple variables are used as ''predictors. '''Here, the researcher is interested in the relationship between the predicted variables (dependent) and predictor variables (also known as the independent variables).  
 +
''
  
 +
Independent variables in multiple regression are usually quantitatively measured variables using summative response, interval, or ratio scales (Lawrence, Meyer, & Guarino, 2017)
  
''contributed by Chris Longo''
+
Multiple Linear Regression uses the same general equation as linear regression, but accommodates for multiple IV's.
  
----
 
  
In the last sentence above, it should read when the absolute value of R is greater than 2 (|R|>2), then this factor is a major contributor toward the chi square value. If the R is negative, it means there is a decrease in the data that is significant and when R is positive, it means there is an increase that is significant.
+
''contributed by Thomas Fox, WCSU Cohort 8''
  
 +
Reference
  
''contributed by Margie Aldrich''
+
Lawrence, S., Meyer, G, & Guarino, A.J. (2017). Applied multivariate research: Design and interpretation. Thousand Oaks, CA: Sage Publications
  
 +
==Multiple Linear Regression interpreting results example==
 +
Research Question:
 +
To what extent and in what manner can variation in college readiness be explained by self regulation, engagement in reading, household income, and population density?
  
----
+
Independent Variables: Self regulation, Engagement in reading, Household income, population density
 +
Dependent Variable: College Readiness
  
Thank you Margie. I typed that entry a while ago and didn't realize that my wording was off. Also, to add, it is important to remember that when analyzing chi square data, look at the "R" column (or calculate yourself using the formula) to determine which factors are significant. For example, in a study measuring reading achievement scores based on 4th, 5th, 6th and 7th grade teachers, also broken down by gender, you would have to look at each factor (level) individually in order to determine whether or not it is a major contributor toward significance. For example, 4th grade male teachers and 7th grade female teachers are major contributors to chi square value, based on the fact that |R|> 2.
+
Sample report interpreting results:
 +
Multiple linear regression was conducted with college readiness as the criterion variable and self regulation, engagement in reading, household income and population density as predictor variables. The model was significant F(4,45) = 17.88, p<.001.  Together, the variables in the model explained 61.4% of the variation in parent income, f2=[.614/.386], 1.59. Household income contributed significantly to the prediction of college readiness p<.001 while self regulation, engagement, and population density did not.
  
 +
''contributed by Scott Trungadi, WCSU Cohort 8''
  
''contributed by Chris Longo''
+
==Collinearity==
 +
 
 +
When conducting a multiple linear regression, you to see if the data meets the assumption of collinearity.  Therefore, you need to locate the Coefficients table in your results under the heading Collinearity Statistics, under which are two subheadings, Tolerance and VIF.
 +
 
 +
If the VIF value is greater than 10, or the Tolerance is less than 0.1, then you have concerns over multicollinearity. Otherwise, your data has met the assumption of collinearity and can be written up something like this:
 +
 
 +
''contributed by Sheri Prendergast, WCSU Cohort 8''
 +
 
 +
Dart, A., (2013).  Reporting Multiple Regressions in APA format-Part One. Retrieved from:  http://www.adart.myzen.co.uk/reporting-multiple-regressions-in-apa-format-part-one/

Revision as of 11:08, 4 December 2019

Multiple linear regression (multiple regression) is a type of correlational test in which the research is interested in finding the strength of a correlation between multiple variables. In multiple linear regression, multiple variables are used as predictors. 'Here, the researcher is interested in the relationship between the predicted variables (dependent) and predictor variables (also known as the independent variables).

Independent variables in multiple regression are usually quantitatively measured variables using summative response, interval, or ratio scales (Lawrence, Meyer, & Guarino, 2017)

Multiple Linear Regression uses the same general equation as linear regression, but accommodates for multiple IV's.


contributed by Thomas Fox, WCSU Cohort 8

Reference

Lawrence, S., Meyer, G, & Guarino, A.J. (2017). Applied multivariate research: Design and interpretation. Thousand Oaks, CA: Sage Publications

Multiple Linear Regression interpreting results example

Research Question: To what extent and in what manner can variation in college readiness be explained by self regulation, engagement in reading, household income, and population density?

Independent Variables: Self regulation, Engagement in reading, Household income, population density Dependent Variable: College Readiness

Sample report interpreting results: Multiple linear regression was conducted with college readiness as the criterion variable and self regulation, engagement in reading, household income and population density as predictor variables. The model was significant F(4,45) = 17.88, p<.001. Together, the variables in the model explained 61.4% of the variation in parent income, f2=[.614/.386], 1.59. Household income contributed significantly to the prediction of college readiness p<.001 while self regulation, engagement, and population density did not.

contributed by Scott Trungadi, WCSU Cohort 8

Collinearity

When conducting a multiple linear regression, you to see if the data meets the assumption of collinearity. Therefore, you need to locate the Coefficients table in your results under the heading Collinearity Statistics, under which are two subheadings, Tolerance and VIF.

If the VIF value is greater than 10, or the Tolerance is less than 0.1, then you have concerns over multicollinearity. Otherwise, your data has met the assumption of collinearity and can be written up something like this:

contributed by Sheri Prendergast, WCSU Cohort 8

Dart, A., (2013). Reporting Multiple Regressions in APA format-Part One. Retrieved from: http://www.adart.myzen.co.uk/reporting-multiple-regressions-in-apa-format-part-one/