× Didn't find what you were looking for? Ask a question
Top Posters
Since Sunday
g
3
3
2
J
2
p
2
m
2
h
2
s
2
r
2
d
2
l
2
a
2
New Topic  
seeb1999 seeb1999
wrote...
Posts: 480
Rep: 2 0
4 years ago
List the five assumptions made in linear regressions and select one to discuss in depth.
Textbook 
Analytics, Data Science, & Artificial Intelligence: Systems for Decision Support

Analytics, Data Science, & Artificial Intelligence: Systems for Decision Support


Edition: 11th
Authors:
Read 93 times
1 Reply

Related Topics

Replies
wrote...
4 years ago
1. Linearity. This assumption states that the relationship between the response variable and the explanatory variables is linear. That is, the expected value of the response variable is a straight-line function of each explanatory variable while holding all other explanatory variables fixed. Also, the slope of the line does not depend on the values of the other variables. It also implies that the effects of different explanatory variables on the expected value of the response variable are additive in nature.
2. Independence (of errors). This assumption states that the errors of the response variable are uncorrelated with each other. This independence of the errors is weaker
than actual statistical independence, which is a stronger condition and is often not needed for linear regression analysis.
3. Normality (of errors). This assumption states that the errors of the response variable are normally distributed. That is, they are supposed to be totally random and should not represent any nonrandom patterns.
4. Constant variance (of errors). This assumption, also called homoscedasticity, states that the response variables have the same variance in their error regardless of the values of the explanatory variables. In practice, this assumption is invalid if the response variable varies over a wide enough range/scale.
5. Multicollinearity. This assumption states that the explanatory variables are not correlated (i.e., do not replicate the same but provide a different perspective of the information needed for the model). Multicollinearity can be triggered by having two or more perfectly correlated explanatory variables presented to the model (e.g., if the same explanatory variable is mistakenly included in the model twice, one with a slight transformation of the same variable). A correlation-based data assessment usually catches this error.
New Topic      
Explore
Post your homework questions and get free online help from our incredible volunteers
  1112 People Browsing
 116 Signed Up Today
Related Images
  
 1125
  
 2930
  
 244