What is assumption 1 for Multiple Linear Regression (MLR1)
The (true) population model is a linear function of the explanatory variables
π¦=π½0+π½1π₯1+π½2π₯2+π½3π₯3+β¦+π½ππ₯π+π’
Assumption 2 for MLR (MLR2)
The sample of size π, {(π¦π, π₯π1,π₯π2,π₯π3,β¦, π₯ππ): π=1, 2, β¦, π}, taken from the population and used to generate the estimates, is random and π>π+1
Assumption MLR3- No perfect collinearity
MLR4- Zero conditional mean assumption
The disturbance term, π’, has an expected value of zero given any values of the independent variables
πΈ(π’|π₯_1,π₯_2,π₯_3,β¦, π₯_π )=0
Why is the zero conditional mean assumption often not valid
Because factors affecting Y and correlated with x have been omitted from the model
Why is zero conditional mean assumption more likely to be valid in the MLR model
Because more variable are explicitly included in the model
If two variables are perfectly collinear, what is the solution for MLR
Exclude one of the explanatory variables from the model
Assumption MLR5: Homoscedasticity
The errors have the same variance irrespective of the explanatory variables
πππ(π’|π)=π^2
Under the 5 assumptions, what are the variances of the OLS estimators (π½Μ_1, π½Μ_2, π½Μ_3,β¦, π½Μ_π)
Where π π^2 is a measure of the correlation between π₯π and all of the other explanatory variables, i.e., all the other π₯s,
How does the variance of the error, sample size and explanatory variables (x) effect the variances of the OLS estimators
-larger the larger π^2, the variance of the error (disturbance) terms
How does the correlation between the explanatory variables (π π^2 ) impact the variances of the OLS estimators
larger the greater the correlation between the explanatory variables π
In the model
π¦=π½0+π½1π₯1+π½2π₯2+π’, how does a greater correlation between x1 and x2 impact the πππ(π½Μ1 )
Which assumption is violated if π 1^2=1
MLR3 would be violated as that means x1 and x2 are perfectly collinear and the assumption is that there is no perfect collinearity
Diagram showing how multicollinearity increases the variance of the estimators
How much of a problem of multicollinearity
What is used, like in the SLR, as an estimator for π^2
the variance of the residuals, πΜ^2=πππ(π’Μπ )
What are the measures of the dispersion of the OLS estimators π½Μ1 to π½Μπ, in the MLR
What is the standard error of the regression
πΜ=β(πΜ^2 )
What are the standard errors of the OLS estimators π½Μ1 to π½Μπ, in the MLR
Acronym for the Gauss-Markov Theorem (BLUE)
B est (smallest variance)
L inear (linear function of the data)
U nbiased (πΈ(π½Μπ )=π½π)
E stimators
As long as the Gauss Markov assumptions are valid, is there any better estimator
NO