What does the mutiple regression model allow us to obtain
Estimates of the relationships between Y and many factors, rather than just between Y and X
Example of a multiple regression model for effect of education and experience on wages
π€ππππ=π½0+π½1πππ’ππ+π½2ππ₯ππππ+π’π
In the education/experience example what can we investigate using multiple regression
π€ππππ=β3.39+0.64πππ’ππ+0.07 ππ₯ππrπ
for this example,what is the predicted wage rate for when educ=0 and exper=0
-3.39
π€ππππ=β3.39+0.64πππ’ππ+0.07 ππ₯ππrπ
for this example, what is the marginal effect of an additional year of education on wages
Holding experience fixed, βππ₯πππ=0, an additional year of education is predicted to increase wages by $0.64 per hour, (ππ€ππππ )Μ/ππππ’π=0.64
π€ππππ=β3.39+0.64πππ’ππ+0.07 ππ₯ππrπ
for this example, what is the marginal effect of an additional year of workplace experience on wages
Holding education fixed, βπππ’π=0, an additional year of workplace experience is predicted to increase wages by $0.07 per hour, (ππ€ππππ )Μ/πππ₯πππ=0.07
For the estimators to be unbiased in the multiple regression model, what needs to hold using the education/experience example
For our estimators to be unbiased, the zero conditional mean assumption
πΈ[π’|π₯1, π₯2]=0, must be valid
i.e., for any values of education and experience in the population, the average value of π’ is zero
What is the general multiple regression model
π¦=π½0+π½1π₯1+π½2π₯2+π½3π₯3+β¦+π½ππ₯π+π’
For this general model, what assumption is still necessary
The zero conditional mean assumption is still necessary
For this general model with π regressors the assumption is: πΈ(π’|π₯1, π₯2, π₯3, . . . π₯π)=0
If any one of the explanatory variables is correlated with π’, then no matter how large π is, what will this mean for the OLS estimators
It will be biased
What is the sample regression function (SRF) for multiple regression
π¦Μ=π½Μ0+π½1π₯1+π½2π₯2+π½3π₯3+β¦+π½ππ₯π
What does OLS do for multiple regression
OLS finds the values for π½Μ0,π½Μ1, π½Μ2, π½3,β¦π½Μπ that minimise the sum of squared residuals (πππ )
When OLS does the minimisation, what will be the result
we get π+1 first order conditions called ‘normal equations’ that we need to solve simultaneouslyβ¦
Solving these ‘normal equations’ simultaneously what is the result
We get the OLS estimators π½Μ0,π½Μ1, π½Μ2, π½Μ3,β¦π½Μπ
Visual representation of the simple regression model:π¦=π½0+π½1π₯1+π’, SRF: π¦Μ=π½Μ0+π½Μ1π₯1,
π½Μ1= πΆππ£(π₯,π¦)/πππ(π₯)
Labelled Visual representation of the multiple regression model: The multiple regression model: π¦=π½0+π½1π₯1+π½2π₯2+π’
What are the three important implications of moving from simple to multiple regression analysis
-A ceteris paribus (causal) interpretation of the estimators (and the estimates they generate) is more likely to be correct
In what cases are the estimators not different for multiple regression model
Why is a causal interpretation of the multiple linear regression estimators more likely to be correct than simple linear regression
SLR: π¦Μ=π½Μ_0+π½Μ1 π₯1
MLR: π¦Μ=π½Μ_0+π½Μ1 π₯1+π½Μ2 π₯2
Explain the ‘partialling out’ process for MLR