Lecture 5 Flashcards

(20 cards)

1
Q

What does the mutiple regression model allow us to obtain

A

Estimates of the relationships between Y and many factors, rather than just between Y and X

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Example of a multiple regression model for effect of education and experience on wages

A

π‘€π‘Žπ‘”π‘’π‘–=𝛽0+𝛽1𝑒𝑑𝑒𝑐𝑖+𝛽2𝑒π‘₯π‘π‘’π‘Ÿπ‘–+𝑒𝑖

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

In the education/experience example what can we investigate using multiple regression

A
  • investigate the relationship between experience and wages
  • investigate the relationship between education and wages, while controlling for the effect of experience on wages
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

π‘€π‘Žπ‘”π‘’π‘–=βˆ’3.39+0.64𝑒𝑑𝑒𝑐𝑖+0.07 𝑒π‘₯𝑝𝑒r𝑖

for this example,what is the predicted wage rate for when educ=0 and exper=0

A

-3.39

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

π‘€π‘Žπ‘”π‘’π‘–=βˆ’3.39+0.64𝑒𝑑𝑒𝑐𝑖+0.07 𝑒π‘₯𝑝𝑒r𝑖
for this example, what is the marginal effect of an additional year of education on wages

A

Holding experience fixed, βˆ†π‘’π‘₯π‘π‘’π‘Ÿ=0, an additional year of education is predicted to increase wages by $0.64 per hour, (πœ•π‘€π‘Žπ‘”π‘’π‘ )Μ‚/πœ•π‘’π‘‘π‘’π‘=0.64

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

π‘€π‘Žπ‘”π‘’π‘–=βˆ’3.39+0.64𝑒𝑑𝑒𝑐𝑖+0.07 𝑒π‘₯𝑝𝑒r𝑖
for this example, what is the marginal effect of an additional year of workplace experience on wages

A

Holding education fixed, βˆ†π‘’π‘‘π‘’π‘=0, an additional year of workplace experience is predicted to increase wages by $0.07 per hour, (πœ•π‘€π‘Žπ‘”π‘’π‘ )Μ‚/πœ•π‘’π‘₯π‘π‘’π‘Ÿ=0.07

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

For the estimators to be unbiased in the multiple regression model, what needs to hold using the education/experience example

A

For our estimators to be unbiased, the zero conditional mean assumption
𝐸[𝑒|π‘₯1, π‘₯2]=0, must be valid
i.e., for any values of education and experience in the population, the average value of 𝑒 is zero

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the general multiple regression model

A

𝑦=𝛽0+𝛽1π‘₯1+𝛽2π‘₯2+𝛽3π‘₯3+…+π›½π‘˜π‘₯π‘˜+𝑒

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

For this general model, what assumption is still necessary

A

The zero conditional mean assumption is still necessary
For this general model with π‘˜ regressors the assumption is: 𝐸(𝑒|π‘₯1, π‘₯2, π‘₯3, . . . π‘₯π‘˜)=0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

If any one of the explanatory variables is correlated with 𝑒, then no matter how large π‘˜ is, what will this mean for the OLS estimators

A

It will be biased

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the sample regression function (SRF) for multiple regression

A

𝑦̂=𝛽̂0+𝛽1π‘₯1+𝛽2π‘₯2+𝛽3π‘₯3+…+π›½π‘˜π‘₯π‘˜

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What does OLS do for multiple regression

A

OLS finds the values for 𝛽̂0,𝛽̂1, 𝛽̂2, 𝛽3,β€¦π›½Μ‚π‘˜ that minimise the sum of squared residuals (𝑆𝑆𝑅)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

When OLS does the minimisation, what will be the result

A

we get π‘˜+1 first order conditions called ‘normal equations’ that we need to solve simultaneously…

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Solving these ‘normal equations’ simultaneously what is the result

A

We get the OLS estimators 𝛽̂0,𝛽̂1, 𝛽̂2, 𝛽̂3,β€¦π›½Μ‚π‘˜

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Visual representation of the simple regression model:𝑦=𝛽0+𝛽1π‘₯1+𝑒, SRF: 𝑦̂=𝛽̂0+𝛽̂1π‘₯1,
𝛽̂1= πΆπ‘œπ‘£(π‘₯,𝑦)/π‘‰π‘Žπ‘Ÿ(π‘₯)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Labelled Visual representation of the multiple regression model: The multiple regression model: 𝑦=𝛽0+𝛽1π‘₯1+𝛽2π‘₯2+𝑒

17
Q

What are the three important implications of moving from simple to multiple regression analysis

A
  • The estimators change: In general, the estimators from simple & multiple regressions are not equal

-A ceteris paribus (causal) interpretation of the estimators (and the estimates they generate) is more likely to be correct

  • The estimators (and the estimates they generate) have a β€œpartialling out” interpretation
18
Q

In what cases are the estimators not different for multiple regression model

A
  • π‘₯1 and π‘₯2 are uncorrelated, i.e., πΆπ‘œπ‘£(π‘₯1π‘₯2)=0, then the SLR and MLR estimators are the same
  • the ceteris paribus effect of a change in π‘₯2 on 𝑦̂ is zero, i.e., 𝛽̂2=0
19
Q

Why is a causal interpretation of the multiple linear regression estimators more likely to be correct than simple linear regression

A

SLR: 𝑦̂=𝛽̂_0+𝛽̂1 π‘₯1
MLR: 𝑦̂=𝛽̂_0+𝛽̂1 π‘₯1+𝛽̂2 π‘₯2

  • In the MLR, we have controlled for the effect of π‘₯2 on 𝑦. In the SLR we have not.
20
Q

Explain the ‘partialling out’ process for MLR

A
  • In the MLR we β€œpartial out” that part of the variation in 𝑦 that could be explained by either π‘₯1 or π‘₯2 or a mixture of both