Saturday, 29 January 2011

Reading 12. j Multicollinearity

The LOS reads:
"Describe multicollinearity, and discuss its causes and effects in regression analysis"


There are 4 areas that appear to be important:
1. What is multicollinearity?
2. What are the effects of multicollinearity?
3. Detecting it
4. How do you correct for it?



1. What is multicollinearity?
In Reading 11 we noted the underlying assumptions of the Linear regression model. Assumptions 2 reads as follows:












For multiple regressions we updated these assumptions, noting that the independent variables are not random and No exact linear relation exists between two or more of the independent variables. Multicollinearity is the exact opposite, being where 2 or more independent variables are highly correlated with 
each other.


2. What are the effects of multicollinearity?
Remember that heteroskadiscity & serial correlation results in standard errors that are too small?
Multicollinearity results in standard errors that are too big.
The result would therefore be the opposite.
Where the first 2 result in t-statistics that are too big, Multicollinearity results in t-statistics that are too small!


3. Detecting it
There is no statistical approach that is required to be studied for the CFA exam.
However, the following is a strong indication:
- Significant F stat, but insignificant t-stat. 
- There is a high correlation between the independent variables. (Usually when one applies your mind to the independent variables it is clear that there is high correlation)


4. Correcting it
The most obvious way is to exclude one of the independent variables from the regression.

No comments:

Post a Comment