In conducting multiple regression analyses (MRAs), a major technical concern involves any high correlation of predictor (regressor) variables included in the model. The term for this is…?

The term for high correlation among predictor variables included in a multiple regression model is called multicollinearity. Multicollinearity occurs when two or more predictor variables in a regression model are highly correlated with each other, making it difficult to distinguish their separate effects on the dependent variable.

To identify multicollinearity in multiple regression analyses, you can use various techniques:

1. Correlation Matrix: Calculate the correlation coefficients between each pair of predictor variables. If any correlation coefficient is close to +1 or -1, it indicates a high degree of multicollinearity.

2. Variance Inflation Factor (VIF): Calculate the VIF for each predictor variable. VIF measures how much the variance of an estimated regression coefficient is increased due to multicollinearity. A VIF value above 5 or 10 is often considered indicative of multicollinearity.

3. Tolerance: Tolerance is the reciprocal of the VIF and measures the proportion of variation in a predictor variable that is not explained by the other predictor variables. A tolerance value close to 0 indicates high multicollinearity.

By examining the correlation matrix, VIF values, and tolerances, you can assess the presence and severity of multicollinearity among the predictor variables in your regression model. If you find evidence of multicollinearity, it is important to address the issue before interpreting the model results. Solutions may involve removing correlated variables, transforming variables, or using statistical techniques specifically designed to handle multicollinearity, such as ridge regression or principal component analysis.