In conducting multiple regression analyses (MRAs), a major technical concern involves any high correlation of predictor (regressor) variables included in the model. The term for this is ....?

A. Redundant predictors
B. Close "knit" IVs
C. Multicollinearity
D. Correlative dysfunction

C. Multicollinearity: highly correlated predictor variables (2 or more).

C. Multicollinearity

The term for the technical concern involving high correlation of predictor variables in multiple regression analyses (MRAs) is C. Multicollinearity.

To determine the presence of multicollinearity, you can calculate the correlation matrix of the predictor variables. The correlation matrix shows the pairwise correlations between all the variables in the analysis. When two or more predictor variables are highly correlated with each other, it indicates the presence of multicollinearity.

Multicollinearity can be problematic because it violates one of the assumptions of multiple regression, which is that the predictor variables are independent of each other. When there is multicollinearity, it becomes difficult to determine the unique contribution of each predictor variable to the dependent variable, as the effects of the correlated variables start to overlap.

To address multicollinearity, one possible solution is to remove one or more highly correlated predictors from the model. Another solution is to combine the highly correlated predictors into a single composite variable, such as creating a mean or sum score. This can help reduce multicollinearity and simplify the model without losing important information.

It is important to identify and address multicollinearity to ensure the validity and interpretability of the results in multiple regression analyses.