• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
The Analysis Factor

The Analysis Factor

Statistical Consulting, Resources, and Statistics Workshops for Researchers

  • our programs
    • Membership
    • Online Workshops
    • Free Webinars
    • Consulting Services
  • statistical resources
  • blog
  • about
    • Our Team
    • Our Core Values
    • Our Privacy Policy
    • Employment
    • Collaborate with Us
  • contact
  • login

Multicollinearity

Linear Regression Analysis – 3 Common Causes of Multicollinearity and What Do to About Them

by Karen Grace-Martin  1 Comment

Multicollinearity in regression is one of those issues that strikes fear into the hearts of researchers. You’ve heard about its dangers in statistics Stage 2classes, and colleagues and journal reviews question your results because of it. But there are really only a few causes of multicollinearity. Let’s explore them.Multicollinearity is simply redundancy in the information contained in predictor variables. If the redundancy is moderate, [Read more…] about Linear Regression Analysis – 3 Common Causes of Multicollinearity and What Do to About Them

Tagged With: dummy coding, interpreting regression coefficients, Multicollinearity, principal component analysis

Related Posts

  • Making Dummy Codes Easy to Keep Track of
  • Simplifying a Categorical Predictor in Regression Models
  • What is Multicollinearity? A Visual Description
  • Your Questions Answered from the Interpreting Regression Coefficients Webinar

What is Multicollinearity? A Visual Description

by Karen Grace-Martin  7 Comments

Multicollinearity is one of those terms in statistics that is often defined in one of two ways:

1. Very mathematical terms that make no sense — I mean, what is a linear combination anyway?

2. Completely oversimplified in order to avoid the mathematical terms — it’s a high correlation, right?

So what is it really? In English?

[Read more…] about What is Multicollinearity? A Visual Description

Tagged With: confounding variable, correlations, linear combination, linear regression, logistic regression, Multicollinearity, predictor variable, Regression, regression coefficients, variance, Variance inflation factor

Related Posts

  • Eight Ways to Detect Multicollinearity
  • The Impact of Removing the Constant from a Regression Model: The Categorical Case
  • Centering for Multicollinearity Between Main effects and Quadratic terms
  • Member Training: Centering

Eight Ways to Detect Multicollinearity

by Karen Grace-Martin  9 Comments

Stage 2Multicollinearity can affect any regression model with more than one predictor. It occurs when two or more predictor variables overlap so much in what they measure that their effects are indistinguishable.

When the model tries to estimate their unique effects, it goes wonky (yes, that’s a technical term).

So for example, you may be interested in understanding the separate effects of altitude and temperature on the growth of a certain species of mountain tree.

[Read more…] about Eight Ways to Detect Multicollinearity

Tagged With: Bivariate Statistics, Correlated Predictors, linear regression, logistic regression, Multicollinearity, p-value, predictor variable, regression models

Related Posts

  • What is Multicollinearity? A Visual Description
  • Member Training: Using Excel to Graph Predicted Values from Regression Models
  • Steps to Take When Your Regression (or Other Statistical) Results Just Look…Wrong
  • Is Multicollinearity the Bogeyman?

Member Training: Model Building Approaches

by TAF Support 

There is a bit of art and experience to model building. You need to build a model to answer your research question but how do you build a statistical model when there are no instructions in the box? 

Should you start with all your predictors or look at each one separately? Do you always take out non-significant variables and do you always leave in significant ones?

[Read more…] about Member Training: Model Building Approaches

Tagged With: centering, interaction, lasso, Missing Data, Model Building, Model Fit, Multicollinearity, overfitting, Research Question, sample size, specification error, statistical model, Stepwise

Related Posts

  • What Is Specification Error in Statistical Models?
  • Member Training: The LASSO Regression Model
  • Steps to Take When Your Regression (or Other Statistical) Results Just Look…Wrong
  • Member Training: Centering

Can We Use PCA for Reducing Both Predictors and Response Variables?

by Karen Grace-Martin  5 Comments

I recently gave a free webinar on Principal Component Analysis. We had almost 300 researchers attend and didn’t get through all the questions. This is part of a series of answers to those questions.

If you missed it, you can get the webinar recording here.

Question: Can we use PCA for reducing both predictors and response variables?

In fact, there were a few related but separate questions about using and interpreting the resulting component scores, so I’ll answer them together here.

How could you use the component scores?

A lot of times PCAs are used for further analysis — say, regression. How can we interpret the results of regression?

Let’s say I would like to interpret my regression results in terms of original data, but they are hiding under PCAs. What is the best interpretation that we can do in this case?

Answer:

So yes, the point of PCA is to reduce variables — create an index score variable that is an optimally weighted combination of a group of correlated variables.

And yes, you can use this index variable as either a predictor or response variable.

It is often used as a solution for multicollinearity among predictor variables in a regression model. Rather than include multiple correlated predictors, none of which is significant, if you can combine them using PCA, then use that.

It’s also used as a solution to avoid inflated familywise Type I error caused by running the same analysis on multiple correlated outcome variables. Combine the correlated outcomes using PCA, then use that as the single outcome variable. (This is, incidentally, what MANOVA does).

In both cases, you can no longer interpret the individual variables.

You may want to, but you can’t. [Read more…] about Can We Use PCA for Reducing Both Predictors and Response Variables?

Tagged With: Component Score, index variable, MANOVA, Multicollinearity, principal component analysis, regression models, Type I error

Related Posts

  • How To Calculate an Index Score from a Factor Analysis
  • How to Reduce the Number of Variables to Analyze
  • Eight Ways to Detect Multicollinearity
  • Four Common Misconceptions in Exploratory Factor Analysis

Member Training: Multicollinearity

by Karen Grace-Martin  Leave a Comment

Multicollinearity isn’t an assumption of regression models; it’s a data issue.

And while it can be seriously problematic, more often it’s just a nuisance.

In this webinar, we’ll discuss:

  • What multicollinearity is and isn’t
  • What it does to your model and estimates
  • How to detect it
  • What to do about it, depending on how serious it is

Note: This training is an exclusive benefit to members of the Statistically Speaking Membership Program and part of the Stat’s Amore Trainings Series. Each Stat’s Amore Training is approximately 90 minutes long.

[Read more…] about Member Training: Multicollinearity

Tagged With: Multicollinearity, regression models

Related Posts

  • Member Training: Centering
  • Member Training: Difference in Differences
  • Member Training: Missing Data
  • What is Multicollinearity? A Visual Description

  • Go to page 1
  • Go to page 2
  • Go to Next Page »

Primary Sidebar

This Month’s Statistically Speaking Live Training

  • Member Training: Multinomial Logistic Regression

Upcoming Workshops

    No Events

Upcoming Free Webinars

TBA

Quick links

Our Programs Statistical Resources Blog/News About Contact Log in

Contact

Upcoming

Free Webinars Membership Trainings Workshops

Privacy Policy

Search

Copyright © 2008–2023 The Analysis Factor, LLC.
All rights reserved.

The Analysis Factor uses cookies to ensure that we give you the best experience of our website. If you continue we assume that you consent to receive cookies on all websites from The Analysis Factor.
Continue Privacy Policy
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Non-necessary
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.
SAVE & ACCEPT