• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
The Analysis Factor

The Analysis Factor

Statistical Consulting, Resources, and Statistics Workshops for Researchers

  • Home
  • Our Programs
    • Membership
    • Online Workshops
    • Free Webinars
    • Consulting Services
  • About
    • Our Team
    • Our Core Values
    • Our Privacy Policy
    • Employment
    • Collaborate with Us
  • Statistical Resources
  • Contact
  • Blog
  • Login

regression models

Measures of Model Fit for Linear Regression Models

by Karen Grace-Martin 37 Comments

A well-fitting regression model results in predicted values close to the observed data values. Stage 2

The mean model, which uses the mean for every predicted value, generally would be used if there were no useful predictor variables. The fit of a proposed regression model should therefore be better than the fit of the mean model. [Read more…] about Measures of Model Fit for Linear Regression Models

Tagged With: F test, Model Fit, R-squared, regression models, RMSE

Related Posts

  • The Difference Between R-squared and Adjusted R-squared
  • Simplifying a Categorical Predictor in Regression Models
  • Eight Ways to Detect Multicollinearity
  • Why ANOVA is Really a Linear Regression, Despite the Difference in Notation

Member Training: Difference in Differences

by TAF Support

The great majority of all regression modeling explores and tests the association between independent and dependent variables. We are not able to claim the independent variable(s) has a causal relationship with the dependent variable. There are five specific model types that allow us to test for causality. Difference in differences models are one of the five.

[Read more…] about Member Training: Difference in Differences

Tagged With: causal models, causality, difference in differences, regression models

Related Posts

  • Member Training: Interrupted Time Series
  • Member Training: The LASSO Regression Model
  • Member Training: Multicollinearity
  • Member Training: Introduction to SPSS Software Tutorial

Eight Ways to Detect Multicollinearity

by Karen Grace-Martin 5 Comments

Multicollinearity can affect any regression model with more than one predictor. It occurs when two or more predictor variables overlap so much in what they measure that their effects are indistinguishable.

When the model tries to estimate their unique effects, it goes wonky (yes, that’s a technical term).

So for example, you may be interested in understanding the separate effects of altitude and temperature on the growth of a certain species of mountain tree.

[Read more…] about Eight Ways to Detect Multicollinearity

Tagged With: Bivariate Statistics, Correlated Predictors, linear regression, logistic regression, Multicollinearity, p-value, predictor variable, regression models

Related Posts

  • A Visual Description of Multicollinearity
  • Steps to Take When Your Regression (or Other Statistical) Results Just Look…Wrong
  • Is Multicollinearity the Bogeyman?
  • The Impact of Removing the Constant from a Regression Model: The Categorical Case

Parametric or Semi-Parametric Models in Survival Analysis?

by guest contributer Leave a Comment

It was Casey Stengel who offered the sage advice, “If you come to a fork in the road, take it.”

When you need to fit a regression model to survival data, you have to take a fork in the road. One road asks you to make a distributional assumption about your data and the other does not. [Read more…] about Parametric or Semi-Parametric Models in Survival Analysis?

Tagged With: cox, distributions, exponential, gamma, hazard function, lognormal, parametric models, regression models, semi-parametric, survival data, Weibull

Related Posts

  • Interpreting the Shapes of Hazard Functions in Survival Analysis
  • What Is a Hazard Function in Survival Analysis?
  • Six Types of Survival Analysis and Challenges in Learning Them
  • The Proportional Hazard Assumption in Cox Regression

Why ANOVA is Really a Linear Regression, Despite the Difference in Notation

by Karen Grace-Martin 3 Comments

When I was in graduate school, stat professors would say “ANOVA is just a special case of linear regression.”  But they never explained why.

And I couldn’t figure it out.

The model notation is different.

The output looks different.

The vocabulary is different.

The focus of what we’re testing is completely different. How can they be the same model?

[Read more…] about Why ANOVA is Really a Linear Regression, Despite the Difference in Notation

Tagged With: ANOVA, linear regression, notation, regression models

Related Posts

  • 7 Practical Guidelines for Accurate Statistical Model Building
  • The Steps for Running any Statistical Model
  • Beyond Median Splits: Meaningful Cut Points
  • Why ANOVA and Linear Regression are the Same Analysis

Can We Use PCA for Reducing Both Predictors and Response Variables?

by Karen Grace-Martin 5 Comments

I recently gave a free webinar on Principal Component Analysis. We had almost 300 researchers attend and didn’t get through all the questions. This is part of a series of answers to those questions.

If you missed it, you can get the webinar recording here.

Question: Can we use PCA for reducing both predictors and response variables?

In fact, there were a few related but separate questions about using and interpreting the resulting component scores, so I’ll answer them together here.

How could you use the component scores?

A lot of times PCAs are used for further analysis — say, regression. How can we interpret the results of regression?

Let’s say I would like to interpret my regression results in terms of original data, but they are hiding under PCAs. What is the best interpretation that we can do in this case?

Answer:

So yes, the point of PCA is to reduce variables — create an index score variable that is an optimally weighted combination of a group of correlated variables.

And yes, you can use this index variable as either a predictor or response variable.

It is often used as a solution for multicollinearity among predictor variables in a regression model. Rather than include multiple correlated predictors, none of which is significant, if you can combine them using PCA, then use that.

It’s also used as a solution to avoid inflated familywise Type I error caused by running the same analysis on multiple correlated outcome variables. Combine the correlated outcomes using PCA, then use that as the single outcome variable. (This is, incidentally, what MANOVA does).

In both cases, you can no longer interpret the individual variables.

You may want to, but you can’t. [Read more…] about Can We Use PCA for Reducing Both Predictors and Response Variables?

Tagged With: Component Score, index variable, MANOVA, Multicollinearity, principal component analysis, regression models, Type I error

Related Posts

  • How To Calculate an Index Score from a Factor Analysis
  • How to Reduce the Number of Variables to Analyze
  • Eight Ways to Detect Multicollinearity
  • Four Common Misconceptions in Exploratory Factor Analysis

  • Go to page 1
  • Go to page 2
  • Go to page 3
  • Go to Next Page »

Primary Sidebar

This Month’s Statistically Speaking Live Training

  • Member Training: Introduction to SPSS Software Tutorial

Upcoming Free Webinars

Poisson and Negative Binomial Regression Models for Count Data

Upcoming Workshops

  • Analyzing Count Data: Poisson, Negative Binomial, and Other Essential Models (Jul 2022)
  • Introduction to Generalized Linear Mixed Models (Jul 2022)

Copyright © 2008–2022 The Analysis Factor, LLC. All rights reserved.
877-272-8096   Contact Us

The Analysis Factor uses cookies to ensure that we give you the best experience of our website. If you continue we assume that you consent to receive cookies on all websites from The Analysis Factor.
Continue Privacy Policy
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Non-necessary
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.
SAVE & ACCEPT