OptinMon 30 - Four Critical Steps in Building Linear Regression Models

Pros and Cons of Treating Ordinal Variables as Nominal or Continuous

July 3rd, 2025 by

There are not a lot of statistical methods designed just for ordinal variables. (There are a few, though.)  Stage 2

But that doesn’t mean that you’re stuck with few options.  There are more than you’d think. (more…)


Assumptions of Linear Models are about Errors, not the Response Variable

March 19th, 2024 by

Stage 2I recently received a great question in a comment about whether the assumptions of normality, constant variance, and independence in linear models are about the errors, εi, or the response variable, Yi.

The asker had a situation where Y, the response, was not normally distributed, but the residuals were.

Quick Answer:  It’s just the errors.

In fact, if you look at any (good) statistics textbook on linear models, you’ll see below the model, stating the assumptions: (more…)


Beyond R-squared: Assessing the Fit of Regression Models

February 20th, 2024 by

Stage 2A well-fitting regression model results in predicted values close to the observed data values. The mean model, which uses the mean for every predicted value, generally would be used if there were no useful predictor variables. The fit of a proposed regression model should therefore be better than the fit of the mean model. But how do you measure that model fit? 

(more…)


The Difference Between R-squared and Adjusted R-squared

August 22nd, 2022 by

When is it important to use adjusted R-squared instead of R-squared?

R², the Coefficient of Determination, is one of the most useful and intuitive statistics we have in linear regression.Stage 2

It tells you how well the model predicts the outcome and has some nice properties. But it also has one big drawback.

(more…)


Member Training: Assumptions of Linear Models

June 30th, 2022 by

Stage 2What are the assumptions of linear models? If you compare two lists of assumptions, most of the time they’re not the same.
(more…)


Linear Regression Analysis – 3 Common Causes of Multicollinearity and What Do to About Them

February 11th, 2022 by
Multicollinearity in regression is one of those issues that strikes fear into the hearts of researchers. You’ve heard about its dangers in statistics Stage 2classes, and colleagues and journal reviews question your results because of it. But there are really only a few causes of multicollinearity. Let’s explore them.Multicollinearity is simply redundancy in the information contained in predictor variables. If the redundancy is moderate, (more…)