When we run a statistical model, we are in a sense creating a mathematical equation. The simplest regression model looks like this:
Yi = β0 + β1X+ εi
The left side of the equation is the sum of two parts on the right: the fixed component, β0 + β1X, and the random component, εi.
You’ll also sometimes see the equation written (more…)
A great tool to have in your statistical tool belt is logistic regression.
It comes in many varieties and many of us are familiar with the variety for binary outcomes.
But multinomial and ordinal varieties of logistic regression are also incredibly useful and worth knowing.
They can be tricky to decide between in practice, however. In some — but not all — situations you (more…)
Generalized linear models—and generalized linear mixed models—are called generalized linear because they connect a model’s outcome to its predictors in a linear way. The function used to make this connection is called a link function. Link functions sounds like an exotic term, but they’re actually much simpler than they sound.
For example, Poisson regression (commonly used for outcomes that are counts) makes use of a natural log link function as follows:
(more…)
I recently held a free webinar in our The Craft of Statistical Analysis program about Binary, Ordinal, and Nominal Logistic Regression.
It was a record crowd and we didn’t get through everyone’s questions, so I’m answering some here on the site. They’re grouped by topic, and you will probably get more out of it if you watch the webinar recording. It’s free.
The following questions refer to this logistic regression model: (more…)
One question that seems to come up pretty often is:
What is the difference between logistic and probit regression?
Well, let’s start with how they’re the same:
Both are types of generalized linear models. This means they have this form:

(more…)