logistic regression

Why Generalized Linear Models Have No Error Term

June 22nd, 2021 by

Even if you’ve never heard the term Generalized Linear Model, you may have run one. It’s a term for a family of models that includes logistic and Poisson regression, among others.

It’s a small leap to generalized linear models, if you already understand linear models. Many, many concepts are the same in both types of models.

But one thing that’s perplexing to many is why generalized linear models have no error term, like linear models do. (more…)


Types of Study Designs in Health Research: The Evidence Hierarchy

March 24th, 2021 by

by Danielle Bodicoat

Statistics can tell us a lot about our data, but it’s also important to consider where the underlying data came from when interpreting results, whether they’re our own or somebody else’s.

Not all evidence is created equally, and we should place more trust in some types of evidence than others.

(more…)


Member Training: Goodness of Fit Statistics

March 4th, 2021 by


What are goodness of fit statistics? Is the definition the same for all types of statistical model? Do we run the same tests for all types of statistic model?

(more…)


Member Training: Explaining Logistic Regression Results to Non-Researchers

August 1st, 2020 by

Interpreting the results of logistic regression can be tricky, even for people who are familiar with performing different kinds of statistical analyses. How do we then share these results with non-researchers in a way that makes sense?

(more…)


What is Multicollinearity? A Visual Description

November 20th, 2019 by

Multicollinearity is one of those terms in statistics that is often defined in one of two ways:

1. Very mathematical terms that make no sense — I mean, what is a linear combination anyway?

2. Completely oversimplified in order to avoid the mathematical terms — it’s a high correlation, right?

So what is it really? In English?

(more…)


Linear Regression for an Outcome Variable with Boundaries

July 22nd, 2019 by

The following statement might surprise you, but it’s true.

To run a linear model, you don’t need an outcome variable Y that’s normally distributed. Instead, you need a dependent variable that is:

The normality assumption is about the errors in the model, which have the same distribution as Y|X. It’s absolutely possible to have a skewed distribution of Y and a normal distribution of errors because of the effect of X. (more…)