interpreting regression coefficients

Clarifications on Interpreting Interactions in Regression

May 17th, 2010 by

In a previous post, Interpreting Interactions in Regression, I said the following:

In our example, once we add the interaction term, our model looks like:

Height = 35 + 4.2*Bacteria + 9*Sun + 3.2*Bacteria*Sun

Adding the interaction term changed the values of B1 and B2. The effect of Bacteria on Height is now 4.2 + 3.2*Sun. For plants in partial sun, Sun = 0, so the effect of Bacteria is 4.2 + 3.2*0 = 4.2. So for two plants in partial sun, a plant with 1000 more bacteria/ml in the soil would be expected to be 4.2 cm taller than a (more…)


Steps to Take When Your Regression (or Other Statistical) Results Just Look…Wrong

April 19th, 2010 by

Stage 2You’ve probably experienced this before. You’ve done a statistical analysis, you’ve figured out all the steps, you finally get results and are able to interpret them. But they just look…wrong. Backwards, or even impossible—theoretically or logically.

This happened a few times recently to a couple of my consulting clients, and once to me. So I know that feeling of panic well. There are so many possible causes of incorrect results, but there are a few steps you can take that will help you figure out which one you’ve got and how (and whether) to correct it.

Errors in Data Coding and Entry

In both of my clients’ cases, the problem was that they had coded missing data with an impossible and extreme value, like 99. But they failed to define that code as missing in SPSS. So SPSS took 99 as a real data point, which (more…)


Interpreting (Even Tricky) Regression Coefficients – A Quiz

January 15th, 2010 by

Here’s a little quiz:

True or False?

1. When you add an interaction to a regression model, you can still evaluate the main effects of the terms that make up the interaction, just like in ANOVA.

2. The intercept is usually meaningless in a regression model. (more…)


Making Dummy Codes Easy to Keep Track of

January 14th, 2010 by

Here’s a little tip.Stage 2

When you construct Dummy Variables, make it easy on yourself  to remember which code is which.  Heck, if you want to be really nice, make it easy for anyone else who will analyze the data or read the results.

Make the codes inherent in the Dummy variable name.

So instead of a variable named Gender with values of 1=Female and 0=Male, call the variable Female.

Instead of a set of dummy variables named MaritalStatus1 with values of 1=Married and 0=Single, along with MaritalStatus2 with values 1=Divorced and 0=Single, name the same variables Married and Divorced.

And if you’re new to dummy coding, this has the extra bonus of making the dummy coding intuitive.  It’s just a set of yes/no variables about all but one of your categories.

————————————————————————————————-
Bookmark and Share


Interpreting Regression Coefficients in Models other than Ordinary Linear Regression

January 5th, 2010 by

Someone who registered for my upcoming Interpreting (Even Tricky) Regression Models workshop asked if the content applies to logistic regression as well.

The short answer: Yes

The long-winded detailed explanation of why this is true and the one caveat:

One of the greatest things about regression models is that they all have the same set up: (more…)


Interpreting Lower Order Coefficients When the Model Contains an Interaction

February 23rd, 2009 by

A Linear Regression Model with an interaction between two predictors (X1 and X2) has the form: 

Y = B0 + B1X1 + B2X2 + B3X1*X2.

It doesn’t really matter if X1 and X2 are categorical or continuous, but let’s assume they are continuous for simplicity.

One important concept is that B1 and B2 are not main effects, the way they would be if (more…)