ANOVA

Beyond Median Splits: Meaningful Cut Points

June 26th, 2009 by

I’ve talked a bit about the arbitrary nature of median splits and all the information they just throw away.Stage 2

But I have found that as a data analyst, it is incredibly freeing to be able to choose whether to make a variable continuous or categorical and to make the switch easily.  Essentially, this means you need to be (more…)


Interpreting Interactions: When the F test and the Simple Effects disagree.

May 11th, 2009 by

Stage 2The way to follow up on a significant two-way interaction between two categorical variables is to check the simple effects.  Most of the time the simple effects tests give a very clear picture about the interaction.  Every so often, however, you have a significant interaction, but no significant simple effects.  It is not a logical impossibility. They are testing two different, but related hypotheses.

Assume your two independent variables are A and B.  Each has two values: 1 and 2.  The interaction is testing if A1 – B1 = A2 – B2 (the null hypothesis). The simple effects are testing whether A1-B1=0 and A2-B2=0 (null) or not.

If you have a crossover interaction, you can have A1-B1 slightly positive and A2-B2 slightly negative. While neither is significantly different from 0, they are significantly different from each other.

And it is highly useful for answering many research questions to know if the differences in the means in one condition equal the differences in the means for the other. It might be true that it’s not testing a hypothesis you’re interested in, but in many studies, all the interesting effects are in the interactions.

 


Checking Assumptions in ANOVA and Linear Regression Models: The Distribution of Dependent Variables

April 10th, 2009 by

Here’s a little reminder for those of you checking assumptions in regression and ANOVA:

The assumptions of normality and homogeneity of variance for linear models are not about Y, the dependent variable.    (If you think I’m either stupid, crazy, or just plain nit-picking, read on.  This distinction really is important). (more…)


Why ANOVA and Linear Regression are the Same Analysis

March 11th, 2009 by

Stage 2If your graduate statistical training was anything like mine, you learned ANOVA in one class and Linear Regression in another.  My professors would often say things like “ANOVA is just a special case of Regression,” but give vague answers when pressed.

It was not until I started consulting that I realized how closely related ANOVA and regression are.  They’re not only related, they’re the same thing.  Not a quarter and a nickel–different sides of the same coin.

So here is a very simple example that shows why.  When someone showed me this, a light bulb went on, even though I already knew both ANOVA and multiple linear (more…)


Testing and Dropping Interaction Terms in Regression and ANOVA models

February 26th, 2009 by

In a Regression model, should you drop interaction terms if they’re not significant?

In an ANOVA, adding interaction terms still leaves the main effects as main effects.  That is, as long as the data are balanced, the main effects and the interactions are independent.  The main effect is still telling (more…)


3 Reasons Psychology Researchers should Learn Regression

February 17th, 2009 by

Stage 2Back when I was doing psychology research, I knew ANOVA pretty well.  I’d taken a number of courses on it and could run it backward and forward.  I kept hearing about ANCOVA, but in every ANOVA class that was the last topic on the syllabus, and we always ran out of time.

The other thing that drove me crazy was those stats professors kept saying “ANOVA is just a special case of Regression.”  I could not for the life of me figure out why or how.

It was only when I switched over to statistics that I finally took a regression class and figured out what ANOVA was all about. And only when I started consulting, and seeing hundreds of different ANOVA and regression models, that I finally made the connection.

But if you don’t have the driving curiosity about ANOVA and regression, why should you, as a researcher in Psychology, Education, or Agriculture, who is trained in ANOVA, want to learn regression?  There are 3 main reasons.

1. There a many, many continuous independent variables and covariates that need to be included in models.  Without the tools to analyze them as continuous, you are left forcing them into ANOVA using an arbitrary technique like median splits.  At best, you’re losing power.  At worst, you’re not publishing your article because you’re missing real effects.

2. Having a solid understanding of the General Linear Model in its various forms equips you to really understand your variables and their relationships.  It allows you to try a model different ways–not for data fishing, but for discovering the true nature of the relationships.  Having the capacity to add an interaction term or a squared term  allows you to listen to your data and makes you a better researcher.

3. The multiple linear regression model is the basis for many other statistical techniques–logistic regression, multilevel and mixed models, Poisson regression, Survival Analysis, and so on.  Each of these is a step (or small leap) beyond multiple regression.  If you’re still struggling with what it means to center variables or interpret interactions, learning one of these other techniques becomes arduous, if not painful.

Having guided thousands of researchers through their statistical analysis over the past 10 years, I am convinced that having a strong, intuitive understanding of the general linear model in its variety of forms is the key to being an effective and confident statistical analyst.  You are then free to learn and explore other methodologies as needed.