Centering variables is common practice in some areas, and rarely seen in others. That being the case, it isn’t always clear what are the reasons for centering variables. Is it only a matter of preference, or does centering variables help with analysis and interpretation? [Read more…] about Member Training: Centering
Centering a covariate –a continuous predictor variable–can make regression coefficients much more interpretable. That’s a big advantage, particularly when you have many coefficients to interpret. Or when you’ve included terms that are tricky to interpret, like interactions or quadratic terms.
For example, say you had one categorical predictor with 4 categories and one continuous covariate, plus an interaction between them.
First, you’ll notice that if you center your covariate at the mean, there is [Read more…] about Centering a Covariate to Improve Interpretability
There is a bit of art and experience to model building. You need to build a model to answer your research question but how do you build a statistical model when there are no instructions in the box?
Centering predictor variables is one of those simple but extremely useful practices that is easily overlooked.
It’s almost too simple.
Centering simply means subtracting a constant from every value of a variable. What it does is redefine the 0 point for that predictor to be whatever value you subtracted. It shifts the scale over, but retains the units.
The effect is that the slope between that predictor and the response variable doesn’t [Read more…] about Should You Always Center a Predictor on the Mean?
Yesterday I gave a little quiz about interpreting regression coefficients. Today I’m giving you the answers.
If you want to try it yourself before you see the answers, go here. (It’s truly little, but if you’re like me, you just cannot resist testing yourself).
True or False?
1. When you add an interaction to a regression model, you can still evaluate the main effects of the terms that make up the interaction, just like in ANOVA. [Read more…] about Answers to the Interpreting Regression Coefficients Quiz
Interpreting the Intercept in a regression model isn’t always a straightforward as it looks. Here’s the definition: the intercept (often labeled the constant) is the expected mean value of Y when all X=0.
Start with a regression equation with one predictor, X.
If X sometimes equals 0, the intercept is simply the expected mean value of Y at that value. That’s meaningful.
If X never equals 0, then the intercept has no intrinsic meaning. Both these scenarios are common in real data. [Read more…] about Interpreting the Intercept in a Regression Model