Need to dummy code in a Cox regression model?

Interpret interactions in a logistic regression?

Add a quadratic term to a multilevel model?

This is where statistical analysis starts to feel really hard. You’re combining two difficult issues into one.

You’re dealing with both a complicated modeling technique (survival analysis, logistic regression, multilevel modeling) and tricky effects in the model (dummy coding, interactions, and quadratic terms).

The only way to figure it all out in a situation like that is to break it down into parts. Trying to understand all those complicated parts together is a recipe for disaster.

But if you can do linear regression, each part is just one step up in complexity. Take one step at a time.

### The Type of Model Reflects the Distribution of the Response Variable

Most statistical models are extensions of linear models (aka. the General Linear Model). The extensions are there to deal with assumptions that aren’t being met.

Since most assumptions are about the residuals, whose distribution reflects the measurement of the response variable, these extensions are all for *response* variables that don’t fit the general linear model. Our examples above are necessary for censored time-to-event, categorical, and clustered response variables.

The other regression models mostly use maximum likelihood estimation, which requires different measures of model fit.

They also often use link functions, making interpretation of coefficients harder. (Remember, coefficients measure the relationship between the predictor and response, so the link function on the response always affects the scaling of the coefficients).

So the first thing you need to work on is how to implement, interpret coefficients, and understand model fit in the model you’re using. Don’t worry yet about the tricky effects until you’ve mastered the model you’re using with simple, continuous predictors.

### The Tricky Effects are About the Predictor Variables

The nice thing about all these models is that the fixed part of the right hand side of the equation–the part with the coefficients and X variables that is actually testing the effects you’re interested in–are all the same.

A dummy variable is a dummy variable in every kind of model.

The random parts (in most models, just the distribution of residuals) will differ, but they’re not directly involved with the effects of predictor variables.

All those tricky effects occur when the predictors aren’t the textbook-perfect numerical variables you used in regression class–dummy variables for categorical predictors, interactions and quadratic terms that are the products of two predictors, centered or rescaled predictor variables.

So once you learn how and when to use, and how to interpret any tricky effect in a linear model, you can use it in any type of model.

I generally recommend really mastering using these effects in the context of linear models, even if that’s not your immediate need. Because learning how to work with and interpret the effects of dummy variables is a lot harder if you’re doing it in terms of odd ratios than in terms of comparing means.

So the next time you need to figure out a combination of a tricky effect and a complicated model, break it down first. Simplify your model as much as you can as you master each part.

Even if it takes running some simple preliminary models that you never could publish, taking the time to understand each part will be worth the extra time it takes.

{ 3 comments… read them below or add one }

This short article on tricky effects was very useful for me and clearified my interpretation of a cox regression with a quadratic term. Thanks!

Hello,

Part of the problem is that there are not many publications regarding “tricky” problems. I wanted to do a interaction in a cox regression and where not able to find any good adivce for their interpretation.

Agreed.

When you can find an example of exactly what you need to do, that’s the clearest and best way to learn.

So in the absence of that, you just have to break down the problem into different pieces.

Karen