Regression models without intercepts

by Karen


A recent question on the Talkstats forum asked about dropping the intercept in a linear regression model, since it makes the predictor’s coefficient stronger and more significant.  Dropping the intercept in a regression model forces the regression line to go through the origin–the y intercept must be 0.

The problem with dropping the intercept is if the slope is steeper just because you’re forcing the line through the origin, not because it fits the data better.  If the intercept really should be something else, you’re creating that steepness artificially.  A more significant model isn’t better if it’s inaccurate.

Get our free webinar recording titled: Ten Data Analysis Tips in R with David Lillis.

{ 8 comments… read them below or add one }

Leave a Comment

Please note that Karen receives hundreds of comments at The Analysis Factor website each week. Since Karen is also busy teaching workshops, consulting with clients, and running a membership program, she seldom has time to respond to these comments anymore. If you have a question to which you need a timely response, please check out our low-cost monthly membership program, or sign-up for a quick question consultation.

Previous post:

Next post: