A recent question on the Talkstats forum asked about dropping the intercept in a linear regression model since it makes the predictor’s coefficient stronger and more significant. Dropping the intercept in a regression model forces the regression line to go through the origin–the y intercept must be 0.
The problem with dropping the intercept is if the slope is steeper just because you’re forcing the line through the origin, not because it fits the data better. If the intercept really should be something else, you’re creating that steepness artificially. A more significant model isn’t better if it’s inaccurate.