The intercept (often labeled the constant) is the expected mean value of Y when all X=0.
Start with a regression equation with one predictor, X.
If X sometimes equals 0, the intercept is simply the expected mean value of Y at that value.
If X never equals 0, then the intercept has no intrinsic meaning. In scientific research, the purpose of a regression model is to understand the relationship between predictors and the response. If so, and if X never = 0, there is no interest in the intercept. It doesn’t tell you anything about the relationship between X and Y.
You do need it to calculate predicted values, though. In market research, there is usually more interest in prediction, so the intercept is more important here.
When X never equals 0 is one reason for centering X. If you re-scale X so that the mean or some other meaningful value = 0 (just subtract a constant from X), now the intercept has a meaning. It’s the mean value of Y at the chosen value of X.
If you have dummy variables in your model, though, the intercept has more meaning. Dummy coded variables have values of 0 for the reference group and 1 for the comparison group. Since the intercept is the expected mean value when X=0, it is the mean value only for the reference group (when all other X=0).
This is especially important to consider when the dummy coded predictor is included in an interaction term. Say for example that X1 is a continuous variable centered at its mean. X2 is a dummy coded predictor, and the model contains an interaction term for X1*X2.
The B value for the intercept is the mean value of X1 only for the reference group. The mean value of X1 for the comparison group is the intercept plus the coefficient for X2.
It’s hard to give an example because it really depends on how X1 and X2 are coded. So I put together 6 situations in this follow up: