Final answer:
The use of one less dummy variable than the number of categories in a linear regression model avoids the dummy variable trap, ensuring the model's matrix is invertible for solvable equations.
Step-by-step explanation:
When a linear regression model includes an intercept, the number of dummy variables representing a categorical variable should indeed be one less than the number of categories to avoid the dummy variable trap. The dummy variable trap is a situation in which the included dummy variables are perfect collinear with the intercept, creating redundant information that can prevent the model from running properly. By excluding one dummy variable (also known as the reference category), we ensure that the model's matrix of predictors has full rank and thus is invertible, leading to a solvable set of equations. This approach effectively captures the categorical effects with respect to the reference category.