Final answer:
In regression analysis, as the complexity of the model increases from linear to cubic, the training residual sum of squares typically decreases because the model better fits the data. Thus, the training residual sum of squares decreases in both linear and cubic regression.
Step-by-step explanation:
The question is about the behavior of the training residual sum of squares (SSE) when applying linear regression compared to cubic regression. In the context of regression analysis, the least-squares criterion is used to find a line of best fit for a given set of data by minimizing the SSE. For linear regression, this involves fitting a straight line to the data.
As the complexity of the model increases, such as moving from a linear to a cubic regression, the training residual sum of squares typically decreases because the more complex model can better adapt to the data points, leading to smaller residuals. Therefore, in both linear and cubic regression, as we add complexity (moving from linear to cubic), the SSE usually decreases. The correct choice, considering this explanation, would be (d) Decreases in both linear and cubic.