Final answer:
The true statement is that we use variable selection techniques to remove unnecessary variables and reduce overfitting.
Step-by-step explanation:
The true statement from the given options is: 2. We use variable selection techniques to help remove unnecessary variables in our model and reduce overfitting.
Variable selection techniques are used in regression analysis to identify the most important variables that contribute significantly to the model. This helps to simplify the model and prevent overfitting.
Examples of variable selection techniques include stepwise regression, which can be both forward (starting with all variables and removing them one by one) or backward (starting with no variables and adding them one by one) and best subset selection which evaluates all possible combinations of variables to find the best one. These techniques aim to find the best subset of variables that improve the model's performance.