Final answer:
Overfitting results in decision trees that are more complex than necessary and training is affected by overfitting.
Step-by-step explanation:
Statement c is NOT true. Both statement a and b are true.
Overfitting refers to a situation in machine learning where a model is trained too well on the training data, to the point that it starts to memorize the data rather than generalize. This can result in decision trees that are more complex than necessary, as the model tries to fit every single data point perfectly.
Underfitting, on the other hand, occurs when a model is too simple and does not capture the complexity of the underlying data. It is usually a result of not enough training or using a model that is not powerful enough. Underfitting can lead to poor performance on both the training and testing data.