43.7k views
5 votes
Which of the is true concerning the application of ensemble methods to classification decision trees? A) Bagging and boosting methods both involve creating models sequentially B) Bagging and random forests methods reduce the variance of predictions at the expense of increasing bias C) Random forest methods for creating decision trees use the entire set of attributes available for each node when calculating information gain D) Bagging methods do not involve bootstrapping and splitting the training data set while random forests do

1 Answer

7 votes

Answer:

The correct statement concerning the application of ensemble methods to classification decision trees is: C) Random forest methods for creating decision trees use the entire set of attributes available for each node when calculating information gain.

Step-by-step explanation:

User Gena  Shumilkin
by
8.8k points

No related questions found