188k views
0 votes
Weights of each sample remain the same for each subsequent weak learner in an Adaboost model.

a. True
b. False

User Romuleald
by
8.4k points

1 Answer

2 votes

Final answer:

The statement is false; in an AdaBoost model, weights of samples are updated after each iteration to focus subsequent learners on misclassified samples.

Step-by-step explanation:

The statement 'Weights of each sample remain the same for each subsequent weak learner in an AdaBoost model' is false. In the AdaBoost algorithm, the weights of samples are updated after each iteration based on whether the weak learner has correctly classified them. If a sample is misclassified, its weight is increased so that subsequent weak learners focus more on those difficult cases. This iterative process is designed to improve the model by focusing on the most challenging samples.

User Finisinfinitatis
by
8.3k points