71.8k views
3 votes
Which of the following statements is/are correct regarding AdaBoost? A) It builds weak learners (decision tree) with restricted depth B) It builds weak learners (decision tree) - Till a tree is fully grown C) Weights of incorrectly classified points are decreased D) Weights of incorrectly classified points are increased

a. A only
b. B only
c. C only
d. D only

User Alvae
by
7.5k points

1 Answer

1 vote

Final answer:

AdaBoost is a machine learning algorithm that builds weak learners with restricted depth, typically decision tree-based classifiers, where it increases the weights of incorrectly classified points to form a strong overall classifier. The correct statements are A) It builds weak learners (decision tree) with restricted depth, and D) Weights of incorrectly classified points are increased.

Step-by-step explanation:

The AdaBoost algorithm, which stands for Adaptive Boosting, is a machine learning-based approach that seeks to create a strong classifier by sequentially building and combining a set of weak learners, typically decision tree-based classifiers. Now, let's address the specifics of your question regarding AdaBoost:

  • A) It builds weak learners (decision tree) with restricted depth: This statement is correct. AdaBoost often uses short decision trees, also known as stump trees, which have a restricted depth, as its weak learners.
  • B) It builds weak learners (decision tree) - Till a tree is fully grown: This is generally incorrect for AdaBoost, as it typically uses trees with one split, which are far from fully grown trees.
  • C) Weights of incorrectly classified points are decreased: This is incorrect. In AdaBoost, the weights of incorrectly classified points are actually increased so that subsequent weak learners focus more on them.
  • D) Weights of incorrectly classified points are increased: This statement is correct, as we just mentioned, increasing the weight of incorrectly classified data points is a key part of the AdaBoost algorithm to ensure that subsequent classifiers focus on these more challenging cases.

So, the correct answer to your question is that statements A and D are correct regarding AdaBoost. The technique's emphasis on iteratively correcting mistakes by increasing the focus on harder-to-classify examples is critical to its functioning.

User Cuttlas
by
7.4k points