Final answer:
The statements A, C, and D about bagging and boosting are true.
Bagging involves building weak learners in parallel and with equal treatment, while boosting builds sequentially and adjusts weights based on performance. Random Forest and AdaBoost are examples specific to each technique.
Step-by-step explanation:
The statement A is true: Bagging (Bootstrap Aggregating) builds weak learners in parallel, while boosting builds weak learners in sequence, where each learner in the sequence tries to correct the errors of its predecessor.
Statement B is incorrect as bagging treats all weak learners equally, whereas boosting assigns more weight to those weak learners with better performance.
Statement C is true: Bagging involves drawing samples from the original dataset with replacement (bootstrap samples) to train each weak learner, while in boosting, subsequent samples give more weight to observations that had higher errors in the previous weak learners.
Finally, statement D is also true: Random Forest is indeed a special type of bagging technique, and AdaBoost (Adaptive Boosting) is a type of boosting technique.