In class, we discussed how AdaBoost builds a strong classifi…
In class, we discussed how AdaBoost builds a strong classifier by sequentially combining weak learners. Answer the following clearly: (a) Explain why the first weak learner in AdaBoost is typically only slightly better than random guessing. (1 point) (b) After the first learner is built, the observation weights are updated.Explain precisely how and why the weights are changed. (2 points) (c) Suppose a weak learner has an error rate greater than 50%.What would this imply in AdaBoost, and why would such a learner not be useful? (1 point) (d) In plain language, explain why combining many weak learners sequentially can outperform a single strong decision tree. (1 point)