Ensemble ML Algorithm - AdaBoost

Hi CloudX Lab Team,

A clarification regarding a doubt is required wrt AdaBoosting Ensemble ML Algorithm —

Once the Predictors (or Independent Variables or x Variables) are trained, they are assigned different weights based on the Accuracy of the Weighted Training Set ---- This methodology or technique is relevant for AdaBoosting.

Now there is another technique that I have come across, Weighted Average Method. Is this also applicable to Ensemble Trees ML Algorithm???

If answer is in positive,what are the indications for adopting Weighted Average Method? Under what circumstances is this technique applicable?

The main philosophy behind AdaBoots is that it takes a bunch of classifiers, trains them, takes the ones that is underfitting, updates their weights, trains them further and so on. However, you can do the same by training a set of weak classifiers yourself and combine their results. So what does AdaBoost does that this method does not?

  1. It helps choose the training set for each new classifier based on the results of the previous classifiers.
  2. It also determines the weight that should be given to each classifier’s proposed answer when combining the results.