Extra trees-ensemble learning

I am not able to understand clearly the difference between Extra trees and Random forest classifier.

Hi @SOUMYADEEP_BANIK_CHO,

The main difference between random forests and extra trees (usually called extreme random forests ) lies in the fact that, instead of computing the locally optimal feature/split combination (for the random forest ), for each feature under consideration, a random value is selected for the split (for extra trees ).

When you are growing a tree in a Random Forest, at each node only a random subset
of the features is considered for splitting (as discussed earlier). It is possible to make
trees even more random by also using random thresholds for each feature rather than
searching for the best possible thresholds (like regular Decision Trees do).

A forest of such extremely random trees is simply called an Extremely Randomized
Trees ensemble (or Extra-Trees for short)
. Once again, this trades more bias for a
lower variance. It also makes Extra-Trees much faster to train than regular Random
Forests since finding the best possible threshold for each feature at every node is one
of the most time-consuming tasks of growing a tree.

I hope it helps.

Regards