Decision trees :Gini and entropy

Q.How and why entropy produce Slightly more balanced trees ?
Q.If entropy produce Slightly more balanced trees then why it take large time than gini ?

Hi @SOUMYADEEP_BANIK_CHO,

Entropy is more computationally heavy due to the log in the equation. Like Gini, the basic idea is to gauge the disorder of a grouping by the target variable. Instead of utilizing simple probabilities, this method takes the log base2 of the probabilities (you can use any log base, however, as long as you’re consistent).

The entropy equation uses logarithms because of many advantageous properties. The Main advantage is the additive property it provides.

I hope it answers both your questions.
Regards.

Crisp & Precise explanation…!!! :+1: :+1: :+1:

Hi @ss7dec,

Doing my best.

Regards