×
In this paper, we find that these three attribute splitting criteria can be unified in a Tsallis entropy framework.
In summary, Tsallis entropy unifies three kinds of split criteria, e.g. Shannon entropy, Gain Ratio and Gini index, and generalizes the split criterion of ...
ABSTRACT. Owing to its simplicity and flexibility, the decision tree re- mains an important analysis tool in many real-world learning.
The proposed method relies on two stages. First, a two-step classification algorithm is implemented. In the first step, a decision classification tree is built ...
Nov 25, 2015 · In this paper, a Tsallis Entropy Criterion (TEC) algorithm is proposed to unify Shannon entropy, Gain Ratio and Gini index, which generalizes the split ...
Missing: attribute | Show results with:attribute
People also ask
A Tsallis Entropy Criterion (TEC) algorithm is proposed to unify Shannon entropy, Gain Ratio and Gini index, which generalizes the split criteria of ...
Mar 5, 2017 · Experimental evidences demonstrate that UTCDT achieves statistically significant improvement over the classical decision tree algorithms, even ...
"Unifying attribute splitting criteria of decision trees by Tsallis entropy." 2017 IEEE International Conference on Acoustics, Speech and Signal Processing ...
Apr 19, 2022 · The classical C4.5 decision tree, based on the Shannon entropy, is a simple algorithm to calculate the gain ratio and then split the attributes ...
Mar 15, 2017 · Firstly, the new split criterion is based on two-term Tsallis conditional entropy, which is better than conventional one-term split criteria.