Learning decision trees in continuous space
Two problems of the ID3 and C4.5 decision tree building methods will be mentioned and solutions will be suggested on them. First, in both methods a Gain-type criteria is used to compare the applicability of possible tests, which derives from the entropy function. We are going to propose a new measur...
Saved in:
Main Authors: | |
---|---|
Corporate Author: | |
Format: | Article |
Published: |
2001
|
Series: | Acta cybernetica
15 No. 2 |
Kulcsszavak: | Számítástechnika, Kibernetika |
Subjects: | |
Online Access: | http://acta.bibl.u-szeged.hu/12674 |
Summary: | Two problems of the ID3 and C4.5 decision tree building methods will be mentioned and solutions will be suggested on them. First, in both methods a Gain-type criteria is used to compare the applicability of possible tests, which derives from the entropy function. We are going to propose a new measure instead of the entropy function, which comes from the measure of fuzziness using a monotone fuzzy operator. It is more natural and much simpler to compute in case of concept learning (when elements belong to only two classes: positive and negative). Second, the well-known extension of the ID3 method for handling continuous attributes (C4.5) is based on discretization of attribute values and in it the decision space is separated with axis-parallel hyperplanes. In our proposed new method (CDT) continuous attributes are handled without discretization, and arbitrary geometric figures are used for separation of decision space, like hyperplanes in general position, spheres and ellipsoids. The power of our new method is going to be demonstrated oh a few examples. |
---|---|
Physical Description: | 213-224 |
ISSN: | 0324-721X |