Conditional inference trees and forests

Unlike previous algorithms, conditional inference trees rely on statistical significance in the selection of attributes on which to perform partitions. In conditional inference trees, the class attribute is defined as a function of the other attributes (iteratively). In short, the algorithm first searches for the attributes that significantly predict the class, in a null hypothesis test that can be selected in the call of the function. The strongest predictor (if any) is then selected for the first partition. Nodes are created after splitting the partition attribute, if numeric, in a way that maximizes the goodness of the split (we do not detail the required computations here). The algorithm then repeats the operation for each of the nodes and continues until no attribute remains, or none is significantly related to the class. More information is available in the documentation of the partykit package.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.222.32.67