Decision Tree Advantages
Amongst other data mining methods, decision trees have various advantages:
- Simple to understand and interpret. People are able to understand decision tree models after a brief explanation.
- Requires little data preparation. Other techniques often require data normalisation, dummy variables need to be created and blank values to be removed.
- Able to handle both numerical and categorical data. Other techniques are usually specialised in analysing datasets that have only one type of variable. Ex: relation rules can be used only with nominal variables while neural networks can be used only with numerical variables.
- Uses a white box model. If a given situation is observable in a model the explanation for the condition is easily explained by boolean logic. An example of a black box model is an artificial neural network since the explanation for the results is difficult to understand.
- Possible to validate a model using statistical tests. That makes it possible to account for the reliability of the model.
- Robust. Performs well even if its assumptions are somewhat violated by the true model from which the data were generated.
- Performs well with large data in a short time. Large amounts of data can be analysed using standard computing resources.
Read more about this topic: Decision Tree Learning
Famous quotes containing the words decision, tree and/or advantages:
“The women of my mothers generation had, in the main, only one decision to make about their lives: who they would marry. From that, so much else followed: where they would live, in what sort of conditions, whether they would be happy or sad or, so often, a bit of both. There were roles and there were rules.”
—Anna Quindlen (20th century)
“If the tree is straight it need not fear casting a crooked shadow.”
—Chinese proverb.
“To become aware in time when young of the advantages of age; to maintain the advantages of youth in old age: both are pure fortune.”
—Johann Wolfgang Von Goethe (17491832)