Skip to content

Commit

Permalink
Merge pull request #41 from ahadc/patch-1
Browse files Browse the repository at this point in the history
Update README.md
  • Loading branch information
igrigorik authored Mar 25, 2018
2 parents 7481ea9 + f4eef94 commit 5ba86e0
Showing 1 changed file with 6 additions and 6 deletions.
12 changes: 6 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,22 +1,22 @@
# Decision Tree

A Ruby library which implements ID3 (information gain) algorithm for decision tree learning. Currently, continuous and discrete datasets can be learned.
A Ruby library which implements [ID3 (information gain)](https://en.wikipedia.org/wiki/ID3_algorithm) algorithm for decision tree learning. Currently, continuous and discrete datasets can be learned.

- Discrete model assumes unique labels & can be graphed and converted into a png for visual analysis
- Continuous looks at all possible values for a variable and iteratively chooses the best threshold between all possible assignments. This results in a binary tree which is partitioned by the threshold at every step. (e.g. temperate > 20C)

## Features
- ID3 algorithms for continuous and discrete cases, with support for inconsistent datasets.
- Graphviz component to visualize the learned tree (http://rockit.sourceforge.net/subprojects/graphr/)
- Support for multiple, and symbolic outputs and graphing of continuos trees.
- [Graphviz component](http://rockit.sourceforge.net/subprojects/graphr/) to visualize the learned tree
- Support for multiple, and symbolic outputs and graphing of continuous trees.
- Returns default value when no branches are suitable for input

## Implementation

- Ruleset is a class that trains an ID3Tree with 2/3 of the training data, converts it into a set of rules and prunes the rules with the remaining 1/3 of the training data (in a C4.5 way).
- Ruleset is a class that trains an ID3Tree with 2/3 of the training data, converts it into a set of rules and prunes the rules with the remaining 1/3 of the training data (in a [C4.5](https://en.wikipedia.org/wiki/C4.5_algorithm) way).
- Bagging is a bagging-based trainer (quite obvious), which trains 10 Ruleset trainers and when predicting chooses the best output based on voting.

Blog post with explanation & examples: http://www.igvita.com/2007/04/16/decision-tree-learning-in-ruby/
[Blog post with explanation & examples](http://www.igvita.com/2007/04/16/decision-tree-learning-in-ruby/)

## Example

Expand Down Expand Up @@ -68,4 +68,4 @@ puts "Predicted: #{decision} ... True decision: #{test.last}"

## License

The MIT License - Copyright (c) 2006 Ilya Grigorik
The [MIT License](https://opensource.org/licenses/MIT) - Copyright (c) 2006 Ilya Grigorik

0 comments on commit 5ba86e0

Please sign in to comment.