-
Notifications
You must be signed in to change notification settings - Fork 6
Decision Tree from Nodes
- Query data - map training and testing data OR split 1 set of nodes into training and testing by defining ratio for training
- Run decision tree procedure - mention the class label (target attribute)
RETURN main.createTreeIG("targetAttribute","prune","max_depth")
Procedure to create the Information Gain DT based on the dataset from the Graph Database. "class_label" is the target attribute. After running the procedure, it will create the DT in Neo4j and also display the prediction time, generate time, confusion matrix and accuracy.
"prune": "True" if you want to prune the tree and "False" otherwise.
"max_depth": "depth level" when you want to prune and "0" otherwise. For example "3" for a depth level of 3.
RETURN main.createTreeGI("targetAttribute","prune","max_depth")
Procedure to create the Gini Index DT based on the dataset from the Graph Database. "class_label" is the target attribute. After running the procedure, it will create the DT in Neo4j and also display the prediction time, generate time, confusion matrix and accuracy.
"prune": "True" if you want to prune the tree and "False" otherwise.
"max_depth": "depth level" when you want to prune and "0" otherwise. For example "3" for a depth level of 3.
RETURN main.createTreeGR("targetAttribute","prune","max_depth")
Procedure to create the Gain Ratio DT based on the dataset from the Graph Database. "class_label" is the target attribute. After running the procedure, it will create the DT in Neo4j and also display the prediction time, generate time, confusion matrix and accuracy.
"prune": "True" if you want to prune the tree and "False" otherwise.
"max_depth": "depth level" when you want to prune and "0" otherwise. For example "3" for a depth level of 3.