# Decision Tree Pruning Questions and Answers

This set of Machine Learning Multiple Choice Questions & Answers (MCQs) focuses on “Decision Tree Pruning”.

1. In the ID3 algorithm the returned tree will usually be very large.
a) True
b) False

Explanation: In the ID3 algorithm the returned tree will usually be very large. Such trees may have low empirical risk, but their true risk will tend to be high. One solution is to limit the number of iterations of ID3, leading to a tree with a bounded number of nodes.

2. Pruning a tree reduces it to a much smaller tree.
a) True
b) False

Explanation: Pruning a tree will reduce it to a much smaller tree, but still with a similar empirical error. So pruning is the process of adjusting the decision tree to minimize the misclassification error.

3. Pruning can only be performed by a bottom up walk on the decision tree.
a) True
b) False

Explanation: Pruning can occur in a top down or bottom up fashion. Usually, the pruning is performed by a bottom-up walk on the tree. Each node might be replaced with one of its subtrees or with a leaf. But there are situations where the top down pruning is also used.

4. Which of the following statements is not true about Pruning?
a) It removes the sections of the tree that provide little power to classify instances
b) It is a technique in machine learning and search algorithms to reduce the size of the decision trees
c) It increases the complexity of the final classifier
d) It improves the predictive accuracy by the reduction of overfitting

Explanation: Pruning reduces the complexity of the final classifier and improves the predictive accuracy by the reduction of overfitting. It is a technique in machine learning and search algorithms to reduce the size of the decision trees. And it removes the sections of the tree that provide little power to classify instances.

5. Which of the following statements is not true about Pruning?
a) It reduces the size of learning tree without reducing predictive accuracy
b) It is will not optimise the performance of the tree
c) Top down pruning will traverse nodes and trim subtrees starting at the root
d) Bottom up pruning will traverse nodes and trim subtrees starting at the leaf nodes

Explanation: Pruning will optimise the performance of the tree and it reduces the size of the learning tree without reducing predictive accuracy. Bottom up pruning will traverse nodes and trim subtrees starting at the leaf nodes and Top down pruning starting at the root.

6. Which of the following is not a Pruning technique?
a) Cost based pruning
b) Cost complexity pruning
c) Minimum error pruning
d) Maximum error pruning

Explanation: Maximum error pruning is not a pruning technique. Cost based pruning, Cost complexity pruning, and Minimum error pruning are the three popular pruning techniques in Decision trees.

7. Which of the following statements is not true about the pruning in the decision tree?
a) When the decision tree is created, many of the branches will reflect anomalies in the training data due to noise
b) The over fitting happens when the learning algorithm continues to develop hypothesis that reduce training set error at the cost of an increased test set errors
c) It optimises the computational efficiency
d) It reduces the classification accuracy

Explanation: Pruning in decision trees improves the classification accuracy and optimises computational efficiency. When the decision tree is created, many of the branches will reflect anomalies in the training data due to noise. And over-fitting happens when the learning algorithm continues to develop hypotheses that reduce training set error at the cost of an increased test set error.

8. Post pruning is also known as backward pruning.
a) True
b) False

Explanation: Post-pruning is also known as backward pruning. Here generate the decision tree and then remove non-significant branches. It allows the tree to perfectly classify the training set, and then post prune the tree.

9. Which of the following statements is not true about Post pruning?
a) It begins by generating the (complete) tree and then adjust it with the aim of improving the classification accuracy on unseen instances
b) It begins by converting the tree to an equivalent set of rules
c) It would not overfit trees
d) It converts a complete tree to a smaller pruned one which predicts the classification of unseen instances at least as accurately

Explanation: Post-pruning overfit trees in a more successful way because it is not easy to precisely estimate when to stop growing the tree. It begins by generating the (complete) tree and then adjusting it with the aim of improving the classification accuracy on unseen instances. The other two statements are the two principal methods of doing this.

10. Which of the following statements is not true about Reduced error pruning?
a) It is the simplest and most understandable method in decision tree pruning
b) It considers each of the decision nodes in the tree to be candidates for pruning, consist of removing the subtree rooted at that node, making it a leaf node
c) If the error rate of the new tree would be equal to or smaller than that of the original tree and that subtree contains no subtree with the same property, then subtree is replaced by leaf node
d) If the error rate of the new tree would be greater than that of the original tree and that subtree contains no subtree with the same property, then subtree is replaced by leaf node, means pruning is done

Explanation: If the error rate of the new tree would be greater than that of the original tree and that subtree contains no subtree with the same property, then subtree is replaced by leaf node, meaning no pruning is done. All other three statements are true about Reduced error pruning.

Sanfoundry Global Education & Learning Series – Machine Learning.

To practice all areas of Machine Learning, here is complete set of 1000+ Multiple Choice Questions and Answers.

If you find a mistake in question / option / answer, kindly take a screenshot and email to [email protected]