Decision Tree Pruning Questions and Answers

This set of Machine Learning Multiple Choice Questions & Answers (MCQs) focuses on “Decision Tree Pruning”.

1. In the ID3 algorithm the returned tree will usually be very large.
a) True
b) False
View Answer

Answer: a
Explanation: In the ID3 algorithm the returned tree will usually be very large. Such trees may have low empirical risk, but their true risk will tend to be high. One solution is to limit the number of iterations of ID3, leading to a tree with a bounded number of nodes.

2. Pruning a tree reduces it to a much smaller tree.
a) True
b) False
View Answer

Answer: a
Explanation: Pruning a tree will reduce it to a much smaller tree, but still with a similar empirical error. So pruning is the process of adjusting the decision tree to minimize the misclassification error.

3. Pruning can only be performed by a bottom up walk on the decision tree.
a) True
b) False
View Answer

Answer: b
Explanation: Pruning can occur in a top down or bottom up fashion. Usually, the pruning is performed by a bottom-up walk on the tree. Each node might be replaced with one of its subtrees or with a leaf. But there are situations where the top down pruning is also used.
advertisement
advertisement

4. Which of the following statements is not true about Pruning?
a) It removes the sections of the tree that provide little power to classify instances
b) It is a technique in machine learning and search algorithms to reduce the size of the decision trees
c) It increases the complexity of the final classifier
d) It improves the predictive accuracy by the reduction of overfitting
View Answer

Answer: c
Explanation: Pruning reduces the complexity of the final classifier and improves the predictive accuracy by the reduction of overfitting. It is a technique in machine learning and search algorithms to reduce the size of the decision trees. And it removes the sections of the tree that provide little power to classify instances.

5. Which of the following statements is not true about Pruning?
a) It reduces the size of learning tree without reducing predictive accuracy
b) It is will not optimise the performance of the tree
c) Top down pruning will traverse nodes and trim subtrees starting at the root
d) Bottom up pruning will traverse nodes and trim subtrees starting at the leaf nodes
View Answer

Answer: b
Explanation: Pruning will optimise the performance of the tree and it reduces the size of the learning tree without reducing predictive accuracy. Bottom up pruning will traverse nodes and trim subtrees starting at the leaf nodes and Top down pruning starting at the root.

6. Which of the following is not a Pruning technique?
a) Cost based pruning
b) Cost complexity pruning
c) Minimum error pruning
d) Maximum error pruning
View Answer

Answer: d
Explanation: Maximum error pruning is not a pruning technique. Cost based pruning, Cost complexity pruning, and Minimum error pruning are the three popular pruning techniques in Decision trees.

7. Which of the following statements is not true about the pruning in the decision tree?
a) When the decision tree is created, many of the branches will reflect anomalies in the training data due to noise
b) The over fitting happens when the learning algorithm continues to develop hypothesis that reduce training set error at the cost of an increased test set errors
c) It optimises the computational efficiency
d) It reduces the classification accuracy
View Answer

Answer: d
Explanation: Pruning in decision trees improves the classification accuracy and optimises computational efficiency. When the decision tree is created, many of the branches will reflect anomalies in the training data due to noise. And over-fitting happens when the learning algorithm continues to develop hypotheses that reduce training set error at the cost of an increased test set error.
advertisement

8. Post pruning is also known as backward pruning.
a) True
b) False
View Answer

Answer: a
Explanation: Post-pruning is also known as backward pruning. Here generate the decision tree and then remove non-significant branches. It allows the tree to perfectly classify the training set, and then post prune the tree.

9. Which of the following statements is not true about Post pruning?
a) It begins by generating the (complete) tree and then adjust it with the aim of improving the classification accuracy on unseen instances
b) It begins by converting the tree to an equivalent set of rules
c) It would not overfit trees
d) It converts a complete tree to a smaller pruned one which predicts the classification of unseen instances at least as accurately
View Answer

Answer: c
Explanation: Post-pruning overfit trees in a more successful way because it is not easy to precisely estimate when to stop growing the tree. It begins by generating the (complete) tree and then adjusting it with the aim of improving the classification accuracy on unseen instances. The other two statements are the two principal methods of doing this.
advertisement

10. Which of the following statements is not true about Reduced error pruning?
a) It is the simplest and most understandable method in decision tree pruning
b) It considers each of the decision nodes in the tree to be candidates for pruning, consist of removing the subtree rooted at that node, making it a leaf node
c) If the error rate of the new tree would be equal to or smaller than that of the original tree and that subtree contains no subtree with the same property, then subtree is replaced by leaf node
d) If the error rate of the new tree would be greater than that of the original tree and that subtree contains no subtree with the same property, then subtree is replaced by leaf node, means pruning is done
View Answer

Answer: d
Explanation: If the error rate of the new tree would be greater than that of the original tree and that subtree contains no subtree with the same property, then subtree is replaced by leaf node, meaning no pruning is done. All other three statements are true about Reduced error pruning.

Sanfoundry Global Education & Learning Series – Machine Learning.

To practice all areas of Machine Learning, here is complete set of 1000+ Multiple Choice Questions and Answers.

If you find a mistake in question / option / answer, kindly take a screenshot and email to [email protected]

advertisement
advertisement
Subscribe to our Newsletters (Subject-wise). Participate in the Sanfoundry Certification contest to get free Certificate of Merit. Join our social networks below and stay updated with latest contests, videos, internships and jobs!

Youtube | Telegram | LinkedIn | Instagram | Facebook | Twitter | Pinterest
Manish Bhojasia - Founder & CTO at Sanfoundry
Manish Bhojasia, a technology veteran with 20+ years @ Cisco & Wipro, is Founder and CTO at Sanfoundry. He lives in Bangalore, and focuses on development of Linux Kernel, SAN Technologies, Advanced C, Data Structures & Alogrithms. Stay connected with him at LinkedIn.

Subscribe to his free Masterclasses at Youtube & discussions at Telegram SanfoundryClasses.