# Decision Trees Questions and Answers – Threshold Based Splitting Rules

This set of Machine Learning Multiple Choice Questions & Answers (MCQs) focuses on “Decision Trees – Threshold Based Splitting Rules”.

1. Which of the following statements is not true about Decision trees?
a) It builds classification models in the form of a tree structure
b) It builds regression models in the form of a tree structure
c) The final result is a tree with decision nodes and leaf nodes
d) It never breaks down a dataset into smaller subsets with increase in depth of tree

Explanation: The decision tree breaks down a dataset into smaller subsets with increase in depth of tree. And it builds classification and regression models in the form of a tree structure where the regression models in the form of a tree structure.

2. Splitting is the process of dividing a node into two or more sub-nodes.
a) True
b) False

Explanation: Splitting in decision tree is the process of dividing a node into two or more sub-nodes. When a sub-node splits into further sub-nodes, then it is called decision node. And the nodes that do not split are called leaf or terminal nodes.

3. Real valued features problems in decision trees cannot be solved using ID3 algorithm.
a) True
b) False

Explanation: Real valued features problem in decision tree cannot be solved directly. But it can be solved by converting it into a binary feature value problem using threshold based splitting rules. Then we can solve this problem using ID3.

4. Which of the following statements is not true about reducing a real valued feature problem into binary feature?
a) It utilizes the threshold based splitting rules
b) Once the binary features are constructed the ID3 algorithm can be easily applied
c) After ID3 applied it is easy to verify that there exists a decision tree with different training error
d) After ID3 applied it is easy to verify that there exists a decision tree with same number of nodes

Explanation: Once the real valued features are reduced to binary features then we can apply ID3. And it is easy to verify that for any decision tree with threshold based splitting rules over the original real valued features that there exists a decision tree over the constructed binary features with the same training error and the same number of nodes.

5. If the original number of real valued features is d and the number of examples is m, then which of the following statements is not true?
a) The number of constructed binary features becomes dm
b) Calculating the Gain of each feature might take O(dm2) operations
c) With more improved implementation the run time can be reduced to O(dmlog(m))
d) The constructed binary features are dm

Explanation: If the original number of real valued features is d and the number of examples is m, then the number of constructed binary features becomes dm not dm. And here calculating the Gain of each feature might take O(dm2) operations. But with more improved implementation the run time can be reduced to O(dmlog(m)) .

Sanfoundry Global Education & Learning Series – Machine Learning.

To practice all areas of Machine Learning, here is complete set of 1000+ Multiple Choice Questions and Answers.