Decision Trees Questions and Answers – Inductive Bias

This set of Machine Learning Multiple Choice Questions & Answers (MCQs) focuses on “Decision Trees – Inductive Bias”.

1. Inductive bias is also known as learning bias.
a) True
b) False
View Answer

Answer: a
Explanation: Inductive bias is also known as learning bias and it is related with the learning algorithms. It is a set of assumptions that the learner uses to predict outputs for the given inputs that has not been encountered.

2. Which of the following statements is not true about the Inductive bias in the decision tree?
a) It is harder to define because of heuristic search
b) Trees that place high information gain attributes close to the root are preferred
c) Trees that place high information gain attributes far away from the root are preferred
d) Shorter trees are preferred over longer ones
View Answer

Answer: c
Explanation: Here the trees that place high information gain attributes close to the root are preferred over those that do not. And it is harder to define because of heuristic search. It prefers the shorter trees over longer trees.

3. According to Occam’s Razor, which of the following statements is not favorable to short hypotheses?
a) It is good to use fewer short hypotheses than long hypotheses
b) A short hypothesis that fits the data is unlikely to be a coincidence
c) A long hypothesis that fits the data might be a coincidence
d) There are many ways to define small set of hypotheses
View Answer

Answer: d
Explanation: Occam’s Razor is the problem solving principle that prefers the simplest hypotheses that fits the data. And the argument opposed is that: there are many ways to define a small set of hypotheses. All other three statements are in favor of the short hypotheses.
advertisement
advertisement

4. Which of the following statements are not true about Inductive bias in ID3?
a) It is the set of assumptions that along with the training data justify the classifications assigned by the learner to future instances
b) ID3 has preference of short trees with high information gain attributes near the root
c) ID3 has preference for certain hypotheses over others, with no hard restriction on the hypotheses space
d) ID3 has preference of long trees with high information gain attributes far away from the root
View Answer

Answer: d
Explanation: ID3 prefers the short trees and not the long trees. It has preference of short trees with high information gain attributes near the root and for certain hypotheses over others, with no hard restriction on the hypotheses space. And it is the set of assumptions that along with the training data justify the classifications assigned by the learner to future instances.

5. Which of the following statements is not true about ID3?
a) ID3 searches incompletely through the hypotheses space, from simple to complex hypotheses, until its termination condition is met
b) Its inductive bias is solely a consequence of the ordering of hypotheses by its search strategy
c) Its hypothesis space introduces additional bias in its each iteration
d) Its hypothesis space introduces no additional bias
View Answer

Answer: c
Explanation: In ID3 its hypothesis space introduces no additional bias and it is solely a consequence of the ordering of hypotheses by its search strategy. And ID3 searches incompletely through this space, from simple to complex hypotheses, until its termination condition is met.

6. Which of the following statements is true about Candidate elimination?
a) Candidate elimination searches the hypotheses completely, and finding every hypothesis consistent with the training data
b) Its inductive bias is solely a consequence of the ordering of hypotheses by its search strategy
c) Its inductive bias is solely a consequence of the expressive power of its hypothesis representation
d) Its search strategy introduces no additional bias
View Answer

Answer: b
Explanation: Its inductive bias is not solely a consequence of the ordering of hypotheses by its search strategy but is solely a consequence of the expressive power of its hypothesis representation. All other statements are true about inductive bias in Candidate elimination.

7. Preference bias is more desirable than a restriction bias.
a) True
b) False
View Answer

Answer: a
Explanation: Preference bias is more desirable than a restriction bias because it allows the learner to work within a complete hypothesis space that is assured to contain the unknown target function. So Preference bias is more desirable than a restriction bias (language bias).
advertisement

8. Preference bias is also known as search bias.
a) True
b) False
View Answer

Answer: a
Explanation: Preference bias is also known as search bias. It is used when a learning algorithm incompletely searches a complete hypothesis space. It chooses which part of the hypothesis space to search. A decision tree is an example.

Sanfoundry Global Education & Learning Series – Machine Learning.

advertisement

To practice all areas of Machine Learning, here is complete set of 1000+ Multiple Choice Questions and Answers.

If you find a mistake in question / option / answer, kindly take a screenshot and email to [email protected]

advertisement
advertisement
Subscribe to our Newsletters (Subject-wise). Participate in the Sanfoundry Certification contest to get free Certificate of Merit. Join our social networks below and stay updated with latest contests, videos, internships and jobs!

Youtube | Telegram | LinkedIn | Instagram | Facebook | Twitter | Pinterest
Manish Bhojasia - Founder & CTO at Sanfoundry
Manish Bhojasia, a technology veteran with 20+ years @ Cisco & Wipro, is Founder and CTO at Sanfoundry. He lives in Bangalore, and focuses on development of Linux Kernel, SAN Technologies, Advanced C, Data Structures & Alogrithms. Stay connected with him at LinkedIn.

Subscribe to his free Masterclasses at Youtube & discussions at Telegram SanfoundryClasses.