This set of Machine Learning Multiple Choice Questions & Answers (MCQs) focuses on “Fundamental Theorem of PAC Learning”.
1. Any ERM rule is a successful PAC learner for hypothesis space H.
a) True
b) False
View Answer
Explanation: ERM rules try to reduce the hypothesis error on the training dataset, even if it overfits. PAC learner tries to calculate the probability of finding an instance in which the hypothesis cannot label correctly.
2. If distribution D assigns zero probability to instances where h not equal to c, then an error will be ______
a) 1
b) 0.5
c) 0
d) infinite
View Answer
Explanation: For every instance selected, h is equal to c. Thus no error occurs. Hence, the probability of finding an error will be zero.
3. If distribution D assigns zero probability to instances where h = c, then an error will be ______
a) Cannot be determined
b) 0.5
c) 1
d) 0
View Answer
Explanation: D assigned zero probability to instances where h = c. Thus, whichever instance is selected, h is not equal to c i.e. error. Since for every instance error is raised, the probability of finding an error is 1.
4. Error strongly depends on distribution D.
a) True
b) False
View Answer
Explanation: If D is a uniform probability distribution that assigns the same probability to every instance in X, then the error for the hypothesis will be the fraction of the total instance space that falls into the region where h and c disagree.
5. PAC learning was introduced by ____________
a) Vapnik
b) Leslie Valiant
c) Chervonenkis
d) Reverend Thomas Bayes
View Answer
Explanation: Leslie Valiant introduced PAC Learning in 1984. Vapnik and Chervonenkis introduce the idea of VC dimension. Thomas Bayes published Bayes’ Theorem.
6. Error is defined over the _____________
a) training set
b) test Set
c) domain set
d) cross-validation set
View Answer
Explanation: Error is defined over the entire distribution of instances-not simply over the training examples-because this is the true error one expects to encounter when actually using the learned hypothesis h on subsequent instances drawn from D.
7. The error of h with respect to c is the probability that a randomly drawn instance will fall into the region where _________
a) h and c disagree
b) h and c agree
c) h is greater than c but not less
d) h is lesser than c but not greater
View Answer
Explanation: The concepts c and h are depicted by the sets of instances within X that they label as positive. If c says the label is positive but h says a negative, error has occurred, whose probability is to be calculated.
8. When was PAC learning invented?
a) 1954
b) 1964
c) 1974
d) 1984
View Answer
Explanation: PAC learning was introduced in the domain Computational Learning Theory. Leslie Valiant introduced the concept of PAC learning in 1984.
Sanfoundry Global Education & Learning Series – Machine Learning.
To practice all areas of Machine Learning, here is complete set of 1000+ Multiple Choice Questions and Answers.
If you find a mistake in question / option / answer, kindly take a screenshot and email to [email protected]