# Machine Learning Questions and Answers – Fundamental Theorem of PAC Learning

This set of Machine Learning Multiple Choice Questions & Answers (MCQs) focuses on “Fundamental Theorem of PAC Learning”.

1. Any ERM rule is a successful PAC learner for hypothesis space H.
a) True
b) False

Explanation: ERM rules try to reduce the hypothesis error on the training dataset, even if it overfits. PAC learner tries to calculate the probability of finding an instance in which the hypothesis cannot label correctly.

2. If distribution D assigns zero probability to instances where h not equal to c, then an error will be ______
a) 1
b) 0.5
c) 0
d) infinite

Explanation: For every instance selected, h is equal to c. Thus no error occurs. Hence, the probability of finding an error will be zero.

3. If distribution D assigns zero probability to instances where h = c, then an error will be ______
a) Cannot be determined
b) 0.5
c) 1
d) 0

Explanation: D assigned zero probability to instances where h = c. Thus, whichever instance is selected, h is not equal to c i.e. error. Since for every instance error is raised, the probability of finding an error is 1.

4. Error strongly depends on distribution D.
a) True
b) False

Explanation: If D is a uniform probability distribution that assigns the same probability to every instance in X, then the error for the hypothesis will be the fraction of the total instance space that falls into the region where h and c disagree.

5. PAC learning was introduced by ____________
a) Vapnik
b) Leslie Valiant
c) Chervonenkis
d) Reverend Thomas Bayes

Explanation: Leslie Valiant introduced PAC Learning in 1984. Vapnik and Chervonenkis introduce the idea of VC dimension. Thomas Bayes published Bayes’ Theorem.

6. Error is defined over the _____________
a) training set
b) test Set
c) domain set
d) cross-validation set

Explanation: Error is defined over the entire distribution of instances-not simply over the training examples-because this is the true error one expects to encounter when actually using the learned hypothesis h on subsequent instances drawn from D.

7. The error of h with respect to c is the probability that a randomly drawn instance will fall into the region where _________
a) h and c disagree
b) h and c agree
c) h is greater than c but not less
d) h is lesser than c but not greater

Explanation: The concepts c and h are depicted by the sets of instances within X that they label as positive. If c says the label is positive but h says a negative, error has occurred, whose probability is to be calculated.

8. When was PAC learning invented?
a) 1954
b) 1964
c) 1974
d) 1984

Explanation: PAC learning was introduced in the domain Computational Learning Theory. Leslie Valiant introduced the concept of PAC learning in 1984.

Sanfoundry Global Education & Learning Series – Machine Learning.