Machine Learning Questions and Answers – PAC Learning

This set of Machine Learning Multiple Choice Questions & Answers (MCQs) focuses on “PAC Learning”.

1. Who introduced the concept of PAC learning?
a) Francis Galton
b) Reverend Thomas Bayes
c) J.Ross Quinlan
d) Leslie Valiant
View Answer

Answer: d
Explanation: Valiant introduced PAC learning. Galton introduced Linear Regression. Quinlan introduced the Decision Tree. Bayes introduced Bayes’ rule and Naïve-Bayes theorem.

2. When was PAC learning invented?
a) 1874
b) 1974
c) 1984
d) 1884
View Answer

Answer: c
Explanation: Leslie Valiant was the inventor of PAC learning. He described it in 1984. It was introduced as a part of the Computational Learning Theory.

3. The full form of PAC is ______
a) Partly Approximation Computation
b) Probability Approximation Curve
c) Probably Approximately Correct
d) Partly Approximately Correct
View Answer

Answer: c
Explanation: Probably Approximately Correct tries to build a hypothesis that can predict with low generalization error (approximately correct), with high probability (probably).
advertisement
advertisement

4. What can be explained by PAC learning?
a) Sample Complexity
b) Overfitting
c) Underfitting
d) Label Function
View Answer

Answer: a
Explanation: PAC learning can give a rough estimate of the number of training examples required by the learning algorithm to develop a hypothesis with the desired accuracy.

5. What is the significance of epsilon in PAC learning?
a) Probability of approximation < epsilon
b) Maximum error < epsilon
c) Minimum error > epsilon
d) Probability of approximation = delta – epsilon
View Answer

Answer: b
Explanation: A concept is PAC learnable by L if L can output a hypothesis with error < epsilon. Hence the maximum error obtained by the hypothesis should be less than epsilon. Epsilon is usually 5% or 1%.

6. What is the significance of delta in PAC learning?
a) Probability of approximation < delta
b) Error < delta
c) Probability = 1 – delta
d) Probability of approximation = delta – epsilon
View Answer

Answer: c
Explanation: A concept C is PAC learnable by L if L can predict a hypothesis with a certain error with probability equal to 1-delta. Delta is usually very low.

7. One of the goals of PAC learning is to give __________
a) maximum accuracy
b) cross-validation complexity
c) error of classifier
d) computational complexity
View Answer

Answer: d
Explanation: PAC learning tells us about the amount of effort required for computation so that a learner can come up with a successful hypothesis with high probability.
advertisement

8. A learner can be deemed consistent if it produces a hypothesis that perfectly fits the __________
a) cross-validation data
b) overall dataset
c) test data
d) training data
View Answer

Answer: d
Explanation: PAC learning is concerned with the behavior of the learning algorithm. The learner has access to only the training data set.

9. Number of hypothesizes |H| = 973, probability = 95%, error < 0.1. Find minimum number of training examples, m, required.
a) 98.8
b) 99.8
c) 99
d) 98
View Answer

Answer: c
Explanation: Probability = 1 – delta (d) = 0.95; d = 0.05. Error < epsilon (e); e = 0.1.
m >= 1/e (ln |H| + ln (1/d)) i.e. m> = 1/0.1(ln 973 + ln (1/0.05)) i.e. m>=98.8
Since number of training examples, m, has to be an integer, answer is 99.
advertisement

10. In PAC learning, sample complexity grows as the logarithm of the number of hypothesizes.
a) False
b) True
View Answer

Answer: b
Explanation: Sample complexity is the number of training examples required to converge to a successful hypothesis. It is given by m >= 1/e (ln |H| + ln (1/d)), where m is the number of training examples and H is hypothesis space.

Sanfoundry Global Education & Learning Series – Machine Learning.

To practice all areas of Machine Learning, here is complete set of 1000+ Multiple Choice Questions and Answers.

If you find a mistake in question / option / answer, kindly take a screenshot and email to [email protected]

advertisement
advertisement
Subscribe to our Newsletters (Subject-wise). Participate in the Sanfoundry Certification contest to get free Certificate of Merit. Join our social networks below and stay updated with latest contests, videos, internships and jobs!

Youtube | Telegram | LinkedIn | Instagram | Facebook | Twitter | Pinterest
Manish Bhojasia - Founder & CTO at Sanfoundry
Manish Bhojasia, a technology veteran with 20+ years @ Cisco & Wipro, is Founder and CTO at Sanfoundry. He lives in Bangalore, and focuses on development of Linux Kernel, SAN Technologies, Advanced C, Data Structures & Alogrithms. Stay connected with him at LinkedIn.

Subscribe to his free Masterclasses at Youtube & discussions at Telegram SanfoundryClasses.