This set of Machine Learning Multiple Choice Questions & Answers (MCQs) focuses on “VC-Dimension”.
1. What does VC dimension do?
a) Reduces complexity of hypothesis space
b) Removes noise from dataset
c) Measures complexity of training dataset
d) Measures the complexity of hypothesis space H
View Answer
Explanation: The VC dimension measures the complexity of the hypothesis space H, not by the number of distinct hypotheses |H|, but by the number of distinct instances from X that can be completely discriminated using H.
2. An instance set S is given. How many dichotomies are possible?
a) 2*|S|
b) 2/|S|
c) 2^|S|
d) |S|
View Answer
Explanation: Given some instance set S, there are 2 ISI possible dichotomies. H shatters S if every possible dichotomy of S can be represented by some hypothesis from H.
3. If h is a straight line, what is the maximum number of points that can be shattered?
a) 4
b) 2
c) 3
d) 5
View Answer
Explanation: The main concept is to separate the positive data points from negative data points. Thus, is done using the line. So, the maximum number of points that can be shattered (or separated) is 3.
4. What is the VC dimension of a straight line?
a) 3
b) 2
c) 4
d) 0
View Answer
Explanation: The maximum number of points shattered by the straight line is 3. Since VC dimension is the maximum number of points shattered, the VC dimension of a straight line is 3.
5. A set of 3 instances is shattered by _____ hypotheses.
a) 4
b) 8
c) 3
d) 2
View Answer
Explanation: A set of S instances can be shattered by 2^|S| hypotheses. Here, number of instances is 3. Hence, number of hypotheses is 2^|3| i.e. 8.
6. What is the relation between VC dimension and hypothesis space H?
a) VC(H) <= |H|
b) VC(H) != log2|H|
c) VC(H) <= log2|H|
d) VC(H) > log2|H|
View Answer
Explanation: Suppose that, VC(H) = d. Then H will require 2d distinct hypotheses to shatter d instances. Hence, 2d <= IHI, and d = VC(H) <= log2(H).
7. VC Dimension can be infinite.
a) True
b) False
View Answer
Explanation: The VC dimension is defined as the largest number of points in set X that can be shattered (or separated) successfully by the chosen hypothesis space. If the hypothesis space H can separate arbitrarily large number of data points in a given set, then VC(H) = ∞.
8. Who invented VC dimension?
a) Francis Galton
b) J. Ross Quinlan
c) Leslie Valiant
d) Vapnik and Chervonenkis
View Answer
Explanation: Vapnik and Chervonenkis introduced VC dimension. Valiant introduced the concept of PAC learning. Galton introduced Linear Regression. Quinlan introduced the Decision Tree.
9. What is the advantage of VC dimension over PAC learning?
a) VC dimension reduces complexity of training data
b) VC dimension outputs more accurate predictors
c) VC dimension can work for infinite hypothesis space
d) There is no advantage
View Answer
Explanation: In the case of infinite hypothesis spaces we cannot apply m >= 1/ε ((ln |H| + ln (1/δ) (PAC Learning). Hence, we consider VC dimension of H. If training set X is too large, VC(H) = ∞.
10. IF VC(H) increases, number of maximum training examples required (m) increases.
a) False
b) True
View Answer
Explanation: m > = 1/ε (4(log 2(2/δ) + 8 VC (H) log 2 (13/ε))
Thus, m is directly proportional to VC(H). Hence, if VC(H) increases, m also increases.
More MCQs on VC-Dimension:
Sanfoundry Global Education & Learning Series – Machine Learning.
To practice all areas of Machine Learning, here is complete set of 1000+ Multiple Choice Questions and Answers.
If you find a mistake in question / option / answer, kindly take a screenshot and email to [email protected]