This set of Machine Learning Multiple Choice Questions & Answers (MCQs) focuses on “Logistic Regression – Advanced Optimization”.
1. Which is a better algorithm than gradient descent for optimization?
a) Conjugate gradient
b) Cost Function
c) ERM rule
d) PAC Learning
View Answer
Explanation: Conjugate gradient is an optimization algorithm and it gives better results than gradient descent. Cost function is used to calculate the average difference between predicted output and actual output. ERM although tries to lower the cost function, it often leads to overfitting.
2. Who invented BFGS?
a) Quinlan
b) Bayes
c) Broyden, Fletcher, Goldfarb and Shannon
d) Cauchy
View Answer
Explanation: Broyden, Fletcher, Goldfarb and Shannon are credited with the invention of BFGS method. Quinlan introduced the algorithm of Decision trees. Bayes invented Naïve-Bayes algorithm. Cauchy is the founder of gradient descent algorithm.
3. Ax = b => [4 2, 2 3][x1, x2] = [2, 2]. Let x0, the initial guess be [1, 1]. What is the residual vector?
a) [4, -3]
b) [-4, 3]
c) [-4, -3]
d) [4, 3]
View Answer
Explanation: Residual vector, r0 = b – Ax0
r0 = [2, 2] – [4 2, 2 3][1, 1]
= [2, 2] – [6, 5]
= [-4, -3].
4. Ax = b => [2 2, 3 3][x1, x2] = [1, 2]. Let x0, the initial guess be [1, 1]. What is the residual vector?
a) [3, -4]
b) [-4, 3]
c) [-4, -3]
d) [-3, -4]
View Answer
Explanation: Residual vector, r0 = b – Ax0
r0 = [1, 2] – [2 2, 3 3][1, 1]
= [1, 2] – [4, 6]
= [-3, -4].
5. In the L-BFGS algorithm, what does the letter L stand for?
a) Lengthy
b) Limited-memory
c) Linear
d) Logistic
View Answer
Explanation: L-BFGS is an approximation of the Broyden-Fletcher-Goldfarb-Shannon algorithm. It is used for cases which are limited in memory. Like BFGS, this method also works better than gradient descent.
6. Ax = b => [3 2, 2 3][x1, x2] = [8, 6]. Let x0, the initial guess be [2, 1]. What is the residual vector?
a) [-1, 0]
b) [0, -1]
c) [1, 0]
d) [0, 1]
View Answer
Explanation: Residual vector, r0 = b – Ax0
r0 = [8, 6] – [3 2, 2 3][2, 1]
= [8, 6] – [8, 7]
= [0, -1].
7. Who developed conjugate gradient method?
a) Hestenes and Stiefel
b) Broyden, Fletcher, Goldfarb and Shannon
c) Valiant
d) Vapnik and Chervonenkis
View Answer
Explanation: Magnus Hestenes and Eduard Stiefel introduced the conjugate gradient algorithm. It is used for advanced optimization. Broyden, Fletcher, Goldfarb and Shannon invented BFGS algorithm. Leslie Valiant introduced the idea of PAC Learning. Vapnik and Chervonenkis was the founder of VC dimension.
8. When was BFGS invented?
a) 1960
b) 1965
c) 1975
d) 1970
View Answer
Explanation: Broyden, Fletcher, Goldfarb and Shannon are credited with the invention of the BFGS method. It was invented in the year, 1970. BFGS is an advanced optimization technique. It is a better algorithm than gradient descent.
Sanfoundry Global Education & Learning Series – Machine Learning.
To practice all areas of Machine Learning, here is complete set of 1000+ Multiple Choice Questions and Answers.