# Machine Learning Questions and Answers – Logistic Regression – Advanced Optimization

This set of Machine Learning Multiple Choice Questions & Answers (MCQs) focuses on “Logistic Regression – Advanced Optimization”.

1. Which is a better algorithm than gradient descent for optimization?
a) Conjugate gradient
b) Cost Function
c) ERM rule
d) PAC Learning
View Answer

Answer: a
Explanation: Conjugate gradient is an optimization algorithm and it gives better results than gradient descent. Cost function is used to calculate the average difference between predicted output and actual output. ERM although tries to lower the cost function, it often leads to overfitting.

2. Who invented BFGS?
a) Quinlan
b) Bayes
c) Broyden, Fletcher, Goldfarb and Shannon
d) Cauchy
View Answer

Answer: c
Explanation: Broyden, Fletcher, Goldfarb and Shannon are credited with the invention of BFGS method. Quinlan introduced the algorithm of Decision trees. Bayes invented Naïve-Bayes algorithm. Cauchy is the founder of gradient descent algorithm.

3. Ax = b => [4 2, 2 3][x1, x2] = [2, 2]. Let x0, the initial guess be [1, 1]. What is the residual vector?
a) [4, -3]
b) [-4, 3]
c) [-4, -3]
d) [4, 3]
View Answer

Answer: c
Explanation: Residual vector, r0 = b – Ax0
r0 = [2, 2] – [4 2, 2 3][1, 1]
= [2, 2] – [6, 5]
= [-4, -3].
advertisement
advertisement

4. Ax = b => [2 2, 3 3][x1, x2] = [1, 2]. Let x0, the initial guess be [1, 1]. What is the residual vector?
a) [3, -4]
b) [-4, 3]
c) [-4, -3]
d) [-3, -4]
View Answer

Answer: d
Explanation: Residual vector, r0 = b – Ax0
r0 = [1, 2] – [2 2, 3 3][1, 1]
= [1, 2] – [4, 6]
= [-3, -4].

5. In the L-BFGS algorithm, what does the letter L stand for?
a) Lengthy
b) Limited-memory
c) Linear
d) Logistic
View Answer

Answer: b
Explanation: L-BFGS is an approximation of the Broyden-Fletcher-Goldfarb-Shannon algorithm. It is used for cases which are limited in memory. Like BFGS, this method also works better than gradient descent.

6. Ax = b => [3 2, 2 3][x1, x2] = [8, 6]. Let x0, the initial guess be [2, 1]. What is the residual vector?
a) [-1, 0]
b) [0, -1]
c) [1, 0]
d) [0, 1]
View Answer

Answer: b
Explanation: Residual vector, r0 = b – Ax0
r0 = [8, 6] – [3 2, 2 3][2, 1]
= [8, 6] – [8, 7]
= [0, -1].

7. Who developed conjugate gradient method?
a) Hestenes and Stiefel
b) Broyden, Fletcher, Goldfarb and Shannon
c) Valiant
d) Vapnik and Chervonenkis
View Answer

Answer: a
Explanation: Magnus Hestenes and Eduard Stiefel introduced the conjugate gradient algorithm. It is used for advanced optimization. Broyden, Fletcher, Goldfarb and Shannon invented BFGS algorithm. Leslie Valiant introduced the idea of PAC Learning. Vapnik and Chervonenkis was the founder of VC dimension.
advertisement

8. When was BFGS invented?
a) 1960
b) 1965
c) 1975
d) 1970
View Answer

Answer: d
Explanation: Broyden, Fletcher, Goldfarb and Shannon are credited with the invention of the BFGS method. It was invented in the year, 1970. BFGS is an advanced optimization technique. It is a better algorithm than gradient descent.

Sanfoundry Global Education & Learning Series – Machine Learning.

advertisement

To practice all areas of Machine Learning, here is complete set of 1000+ Multiple Choice Questions and Answers.

If you find a mistake in question / option / answer, kindly take a screenshot and email to [email protected]

advertisement
advertisement
Subscribe to our Newsletters (Subject-wise). Participate in the Sanfoundry Certification contest to get free Certificate of Merit. Join our social networks below and stay updated with latest contests, videos, internships and jobs!

Youtube | Telegram | LinkedIn | Instagram | Facebook | Twitter | Pinterest
Manish Bhojasia, a technology veteran with 20+ years @ Cisco & Wipro, is Founder and CTO at Sanfoundry. He lives in Bangalore, and focuses on development of Linux Kernel, SAN Technologies, Advanced C, Data Structures & Alogrithms. Stay connected with him at LinkedIn.

Subscribe to his free Masterclasses at Youtube & discussions at Telegram SanfoundryClasses.