Machine Learning Questions and Answers – Subgradient Descent

This set of Machine Learning Multiple Choice Questions & Answers (MCQs) focuses on “Subgradient”.

1. The Subgradient method is an algorithm for maximizing a non-differentiable convex function.
a) True
b) False
View Answer

Answer: b
Explanation: The Subgradient method is not an algorithm for maximizing a non-differentiable convex function but is used to minimize the non differentiable convex function. Convex optimization is the problem of minimizing convex functions over convex sets. And when the objective function is non-differentiable Subgradient methods are used.

2. Which of the following statements is not true about Subgradient?
a) The step lengths are chosen via a line search
b) It can be directly applied to non-differentiable functions
c) It is an iterative method
d) The step lengths are fixed ahead of time
View Answer

Answer: a
Explanation: In Subgradient the step lengths are not chosen via line search, and are often fixed ahead of time. It is an iterative algorithm, which uses an initial guess to generate a sequence of improving approximate solutions for a class of problems. And these are directly applied to non-differentiable functions.

3. Subgradient methods can be much slower than interior-point methods.
a) True
b) False
View Answer

Answer: a
Explanation: Subgradient methods can be much slower than interior-point methods. Where the interior-point methods are used to solve linear and nonlinear convex optimization problems and are second-order methods, not affected by problem scaling. Subgradient methods are first-order methods and their performance depends very much on the problem of scaling and conditioning.
advertisement
advertisement

4. Which of the following statements is not true about the Subgradient method?
a) It has small memory requirement than interior-point methods
b) It can be used for extremely large problems
c) Simple distributed algorithm can be generated by combining sub gradient with primal or dual composition techniques
d) It is much faster than Newton’s method in the unconstrained case
View Answer

Answer: d
Explanation: Subgradient methods are not faster than Newton’s method but are slower than it. The advantages of Sub-gradient are that it has smaller memory requirements than interior-point methods and can be used for extremely large problems. And can be combined with primal or dual composition techniques to form a simple distributed algorithm.

5. Step size, αk = α is a positive constant, independent of k is represented by Constant step size rule.
a) True
b) False
View Answer

Answer: a
Explanation: Constant step size rule defines the step size, αk = α is a positive constant, independent of k. And Constant step length, Non-summable diminishing, and Non-summable diminishing step lengths are other step size rules in Subgradient which define the step size in different ways.

6. In SVM problems, we cannot directly apply gradient descent but we can apply Subgradient descent.
a) True
b) False
View Answer

Answer: a
Explanation: In SVM problems we cannot directly apply gradient descent but we can apply Subgradient descent. Because SVM objective is not continuously differentiable and we cannot apply gradient descent. And Sub-gradient descent can be used to solve this non-differentiable SVM objective function.

7. Which of the following objective functions is not solved by Subgradient?
a) Hinge loss
b) L1 norm
c) Perceptron loss
d) TanH function
View Answer

Answer: d
Explanation: TanH function cannot be handled by the Subgradient. TanH function is a differentiable objective function which cannot be solved by the Subgradient. Because Subgradient is used to solve the non differentiable convex problems where all the other three are the non differentiable functions.
advertisement

8. The step size rules in Subgradient are determined before the algorithm is run.
a) False
b) True
View Answer

Answer: b
Explanation: The step size rules in subgradient are determined before the algorithm is run. That is they do not depend on any data computed during the algorithm. But in standard descent methods the step size rules depend very much on the current point and search direction.

9. The Subgradient is a descent method.
a) True
b) False
View Answer

Answer: b
Explanation: Unlike the ordinary gradient method, the subgradient method is not a descent method, because the function value often increases. The method looks very much like the ordinary gradient method for differentiable functions, but with several exceptions.
advertisement

10. Subgradient descent can be used at points where derivative is not defined.
a) True
b) False
View Answer

Answer: a
Explanation: Subgradient descent can be used at points where derivative is not defined. It solves the non-differentiable convex function. And it is like gradient descent, but replacing gradients with subgradients.

11. Which of the following statements is not true about Subgradient method?
a) Its convergence can be very fast
b) It handles general non-differentiable convex problem
c) It has no good stopping criterion
d) It often leads to very simple algorithms
View Answer

Answer: d
Explanation: Subgradient method’s convergence can be very slow and not very fast. It involves the convergence of the iterative process. And this iterative process makes the convergence very slowly. All other three statements are the key features of the Sub gradient method.

Sanfoundry Global Education & Learning Series – Machine Learning.

To practice all areas of Machine Learning, here is complete set of 1000+ Multiple Choice Questions and Answers.

If you find a mistake in question / option / answer, kindly take a screenshot and email to [email protected]

advertisement
advertisement
Subscribe to our Newsletters (Subject-wise). Participate in the Sanfoundry Certification contest to get free Certificate of Merit. Join our social networks below and stay updated with latest contests, videos, internships and jobs!

Youtube | Telegram | LinkedIn | Instagram | Facebook | Twitter | Pinterest
Manish Bhojasia - Founder & CTO at Sanfoundry
Manish Bhojasia, a technology veteran with 20+ years @ Cisco & Wipro, is Founder and CTO at Sanfoundry. He lives in Bangalore, and focuses on development of Linux Kernel, SAN Technologies, Advanced C, Data Structures & Alogrithms. Stay connected with him at LinkedIn.

Subscribe to his free Masterclasses at Youtube & discussions at Telegram SanfoundryClasses.