Neural Networks Questions and Answers – Gradient Checking

This set of Neural Networks Multiple Choice Questions & Answers (MCQs) focuses on “Neural Networks – Gradient Checking”.

1. What do you mean by gradient checking?
a) It is a test that makes sure that the execution of backpropagation is bug free
b) It is a test which can be referred as external software testing
c) It is a behaviour testing of a software
d) It is a type of software testing in which the system is tested against the functional requirements and specification
View Answer

Answer: a
Explanation: When there is an implementation of backpropagation, then there is a test called gradient checking that helps one to make sure that the execution of backpropagation is bug free.

2. What is the purpose of gradient checking?
a) To train the network in incorrect direction and to increase the accuracy of the output
b) To train the network function normally and to increase its accuracy of the output
c) To train the network in such a way that the prediction is same with or without error
d) To train the network in such a way that it works normally for some cases and abnormally for some cases
View Answer

Answer: b
Explanation: When an error occurs during backpropagation, the network will get trained in the incorrect direction. In this case the network will appear to function normally but it’s prediction will not increase the accuracy. So, before training the network we must make sure that the calculations performed during backpropagation is correct.

3. Which of the following statement is incorrect about backpropagation?
a) It is an algorithm commonly used to train the neural networks
b) It helps to adjust the weights of the neurons so that the accuracy of the output increases
c) It is a method of training the neural networks to perform tasks more accurately
d) The idea behind backpropagation is not to test how wrong the neural network is
View Answer

Answer: d
Explanation: Backpropagation is an algorithm commonly used to train the neural networks so that they can perform the tasks more accurately. The idea behind is to test how wrong the neural network is and then to correct it.
advertisement
advertisement

4. Is gradient checking intended for training the networks?
a) True
b) False
View Answer

Answer: b
Explanation: Gradient checking is not intended for training of networks because it is very slow, but it can help to identify errors in backpropagation implementation and can help in increasing the accuracy of the predictions.

5. Which of the following statement is not true about gradient checking?
a) It is used during the training of the neural network
b) It doesn’t work with dropout
c) It is used for only debugging purpose
d) Runs at random initialization
View Answer

Answer: a
Explanation: Gradient checking doesn’t work with dropout as in this case the cost function J is very difficult to compute, it also runs at random initialization whereas it is not used during the training of neural network but it is only used for debugging purpose only.
Sanfoundry Certification Contest of the Month is Live. 100+ Subjects. Participate Now!

6. Why should one stop gradient checking once it is done before running the network for entire set of training iterations?
a) Because it would increase the speed of training process
b) Because it would change the output of the training process
c) Because it would slow down the speed of training process
d) Because it would nullify the output
View Answer

Answer: c
Explanation: One should turn off the gradient checking process once it is completed, before running the network for entire set of training epochs. Because, in practice the calculation is much slower than the backpropagation and would slow down the training process.

7. Gradient Checking is based on 2-sided derivative.
a) True
b) False
View Answer

Answer: a
Explanation: Gradient checking is based on 2-sided derivative because the error is of order O(∈2) as compared to O(∈) for the 1-sided derivative. Hence gradient checking uses the 2-sided derivative as follows:
g(Ө) = lim → O\(\frac {J(θ+ϵ)-J(θ-ϵ)}{2*ϵ}\)
advertisement

Sanfoundry Global Education & Learning Series – Neural Networks.

To practice all areas of Neural Networks, here is complete set on 1000+ Multiple Choice Questions and Answers.

advertisement

If you find a mistake in question / option / answer, kindly take a screenshot and email to [email protected]

advertisement
advertisement
Subscribe to our Newsletters (Subject-wise). Participate in the Sanfoundry Certification contest to get free Certificate of Merit. Join our social networks below and stay updated with latest contests, videos, internships and jobs!

Youtube | Telegram | LinkedIn | Instagram | Facebook | Twitter | Pinterest
Manish Bhojasia - Founder & CTO at Sanfoundry
Manish Bhojasia, a technology veteran with 20+ years @ Cisco & Wipro, is Founder and CTO at Sanfoundry. He lives in Bangalore, and focuses on development of Linux Kernel, SAN Technologies, Advanced C, Data Structures & Alogrithms. Stay connected with him at LinkedIn.

Subscribe to his free Masterclasses at Youtube & discussions at Telegram SanfoundryClasses.