This set of Neural Networks Multiple Choice Questions & Answers (MCQs) focuses on “Neural Networks – Gradient Checking”.

1. What do you mean by gradient checking?
a) It is a test that makes sure that the execution of backpropagation is bug free
b) It is a test which can be referred as external software testing
c) It is a behaviour testing of a software
d) It is a type of software testing in which the system is tested against the functional requirements and specification

Explanation: When there is an implementation of backpropagation, then there is a test called gradient checking that helps one to make sure that the execution of backpropagation is bug free.

2. What is the purpose of gradient checking?
a) To train the network in incorrect direction and to increase the accuracy of the output
b) To train the network function normally and to increase its accuracy of the output
c) To train the network in such a way that the prediction is same with or without error
d) To train the network in such a way that it works normally for some cases and abnormally for some cases

Explanation: When an error occurs during backpropagation, the network will get trained in the incorrect direction. In this case the network will appear to function normally but it’s prediction will not increase the accuracy. So, before training the network we must make sure that the calculations performed during backpropagation is correct.

3. Which of the following statement is incorrect about backpropagation?
a) It is an algorithm commonly used to train the neural networks
b) It helps to adjust the weights of the neurons so that the accuracy of the output increases
c) It is a method of training the neural networks to perform tasks more accurately
d) The idea behind backpropagation is not to test how wrong the neural network is

Explanation: Backpropagation is an algorithm commonly used to train the neural networks so that they can perform the tasks more accurately. The idea behind is to test how wrong the neural network is and then to correct it.

4. Is gradient checking intended for training the networks?
a) True
b) False

Explanation: Gradient checking is not intended for training of networks because it is very slow, but it can help to identify errors in backpropagation implementation and can help in increasing the accuracy of the predictions.

5. Which of the following statement is not true about gradient checking?
a) It is used during the training of the neural network
b) It doesn’t work with dropout
c) It is used for only debugging purpose
d) Runs at random initialization

Explanation: Gradient checking doesn’t work with dropout as in this case the cost function J is very difficult to compute, it also runs at random initialization whereas it is not used during the training of neural network but it is only used for debugging purpose only.
Sanfoundry Certification Contest of the Month is Live. 100+ Subjects. Participate Now!

6. Why should one stop gradient checking once it is done before running the network for entire set of training iterations?
a) Because it would increase the speed of training process
b) Because it would change the output of the training process
c) Because it would slow down the speed of training process
d) Because it would nullify the output

Explanation: One should turn off the gradient checking process once it is completed, before running the network for entire set of training epochs. Because, in practice the calculation is much slower than the backpropagation and would slow down the training process.

7. Gradient Checking is based on 2-sided derivative.
a) True
b) False

Explanation: Gradient checking is based on 2-sided derivative because the error is of order O(∈2) as compared to O(∈) for the 1-sided derivative. Hence gradient checking uses the 2-sided derivative as follows:
g(Ө) = lim → O$$\frac {J(θ+ϵ)-J(θ-ϵ)}{2*ϵ}$$

Sanfoundry Global Education & Learning Series – Neural Networks.

To practice all areas of Neural Networks, here is complete set on 1000+ Multiple Choice Questions and Answers.