# Neural Network Questions and Answers – Determination of Weights

This set of Neural Networks Multiple Choice Questions & Answers (MCQs) focuses on “Determination of Weights″.

1. In determination of weights by learning, for orthogonal input vectors what kind of learning should be employed?
a) hebb learning law
b) widrow learning law
c) hoff learning law
d) no learning law

Explanation: For orthogonal input vectors, Hebb learning law is best suited.

2. In determination of weights by learning, for linear input vectors what kind of learning should be employed?
a) hebb learning law
b) widrow learning law
c) hoff learning law
d) no learning law

Explanation: For linear input vectors, widrow learning law is best suited.

3. In determination of weights by learning, for noisy input vectors what kind of learning should be employed?
a) hebb learning law
b) widrow learning law
c) hoff learning law
d) no learning law

Explanation: For noisy input vectors, there is no learning law.

4. What are the features that can be accomplished using affine transformations?
a) arbitrary rotation
b) scaling
c) translation
d) all of the mentioned

Explanation: Affine transformations can be used to do arbitrary rotation, scaling, translation.

5. What is the features that cannot be accomplished earlier without affine transformations?
a) arbitrary rotation
b) scaling
c) translation
d) all of the mentioned

Explanation: Affine transformations can be used to do arbitrary rotation, scaling, translation.

6. what are affine transformations?
a) addition of bias term (-1) which results in arbitrary rotation, scaling, translation of input pattern
b) addition of bias term (+1) which results in arbitrary rotation, scaling, translation of input pattern
c) addition of bias term (-1) or (+1) which results in arbitrary rotation, scaling, translation of input pattern
d) none of the mentioned

Explanation: It follows from basic definition of affine transformation.

7. Can a artificial neural network capture association if input patterns is greater then dimensionality of input vectors?
a) yes
b) no

Explanation: By using nonlinear processing units in output layer.

8. By using only linear processing units in output layer, can a artificial neural network capture association if input patterns is greater then dimensionality of input vectors?
a) yes
b) no

Explanation: There is need of non linear processing units.

9. Number of output cases depends on what factor?
a) number of inputs
b) number of distinct classes
c) total number of classes
d) none of the mentioned

Explanation: Number of output cases depends on number of distinct classes.

10. For noisy input vectors, Hebb methodology of learning can be employed?
a) yes
b) no

Explanation: For noisy input vectors, no specific type of learning method exist.

Sanfoundry Global Education & Learning Series – Neural Networks.

To practice all areas of Neural Networks, here is complete set on 1000+ Multiple Choice Questions and Answers. 