Neural Network Questions and Answers – Determination of Weights

«
»

This set of Neural Networks Multiple Choice Questions & Answers (MCQs) focuses on “Determination of Weights″.

1. In determination of weights by learning, for orthogonal input vectors what kind of learning should be employed?
a) hebb learning law
b) widrow learning law
c) hoff learning law
d) no learning law
View Answer

Answer: a
Explanation: For orthogonal input vectors, Hebb learning law is best suited.

2. In determination of weights by learning, for linear input vectors what kind of learning should be employed?
a) hebb learning law
b) widrow learning law
c) hoff learning law
d) no learning law
View Answer

Answer: b
Explanation: For linear input vectors, widrow learning law is best suited.

3. In determination of weights by learning, for noisy input vectors what kind of learning should be employed?
a) hebb learning law
b) widrow learning law
c) hoff learning law
d) no learning law
View Answer

Answer: d
Explanation: For noisy input vectors, there is no learning law.
advertisement
advertisement

4. What are the features that can be accomplished using affine transformations?
a) arbitrary rotation
b) scaling
c) translation
d) all of the mentioned
View Answer

Answer: d
Explanation: Affine transformations can be used to do arbitrary rotation, scaling, translation.

5. What is the features that cannot be accomplished earlier without affine transformations?
a) arbitrary rotation
b) scaling
c) translation
d) all of the mentioned
View Answer

Answer: c
Explanation: Affine transformations can be used to do arbitrary rotation, scaling, translation.

6. what are affine transformations?
a) addition of bias term (-1) which results in arbitrary rotation, scaling, translation of input pattern
b) addition of bias term (+1) which results in arbitrary rotation, scaling, translation of input pattern
c) addition of bias term (-1) or (+1) which results in arbitrary rotation, scaling, translation of input pattern
d) none of the mentioned
View Answer

Answer: a
Explanation: It follows from basic definition of affine transformation.

7. Can a artificial neural network capture association if input patterns is greater then dimensionality of input vectors?
a) yes
b) no
View Answer

Answer: a
Explanation: By using nonlinear processing units in output layer.
advertisement

8. By using only linear processing units in output layer, can a artificial neural network capture association if input patterns is greater then dimensionality of input vectors?
a) yes
b) no
View Answer

Answer: b
Explanation: There is need of non linear processing units.

9. Number of output cases depends on what factor?
a) number of inputs
b) number of distinct classes
c) total number of classes
d) none of the mentioned
View Answer

Answer: b
Explanation: Number of output cases depends on number of distinct classes.
advertisement

10. For noisy input vectors, Hebb methodology of learning can be employed?
a) yes
b) no
View Answer

Answer: b
Explanation: For noisy input vectors, no specific type of learning method exist.

Sanfoundry Global Education & Learning Series – Neural Networks.

To practice all areas of Neural Networks, here is complete set on 1000+ Multiple Choice Questions and Answers.

advertisement
advertisement
Subscribe to our Newsletters (Subject-wise). Participate in the Sanfoundry Certification contest to get free Certificate of Merit. Join our social networks below and stay updated with latest contests, videos, internships and jobs!

Youtube | Telegram | LinkedIn | Instagram | Facebook | Twitter | Pinterest
Manish Bhojasia - Founder & CTO at Sanfoundry
Manish Bhojasia, a technology veteran with 20+ years @ Cisco & Wipro, is Founder and CTO at Sanfoundry. He lives in Bangalore, and focuses on development of Linux Kernel, SAN Technologies, Advanced C, Data Structures & Alogrithms. Stay connected with him at LinkedIn.

Subscribe to his free Masterclasses at Youtube & technical discussions at Telegram SanfoundryClasses.