# Machine Learning Questions and Answers – Logistic Regression – Multiple Classification

This set of Machine Learning Multiple Choice Questions & Answers (MCQs) focuses on “Logistic Regression – Multiple Classification”.

1. The output is whether a person will vote or not, based on several features. It is an example of multiclass classification.
a) True
b) False

Explanation: In multiclass classification, the output, y should have more than two values (or classes). In this example, the output can be only yes or no. Hence, it is not an example of multiclass classification.

2. The output is whether a person will surely vote or surely not vote or may cast a vote, based on one feature. It is an example of multiclass classification.
a) True
b) False

Explanation: In multiclass classification, the output, y should have more than two values (or classes). Here, there are three classes – i) surely vote, ii) surely not vote, and iii) may cast a vote. Thus, it is an example of multiclass classification.

3. y = {0, 1, …, n}. This problem is divided into ______ binary classification problems.
a) n
b) 1/n
c) n + 1
d) 1/(n+1)

Explanation: The indexing starts at 0. So there are n + 1 output classes. Hence, to get the correct output, we need to divide the problem into n + 1 classification problems with binary outputs (0 or 1). If 1 is the output, the instance belongs to that particular class.

4. y = {0, 1, …, 8}. This problem is divided into ______ binary classification problems.
a) 1/9
b) 9
c) 8
d) 1/8

Explanation: Since, indexing starts at 0, the number of classes is 9 and they are 0, 1, 2, 3, 4, 5, 6, 7, and 8. To solve this 9-class problem, we need to divide the problem 9 binary classification problems.

5. y = {0, 1, 2, 3, 4, 5, 6, 8}. This problem is divided into ______ binary classification problems.
a) 9
b) 1/9
c) 8
d) 1/8

Explanation: In this example, there are 8 different classes and not 9. 0 is one of the classes but there is no class 7. So, here number of classes is 8. Thus, the problem is divided into 8 binary classification problems.

6. The outputs of an image recognition system is {0, 0, 1, 0}. The classes are dog, cat, elephant, and lion. What is the image of, according to our algorithm?
a) Dog
b) Cat
c) Elephant
d) Lion

Explanation: The output vector is a representative of the probability of the image being a particular class. According to the algorithm, the probability of image being a cat is zero, dog is zero, elephant is one, lion is zero. Thus, the image is of an elephant.

7. Who invented logistic regression?
a) Valiant
b) Ross Quinlan
c) DR Cox
d) Bayes

Explanation: Statistician DR Cox invented Logistic Regression in 1958. Ross Quinlan is the founder of the machine learning model decision tree. Leslie Valiant introduced PAC Learning. Bayes is known for Naïve-Bayes algorithm.

8. When was logistic regression invented?
a) 1957
b) 1959
c) 1960
d) 1958

Explanation: Logistic regression was invented by statistician DR Cox in the year 1958. It was introduced even before the invention of machine learning. It was introduced as a part of the direct probability model.

Sanfoundry Global Education & Learning Series – Machine Learning.