This set of Machine Learning Multiple Choice Questions & Answers (MCQs) focuses on “Multivariate Linear Regression”.

1. Multivariate linear regression belongs to which category?

a) Neither supervised nor unsupervised learning

b) Both supervised and unsupervised learning

c) Supervised learning

d) Unsupervised learning

View Answer

Explanation: In multivariate linear regression the dataset with correct values of the target variable is given to the learner. The correct answers are already given, from which the learner can learn. Hence, it is supervised learning.

2. The learner is trying to predict housing prices based on the size of each house and number of bedrooms. What type of regression is this?

a) Multivariate Logistic Regression

b) Logistic Regression

c) Linear Regression

d) Multivariate Linear Regression

View Answer

Explanation: Learner is trying to output a value which can be any number, instead of a binary output (yes or no / 0 or 1, etc.). Thus it is linear regression and not logistic regression. Since there are two independent variables “size” and “number of bedrooms”, it is multivariate linear regression.

3. What does x_{n}^{(i)} represent?

a) Value of n^{th} variable is i

b) Value of i^{th} variable in the n^{th} training example

c) Value of n^{th} variable in the i^{th} training example

d) Value of i^{th} variable is n

View Answer

Explanation: In multivariate linear regression, while representing an independent variable, the subscript denotes the feature number and the superscript denotes the number of the training example.

4. How is the hypothesis represented in multivariate regression? Transpose of matrix a is represented as a^{T}.

a) h(X) = t^{T}X

b) h(X) = tX

c) h(X) = tX^{T}

d) h(X) = t^{T}X^{T}

View Answer

Explanation: In multivariate regression, t = [t

_{0}t

_{1}t

_{2}… t

_{n}]

^{T}, X = [1 x

_{1}x

_{2}… x

_{n}]

^{T}. t is the vector of coefficients of the features. X is the feature vector denoting all the features.

5. Let there be n features. What is the dimension of the X vector in hypothesis h(X) = t^{T}X?

a) n x 1

b) (n + 1) x 1

c) n x n

d) (n – 1) x 1

View Answer

Explanation: The feature vector contains n features and 1 which represents x

_{0}. It is not a feature. It is used to match the dimension of vector t since vector t contains an extra element t

_{0}which is the intercept.

6. What does X^{(I)} represent?

a) A vector denoting all the values of the i^{th} feature

b) i^{th} feature in the i^{th} example

c) A feature vector denoting the independent variables in the i^{th} example

d) Depends on the dataset

View Answer

Explanation: The feature vector is of dimension n x 1, where n is the number of features. Values of all the features in the i

^{th}row of training examples are values of this vector, where the value of feature a is represented in a row an of the feature vector.

7. What is the minimum number of variables required to represent a linear regression model?

a) 3

b) 2

c) 1

d) 4

View Answer

Explanation: At least four variables are required. One is required to represent the number of training examples, another should represent the target variable. Since it is multivariate linear regression, there should be at least two independent variables and they are represented by two more variables.

8. What does (x_{1}^{(4)}, x_{2}^{(4)}, y^{(4)}) represent or imply?

a) There are 4 training examples

b) The values of x_{1}, x_{2}, and y are 4

c) The fourth training example and there are two independent variables

d) The second training example and there are two independent variables

View Answer

Explanation: In a linear regression model, the set (x

_{1}

^{(i)}, …, x

_{n}

^{(i)}, y

^{(i)}) represents the i

^{th}example in the training set. x

_{n}

^{(i)}gives the value of i

^{th}x

_{n}, y

^{(i)}gives the i

^{th}value of y. (x

_{1}

^{(i)}, …, x

_{n}

^{(i)}, y

^{(i)}) implies that there are n independent variables.

9. There is no upper bound on the number of the independent variable(s).

a) True

b) False

View Answer

Explanation: In multivariate linear regression, there is no upper bound on the number of independent variables. The number of independent variables can be 2 to n, where n is any positive number above 2, and not infinite. They can be represented as a vector and the target variable is represented as the function of these independent variables.

10. There is no upper bound on the number of the target variable(s).

a) True

b) False

View Answer

Explanation: The maximum number of target variables that can be predicted by a learner is 1. The target variable is dependent on the independent variables. The value of the target variable is predicted based on a function of the independent variables.

**More MCQs on Multivariate Linear Regression:**

**Sanfoundry Global Education & Learning Series – Machine Learning**.

To practice all areas of Machine Learning, __ here is complete set of 1000+ Multiple Choice Questions and Answers__.