This set of Machine Learning Multiple Choice Questions & Answers (MCQs) focuses on “Linear Regression – Cost Function”.

1. The hypothesis is given by h(x) = t_{0} + t_{1}x. What are t_{0} and t_{1}?

a) Value of h(x) when x is 0, intercept along y-axis

b) Value of h(x) when x is 0, the rate at which h(x) changes with respect to x

c) The rate at which h(x) changes with respect to x, intercept along the y-axis

d) Intercept along the y-axis, the rate at which h(x) changes with respect to x

View Answer

Explanation: Since t

_{1}is the coefficient of x, it is the rate at which h(x) changes with respect to x. t

_{0}is the intercept at the y-axis, but practically, it may not be the value of h(x) when x=0.

2. The hypothesis is given by h(x) = t_{0} + t_{1}x. t_{0} gives the value of h(x) when x is 0.

a) True

b) False

View Answer

Explanation: Although t

_{0}is the intercept along the y-axis, it is not always the value of h(x) when x is 0. For e.g. a learner predicts a hypothesis which gives the price of a house based on its size. The hypothesis may have a y-intercept but that does not mean it is equal to the price of the house whose size is 0. A non-existent house cannot have a price.

3. The hypothesis is given by h(x) = t_{0} + t_{1}x. What is the goal of t_{0} and t_{1}?

a) Give negative h(x)

b) Give h(x) as close to 0 as possible, without themselves being 0

c) Give h(x) as close to y, in training data, as possible

d) Give h(x) closer to x than y

View Answer

Explanation: t

_{0}and t

_{1}try to minimize prediction error on the training set. Since y is the target variable, h(x) must be y (ideally) or closer to y. This is what t

_{0}and t

_{1}try to achieve.

4. The hypothesis is given by h(x) = t_{0} + t_{1}x. What does t_{1} = 0 after several iterations imply?

a) The target variable is independent of x

b) Hypothesis is wrong

c) t_{0} is 0

d) x is the target variable

View Answer

Explanation: The equation t

_{1}= 0 implies that h(x) does not change with change in x. The value of h(x) is not dependent on x. Thus, the target variable is not dependent on x.

5. In a linear regression problem, h(x) is the predicted value of the target variable, y is the actual value of the target variable, m is the number of training examples. What do we try to minimize?

a) (h(x) – y) / m

b) (h(x) – y)^{2} / 2*m

c) (h(x) – y) / 2*m

d) (y – h(x))

View Answer

Explanation: The objective is to find the difference between the predicted value and actual value of the target variable and to minimize this error. If we get a negative value, then no minimizing can be done, that’s why squaring is done. To get an average over the dataset, the squared value is divided by twice the number of training examples.

6. The cost function contains a summation expression.

a) True

b) False

View Answer

Explanation: The objective of the cost function is to minimize the error. Thus it calculates the error for each example and sums it. The error for each example is basically the difference between the predicted value and the actual value of the target variable.

7. What is the simplified hypothesis?

a) h(x) = t_{1}x

b) h(x) = t_{0} + t_{1}x

c) h(x) = t_{0}

d) h(x) = t_{0}x

View Answer

Explanation: In the simplified hypothesis, we assume that t

_{0}= 0. It is safe to assume this because often it is practical that the value of h(x) is 0 when the value of x is 0, especially in problems where we try to output the cost price o something.

8. The simplified hypothesis reduces the complexity of the cost function.

a) True

b) False

View Answer

Explanation: When we ignore the intercept term in the hypothesis equation, we are only left with the x term. Thus the cost function only tries to minimize the term t

_{1}which is the coefficient of x. This simplifies the calculation of the cost function to a certain degree.

9. In the simplified hypothesis, what does hypothesis H and cost function J depend on?

a) Both are functions of x

b) J is a function of x, H is a function of t_{1}

c) H is a function of x, J is a function of t_{1}

d) Both are functions of t_{1}

View Answer

Explanation: The simplified hypothesis: h(x) = t

_{1}x; thus h is only dependent on the value of x, as t

_{1}is kept constant for a single iteration through the dataset. Now, after the completion of one iteration through the dataset, the cost function calculates the error and alters t

_{1}in order to minimize the error.

10. (x^{(1)}, y^{(1)}) = 1, 1.5, (x^{(2)}, y^{(2)}) = 2, 3, (x^{(3)}, y^{(3)}) = 3, 4.5. Hypothesis: h(x) = t_{1}x, where t_{1} = 1.5. How much error is obtained?

a) 4.5

b) 0

c) 22.5

d) 1.5

View Answer

Explanation: Cost function: J(t

_{1}) = [(t

_{1}x

^{(1)}– y

^{(1)})

^{2}+ (t

_{1}x

^{(2)}– y

^{(2)})

^{2}+ (t

_{1}x

^{(3)}– y

^{(3)})

^{2}] / 2*m

= [0 + 0 + 0] / 2*3

= 0/6

= 0.

11. (x^{(1)}, y^{(1)}) = 1, 1.5, (x^{(2)}, y^{(2)}) = 2, 3, (x^{(3)}, y^{(3)}) = 3, 4.5. Hypothesis: h(x) = t_{1}x, where t_{1} = 2. How much error is obtained?

a) 0.3

b) 0

c) 0.42

d) 0.5

View Answer

Explanation: Cost function: J(t

_{1}) = [(t

_{1}x

^{(1)}– y

^{(1)})

^{2}+ (t

_{1}x

^{(2)}– y

^{(2)})

^{2}+ (t

_{1}x

^{(3)}– y

^{(3)})

^{2}] / 2*m

= [0.5

^{2}+ 1

^{2}+ 1.5

^{2}] / 2*3

= 2.5/6

= 0.42.

12. How to graphically find t_{1} for which cost function is minimized?

a) Plot J(t_{1}) against t_{1} and find minima

b) Plot t_{1} against J(t_{1}) and find minima

c) Plot J(t_{1}) against t_{1} and find maxima

d) Plot t_{1} against J(t_{1}) and find maxima

View Answer

Explanation: At the minima of the graph obtained by plotting J(t

_{1}) against t

_{1}, we have the minimal value of J(t

_{1}) for the given dataset, using linear regression. This is the desired cost function. So, we take this value of t

_{1}and use it in the final hypothesis.

13. What is the ideal value of t_{1}?

a) 0

b) Depends on the dataset

c) 1

d) 0.5

View Answer

Explanation: There is no way that t

_{1}can be determined before observing the dataset. It can take any value based on the rate of change of the target variable with the change of the independent variable. It can even take a negative value if the target variable is indirectly proportional to the independent variable.

14. Hypothesis is: h(x) = t_{0} + t_{1}x. How do we graphically find the desired cost function?

a) Plot J(t_{0}, t_{1}) against t_{0} and find minima

b) Plot J(t_{0}, t_{1})) against t_{1} and find minima

c) Plot J(t_{0}, t_{1}) against either t_{1} or t_{0} and find minima

d) Make a 3-d plot with J(t_{0}, t_{1}) against t_{1} and t_{0} and find minima

View Answer

Explanation: J(t

_{0}, t

_{1}) is dependent on both the parameters, t

_{1}and t

_{0}. Thus we need to find the J(t

_{0}, t

_{1}) as a function of both t

_{1}and t

_{0}. So we need to plot in 3 dimensions.

**Sanfoundry Global Education & Learning Series – Machine Learning**.

To practice all areas of Machine Learning, __ here is complete set of 1000+ Multiple Choice Questions and Answers__.

**If you find a mistake in question / option / answer, kindly take a screenshot and email to [email protected]**