# R Programming Questions and Answers – Linear Regression – 3

«
»

This set of R Programming Language Multiple Choice Questions & Answers (MCQs) focuses on “Linear Regression – 3”.

1. ________ is an incredibly powerful tool for analyzing data.
a) Linear regression
b) Logistic regression
d) Greedy algorithms

Explanation: Linear regression is an incredibly powerful tool for analysing data. we’ll focus on finding one of the simplest type of relationship: linear. This process is unsurprisingly called linear regression, and it has many applications.

2. The square of the correlation coefficient r 2 will always be positive and is called the ________
a) Regression
b) Coefficient of determination
c) KNN
d) Algorithm

Explanation: The square of the correlation coefficient r square will always be positive and is called the coefficient of determination. This also is equal to the proportion of the total variability that’s explained by a linear model.

3. Predicting y for a value of x that’s outside the range of values we actually saw for x in the original data is called ___________
a) Regression
b) Extrapolation
c) Intra polation
d) Polation

Explanation: Predicting y for a value of x that is within the interval of points that we saw in the original data is called interpolation. Predicting y for a value of x that’s outside the range of values we actually saw for x in the original data is called extrapolation.

4. What is predicting y for a value of x that is within the interval of points that we saw in the original data called?
a) Regression
b) Extrapolation
c) Intra polation
d) Polation

Explanation: Predicting y for a value of x that is within the interval of points that we saw in the original data is called interpolation. Predicting y for a value of x that’s outside the range of values we actually saw for x in the original data is called extrapolation.

5. Analysis of variance in short form is?
a) ANOV
b) AVA
c) ANOVA
d) ANVA

Explanation: If the ANOVA test determines that the model explains a significant portion of the variability in the data, then we can consider testing each of the hypotheses and correcting for multiple comparisons.

6. ________ is a simple approach to supervised learning. It assumes that the dependence of Y on X1, X2, . . . Xp is linear.
a) Linear regression
b) Logistic regression
d) Greedy algorithms

Explanation: Linear regression is a simple approach to supervised learning. It assumes that the dependence of Y on X1, X2, . . . Xp is linear. linear regression is an incredibly powerful tool for analysing data.

7. Although it may seem overly simplistic, _______ is extremely useful both conceptually and practically.
a) Linear regression
b) Logistic regression
d) Greedy algorithms

Explanation: Linear regression is a simple approach to supervised learning. It assumes that the dependence of Y on X1, X2, . . . Xp is linear. linear regression is an incredibly powerful tool for analysing data.

8. When there are more than one independent variables in the model, then the linear model is termed as _______
a) Unimodal
b) Multiple model
c) Multiple Linear model
d) Multiple Logistic model

Explanation: When there are more than one independent variables in the model, then the linear model is termed as the multiple linear regression model.

9. The parameter β0 is termed as intercept term and the parameter β1 is termed as slope parameter. These parameters are usually called as _________
a) Regressionists
b) Coefficients
c) Regressive
d) Regression coefficients

Explanation: The parameter β0 is termed as intercept term and the parameter β1 is termed as slope parameter. These parameters are usually called as regression coefficients.

10. The sum of squares of the difference between the observations and the line in the horizontal direction in the scatter diagram can be minimized to obtain the estimates is generally called?
a) reverse regression method
b) formal regression
c) logistic regression
d) simple regression

Explanation: The sum of squares of the difference between the observations and the line in the horizontal direction in the scatter diagram can be minimized to obtain the estimates of 0 1 β and β. This is generally called a reverse or inverse regression method.

11. ______ regression method is also known as the ordinary least squares estimation.
a) Simple
b) Direct
c) Indirect
d) Mutual

Explanation: Direct regression method also known as the ordinary least squares estimation. Assuming that a set of n paired observations are available which satisfy the linear regression model.

12. __________ refers to a group of techniques for fitting and studying the straight-line relationship between two variables.
a) Linear regression
b) Logistic regression
d) Greedy algorithms

Explanation: Linear regression is an incredibly powerful tool for analysing data. we’ll focus on finding one of the simplest type of relationship: linear. This process is unsurprisingly called linear regression, and it has many applications.

13. In order to calculate confidence intervals and hypothesis tests, it is assumed that the errors are independent and normally distributed with mean zero and _______
a) Mean
b) Variance
c) SD
d) KNN

Explanation: In order to calculate confidence intervals and hypothesis tests, it is assumed that the errors are independent and normally distributed with mean zero and variance.

14. What do we do the curvilinear relationship in linear regression?
a) consider
b) ignore
c) may be considered
d) sometimes consider

Explanation: Linear regression models the straight-line relationship between Y and X. Any curvilinear relationship is ignored. This assumption is most easily evaluated by using a scatter plot.

15. When hypothesis tests and confidence limits are to be used, the residuals are assumed to follow the __________distribution.
a) Formal
b) Mutual
c) Normal
d) Abnormal 