Final answer:
Ordinary Least Squares (OLS) minimize the sum of squared differences in linear regression, while Maximum Likelihood Estimation (MLE) finds the parameters making the data most probable in logistic regression, which models binary outcomes.
Step-by-step explanation:
The statement 'OLS is to linear regression as Maximum likelihood is to logistic regression' implies that Ordinary Least Squares (OLS) is the standard method for estimating the parameters of a linear regression model, while Maximum Likelihood Estimation (MLE) is typically used for logistic regression. In the context of linear regression, OLS aims to minimize the sum of the squared differences between the observed values and the values predicted by the linear model. This process is known as fitting a least-squares regression line. On the other hand, logistic regression deals with binary outcomes, and MLE is used to find the set of parameters that makes the observed data most probable under the logistic model. While OLS is used to create a model that predicts continuous outcomes, MLE in logistic regression is used when the outcome is categorical, typically binary, and models the probability of occurrence of one of the categories.