KDnuggets : News : 2009 : n14 : item23 < PREVIOUS | NEXT >

Publications

From: Bruce Ratner
Date: Mon, 20 Jul 2009
Subject: Linear Probability, Logit, and Probit Models: How Do They Differ?

At the beginning of everyday for the regression modeler, whose tasks are to predict a continuous dependent (e.g., profit) and a binary dependent (e.g., yes-no response), the ordinary least squares (OLS) regression model and the logistic regression model, respectively, are likely to be put to use, giving promise of another workday of successful models. The essence of any prediction model is the fitness function, which quantifies the optimality (goodness or accuracy) of a solution (predictions). The fitness function of the OLS regression model is mean squared error (MSE), which is minimized by calculus. Historians generally regard calculus going back to the time of the ancient Greeks, circa 400 BC. Calculus started making great strides in Europe towards the end of the 18th century. Leibniz and Newton pulled their own "to-be-calculus" ideas together, and they are credited with the independent "invention" of calculus. The OLS regression model is celebrating 204 years of popularity, as the invention of the method of least squares was on March 6, 1805.

The first use of OLS regression with a binary dependent has an intractable past: Who, when, and why are not known. The pent-up need for a binary dependent-variable linear regression model was quite apparent, as once it was employed there was no turning back the clock. The general passion of the users� of the new probability regression model resulted in renaming it the Linear Probability Model. The problems of the linear probability model today are well known. But, its usage came to a quick halt when the probit model was invented.

Read more.


KDnuggets : News : 2009 : n14 : item23 < PREVIOUS | NEXT >

Copyright © 2009 KDnuggets.   Subscribe to KDnuggets News!