Logistic Regression: Defintion
Bruce Ratner, Ph.D.
[For a Collection of Articles on Logistic Regression, and Related Issues, click here.]
Logistic Regression Model
Let Y be a binary class dependent variable that assumes two outcomes or classes (typically labeled 0 and 1). The logistic regression model (LRM) classifies an individual into one of the classes based on the values for predictor (independent) variables X1, X2, ..., Xn for that individual.
LRM estimates the logit of Y - a log of the odds of an individual belonging to class 1; the logit is defined in equation (3.1). The logit, which takes on values between -7 and +7, is a virtually abstract measure for all but the experienced data analyst. (The logit theoretically assumes values between plus and minus infinity. However, in practice it rarely goes outside the range of plus and minus seven.) Fortunately, the logit can easily be converted into the probability of an individual belonging to class 1, Prob(Y = 1), which is defined in equation (3.2).
logit Y = b0 + b1*X1 + b2*X2 + ... + bn*Xn (3.1)
Prob(Y = 1) = exp( Logit Y ) / (1 + exp( Logit Y )) (3.2)
An individual’s estimated (predicted) probability of belonging to class 1 is calculated by "plugging-in" the values of the predictor variables for that individual in the equations (3.1) and (3.2). The bs are the logistic regression coefficients, which are determined by the calculus-based method of maximum likelihood. Note: Unlike the other coefficients, b0 (referred to as the Intercept) has no predictor variable with which it is multiplied. Needless to say, the probability of an individual belonging to class 0: Prob(Y=) is
1 - Prob(Y = 1).
Go Back to Article An Alternative Response Model.
1 800 DM STAT-1, or e-mail at firstname.lastname@example.org.