Parameter estimates (also called coefficients) are the log odds ratio associated with a
one-unit change of the predictor, all other predictors being held constant.
The unknown model parameters are estimated using maximum-likelihood estimation.
A coefficient describes the size of the contribution of that predictor; a large coefficient indicates that the variable strongly influences the probability of that outcome, while a near-zero coefficient indicates that variable has little influence on the probability of that outcome. A positive sign indicates that the explanatory variable increases the probability of the outcome, while a negative sign indicates that the variable decreases the probability of that outcome. A confidence
interval for each parameter shows the uncertainty in the estimate.
When the model contains categorical variables, the interpretation of the coefficients is more complex. For each term involving a categorical variable, a number of dummy predictor variables are created to predict the effect of each different level. There are different ways to code the predictors for a categorical variable, the most common method in logistic regression is called reference cell coding or dummy coding. In reference cell coding, the first category acts as a
baseline, and you can interpret the other coefficients as an increase or decrease in the log odds ratio over the baseline category.