Skip to content

coefficient of logistic regression

8 messages · orkun, John Fox, Thomas W Blackwell

#
Hello

in logistic regression,
I want to know that it is possible to get probability values of each 
predictors by
using following formula for each predictor one by one (keeping constant 
the others)
 <<< exp(coef)/(1+exp(coef)) >>>

thanks in advance

Ahmet Temiz


______________________________________



______________________________________
The views and opinions expressed in this e-mail message are the ... {{dropped}}
#
At 11:54 AM 6/3/2003 +0300, orkun wrote:

            
Dear Ahmet,

This will almost surely give you nonsense, since it produces a fitted 
probability ignoring the constant in the model (assuming that there is 
one), setting other predictors to 0 and the predictor in question to 1. 
What is it that you want to do?

I hope that this helps,
  John

-----------------------------------------------------
John Fox
Department of Sociology
McMaster University
Hamilton, Ontario, Canada L8S 4M4
email: jfox at mcmaster.ca
phone: 905-525-9140x23604
web: www.socsci.mcmaster.ca/jfox
#
John Fox wrote:

            
thank you

Say, I just want to find each predictor's particular effect on dependent 
variables.
Actual model is to prepare landslide susceptibility map on GIS. So  I 
want to know
what the effect as probability value comes from each predictor. For 
instane what is the effect
of  slope on landslide susceptibility. Should I keep others constant ?

kind regards


______________________________________



______________________________________
The views and opinions expressed in this e-mail message are the ... {{dropped}}
#
Ahmet  -

In a logistic regression model, fitted probabilities make
sense for individual cases (rows in the data set), as well
as for future cases (predictions) for which no outcome
(success or failure) has been observed yet.  Fitted
probabilities are calculated from the matrix formula:

  Pr[success]  =  exp( X %*% beta) / (1 + exp( X %*% beta)

where  X  is an [n x (p+1)] matrix, containing all p predictor
variables as columns, preceded by a column of 1s for the
intercept, and  beta  is the [(p+1) x 1] vector of logistic
regression coefficients.

One can interpret the sign and the magnitude of an individual
regression coeffient by saying that an increase of 1 unit in
predictor variable [i] will increase or decrease the odds of
success by a multiplier of  exp(beta[i]).  When  beta[i] > 0
the odds increase, because  exp(beta[i]) > 1,  and when
beta[i] < 0  the odds decrease, because  exp(beta[i]) < 1.

I hope this explanation helps.

-  tom blackwell  -  u michigan medical school  -  ann arbor  -
On Tue, 3 Jun 2003, orkun wrote:

            
#
Dear Ahmet,

Sorry for the slow response, but I've been busy all today, coincidentally 
teaching a workshop on logistic regression.

Tom Blackwell sent you a useful suggestion for interpreting coefficients on 
the odds scale. If you want to trace out the partial relationship of the 
fitted probability of response to a particular predictor holding others 
constant, you can set the other predictors to typical values and let the 
predictor in question vary over its range, transforming the fitted log-odds 
to the probability scale.

You may be interested in my effects package (on CRAN or at 
<http://socserv.socsci.mcmaster.ca/jfox/Misc/effects/index.html>), which 
makes these kinds of displays for linear and generalized-linear models, 
including those with interactions.

Regards,
  John
At 03:06 PM 6/3/2003 +0300, orkun wrote:
-----------------------------------------------------
John Fox
Department of Sociology
McMaster University
Hamilton, Ontario, Canada L8S 4M4
email: jfox at mcmaster.ca
phone: 905-525-9140x23604
web: www.socsci.mcmaster.ca/jfox
#
John Fox wrote:

            
Dear Mr. Fox

thank you very much all.

Because of related to your answer. I ask you directly if you don't mind
I studied several ways after my email.
I wonder whether pgeo<-predict.glm(glm.ob,type="terms")
gives same result with probability value I asked before.
I tried on it. But it gives "Error in rep(1/n,n) %*% 
model.matrix(object): non conformable
arguments" .

By the way , your teaching notes is available on the internet ?


cordially

can y



______________________________________



______________________________________
The views and opinions expressed in this e-mail message are the ... {{dropped}}
#
Thomas W Blackwell wrote:

            
Dear Mr. Fox

thank you very much all.

So,  using the formula -exp(coef)/(1+exp(coef))- for getting probability 
of each predictor is correct.

Because of related to your answer. I ask you directly if you don't mind
I studied several ways after my email.
I wonder whether pgeo<-predict.glm(glm.ob,type="terms")
gives same result with probability value I asked before.
I tried on it. But it gives "Error in rep(1/n,n) %*% 
model.matrix(object): non conformable
arguments" .

could you tell me why ?


cordially



______________________________________



______________________________________
The views and opinions expressed in this e-mail message are the ... {{dropped}}
1 day later
#
Dear can y,
At 03:04 PM 6/4/2003 +0300, orkun wrote:
[previous messages deleted]
I don't know why this doesn't work for you -- it works for me. I don't 
think that this will give you what you want, however: setting type="terms" 
produces the (centred) term-wise components of the fitted values on the 
scale of the linear predictor (i.e., the logit scale).

I think that the responses that you got previously from Tom Blackwell and 
from me answer your question.
They are, along with other course materials, at 
<http://www.math.yorku.ca/SCS/spida/glm/>. Unfortunately, these workshops 
were taught using SAS rather than R (not my choice).

John
-----------------------------------------------------
John Fox
Department of Sociology
McMaster University
Hamilton, Ontario, Canada L8S 4M4
email: jfox at mcmaster.ca
phone: 905-525-9140x23604
web: www.socsci.mcmaster.ca/jfox