I used R and the function polr (MASS) to perform an ordered logistic regression. The model is simple: there is only one dichotomous predictor (levels "normal" and "modified"). The question was if the frequency distribution of the response is shifted under the "modified" condition compared to the "normal" condition.

This is the script:

y = ordered(gl(5,1,labels=c("cal++","cal","even","hk","hk++")))

yy = rep(y,2)

condition = gl(2,5,labels=c("normal","modified"))

f.norm = c(2,0,4,13,5)

f.mod = c(0,0,1,11,12)

ff = c(f.norm, f.mod)

m = polr(yy~condition,weights=ff)

summary(m)

This is the result:

Coefficients:

Value Std. Error t value

conditionmodified 1.513 0.6034 2.508

Intercepts:

Value Std. Error t value

cal++|cal -2.6137 0.7456 -3.5053

cal|even -2.6135 0.7456 -3.5053

even|hk -1.1836 0.4607 -2.5691

hk|hk++ 1.4620 0.4864 3.0059

Residual Deviance: 97.08686

AIC: 107.0869

I also attached a plot with the predicted probabilities and empirical frequencies.

My question is: what is the practical meaning of the coefficient 1.513? From book I know that exp(-1.513) = 0.22 this is the ratio of odds for lower to higher outcome per unit increase in the predictor (here from "normal" to "modified"). So I read it as the odds for a higher outcome in "modified" is 0.22-times as high as in "normal". But then this would mean that the odds for higher outcomes should be lower in the "modified" group, from what I would expect that the "modified"-frequencies peak at lower response levels. But in fact I see that they peak at higher levels ("normal" peaks at hk, "modified" at hk++). Where am I wrong? How can I understand this?

Thanks for any help!

More Jochen Wilhelm's questions See All
Similar questions and discussions