Skip to content

simple mixture

3 messages · Emmanuel Paradis, Paulo Justiniano Ribeiro Jr, Bill Simpson

#
Dear All,

I am trying to do some simple mixture analyses. For instance, I have a
sample of n observations and I suspect they come from two different
exponential distributions with parameters rate1 and rate2, respectively.
So, I want to estimate rate1, rate2, and the proportions of both kinds of
individuals in the sample. I had a look at the packages mda and mclust, but
they do not seem to do this kind of analysis.

Has anybody tried to program such simple mixture analysis in R? Also, I am
interested in testing the hypothesis that the suspected heterogeneity is
significant: are there any method in R for such a test?

Thanks in advance for any help.

Emmanuel Paradis
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
#
Hi 
I'm trying to create a dll file for the windows distribution of a package.

I've tried  mingw32 and cygwin with the commands given below and it seems 
that there is a problem related with C random() function.  
(-mno-cygwin only included when using cygwin)

Can anybody tell me how to go around this?
Thanks
P.J.

$ gcc -O2 -mno-cygwin -c function.c
$ dllwrap -o function.dll -mno-cygwin --export-all-symbols function.o 
dllwrap: no export definition file provided
dllwrap: creating one, but that may not be what you want
/usr/bin/ld: warning: cannot find entry symbol
__cygwin_dll_entry at 12; defaulting
 to 64f81000
function.o(.text+0x164a):function.c: undefined reference to `random'
function.o(.text+0x168f):function.c: undefined reference to `random'
collect2: ld returned 1 exit status
dllwrap: gcc exited with status 1

--------------------------------------------------------------

Paulo Justiniano Ribeiro Jr
Dept Maths & Stats  -  Fylde College
Lancaster University
Lancaster LA1 4YF   -  U.K.

e-mail: paulojus at est.ufpr.br
http://www.maths.lancs.ac.uk/~ribeiro


-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
#
On Thu, 9 Nov 2000, Emmanuel Paradis wrote:
This is an MLE problem; use nlm().

I recently had to do this with the Gumbel distribution. Experts, please
let me know if I've screwed up.

Bill

==================================================
pgumbel<-function(x,a,b) exp(-exp(-(x-a)/b))

dgumbel<-function(x,a,b) (1/b)*exp(-(x-a)/b)*exp(-exp(-(x-a)/b))

rgumbel<-function(n,a,b) a-b*log(-log(runif(n=n)))

dmixgumbel<-function(x,p,a1,b1,a2,b2)
{
p*(1/b1)*exp(-(x-a1)/b1)*exp(-exp(-(x-a1)/b1)) +
(1-p)*(1/b2)*exp(-(x-a2)/b2)*exp(-exp(-(x-a2)/b2))
}

#not sure if this is right!!!!!!
pmixgumbel<-function(x,p,a1,b1,a2,b2)
{
p*exp(-exp(-(x-a1)/b1)) + (1-p)*exp(-exp(-(x-a2)/b2))
}

fitmixture.sim<-function(p=.9,a1=200,b1=50,a2=600,b2=20)
{
n1<-round(p*150)
n2<-150-n1
x<-c(rgumbel(n=n1,a=a1,b=b1),rgumbel(n=n2,a=a2,b=b2))
plot(sort(x),ppoints(x))

fn<-function(p)
        {
        sum(-log(dmixgumbel(x,p=p[1],a1=p[2],b1=p[3],a2=p[4],b2=p[5])) )
        }
out<-nlm(fn,p=c(.9,200,50,600,20), hessian=TRUE)

xfit<-seq(min(x),max(x),length=40)
yfit<-pmixgumbel(xfit,p=out$estimate[1],a1=out$estimate[2],
b1=out$estimate[3],a2=out$estimate[4],b2=out$estimate[5])
lines(xfit,yfit)

list(out=out)
}


-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._