Skip to content

include Time in as.ltraj()

5 messages · Struve, Juliane, Erika Mudrak, Glen A Sargeant +1 more

#
Hello everyone-

I have several sets of data that I fit (using MLE) to several uncommon distributions (betabinomial, zero-inflated negative binomial, zero-inflated betabinomial, zero-inflated binomial etc...). 

I used dzinbinom from the emdbook package, corresponding to Benjamin Bolker's book, Ecological Models and Data in R, and dzibinom and dzibb as developed on page 285-286 of this book. 

I have compared them using AIC values (with AICtab from the bbmle package), but I would still like to do a goodness of fit test on the "winner" to see if it is a reasonable distribution. 

goodfit() from vcd can only take poisson, binomial, and nbinomial. 

I would like to use chisq.test, but I am having trouble coming up with the reference distribution.

I followed an example on page 287 of this book where the reference distribution is calculated using dzibb, and then used it in the slot for p, a vector of probabilities: 


ZIBBprob=dzibb(1:size, prob=blurf1, theta=blurf2, size=blurf3, zprob=blurf4)
chisq.test(tabulate(obs),p=ZIBBprob)

My problem is that the ZIBBprob vector does not add up to 1...  Is this because I am mis-using dzibb?  

Does anyone have any suggestions on how I can perform GOF tests on these weirdo distributions?

Thanks

Erika Mudrak



-------------------------------------------
Erika Mudrak
Graduate Student
Department of Botany
University of Wisconsin-Madison
430 Lincoln Dr
Madison WI, 53706
608-265-2191
mudrak at wisc.edu
#
Struve, Juliane wrote:
Juliane,

In general, you can learn about any error message by inspecting code that
generated it.  To view the code for any function, submit the function name
(no trailing parentheses) from the command prompt.  

In this specific case:
If you inspect the code for as.ltraj, you will find the following
expressions:

    rr <- any(unlist(lapply(res, function(x) (length(unique(x$date)) != 
        length(x$date)))))
    if (rr) 
        stop("non unique dates for a given burst")

'as.ltraj' compares the length of 'date' to the length of a vector
containing only unique values in date and returns an error message when they
are unequal (i.e., when 'date' contains duplicates).

Glen Sargeant
#
> id=All_1646$"Fish_ID", typeII = TRUE)
 > 
 > I get an error message 
 > "Error in as.ltraj(All_1646, date = All_1646$Datime, id =
 >  All_1646$Fish_ID,  : non unique dates for a given burst".
 > 
 > However, the Datime column does include the time of observation as
 > shown below. What am I doing wrong ?


Hard to say without any reproducible example... 
However, it seems that there are several errors in your code and data:
first, note that the first argument of the function (xy) should be a
data frame containing only the coordinates of the relocations of the
animal (presently, it also contains the date, the time, a column named
Datime, etc.). Please check that your arguments are correct.

Now, concerning your message, it is likely that several relocations
have been collected at exactly the same moment (i.e. for a given
Fish_ID, there are probably at least two relocations with the same
value of Datime). You can verify it:

ta <- table(All_1646$Fish_ID, All_1646$Datime)
any(ta>1)

look at this table: if there is any value >1, this means that there
are probably two relocations collected at the same date for a given
animal.
HTH,


Cl?ment Calenge.
3 days later
#
Dear Clement,

thank you very much for replying.

All_1646_ltrj <- as.ltraj(All_1646_test[,1:2], date=All_1646_test$Date, id=All_1646_test$Fish_ID, typeII = TRUE)

works fine, All_1646_test[,1:2] now contains the coordinates that as.ltraj() is looking for. 


However, I still get an error meassage with regards to non-unique observations. Actually, the observations vary by seconds, but it seems these small differences are not recognized ?  If I change the times manually so that the observations differ by an hour the above statement works fine. Short time intervals of a few seconds are a characteristic of my data. How can I deal with such data ? 

Many thanks for your advice and best wishes,

Juliane