Has anyone considered the possible advantages of Hamiltonian Monte Carlo (http://www.cs.toronto.edu/~radford/) or Reimann manifold Langevin Monte Carlo (http://www.dcs.gla.ac.uk/inference/rmhmc/) in the context of mixed modeling? The latter paper refers to early work by Bates and Watts (1980), and it appears that the study of geometric aspects of regression goes back (at least) to the Ph.D. thesis of D.M. Bates, so perhaps readers of this list can provide some historical perspective. A key advantage of HMC is that it can prevent getting stuck executing a Metropolis random walk by taking more directed steps that have a high probability of acceptance. The downside is that a lot of tuning is required, and the new Reimann manifold method is advertised to address this issue by using the intrinsic geometry to automate some of this manual fine tuning (at the considerable expense of computing Hessians). But it is never really established why an intrinsic geometric approach is practically relevant (and worth the expense). A similar concern was expressed by some of the comments following Bates and Watts (1980): to a geometer a squashed beer can is the same as a full one, and a donut is the same as a coffee cup. It may be very helpful conceptually to understand the underlying intrinsic geometric structure, but if this is all that mattered there would be no advantage to eigenvalue decompositions, QR transformations, and other change of variables that obviously have very practical consequences. Thus another question is what is the current applied view on the geometric ideas that were discussed in Bates and Watts (1980)? The same journal issue included a number of papers on the geometry of statistical inference including one by Efron. My (possibly incorrect) impression is that well-designed adaptive coordinate transformations have overshadowed the use of intrinsic methods. Finally, it seems like there is a link between HMC and mixed modeling because the introduction of the momenum variables looks a lot like the introduction of random effects. Thanks, Dominick
Hamiltonian Monte Carlo and Reimann manifold Langevin Monte Carlo and Mixed Effects
4 messages · dave fournier, Dominick Samperi
1 day later
I implemented hybrid mcmc (Hamiltonian) in AD Model Builder's random effects module. So far as I know it has not been used much. It should be possible to access it for testing from Ben Bolker's glmmadmb package for R without too much trouble.
1 day later
On Sat, May 14, 2011 at 1:12 PM, dave fournier <davef at otter-rsch.com> wrote:
I implemented hybrid mcmc (Hamiltonian) in AD Model Builder's random effects module. So far as I know it has not been used much. ?It should be possible to access it for testing from Ben Bolker's glmmadmb package for R without too much trouble.
Thanks Dave, I just installed glmmADMB from R-Forge (Version 0.5-2) but found no information about Hybrid or Hamiltonian MCMC there. Dominick
On 11-05-15 07:43 PM, Dominick Samperi wrote:
Yes well that is more difficult. Now that we are open source you are dealing with a committee. I do the ADMB side. Ben and others do the R side (I guess). I have been trying to get the R side people to make the R scripts more flexible so that command line options can be passed to the ADMB part. You can request this either here or perhaps better so as not to clutter up the R list with unwelcome ADMB stuff on the ADMB list.
On Sat, May 14, 2011 at 1:12 PM, dave fournier<davef at otter-rsch.com> wrote:
I implemented hybrid mcmc (Hamiltonian) in AD Model Builder's random effects module. So far as I know it has not been used much. It should be possible to access it for testing from Ben Bolker's glmmadmb package for R without too much trouble.
Thanks Dave, I just installed glmmADMB from R-Forge (Version 0.5-2) but found no information about Hybrid or Hamiltonian MCMC there. Dominick