An embedded and charset-unspecified text was scrubbed... Name: not available URL: <https://stat.ethz.ch/pipermail/r-sig-mixed-models/attachments/20130926/ede5140c/attachment.pl>
Maximum nAGQ=25?
5 messages · Rafael Sauter, Ben Bolker, Ross Boylan
Rafael Sauter <rafael.sauter at ...> writes:
Dear R-sig-ME, since the beginning of this year the new lme4-version is available on CRAN which has some major changes compared to older versions. I am still running the old lme4-version ???0.999999.2???. Now I am surprised by one of the changes in the current new version '1.0-4': the GH-approximation allows only for a maximum of 25 quadrature points (nAGQ=25) whereas in the old version I did not encounter any such restrictions for the number of quadrature points. As I did not find any discussion about this change in the new lme4-version let me allow to ask: 1) Why is 25 a reasonable upper bound for nAGQ? What were the reasons to implement this upper bound? Is the increasing complexity as mentioned in the details of '?glmer' the the main reason for this? 2) Is this somehow related to the fact that at the moment in the new lme4 version nAGQ>1 is only available for models with a single, scalar random-effects term (as discussed here https://stat.ethz.ch/pipermail/r-sig-mixed-models/2013q3/020573.html and will this maximum of nAGQ=25 stay that way in the future also when non-scalar random effects will be implemented again? I'd be glad for any hints and explanations about this issue. Thanks,
I will only speak for myself: other lme4-authors (especially Doug Bates) may chime in on this one. I believe there isn't a rigorous argument for why >25 quadrature points is too many: ?glmer says " A model with a single, scalar random-effects term could reasonably use up to 25 quadrature points per scalar integral." For example, Figure 1 of Breslow "Whither PQL?" (2003) shows trace plots of non-adaptive and adaptive GHQ (glmer uses adaptive GHQ) for one example -- the plots level off well before 20, which is the maximum shown in the plot. I think we would certainly be willing to reconsider this limit if you can show that there is some sensible case where it matters ... This is unrelated to the issue of non-scalar random effects, which as I might have previously stated on this list (or not) is really a matter of finding time & energy to implement it within the new framework (patches welcome). Ben Bolker
On Thu, Sep 26, 2013 at 04:23:47PM +0000, Ben Bolker wrote:
Rafael Sauter <rafael.sauter at ...> writes:
Dear R-sig-ME, since the beginning of this year the new lme4-version is available on CRAN which has some major changes compared to older versions. I am still running the old lme4-version ???0.999999.2???. Now I am surprised by one of the changes in the current new version '1.0-4': the GH-approximation allows only for a maximum of 25 quadrature points (nAGQ=25) whereas in the old version I did not encounter any such restrictions for the number of quadrature points. As I did not find any discussion about this change in the new lme4-version let me allow to ask: 1) Why is 25 a reasonable upper bound for nAGQ? What were the reasons to implement this upper bound? Is the increasing complexity as mentioned in the details of '?glmer' the the main reason for this? 2) Is this somehow related to the fact that at the moment in the new lme4 version nAGQ>1 is only available for models with a single, scalar random-effects term (as discussed here https://stat.ethz.ch/pipermail/r-sig-mixed-models/2013q3/020573.html and will this maximum of nAGQ=25 stay that way in the future also when non-scalar random effects will be implemented again? I'd be glad for any hints and explanations about this issue. Thanks,
I will only speak for myself: other lme4-authors (especially Doug Bates) may chime in on this one. I believe there isn't a rigorous argument for why >25 quadrature points is too many: ?glmer says " A model with a single, scalar random-effects term could reasonably use up to 25 quadrature points per scalar integral." For example, Figure 1 of Breslow "Whither PQL?" (2003) shows trace plots of non-adaptive and adaptive GHQ (glmer uses adaptive GHQ) for one example -- the plots level off well before 20, which is the maximum shown in the plot. I think we would certainly be willing to reconsider this limit if you can show that there is some sensible case where it matters ...
If the limit is hard-coded to 25, it will be hard to discover if using
25 matters. That seems to me an argument for not hard coding it. I
suppose if the results had not stabilized by 25 that would be an indication. OTOH, 25 is a lot of quadrature points. One problem I've encountered with high number of quadrature points--not in lmer, but I think it's a general issue--is that as the number of quadrature points goes up the extreme x values go up, and numerical problems are more likely. Usually one can compensate by coding the likelihood defensively. Ross Boylan
Ross Boylan <ross at ...> writes:
On Thu, Sep 26, 2013 at 04:23:47PM +0000, Ben Bolker wrote:
Rafael Sauter <rafael.sauter <at> ...> writes:
[snip]
As I did not find any discussion about this change in the new lme4-version let me allow to ask: 1) Why is 25 a reasonable upper bound for nAGQ? What were the reasons to implement this upper bound? Is the increasing complexity as mentioned in the details of '?glmer' the the main reason for this?
[snip]
I will only speak for myself: other lme4-authors (especially Doug Bates) may chime in on this one. I believe there isn't a rigorous argument for why >25 quadrature points is too many: ?glmer says " A model with a single, scalar random-effects term could reasonably use up to 25 quadrature points per scalar integral."
[snip]
I think we would certainly
be willing to reconsider this limit if you can show that there is some sensible case where it matters ...
If the limit is hard-coded to 25, it will be hard to discover if using
25 matters. That seems to me an argument for not hard coding it. I
suppose if the results had not stabilized by 25 that would be an indication. OTOH, 25 is a lot of quadrature points. One problem I've encountered with high number of quadrature points--not in lmer, but I think it's a general issue--is that as the number of quadrature points goes up the extreme x values go up, and numerical problems are more likely. Usually one can compensate by coding the likelihood defensively. Ross Boylan
As may have been reflected in discussions on the list, the lme4 authors have been having lots of internal discussions about how much flexibility to allow, when to give users helpful advice in the form of warnings, etc etc etc.. Not completely tongue-in-cheek, I could say that if you're capable of compiling a package from source, it's not very complicated to search for "nAGQ <= 25L" in R/modular.R and modify or remove this limitation for yourself ... Ben Bolker
3 days later
Ben Bolker <bbolker at ...> writes:
Ross Boylan <ross <at> ...> writes:
On Thu, Sep 26, 2013 at 04:23:47PM +0000, Ben Bolker wrote:
Rafael Sauter <rafael.sauter <at> ...> writes:
[snip]
As I did not find any discussion about this change in the new lme4-version let me allow to ask: 1) Why is 25 a reasonable upper bound for nAGQ?
What were the reasons to
implement this upper bound? Is the increasing
complexity as mentioned in
the details of '?glmer' the the main reason for this?
[snip]
If the limit is hard-coded to 25, it will be hard to discover if using
25 matters. That seems to me an argument for not hard coding it. I
suppose if the results had not stabilized by 25 that would be an indication. OTOH, 25 is a lot of quadrature points.
OK, Doug Bates has chimed in at https://github.com/lme4/lme4/issues/136 to point out that the current implementation of AGHQ is table-driven (see https://github.com/lme4/lme4/blob/master/R/GHrule.R ); thus, the decision to limit the number of quadrature points is *not* arbitrary, and extending it is not just a matter of removing the test for nAGQ>25. The table could be extended, or a new implementation could compute the table on the fly -- but for now this will probably go back down the priority list a bit unless someone demonstrates a really pressing need or sends us a pull request ... Ben Bolker