Skip to content
Prev 398385 / 398500 Next

Optimization of large nonlinear models

In my experience statistical fitting problems are more typically
compute-bound (CPU) rather than memory-bound; again, speaking only from my
experience, having too *little* memory will cause severe problems, but
having more memory than necessary doesn't help.

  Usually the work has to go into speeding up  the objective function:

  providing gradients of the objective function (either analytically or by
autodiff) can make a huge difference (e.g. see the RTMB package ... [R]TMB
are heavily used in fisheries, FWIW)
  you might be able to parallelize the objective-function computations.
Parallelized optimization algorithms do exist (e.g. Kyle and Neira 2014),
but I don't know if anyone has implemented them in R ...
  translating objective functions into C++ etc. (possibly with threaded
computation using OpenMP)

Klein, Kyle, and Julian Neira. 2014. ?Nelder-Mead Simplex Optimization
Routine for Large-Scale Problems: A Distributed Memory
Implementation.? *Computational
Economics* 43 (4): 447?61. https://doi.org/10.1007/s10614-013-9377-8.

I'm not sure those address your problem, but that's my best guess based on
what you've told us

On Fri, Dec 26, 2025 at 5:01?AM Ruben Roa Ureta via R-help <
r-help at r-project.org> wrote:

            

  
  
Message-ID: <CABghstQ7gVzm9ptgpJ2XQup=tkZ3WF8rm+e-MM7-iJTdbeHTHA@mail.gmail.com>
In-Reply-To: <trinity-b3ca3055-02a3-4184-b188-57ee9bea46c7-1766573431945@3c-app-mailcom-lxa09>