Skip to content
Prev 4404 / 15274 Next

Basic Mean Variance Optimization

I have not seen a reply to this, so I will offer a comment. 


      First, I recommend you not worry about computer time until you 
actually know you have a problem with that.  Focus first on getting 
something to do what you want.  Then if it takes too long to compute, 
think about how to get it to run faster.  Often in R, a little thought 
and experimentation can yield big improvements in performance without 
coding anything in a compiled language like C.  If that fails, before 
you run to C, it might be wise to use tools like Rprof or system.time to 
understand which parts of your code take the most time.  For example, if 
most of the time is consumed with waiting for a data base query to 
complete, writing that in C will only give you code that is harder to 
maintain with no substantive improvement in performance.  Beyond that, 
there is the "QuantLib" project (http://quantlib.org/index.shtml) and 
the RQuantLib package to interface to some of the QuantLib code. 


      Have you seen Wuertz, et al. (2009) Portfolio Optimization with 
R/Rmetrics (www.rmetrics.org)?  


      The computation of a variance optimal portfolio might involve the 
computation of something like solve(S, r), where r = a vector of 
estimated log(returns) on the different assets, and S = the 
corresponding covariance matrix.  However, with hundreds of assets, you 
need a stable way to estimate S to avoid problems with degeneracy.  You 
could do that with a singular factor analysis algorithm.  This could be 
updated regularly using the binomial inverse theorem 
(http://en.wikipedia.org/wiki/Binomial_inverse_theorem), which is a 
standard part of a traditional Kalman filter algorithm that could be 
used to update r, S and S-inverse based on the latest information 
available. 


      Hope this helps. 
      Spencer
burke nersesian wrote: