Skip to content
Prev 181934 / 398513 Next

improve efficiency of a loop

Dear All:

I need advice about efficient looping/vectorization. I am trying to
bootstrap a regression model with one lag of the dependent variable in
the RHS. Specifically, let error^b_(t) be the bootstrapped error of
the regression y_(t) = gamma y_(t-1) + beta x +error_(t) at time (t),
y_(t) is the original dependent variable, and y^b_(t) the bootstraped
y_(t) using parameter estimates gamma and beta. My basic procedure is
like this:
1. Get the first y^b value using y_(1):
y^b_(2) = gamma y_(1) + beta x_(2) + error_(2).b

2. Get the other y^bs:

y^b_(3) = gamma y^b_(2) + beta x_(3) + error_(3).b
?.
?.
?y^b_(T) = gamma y^b_(t-1) + beta x_(T) + error_(4).b

however, my approach that uses a loop similar to the one below, is
extremely slow. In my actual situation I am dealing with observations
indexed over time and cross-sections, however, I thought that it was
simpler to ask my question considering only one source of variation.
Let's suppose that the dataset look like this:
The parameters are:
Please, for my question to make any sense, imagine that my bootstraped
errors are identical to the 'res' above, although of course, in
practice they will not be; I just want to keep things simple here:

I first get the first value of y^b_(2) using y_(1):
And then, fill the rest of the values using this loop:
My problem is that this becomes very slow --- painfully slow when I am
actually botstraping --- when I have several observations or more than
one index (hence more looping levels) in the dependent variable. I
tried to use lapply, but it seems to me that it is not adequate when
you have a recursive situation such the one above. Any hint?

I appreciate any help,

Nelson Villoria
Message-ID: <9379458d0905301207y58041ba9kd65b296b92e1769b@mail.gmail.com>
In-Reply-To: <9379458d0905301203p7df7a9e4u99de53970875dee7@mail.gmail.com>