Memory issue with svm modeling in R
Well, i'm no expert on these topics, but if its 2.7 gig and R can maximally use 2gig, then the easiest solution would be giving R more memory. Did you read through help(memory.size) as the error suggested? try calling memory.size(T) or memory.limit(3000) and see if it works. I don't have any experience with either Rstudio or Amazon whatever. The local system seems to be windows so the above might work, don't know the other, you might need to change the memory limit at startup of the console if its not.
On 22.10.2012, at 10:18, Vignesh Prajapati wrote:
Hello Jessica, Thanks for inform this and very sorry for inconvenience, Here I have attached two Files 1. crash.png- For Issue with Amazon Instance 2. localmachine_error.bmp - for Issue with local machine Thanks On Mon, Oct 22, 2012 at 1:42 PM, Jessica Streicher <j.streicher at micromata.de> wrote: Hello Vignesh, we did not get any attachments, maybe you could upload them somewhere? On 19.10.2012, at 09:46, Vignesh Prajapati wrote:
As I found the memory problem with local machine/micro instance(amazon) for
building SVM model in R on large dataset(2,01,478 rows with 11 variables),
then I have migrated our micro instance to large instance at Amazon. Still
I have memory issue with large amazon instance while developing R model for
this dataset due to large size. I have attached the snap of error with
local machine(localmachine_error.bmp) and amazon instance(crash.png ) with
this post.
Issue on local Machine ::
[image: enter image description here]
Issue on Amazon large Instance ::
[image: enter image description here]
Can any one suggest me for the solution of this issue.?
Thanks
Vignesh
[[alternative HTML version deleted]]
______________________________________________ R-help at r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.
<crash.png><localmachine_error.bmp>