Message-ID: <D315E966-EC2F-4AB5-B818-0D7F8B9B7AF2@t-online.de>
Date: 2013-11-04T22:57:10Z
From: Simon Pickert
Subject: speed issue: gsub on large data frame
Hi R?lers,
I?m running into speeding issues, performing a bunch of
?gsub(patternvector, [token],dataframe$text_column)"
on a data frame containing >4millionentries.
(The ?patternvectors? contain up to 500 elements)
Is there any better/faster way than performing like 20 gsub commands in a row?
Thanks!
Simon