speed issue: gsub on large data frame
Hi R?lers, I?m running into speeding issues, performing a bunch of ?gsub(patternvector, [token],dataframe$text_column)" on a data frame containing >4millionentries. (The ?patternvectors? contain up to 500 elements) Is there any better/faster way than performing like 20 gsub commands in a row? Thanks! Simon