Message-ID: <1386684025.92326.YahooMailNeo@web142603.mail.bf1.yahoo.com>
Date: 2013-12-10T14:00:25Z
From: arun
Subject: Datatable manipulation
In-Reply-To: <CAOdnBQcGVAeD_7S_6VmDZicHj2s_iuPPLJPdjUWaES=7vttmgw@mail.gmail.com>
Hi,
Check these links:
http://stackoverflow.com/questions/19626534/r-issue-error-in-sqliteexecstatementcon-statement-bind-data-no-such-tabl
http://r.789695.n4.nabble.com/Problem-with-SQLDF-Error-in-sqliteExecStatement-con-statement-bind-data-RS-DBI-driver-error-in-state-td4621931.html
A.K.
On Tuesday, December 10, 2013 6:54 AM, Nitisha jha <nitisha999 at gmail.com> wrote:
Hi,
I have a doubt after a long time :) . I have a function which has sqldf statements and it works fine when I call it from console but when I am calling that function within another function, it gives me the error that?
Error in sqliteExecStatement(con, statement, bind.data) :?
? RS-DBI driver: (error in statement: no such table: table_name)
What is wrong here?
On Fri, Nov 22, 2013 at 7:27 PM, arun <smartpink111 at yahoo.com> wrote:
>
>Hi,
>Assuming that this is the case:
>
>dat1 <- read.table(text="a???? b???? c???? d?????? e
>1???? 2???? 3???? 4???? 5
>10???? 9???? 8???? 7???? 6",sep="",header=TRUE)
>
>Names1<- read.table(text="Original????? New??
>e???? ee
>g??? gg
>a???? aa
>c???? cc
>f???? ff",sep="",header=TRUE,stringsAsFactors=FALSE)
>?
>?indx <- match(names(dat1),Names1[,1])
>?names(dat1)[names(dat1) %in% Names1[,1]] <- Names1[,2][indx[!is.na(indx)]]
>?dat1
>#? aa b cc d ee
>
>#1? 1 2? 3 4? 5
>#2 10 9? 8 7? 6
>
>
>A.K.
>
>
>On Friday, November 22, 2013 4:46 AM, Nitisha jha <nitisha999 at gmail.com> wrote:
>
>Hey! I got this one. :)
>For the match function, actually I just want the ones that are matching to be replaced. Rest should stay the same. How do I do that? When I tried your command, if there is no match, it writes var2 or something.?
>
>
>
>
>>>>On Fri, Nov 22, 2013 at 12:38 AM, arun <smartpink111 at yahoo.com> wrote:
>>>>
>>>>
>>>>>
>>>>>Hi,
>>>>>Try:
>>>>>
>>>>>dat1 <- read.table(text="a ??? b ??? c ??? d?????? e
>>>>>
>>>>>1 ??? 2 ??? 3 ??? 4 ??? 5
>>>>>10 ??? 9 ??? 8 ??? 7 ??? 6",sep="",header=TRUE)
>>>>>
>>>>>Names1<- read.table(text="Original? ??? New???
>>>>>
>>>>>e ??? ee
>>>>>b ??? bb???
>>>>>a ??? aa
>>>>>c ??? cc
>>>>>d ??? dd",sep="",header=TRUE,stringsAsFactors=FALSE)
>>>>>
>>>>>It is better to dput() your dataset.? For example:
>>>>>?dput(Names1)
>>>>>structure(list(Original = c("e", "b", "a", "c", "d"), New = c("ee",
>>>>>"bb", "aa", "cc", "dd")), .Names = c("Original", "New"), class = "data.frame", row.names = c(NA,
>>>>>-5L))
>>>>>
>>>>>
>>>>>?names(dat1) <- Names1[,2][match(names(dat1), Names1[,1])] ##
>>>>>?dat1
>>>>>#? aa bb cc dd ee
>>>>>#1? 1? 2? 3? 4? 5
>>>>>#2 10? 9? 8? 7? 6
>>>>>A.K.
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>On Thursday, November 21, 2013 1:45 PM, Nitisha jha <nitisha999 at gmail.com> wrote:
>>>>>
>>>>>Hi,
>>>>>
>>>>>Thanks. I used as.character() and got the right strings.
>>>>>
>>>>>Btw, I have lots of handicaps regarding R.
>>>>>
>>>>>I have to rename the columns(I have 22 columns here). I have the new names along with the original names in another dataset. Right now, I am going hardcoding all the? 19 name changes(tedious and not optimum). 1st 3 names remain the same. I will give u a sample dataset. Let me know if there is any easy way of doing this.? Pardon the displaced column labels.
>>>>>
>>>>>
>>>>>
>>>>>Original dataset.
>>>>>
>>>>>
>>>>>
>>>>>?? a b c d??????????? ?
>>>>>e
>>>>>1 2 3 4 5
>>>>>10 9 8 7 6
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>Dataset for name change
>>>>>
>>>>>
>>>>>Original? New
>>>>>
>>>>>
>>>>>
>>>>>e ee
>>>>>
>>>>>
>>>>>
>>>>>b bb
>>>>>
>>>>>
>>>>>
>>>>>a aa
>>>>>
>>>>>
>>>>>
>>>>>c cc
>>>>>
>>>>>
>>>>>
>>>>>d dd
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>I want my final dataset to be like this:
>>>>>
>>>>>aa bb cc dd ee
>>>>>1 2 3 4 5
>>>>>10 9 8 7 6
>>>>>
>>>>>
>>>>>
>>>>>?Could u tell me an optimal way to do it. My method is tedious and not good.
>>>>>
>>>>>Also, is there a way to import .xls without perl (windows)?
>>>>>
>>>>>
>>>>>Thanks for being patient. :)
>>>>>
>>>>
>>>
>>
>