Skip to content

Help to download data from multiple URLs with API key

3 messages · Eric Berger, Bhaskar Mitra

#
Hello Everyone,

I am trying to download data from multiple websites using API key.
The  code to download from one URL is given below.

I have a list of multiple URLs' where the suffix URL 'c' keeps changing.

I would appreciate any help on how i can modify the code below that will
allow
 me to read multiple URLs and save the data from each URL as separate
csv file.

thanks,
bhaskar


#-----------------------------------------------------------------------
library(rjson)
setwd(Input)

base_url <- "abcd"         # This remains constant

b <-  "api_key"            # the api key - this remains constant

c <-  "series_id=1"        # Only this suffix URL changes.  I have a list
of multiple such URL's with different series ids.


full_url = paste0(base_url,
                  b,
                  c)


d3 <- lapply(fromJSON(file=full_url)[[2]], function(x) c(x["data"]))
d3 <- do.call(rbind, d3)

b <- as.data.frame(unlist(d3))
write.csv(b)

#-----------------------------------------------------------------------
#
Hi Bhaskar,
Why not just create a function that does the repetitive work, such as

doOne <- function( suffix ) {
   base_url <- "abcd"         # This remains constant
   b <-  "api_key"            # the api key - this remains constant
   c <-  paste("series_id=",suffix,sep="")
   full_url = paste0(base_url, b, c)
   d3 <- lapply(fromJSON(file=full_url)[[2]], function(x) c(x["data"]))
   d3 <- do.call(rbind, d3)
   b <- as.data.frame(unlist(d3))
   write.csv(b)
}

Then,
suffixes <- ... (whatever)
for ( s in suffixes )
    doOne( s )

You might need to also think about the filenames that you want to use in
the write.csv() command in the function doOne.

HTH,
Eric


On Tue, Apr 21, 2020 at 9:30 AM Bhaskar Mitra <bhaskar.kolkata at gmail.com>
wrote:

  
  
#
Hi Eric,

Thanks for your help. This is really helpful.
I have also adjusted the code to ensure  filename has the
base url and suffix as strings.

thanks,
bhaskar
On Tue, Apr 21, 2020 at 12:16 AM Eric Berger <ericjberger at gmail.com> wrote: