I'm making some API calls to dowload weather data from an online provider. The site itself is private so I can not share that. I'm using jsonlite::fromJSON
to get the data as a table:
> head(fromJSON(web_url)$hourly$data)
time summary icon precipIntensity precipProbability temperature
1 1515711600 Clear clear-night 0 0 -0.42
2 1515715200 Partly Cloudy partly-cloudy-night 0 0 -0.18
3 1515718800 Partly Cloudy partly-cloudy-night 0 0 -0.22
4 1515722400 Partly Cloudy partly-cloudy-night 0 0 -0.08
5 1515726000 Partly Cloudy partly-cloudy-night 0 0 -0.28
6 1515729600 Partly Cloudy partly-cloudy-night 0 0 -0.10
apparentTemperature dewPoint humidity pressure windSpeed windBearing cloudCover visibility
1 -0.42 -2.73 0.84 1016.00 0.48 346 0.00 12.34
2 -0.18 -3.52 0.78 1016.32 0.34 308 0.31 12.34
3 -0.22 -3.52 0.78 1015.90 1.03 338 0.31 12.34
4 -0.08 -3.49 0.78 1016.20 0.95 48 0.31 12.62
5 -0.28 -3.43 0.79 1016.56 0.19 129 0.36 12.97
6 -0.10 -3.60 0.77 1016.40 0.81 40 0.31 12.34
When I make standalone calls, it works fine but when I try to use lapply
to get the data for a longer period of time, it does not work.
lapply(c("data.table", "lubridate", "httr", "jsonlite"),
library, character.only = T, quietly = T)
lastyr <- as.POSIXct("2018-01-01 00:00", format = "%Y-%m-%d %H:%M", tz = "UTC")
ndays <- seq.POSIXt(from = lastyr, to = lastyr + days(366), by = "day")
history_list <- lapply(ndays, function(tstamp){
web_url <- urlCreator(tstamp)
return(fromJSON(web_url)$hourly$data)
})
The function urlCreator
outputs a string which is the custom URL used to access the API. When I run the above code, I get an error message (generally 40 iterations in - but it is not consistent):
Error in open.connection(con, "rb") :schannel: next InitializeSecurityContext failed: SEC_E_BUFFER_TOO_SMALL (0x80090321) - Le cache fourni à une fonction était insuffisant.
Am I running out of memory? There aren't any other items saved in my memory so not sure why that is? In the past I've used httr::GET()
and httr::content()
to get all JSON data from the url and then parsed it to get what I need. fromJSON
seemed like an easier way until I ran into this.
EDIT:
Older code (using httr
only)
json <- GET(web_url)
if(json$status_code == 200){
c <- content(json)
cd <- c$hourly$data
# cd is a list, listConvert is a function to go through the list
# and convert it to a data.table
return(rbindlist(lapply(1:length(cd), listConvert),
use.names = T, fill = T)) }
User contributions licensed under CC BY-SA 3.0