-
Notifications
You must be signed in to change notification settings - Fork 28
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Download time-series data for a given coordinates #8
Comments
No and this isn't the goal of the APIs I fear. They are meant to query gridded data explicitly. Although you could do this by looping over a custom query you adjust, you will run into performance issues quickly. In general queries are quite slow. This isn't an issue if you in the end download a lot of data, but through my experience on the unit checks of the code I noticed that query size (mainly spatial extent) isn't the limiting factor in terms of speed, i.e. if you query say Germany or 1 location in Germany might end up taking the same amount of time. Subsequent queries aren't necessarily faster either. So, it is better to download larger gridded data and subset the locations from this data. If your cover multiple areas separated by larger areas of data not of interest divide the queries into meaningful sections. For example, say the EU and conterminous US, or Spain and Germany. Hope this helps. |
What about downloading a gridded time-series dataset as nc. I tried, without success, to download data for two days, by adding in the request list for "day" = "04:05". request <- list("dataset" = "reanalysis-era5-pressure-levels", |
Your syntax isn't correct, use: Use proper R list elements and syntax. |
To recap, If you want a time series create a query covering the correct months, days, years. Then subset using the raster or stars library in R, as I mentioned above. This should give you spatial and temporal coverage. |
Is it possible to use wf_request() function to download time-series data for a given list of coordinates?
Thanks,
Alexandru
The text was updated successfully, but these errors were encountered: