Giter VIP home page Giter VIP logo

climater's Introduction

Hey! Nice to see you.

I'm Mike, a Spatial Data Scientist living in Fort Collins, Colorado working for Lynker and the NOAA Office of Water Prediction

GitHub mikejohnson51 GitHub mikejohnson51

Tools I work with

R GDAL git whitebox AWS QGIS

My latest CV (made with vitae)

CV refresh

Open source projects


📚 Projects ⭐ Stars 🍴 Forks 🛎 Issues 📬 Pull requests
mikejohnson51/AOI Stars Forks Issues Pull Requests
mikejohnson51/climateR Stars Forks Issues Pull Requests
mikejohnson51/climateR-catalogs   Stars Forks Issues Pull Requests
mikejohnson51/zonal Stars Forks Issues Pull Requests
mikejohnson51/nwmTools Stars Forks Issues Pull Requests
DOI-USGS/nhdplusTools Stars Forks Issues Pull Requests
DOI-USGS/hyRefactor Stars Forks Issues Pull Requests
DOI-USGS/dataRetrieval Stars Forks Issues Pull Requests
NOAA-OWP/hydrofabric Stars Forks Issues Pull Requests
mikejohnson51/AHGestimation Stars Forks Issues Pull Requests

Where to find me

Github Twitter LinkedIn Gmail  

Visitor count

climater's People

Contributors

aariq avatar anguswg-ucsb avatar arashmodrad avatar jamestsakalos avatar mbjoseph avatar mikejohnson51 avatar nayhur avatar program-- avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

climater's Issues

Problem using the "get PRISM" command.

Hello, I need help with an error.

I had a problem with the "getPRISM" command: The moment I request the download of TerraClimate variables using a polygon for Argentina, I end up having the following problem:

AOI = aoi_get(country = "Argentina")
AOI

p = getPRISM(AOI, param = c('tmax', 'tmin', 'prcp'), startDate = "2014-01-01", endDate = "2019-12-30")

Error in define.grid3(AOI, id) : Requested AOI not in model domain

Can anybody help me, please?

best regards (:
Marília

'aoiProj' is not an exported object from 'namespace:AOI'

Hi Mike,

Thanks for that useful package.
When running the example code chunk:

ts = geocode('Colorado Springs') %>% 
  getLOCA(param = 'tmax', model = 10, startDate = "2030-01-01", endDate = "2040-12-31")

I get:
'aoiProj' is not an exported object from 'namespace:AOI'

Do you have a fix?

Thanks
Antoine

climateR upgrades

This is an issue to document current work and needed improvements towards a package that is (1) easier to maintain, (2) has less dependencies, and (3) supports more data products.

Current work has supported:

  • add terra support #44

    • remove sf dependency
    • drop methods dependency
    • drop raster support directly
  • automate grid and parameter definitions from TDS landing pages

  • generalize documentation using inheretParams

  • deal with common differently names parameters (e.g. pr, prcp, ppt)

  • provide more concrete parameter discovery tools

Many Sites from TerraClim Normals

From an email: I'm trying to use some combination of 3 packages (ncdf4, climateR, raster -- or others if you recommend them!) to extract data for ~40k US sites, for a number of terraclimate normal variables. 


I will soon add the TerraClim normals to climateR. Until then, I would suggest using the raw URLs as follows. Depending of your proj/gdal you may get warnings about CRS elements, this are ok.

library(raster)
library(AOI)
library(sf)
library(dplyr)

random_pts = AOI::aoi_get(state = "conus") %>% 
            st_sample(40000) %>% 
            st_cast("POINT") %>% 
            st_as_sf() %>% 
            mutate(ID = 1:n())

# Assuming you already have your points, start here:

# Bounding Box 
bb = st_as_sf(st_as_sfc(st_bbox(random_pts)))

rb <- brick("http://thredds.northwestknowledge.net:8080/thredds/dodsC/TERRACLIMATE_ALL/summaries/TerraClimate19611990_tmin.nc")

us_terra = crop(rb, bb)

ext = data.frame(ID = random_pts$ID, extract(us_terra, st_as_sf(random_pts)))

ext[1:2, 1:4]
#>   ID X1961.01.01 X1961.02.01 X1961.03.01
#> 1  1       -12.9       -10.1        -6.2
#> 2  2       -20.1       -16.7        -9.5

Created on 2021-03-09 by the reprex package (v0.3.0)

no getTerraClim function?

Any idea why the getTerraClim function isn't loading with the installation and loading of the package?

Additional Error with getTerraClim

Running example code.

w = aoi_get("world") %>% getTerraClim(param = 'palmer', startDate = '2017-10-10')
Error in if (length(dim(v)[3]) == 0 | is.na(dim(v)[3])) { :
argument is of length zero

Consider adding NEX-Gridded Daily Meteorology (NEX-GDM) dataset?

NetCDF files: https://data.nas.nasa.gov/geonex/geonexdata/NEX-GDM/. Article describing dataset: Hashimoto et al. 2019 (https://doi.org/10.1002/joc.5995). Excerpt from abstract: "NEX-GDM employs the random forest algorithm for estimation, which allows us to find the best estimate from the spatially continuous data sets. We used the NEX-GDM model to produce historical 1-km daily spatial data for the conterminous United States from 1979 to 2017, including precipitation, minimum temperature, maximum temperature, dew point temperature, wind speed, and solar radiation. In this study, NEX-GDM ingested a total of 30 spatial variables from 13 different data sets, including satellite, reanalysis, radar, and topography data."

I just recently found your climateR package and am looking forward to trying it out (for PRISM, GridMET, and Daymet), looks excellent!

Explicit scoping for param_meta in define.param

Hey @mikejohnson51! I'm incorporating climateR into a downstream package, but running into scoping issues.

When my package calls getMACA, the following error is raised:

Error in eval(parse(text = paste0("param_meta$", service))) : 
  object 'param_meta' not found

The traceback indicates that this is raised when climateR::define.param is called by climateR::getMACA:

image

Proposed solution

Modifying this line to use climateR::param_meta instead of param_meta solves the issue: https://github.com/mikejohnson51/climateR/blob/master/R/utility_define_param.R#L12

I noticed that you've explicitly specified the scope like this elsewhere where eval() is used, e.g.,

m = eval(parse(text = paste0('climateR::model_meta$', dataset, "$model")))

and

meta = eval(parse(text = paste0("climateR::model_meta$", dataset)))

Bringing in NASA Earth Data

Will require the netrc work from wrfhydrosubset to be moved over
Will allow access to TRIM and NLDAS data

getGridMET returning inconsistent data

Running getGridMET seems to be returning faulty data, particularly towards the end of long time series. The data returned is also inconsistent between identical calls.

For example, the following call

start_date <- "2010-01-01"
stop_date <- "2021-03-01"
site_bbox <- st_bbox(sites)

weather_rast <- getGridMET(AOI = site_bbox, 
                             param = c("prcp", "tmin", "tmax"), 
                             startDate = start_date, 
                             endDate = stop_date)

Runs without any errors (though it does return the warning In CPL_crs_from_input(x) : GDAL Message 1: +init=epsg:XXXX syntax is deprecated. It might return a CRS with a non-EPSG compliant axis order.).

But, when I try examining the data from a particular date using
rasterVis::levelplot(weather_rast$tmax$X2019.01.01)

I get a different result every time I rerun the getGridMET call. Some examples of the plots (all generated using the exact code above) are here:

image
image
image

Earlier dates in the time series do show believable weather data.

Any insights are welcome!

Extract climate values to point locations

Hello-- I am looking for a way to extract daily climate data (from the MACA climate dataset) from 5 GCMs to ten latitude/longitude point locations, instead of an AOI extent. Thanks!

GridMET Errors for "fmoist_100", "fmoist_1000", and "palmer" parameters

Hi Mike! I was playing around with aggregating climate data via climateR and ran into some issues with the following parameters for GridMET:

  • fmoist_100
  • fmoist_1000
  • palmer

I don't know if these are known, but it looks like there are recurring issues with the palmer index due to the THREDDS server so feel free to ignore that one! Also, the fmoist_* parameters can be handled individually, since this looks to be an issue only with using them in combination.

> climateR::getGridMET(aoi, c("fmoist_100", "fmoist_1000"), "2012-01-01", "2013-01-01")
Error in `names<-`(`*tmp*`, value = unique(date.names)) : incorrect number of layer names

This seems to be an issue at this line, but I'm not 100% sure. This error is only thrown when trying to get both parameters at once, and isn't thrown when trying to retrieve each parameter individually.

> climateR::getGridMET(aoi, "palmer", "2021-01-01", "2021-02-01")
Error in var[[i]] * scale_factor : non-numeric argument to binary operator

This looks to be an issue at this line. I imagine it is related to #7 and #15, and that the var variable was given some urls values due to exception handling.

reprex

aoi1 <- AOI::aoi_get("UCSB")
aoi2 <- AOI::aoi_get("Colorado Springs")

# Doesn't Work
climateR::getGridMET(
    AOI       = aoi1,
    param     = "palmer",
    startDate = "2021-01-01",
    endDate   = "2021-02-01"
)
#> Error in var[[i]] * scale_factor: non-numeric argument to binary operator

# Doesn't Work
climateR::getGridMET(
    AOI       = aoi2,
    param     = "palmer",
    startDate = "2021-01-01",
    endDate   = "2021-02-01"
)
#> Error in var[[i]] * scale_factor: non-numeric argument to binary operator

Created on 2021-04-07 by the reprex package (v2.0.0)


aoi <- AOI::aoi_get("UCSB")

# Works
climateR::getGridMET(
    AOI       = aoi,
    param     = "fmoist_100",
    startDate = "2021-01-01",
    endDate   = "2021-02-01"
)
#> $fmoist_100
#> class      : RasterStack
#> dimensions : 2, 2, 4, 32  (nrow, ncol, ncell, nlayers)
#> resolution : 0.04166667, 0.04166667  (x, y)
#> extent     : -119.9125, -119.8292, 34.37917, 34.4625  (xmin, xmax, ymin, ymax)
#> crs        : +proj=longlat +datum=WGS84 +no_defs
#> names      : X2021.01.01, X2021.01.02, X2021.01.03, X2021.01.04, X2021.01.05, X2021.01.06, X2021.01.07, X2021.01.08, X2021.01.09, X2021.01.10, X2021.01.11, X2021.01.12, X2021.01.13, X2021.01.14, X2021.01.15, ...
#> min values :        16.6,        17.5,        18.1,        18.5,        19.9,        17.3,        14.8,        12.9,        12.4,        11.8,        11.0,        10.7,        11.4,        10.8,        10.0, ...
#> max values :        19.7,        19.7,        19.6,        19.6,        22.2,        21.3,        18.9,        17.0,        17.4,        17.3,        15.5,        15.5,        16.7,        14.9,        13.2, ...

# Works
climateR::getGridMET(
    AOI       = aoi,
    param     = "fmoist_1000",
    startDate = "2021-01-01",
    endDate   = "2021-02-01"
)
#> $fmoist_1000
#> class      : RasterStack
#> dimensions : 2, 2, 4, 32  (nrow, ncol, ncell, nlayers)
#> resolution : 0.04166667, 0.04166667  (x, y)
#> extent     : -119.9125, -119.8292, 34.37917, 34.4625  (xmin, xmax, ymin, ymax)
#> crs        : +proj=longlat +datum=WGS84 +no_defs
#> names      : X2021.01.01, X2021.01.02, X2021.01.03, X2021.01.04, X2021.01.05, X2021.01.06, X2021.01.07, X2021.01.08, X2021.01.09, X2021.01.10, X2021.01.11, X2021.01.12, X2021.01.13, X2021.01.14, X2021.01.15, ...
#> min values :        14.3,        14.6,        14.6,        14.6,        15.9,        15.9,        15.8,        15.6,        15.4,        15.0,        14.6,        14.1,        14.2,        14.1,        14.0, ...
#> max values :        20.1,        20.1,        19.9,        19.6,        20.9,        21.0,        20.8,        20.5,        20.5,        20.2,        19.7,        19.2,        19.3,        19.0,        18.7, ...

# Doesn't Work
climateR::getGridMET(
    AOI       = aoi,
    param     = c("fmoist_100", "fmoist_1000"),
    startDate = "2021-01-01",
    endDate   = "2021-02-01"
)
#> Error in `names<-`(`*tmp*`, value = unique(date.names)): incorrect number of layer names

Created on 2021-04-07 by the reprex package (v2.0.0)

issue with getTerraClim readme-example

Just got started with AOI and climateR. I tried to replicate the global data example from the climateR readme:

kenya = aoi_get(country = "Kenya")
tc = getTerraClim(kenya, param = "prcp", startDate = "2018-01-01")

The first line worked, but the second one gave me the following error.

Error in var[[i]] * scale_factor : 
non-numeric argument to binary operator

I'm uncertain to what operator it is referring...can you help me to resolve this?

(I run R version 4.0.4 (2021-02-15) on Win10)

Thanks in advance!

'palmer' param throwing this error though 'prcp' works for gridmet

AOI = getAOI(state = 'CO') %>% AOI::bbox_st()

Generate Random Points

n = 10
pts = data.frame(lat = runif(n, AOI$ymin, AOI$ymax),
lon = runif(n, AOI$xmin, AOI$xmax))

test <- getGridMET(AOI = pts[1,], param = c('prcp'), startDate = "1980-01-01", endDate = "2018-12-31")

test <- getGridMET(AOI = pts[1,], param = c('palmer'), startDate = "1980-01-01", endDate = "2018-12-31")

" Error in { : task 1 failed - "NetCDF: Access failure" "

support for {terra}'s objects

Dear @mikejohnson51 -- thanks for the great package.

I am now thinking about adding climateR to the data packages section of the Geocomputation with R book. As the book uses {terra} for raster data, I have a quick question -- do you have any plans to add terra support for this package?

getTerraClim: Requested AOI not in model domain

When trying to use getTerraClim, I'm running into the error 'Requested AOI not in model domain'. It happens whether I use bounding box or point data. Is this an issue with the server?

example <- aoi_get(country = "Argentina") %>%
getTerraClim(param = c('tmax', 'tmin', 'prcp'), startDate = "2014-01-01")

Returns:
"Error in withCallingHandlers(expr, warning = function(w) if (inherits(w, :
Requested AOI not in model domain

getPRISM and getDaymet errors

PRISM:

prism_prcp_1mo <- getPRISM(AOI = aoi_get(state = "AZ"),
param = "prcp",
startDate = "1990-01-01",
endDate = "1990-02-01")

'Error in dim(v) <- round(c(g$rows, g$cols, time)) :
dims [product 613760] do not match the length of object [19180]'

getPRISM() was working fine for me until I recently updated R and all packages. It still works when pulling data for single days.

Daymet:

daymet_prcp_1day <- getDaymet(AOI = aoi_get(state = "AZ"),
param = "prcp",
startDate = "1990-01-01")

'Error in var[[i]] * scale_factor :
non-numeric argument to binary operator'

TerraClimate data before 1980 for African AOIs

According to the TerraClimate webpage, this datasets goes back to 1958:
http://www.climatologylab.org/terraclimate.html

However, if getTerraClim is called with a date before 1980 (at least for AOIs in Africa) I get an error.

Is this an issue with climateR, or are some areas just not covered for the time before 1970?

Thanks, Urs

library(AOI)
library(climateR)
library(sf)
#> Linking to GEOS 3.8.1, GDAL 3.2.1, PROJ 7.2.1
sf_use_s2(F) # Otherwise issue with getTerraClim function and aoi
#> Spherical geometry (s2) switched off

kenya = aoi_get(country = "Kenya")

# Works with 1985
tc = getTerraClim(kenya, param = "prcp", startDate = "1985-01-01")
rasterVis::levelplot(tc$terraclim_prcp)

# But not with 1975
tc = getTerraClim(kenya, param = "prcp", startDate = "1975-01-01")
#> Error in seq.int(0, to0 - from, by): wrong sign in 'by' argument

sf_use_s2(T)
#> Spherical geometry (s2) switched on

Created on 2021-07-04 by the reprex package (v2.0.0.9000)

TOPOWX

Could a function be added to pull data from TOPOWX?

Dealing with GridMet Palmer Index

Following up on some requests and issues (#33, #32, #7) will the Palmer Index (gridmet) we will try to solve it once and for all.

The confusion is with the appropriate timestep of the data stored. While most gridmet variables are daily, palmer is not. The gridMet webpage suggests it is a 10-day product, while the Thredds server suggests it is a five-day value (pentad), while the variable name within the CDM is "daily_mean_palmer_drought_severity_index".

It appears that the data is pentad, so the function needs to adopt in such a way that requests for palmer return pentad data.

Error in { : task 2 failed - "NetCDF: Access failure"

Trying to pull gridmet data with the following code:

AOI = aoi_get(c(45.3988,-95.7885,100,100),km=TRUE)

gm_hist = getGridMET(AOI,
param=c('pet_grass','palmer','prcp'),
startDate = "2009-01-01",
endDate = "2019-12-31")

resulting in error: Error in { : task 2 failed - "NetCDF: Access failure"

getBCCA and getLOCA date errors

Hey @mikejohnson51 -- awesome package! I am running into some issues extracting BCCA and LOCA data. It looks like the date column from the dates object does not exist -- maybe because dates is a vector of dates, and not a data.frame?

library(climateR)
#> Loading required package: AOI
#> Loading required package: leaflet

getBCCA(getAOI(state = 'CO'),
        param = 'tmax',
        model = 'inmcm4',
        scenario = 'rcp45',
        startDate = '2030-10-29',
        endDate = '2030-12-29')
#> Error in dates$date: $ operator is invalid for atomic vectors


getLOCA(getAOI(state = 'CO'),
        param = 'tmax',
        model = 'ACCESS1-0',
        scenario = 'rcp45',
        startDate = '2030-10-29',
        endDate = '2030-12-29')
#> Error in dates$date: $ operator is invalid for atomic vectors

Created on 2019-08-12 by the reprex package (v0.3.0)

Possibly related, I'm noticing that the function calls to define.versions differ for MACA vs. BCCA and LOCA. MACA passes a data frame (https://github.com/mikejohnson51/climateR/blob/master/R/getMACA.R#L23), but BCCA and LOCA both pass vectors of dates (https://github.com/mikejohnson51/climateR/blob/master/R/getBCCA.R#L24 https://github.com/mikejohnson51/climateR/blob/master/R/getLOCA.R#L23).

Any help would be much appreciated!

GridMet Error with palmer (different from issue #32)

Example below. Error persists with any year combination but runs fine with call to other gridMet layers.

start.year=2015
end.year=2016

#--Extract rasters
dat <- getGridMET(aoi_get(country = c("US")), param="palmer", startDate = start.date, endDate = end.date)

Requested AOI not completly in model domain...AOI is being clipped
although coordinates are longitude/latitude, st_intersection assumes that they are planar
Error in { : task 1 failed - "NetCDF: Access failure"
In addition: Warning message:
attribute variables are assumed to be spatially constant throughout all geometries

Extracting daily (summertime) data from 1000 points over 10 GCMs

Hi Mike,
Great package! I am hoping to use your new update to extract a mountain of point data. My ultimate goal is to extract daily summertime data (from May 1 - August 15) for the years 2020 - 2099 from the MACA climate data set. I would like to extract data from 10 GCMs and 5 variables to 1000 points. I know this is a TON of data, and was hoping you might have a solution. Thanks so much!

Compute anomalies from monthly data and historic normals

Question from email:


What I'm trying to do is calculate anomalies between a "current" period  (mean values for 2004:2015) and a "baseline" period (1961-1990 normals), for Tmin coldest month, Tmax warmest month, climate water deficit (perhaps summed over the year), etc.

So for example, to calculate the anomaly for Tmin coldest month, for each site I need to calculate:

  1. mean(Tmin.Jan), mean(Tmin.Feb), etc., across 2004-2015
  2. the min. value from 1 (since we don't know which month is the coldest)
  3. The min value of (Tmin.Jan., Tmin.Feb, etc.) from 1961-1990 normals
  4. difference between 2 and 3

I think there are two approaches for this - a raster and a point based solution. The raster approach is simpler I find but will provide both for comparison:

library(raster)
library(AOI)
library(sf)
library(climateR)
library(dplyr)

Get Data for 100 random pts in Colorado ...

random_pts = AOI::aoi_get(state = "co") %>% 
  st_sample(100) %>% 
  st_cast("POINT") %>% 
  st_as_sf() %>% 
  mutate(ID = 1:n())

co = getTerraClim(random_pts, 
                  param = "tmin", 
                  startDate = "2004-01-01", 
                  endDate = "2014-12-01")

coNorm = getTerraClimNormals(random_pts, 
                             param = "tmin", 
                             period = "19611990")

Site based approach

ext = extract_sites(r = co, pts = random_pts, "ID")

ext_norm = extract_sites(r = coNorm, pts = random_pts, "ID")[[1]] %>% 
  tidyr::pivot_longer(-date, values_to = "norms") %>% 
  mutate(month = as.numeric(date), date = NULL) %>% 
  group_by(name) %>% 
  summarize(minNorm = min(norms)) %>% 
  ungroup()
  
out = ext$tmin %>% 
  tidyr::pivot_longer(-date) %>% 
  mutate(month = lubridate::month(date)) %>% 
  group_by(month, name) %>% 
  summarise(meanTmin = mean(value)) %>% 
  ungroup() %>% 
  group_by(name) %>% 
  summarise(minTmin = min(meanTmin)) %>%
  ungroup() %>% 
  left_join(ext_norm, by = c("name")) %>% 
  mutate(site_anom = minTmin - minNorm, ID = as.numeric(gsub("site_", "", name))) %>% 
  left_join(random_pts, by = "ID") %>% 
  st_as_sf() %>% 
  arrange(ID)
#> `summarise()` has grouped output by 'month'. You can override using the `.groups` argument.

plot(out['site_anom'], pch = 16)

Raster Based approach

indices      = rep(1:12, times = nlayers(co$tmin)/12)
co_tmin      = stackApply(co$tmin, indices = indices, mean)
diff_rast = min(co_tmin) - min(coNorm$`19611990_tmin`)
random_pts$raster_anom =  extract(diff_rast, random_pts)
plot(diff_rast)

Check for agreement

plot(random_pts$raster_anom, out$site_anom)
abline(0,1)

Created on 2021-03-10 by the reprex package (v0.3.0)

+init=epsg:XXXX syntax deprecation

Currently some of the function calls in climateR will raise a warning about +init=epsg syntax deprecation, e.g.,

library(AOI)
library(climateR)

climateR::getMACA(
  AOI::aoi_get(state = "FL"), 
  model = "CCSM4", 
  param = 'prcp', 
  scenario = c('rcp45', 'rcp85'), 
  startDate = "2080-06-29", endDate = "2080-06-30")
#> Warning in CPL_crs_from_input(x): GDAL Message 1: +init=epsg:XXXX syntax is
#> deprecated. It might return a CRS with a non-EPSG compliant axis order.
#> $ccsm4_prcp_rcp45_mm
#> class      : RasterStack 
#> dimensions : 144, 183, 26352, 2  (nrow, ncol, ncell, nlayers)
#> resolution : 0.04142886, 0.04137668  (x, y)
#> extent     : -87.61456, -80.03308, 25.06308, 31.02132  (xmin, xmax, ymin, ymax)
#> crs        : +proj=longlat +datum=WGS84 +no_defs 
#> names      : X2080.06.29, X2080.06.30 
#> min values :           0,           0 
#> max values :    66.55993,    17.49149 
#> 
#> 
#> $ccsm4_prcp_rcp85_mm
#> class      : RasterStack 
#> dimensions : 144, 183, 26352, 2  (nrow, ncol, ncell, nlayers)
#> resolution : 0.04142886, 0.04137668  (x, y)
#> extent     : -87.61456, -80.03308, 25.06308, 31.02132  (xmin, xmax, ymin, ymax)
#> crs        : +proj=longlat +datum=WGS84 +no_defs 
#> names      : X2080.06.29, X2080.06.30 
#> min values :           0,           0 
#> max values :    33.06903,    23.11818

Created on 2021-01-15 by the reprex package (v0.3.0)

For MACA, this appears to happen because of this line:

if(is.null(proj)){proj = "+init=epsg:4326"}

Suggested solution

Changing this to the integer value of the EPSG code solves the issue for MACA:

if(is.null(proj)){proj = 4326}

This integer would then get used here: https://github.com/mikejohnson51/climateR/blob/master/R/utility_define_grid.R#L29

The deprecated syntax also shows up for EDDI:

raster::crs(t) = "+init=epsg:4326"
eddi = setNames(data.frame("eddi", "+init=epsg:4326", baseURL,

I think a fix there might be to use the proj4string like this:

raster::crs(t) = "+proj=longlat +datum=WGS84 +no_defs"

Incorrect Rounding on getTerraClim Palmer Results?

Hi Mike,

I've noticed that the results from getTerraClim seem to be truncated to 1 decimal place, but do not have the correct rounding when compared with TerraClim data retrieved from the Climate Engine. Here's an example:

#Setting up the lat/long point to extract data from
pts <- data.frame(ID = 1,
                  lat = 38.31250,
                  lon = -75.14583)

#Extracting the data for a few months in 2001
getTerraClim(AOI=pts[1,2:3],param=c('palmer'),startDate = "2001-06-01",endDate = "2001-10-31")
 
#Output 
    source     lat       lon    date palmer
1 terraclim 38.3125 -75.14583 2001-06    1.2
2 terraclim 38.3125 -75.14583 2001-07    1.5
3 terraclim 38.3125 -75.14583 2001-08    1.9
4 terraclim 38.3125 -75.14583 2001-09    1.4
5 terraclim 38.3125 -75.14583 2001-10   -1.1

The palmer results have 2 significant figures.

When I go to The Climate Engine and retrieve the same dataset using the "Make a Graph" function, I get the following results:

#Variable: PDSI
#Data Source: TERRACLIMATE 4000 m (1/24-deg) monthly dataset (University of Idaho)
#Missing Value: -9999
#Time Period: 2001-6-01 to 2001-10-01
#Point (Lat,Lon): 38.3125N,75.1458W
#Date(yyyy-mm-dd) PDSI
2001-06-01 1.2200
2001-07-01 1.5200
2001-08-01 1.9700
2001-09-01 1.4400
2001-10-01 -1.1600

It seems like for August and October 2001, the rounding was incorrectly applied. A few other spot checks shows that the data is truncated rather than rounded. I have only looked through the palmer results, and haven't checked any of the other TerraClim variables.

I was looking through the "fast download" code, but couldn't trace the issue.

Thanks!

Phillip

getTerraClim:"Error in matrix(var, nrow = l, byrow = F) : data is too long

Dear ALL,
I am trying to downloading Terraclimate monthly timeseries data using getTerraClim function.
This is my code
Step TWO
#Load packages
library(chirps)
library(sf)
library(AOI)
library(climateR)
library(terra)
library(raster)
library(rasterVis)
library(shapefiles)
library(sp)

If you get this Error: Requested AOI not in model domain, then deactivate s2 in sf library

library(sf)
sf_use_s2(FALSE)

##Load babati stn coordinate shapefile as AOI
df<- read.csv('~/Documents/MoW/Year2_2021_2022/Task 2_Capacity_Building/Training_1_RS/Day2/Data/Babati_stn.csv')#read/import station location data
show(df)

##Set coordinates
coordinates(df)=~ Lon + Lat

#Create a shape file and set the coordinate reference system(CRS) as Arc 1960 EPSG 4210
df.shp = df
#crs(df.shp)<-'+proj=longlat +datum=WGS84 +no_defs'
crs(df.shp)<-'+proj=longlat +ellps=clrk80 +towgs84=-160,-6,-302,0,0,0,0 +no_defs' #Arc 1960 EPSG 4210

#Create as sf object
#Create sf extends data.frame-like objects with a simple feature list column
df.shp.sf = as(df.shp, "sf")

#plot the point
plot(df.shp.sf)

Downloading Terraclimate monthly timeseries data using getTerraClim function

Babati.ppt=getTerraClim(df.shp.sf, param = 'prcp',startDate = "1918-01-01",endDate = "1918-07-31")

After running I am getting this;
"Error in matrix(var, nrow = l, byrow = F) : data is too long
In addition: Warning messages:
1: In max(d$year) : no non-missing arguments to max; returning -Inf
2: In min(d$year) : no non-missing arguments to min; returning Inf
3: In min(d$month.index) : no non-missing arguments to min; returning Inf
4: In max(d$month.index) : no non-missing arguments to max; returning -Inf"

Any suggestion will be appreciated

getTerraClim problem

Hi Mike,
it seems that this section of code:
w = AOI::world %>% getTerraClim(param = 'prcp', startDate = '2017-10-10')
does not work...

When I try to implement it, i obtain the following error:
Error: 'world' is not an exported object from 'namespace:AOI'

Any idea to fix this?
thanks!

Consider adding National Solar Radiation Database (NSRDB) as a dataset?

Mike, another gridded climate dataset I like but have trouble downloading in large quantities is National Solar Radiation Database (NSRDB). It's the best (highest quality and best resolution) gridded solar radiation dataset in North America. I previously had some R code that would download pixel time series by feeding the API a list of URLs, but NSRDB changed its system and my code doesn't work anymore. API instructions at https://nsrdb.nrel.gov/data-sets/download-instructions.html. I don't know if it'd be possible to integrate this dataset into climateR (there's no OpenDAP service that I'm aware of), but wanted to mention it just in case.

getPRISM() returns links instead of values for some dates

Hi,

I am using getPRISM() for the first time but for half on the dates it returns a URL. I would really appreciate some help to solve this issue. Thanks in advance!

library(climateR)
library(sf)
library(AOI)

pt<- st_point(c(-122.9765,47.07159)) %>% st_sfc(crs=4326) %>% st_sf()

Evergreen_prism <- getPRISM(pt,param=c('tmax','tmin'),startDate="2021-1-1",endDate="2021-6-1")

152
prism 47.08333 -122.9583 2021-06-01 26.9139995574951
http://convection.meas.ncsu.edu:8080/thredds/dodsC/prism/da](http://convection.meas.ncsu.edu:8080/thredds/dodsC/prism/daily/combo/2021/PRISM_combo_20210601.nc?tmin[0:1:0][68:1:68][49:1:49])

My R session info is:

R version 4.1.0 (2021-05-18)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows 10 x64 (build 19043)

Matrix products: default

locale:
[1] LC_COLLATE=English_United States.1252 LC_CTYPE=English_United States.1252
[3] LC_MONETARY=English_United States.1252 LC_NUMERIC=C
[5] LC_TIME=English_United States.1252
system code page: 65001

attached base packages:
[1] stats graphics grDevices utils datasets methods base

other attached packages:
[1] AOI_0.2.0.9000 sf_1.0-5 climateR_0.1.0

loaded via a namespace (and not attached):
[1] leaflet.extras_1.0.0 tidyselect_1.1.1 terra_1.5-13 purrr_0.3.4
[5] lattice_0.20-44 rnaturalearth_0.1.0 vctrs_0.3.8 generics_0.1.1
[9] USAboundaries_0.4.0 htmltools_0.5.2 s2_1.0.7 utf8_1.2.2
[13] rlang_0.4.12 e1071_1.7-9 pillar_1.6.4 later_1.3.0
[17] glue_1.6.0 DBI_1.1.2 sp_1.4-6 RNetCDF_2.5-2
[21] wk_0.6.0 foreach_1.5.1 lifecycle_1.0.1 rvest_1.0.2
[25] raster_3.5-12 htmlwidgets_1.5.4 codetools_0.2-18 fastmap_1.1.0
[29] doParallel_1.0.16 httpuv_1.6.5 crosstalk_1.2.0 parallel_4.1.0
[33] class_7.3-19 fansi_0.5.0 Rcpp_1.0.8 xtable_1.8-4
[37] KernSmooth_2.23-20 promises_1.2.0.1 classInt_0.4-3 leaflet_2.0.4.1
[41] jsonlite_1.7.3 mime_0.12 digest_0.6.29 dplyr_1.0.7
[45] shiny_1.7.1 grid_4.1.0 tools_4.1.0 magrittr_2.0.1
[49] proxy_0.4-26 tibble_3.1.6 crayon_1.4.2 pkgconfig_2.0.3
[53] ellipsis_0.3.2 xml2_1.3.3 httr_1.4.2 iterators_1.0.13
[57] R6_2.5.1 units_0.7-2 compiler_4.1.0

Bring intersectr netcdf traversal for unstructured grids into climateR.

Based on some slack chat, I think it's going to be productive to contribute some of what's currently in intersectr over here.

I'll be moving a few things into https://github.com/USGS-R/ncdfgeom but the core execute intersection functions from intersectr need a good home. That stuff will be well used here.

Specifically, utils.R https://github.com/USGS-R/intersectr/blob/master/R/utils.R

And execute_intersection.R https://github.com/USGS-R/intersectr/blob/master/R/execute_intersection.R

TerraClim Data Available through 2018

It looks like the ending data of the TerraClim dataset is hardcoded to be 2017. However data is now available through 2018. It looks like a simple update in getTerraClim will do the trick.

getEDDI broken?

The getEDDI function may be suffering from a broken our outdated ftp link. I get the following error:

Error in curl::curl_fetch_disk(url, x$path, handle = handle) : 
  Given file does not exist
Timing stopped at: 0.99 0.15 1.74

NOAA may have recently changed their ftp address, according to this

Access Parameter Values from getTerraClim Output for 900 Counties

Hi Mike,

I am trying to use function 'getTerraClim', but I am not familiar with the output data structure. Do you know what the commands would be if I want to access parameter values only? Is this something like 'tc$prcp@something'? So I can then be able to save them in .csv in R.

Here is the google drive link to all the counties (.shp file) that I would like to access their parameters from TerraClimate data source: https://drive.google.com/file/d/1y_NN771ch-YS4nSNH6pXes40iELAnR-i/view?usp=sharing

Thank you!
Mengya

getPRISM() doesn't return data for specific day

getPRISM() has been working perfectly for me, but recently I've noticed it returns a weird URL instead of data when I try to get temperature data for the date May 12th 2021. All points I've tried seem to have this issue. Only data for May 12th 2021 is affected; data for all other dates is returned as expected. For example:

library(AOI)
library(climateR)
library(sf)

ptest = st_point(c(-72.42642220, 41.59313330)) %>% 
  st_sfc(crs = 4326) %>% 
  st_sf()

prismTest = getPRISM(ptest, 
                 param = c('tmax', 'tmin'), 
                 startDate = "2021-05-12",
                 endDate = "2021-05-13")

prismTest

the object prismTest looks like:

source lat lon date
1 prism 41.58333 -72.41667 2021-05-12
2 prism 41.58333 -72.41667 2021-05-13
tmax
1 http://convection.meas.ncsu.edu:8080/thredds/dodsC/prism/daily/combo/2021/PRISM_combo_20210512.nc?tmax[0:1:0][200:1:200][1262:1:1262]
2 18.9360008239746
tmin
1 http://convection.meas.ncsu.edu:8080/thredds/dodsC/prism/daily/combo/2021/PRISM_combo_20210512.nc?tmin[0:1:0][200:1:200][1262:1:1262]
2 4.01399993896484

As you can see, data for May 13th 2021 is normal, but May 12th is a weird link.

I would appreciate any help figuring out the cause/solution to this. Here's my session info
Thanks!!!

`st_bbox()` doesn't work with `getCHIRPS()`

The example in the README using st_bbox() with getGridMET() doesn't work when it's swapped out for getCHRIPS(). The documentation doesn't seem to indicate that it should behave any differently.

library(climateR)
library(sf)
#> Linking to GEOS 3.8.1, GDAL 3.1.1, PROJ 6.3.1
AOI = st_bbox(c(xmin = -112, xmax = -105, ymax = 39, ymin = 34), crs = 4326) %>% 
  getCHIRPS(startDate = "2018-09-01")
#> Error in UseMethod("st_geometry"): no applicable method for 'st_geometry' applied to an object of class "bbox"

Created on 2020-11-23 by the reprex package (v0.3.0)

Day calculation incorrect in getGridMET and others

In getGridMET, the base date used in

d = define.dates(startDate, endDate, baseDate = "1979-01-01")
followed by subtracting 1 from the date.index in

    urls = paste0(g$base, p$call, "_1979_CurrentYear_CONUS.nc?",
        p$description, "[", min(d$date.index) - 1, ":1:", max(d$date.index) -
            1, "]", g$lat.call, g$lon.call, "#fillmismatch")

results in a malformed url when trying to obtain the beginning of the dataset, and returns dates that are consistently off by one day in later time series.

This is not observed with getPRISM, which doesn't specify a base date, but does appear with other get functions.

Here is a reproducible example comparing the results of getGridMET() and data extracted from a downloaded NetCDF file in the working directory.

library(AOI)
library(climateR)
library(sf)
library(raster)

# create an example point
idpt <- data.frame(lon = -96.89999276, lat = 31.51999451)
idpt <- st_as_sf(x = idpt, coords = c("lon", "lat"), crs = 4269)

# extract maximum temperature from a downloaded file
gridmet1979tmax <- stack("tmmx_1979.nc")
gridmet1979tmax.pt <- extract(gridmet1979tmax, idpt)

# try to download the first day of the available time series
getGridMET(idpt, "tmax", startDate = "1979-01-01", endDate = NULL)

# creates malformed URL with -1 in time fields
# http://thredds.northwestknowledge.net:8080/thredds/dodsC/agg_met_tmmx_1979_CurrentYear_CONUS.nc?daily_maximum_temperature[-1:1:-1][429:1:429][669:1:669]#fillmismatch

# does not return data
#   source    lat       lon       date
# 1 gridmet 31.525 -96.89167 1979-01-01


# try to download the second day of the available time series
getGridMET(idpt, "tmax", startDate = "1979-01-02", endDate = NULL)

# creates a functional URL that downloads the previous day
# http://thredds.northwestknowledge.net:8080/thredds/dodsC/agg_met_tmmx_1979_CurrentYear_CONUS.nc?daily_maximum_temperature[0:1:0][429:1:429][669:1:669]#fillmismatch

# this says it is the date requested, but the tmax for 1979-01-01
#   source    lat       lon       date  tmax
# 1 gridmet 31.525 -96.89167 1979-01-02 268.8

# jan 1 and 2 from the downloaded file
gridmet1979tmax.pt[1, 1:2]

# X28854 X28855
# 268.8  272.4

# all dates are off by one, because of the extra subtraction

Add CRU TS4 data?

Is there any chance to include the CRU TS4 data set in the package?

Like TerraClimate, this is a global data set (and TerraClimate is partly based on it). However, for the area where I am working at (Kibale National Park, Uganda), the temperature data from CRU TS4 reflect a bit better our ground data. And this might be true for other areas as well.

The most recent CRU TS data set can be found here, with links to the most recent publication:
https://crudata.uea.ac.uk/cru/data/hrg/cru_ts_4.05/

Thanks for considering the addition.

MACA monthly Returns are not correctly indexed

library(climateR)
library(sf)

box_sfc <- st_as_sfc(st_bbox(c(xmin = -76., xmax = -74., 
                               ymax =  43., ymin = 41.), 
                             crs=4326)) 

# does not work with end date???
t3 <- climateR::getMACA(box_sfc, 
                        param="prcp", 
                        model = "CCSM4", 
                        scenario="rcp45", 
                        startDate = as.Date("1950-01-01"), endDate=as.Date("1950-03-01"),
                        timeRes="monthly")

Error in socketAccept

I get a socket error when I attempt to run getTerrClim. However same error occurs for examples.

tmp<-getTerraClim(AOI=aoi, param=c(var),

  •               startDate=str.date,
    
  •               endDate=end.date)
    

Error in socketAccept(socket = socket, blocking = TRUE, open = "a+b", :
all connections are in use
In addition: Warning messages:
1: In .Internal(socketAccept(socket, blocking, open, encoding, timeout)) :
closing unused connection 123 (<-view-localhost:11875)
2: In .Internal(socketAccept(socket, blocking, open, encoding, timeout)) :
closing unused connection 125 (<-view-localhost:11875)
3: In .Internal(socketAccept(socket, blocking, open, encoding, timeout)) :
closing unused connection 124 (<-view-localhost:11875)

Point based extraction

This was sent as an email:


I am working on modeling recharge relative to our state groundwater level network, and it is nice to have the ability
to pull gridded data such as the sources that you have used in this package.

Can you possibly send me an example for pulling time series data for a point location?
Alternatively, if I retrieve for example data for the whole state as a raster with time series, then extract from that?

Here's what I caurrently have:

library(climateR)
library(AOI)
precip = aoi_get("41.59313330, -72.42642220") %>%  getPRISM(param = 'prcp', startDate = "2019-10-01","2020-09-30")

returns this
Error in { : task 353 failed - "NetCDF: file not found"

If I retrieve for GridMET, I get
Error in { : task 1 failed - "NetCDF: Access failure"

assuming I am missing a package dependency for NetCDF?

Using getGridMET

I am trying to use the getGridMET function but the coordinates/location identifier seems to be off. I am using simple example code:

AOI = aoi_get(state = "MN")
p = getGridMET(AOI, param = c('tmax','tmin'), startDate = "2018-11-20")

r = raster::stack(p$tmax, p$tmin)
names(r) = c('tmax', 'tmin')
rasterVis::levelplot(r, main=AOI$state_abbr)

For Minnesota, I've attached the output plot (which looks like the bottom part of Texas).
climateR_test_MNtemp

Meanwhile, for Florida, the output looks like it's of the top part of Michigan.
climateR_test_FLtemp

The temperature values are also another indicator of the problem.

When I use getPRISM instead of getGridMET, the location is accurate and the output is fine. Additionally, using getGridMET for data from California seems to work properly as well. Could this be an issue with how the coordinate system used by the GridMET is read into either aoi_get or getGridMET functions? Do you have any solutions on how to get the state name input to match up properly with the physical location of the state?

Thank you for your help!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.