Giter VIP home page Giter VIP logo

terrainr's Introduction

terrainr: Landscape Visualization in R and Unity

DOI License: MIT CRAN status Lifecycle: maturing codecov Project Status: Active – The project has reached a stable, usable state and is being actively developed. R build status rOpenSci Review Status

Overview

terrainr makes it easy to retrieve elevation and base map image tiles for areas of interest within the United States from the National Map family of APIs, and then process that data into larger, joined images or crop it into tiles that can be imported into the Unity 3D rendering engine.

There are three main utilities provided by terrainr. First, users are able to download data from the National Map via the get_tiles function, downloading data tiles for the area represented by an sf or Raster object:

library(terrainr)
library(sf)

location_of_interest <- tmaptools::geocode_OSM("Hyampom California")$coords

location_of_interest <- data.frame(
  x = location_of_interest[["x"]],
  y = location_of_interest[["y"]]
)

location_of_interest <- st_as_sf(
  location_of_interest, 
  coords = c("x", "y"), 
  crs = 4326
)

location_of_interest <- set_bbox_side_length(location_of_interest, 8000)

output_tiles <- get_tiles(location_of_interest,
                          services = c("elevation", "ortho"),
                          resolution = 30 # pixel side length in meters
                          )

Once downloaded, these images are in standard GeoTIFF or PNG formats and can be used as expected with other utilities:

raster::plot(raster::raster(output_tiles[["elevation"]][[1]]))

raster::plotRGB(raster::brick(output_tiles[["ortho"]][[1]]), scale = 1)

Finally, terrainr helps you visualize this data, both natively in R via the new geom_spatial_rgb geom:

library(ggplot2)
ggplot() + 
  geom_spatial_rgb(data = output_tiles[["ortho"]],
                   aes(x = x, y = y, r = red, g = green, b = blue)) + 
  coord_sf(crs = 4326) + 
  theme_void()

As well as with the Unity 3D rendering engine, allowing you to fly or walk through your downloaded data sets in 3D and VR:

with_progress( # When not specifying resolution, default is 1m pixels
  output_tiles <- get_tiles(location_of_interest,
                            services = c("elevation", "ortho"))
)

merged_dem <- merge_rasters(output_tiles[["elevation"]], 
                            tempfile(fileext = ".tif"))
merged_ortho <- merge_rasters(output_tiles[["ortho"]], 
                              tempfile(fileext = ".tif"))

make_manifest(output_tiles$elevation,
              output_tiles$ortho)

We can then import these tiles to Unity (following the Import Vignette) to create:

The more time intensive processing steps can all be monitored via the progressr package, so you’ll be more confident that your computer is still churning along and not just stalled out. For more information, check out the introductory vignette and the guide to importing your data into Unity!

Citing terrainr

The United States Geological Survey provides guidelines for citing USGS data products (as downloaded from get_tiles) at https://www.usgs.gov/faqs/how-should-i-cite-datasets-and-services-national-map .

To cite terrainr in publications please use:

Mahoney, M. J., Beier, C. M., and Ackerman, A. C., (2022). terrainr: An R package for creating immersive virtual environments. Journal of Open Source Software, 7(69), 4060, https://doi.org/10.21105/joss.04060

A BibTeX entry for LaTeX users is:

  @Article{,
    year = {2022},
    publisher = {The Open Journal},
    volume = {7},
    number = {69},
    pages = {4060},
    author = {Michael J. Mahoney and Colin M. Beier and Aidan C. Ackerman},
    title = {{terrainr}: An R package for creating immersive virtual environments},
    journal = {Journal of Open Source Software},
    doi = {10.21105/joss.04060},
    url = {https://doi.org/10.21105/joss.04060},
  }

Available Datasets

The following datasets can currently be downloaded using get_tiles or hit_national_map_api:

  • 3DEPElevation: The USGS 3D Elevation Program (3DEP) Bare Earth DEM.
  • USGSNAIPPlus: National Agriculture Imagery Program (NAIP) and high resolution orthoimagery (HRO).
  • nhd: A comprehensive set of digital spatial data that encodes information about naturally occurring and constructed bodies of surface water (lakes, ponds, and reservoirs), paths through which water flows (canals, ditches, streams, and rivers), and related entities such as point features (springs, wells, stream gauges, and dams).
  • govunits: Major civil areas for the Nation, including States or Territories, counties (or equivalents), Federal and Native American areas, congressional districts, minor civil divisions, incorporated places (such as cities and towns), and unincorporated places.
  • contours: The USGS Elevation Contours service.
  • geonames: Information about physical and cultural geographic features, geographic areas, and locational entities that are generally recognizable and locatable by name.
  • NHDPlus_HR: A comprehensive set of digital spatial data comprising a nationally seamless network of stream reaches, elevation-based catchment areas, flow surfaces, and value-added attributes.
  • structures: The name, function, location, and other core information and characteristics of selected manmade facilities.
  • transportation: Roads, railroads, trails, airports, and other features associated with the transport of people or commerce.
  • wbd: Hydrologic Unit (HU) polygon boundaries for the United States, Puerto Rico, and the U.S. Virgin Islands.

(All descriptions above taken from the National Map API descriptions.)

Note that sometimes these resources go offline, for reasons unrelated to anything in this package. You can see the current status of these resources at this link.

Installation

You can install terrainr from CRAN via:

install.packages("terrainr")

Or, if you want the newest patches and features, you can install the development version of terrainr from GitHub with:

# install.packages("devtools")
devtools::install_github("ropensci/terrainr")

Be aware that the development version is not stable, and features that haven’t been published on CRAN may change at any time!

Code of Conduct

Please note that this package is released with a Contributor Code of Conduct. By contributing to this project, you agree to abide by its terms.

ropensci_footer

terrainr's People

Contributors

actions-user avatar maelle avatar mikemahoney218 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

terrainr's Issues

merge_rasters is not tested on 3/4 band rasters

#22 introduced a regression in how merge_rasters handles mixed-band rasters (that is, the function went from handling them to not), which was missed because there are no tests to confirm mixed band rasters are successfully merged. I should add this not-uncommon outcome as a test case.

Release terrainr 0.7.2

Prepare for release:

  • git pull
  • Check current CRAN check results
  • Polish NEWS
  • devtools::build_readme()
  • urlchecker::url_check()
  • devtools::check(remote = TRUE, manual = TRUE)
  • devtools::check_win_devel()
  • rhub::check_for_cran()
  • revdepcheck::revdep_check(num_workers = 4)
  • Update cran-comments.md
  • git push

Submit to CRAN:

  • usethis::use_version('patch')
  • devtools::submit_cran()
  • Approve email

Wait for CRAN...

  • Accepted 🎉
  • git push
  • usethis::use_github_release()
  • usethis::use_dev_version()
  • git push

Documentation

It should be more explicit that the key goal of this package is simplifying the process for importing data into Unity, with API wrappers a secondary benefit for getting a very limited subset of data into Unity.

I believe the actionable steps here are probably:

  • Talk up image manipulation and raster tiling
  • Emphasize Unity connection in README

Additionally, from the review:

* I think there is some nice utility here but there is also some cross over with others packages like `FedData`, `elevatr`.

* Following this, other then elevation, this is not a "spatial data" retrieval package as it returns PNGs that are more similar to map tiles then to spatial data. Thus the functionality is also similar to `rosm`.

* I cannot think of many cases for wanting the PNG files in R. That **certainly** does not mean they are not out there but I think more clarity about what the package is returning (maptiles) and there use would help it gain traction.

* It's a weird point but one that I think is important. `terrainr` as a name sells your package short while also being misleading. Terrain is one of 9 endpoints offered but is also anomalous in that it is the only "spatial" data available in your package (in the traditional vector/raster sense).

* Arguably terrain is also the best served of the resources in the R ecosystem with `elevatr` and `FedData`. `Elevatr` on the surface is a better choice for getting elevation data simply because it offers multiple resolutions and the Mapzen server is quicker then the ArcGIS server.

* My expectation of the package coming in given the description of "retrieve geospatial data" was that for resources like the NHD, NHDHighRes, WBD, and transportation, I would be getting the vector data. Particularly since these are offered by the National Map. Therefore, it is very important to be upfront that the package is offering endpoints to (effectively) mosaicked web tiles (with one spatial data source).

* All of this is not to discount the utility of your package because I think it is one of the few to offer access to these resources which makes it very unique! But the functionality needs to be explicit because, as is, someone coming in with the expectation that they will be able to query spatial data will be disappointed, while those looking for map tile data will likely pass your package by.

Expected output files from make_manifest and instructions for rendering in Unity

I'm wondering what the expected output is from the README example for Unity is?:

make_manifest(output_tiles$elevation, output_tiles$ortho)

Note: The instructions for Unity creates an output_tiles object using a simulated_data object which was never created in the README?

with_progress( # When not specifying resolution, default is 1m pixels
  output_tiles <- get_tiles(simulated_data,
                            services = c("elevation", "ortho"))
)

_therefore I used output_tiles from the beginning of the README:

library(terrainr)
library(sf)

# Optional way to display a progress bar while your tiles download:
library(progressr)
handlers("progress")
handlers(global = TRUE)

location_of_interest <- tmaptools::geocode_OSM("Hyampom California")$coords

location_of_interest <- data.frame(
  x = location_of_interest[["x"]],
  y = location_of_interest[["y"]]
)

location_of_interest <- st_as_sf(
  location_of_interest, 
  coords = c("x", "y"), 
  crs = 4326
)

location_of_interest <- set_bbox_side_length(location_of_interest, 8000)

output_tiles <- get_tiles(location_of_interest,
                          services = c("elevation", "ortho"),
                          resolution = 30 # pixel side length in meters
                          )
                          
merged_dem <- merge_rasters(output_tiles[["elevation"]], 
                            tempfile(fileext = ".tif"))
merged_ortho <- merge_rasters(output_tiles[["ortho"]], 
                              tempfile(fileext = ".tif"))

make_manifest(output_tiles$elevation,
              output_tiles$ortho)

I see the following in my working directory:

import_1_1.png
import_1_1.png.aux.xml
import_1_1.raw
import_terrain.cs
terrainr.manifest

Is this the expected output?

Next, how do we load these tiles (which files specifically) into Unity and render a scene?

Migrate off {raster}

The raster package will be deprecated in 2024, and it's time to start thinking about moving off of one of the core dependencies of the project.

The clear alternative is {terra}, given that it's by the same developer and has a similar API. However, I think it would be worthwhile to look into using {stars} for internal computation, so that the entire package (with its dependencies on units, sf, and stars) fits into one of the main silos of geocomputation.

Functions accepting raster inputs should quietly cast to the new format before continuing. I don't think any function returns raster objects directly -- usually we return file paths to a raster -- but if so, changing this will require a major version update.

Release terrainr 0.7.3

Prepare for release:

  • git pull
  • Check current CRAN check results
  • Polish NEWS
  • devtools::build_readme()
  • urlchecker::url_check()
  • devtools::check(remote = TRUE, manual = TRUE)
  • devtools::check_win_devel()
  • rhub::check_for_cran()
  • revdepcheck::revdep_check(num_workers = 4)
  • Update cran-comments.md
  • git push

Submit to CRAN:

  • usethis::use_version('patch')
  • devtools::submit_cran()
  • Approve email

Wait for CRAN...

  • Accepted 🎉
  • git push
  • usethis::use_github_release()
  • usethis::use_dev_version()
  • git push

Edit to Vignette "Gentle Introduction"

Hello, I added a step in coverting the raster to a dataframe:

elevation_raster <- raster::raster(output_files[[1]])
elevation_df_matrix<- raster::rasterToPoints(elevation_raster) # added this intermediate step
elevation_df <- as.data.frame(elevation_df_matrix, xy = TRUE)
elevation_df <- setNames(elevation_df, c("x", "y", "elevation"))

Hope this is helpful!

Dependency Minimization

This package relies both upon gdalUtils and gdalUtilities. gdalUtils is used twice for mosaic_rasters:

https://github.com/mikemahoney218/terrainr/blob/6b6e46256634356bdd751eadb3e89cb90094ca86/R/merge_rasters.R#L124

and once for gdal_translate:

https://github.com/mikemahoney218/terrainr/blob/6b6e46256634356bdd751eadb3e89cb90094ca86/R/raster_to_raw_tiles.R#L78

gdalUtilities is used once for gdal_translate:

https://github.com/mikemahoney218/terrainr/blob/6b6e46256634356bdd751eadb3e89cb90094ca86/R/raster_to_raw_tiles.R#L112

I believe -- but am not 100% sure -- that I used gdalUtilities back when I was trying to run things through JPG instead of PNG, meaning this could be painlessly swapped over to gdalUtils and we could safely get rid of the gdalUtilities package, which means we'd get rid of the hefty sf dependency.

Additionally, this package has a dependency on png for a single line of code: https://github.com/mikemahoney218/terrainr/blob/6b6e46256634356bdd751eadb3e89cb90094ca86/R/merge_rasters.R#L97

If it turns out we can pull down orthos as TIFFs, we can cut that dependency.

deepdep::plot_dependencies(deepdep::deepdep("terrainr", depth = 99, local = TRUE))

image

get_bbox_centroid

Same thing. st_centroid() does what you need. No need for specific classes and logic. For example:

get_bbox_centroid(list(
     c(lat = 44.04905, lng = -74.01188),
     c(lat = 44.17609, lng = -73.83493)))

# Equals 

bb = st_as_sfc(st_bbox(c(ymin = 44.04905, ymax = 44.17609,
                         xmin = -74.01188, xmax = -73.83493), crs = 4326))

st_centroid(bb)

hit_api

hit_national_map_api.R

* I suggest using `httr::RETRY("GET", ...)`  rather then a tryCatch and counters, this will greatly simplify your logic. For example:
#Set UP
url = httr::parse_url("https://elevation.nationalmap.gov/arcgis/rest/services/3DEPElevation/ImageServer/exportImage")

query_arg <- list(
      bboxSR = 4326,
      imageSR = 4326,
      size = paste(8000, 8000, sep = ","),
      format = "tiff",
      pixelType = "F32",
      noDataInterpretation = "esriNoDataMatchAny",
      interpolation = "+RSP_BilinearInterpolation",
      f = "json"
    )

## 
first_corner <- bbox@bl
second_corner <- bbox@tr

bbox_arg <- list(bbox = paste(
    min(first_corner@lng, second_corner@lng),
    min(first_corner@lat, second_corner@lat),
    max(second_corner@lng, first_corner@lng),
    max(second_corner@lat, first_corner@lat),
    sep = ","
  ))

####======
#^all the above could be replaces with paste(st_bbox(...), collapse = ",")

You have a lot of logic wrapping your httr calls which is essentially:

res <- httr::GET(url, query = c(bbox_arg, query_arg))
content = httr::content(res, type = "application/json")

An identical, but way to build in some of that error catching and retrying would be to change it to:

res2 <- httr::RETRY("GET", url, 
                   query = c(bbox_arg, query_arg), 
                   times = 10, pause_cap = 10)

get_tiles hangs when National Map API non-responsive

Sometimes get_tiles hangs in an infinite loop trying for a correct API response to a request for an ortho image. Maybe 50% or more of the time it works fine. Maybe 50% or less of the time it hangs. With verbose=TRUE, I get an infinite sequence of messages "API call 2 attempt 1". When I ESC out, R itself sometimes hangs and I need to restart.

Looking near Line 220 of hit_api.R, I get the sense that "counter" is never incremented, so as long as the API is not behaving, the while loop never ends.

Suggestions / questions:

  1. Short term. A graceful exit from the loop would be good i.e. counter=counter+1.
  2. Any ideas why the API sometimes seems to be non-responsive?

Thanks!

utils (1)

Check out the units package to handle all these unit conversions, but really, the unit given should match the CRS of the input data. Because of this, I might consider being more strict about the units you allow for a buffering call.

add_bbox_buffer.R

There are a lot of functions in this package that recreate sf functionality. These could be simplified making the code much easier to maintain. While the implementation of these methods is obviously well considered, it is fragile compared to the GDAL/GEOS/PROJ backend supporting sf. For this package to attract wide use I think the package specific classes/structures must be removed For example, add_bbox_buffer:

Could be simplified to something like:

library(sf)
bb = st_as_sfc(st_bbox(c(ymin = 44.17609, ymax = 44.04905,
                         xmin = -74.01188, xmax = -73.83493), crs = 4326))
tmp = st_transform(bb, 5070) # EPSG ensures meters
tmp = st_buffer(tmp, 10, 
                joinStyle = 'MITRE', 
                endCapStyle = "SQUARE", 
                mitreLimit = 2)

out = st_transform(tmp, 4326) # Put back to Lat Long
* It is certainly a shameless self plug but check out the AOI/climateR workflows. AOI allow users to (1) flexably generate AOIs and (2) climateR simply takes a spatial object and can identify the underlying climate grids:
  
  * Build AOIs: https://github.com/mikejohnson51/AOI
  * Use AOIs to get climate grids: https://github.com/mikejohnson51/climateR

* The new nhdplusTools 'get' family (get_nhdplus(), get_huc8(), get_waterbodies()) also use this logic:
  
  * Example: https://github.com/USGS-R/nhdplusTools/blob/master/R/get_nhdplus.R

* Think about this kind of workflow for more generic queries.

* `elevatr` also offers a nice way that is more generic:

https://github.com/jhollist/elevatr/blob/4c45d548231400284f72b92b42dff6d9fd9a0928/R/get_elev_raster.R#L9

merge_raster

The GDAL stuff you have here is (reasonably!!) way more than needed. One big reason to avoid gdalUtils in a package is because it requires an instance of GDAL that is different than the version that comes with sf. For many users this can be an issue.
Here is an brief solution to these issues for much less code and much more utility:

The example you have only gives one raster which I assume wouldn't need to be merged? This should probably be changed, but I pulled an example from one of your other functions:

simulated_data <- data.frame(
  id = seq(1, 100, 1),
  lat = runif(100, 44.04905, 44.17609),
  lng = runif(100, -74.01188, -73.83493)
)

bbox <- get_coord_bbox(lat = simulated_data$lat, lng = simulated_data$lng)
bbox <- add_bbox_buffer(bbox, 100)
img_files = get_tiles(bbox, tempfile())

So the goal is to get this to a single raster right? If so try:

target_prj = st_crs(5070)$proj4string
method = "near"
destfile1 = tempfile()

sf::gdal_utils(util = "warp", 
                 source = unlist(img_files), 
                 destination = destfile1,
                 options = c("-t_srs", as.character(target_prj),
                             "-r", method))

raster::raster(destfile1)

# class      : RasterLayer 
# dimensions : 16782, 14984, 251461488  (nrow, ncol, ncell)
# resolution : 1.151324, 1.151324  (x, y)
# extent     : 1736879, 1754131, 2540862, 2560184  (xmin, xmax, ymin, ymax)
# crs        : +proj=aea +lat_0=23 +lon_0=-96 +lat_1=29.5 +lat_2=45.5 +x_0=0 +y_0=0 +datum=NAD83 +units=m +no_defs 

The cool thing about above is that it is short-and-sweet, only relies on sf and it offers you a direct path to allow users to specify the output CRS and resampling method (exposing target_prj, method, destfile1 as function parameters.)

Another reason to keep your bbox objects in standard classes is that you can use then to quickly crop the merged raster in the same call by first writing the bbox object to a shp. So instead of the above you could do:

# From the same coordinates make a 'sf' bbox of your points:
bbox = st_as_sf(simulated_data, coords = c("lng", "lat"), crs = 4326) %>% 
  st_bbox() %>% 
  st_as_sfc() %>% 
  st_as_sf()

# Write temporary shp
tmp_shp = tempfile(fileext = ".shp")
write_sf(bbox, dsn = tmp_shp)

destfile2 = tempfile()

sf::gdal_utils(util = "warp", 
                 source = unlist(img_files), 
                 destination = destfile2,
                 options = c("-t_srs", as.character(target_prj),
                             "-r", method,
                             '-cutline', tmp_shp,
                             '-crop_to_cutline'))

raster::raster(destfile2)

# class      : RasterLayer 
# dimensions : 14095, 14505, 204447975  (nrow, ncol, ncell)
# resolution : 1.151359, 1.151313  (x, y)
# extent     : 1737200, 1753901, 2542188, 2558415  (xmin, xmax, ymin, ymax)
# crs        : +proj=aea +lat_0=23 +lon_0=-96 +lat_1=29.5 +lat_2=45.5 +x_0=0 +y_0=0 +datum=NAD83 +units=m +no_defs 
# source     : /private/var/folders/_d/jkzhpcss17v0sy0zjr7xhrxr0000gn/T/RtmpyEg8av/file220c7958114b 
# names      : file220c7958114b

hit_api (2)

You could use this approach again to get the images and write them to disk, instead of httr::content(..., "raw"). For example:

tmpfile <-tempfile()
content <-httr::content(res2, type = "application/json")

httr::RETRY("GET", content$href, 
                   httr::progress(), 
                   httr::write_disk(tmpfile, overwrite = TRUE),
                   times = 10, pause_cap = 10)

raster::raster(tmpfile) %>% raster::plot()

georeference (1)

The given example does not run because the arguments are in the wrong order, also it doesn't seem to need alignment? The input and output are identical.

Improve test coverage

I've been rapidly iterating on this version, removing obsolete unit tests without replacement. I should ensure test coverage is above 95% before resubmission.

classes.R

I think your toolset has a lot of value outside the terrainr environment, but it becomes very limited with the imposed classes for things like coordinate sets and bounding boxes which already have well defined and common class structures that play nicely in the R spatial environment. I would like to see all the specific classes removed from the package or else a strong justification for what they add. I am aware this will take some considerable rewrite in the package but think it is well worth it.

gdalwarp runs into system-level open file limits when merging many tiles

When attempting to use gdalwarp to merge many files, gdalwarp begins by opening a connection to each input file. On some systems (including Linux) this causes an error when there are more input files than may be opened by a single process at once (default 1024). This causes merging to fall over to merge_rasters_deprecated which appears to work often, but not always, as a replacement.

One potential fix is to use gdalbuildvrt to "combine" the tiles into a virtual raster, and then gdalwarp to transform that virtual raster to an on-disk file. I'm not sure about the implications of this for performance -- possibly we only do this when merging > 100 files?

georeference (2)

No need for TIFF package (I think) since raster reads tiffs already (raster will also read PNG and JPEG, just into RGBA bands)

merge_rasters cannot handle mixed 3/4 band rasters

The NAIP endpoint has a tendency, when transparent = TRUE, to return a mix of 3 and 4 band images (I imagine 3 when there are no transparent pixels and 4 when there are, but I haven't validated this). This causes problems with at least my naive use of gdalwarp, where the output raster is assumed to have the same number of bands as the first input file. The older implementation of merge_rasters is much slower in this case (by virtue of reading every file into memory) but doesn't error out.

In the short term, I plan to fix this by reintroducing the older code as a fallback mechanism. In the mid term, I think I can handle this particular case by the -dstnodata flag, and make it less common by changing the default transparency argument for the NAIP endpoint.

Tests don't check names for get_tiles return

As a result of #12, I'm working on changing the names of the list returned by get_tiles. This was maybe alarmingly easy to do -- not a single test failed as a result of the new names! It would probably be good to establish tests with the new behavior (names in == names out).

Things To Do Before "Completion"

Things To Do Before "Completion"

Documentation

    • Validate roxygen documentation for each of the following:
      • add_bbox_buffer
      • calc_haversine_distance
      • classes
      • get_bbox
      • get_bbox_centroid
      • get_tiles
      • hit_api
      • merge_rasters
      • point_from_distance
      • raster_to_raw_tiles
      • utils
    • To be complete, each function needs a title, description, parameters, return, examples, and family tags.
    • Write README
    • Write intro vignette
    • Write Unity vignette
    • Add a NEWS file
    • Finish DESCRIPTION
    • Clean website
    • Add paper.md

Testing

    • Get each of the following to sufficient coverage
      • add_bbox_buffer
      • calc_haversine_distance
      • classes
      • get_bbox
      • get_bbox_centroid
      • get_tiles
      • hit_api
      • merge_rasters
      • point_from_distance
      • raster_to_raw_tiles
      • utils
    • Investigate setting up CI prior to magick release

Change readme images

This is small, but the README images are highlighting forested NAIP imagery, which is fine but not the intent of the NAIP program and not particularly stunning at 1m resolution. Should look into areas out West for replacement.

Release terrainr 0.7.1

Prepare for release:

  • git pull
  • Check current CRAN check results
  • Polish NEWS
  • devtools::build_readme()
  • urlchecker::url_check()
  • devtools::check(remote = TRUE, manual = TRUE)
  • devtools::check_win_devel()
  • rhub::check_for_cran()
  • revdepcheck::revdep_check(num_workers = 4)
  • Update cran-comments.md
  • git push

Submit to CRAN:

  • usethis::use_version('patch')
  • devtools::submit_cran()
  • Approve email

Wait for CRAN...

  • Accepted 🎉
  • git push
  • usethis::use_github_release()
  • usethis::use_dev_version()
  • git push

Add section on single-point workflow to overview vignette

Removing get_bbox (see #14) and pushing towards raster/sf generics means that the basic workflow for single point-of-interest is entirely different now. This original section has been excised from the overview vignette:

We can also calculate a bounding box for our single point:

peak_bbox <- get_bbox(lat = mt_elbert[[1]], 
                      lng = mt_elbert[[2]])
peak_bbox

But of course, this bounding box doesn't contain anything other than our single
point!

If we want, we can extend the bounding box by a set distance using
add_bbox_buffer():

add_bbox_buffer(bbox = peak_bbox, 
                distance = 1000,
                distance_unit = "meters")

Which, if we only have a single point, can be equivalent to simulating data as
we did above!

all.equal(add_bbox_buffer(bbox = peak_bbox, 
                distance = 1000,
                distance_unit = "meters"),
          mt_elbert_bbox,
          tolerance = 0.00001)

It'll make sense to add in a section with a replacement workflow to the end.

Remove get_bbox

get_bbox.R

Again sf::st_bbox does everything needed. No need to write unique methods to extract bbox's from common spatial classes. This removes your dependency of rlang too.

For example:

st_bbox(c(ymin = 44.04905, ymax = 44.17609,
                         xmin = -74.01188, xmax = -73.83493), crs = 4326)

st_bbox(raster::raster(ncol=100, nrow=100))

Branding

Branding

* One **huge** opportunity for your package, that I think you already have worked out, is how to integrated these resources with ggplot as a base-map. It is a constant frustration of mine (and many I know) to get nice base maps for ggplot.

* Yes, `tmap` has some utilities for this and `ggmap` is out there (I have always struggled with it), there is room here for growth. If you could add a `geom_terrainr()` that played nicely with `ggplot`, I think that would be immediately useful to a many people.

add_bbox_buffer does not return objects with a CRS

library(sf)
#> Linking to GEOS 3.8.0, GDAL 3.0.4, PROJ 6.3.1
library(terrainr)
campsites <- read_sf("https://opendata.arcgis.com/datasets/06993a2f20bf42d382c8ce6bd54027c4_0.geojson")
campsite_bbox <- add_bbox_buffer(campsites, 250)
st_crs(campsite_bbox)
#> Coordinate Reference System: NA

Created on 2021-05-05 by the reprex package (v1.0.0)

terrainr 0.5.0

Targeting an August release, I want the following to get done before release:

  • get_tiles handles projected coordinates
  • make_manifest is implemented for automated merging
  • The "Import to Unity" vignette is completely reworked to use make_manifest instead
  • The README is rewritten to use a more attractive surface (Zion?)
  • Edit A Gentle Introduction to terrainr to improve accuracy with the current version

With stretch goals:

  • Remove custom classes from hit_national_map_api
  • Remove custom classes from get_tiles
  • Remove custom classes from split_bbox

That last one is a lot less likely to get done by August. but I think I can expect to have all exported code expect sf objects in this release (with removal in 2022).

show.legend()

Thank you for developing this package! I tried to use the geom_spatial_rgb() function to plot the RGB plot with a raster object, which did great work, except for the legend. I run the code with show.legend = Ture, or modify the source code in the function. I always get the plot without the legend.

ggplot() + geom_spatial_rgb(data = raster_object, aes(x = x, y = y, r = loc_scaled, g = for_scaled, b = rev_scaled))

Thanks!

Release terrainr 0.7.4

Prepare for release:

  • git pull
  • Check current CRAN check results
  • Polish NEWS
  • devtools::build_readme()
  • urlchecker::url_check()
  • devtools::check(remote = TRUE, manual = TRUE)
  • devtools::check_win_devel()
  • rhub::check_for_cran()
  • revdepcheck::revdep_check(num_workers = 4)
  • Update cran-comments.md
  • git push

Submit to CRAN:

  • usethis::use_version('patch')
  • devtools::submit_cran()
  • Approve email

Wait for CRAN...

  • Accepted 🎉
  • git push
  • usethis::use_github_release()
  • usethis::use_dev_version()
  • git push

get_tiles

* This function, even for the example, takes a while to run, it would be nice to have some messaging along the way to give confidence it is going if `progressr` is not installed. Consider using `httr::progress()` in `hit_national_map_api`. More on that below.

geom_terrainr

One huge opportunity for your package, that I think you already have worked out, is how to integrated these resources with ggplot as a base-map. It is a constant frustration of mine (and many I know) to get nice base maps for ggplot.
Yes, tmap has some utilities for this and ggmap is out there (I have always struggled with it), there is room here for growth. If you could add a geom_terrainr() that played nicely with ggplot, I think that would be immediately useful to a many people.

Please remove dependencies on **rgdal**, **rgeos**, and/or **maptools**

This package depends on (depends, imports or suggests) raster and one or more of the retiring packages rgdal, rgeos or maptools (https://r-spatial.org/r/2022/04/12/evolution.html, https://r-spatial.org/r/2022/12/14/evolution2.html). Since raster 3.6.3, all use of external FOSS library functionality has been transferred to terra, making the retiring packages very likely redundant. It would help greatly if you could remove dependencies on the retiring packages as soon as possible.

Example not reproducible: img_width not found

I suspect this is a change that came with the newest version of {sf}. The examples in get_tiles() does not work out of the box.

library(terrainr)
simulated_data <- data.frame(
  id = seq(1, 100, 1),
  lat = runif(100, 44.04905, 44.17609),
  lng = runif(100, -74.01188, -73.83493)
)

simulated_data <- sf::st_as_sf(simulated_data, coords = c("lng", "lat"))

get_tiles(simulated_data, tempfile())
Error in hit_national_map_api(current_bbox, current_box[["img_width"]],  : 
  object '"img_width"' not found
In addition: Warning messages:
1: Assuming geographic CRS.
ℹ Set 'projected' to TRUE if projected. 
2: Assuming CRS of EPSG 4326
ℹ Set bboxSR explicity to override 

README

Readme - you have an 'elev' hanging at the end of the 2nd 'plot' line

Validate documentation

0.3.0 is a major update! I need to make sure to do a final documentation pass before resubmission. Below are a few thoughts about issues likely to have cropped up:

  • @param bbox either no longer exists or has been updated to data
  • @return has been updated to non-terrainr S4 classes (#24)

Debug why the `govunits` endpoint is failing on CI

The scheduled CI job is currently failing with the following:

══ Failed tests ════════════════════════════════════════════════════════════════
── Error ('test-2-get_tiles_govunits.R:19'): get_tiles gets the same govunits tiles twice ──
Error in `names(lines) <- format(c("", row_idx), align = "right")`: 'names' attribute [15] must be the same length as the vector [3]
Error: Error: R CMD check found ERRORs
Execution halted
Backtrace:1. └─testthat::expect_equal(png::readPNG(output_tif[[1]]), png::readPNG("testdata/govunits.png")) at test-2-get_tiles_govunits.R:19:2
 2.   └─testthat:::expect_waldo_equal("equal", act, exp, info, ..., tolerance = tolerance)
 3.     └─testthat:::waldo_compare(...)
 4.       └─waldo::compare(x, y, ..., x_arg = x_arg, y_arg = y_arg)
 5.         └─waldo:::compare_structure(x, y, paths = c(x_arg, y_arg), opts = opts)
 6.           └─waldo:::compare_vector(x, y, paths = paths, opts = opts)
 7.             └─waldo:::compare_numeric(...)
 8.               └─waldo:::printed_rows(x, y, paths = paths)

[ FAIL 1 | WARN 2 | SKIP 0 | PASS 103 ]
Error: Error: Test failures
Execution halted

1 error| 0 warnings| 1 noteError: Process completed with exit code 1.

I'm not really sure what's going on here and haven't had time to check.

Bump sf version and remove overwrite logic

Right now, merge_rasters does a bit of a funny dance to get around issues with gdal_utils::("warp", ...) in the release version of sf:

# see https://github.com/r-spatial/sf/issues/1834 for why we don't pass this
# as an option
if (any(options == "-overwrite")) overwrite <- TRUE
initial_file <- output_raster
if (overwrite) initial_file <- tempfile(fileext = ".tif")

This is fixed in the next version of sf (see the same issue). I don't want to release a version with that dance to CRAN, so following sf's next release I should remove the logic, require the new version of sf, and do a patch release.

Release terrainr 0.7.5

Prepare for release:

  • git pull
  • Check current CRAN check results
  • Polish NEWS
  • usethis::use_github_links()
  • urlchecker::url_check()
  • devtools::build_readme()
  • devtools::check(remote = TRUE, manual = TRUE)
  • devtools::check_win_devel()
  • revdepcheck::revdep_check(num_workers = 4)
  • Update cran-comments.md
  • git push

Submit to CRAN:

  • usethis::use_version('patch')
  • devtools::submit_cran()
  • Approve email

Wait for CRAN...

  • Accepted 🎉
  • usethis::use_github_release()
  • usethis::use_dev_version(push = TRUE)

Consider setting transparent = FALSE by default for NAIP endpoint

The image endpoints have the option to include alpha bands in the downloaded files, a very useful time saving step for mostly empty layers (like NHD etc which are mostly vector data acquired as map tiles).

NAIP however does not have nearly as many of these pixels, and as a result appears to return 3 band rasters when no pixels in a tile are transparent. Map tiles with mixed bands has caused issues for merging rasters (#30 fixed by #32), so it might make sense to look at any impact associated with setting transparent to FALSE by default.

My assumption is that setting transparent to FALSE will resolve this issue, but this is untested; that should be the first step in fixing this issue.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.