Giter VIP home page Giter VIP logo

geocompr's People

Contributors

appelmar avatar babayoshihiko avatar cuixueqin avatar darrellcarvalho avatar dcooley avatar defuneste avatar eblondel avatar erstearns avatar eyesofbambi avatar florentbedecarratsnm avatar florisvdh avatar iod-ine avatar jannes-m avatar katygregg avatar kiranmayiv avatar krystof236 avatar lvulis avatar marcosci avatar mikejohnpage avatar nickbearman avatar nowosad avatar pat-s avatar prosoitos avatar robinlovelace avatar rsbivand avatar sdesabbata avatar smkerr avatar tibbles-and-tribbles avatar tylerlittlefield avatar zmbc avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

geocompr's Issues

Name

I think Geocomputation with R is a better working title and that the repo would benefit from being called geocompR but let's decide after April 11th.

Section on projections and datum

People working with geospatial data rarely work in one projection, however this is a topic that is often neglected in books.

A brief intro and explanation of how to transform existing maps using various different projections would be great. But more importantly, it would be extremely interesting to read about what can go badly wrong with projection mismatches.

Michael Sumner's vectors tips

https://twitter.com/mdsumner/status/860676991040733184

  • rmapshaper::ms_simplify for reducing resolution while maintaining topology (shared edges) and object data /1
    - raster::shapefile for easy read/write /2 (but say no to shp)
  • rgeos::gBuffer(xsdf, width = 0, byid = TRUE) and similar w sf::st_buffer will resolve many topology problems, geometry errors /3
  • ggpolypath will draw polygon holes correctly but is limited with some aesthetics because identity is implicit in gg not absolute /4
    - ggplot2-like fortify to df decomposition is rogue but helpful for some tasks, see spbabel for examples including recomposition /5
    - use df[c("x", "y")] <- http://as.data .frame(rgdal::project(as.matrix(df[c("lon", "lat")]), prj)) rogue diy transforms /6
  • - don't go rogue until you know what you're doing with from/to proj crs ... /7
  • ultimate auto-graticule is probably unicorn, see graticule pkg for manual control over getting exactly what's needed /7
  • your map is a unique creation, do whatever rogue or otherwise to get what you need /8
  • use mapview and leaflet and plotly to interactively explore your map data /9
  • use fasterize for rasterizing /10
  • use sf::st_crs for EPSG look up /11
  • use Geopackage for data you care about, shapefile do a lot of dumb down, though sometimes it's the only option /12
  • sf uses planar assumptions for projected but ellipsoidal for longlat /13
  • set crs NA for planar in longlat, transform back then forward for ellipsoidal in projected (need tell much longer story here) /14
  • sf::st_segmentize in longlat? You probably wanted vertical orthodromes and horizontal loxodromes, i.e planar assumption /15
  • sf poly constructors do heavy checking on ring closure, diy/rogue for fast creation when checking not needed /16
  • use sf st_read/read_sf and write counterparts for wicked fast I/o, use as(x, "Spatial") st_as_sf rather than rgdal for sp i/o /17
  • try geom_sf in dev version of tidyverse/ggplot2 share your real world examples! /18
  • use spex pkg to get an extent polygon (rather than just a raw extent with no crs) /19

Write chapter 5 on transforming data

Will contain information on CRS transformations (raster and vector) as well as afine transforms. Could also contain transformation in raster datasets.

Ideas for additional topics to include

List of suggestions for the second edition. See below for various ideas and things already implemented (see #12 (comment) for an older version of this list that includes ideas already implemented) .

Part I

  • Additional operations on vector data: st_convex_hull, st_triangulate, st_polygonize, st_segmentize, st_boundary and st_node - see c1d1c0e
  • S2 geometry
  • geos package by @paleolimbot

Part II

  • 3d mapping section / chapter - section on that

  • ? A new part on visualizing geographic data with static/interactive/other chapters. Just thinking it could be usefully split-up given its popularity (RL)

Part III

  • Content in the algorithms chapter building on work by Stan Openshaw, father of 'geocomputation' e.g. a function that implements GAM, with reference to the series of conferences called GeoCompution. [aka how to implement spatial ideas in R]
  • Include something on facets ordered by location

Other

Reference issue in the second chapter

@Robinlovelace there is a mention of three packages - sp, rgdal, and rgeos in the second chapter, however references are missing there. It looks like they are not in the geocompr package description, therefore there are no valid bib entry of them.
The question is - should we add them to the description or we should add theirs bib entry to the ref.bib file?

make build and make html errors

@Robinlovelace can you check if make build or make html work for you? I've got an error for lines 119-128 in the 03-attr.Rmd:

Error in as(st_geometry(x), "Spatial") :
  no method or default for coercing "sfc_MULTIPOLYGON" to "Spatial"
Calls: <Anonymous> ... eval -> eval -> set_units -> st_area ->
 <Anonymous> -> as

Michael Sumner's rasters tips

https://twitter.com/mdsumner/status/860482390702960641

  • get the thing into memory for processing: readAll(x) /1
  • doing multiple extractions on same vector layer, use cellFromXY, cellFromPolygons etc. to build index /2
  • don't use projectRaster(raster, crs = projection(shp)), use spTransform(shp, projection(raster)) /3
  • extract on a one-layer RasterLayer may be slow, use extract(brick(raster()), ...) one weird trick /4
  • extract tiled file is VERY SLOW, readAll into memory, or rewrite untiled writeRaster(raster(), "file.tif") if too big for mem /5
  • read the performance vignette for large files https://cran.r-project.org/web/packages/raster/vignettes/functions.pdf … /6
  • (read the introduction vignette too) /7
  • a brick on file in .grd format is slow for crop(brick, ex) for every layer, it's better if brick is in memory /8
  • re /8 this is not true for a brick on file in NetCDF format, internally optimized for crop() on all layers /9
  • crop, extract, other ops will be slow for a stack composed of multiple single-layer files, try conversion to a bulk format /10
  • do try coarse grained parallel::parLapply(cluster, list-of-rasters-or-files, rasterOps) it can be effective /11
  • explore velox, fasterize, and unrelated packages like spatstat, oce / 12
  • sometimes raster(readGDAL(file)) is faster than raster(file) /13

The idea from https://twitter.com/clavitolo/status/865671631368105984

Make chapter beginnings consistent

Currently some have this:

---
knit: "bookdown::preview_chapter"
---

They should all have prerequisites...

@Nowosad note my use of pacman's p_load() for package loading - you happy with that? I think it's the best solution and that the benefits justify the costs of having another (minor) dependency for the book.

Installing geocompr fails

Installation of each package fails with the message:'Don't know how to decompress files with extension - ' . Where - appears to be the version number of the particular package e.g. 3-23 or 94-5 etc.

I'm using RStudio 1.0.136 (R v3.2.0) on Mac OS X 10.10.5

Thanks for your help - looking forward to reading (contributing?)

Stu

Definition of "Multiple Features"

Section 2.1.4 says: "So far, our geometry types have just included one feature. To represent multiple features in one object, we can use the “multi”-version of each geometry type:"

But I think (and a careful reading of the standard would confirm this) these things are still "single features". They're just single features with multiple (ie compound) geometries. You only get multiple features when you create an sfc object.

gpkg instead of shp

My idea is to mostly use the .gpkg format for vector examples. We could show one example with .shp at the beginning, and shortly inform that we will use .gpkg later in the book. We could also add an extended explanation why .gpkg is better in an appendix to the book. What do you think about it @Robinlovelace ?

Section on best practices for visualisation

Hi there, your book already looks amazing!

It would be great to add a section dedicated to best practices for visualisation, e.g. use of color-blind friendly palettes. A package that provides some nice palettes is viridis. There are also tools to select good color schemes for maps and other graphics (e.g. colorbrewer2).

The use of these palettes throughout the book would also help strenghten the message.

load sdData from github not cran

Current (25 June):
2 Geographic data in R
Prerequisites
This chapter requires the packages sf, and spData to be installed and loaded:
library(sf)
library(spData)

Should be:
devtools::install_github("nowosad/spData")

Additional datasets to use

Finding interesting datasets to showcase algorithms can be very difficult.

One option could be to use ECMWF and Copernicus data. If you are not familiar with them, ECMWF stands for European Centre for Medium-range Weather Forecast, it is an inter-governamental organisation and stores the largest meteorological data archive in the world (global coverage). Copernicus is a European programme that provides satellite and in-situ observations for a number of domains (e.g. biodiversity, environmental protection, climate, public health, tourism, etc.).

There are countless uses for these datasets!

issue with dplyr and shiny

I am trying to use dplyr verbs in reactive functions in a shiny app (and I am pretty new to shiny). The app is quite long, so I am not attaching it. Basically the issue is with the instruction within the server function which subset one of the datasets I am loading:

  # some code here
  # load files (from the data folder) ---------------------------------------

  studies <- readRDS("data/studies.rds")
  samples <- readRDS("data/samples.rds")
  edges <- readRDS("data/edges.rds")
  taxa <- readRDS("data/taxa.rds")
  version <- readRDS("data/version.rds")
 
 # some more code here defining the ui

server <- function(input, output) {
  # some code irrelevant to the issue (defines other outputs)
  
  # samples table, view tab -------------------------------------------------
  # Render selectInput for study, View tab 
    
    output$choose_study <- renderUI({
    study_names <- studies$studyId
    selectInput("view_study",
                label = h5("Select one or more studies to explore samples"), 
                choices = study_names, 
                selected = "ST1",
                multiple = TRUE)    
  })
  
  # Reactively subset so that only the selected study is viewed

  f_vsamples <- reactive({
    subset(samples, studyId %in% input$view_study, 
           select = c(label_2, s_type, n_reads2, foodId, llabel, L1, description))
  })

  output$vsamples_table <- DT::renderDataTable(
    f_vsamples(),
    rownames = F,
    escape = F,
    colnames = c("label" = "label_2", 
                 "sample type" = "s_type", 
                 "reads" = "n_reads2",
                 "code" = "foodId", 
                 "ext. code" = "llabel",
                 "food group" = "L1",
                 "description" = "description"),
    options = list(pageLength = 5,
                   lengthMenu = c(5, 10, 25, 20),
                   columnDefs = list(
                     list(
                       targets = 5,
                       render = JS(
                         "function(data, type, row, meta) {",
                         "return type === 'display' && data.length > 30 ?",
                         "'<span title=\"' + data + '\">' + data.substr(0, 30) + '...</span>' : data;","}"
                       )
                     )
                   )
                   
    )
  )
}

this works fine and the table use rendered correctly. However, when I replace

  f_vsamples <- reactive({
    dplyr::filter(samples, studyId %in% input$view_study) %>% 
           dplyr::select(label_2, s_type, n_reads2, foodId, llabel, L1, description)
  })

the table is still generated correctly, but I get a warning in the R console

Listening on http://127.0.0.1:4980
Warning: Error in filter_impl: Result must have length 1723, not 0
Stack trace (innermost first):
    99: <Anonymous>
    98: stop
    97: filter_impl
    96: filter.tbl_df
    95: dplyr::filter
    94: <reactive:f_vsamples> [/Users/eugenio/Dropbox/incomune/Progetti attivi/FoodMicrobioNet/attività2017/filterApp/appv2.R#240]
    83: f_vsamples
    82: exprFunc
    81: widgetFunc
    80: func
    79: origRenderFunc
    78: renderFunc
    77: origRenderFunc
    76: output$vsamples_table
     1: runApp

Am I doing something wrong?

Everything goes fine if I try this

require(shiny)
require(dplyr)
n <- 10
model.data0 <- 
  data.frame( "COURSE" = sample(LETTERS[1:3], n, replace=TRUE),
              "VALUE"  = sample(1:10, n, replace=TRUE),
              "TODROP" = sample(1:10, n, replace=TRUE)) 

ui <- fluidPage( 
  sidebarLayout(
    sidebarPanel(
      uiOutput('choose_course')
    ),
    mainPanel(
      tableOutput('courseTable')
    )
  )
)

server <- function(input, output, session) {
  # Build data, would be replaced by the csv loading in your case
  model.data0 <- 
    data.frame( "COURSE" = sample(LETTERS[1:3], n, replace=TRUE),
                "VALUE"  = sample(1:10, n, replace=TRUE),
                "TODROP" = sample(1:10, n, replace=TRUE)) 
  
  # Render selectInput 
  output$choose_course <- renderUI({
    course.names <- as.vector( unique(model.data0$COURSE) )
    selectInput("courses","Choose courses", choices=course.names, multiple=TRUE)    
  })
  
  # Subset so that only the selected rows are in model.data
  model.data <- reactive({
    # subset(model.data0(), COURSE %in% input$courses)
    dplyr::filter(model.data0, COURSE %in% input$courses) %>%
      dplyr::select(COURSE, VALUE)
  })
  
  output$courseTable <- renderTable({ model.data() })
}
runApp(shinyApp(ui,server))

which is taken from here https://stackoverflow.com/questions/37887482/filtering-from-selectinput-in-r-shiny. In this example using dplyr::filter or subset does not really matter neither does the position of the code building model.data0. So the culprit is likely to be the way I am using dplyr verbs in the server part of my app. Please advise

README.Rmd issue

There is a minor issue with README.Rmd. If you use a knit button, It doesn't give a proper .md file. This is probably because we use "Project build tools: Website". I found out a temporary solution:

rmarkdown::render('README.Rmd', output_format = 'md_document', output_file = 'README.md')

Make point-pattern.Rmd a chapter

I think this should be the new chapter 10, after raster-vector.

Sound good? Please give it a bash if you find some spare time (p.s. there are more typos in there - try to spot them!) @Nowosad.

Issues with dplyr verbs

From Chapter 3

library(tidyverse)
library(sf) #0.4-3
library(spData) # inc spatial data and datsets
f = system.file("shapes/wrld.shp", package = "spData")
world = st_read(f)


# this works
world_few_rows = world[world$population > 1e9,]

#OR
world_few_rows = world %>% 
        filter(population > 1e9)
Error in filter_impl(.data, quo) : Result must have length 177, not 12180

and

world_orig = world # create copy of world dataset for future reference
world = select(world_orig, name_long, continent, population = pop)
 #Error in select.sf(world_orig, name_long, continent, population = pop) : requires dplyr > 0.5.0: install that first, then reinstall sf


> sessionInfo()
R version 3.3.2 (2016-10-31)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows >= 8 x64 (build 9200)

locale:
[1] LC_COLLATE=English_Canada.1252  LC_CTYPE=English_Canada.1252   
[3] LC_MONETARY=English_Canada.1252 LC_NUMERIC=C                   
[5] LC_TIME=English_Canada.1252    

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base     

other attached packages:
 [1] bindrcpp_0.1       spData_0.1-18      sf_0.4-3           dplyr_0.6.0       
 [5] purrr_0.2.2.2      readr_1.1.1        tidyr_0.6.3        tibble_1.3.1      
 [9] ggplot2_2.2.1.9000 tidyverse_1.1.1   

loaded via a namespace (and not attached):
 [1] Rcpp_0.12.10.1   bindr_0.1        cellranger_1.1.0 plyr_1.8.4      
 [5] forcats_0.2.0    tools_3.3.2      jsonlite_1.4     lubridate_1.6.0 
 [9] gtable_0.2.0     nlme_3.1-131     lattice_0.20-34  rlang_0.1.1     
[13] psych_1.7.5      DBI_0.6-1        parallel_3.3.2   haven_1.0.0     
[17] stringr_1.2.0    httr_1.2.1       knitr_1.16       xml2_1.1.1      
[21] hms_0.3          grid_3.3.2       glue_1.0.0       R6_2.2.1        
[25] readxl_1.0.0     foreign_0.8-68   udunits2_0.13    reshape2_1.4.2  
[29] modelr_0.1.0     magrittr_1.5     units_0.4-4      scales_0.4.1    
[33] assertthat_0.2.0 mnormt_1.5-5     rvest_0.3.2      colorspace_1.3-2
[37] stringi_1.1.5    lazyeval_0.2.0   munsell_0.4.3    broom_0.4.2  

Section on web services and OGC standards

OGC standards such as WMS/WFS/WCS/... are becoming very common (hurray!) and many data providers serve layers via web services.

A section on how to assemble a typical request and visualise the response would be fantastic!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.