Giter VIP home page Giter VIP logo

glcmtextures's People

Contributors

ailich avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

glcmtextures's Issues

Add examples

The help files do not have examples. ?glcm_textures could have something like this:

r<- rast(volcano, extent= ext(2667400, 2667400 + ncol(volcano)*10, 6478700, 6478700 + nrow(volcano)*10), crs = "EPSG:27200") 
rq <- quantize_raster(r = r, n_levels = 16, method = "equal prob")
txt <- glcm_textures(rq, w = c(3,5), n_levels = 16, quantization = "none", shift = c(1,0))
plot(txt)

add ... or wopt

With glcm_textures and quantize_raster one cannot pass options to terra. That would be useful for debugging; to use less common filetypes; and to manage memory use without setting global options.

As these functions do not have ellipses for other purposes, you could perhaps use them for this (and pass them on as wopt=list(...)

GLCMTextures in Python

Hi Ailich,

Is there any sort of this GLCM codes in python?
I've tried to develop one, but it seems it needs something to correct.

Cheers,
Hendra

Add option to write output to file

Many raster functions have a filename argument where you can write a raster to file and also have it available in the R environment. Add this feature to functions.

Only calculate the requested subset of metrics

Add a metrics argument to C_glcm_textures_helper and C_glcm_metrics so that it only calculates the requested subset. This will likely increase performance when only some measures are needed.

One single GLCM (and set of metrics) for all window positions

I have a set of little images (200 x 200) of "uniform" textures
and would like to calculate single GLCM sets of metrics for each image,
instead of a raster with the values for each window position.
Is it possible to set the parameters so that only one single GLCM is built for
the entire image (thus only 1 set of metrics for the entire image) instead of having one GLCM (and one set of metrics)
calculated for each window position?

Tidy up vignette

Sizing of plots is off in HTML vignette contained within the package

Speed of glcm_textures

I did a speed comparison between GLCM::glcm_textures() and the older package glcm::glcm(). It seems that glcm_textures is about 6 times slower (!). Is this expected? (I did the quantize_raster() step before starting microbenchmark)

Microbenchmark results:

expr       min        lq      mean    median        uq      max neval cld
 textures_old <- f1_glcm(red_raster)  25.28074  25.81914  26.23389  26.1424  26.70709  27.1866    10  a 
 textures_new <- f2_GLCMTextures(red_quantized) 142.20192 142.26140 142.73769   142.3798 142.76346 144.9905    10   b

Here's the code that I ran:

library(terra)
library(GLCMTextures)
library(glcm)
library(microbenchmark)

Data_dir <- "Namibia/Output/"
stk_file <- file.path(Data_dir, "stk_full.tif")
stk <- rast(stk_file)
# Clip to small area for testing
stk_crop <- crop(stk, ext(322400, 323000, 7885600, 7886200))
red <- stk_crop$red
red_raster <- raster::raster(red)

red_quantized <- quantize_raster(r = red, n_levels = 16,
                                 method = "equal prob")

grey_levels <- 16
wind <- c(5, 5)
shift <- list(c(0,1), c(1,1), c(1,0), c(-1,1))
f1_glcm = function(x) {textures_old <- glcm::glcm(x = x,
                               statistics=c('variance',
                                          'homogeneity',
                                          'contrast',
                                          'dissimilarity',
                                          'entropy'),
                               shift=shift,
                               n_grey = grey_levels, window=wind)

                      names(textures_old) <- c("variance", "homogeneity",
                                                "contrast", "dissimilarity", "entropy")
                      return(textures_old)
                      }
f2_GLCMTextures = function(x) {textures_new <- glcm_textures(r = x,
                                              w=wind,
                                              shift=shift,
                                              metrics = c("glcm_variance",
                                                          "glcm_homogeneity",
                                                          "glcm_contrast",
                                                          "glcm_dissimilarity",
                                                          "glcm_entropy"),
                                              quantization = "none",
                                              n_levels=grey_levels, na.rm = TRUE)

                      names(textures_new) <- c("variance", "homogeneity",
                                             "contrast","dissimilarity", "entropy")
                      return(textures_new)
                    }

microbenchmark(
               textures_old <- f1_glcm(red_raster),
               textures_new <- f2_GLCMTextures(red_quantized),
               times=10
              )

Thanks,
Micha

parallel processing?

Hi, and thank you for this wonderful package. Do you know how to parallelize or otherwise speedup both the quantization and the GLCM creation? I am working with large raster tiles and it takes a long time for the GLCMs to be created per tile.

Add option to get non-normalized GLCM

Have a normalize=FALSE option in make_glcm to get raw counts rather than probabilities. Useful for demonstrations of how the GLCM is constructed.

Small improvements

Hi Alex, congratulations on releasing the packages on CRAN! I think you can consider these things to improve the user experience:

  1. Add BugReports field to DESCRIPTION: BugReports: https://github.com/ailich/GLCMTextures/
  2. Create pkgdown website
  3. Add CRAN badge to README: [![CRAN](https://www.r-pkg.org/badges/version/GLCMTextures)](https://cran.r-project.org/package=GLCMTextures). And maybe badge with license?

Add test cases

Add tests to make sure values evaluate as expected based on examples from the texture tutorial.

glcm_textures "clips" a large raster

We are using GLCMTextures to produce several texture layers for a large study ares - the extent is a bit more than 12,000 x 16,000. Running quantize_raster() in advance produces the correct 16 bit raster for our whole study area. But then, after running glcm_textures() the result is clipped somehow. See images below.

I reran the function a few times on my home computer - 16GB RAM - with identical results. Then I tried on our dept server - with 128 GB RAM - and the result looked fine, the texture bands all covered the whole study area.

Is there some memory limit that would cause glcm_textures to do this clipping? (Just to be clear, on the low mem computer, quantize_raster worked fine, resulting in a raster covering the whole region. Only glcm_textures failed)

Full study area:
Full study area

Texture bands, after running glcm_textures:
GLCM textures result

Thanks,
Micha

as.int fails in "quantize_raster"

Hello,
While trying the code on WorldView2 images (11bit), we got the following error:
Error: 'as.int' is not an exported object from 'namespace:terra'

using the data included in the package, same error appears.

r<- rast(volcano, extent= ext(2667400, 2667400 + ncol(volcano)*10, 6478700, 6478700 + nrow(volcano)*10), crs = "EPSG:27200") 
quantized <- quantize_raster(r, n_levels = 16,  method = "equal prob")
Error: 'as.int' is not an exported object from 'namespace:terra'

Thanks in advanced,
Micha and Klil

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.