Giter VIP home page Giter VIP logo

bragagnololu / unet-defmapping Goto Github PK

View Code? Open in Web Editor NEW
16.0 2.0 5.0 63 KB

This repository presents the product of my master's thesis, which uses UNet to map deforestation using Sentinel-2 Level 2A images.

License: GNU General Public License v3.0

Python 100.00%
cnn cnn-keras cnn-classification unet unet-image-segmentation semantic-segmentation forest deforestation machine-learning machine-learning-algorithms

unet-defmapping's Introduction

Deforestation mapping using UNets

This repository contains the scripts referring to a methodology that performs the deforestation mapping using UNets and satellite images from Sentinel-2, being part of my master's thesis in Environmental Science and Technology. The methodology was tested for mapping deforestation spots using images from the Amazon and Atlantic Rainforest biomes, located in Brazil. Therefore, the files presented in the "Files" folder refer to UNets trained using images from these both regions. The results of these applications are being evaluated in journals and the access links will be made available as soon as they undergo peer review.

1 Usage

To identify deforestation in areas where UNet has already been trained (Amazon and Atlantic Rainforest), it is possible to directly use the scripts presented in the Deforestation-mapping folder and the files available in "Files". Otherwise, it is necessary to carry out a new training, using the training files of UNet in the folder of the same name. To do so, you must have the training images and their respective masks at hand.

1.1 Training a UNet

The UNet training procedures are described in the README.md file, found in the UNet folder.
For a new training to be used in the deforestation mapping algorithm, pay attention to using Level 2A Sentinel-2 images and a composition of RGB + Near-infrared images (Bands 4-3-2-8). For masks, non-forest regions are represented by the value 0, while forest areas are represented by the value 1.

1.2 Using the mapping deforestation script

The following figure shows the work flow of the proposed method (extracted from Bragagnolo et al., 2021):

drawing

The scripts for this functionality are in the "Deforestation-mapping" folder.
To execute the algorithm, use the file deforestation_main.py, where some information must me added:

# scripts that must be in the same path that this one
from deforestation_mapping import *

# .GEOjson file of the area to be monitored
geojson_file = '/rondonia_square3.geojson'

# path to save the downloaded images
save_imgs = '/Downloaded'

# save RGB files
save_rgb = '/rgb_files'

# save tiles
save_tiles = '/tiles_imgs"

# Unet weights file
unet_weights = "/weights_file_of_trained_UNet.hdf5"

# Unet weights clouds file
unet_clouds = '/weights_file_of_clouds_trained_UNet.hdf5'

# classificated images path
class_path = "/predicted"

# classificated clouds images path
class_clouds = "/predicted_clouds"

# polygons save
poly_path = '/polygons'

# files saved after the trained UNet
percentiles_forest = ["/bands_third.npy",
                       "/bands_nin.npy"]

percentiles_clouds = ["/bands_third_clouds.npy",
                       "/bands_nin_clouds.npy"]

def_main(save_imgs, save_rgb, save_tiles, unet_weights, unet_clouds,
         class_path, class_clouds, poly_path, 
         percentiles_forest, percentiles_clouds, geojson_file)

Some settings must also be made in the file deforestation_mapping.py, as credentials for accessing the Sentinel-Hub (user and passwrod) and defining the time period to be covered by the analysis (parameter date):

# connect to the API
user = 'USERNAME'
password = 'PASSWORD' 

api = SentinelAPI(user, password, 'https://scihub.copernicus.eu/dhus')

# search by polygon
footprint = geojson_to_wkt(read_geojson(boundsdata))

# search for the images
products = api.query(footprint,
                 date = (["NOW-30DAYS","NOW"]),
                 area_relation = 'IsWithin',
                 platformname = 'Sentinel-2',
                 processinglevel = 'Level-2A',
                 #cloudcoverpercentage = (0, 20)
                )

2 Results

At the end of the algorithm, raster images will be obtained indicating the deforestation spots for the given image, as well as vector files, in the shapefile .shp format, also indicating the deforested areas.

References

Bragagnolo, L., R. V. da Silva, and J. M. V. Grzybowski. "Towards the automatic monitoring of deforestation in Brazilian rainforest." Ecological Informatics (2021): 101454. https://doi.org/10.1016/j.ecoinf.2021.101454

Bragagnolo, L., Roberto Valmir da Silva, and José Mario Vicensi Grzybowski. "Amazon forest cover change mapping based on semantic segmentation by U-Nets." Ecological Informatics 62 (2021): 101279. https://doi.org/10.1016/j.ecoinf.2021.101279

System requirements

Python 3.0
Keras and Tensorflow
sklearn
rasterio
rkimage
fiona
cv2
numpy_indexed
sentinelsat
zipfile
glob
matplotlib

unet-defmapping's People

Contributors

bragagnololu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

unet-defmapping's Issues

Models and Dimensions

Hi @bragagnololu !

I'm trying to run your code and I've been having some issues.
What confused me kind a bit was the fact that you proposed two different models: one for forest and one for cloud detection, and they have different dimensions for the input.
Can you give me some details on why the cloud has input_size of (512, 512, 3) while the forest uses (512, 512, 4)?

Giving more details on my current issue: I've been trying to load the forest model and test it on some images, but I'm having the following issue loading the arrays on the following code:

# loading arrays
image_array = np.load("image_array_og.npy") # array of training images
image_array[image_array > 10000] = 10000
image_array = image_array.astype(float)/10000
mask_array = np.load("mask_array_og.npy") # array of training masks

channels_imgs = 4 # number of channels of one image

bands_third = np.zeros(channels_imgs)
bands_nin = np.zeros(channels_imgs)

# getting the percentiles of the training array for normalization
for i in range(channels_imgs):
    bands_third[i] = np.percentile(image_array[:,:,:,i],3)
    bands_nin[i] = np.percentile(image_array[:,:,:,i],97)

np.save('bands_third_og.npy', bands_third)
np.save('bands_nin_og.npy', bands_nin)

PS: I did not change the gen_npy_files.py code, except by adding lines to save the numpy arrays into files.
My error message is the following:

---------------------------------------------------------------------------
IndexError                                Traceback (most recent call last)
/tmp/ipykernel_40875/54951181.py in <module>
     12 # getting the percentiles of the training array for normalization
     13 for i in range(channels_imgs):
---> 14     bands_third[i] = np.percentile(image_array[:,:,:,i],3)
     15     bands_nin[i] = np.percentile(image_array[:,:,:,i],97)
     16 

IndexError: index 3 is out of bounds for axis 3 with size 3

I've experiemented reducing the dimension of the input, but then I start having problems loading the weights of the model (as expected).

I think I can make it work if you give us more details on how to run the code at the /UNet folder.

System requirements clarification

Hi,

I recently tried out awesome UNET-defmapping with mostly up to date packages (see below), however with no luck. I wonder maybe a little clarification on originally used package names and versions would solve all the issues I encountered tinkering around and trying to make this amazing code to work in my environment.

Python 3.7.10
? Keras 2.4.3
? Tensorflow 2.3.0
? sklearn: scikit-learn 0.24.2
rasterio 1.1.5
? rkimage ?
fiona 1.8.9.post2
? cv2: opencv-python
numpy-indexed 0.3.5
sentinelsat 1.0.0
zipfile.py
? glob2 0.7
matplotlib 3.4.2

I'd be very grateful for any kind of assistance.

Geojson file does not exist

Dear Lucimara,

I am trying to download the images but it seems that the rondonia_square3.geojson does not exist. Could you please inform me where can I find this file so that I can run the codes?

I am looking forward to your answer

Best regards

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.