Giter VIP home page Giter VIP logo

lidar's People

Contributors

dependabot[bot] avatar giswqs avatar pre-commit-ci[bot] avatar rafelafrance avatar stephenwaltersnv5 avatar trellixvulnteam avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

lidar's Issues

Love the GUI!

  • lidar version:
  • Python version:
  • Operating System:

Description

Just wanted to comment on your use of PySimpleGUI. Thanks for giving it a try and mentioning it in your docs. It is a nice looking GUI. I'm going to post it in the user's screen shots in my GitHub.

๐Ÿฅ‡ One of the best looking layouts I've seen.

What I Did

Paste the command(s) you ran and the output.
If there was a crash, please include the traceback here.

Newest lidar version breaks install (pygdal)

  • lidar version: 0.6.2
  • Python version: 3.8
  • Operating System: osgeo/gdal:ubuntu-small-latest (Docker)

Description

When rebuilding my Python image based on the osgeo/gdal image, I face an installation issue since lidar version 0.6.2. The Pip install fails with the following error:

#6 189.9 Collecting pygdal==1.10.0.0
#6 189.9   Downloading pygdal-1.10.0.0.tar.gz (313 kB)
#6 190.5     ERROR: Command errored out with exit status 1:
#6 190.5      command: /usr/bin/python3 -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-oibtxg54/pygdal/setup.py'"'"'; __file__='"'"'/tmp/pip-install-oibtxg54/pygdal/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base /tmp/pip-install-oibtxg54/pygdal/pip-egg-info
#6 190.5          cwd: /tmp/pip-install-oibtxg54/pygdal/
#6 190.5     Complete output (42 lines):
#6 190.5     running egg_info
#6 190.5     creating /tmp/pip-install-oibtxg54/pygdal/pip-egg-info/pygdal.egg-info
#6 190.5     writing /tmp/pip-install-oibtxg54/pygdal/pip-egg-info/pygdal.egg-info/PKG-INFO
#6 190.5     writing dependency_links to /tmp/pip-install-oibtxg54/pygdal/pip-egg-info/pygdal.egg-info/dependency_links.txt
#6 190.5     writing requirements to /tmp/pip-install-oibtxg54/pygdal/pip-egg-info/pygdal.egg-info/requires.txt
#6 190.5     writing top-level names to /tmp/pip-install-oibtxg54/pygdal/pip-egg-info/pygdal.egg-info/top_level.txt
#6 190.5     writing manifest file '/tmp/pip-install-oibtxg54/pygdal/pip-egg-info/pygdal.egg-info/SOURCES.txt'
#6 190.5     Traceback (most recent call last):
#6 190.5       File "<string>", line 1, in <module>
#6 190.5       File "/tmp/pip-install-oibtxg54/pygdal/setup.py", line 142, in <module>
#6 190.5         setup(
#6 190.5       File "/usr/lib/python3/dist-packages/setuptools/__init__.py", line 144, in setup
#6 190.5         return distutils.core.setup(**attrs)
#6 190.5       File "/usr/lib/python3.8/distutils/core.py", line 148, in setup
#6 190.5         dist.run_commands()
#6 190.5       File "/usr/lib/python3.8/distutils/dist.py", line 966, in run_commands
#6 190.5         self.run_command(cmd)
#6 190.5       File "/usr/lib/python3.8/distutils/dist.py", line 985, in run_command
#6 190.5         cmd_obj.run()
#6 190.5       File "/usr/lib/python3/dist-packages/setuptools/command/egg_info.py", line 297, in run
#6 190.5         self.find_sources()
#6 190.5       File "/usr/lib/python3/dist-packages/setuptools/command/egg_info.py", line 304, in find_sources
#6 190.5         mm.run()
#6 190.5       File "/usr/lib/python3/dist-packages/setuptools/command/egg_info.py", line 535, in run
#6 190.5         self.add_defaults()
#6 190.5       File "/usr/lib/python3/dist-packages/setuptools/command/egg_info.py", line 571, in add_defaults
#6 190.5         sdist.add_defaults(self)
#6 190.5       File "/usr/lib/python3.8/distutils/command/sdist.py", line 228, in add_defaults
#6 190.5         self._add_defaults_ext()
#6 190.5       File "/usr/lib/python3.8/distutils/command/sdist.py", line 311, in _add_defaults_ext
#6 190.5         build_ext = self.get_finalized_command('build_ext')
#6 190.5       File "/usr/lib/python3.8/distutils/cmd.py", line 299, in get_finalized_command
#6 190.5         cmd_obj.ensure_finalized()
#6 190.5       File "/usr/lib/python3.8/distutils/cmd.py", line 107, in ensure_finalized
#6 190.5         self.finalize_options()
#6 190.5       File "/tmp/pip-install-oibtxg54/pygdal/setup.py", line 61, in finalize_options
#6 190.5         self.library_dirs.append(os.path.join(self.gdaldir, 'lib'))
#6 190.5       File "/usr/lib/python3.8/posixpath.py", line 90, in join
#6 190.5         genericpath._check_arg_types('join', a, *p)
#6 190.5       File "/usr/lib/python3.8/genericpath.py", line 155, in _check_arg_types
#6 190.5         raise TypeError("Can't mix strings and bytes in path components") from None
#6 190.5     TypeError: Can't mix strings and bytes in path components
#6 190.5     ----------------------------------------
#6 190.6 ERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.
#6 ERROR: executor failed running [/bin/sh -c apt-get update     && apt install -y --no-install-recommends         python3-tk         python3-dev         python3-pip         python3-gdal         python3-distutils         g++     && pip install --no-cache-dir pipenv_to_requirements     && pipenv run pipenv_to_requirements     && pip install --no-cache-dir -r requirements.txt     && pip uninstall -y pipenv_to_requirements     && apt-get remove -y         python3-pip         python3-dev         g++     && apt-get autoremove -y     && rm -rf /var/lib/apt/lists/*     && echo "from whitebox.whitebox_tools import WhiteboxTools" > wb.py     && echo "wbt = WhiteboxTools()" >> wb.py     && python3 wb.py     && chmod -R 0777 `find / -type d -name "whitebox"`     && rm wb.py     && rm Pipfile     && rm requirements.txt]: exit code: 1
------
 > [3/3] RUN apt-get update     && apt install -y --no-install-recommends         python3-tk         python3-dev         python3-pip         python3-gdal         python3-distutils         g++     && pip install --no-cache-dir pipenv_to_requirements     && pipenv run pipenv_to_requirements     && pip install --no-cache-dir -r requirements.txt     && pip uninstall -y pipenv_to_requirements     && apt-get remove -y         python3-pip         python3-dev         g++     && apt-get autoremove -y     && rm -rf /var/lib/apt/lists/*     && echo "from whitebox.whitebox_tools import WhiteboxTools" > wb.py     && echo "wbt = WhiteboxTools()" >> wb.py     && python3 wb.py     && chmod -R 0777 `find / -type d -name "whitebox"`     && rm wb.py     && rm Pipfile     && rm requirements.txt:
------
executor failed running [/bin/sh -c apt-get update     && apt install -y --no-install-recommends         python3-tk         python3-dev         python3-pip         python3-gdal         python3-distutils         g++     && pip install --no-cache-dir pipenv_to_requirements     && pipenv run pipenv_to_requirements     && pip install --no-cache-dir -r requirements.txt     && pip uninstall -y pipenv_to_requirements     && apt-get remove -y         python3-pip         python3-dev         g++     && apt-get autoremove -y     && rm -rf /var/lib/apt/lists/*     && echo "from whitebox.whitebox_tools import WhiteboxTools" > wb.py     && echo "wbt = WhiteboxTools()" >> wb.py     && python3 wb.py     && chmod -R 0777 `find / -type d -name "whitebox"`     && rm wb.py     && rm Pipfile     && rm requirements.txt]: exit code: 1
Service 'python-env' failed to build : Build failed

Process finished with exit code 1

What I Did

Based on the docker image osgeo/gdal:ubuntu-small-latest I install

  • python3-tk
  • python3-dev
  • python3-gdal
  • python3-pip

And then install lidar among my other requirements

  • rasterio
  • fiona
  • whitebox
  • numpy

all listed in a requirements.txt file with a

pip install pip install --no-cache-dir -r requirements.txt

Pip install fails with the message above

Simulate Inundation- RecursionError: maximum recursion depth exceeded

  • lidar version: 0.7.3
  • Python version: 3.11
  • Operating System: Windows 11

Description

I have followed your tutorial for delineating inundation areas and all steps worked well until Simulate Inundation. I got an error that maximum recursion depth exceeded.

What I Did

please see the attached pictures
Screenshot_simulate_inundation

Numpy version 1.20.0 or greater throws AttributeError when running ExtractSinks

Thank you for this fantastic package!

  • lidar version: 0.7.1
  • numpy version: 1.24.4
  • Python version: 3.8
  • Operating System: Windows 10 Pro

Description

Attempted to run the ExtractSinks function but Numpy threw an AttributeError.

What I Did

Commands:

lidar.ExtractSinks(in_dem=DEM,  min_size=min_area, out_dir=out_dir)

Traceback:

Loading data ...
Traceback (most recent call last):
  File "C:\Users\user\Miniconda3\envs\hydro\lib\site-packages\lidar\filling.py", line 290, in ExtractSinks
    max_elev = np.float(np.max(dem[dem != no_data]))
  File "C:\Users\user\Miniconda3\envs\hydro\lib\site-packages\numpy\__init__.py", line 305, in __getattr__
    raise AttributeError(__former_attrs__[attr])
AttributeError: module 'numpy' has no attribute 'float'.
`np.float` was a deprecated alias for the builtin `float`. To avoid this error in existing code, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here.
The aliases was originally deprecated in NumPy 1.20; for more details and guidance see the original release note at:
    https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations

Offending lines of code:
image

The issue was resolved after replacing all references in the codebase from np.float() to the python built-in float().

Wrong depression ID in outputted depressions.shp and level shapefiles

  • lidar version: 0.6.1 (Forced to use this one because of #21)
  • Python version: 3.8
  • Operating System: Ubuntu (Docker)

Description

When delineating depressions I use the following

sink_path = ExtractSinks(
    './test/dem.tif',
    1,
    './output'
)
DelineateDepressions(
    sink_path,
    1,
    0.5,
    0.2,
    './output'
    True
)

This runs through and produces me the general sinks (regions.shp) and the nested depressions (depressions.shp, depressions_info.csv). I understand the relation between both and want to assign to each region all the associated depressions and their geometries by looping through all the regions (ID's) and assigning them every depression (getting info from the depressions_info.csv). But I stumbled upon an issue where I cannot associate depressions by their IDs to the according poygon geometry, because the depressions.shp IDs are wrong. While I have enumerated all depressions correctly in the CSV file, the depressions shapefile has an very strange (and I assume wrong) numbering. At least most IDs fit, but I have for the level 1 depressions suddenly hundreds of depressions all having the depression ID 1. The rest of the polygons all have the correct depression ID, which relates tothe same ID in the depressions_info.csv file.

image

The same can be seen when looking at the individual level shapefiles. For level 1, I have a lot of polygons all with the same depression ID = 1

image

This does not allow to associate a geometry to a depression via its ID. I guess 1/50 or 1/100 of my depressions (and only those of level 1) have the wrong depression ID set to 1.

image

Do I misunderstand something or is this a bug in lidar? I assume, the polygons in level 1 are written to the shapefiles with an inccorect ID of 1 instead of the depression ID from the CSV file

Wrong Delineate Flow Oath

  • lidar version:
  • Python version: 3.6.9
  • Operating System: Windows x64-Conda

Description

I am running the example you provide in the Lidar folder.
When I run script 3 I get the following error:

Traceback (most recent call last):
File "C:\lidar\lidar\toolbox\scripts\3_Flow_Path.py", line 425, in
FlowPath(in_dem, in_sink, rain_intensity, out_flowpath)
File "C:\lidar\lidar\toolbox\scripts\3_Flow_Path.py", line 303, in FlowPath
code_block="",
File "c:\program files\arcgis\pro\Resources\arcpy\arcpy\management.py", line 4530, in CalculateField
raise e
File "c:\program files\arcgis\pro\Resources\arcpy\arcpy\management.py", line 4527, in CalculateField
retval = convertArcObjectToPythonObject(gp.CalculateField_management(*gp_fixargs((in_table, field, expression, expression_type, code_block, field_type), True)))
File "c:\program files\arcgis\pro\Resources\arcpy\arcpy\geoprocessing_base.py", line 511, in
return lambda *args: val(*gp_fixargs(args, True))
arcgisscripting.ExecuteError: ERROR 000539: File "", line 1
0,05
^
SyntaxError: invalid token

Failed to execute (CalculateField).
Failed to execute (DelineateFlowPath).

What I Did

Paste the command(s) you ran and the output.
If there was a crash, please include the traceback here.

Captura de pantalla (69)
Captura de pantalla (68)

Issue on the Scripts and Hydro Analyst tool

  • lidar version:0.6.1
  • Python version: 3.7
  • Operating System: 64 bit operating system

Description

Hi Dr. Wu,

I was trying to work on flow path delineation in the place where there is a high number of depressions. I tried to use tools on ArcGIS Pro and scripts which are in the lidar folder of the package using conda environment. Both the script and tool extracted sinks without any error but I am facing some of the issues while using script and tool on other tools:

  1. While using the flow path delineation tool, I am getting no field name "volume" error. I tried to work using the script given in the Lidar folder and saw that it is trying to do some calculation using the volume field on the sink but there is no volume field created from the sink extraction tool. I tried to fix that error and run on the conda environment. I am getting an incomplete result using the script.
    The flow path generated is not in whole dem but only in the lower part of the DEM. The issue can be seen in the figure given in the link below:
    https://drive.google.com/file/d/1WZzvtzs8-XhATwjGC7B86iMjNOAMzIrT/view?usp=sharing

  2. For the simulate inundation tool and script I am getting maximum recursion depth exceeded error. I tried to decrease the size of the area in which I am working and it has only 12 regions but also I am getting maximum recursion depth exceeded error. I tried to solve this error by setting the recursion limit but it did not work. Instead, the code crashed my tool.

File "C:\Users\na\Anaconda3\envs\py37\lib\site-packages\skimage\measure\_regionprops.py", line 323, in __getattr__
    if attr in self._extra_properties:
RecursionError: maximum recursion depth exceeded
Data preparation time: 0.09323954582214355
Total number of regions: 12
Iteration: 1 ( H = 2 )

Please help me to solve the error. I think this script and tool is very useful for me in my work.

The DEM which I used in work is :
DEM.zip

ValueError: Values other than "rc" for the "coordinates" argument to skimage.measure.regionprops are no longer supported.

  • lidar version:
  • Python version: 3.7
  • Operating System: Windows 10

Description

I tried running the example.py with the dem.tif but got the following error:

# identify the sample data directory of the package
#package_name = 'lidar'
#data_dir = pkg_resources.resource_filename(package_name, 'data/')
data_dir = 'lidar'

# use the sample dem. Change it to your own dem if needed
in_dem = os.path.join(data_dir, 'dem.tif')
# set output directory. By default, use the temp directory under user's home directory
out_dir = os.path.join(os.path.expanduser("~"), "temp")

# parameters for identifying sinks and delineating nested depressions
min_size = 1000      # minimum number of pixels as a depression
min_depth = 0.5      # minimum depth as a depression
interval = 0.3       # slicing interval for the level-set method
bool_shp = True     # output shapefiles for each individual level

# extracting sinks based on user-defined minimum depression size
out_dem = os.path.join(out_dir, "median.tif")
in_dem = lidar.MedianFilter(in_dem, kernel_size=3, out_file=out_dem)
sink_path = lidar.ExtractSinks(in_dem, min_size, out_dir)
dep_id_path, dep_level_path = lidar.DelineateDepressions(sink_path, min_size, min_depth, interval, out_dir, bool_shp)

print('Results are saved in: {}'.format(out_dir))

Median filtering ...
Run time: 0.0290 seconds
Saving dem ...
Loading data ...
min = 379.70, max = 410.72, no_data = -3.402823e+38, cell_size = 1.0
Depression filling ...
Saving filled dem ...
Region grouping ...
Computing properties ...

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-13-d90bdedee5d6> in <module>
     18 out_dem = os.path.join(out_dir, "median.tif")
     19 in_dem = lidar.MedianFilter(in_dem, kernel_size=3, out_file=out_dem)
---> 20 sink_path = lidar.ExtractSinks(in_dem, min_size, out_dir)
     21 dep_id_path, dep_level_path = lidar.DelineateDepressions(sink_path, min_size, min_depth, interval, out_dir, bool_shp)
     22 

~\Anaconda3\envs\py37\lib\site-packages\lidar\filling.py in ExtractSinks(in_dem, min_size, out_dir)
    178 
    179     print("Computing properties ...")
--> 180     objects = measure.regionprops(label_objects, dem, coordinates='xy')
    181     dep_list = get_dep_props(objects, cell_size)
    182     write_dep_csv(dep_list, out_csv_file)

~\Anaconda3\envs\py37\lib\site-packages\skimage\measure\_regionprops.py in regionprops(label_image, intensity_image, cache, coordinates)
    855                    'stop using the "coordinates" argument, or use skimage '
    856                    'version 0.15.x or earlier.')
--> 857             raise ValueError(msg)
    858 
    859     regions = []

ValueError: Values other than "rc" for the "coordinates" argument to skimage.measure.regionprops are no longer supported. You should update your code to use "rc" coordinates and stop using the "coordinates" argument, or use skimage version 0.15.x or earlier.

ImportError: DLL load failed: The specified module could not be found.

  • lidar version:
  • Python version:3.6
  • Operating System: windows 10

Description

Thank you so much for creating this wonderful tool. I am testing with ArcGIS 2.5 with your dataset. But I am getting errors. Is it related python environment or version? Would you check it, please?

What I Did

I am testing with your provided dataset.

ERROR 032659 updateParameters Syntax Error: Traceback (most recent call last):
  File "C:\Users\hahmad\Downloads\lidar-master\lidar\toolbox\ArcGIS Pro Hydrology Analyst.tbx#ExtrackSink.UpdateParameters.py", line 1, in <module>
  File "c:\program files\arcgis\pro\Resources\arcpy\arcpy\__init__.py", line 21, in <module>
    import numpy
  File "C:\Users\hahmad\AppData\Local\ESRI\conda\envs\arcgispro-py3-clone1\lib\site-packages\numpy\__init__.py", line 140, in <module>
    from . import _distributor_init
  File "C:\Users\hahmad\AppData\Local\ESRI\conda\envs\arcgispro-py3-clone1\lib\site-packages\numpy\_distributor_init.py", line 34, in <module>
    from . import _mklinit
ImportError: DLL load failed: The specified module could not be found.

Environment issue

  • lidar version: newest
  • Python version: 3.7
  • Operating System: Windows 10

Hi. First of all, I'm quite a novice using python, so this may be a very simply issue but I'm stuck despite all the forums I checked already!

So, I installed the lidar package according to the instructions using Anaconda Prompt. It was successful. However, when I tried installing spyder inside that new environment it does not work. I thought I installed spyder successfully also in the new environment but cannot access it (see screenshots below). I can open spyder directly but then it is still in another environment and won't let me access the lidar package. Any thoughts?

Thanks a lot!

python_code

Minor bug/typo in lidar.ipynb

  • lidar version: 0.7.2
  • Python version: 3.8.18
  • Operating System: Pop!_OS 22.04

Description

While working thru the example notebook lidar.ipynb, I could not open the zip file in cell 4.

What I Did

To fix, change:
with zipfile.ZipFile(zip_name, "r") as zip_ref: to
with zipfile.ZipFile(zip_path, "r") as zip_ref:

Presumably, also change:
with tarfile.open(zip_name, "r") as tar_ref: to
with tarfile.open(zip_path, "r") as tar_ref:

Error 001683 - Py2 to 3, Delineate Hierarchy, Flow Path, Animation and Simulate

  • lidar version: 0.7.3
  • Python version: 3.11
  • Operating System: Windows 64 bit

**ArcPro 3.2.2


I was following your tutorial and the "Delineate Flow Path Tool" is continuously failing. I've tried to update the packages, update Pro and all of the other generic "checks". Here is the message I got below when running the "Analyze Tools for Pro", which is the same error message I got when the tool failed. Everything was done exactly as you had in the tutorial. This might have been initially from batch update for Pro I assume?

WARNING 001683: Found Python 2 to 3 errors: Line 348: if key in parent_ids.keys(): -> if key in list(parent_ids.keys()): within script tool DelineateDepressionHierarchy(C:\Users\Thompson\Documents\ArcGIS\lidar_code\lidar-master\lidar\toolbox\scripts\4_Slicing.py)
WARNING 001683: Found Python 2 to 3 errors: Line 399: if row[0] not in inDict.keys(): -> if row[0] not in list(inDict.keys()):
Line 403: for key, value in od.items(): -> for key, value in list(od.items()): within script tool DelineateFlowPath(C:\Users\Thompson\Documents\ArcGIS\lidar_code\lidar-master\lidar\toolbox\scripts\3_Flow_Path.py)
WARNING 001683: Found Python 2 to 3 errors: Line 1: from future import division -> within script tool PlayAnimation(C:\Users\Thompson\Documents\ArcGIS\lidar_code\lidar-master\lidar\toolbox\scripts\7_Play_Animation.py)
WARNING 001683: Found Python 2 to 3 errors: Line 142: print("creating {}...".format(shp)) -> print(("creating {}...".format(shp)))
Line 318: if key in parent_ids.keys(): -> if key in list(parent_ids.keys()): within script tool SimulateInundation(C:\Users\Thompson\Documents\ArcGIS\lidar_code\lidar-master\lidar\toolbox\scripts\6_Simulate Inundation.py)

Thank you for your help! This

Support newer versions of tcl

  • lidar version: 0.7.1
  • Python version: 3.7
  • Operating System: Windows 10

Description

When trying to import lidar, I get an error about needing an older version of tcl than required by the Python interpreter.

What I Did

import lidar
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "C:\...\lib\site-packages\lidar\__init__.py", line 18, in <module>
    from .gui import gui
  File "C:\...\lib\site-packages\lidar\gui.py", line 9, in <module>
    import PySimpleGUI as sg
  File "C:\...\lib\site-packages\PySimpleGUI\__init__.py", line 2, in <module>
    from .PySimpleGUI import *
  File "C:\...\lib\site-packages\PySimpleGUI\PySimpleGUI.py", line 150, in <module>
    tclversion_detailed = tkinter.Tcl().eval('info patchlevel')
  File "C:\...\lib\tkinter\__init__.py", line 2119, in Tcl
    return Tk(screenName, baseName, className, useTk)
  File "C:\...\lib\tkinter\__init__.py", line 2023, in __init__
    self.tk = _tkinter.create(screenName, baseName, className, interactive, wantobjects, useTk, sync, use)
_tkinter.TclError: Can't find a usable init.tcl in the following directories:
    {C:\...\tcl\tcl8.6}

C:/.../tcl/tcl8.6/init.tcl: version conflict for package "Tcl": have 8.6.12, need exactly 8.6.9
version conflict for package "Tcl": have 8.6.12, need exactly 8.6.9
    while executing
"package require -exact Tcl 8.6.9"
    (file "C:/.../tcl/tcl8.6/init.tcl" line 19)
    invoked from within
"source C:/.../tcl/tcl8.6/init.tcl"
    ("uplevel" body line 1)
    invoked from within
"uplevel #0 [list source $tclfile]"


This probably means that Tcl wasn't installed properly.

ID mismatch between depressions / regions in depressions_info

  • lidar version: 0.6.1
  • Python version: 3.8
  • Operating System: Ubuntu

Description

The following image shows the mismatch in the depressions_info.csv after executing

sink_path = ExtractSinks(
    dem_raster,
    2,
    out_dir
)
DelineateDepressions(
    sink_path,
    2,
    0,
    0.2,
    out_dir,
    True
)

mismatch

The depressions_info.csv references the region_id 143 for depression 163 while a look at the map show that actually region 173 spatially matches the depression 163. I tested in another region and here everything was correct and the region/depression IDs matched as they should. I expect this to be a bug. Any other suggestions?

PS: I attached the SRTM image to test srtm_dem.zip

region-id inconsistent between depressions.shp and regions.shp

Firstly, thank you for providing an excellent package/resource.

  • lidar version: 0.7.1
  • Python version: 3.11.0
  • Operating System: Windows 10

Description

Describe what you were trying to get done.

  • short-term goal, and where the problem is happening: aligning depression regions, with individual nested depressions

Tell us what happened, what went wrong, and what you expected to happen.

  • the region-id values in regions.shp/regions_info.csv that is generated using ExtractSinks() differ from those in the depressions.shp/depressions_info.csv generated using DelineateDepressions().
  • I would expect the region-id to be consistent between these.

What I Did

outdir='tmp/'
min_size = 50           # minimum number of pixels as a depression
min_depth = 30           # minimum depth as a depression
interval = 10         # slicing interval for the level-set method
bool_shp = False        # output shapefiles for each individual level

sink_path = ExtractSinks('sample.tif', min_size, outdir)

dep_id_path, dep_level_path = DelineateDepressions(sink_path, min_size, min_depth, interval, outdir, bool_shp)

# read in output and combine info csv with geometry from shapefile
# depressions
depressions = gpd.read_file('tmp/depressions.shp')
depressions = depressions.dissolve('id').reset_index() # makes for a tidier merge with depressions_info.csv (creates multipolygons)
dep_info = pd.read_csv('tmp/depressions_info.csv')
gdf = depressions.merge(dep_info, on='id')

# regions
regions = gpd.read_file('tmp/regions.shp').sort_values(by='id')
reg_info = pd.read_csv('tmp/regions_info.csv')
regions = regions.merge(reg_info, left_on='id', right_on='region-id').sort_values(by='id').reset_index(drop=True)

at this stage I would expect the region-id in regions to match that in gdf...however, they do not. Below is an illustration, which I think also provides a way to handle the discrepency (spatial join)

# read in raster (for plotting)
region_raster = rio.open_rasterio('tmp/region.tif')

# to enable comparison of region-ids intersection of depressions and regions shapefiles
overlaid = (gdf
            .dissolve('region-id')   # aggregate by region-id
            .reset_index() # to ensure result gdf has both region-id_1 and region-id_2 (where _2 is the value that correctly corresponds with those in regions.shp)
            .overlay(regions, keep_geom_type=False)
)

###### plotting

# random region number
R = 310

fig, axs = plt.subplots(figsize=[15,6],ncols=5)

# plot raster 
(region_raster==R).plot(ax=axs[0], add_colorbar=False)

# plot from regions shapefile
regions.loc[regions['region-id']==R].plot(ax=axs[1])

# plot from depreesion shapefile
gdf.loc[gdf['region-id']==R].plot(column='level', ax=axs[2])

# plot from intersection of depressions and regions
overlaid.loc[overlaid['region-id_1']==R].plot(ax=axs[3])

# plot from intersection of depressions and regions
overlaid.loc[overlaid['region-id_2']==R].plot(ax=axs[4])

# tidy up axes limits, and labels etc...
axs[0].set_xlim(axs[1].get_xlim())
axs[0].set_ylim(axs[1].get_ylim())
axs[0].set_aspect('equal')

axs[0].set_title(f'region:{R}\nfrom region.tif')
axs[1].set_title(f'region:{R}\nfrom regions.shp')
axs[2].set_title(f'region:{R}\nfrom depressions.shp')
axs[3].set_title(f'region:{overlaid.loc[overlaid["region-id_1"]==R,"region-id_2"].values[0]}\nfrom regions.shp')
axs[4].set_title(f'region:{overlaid.loc[overlaid["region-id_2"]==R,"region-id_1"].values[0]}\nfrom depressions.shp')

plt.subplots_adjust(wspace=0.35)

print(f"region: {R} in the depressions.shp file corresponds to region: {overlaid.loc[overlaid['region-id_1']==R,'region-id_2'].values[0]} in regions.shp")
print(f"region: {R} in the regions.shp file corresponds to region: {overlaid.loc[overlaid['region-id_2']==R,'region-id_1'].values[0]} in depressions.shp")

region: 310 in the depressions.shp file corresponds to region: 394 in regions.shp
region: 310 in the regions.shp file corresponds to region: 245 in depressions.shp

image

the question

So, i think the question is... is the discrepency between the two region_ids expected/normal?
If yes, is the use of .overlay() the best way to handle it and reconcile regions.shp/regions_info.csv and depressions.shp/depressions_info.csv, or is there an even more straightforward way?
If no, have I done something wrong?

thank you

Enable full features in the python version of the lidar package

  • lidar version: 0.6.1
  • Python version: 3.8
  • Operating System: Windows x64- Conda

Description

Dear Qiusheng,
First of all, I would like to thank you for your amazing efforts in proposing this package. It is very useful for delineating land depressions from DEMs.

I can only run the python version of the package as I do not have access to ArcGIS Pro. The ArcGIS Pro toolbox has more features than the python version of the package. I can only get the depressions, their properties (depth, volume, etc.), and their cascading order using python scripts. I am very interested in getting the flow directions, depressions catchment, and more importantly the ability to simulate inundation over the different depressions. I can see that these features are available in the ArcGIS toolbox and I was not able to find python scripts (in lidar-master/lidar) to run these functions. I can also see these functions as scripts under the toolbox (in lidar-master/lidar/toolbox/scripts), but converting them to run outside ArcGIS might be problematic for me.

So, my request, can you please add these functions (especially the inundation mapping) to the lidar-master/lidar so that I and others can run them outside ArcGIS.
I would also appreciate if you can share with me a conceptual workflow or the methodology behind how the inundation mapping works because I could not find it in any of the related publications listed on the repo page.

Thank you so much

Memory Issue

  • lidar version:
  • Python version: 3.7
  • Operating System: Windows 10

RuntimeError: Free disk space available is 49152 bytes, whereas 2705360268 are at least necessary. You can disable this check by defining the CHECK_DISK_FREE_SPACE configuration option to FALSE.

How can I solve this? Thanks a lot!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.