Giter VIP home page Giter VIP logo

mintpy's Introduction

Language Docs Status CircleCI Docker Status Conda Download Version Forum License Citation

MintPy

The Miami INsar Time-series software in PYthon (MintPy as /mɪnt paɪ/) is an open-source package for Interferometric Synthetic Aperture Radar (InSAR) time series analysis. It reads the stack of interferograms (coregistered and unwrapped) in ISCE, ARIA, FRInGE, HyP3, GMTSAR, SNAP, GAMMA or ROI_PAC format, and produces three dimensional (2D in space and 1D in time) ground surface displacement in line-of-sight direction. It includes a routine time series analysis (smallbaselineApp.py) and some independent toolbox.

This package was called PySAR before version 1.1.1. For version 1.1.2 and onward, we use MintPy instead.

This is research code provided to you "as is" with NO WARRANTIES OF CORRECTNESS. Use at your own risk.

2. Running MintPy

2.1 Routine workflow smallbaselineApp.py

MintPy reads a stack of interferograms (unwrapped interferograms, coherence and connected components from SNAPHU if available) and the geometry files (DEM, lookup table, incidence angle, etc.). You need to give the path to where the files are and MintPy takes care of the rest!

smallbaselineApp.py                         # run with default template 'smallbaselineApp.cfg'
smallbaselineApp.py <custom_template>       # run with default and custom templates
smallbaselineApp.py -h / --help             # help
smallbaselineApp.py -H                      # print    default template options
smallbaselineApp.py -g                      # generate default template if it does not exist
smallbaselineApp.py -g <custom_template>    # generate/update default template based on custom template

# Run with --start/stop/dostep options
smallbaselineApp.py GalapagosSenDT128.txt --dostep velocity  # run step 'velocity' only
smallbaselineApp.py GalapagosSenDT128.txt --end load_data    # end run after step 'load_data'

Inside smallbaselineApp.py, it reads the unwrapped interferograms, references all of them to the same coherent pixel (reference point), calculates the phase closure and estimates the unwrapping errors (if it has been asked for), inverts the network of interferograms into time-series, calculates the temporal coherence to evaluate the quality of inversion, corrects local oscillator drift (for Envisat only), corrects stratified tropospheric delay (using global atmospheric models or phase-elevation-ratio approach), removes phase ramps (if it has been asked for), corrects DEM error,... and finally estimates the velocity.

Configuration parameters for each step are initiated with default values in a customizable text file smallbaselineApp.cfg.

Example on Fernandina volcano, Galápagos with Sentinel-1 data

wget https://zenodo.org/record/3952953/files/FernandinaSenDT128.tar.xz
tar -xvJf FernandinaSenDT128.tar.xz
cd FernandinaSenDT128/mintpy
smallbaselineApp.py ${MINTPY_HOME}/docs/templates/FernandinaSenDT128.txt

Results are plotted in ./pic folder. To explore more data information and visualization, try the following scripts:

info.py                    # check HDF5 file structure and metadata
view.py                    # 2D map view
tsview.py                  # 1D point time-series (interactive)
plot_coherence_matrix.py   # plot coherence matrix for one pixel (interactive)
plot_network.py            # plot network configuration of the dataset
plot_transection.py        # plot 1D profile along a line of a 2D matrix (interactive)
save_kmz.py                # generate Google Earth KMZ file in points or raster image
save_kmz_timeseries.py     # generate Google Earth KMZ file in points for time-series (interactive)

2.2 Customized processing recipe

MintPy is a toolbox with individual utility scripts. Simply run the script with -h / --help to see its usage, you could build your own customized processing recipe! Here is an example to compare the velocities estimated from displacement time-series with different tropospheric delay corrections.

2.3 Build on top of mintpy module

MintPy is modulized in Python with utilities classes and functions and well commented in the code level. Users who are familiar with Python could build their own functions and modules on top of mintpy.objects and mintpy.utils. However, we don't have a complete API document website yet (maybe you can contribute this!). Below is an example of reading the 3D matrix of displacement time-series from an HDF5 file.

from mintpy.utils import readfile
ts_data, meta = readfile.read('timeseries_ERA5_ramp_demErr.h5')

Algorithms implemented in the software are described in details at Yunjun et al. (2019).

4. Contact us

5. Contributing

Imposter syndrome disclaimer: We want your help. No, really.

There may be a little voice inside your head that is telling you that you're not ready to be an open source contributor; that your skills aren't nearly good enough to contribute. What could you possibly offer?

We assure you - the little voice in your head is wrong. If you can write code at all, you can contribute code to open source. Contributing to open source projects is a fantastic way to advance one's coding skills. Writing perfect code isn't the measure of a good developer (that would disqualify all of us!); it's trying to create something, making mistakes, and learning from those mistakes. That's how we all improve, and we are happy to help others learn.

Being an open source contributor doesn't just mean writing code. You can help out by writing or proofreading documentation, suggesting or implementing tests, or even giving feedback about the project (and yes - that includes giving feedback about the contribution process). Some of these contributions may be the most valuable to the project as a whole, because you're coming to the project with fresh eyes, so you can see the errors and assumptions that seasoned contributors have glossed over.

For more information, please read our contributing guide.

This disclaimer was adapted from the MetPy project.

6. Citing this work

Yunjun, Z., Fattahi, H., and Amelung, F. (2019), Small baseline InSAR time series analysis: Unwrapping error correction and noise reduction, Computers & Geosciences, 133, 104331. [ doi | arxiv | data | notebook ]

In addition to the above, we recommend that you cite the original publications that describe the algorithms used in your specific analysis. They are noted briefly in the default template file and listed in the references.md file.

mintpy's People

Contributors

avalentino avatar bbuzz31 avatar bhuvankumaru avatar bjmarfito avatar cirrusasf avatar dependabot[bot] avatar ehavazli avatar falkamelung avatar forrestfwilliams avatar gravelcycles avatar hardreddata avatar hfattahi avatar jhkennedy avatar mehdizadehm avatar mirzaees avatar olliestephenson avatar ovec8hkin avatar pbrotoisworo avatar pre-commit-ci[bot] avatar ranneylxr avatar rzinke avatar sanghoonhong avatar scottstanie avatar sssangha avatar stacktom avatar taliboliver avatar yjzhenglamarmota avatar ymcmrs avatar yuankailiu avatar yunjunz avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mintpy's Issues

need ability to specify walltime for Dask

Hi @2gotgrossman : Pysar with Dask works great. Thank you!

Issues:

  • for longer jobs we need longer walltime then 30 min specified in the dash_pysar.yaml
  • its location in ~/.config/dask is not good if it is updated for different runs.

Suggestion: For each Dask invocation we copy the /PYSAR/defaults/dask_pysar.yaml into the projectname/PYSAR dir and modify it for the specific case. The walltime could be adjusted as a PySAR option:

pysar.networkInversion.parallel     = yes              # [yes / no], auto for no, parallel 
pysar.networkInversion.parallel.walltime   = 3:00         #  walltime in HH:MM, auto for 0:30
processing using dask

Are there any other options we may want to modify (e.g. number of workers? - I don't see it).

Any thoughts?

Others:

  • Some documentation on it on the pysar page would be great, including how to assess progress.
  • I did not see messages such as 'inversion split into 200 chunks' and information when chunks are finished.
  • there are two incidences of walltime. Which one is used: jobqueue or Ingram_inversion?
  • Who can/like/will do this?

ifgram_inversion: need command that waits until all dask workers are completed.

Hi David,
After the dask task I have added a function to move the 40 worker_.e and worker_.o files into separate directories as below. However, after pysar is completed there are still some worker*o file in the pysar directory. It appears that not *.e files exist when ut.move_dask_stdout_stderr_files() is run. Is there a command to wait until all dask workers are completed? I tried future.result() or similar but that did not solve the problem.

            for future, result in as_completed(futures, with_results=True):
                i_future += 1
                print("FUTURE #" + str(i_future), "complete in", time.time() - start_time_subboxes,
                      "seconds. Box:", subbox, "Time:", time.time())
                tsi, temp_cohi, ts_stdi, ifg_numi, subbox = result

                ts[:, subbox[1]:subbox[3], subbox[0]:subbox[2]] = tsi
                ts_std[:, subbox[1]:subbox[3], subbox[0]:subbox[2]] = ts_stdi
                temp_coh[subbox[1]:subbox[3], subbox[0]:subbox[2]] = temp_cohi
                num_inv_ifg[subbox[1]:subbox[3], subbox[0]:subbox[2]] = ifg_numi

            # Shut down Dask workers gracefully
            cluster.close()
            client.close()

        ut.move_dask_stdout_stderr_files()

asc_desc2horz_vert: examples

XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

  1. asc_desc2horz_vert.py
    In the example use geo_* for geocoded
example:
  #geocode asc/desc data into the same spatial resolution and coverage
  geocode.py AlosAT424/velocity -x 0.00027778 -y -0.00027778 --bbox 32.0 32.5 130.1 130.5 -o vel_AlosAT424.h5
  geocode.py AlosDT73/velocity  -x 0.00027778 -y -0.00027778 --bbox 32.0 32.5 130.1 130.5 -o vel_AlosDT73.h5

  asc_desc2horz_vert.py  vel_AlosAT424_masked.h5  vel_AlosDT73_masked.h5
  asc_desc2horz_vert.py  vel_EnvAT134_masked.h5   vel_EnvAT256_masked.h5  16

XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

Memory Error for very large dataset

Hello,

I am processing a large data set: A ~15 degrees latitude swath consisting of > 1000 interferograms for something like 100 dates. When I run smallbaselineApp, an avgPhaseVelocity map is produced, but the inversion step fails with "MemoryError". I briefly raised this issue in the past, and I know Heresh had started working on a block-by-block processing feature to break the data into smaller chunks. Was the block-by-block method finished and fully implemented for the rest of the process?

I have been using the MintPy version with block-by-block processing enabled (committed 13 Aug 2019), and it produces the file timeseries.h5. However, when I try tsview.py timeseries.h5 I get a similar "MemoryError" message.

Is a workaround still in the works? Or do I need to process my data in smaller chunks?

Thanks,
Robert

geocode.py fails with bilinear interpolation

Description of the problem

Running geocoding script with the bilinear flag faults with an error.

Full script that generated the error

geocode.py timeseriesStepModel.h5 timeseries -i bilinear

Full error message

resampling file: timeseriesStepModel.h5
reading timeseries from timeseriesStepModel.h5 ...
bilinear resampling using 1 processor cores ...
Traceback (most recent call last):
  File "/home/fielding/python/MintPy/mintpy/geocode.py", line 340, in <module>
    main()
  File "/home/fielding/python/MintPy/mintpy/geocode.py", line 334, in main
    run_geocode(inps)
  File "/home/fielding/python/MintPy/mintpy/geocode.py", line 311, in run_geocode
    print_msg=True)
  File "/home/fielding/python/MintPy/mintpy/objects/resample.py", line 88, in run_resample
    print_msg=True)
  File "/home/fielding/python/MintPy/mintpy/objects/resample.py", line 466, in run_pyresample
    epsilon=0)
  File "/home/fielding/anaconda3/envs/mintpy/lib/python3.7/site-packages/pyresample/bilinear/__init__.py", line 86, in resample_bilinear
    epsilon=epsilon)
  File "/home/fielding/anaconda3/envs/mintpy/lib/python3.7/site-packages/pyresample/bilinear/__init__.py", line 238, in get_bil_info
    proj = Proj(target_area_def.proj_str)
AttributeError: 'GridDefinition' object has no attribute 'proj_str'

System information

  • Operating system: Linux
  • Version of Python and relevant dependencies if applicable Python 3.7.3
  • Version of MintPy (output of smallbaselineApp.py -v): MintPy release version v1.2beta-32, release date 2019-09-09

Processing parameters for similarity to ARIA GUNW-derived time series

Description of the desired feature

Hi @dbekaert and @yunjunz
To facilitate the transition to ARIA GUNW products(once more widely available) I would like to use in my topsStack, MintPy processing parameters to end up with products as close as possible to GUNW-derived MintPy products.

On https://aria.jpl.nasa.gov/node/97 I see that 19 range looks and 7 azimuth looks are used. What spacing does ARIA use for geocoding? It says "~90 m" for the data group, but this is is not very precise. Is the spacing specified in meters or degrees? If the first it would be good to add a mintpy option for spacing in meters as it currently supports only mintpy.geocode.latStep and mintpy.geocode.lonStep .

@hfattahi , topsStack products may differ from ARIA standard products because of the Spectral Diversity Threshold. On the ARIA page I found the blurb below. Is there an option in topsStack to do the same thing? If not, could you point me to an ISCE function that does this for possible implementation into topsStack?

Thank you!

Default is 0.8. If coherence is too low and ESD fails, threshold is relaxed in steps of 0.05 until 0.5 is reached, after which processing is done using orbit information only.

Linux pykml problem: GE shows a red cross

Description of the problem
Sorry if this is a stupid post. Could you please confirm that the linux installation as described on installation.md is working ? I followed (almost) the instructions and got a red cross in Google Earth. I see lots of relative paths as ../../../../../ which could be the cause for the problem (at unzipping weird paths appear (see below)).

//login3/projects/scratch/insarlab/famelung/unittestGalapagosSenDT128/mintpy/geo[1060] unzip geo_velocity.kmz
Archive:  geo_velocity.kmz
warning:  skipped "../" path component(s) in ../../../../../../../projects/scratch/insarlab/famelung/unittestGalapagosSenDT128/mintpy/geo/geo_velocity.kml
replace projects/scratch/insarlab/famelung/unittestGalapagosSenDT128/mintpy/geo/geo_velocity.kml? [y]es, [n]o, [A]ll, [N]one, [r]ename: A
 extracting: projects/scratch/insarlab/famelung/unittestGalapagosSenDT128/mintpy/geo/geo_velocity.kml  
warning:  skipped "../" path component(s) in ../../../../../../../projects/scratch/insarlab/famelung/unittestGalapagosSenDT128/mintpy/geo/geo_velocity.png
 extracting: projects/scratch/insarlab/famelung/unittestGalapagosSenDT128/mintpy/geo/geo_velocity.png  
warning:  skipped "../" path component(s) in ../../../../../../../projects/scratch/insarlab/famelung/unittestGalapagosSenDT128/mintpy/geo/geo_velocity_cbar.png
 extracting: projects/scratch/insarlab/famelung/unittestGalapagosSenDT128/mintpy/geo/geo_velocity_cbar.png  
//login3/projects/scratch/insarlab/famelung/unittestGalapagosSenDT128/mintpy/geo[1061] lst
total 375712
//login3/projects/scratch/insarlab/famelung/unittestGalapagosSenDT128/mintpy/geo[1060] unzip geo_velocity.kmz
Archive:  geo_velocity.kmz
warning:  skipped "../" path component(s) in ../../../../../../../projects/scratch/insarlab/famelung/unittestGalapagosSenDT128/mintpy/geo/geo_velocity.kml
replace projects/scratch/insarlab/famelung/unittestGalapagosSenDT128/mintpy/geo/geo_velocity.kml? [y]es, [n]o, [A]ll, [N]one, [r]ename: A
 extracting: projects/scratch/insarlab/famelung/unittestGalapagosSenDT128/mintpy/geo/geo_velocity.kml  
warning:  skipped "../" path component(s) in ../../../../../../../projects/scratch/insarlab/famelung/unittestGalapagosSenDT128/mintpy/geo/geo_velocity.png
 extracting: projects/scratch/insarlab/famelung/unittestGalapagosSenDT128/mintpy/geo/geo_velocity.png  
warning:  skipped "../" path component(s) in ../../../../../../../projects/scratch/insarlab/famelung/unittestGalapagosSenDT128/mintpy/geo/geo_velocity_cbar.png
 extracting: projects/scratch/insarlab/famelung/unittestGalapagosSenDT128/mintpy/geo/geo_velocity_cbar.png  
//login3/projects/scratch/insarlab/famelung/unittestGalapagosSenDT128/mintpy/geo[1061] lst
total 375712

Full script that generated the error

PASTE CODE HERE

Full error message

PASTE ERROR MESSAGE HERE

System information

  • Operating system:
  • Version of Python and relevant dependencies if applicable
  • Version of MintPy (output of smallbaselineApp.py -v):

asc_desc2horz_vert.py does not work with timeseries file type

Description of the problem

I have two timeseries Step Model files from ascending and descending tracks. When I tried to run asc_desc2horz_vert.py on them it fails to write out the result because writefile.py requires a reference file.

Full script that generated the error

asc_desc2horz_vert.py A135_geo_timeseriesStepModel_msk.h5 D098_geo_timeseriesStepModel_msk.h5

Full error message

Using default MintPy Path: /home/fielding/tools/MintPy
---------------------
reading A135_geo_timeseriesStepModel_msk.h5
reading D098_geo_timeseriesStepModel_msk.h5
---------------------
get design matrix
incidence angle: 38.95743031204843
heading angle: 349.125244140625
incidence angle: 33.72858887716663
heading angle: 190.9101104736328
project asc/desc into horz/vert direction
---------------------
writing horizontal component to file: hz.h5
Traceback (most recent call last):
  File "/home/fielding/tools/MintPy/mintpy/asc_desc2horz_vert.py", line 271, in <module>
    main(sys.argv[1:])
  File "/home/fielding/tools/MintPy/mintpy/asc_desc2horz_vert.py", line 260, in main
    writefile.write(dH, out_file=inps.outfile[0], metadata=atr)
  File "/home/fielding/tools/MintPy/mintpy/utils/writefile.py", line 56, in write
    raise Exception('Can not write {} file without reference file!'.format(k))
Exception: Can not write timeseries file without reference file!

System information

  • Operating system: Linux
  • Version of Python and relevant dependencies if applicable
  • Version of MintPy (output of smallbaselineApp.py -v): MintPy release version v1.2.0-14, release date 2020-02-10

Scene average coherence includes areas of nodata from ISCE topsStack output

Description of the problem
My ISCE topsStack output includes some scenes with limited coverage of the area due to partial acquisitions. It seems that the scene average coherence of pairs that include those scenes is calculated using at least some of the area with no data. Here are the coherence plots:

coherence_1

coherence_2

Full script that generated the error
Ran the regular input_data step and then plot_smallbaselineApp.sh.

Full error message

Here is the coherence plot:
CoherenceMatrix

System information

  • Operating system: Linux
  • Version of Python and relevant dependencies if applicable: Python 3.7
  • Version of MintPy (output of smallbaselineApp.py -v): MintPy release version v1.2beta-18, release date 2019-08-24

save_gmt.py is confused about input parameters

Description of the problem

The save_gmt.py script is having trouble with its input parameter parsing.

Full script that generated the error

save_gmt.py geo_velocity.h5 -o geo_velocity.grd

or save_gmt.py -h.

Full error message

Using default MintPy Path: /home/fielding/tools/MintPy
usage: save_gmt.py [-h] [-o OUTFILE] file [dset]
save_gmt.py: error: unrecognized arguments: o m e / f i e l d i n g / t o o l s / M i n t P y / m i n t p y / s a v e _ g m t . p y

System information

  • Operating system:MacOS
  • Version of Python and relevant dependencies if applicable: 3.7
  • Version of MintPy (output of smallbaselineApp.py -v): MintPy release version v1.2beta-27, release date 2019-09-02

Problem generating baseline plot with pseudo-roi_pac mintpy run

I don't know if it is because all the baselines are 0 or the Btemp are negative (i'm not sure how robust is the code - if all Bperp are zero the code should just alternate their "Bperp" locations by date in order to produce a visible plot.

(MintPy) tiger mintpy_1) smallbaselineApp.py DOMUYO_ALOS2_A34.txt

_________________________________________________
 ____    ____   _            _   _______          
|_   \  /   _| (_)          / |_|_   __ \         
  |   \/   |   __   _ .--. `| |-' | |__) |_   __  
  | |\  /| |  [  | [ `.-. | | |   |  ___/[ \ [  ] 
 _| |_\/_| |_  | |  | | | | | |, _| |_    \ '/ /  
|_____||_____|[___][___||__]\__/|_____| [\_:  /   
                                         \__.'    

   Miami InSAR Time-series software in Python  
          MintPy v1.2.0-19, 2020-02-12
_________________________________________________

--RUN-at-2020-02-14 10:29:50.720435--
Run routine processing with smallbaselineApp.py on steps: ['load_data', 'modify_network', 'reference_point', 'correct_unwrap_error', 'stack_interferograms', 'invert_network', 'correct_LOD', 'correct_troposphere', 'deramp', 'correct_topography', 'residual_RMS', 'reference_date', 'velocity', 'geocode', 'google_earth', 'hdfeos5']
--------------------------------------------------
Project name: DOMUYO_ALOS2_A34
Go to work directory: /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/mintpy_1
copy default template file /u/pez-z1/paul/python/MintPy/mintpy/defaults/smallbaselineApp.cfg to work directory
read custom template file: /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/mintpy_1/DOMUYO_ALOS2_A34.txt
update default template based on input custom template
    mintpy.load.processor: auto --> roipac
    mintpy.load.unwFile: auto --> /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/DATA/geo*_msk-ramp_cut.unw
    mintpy.load.corFile: auto --> /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/DATA/geo*_cut.cor
    mintpy.load.demFile: auto --> /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/DATA/geo_190331-180401_4rlks_4alks_cut.hgt
    mintpy.network.coherenceBased: auto --> yes
    mintpy.network.minCoherence: auto --> 0.1
    mintpy.network.aoiLALO: auto --> no
    mintpy.troposphericDelay.method: auto --> no
    mintpy.deramp: auto --> no
    mintpy.topographicResidual: auto --> no
create directory: /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/mintpy_1/inputs
copy DOMUYO_ALOS2_A34.txt to inputs directory for backup.
copy smallbaselineApp.cfg to inputs directory for backup.
create directory: /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/mintpy_1/pic
copy DOMUYO_ALOS2_A34.txt to pic directory for backup.
copy smallbaselineApp.cfg to pic directory for backup.
read default template file: /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/mintpy_1/smallbaselineApp.cfg


******************** step - load_data ********************
load_data.py --template /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/mintpy_1/smallbaselineApp.cfg /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/mintpy_1/DOMUYO_ALOS2_A34.txt --project DOMUYO_ALOS2_A34
SAR platform/sensor : Alos2
processor: roipac
--------------------------------------------------
prepare metadata files for roipac products
prep_roipac.py /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/DATA/geo*_msk-ramp_cut.unw
prep_roipac.py /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/DATA/geo*_cut.cor
prep_roipac.py /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/DATA/geo_190331-180401_4rlks_4alks_cut.hgt
--------------------------------------------------
searching interferometric pairs info
input data files:
unwrapPhase     : /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/DATA/geo*_msk-ramp_cut.unw
coherence       : /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/DATA/geo*_cut.cor
number of unwrapPhase     : 17
number of coherence       : 17
--------------------------------------------------
searching geometry files info
input data files:
height          : /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/DATA/geo_190331-180401_4rlks_4alks_cut.hgt
--------------------------------------------------
updateMode : True
compression: None
--------------------------------------------------
create HDF5 file /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/mintpy_1/inputs/ifgramStack.h5 with w mode
create dataset /unwrapPhase of <class 'numpy.float32'>   in size of (17, 1439, 1620) with compression = None
[==================================================] 20200119_20190331    0s /     0s
create dataset /coherence   of <class 'numpy.float32'>   in size of (17, 1439, 1620) with compression = None
[==================================================] 20200119_20190331    0s /     0s
create dataset /date        of <class 'numpy.bytes_'>    in size of (17, 2)
create dataset /bperp       of <class 'numpy.float32'>   in size of (17,)
create dataset /dropIfgram  of <class 'numpy.bool_'>     in size of (17,)
add extra metadata: {'PROJECT_NAME': 'DOMUYO_ALOS2_A34', 'PLATFORM': 'Alos2'}
Finished writing to /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/mintpy_1/inputs/ifgramStack.h5
--------------------------------------------------
create HDF5 file /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/mintpy_1/inputs/geometryGeo.h5 with w mode
create dataset /height             of <class 'numpy.float32'>   in size of (1439, 1620) with compression = lzf
Finished writing to /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/mintpy_1/inputs/geometryGeo.h5
No lookup table info range/lat found in files.
Input data seems to be geocoded. Lookup file not needed.
Loaded dataset are processed by InSAR software: roipac
Loaded dataset is in GEO coordinates
Interferograms Stack: /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/mintpy_1/inputs/ifgramStack.h5
Geometry File       : /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/mintpy_1/inputs/geometryGeo.h5
Lookup Table File   : None
--------------------------------------------------
All data needed found/loaded/copied. Processed 2-pass InSAR data can be removed.
--------------------------------------------------
updating ifgramStack.h5, geometryGeo.h5 metadata based on custom template file: DOMUYO_ALOS2_A34.txt


******************** step - modify_network ********************
Input data seems to be geocoded. Lookup file not needed.
modify_network.py /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/mintpy_1/inputs/ifgramStack.h5 -t /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/mintpy_1/smallbaselineApp.cfg
No lookup table info range/lat found in files.
read options from template file: smallbaselineApp.cfg
open ifgramStack file: ifgramStack.h5
number of interferograms: 17
--------------------------------------------------
use coherence-based network modification
calculating spatial average of coherence in file /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/mintpy_1/inputs/ifgramStack.h5 ...
reading interferogram 17/17 ...
write average value in space into text file: ifgramStack_coherence_spatialAvg.txt
Get minimum spanning tree (MST) of interferograms with inverse of coherence.
Drop ifgrams with 1) average coherence < 0.1 AND 2) not in MST network: (0)
[]
--------------------------------------------------
number of interferograms to remove: 0
number of interferograms to keep  : 17
Calculated date12 to drop is the same as exsiting marked input file, skip updating file.

plot_network.py /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/mintpy_1/inputs/ifgramStack.h5 -t /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/mintpy_1/smallbaselineApp.cfg --nodisplay
read options from template file: smallbaselineApp.cfg
read temporal/spatial baseline info from file: /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/mintpy_1/inputs/ifgramStack.h5
['ifgramStack_coherence_spatialAvg.txt'] exists and is newer than ['/u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/mintpy_1/inputs/ifgramStack.h5'] --> skip.
ifgramStack_coherence_spatialAvg.txt already exists, read it directly
number of acquisitions: 11
number of interferograms: 17
--------------------------------------------------
number of interferograms marked as drop: 0
number of interferograms marked as keep: 17
number of acquisitions marked as drop: 0
Traceback (most recent call last):
  File "/u/pez-z1/paul/python/MintPy/mintpy/smallbaselineApp.py", line 1090, in <module>
    main()
  File "/u/pez-z1/paul/python/MintPy/mintpy/smallbaselineApp.py", line 1080, in main
    app.run(steps=inps.runSteps, plot=inps.plot)
  File "/u/pez-z1/paul/python/MintPy/mintpy/smallbaselineApp.py", line 1012, in run
    self.run_network_modification(sname)
  File "/u/pez-z1/paul/python/MintPy/mintpy/smallbaselineApp.py", line 457, in run_network_modification
    mintpy.plot_network.main(scp_args.split())
  File "/u/pez-z1/paul/python/MintPy/mintpy/plot_network.py", line 218, in main
    inps = cmd_line_parse(iargs)
  File "/u/pez-z1/paul/python/MintPy/mintpy/plot_network.py", line 142, in cmd_line_parse
    inps.colormap = pp.ColormapExt(inps.cmap_name, vlist=inps.cmap_vlist).colormap
  File "/u/pez-z1/paul/python/MintPy/mintpy/objects/colors.py", line 101, in __init__
    self.get_colormap()
  File "/u/pez-z1/paul/python/MintPy/mintpy/objects/colors.py", line 163, in get_colormap
    colors1 = self.colormap(np.linspace(0.0, 0.3, n1))
  File "<__array_function__ internals>", line 6, in linspace
  File "/u/pez-z1/paul/python/miniconda3/envs/MintPy/lib/python3.6/site-packages/numpy/core/function_base.py", line 124, in linspace
    raise ValueError("Number of samples, %s, must be non-negative." % num)
ValueError: Number of samples, -320, must be non-negative.


(MintPy) tiger mintpy_1) more ifgramStack_coherence_spatialAvg.txt
# Data file: ifgramStack.h5
# Mask file: None
# AOI box: (0, 0, 1620, 1439)
#	DATE12		Mean	Btemp/days	Bperp/m		Num
20161127_20160207	0.5190	    -294	     0.0	0
20170205_20160207	0.4960	    -364	     0.0	1
20171112_20150208	0.3000	   -1008	     0.0	2
20171112_20160207	0.3782	    -644	     0.0	3
20171112_20170205	0.3429	    -280	     0.0	4
20180121_20150208	0.3307	   -1078	     0.0	5
20180121_20170205	0.4570	    -350	     0.0	6
20180401_20150208	0.4026	   -1148	     0.0	7
20180401_20160207	0.4100	    -784	     0.0	8
20180401_20170205	0.4793	    -420	     0.0	9
20190217_20150208	0.3736	   -1470	     0.0	10
20190217_20160207	0.4007	   -1106	     0.0	11
20190217_20180401	0.4734	    -322	     0.0	12
20190331_20190217	0.5160	     -42	     0.0	13
20200105_20190217	0.4968	    -322	     0.0	14
20200105_20190331	0.4731	    -280	     0.0	15
20200119_20190331	0.5310	    -294	     0.0	16
(MintPy) tiger mintpy_1) 

Applying mask.py to timeseriesStepModel.h5 results in file that can't be geocoded

Description of the problem

I used the mask.py script to apply a mask to the timeseriesStepModel.h5 file, but the output does not work for geocoding.

Full script that generated the error

mask.py timeseriesStepModel.h5 -m maskTempCoh.h5
geocode.py timeseriesStepModel_msk.h5 -d timeseries

The original file timeseriesStepModel.h5 has this structure:

HDF5 dataset "/date                ": shape (1,)                , dtype <|S8>
HDF5 dataset "/timeseries          ": shape (1, 830, 1319)      , dtype <float32>

After masking, the timeseriesStepModel_msk.h5 file has this structure:

HDF5 dataset "/date                ": shape (1,)                , dtype <|S8>
HDF5 dataset "/timeseries          ": shape (830, 1319)         , dtype <float32>

Full error message

(mintpy) [fielding@sar MintPy_v1]$ geocode.py timeseriesStepModel_msk.h5 -d timeseries
Using default MintPy Path: /home/fielding/tools/MintPy
geocode.py timeseriesStepModel_msk.h5 -d timeseries
number of processor to be used: 1
output pixel size in (lat, lon) in degree: (-0.001060076300571566, 0.0008407384383298559)
output area extent in (S N W E) in degree: (17.76697540283203, 18.645779, -67.36084, -66.25274658203125)
--------------------------------------------------
resampling file: timeseriesStepModel_msk.h5
reading timeseries from timeseriesStepModel_msk.h5 ...
Traceback (most recent call last):
  File "/home/fielding/tools/MintPy/mintpy/geocode.py", line 341, in <module>
    main(sys.argv[1:])
  File "/home/fielding/tools/MintPy/mintpy/geocode.py", line 335, in main
    run_geocode(inps)
  File "/home/fielding/tools/MintPy/mintpy/geocode.py", line 299, in run_geocode
    data = readfile.read(infile, datasetName=dsName, print_msg=False)[0]
  File "/home/fielding/tools/MintPy/mintpy/utils/readfile.py", line 203, in read
    data = read_hdf5_file(fname, datasetName=datasetName, box=box)
  File "/home/fielding/tools/MintPy/mintpy/utils/readfile.py", line 231, in read_hdf5_file
    slice_list = get_slice_list(fname)
  File "/home/fielding/tools/MintPy/mintpy/utils/readfile.py", line 462, in get_slice_list
    obj.open(print_msg=False)
  File "/home/fielding/tools/MintPy/mintpy/objects/stack.py", line 165, in open
    self.get_size()
  File "/home/fielding/tools/MintPy/mintpy/objects/stack.py", line 204, in get_size
    self.numDate, self.length, self.width = f[self.name].shape
ValueError: not enough values to unpack (expected 3, got 2)

System information

  • Operating system: Linux
  • Version of Python and relevant dependencies if applicable
  • Version of MintPy (output of smallbaselineApp.py -v): MintPy release version v1.2.0-13, release date 2020-02-10

Error loading stackSentinel output processed twice

Description of the problem

I used the stackSentinel processing system in ISCE to process a stack twice. The first time, I had the first date in December 2019. Then I went back and reprocessed the stack starting in July 2019. When I tried to load the data from the stack into MintPy, it seems that it got confused because there were baselines with two different reference dates in the baselines directory.

Full script that generated the error

Here is the listing of the baselines directory

(mintpy) [fielding@sar baselines]$ ls
20190718_20190730  20190718_20190916  20190718_20191115  20190718_20200114  20191221_20200126
20190718_20190811  20190718_20190928  20190718_20191127  20190718_20200126
20190718_20190823  20190718_20191010  20190718_20191209  20190718_20200207
20190718_20190829  20190718_20191022  20190718_20191221  20191221_20200102
20190718_20190904  20190718_20191103  20190718_20200102  20191221_20200114

Full error message

******************** step - load_data ********************
load_data.py --template /u/sar-r2/fielding/Puerto_Rico/S1AB/D098/seismic/MintPy_v1/smallbaselineApp.cfg /u/sar-r2/fielding/Puerto_Rico/S1AB/D098/seismic/MintPy_v1/PuertoRicoSenD098.txt --project PuertoRicoSenD098
SAR platform/sensor : Sen
processor: isce
--------------------------------------------------
prepare metadata files for isce products
prep_isce.py -m ../master/IW1.xml -g ../merged/geom_master -b ../baselines  -i ../merged/interferograms -f filt_*.unw
['../master/data.rsc'] exists and is newer than ['../master/IW1.xml'] --> skip.
prepare .rsc file for geometry files
read perp baseline time-series from ../baselines
prepare .rsc file for  filt_*.unw
Traceback (most recent call last):
  File "/home/fielding/tools/MintPy/mintpy/prep_isce.py", line 493, in <module>
    main()
  File "/home/fielding/tools/MintPy/mintpy/prep_isce.py", line 485, in main
    update_mode=inps.update_mode)
  File "/home/fielding/tools/MintPy/mintpy/prep_isce.py", line 444, in prepare_stack
    ifg_metadata = add_ifgram_metadata(ifg_metadata, dates, baseline_dict)
  File "/home/fielding/tools/MintPy/mintpy/prep_isce.py", line 346, in add_ifgram_metadata
    bperp_top = baseline_dict[dates[1]][0] - baseline_dict[dates[0]][0]
KeyError: '20190718'

System information

  • Operating system: Linux
  • Version of Python and relevant dependencies if applicable: Python 3.7.6
  • Version of MintPy (output of smallbaselineApp.py -v): MintPy v1.2.0-9, 2020-02-05

error executing example of Fernandina volcano with Sentinel-1 data

Hi,

I am trying to execute the example of Fernandina volcano with Sentinel-1 data but I am getting the following error. can anyone help me to solve the above issue.

downloading weather model data using PyAPS ...
number of grib files to download: 98
------------------------------------------------------------------------------
INFO: You are using the latest ECMWF platform for downloading datasets: https://cds.climate.copernicus.eu/api/v2
INFO: You are using the latest ECMWF platform for downloading datasets: https://cds.climate.copernicus.eu/api/v2
INFO: You are using the latest ECMWF platform for downloading datasets: https://cds.climate.copernicus.eu/api/v2
open geometry file: geometryRadar.h5
reading height          data from file: /Users/python/FernandinaSenDT128/inputs/inputs/geometryRadar.h5 ...
reading incidenceAngle  data from file: /Users/python/FernandinaSenDT128/inputs/inputs/geometryRadar.h5 ...
reading latitude        data from file: /Users/python/FernandinaSenDT128/inputs/inputs/geometryRadar.h5 ...
reading longitude       data from file: /Users/python/FernandinaSenDT128/inputs/inputs/geometryRadar.h5 ...
calcualting delay for each date using PyAPS (Jolivet et al., 2011; 2014) ...
number of grib files used: 0
Traceback (most recent call last):

  File "/Users/python/MintPy/mintpy/utils/ptime.py", line 296, in update_amount
    percentDone = (diffFromMin / np.float(self.span)) * 100.0
ZeroDivisionError: float division by zero

Simple TS inversion for roi_pac like case

I am a new mintpy user (have it working for ARIA S1 standard products) and I have some ALOS-2 processed, and dumped in a single directory of gocoded unwrapped interferograms and coherence files, each with a roi_pac like *.rsc file. The file names include the dates. So I have used GIAnT to process these but I want to switch to mintpy. When I try running with a simple input *.txt file:

smallbaselineApp.py DOMUYO_ALOS2_A34.txt

I get the following error (its doesn't know the sensor):

--RUN-at-2020-02-11 14:20:37.315811--
Run routine processing with smallbaselineApp.py on steps: ['load_data', 'modify_network', 'reference_point', 'correct_unwrap_error', 'stack_interferograms', 'invert_network', 'correct_LOD', 'correct_troposphere', 'deramp', 'correct_topography', 'residual_RMS', 'reference_date', 'velocity', 'geocode', 'google_earth', 'hdfeos5']
--------------------------------------------------
Project name: DOMUYO_ALOS2_A34
Go to work directory: /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/mintpy_1
read custom template file: /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/mintpy_1/DOMUYO_ALOS2_A34.txt
update default template based on input custom template
read default template file: /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/mintpy_1/smallbaselineApp.cfg


******************** step - load_data ********************
load_data.py --template /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/mintpy_1/smallbaselineApp.cfg /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/mintpy_1/DOMUYO_ALOS2_A34.txt --project DOMUYO_ALOS2_A34
Traceback (most recent call last):
  File "/u/pez-z1/paul/python/MintPy/mintpy/smallbaselineApp.py", line 1090, in <module>
    main()
  File "/u/pez-z1/paul/python/MintPy/mintpy/smallbaselineApp.py", line 1080, in main
    app.run(steps=inps.runSteps, plot=inps.plot)
  File "/u/pez-z1/paul/python/MintPy/mintpy/smallbaselineApp.py", line 1009, in run
    self.run_load_data(sname)
  File "/u/pez-z1/paul/python/MintPy/mintpy/smallbaselineApp.py", line 368, in run_load_data
    mintpy.load_data.main(scp_args.split())
  File "/u/pez-z1/paul/python/MintPy/mintpy/load_data.py", line 624, in main
    inpsDict = read_inps2dict(inps)
  File "/u/pez-z1/paul/python/MintPy/mintpy/load_data.py", line 190, in read_inps2dict
    inpsDict['PLATFORM'] = str(sensor.project_name2sensor_name(str(inpsDict['PROJECT_NAME']))[0])
  File "/u/pez-z1/paul/python/MintPy/mintpy/objects/sensor.py", line 162, in project_name2sensor_name
    sensor = [s for s in sensor if s in proj_name][0]
IndexError: list index out of range

Here is the input file:

> more DOMUYO_ALOS2_A34.txt
# vim: set filetype=cfg:
##-------------------------------- MintPy -----------------------------##
mintpy.reference.lalo        = auto #14.100,120.938  #PTGY station

mintpy.deramp                = no  #[no / linear / quadratic], auto for no - no ramp will be removed
mintpy.topographicResidual   = no  #[yes / no], auto for yesmintpy.troposphericDelay.method = no

mintpy.troposphericDelay.method = no  #[pyaps / height_correlation / no], auto for pyaps

mintpy.network.coherenceBased            = yes

mintpy.network.minCoherence    = 0.7

mintpy.network.aoiLALO         = -36.659:-36.560,-70.490:-70.370  #[lat0:lat1,lon0:lon1 / no], auto for no - use the whole area

mintpy.load.processor      = roipac  #[isce,roipac,gamma,], auto for isce
mintpy.load.updateMode     = auto  #[yes / no], auto for yes, skip re-loading if HDF5 files are complete
mintpy.load.compression    = auto  #[gzip / lzf / no], auto for no [recommended].
##---------interferogram datasets:
mintpy.load.unwFile        = /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/DATA/geo*_msk_cut.unw
mintpy.load.corFile        = /u/puma-r0/paul/CHILLAN/ALOS2/A034/TS/DATA/geo*_cut.cor

Python 3

Hello! Are there any plans to support Python 3?

move notebooks out of mintpy repo

This issue is to track the process of moving the jupyter notebooks out of MintPy repository because they are 1) generally much larger than the script and 2) might experience quite some details changes, leading to unnecessary loads of commit history.

To do list:

  • move the following notebooks to new repository insarlab/MintPy-tutorial
    • docs/tutorials
    • docs/examples/applications
    • docs/examples/simulations
  • move docs/paper to new repository geodesymiami/Yunjun_et_al-2019-MintPy.

Links to this notebooks will still be kept in MintPy repo.

Dask Throws Exception Once in a While due to `gamma` import

This Exception used to be thrown once in a while when running with Dask, but now it is thrown consistently (the last 4 times I have run the code with Dask).

@yunjunz , you mentioned that we could just add the gamma function to our code and that would solve the problem. Is that still possible?

Traceback (most recent call last):
  File "pysar/ifgram_inversion.py", line 1220, in <module>
    main()
  File "pysar/ifgram_inversion.py", line 1211, in main
    ifgram_inversion(inps.ifgramStackFile, inps)
  File "pysar/ifgram_inversion.py", line 1117, in ifgram_inversion
    for future, result in as_completed(futures, with_results=True):
  File "/nethome/dwg11/anaconda2/envs/pysar_new/lib/python3.6/site-packages/distributed/client.py", line 3858, in __next__
    return self._get_and_raise()
  File "/nethome/dwg11/anaconda2/envs/pysar_new/lib/python3.6/site-packages/distributed/client.py", line 3849, in _get_and_raise
    six.reraise(*result)
  File "/nethome/dwg11/anaconda2/envs/pysar_new/lib/python3.6/site-packages/six.py", line 692, in reraise
    raise value.with_traceback(tb)
  File "/nethome/dwg11/anaconda2/envs/pysar_new/lib/python3.6/site-packages/distributed/protocol/pickle.py", line 59, in loads
    return pickle.loads(x)
  File "/nethome/dwg11/anaconda2/envs/pysar_new/lib/python3.6/site-packages/numpy/core/__init__.py", line 149, in _ufunc_reconstruct
    return getattr(mod, name)
AttributeError: module '__mp_main__' has no attribute 'gamma'

new features for save_gbis.py

@ranneylxr, @falkamelung, let's discuss here regarding new features for save_gbis.py #174, geodesymiami/mimtpy#1. I can think of the following:

The input requirement for GBIS is simple: displacement measurement in points format and in the unit of radians.

  1. support for more file types: currently supports velocity and *.unw file, which is generated using save_roipac.py. HDF-EOS5, timeseries and ifgramStack file supports are desired.

  2. regarding multiple tracks of datasets, I don't think it's a good idea to include it in save_gbis.py, because GBIS handles each track of dataset in a separate mat file, thus, for multiple mat files, one can run save_gbis.py multiple times. Merging mutiple tracks in one command line would bring unnecessary complexity.

What do you think?

Possibly add python>=3.6 to docs/conda.txt

This is related to #76

When combining multiple requirements.txt (conda.txt for MintPy) to use gdal 3.0 features with MintPy, conda's automatic resolution downgrades the python version to 2.7 to satisfy all the requirements. Adding python>=3.6 to conda.txt guarantees that a python3 environment is setup and that python doesnt get accidentally downgraded.

It would help to include this in the requirements file to allow for setup of environments where multiple tools can operate in the same env by just combining the requirements.txt files.

For reference adding these 3 lines to conda.txt allows us to interoperate with gdal3.0

gdal>=3.0
libgdal
python>=3.6

Support different GPS solution providers

Currently MintPy supports only the UNR GPS solutions. It would be nice to have an option to select the GPS data source. For example, people from Scripps, MIT and JPL would like to use SOPAC, PBO and JPL GPS solutions, respectively. The way to do this would be to support the Unavco webservices where some of the GPS solution are available.

Description of the desired feature

Is your feature request related to a problem? Please describe

Describe the solution you'd like

Describe alternatives you have considered

Additional context

Are you willing to help implement and maintain this feature? Yes/No

geocode.py usage example not correct

Not a bit deal, but there is some confusion with the geocode usage example.geocode.py --help prints out at the end:

example:
  geocode.py velocity.h5
  geocode.py velocity.h5 -b -0.5 -0.25 -91.3 -91.1
  geocode.py velocity.h5 timeseries.h5 -t pysarApp_template.txt -o ./GEOCODE --update

Although the -o says this is the option for a file. the user expects that
geocode.py velocity.h5 timeseries.h5 -t pysarApp_template.txt -o ./GEOCODE —update
will write into the GEOCODE directory (as pysarApp does this). This is a confusion but not really a bug.

geocode.py velocity.h5 -t pysarApp_template.txt -o qqq --update works as expected, however with two arguments the qqq is ignored and I get geo_velocity.h5 and geo_timeseries.h5

Adding topsStack, ssara support to tropo_pyaps3.py ?

Hi @yunjunz @hfattahi @bhuvankumaru @mirzaees ,
I generally skip atmospheric corrections because the download of the atmospheric models can take very long. We should start tropo_pyaps3.py at the beginning of the SSARA download or topsStack processing. We (Xiaoran @ranneylxr ) are considering to add options to read dates and times from the SAFE_file_list.txt or the ssara_federated_query.py --print. Given that we have --date-list we would do:

tropo_pyaps3.py  --SAFE-list SAFE_file_list.txt
tropo_pyaps3.py  --ssara_list ssara_listing.txt

Any thoughts?

Heres the frames that would be supported:

cat SAFE_files.txt 
/projects/scratch/insarlab/famelung/BolnaySenDT150/SLC/S1B_IW_SLC__1SDV_20170619T234755_20170619T234823_006126_00AC37_040E.zip
/projects/scratch/insarlab/famelung/BolnaySenDT150/SLC/S1A_IW_SLC__1SDV_20160712T234822_20160712T234852_012122_012C64_F3BA.zip
/projects/scratch/insarlab/famelung/BolnaySenDT150/SLC/S1A_IW_SLC__1SDV_20161203T234810_20161203T234836_014222_016FD8_3BAB.zip
/projects/scratch/insarlab/famelung/BolnaySenDT150/SLC/S1A_IW_SLC__1SDV_20170120T234807_20170120T234834_014922_0185A3_2BCC.zip


cat ./ssara_listing.txt
//login4/projects/scratch/insarlab/famelung/WenchuanSenDT62/SLC[1086] cat ./ssara_listing.txt
Running SSARA API Query:  https://web-services.unavco.org/brokered/ssara/api/sar/search?platform=SENTINEL-1A%2CSENTINEL-1B&relativeOrbit=62&start=%3D2014-01-01&processingLevel=L0%2CL1.0%2CSLC&intersectsWith=Polygon%28%28102.10+30.40%2C+102.10+31.60%2C+105.40+31.60%2C+105.40+30.40%2C+102.10+30.40%29%29
SSARA API query: 34.263800 seconds
Found 292 scenes
wget -O dem.tif "http://ot-data1.sdsc.edu:9090/otr/getdem?north=33.824873&south=28.222164&east=105.754675&west=101.805635&demtype=SRTMGL1_E"
gdal_translate -of GMT -ot Int16 -projwin 101.805635 33.824873 105.754675 28.222164 /vsicurl/https://cloud.sdsc.edu/v1/AUTH_opentopography/Raster/SRTM_GL1_Ellip/SRTM_GL1_Ellip_srtm.vrt dem.grd
ASF,Sentinel-1A,2759,2014-10-09T23:03:54.000000,2014-10-09T23:04:21.000000,62,2952,2952,IW,NA,DESCENDING,R,VV,https://datapool.asf.alaska.edu/SLC/SA/S1A_IW_SLC__1SSV_20141009T230354_20141009T230421_002759_003198_8E1D.zip
ASF,Sentinel-1A,2759,2014-10-09T23:04:19.000000,2014-10-09T23:04:47.000000,62,2981,2981,IW,NA,DESCENDING,R,VV,https://datapool.asf.alaska.edu/SLC/SA/S1A_IW_SLC__1SSV_20141009T230419_20141009T230447_002759_003198_7263.zip
ASF,Sentinel-1A,3109,2014-11-02T23:03:54.000000,2014-11-02T23:04:21.000000,62,2952,2952,IW,NA,DESCENDING,R,VV,https://datapool.asf.alaska.edu/SLC/SA/S1A_IW_SLC__1SSV_20141102T230354_20141102T230421_003109_00390D_8B4F.zip

Description of the desired feature

Is your feature request related to a problem? Please describe

Describe the solution you'd like

Describe alternatives you have considered

Additional context

Are you willing to help implement and maintain this feature? Yes/No

asc_desc2horz_vert.py is too strict about reference point matching

Description of the problem

The asc_desc2horz_vert.py is checking that the reference point latitude and longitude for the two input datasets is the same using 4 decimal digits. The rounding of the location of the reference point can be different when the reference point is set to the same value. Maybe the reference point is rounded to the nearest pixel of each dataset and they are not geocoded on the same grid?

Full script that generated the error

add_attribute.py A135_geo_timeseriesStepModel_msk.h5 ref_lat=18.2 ref_lon=-67.15
add_attribute.py D098_geo_timeseriesStepModel_msk.h5 ref_lat=18.2 ref_lon=-67.15
asc_desc2horz_vert.py A135_geo_timeseriesStepModel_msk.h5 D098_geo_timeseriesStepModel_msk.h5

Full error message

Note, I added an extra line to the script to report the values of the reference point location if it finds they are different. I was able to get the script to work if I change the number of digits used for checking the reference point to 2 decimal places. I was not sure if that is the best solution or whether the way that the reference point is stored in the HDF5 files should be changed.

Using default MintPy Path: /home/fielding/tools/MintPy
---------------------
reading A135_geo_timeseriesStepModel_msk.h5
reading D098_geo_timeseriesStepModel_msk.h5
---------------------
get design matrix
incidence angle: 38.95743031204843
heading angle: 349.125244140625
incidence angle: 33.72858887716663
heading angle: 190.9101104736328
project asc/desc into horz/vert direction
---------------------
writing horizontal component to file: hz.h5
Traceback (most recent call last):
  File "/home/fielding/tools/MintPy/mintpy/asc_desc2horz_vert.py", line 271, in <module>
    main(sys.argv[1:])
  File "/home/fielding/tools/MintPy/mintpy/asc_desc2horz_vert.py", line 260, in main
    writefile.write(dH, out_file=inps.outfile[0], metadata=atr)
  File "/home/fielding/tools/MintPy/mintpy/utils/writefile.py", line 56, in write
    raise Exception('Can not write {} file without reference file!'.format(k))
Exception: Can not write timeseries file without reference file!
(mintpy) [fielding@sar combo]$ asc_desc2horz_vert.py A135_geo_timeseriesStepModel_msk.h5 D098_geo_timeseriesStepModel_msk.h5
Using default MintPy Path: /home/fielding/tools/MintPy
data1 reference point ['18.1999', '-67.1501'] data2 reference point ['18.2006', '-67.1495']
Traceback (most recent call last):
  File "/home/fielding/tools/MintPy/mintpy/asc_desc2horz_vert.py", line 271, in <module>
    main(sys.argv[1:])
  File "/home/fielding/tools/MintPy/mintpy/asc_desc2horz_vert.py", line 251, in main
    inps = cmd_line_parse(iargs)
  File "/home/fielding/tools/MintPy/mintpy/asc_desc2horz_vert.py", line 99, in cmd_line_parse
    raise ValueError('input files do not have the same reference point from REF_LAT/LON values')
ValueError: input files do not have the same reference point from REF_LAT/LON values

System information

  • Operating system:
  • Version of Python and relevant dependencies if applicable
  • Version of MintPy (output of smallbaselineApp.py -v):

Problem loading ISCE topsStack output with inconsistent size

Description of the problem

Load data failed from Sentinel-1 data processed with topsStack processor.

smallbaselineApp.py RidgecrestSenA064.txt --dostep load_data

This is my config file:

# vim: set filetype=cfg:
##----------------------------- SentinelStack/ISCE ---------------------##
cleanopt                        = 1    # [ 0 / 1 / 2 / 3 4]   0: no cleaning, 1: largest files, 2: merged/etc, PROCESS dirs, 3: SLC,RAW, 4: everything
ssaraopt                        = --platform=SENTINEL-1A,SENTINEL-1B -r 128 -f 587,588,589,590,591,592,593  -e 2018-07-01
processor                       = isce
sentinelStack.demDir            = /nethome/famelung/Sentinel/GalapagosT128SenVVD/DEM
sentinelStack.boundingBox       = '-1 0.15 -91.6 -90.9'
sentinelStack.subswath          = 1 2  # comment 
sentinelStack.numConnections    = 5   # comment
sentinelStack.azimuthLooks      = 5   # comment
sentinelStack.rangeLooks        = 15  # comment
sentinelStack.filtStrength      = 0.2  # comment
sentinelStack.unwMethod         = snaphu  # comment
sentinelStack.coregistration    = auto  # comment



##-------------------------------- MintPy -----------------------------##
mintpy.load.processor        = isce
##---------for ISCE only:
mintpy.load.metaFile         = ../master/IW*.xml
mintpy.load.baselineDir      = ../baselines
##---------interferogram datasets:
mintpy.load.unwFile          = ../merged/interferograms/*/filt_*.unw
mintpy.load.corFile          = ../merged/interferograms/*/filt_*.cor
mintpy.load.connCompFile     = ../merged/interferograms/*/filt_*.unw.conncomp
##---------geometry datasets:
mintpy.load.demFile          = ../merged/geom_master/hgt.rdr
mintpy.load.lookupYFile      = ../merged/geom_master/lat.rdr
mintpy.load.lookupXFile      = ../merged/geom_master/lon.rdr
mintpy.load.incAngleFile     = ../merged/geom_master/los.rdr
mintpy.load.azAngleFile      = ../merged/geom_master/los.rdr
mintpy.load.shadowMaskFile   = ../merged/geom_master/shadowMask.rdr
mintpy.load.waterMaskFile    = ../merged/geom_master/waterMask.rdr
mintpy.load.bperpFile        = ../merged/baseline_grid/*/bperp.rdr

#mintpy.subset.yx                         = 400:2400,0:1700
#mintpy.reference.lalo                    = -0.31,-91.22
mintpy.network.endDate                   = 20190704
mintpy.unwrapError.method                = phase_closure   #[bridging / phase_closure / no], auto for no
#mintpy.topographicResidual.stepFuncDate  = 20150524,20150616,20170321,20170910,20180613  #Wolf,Wolf,CerroAzul,Fernandina,Fernandina
#mintpy.deramp                            = linear

Full error message

searching interferometric pairs info
input data files:
unwrapPhase     : ../merged/interferograms/*/filt_*.unw
coherence       : ../merged/interferograms/*/filt_*.cor
connectComponent: ../merged/interferograms/*/filt_*.unw.conncomp
number of unwrapPhase     : 255
number of coherence       : 255
number of connectComponent: 255
--------------------------------------------------
searching geometry files info
input data files:
height          : ../merged/geom_master/hgt.rdr
latitude        : ../merged/geom_master/lat.rdr
longitude       : ../merged/geom_master/lon.rdr
incidenceAngle  : ../merged/geom_master/los.rdr
azimuthAngle    : ../merged/geom_master/los.rdr
shadowMask      : ../merged/geom_master/shadowMask.rdr
--------------------------------------------------
updateMode : True
compression: None
--------------------------------------------------
create HDF5 file /net/kraken/nobak/fielding/Ridgecrest/S1AB/A064/stack2/MintPy_v1/inputs/ifgramStack.h5 with w mode
create dataset /unwrapPhase      of <class 'numpy.float32'>   in size of (255, 3988, 9791) with compression = None
[=======>                15%                       ] 20151017_20151029   29s /   167sTraceback (most recent call last):
  File "/home/fielding/python/MintPy/mintpy/smallbaselineApp.py", line 1077, in <module>
    main()
  File "/home/fielding/python/MintPy/mintpy/smallbaselineApp.py", line 1067, in main
    app.run(steps=inps.runSteps, plot=inps.plot)
  File "/home/fielding/python/MintPy/mintpy/smallbaselineApp.py", line 996, in run
    self.run_load_data(sname)
  File "/home/fielding/python/MintPy/mintpy/smallbaselineApp.py", line 364, in run_load_data
    mintpy.load_data.main(scp_args.split())
  File "/home/fielding/python/MintPy/mintpy/load_data.py", line 589, in main
    extra_metadata=extraDict)
  File "/home/fielding/python/MintPy/mintpy/objects/stackDict.py", line 153, in write2hdf5
    ds[i, :, :] = data
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
  File "/home/fielding/anaconda3/envs/mintpy/lib/python3.7/site-packages/h5py/_hl/dataset.py", line 707, in __setitem__
    for fspace in selection.broadcast(mshape):
  File "/home/fielding/anaconda3/envs/mintpy/lib/python3.7/site-packages/h5py/_hl/selections.py", line 299, in broadcast
    raise TypeError("Can't broadcast %s -> %s" % (target_shape, self.mshape))
TypeError: Can't broadcast (3988, 6945) -> (3988, 9791)

System information

  • Operating system: Linux
  • Version of Python and relevant dependencies if applicable: Python 3.7 from Anaconda
  • Version of MintPy (output of smallbaselineApp.py -v):

No default reference_date

Description of the problem

Running the the full smallbaselineApp.py with the tropospheric correction and topographic residual correction turned off fails when it reaches the "reference_date" step. I thought it would automatically set the reference date to the first date.

Full script that generated the error

Contents of my configuration file.

mintpy.reference.yx           = auto
#mintpy.reference.lalo        = -35.5,-117.7
#mintpy.topographicResidual.stepFuncDate  = 20170910,20180613  #eruption dates
#mintpy.deramp                = linear

mintpy.troposphericDelay.method         = no  #[pyaps / height_correlation / no], auto for pyaps
mintpy.troposphericDelay.weatherModel   = ERA5  #[ECMWF / MERRA / NARR], auto for ECMWF, for pyaps method
mintpy.troposphericDelay.weatherDir     = ./  #[path2directory], auto for WEATHER_DIR or "./"
mintpy.troposphericDelay.polyOrder      = auto  #[1 / 2 / 3], auto for 1, for height_correlation method
mintpy.troposphericDelay.looks          = auto  #[1-inf], auto for 8, for height_correlation, number of looks
mintpy.troposphericDelay.minCorrelation = auto  #[0.0-1.0], auto for 0, for height_correlation

mintpy.topographicResidual  = no
mintpy.residualRMS.maskFile = auto  #[file name / no], auto for maskTempCoh.h5, mask for ramp estimation
mintpy.residualRMS.deramp   = auto  #[quadratic / linear / no], auto for quadratic
mintpy.residualRMS.cutoff   = auto  #[0.0-inf], auto for 3

Full error message

...
generate_mask.py temporalCoherence.h5 -m 0.7 -o maskTempCoh.h5 --base /u/pez-z2/fielding/Calif/Ridgecrest/S1AB/D071/postseismic/MintPy_v1/inputs/geometryRadar.h5 --base-dataset shadowMask --base-value 1
update mode: ON
run or skip: run
input temporalCoherence file: temporalCoherence.h5
read temporalCoherence.h5
create initial mask with the same size as the input file and all = 1
all pixels with nan value = 0
exclude pixels with value < 0.7
exclude pixels in base file geometryRadar.h5 dataset shadowMask with value == 1.0
create HDF5 file: maskTempCoh.h5 with w mode
create dataset /mask of bool       in size of (3343, 9747)         with compression=None
finished writing to maskTempCoh.h5
time used: 00 mins 0.7 secs.
number of reliable pixels: 29101856


******************** step - correct_LOD ********************
No local oscillator drift correction is needed for Sen.


******************** step - correct_troposphere ********************
No tropospheric delay correction.


******************** step - deramp ********************
No phase ramp removal.


******************** step - correct_topography ********************
No topographic residual correction.


******************** step - residual_RMS ********************
No residual phase file found! Skip residual RMS analysis.


******************** step - reference_date ********************
reference_date.py -t /u/pez-z2/fielding/Calif/Ridgecrest/S1AB/D071/postseismic/MintPy_v1/smallbaselineApp.cfg  timeseries.h5
input reference date: reference_date.txt
Traceback (most recent call last):
  File "/home/fielding/tools/MintPy/mintpy/smallbaselineApp.py", line 1077, in <module>
    main()
  File "/home/fielding/tools/MintPy/mintpy/smallbaselineApp.py", line 1067, in main
    app.run(steps=inps.runSteps, plot=inps.plot)
  File "/home/fielding/tools/MintPy/mintpy/smallbaselineApp.py", line 1029, in run
    self.run_reference_date(sname)
  File "/home/fielding/tools/MintPy/mintpy/smallbaselineApp.py", line 824, in run_reference_date
    mintpy.reference_date.main(scp_args.split())
  File "/home/fielding/tools/MintPy/mintpy/reference_date.py", line 142, in main
    inps.refDate = read_ref_date(inps)
  File "/home/fielding/tools/MintPy/mintpy/reference_date.py", line 90, in read_ref_date
    raise Exception(msg)
Exception: input reference date: reference_date.txt is not found.
All available dates:
['20190716', '20190728', '20190809', '20190815', '20190821', '20190827']

System information

  • Operating system:MacOS
  • Version of Python and relevant dependencies if applicable 3.7
  • Version of MintPy (output of smallbaselineApp.py -v): MintPy release version v1.2beta-23, release date 2019-08-28

Core dumped when using view.py

Description of the problem
After generating a TS using the latest '49d89fb' commit, I am unable to use "view.py" to visualize any outputs without it crashing. I should mention that I am also unable to view existing outputs from older runs as well, so I don't think my outputs leveraging the latest commits are corrupted.

Full error message

(mintpy) [ssangha@leffe run1_12312019]$ view.py /u/leffe-data/dbekaert/ARIA_CA/CA_144/DWR/run1_01022020/mintpy/velocity.h5 velocity
run view.py in MintPy release version v1.2.0-12, release date 2020-02-09
input file is velocity file: /u/leffe-data/dbekaert/ARIA_CA/CA_144/DWR/run1_01022020/mintpy/velocity.h5 in float32 format
file size in y/x: (7528, 5027)
turning glob search OFF for velocity file
num of datasets in file velocity.h5: 2
datasets to exclude (0):
[]
datasets to display (1):
['velocity']
data   coverage in y/x: (0, 0, 5027, 7528)
subset coverage in y/x: (0, 0, 5027, 7528)
data   coverage in lat/lon: (-121.193758034, 40.049519538, -117.00420596400001, 33.776083322)
subset coverage in lat/lon: (-121.193758034, 40.049519538, -117.00420596400001, 33.776083322)
------------------------------------------------------------------------
colormap: jet
figure title: velocity
figure size : [6.0, 7.1875210842766935]
read mask from file: maskTempCoh.h5
reading data ...
masking data
data    range: [-12.647075, 3.8018768] cm/year
display range: [-12.647075, 3.8018768] cm/year
This application failed to start because it could not find or load the Qt platform plugin "xcb"
in "".

Reinstalling the application may fix this problem.
Abort (core dumped)

System information

  • Operating system: Linux
  • Version of Python and relevant dependencies if applicable: Python 3.7.6
  • Version of MintPy (output of smallbaselineApp.py -v): v1.2.0-12, release date 2020-02-09

dask under LSF broken with new conda environment (plus Miami fix)

NOTE: It turns out to be a conda environment issue. An old environment (from May 7 2019) works fine but I was not able to recreate this environment (I tried pip install -r requirement_good.txt but got too many failures. I also tried reinstalling the conda.txt form May 7 but some packages did not install. There are also differences in module version numbers.

If you work on pegasus in Miami the temporary fix is cd $PARENTDIR/3rdparty; ln -s /nethome/famelung/MINICONDA3_GOOD miniconda3

The issue is that dask workers don't start when ifgram_inversion is called via bsub. Below is my job file. When I copy-paste the commands to the shell (starting with cd /nethome/famelung/test/development/rsmas_insar it works fine and I am getting the output below. However, when I run as bsub<ifgram_inversion.job, it starts the job but does not start the workers

//login4/nethome/famelung/test_mintpy[1249] cat ifgram_inversion.job 
#! /bin/bash
#BSUB -J run_13_smallbaseline_0
#BSUB -P insarlab
#BSUB -n 1
#BSUB -R span[hosts=1]
#BSUB -o run_13_smallbaseline_0_%J.o
#BSUB -e run_13_smallbaseline_0_%J.e
#BSUB -q general
#BSUB -W 5:00
#BSUB -R rusage[mem=2000]
free
cd /nethome/famelung/test/development/rsmas_insar
source default_isce22.bash
cd /nethome/famelung/test_mintpy
export PYTHON3DIR=$PWD/miniconda3
export MINTPY_HOME=$PWD/MintPy
export PATH=${PYTHON3DIR}/bin:${PATH}
export PYTHONPATH=${MINTPY_HOME}
export SCRATCHDIR=$PWD
echo "PYTHONPATH: $PYTHONPATH"

cd /nethome/famelung/test_mintpy/unittestGalapagosSenDT128/mintpy
$MINTPY_HOME/mintpy/ifgram_inversion.py /nethome/famelung/test_mintpy/unittestGalapagosSenDT128/mintpy/inputs/ifgramStack.h5 -t /nethome/famelung/test_mintpy/unittestGalapagosSenDT128/mintpy/smallbaselineApp.cfg --update

Good output (when started from shell):

$MINTPY_HOME/mintpy/ifgram_inversion.py /nethome/famelung/test_mintpy/unittestGalapagosSenDT128/mintpy/inputs/ifgramStack.h5 -t /nethome/famelung/test_mintpy/unittestGalapagosSenDT128/mintpy/smallbaselineApp.cfg --update
--------------------------------------------------
update mode: ON
1) NOT ALL output files found: ['timeseries.h5', 'temporalCoherence.h5'].
run or skip: run.
-------------------------------------------------------------------------------
least-squares solution with L2 min-norm on: deformation velocity
minimum redundancy: 1.0
weight function: var
mask: no
-------------------------------------------------------------------------------
number of interferograms: 20
number of acquisitions  : 7
number of lines   : 833
number of columns : 1364
reference pixel in y/x: (455, 842) from dataset: unwrapPhase
JOB COMMAND CALLED FROM PYTHON: #!/usr/bin/env bash

#BSUB -J mintpy_bee
#BSUB -q general
#BSUB -P insarlab
#BSUB -n 2
#BSUB -R "span[hosts=1]"
#BSUB -M 4000
#BSUB -W 00:15
#BSUB -R "rusage[mem=2500]"
#BSUB -o worker_mintpy.%J.o
#BSUB -e worker_mintpy.%J.e
JOB_ID=${LSB_JOBID%.*}



/nethome/famelung/test_mintpy/miniconda3/bin/python3 -m distributed.cli.dask_worker tcp://10.11.1.13:47567 --nthreads 2 --memory-limit 4.00GB --name mintpy_bee--${JOB_ID}-- --death-timeout 60 --interface ib0

0 [0, 0, 34, 833]
1 [34, 0, 68, 833]
2 [68, 0, 102, 833]
3 [102, 0, 136, 833]
4 [136, 0, 170, 833]
5 [170, 0, 204, 833]
6 [204, 0, 238, 833]
7 [238, 0, 272, 833]
8 [272, 0, 306, 833]
9 [306, 0, 341, 833]
10 [341, 0, 375, 833]
11 [375, 0, 409, 833]
12 [409, 0, 443, 833]
13 [443, 0, 477, 833]
14 [477, 0, 511, 833]
15 [511, 0, 545, 833]
16 [545, 0, 579, 833]
17 [579, 0, 613, 833]
18 [613, 0, 647, 833]
19 [647, 0, 682, 833]
20 [682, 0, 716, 833]
21 [716, 0, 750, 833]
22 [750, 0, 784, 833]
23 [784, 0, 818, 833]
24 [818, 0, 852, 833]
25 [852, 0, 886, 833]
26 [886, 0, 920, 833]
27 [920, 0, 954, 833]
28 [954, 0, 988, 833]
29 [988, 0, 1023, 833]
30 [1023, 0, 1057, 833]
31 [1057, 0, 1091, 833]
32 [1091, 0, 1125, 833]
33 [1125, 0, 1159, 833]
34 [1159, 0, 1193, 833]
35 [1193, 0, 1227, 833]
36 [1227, 0, 1261, 833]
37 [1261, 0, 1295, 833]
38 [1295, 0, 1329, 833]
39 [1329, 0, 1364, 833]
FUTURE #1 complete in 24.31698513031006 seconds. Box: [1329, 0, 1364, 833] Time: 1564664234.3381693
FUTURE #2 complete in 30.23323893547058 seconds. Box: [272, 0, 306, 833] Time: 1564664240.2544231
FUTURE #3 complete in 44.892457723617554 seconds. Box: [818, 0, 852, 833] Time: 1564664254.9136415
FUTURE #4 complete in 51.77775502204895 seconds. Box: [0, 0, 34, 833] Time: 1564664261.798939
FUTURE #5 complete in 69.37102770805359 seconds. Box: [579, 0, 613, 833] Time: 1564664279.3922114
FUTURE #6 complete in 75.06305527687073 seconds. Box: [1125, 0, 1159, 833] Time: 1564664285.0842392
FUTURE #7 complete in 88.02767276763916 seconds. Box: [306, 0, 341, 833] Time: 1564664298.0488567
FUTURE #8 complete in 96.89408779144287 seconds. Box: [852, 0, 886, 833] Time: 1564664306.915272
FUTURE #9 complete in 112.87290334701538 seconds. Box: [613, 0, 647, 833] Time: 1564664322.8940873
FUTURE #10 complete in 119.6050398349762 seconds. Box: [68, 0, 102, 833] Time: 1564664329.6262236
FUTURE #11 complete in 134.82669591903687 seconds. Box: [1159, 0, 1193, 833] Time: 1564664344.84788
FUTURE #12 complete in 142.95256853103638 seconds. Box: [341, 0, 375, 833] Time: 1564664352.9737759
FUTURE #13 complete in 157.9442274570465 seconds. Box: [886, 0, 920, 833] Time: 1564664367.9654114
FUTURE #14 complete in 169.349839925766 seconds. Box: [102, 0, 136, 833] Time: 1564664379.371024
FUTURE #15 complete in 183.52355813980103 seconds. Box: [647, 0, 682, 833] Time: 1564664393.544742
FUTURE #16 complete in 190.4077000617981 seconds. Box: [1193, 0, 1227, 833] Time: 1564664400.4288838
FUTURE #17 complete in 209.62204456329346 seconds. Box: [34, 0, 68, 833] Time: 1564664419.6432285
FUTURE #18 complete in 212.5091257095337 seconds. Box: [375, 0, 409, 833] Time: 1564664422.53031
FUTURE #19 complete in 231.66367363929749 seconds. Box: [920, 0, 954, 833] Time: 1564664441.6848578
FUTURE #20 complete in 235.89809203147888 seconds. Box: [136, 0, 170, 833] Time: 1564664445.919276
FUTURE #21 complete in 257.9646050930023 seconds. Box: [682, 0, 716, 833] Time: 1564664467.9857888
FUTURE #22 complete in 258.6260116100311 seconds. Box: [409, 0, 443, 833] Time: 1564664468.6471953
FUTURE #23 complete in 279.2651083469391 seconds. Box: [1227, 0, 1261, 833] Time: 1564664489.2862923
FUTURE #24 complete in 282.21773505210876 seconds. Box: [170, 0, 204, 833] Time: 1564664492.238918
FUTURE #25 complete in 305.6829824447632 seconds. Box: [954, 0, 988, 833] Time: 1564664515.7041667
FUTURE #26 complete in 307.676016330719 seconds. Box: [1261, 0, 1295, 833] Time: 1564664517.6972003
FUTURE #27 complete in 328.68252635002136 seconds. Box: [716, 0, 750, 833] Time: 1564664538.7037106
FUTURE #28 complete in 332.4500389099121 seconds. Box: [988, 0, 1023, 833] Time: 1564664542.4712229
FUTURE #29 complete in 350.7683844566345 seconds. Box: [443, 0, 477, 833] Time: 1564664560.7895684
FUTURE #30 complete in 354.5441644191742 seconds. Box: [204, 0, 238, 833] Time: 1564664564.5653486
FUTURE #31 complete in 376.24752926826477 seconds. Box: [750, 0, 784, 833] Time: 1564664586.2687135
FUTURE #32 complete in 378.82397508621216 seconds. Box: [1091, 0, 1125, 833] Time: 1564664588.8451588
FUTURE #33 complete in 399.39640045166016 seconds. Box: [1295, 0, 1329, 833] Time: 1564664609.4175844
FUTURE #34 complete in 401.1583776473999 seconds. Box: [477, 0, 511, 833] Time: 1564664611.1795607
FUTURE #35 complete in 423.41696190834045 seconds. Box: [1023, 0, 1057, 833] Time: 1564664633.438146
FUTURE #36 complete in 425.92578983306885 seconds. Box: [784, 0, 818, 833] Time: 1564664635.9469733
FUTURE #37 complete in 445.3375630378723 seconds. Box: [238, 0, 272, 833] Time: 1564664655.3587465
FUTURE #38 complete in 449.5472912788391 seconds. Box: [1329, 0, 1364, 833] Time: 1564664659.5684752
FUTURE #39 complete in 471.0708589553833 seconds. Box: [511, 0, 545, 833] Time: 1564664681.092043
FUTURE #40 complete in 471.2488474845886 seconds. Box: [1057, 0, 1091, 833] Time: 1564664681.2700312
--------------------------------------------------
converting phase to range
calculating perpendicular baseline timeseries

Output when it stalls (started with bsub < < ifgram_inversion.job):

//login4/nethome/famelung/test_mintpy[1259] bjobs
JOBID     USER    STAT  QUEUE      FROM_HOST   EXEC_HOST   JOB_NAME   SUBMIT_TIME
22735449  famelun RUN   general    login3      n284        *aseline_0 Aug  1 09:33
//login4/nethome/famelung/test_mintpy[1260] bpeek 22735449
<< output from stdout >>
             total       used       free     shared    buffers     cached
Mem:      32990384   30121180    2869204          0          0    1963360
-/+ buffers/cache:   28157820    4832564
Swap:            0          0          0
sourcing /nethome/famelung/test/development/rsmas_insar/default_isce22.bash ...
PYTHONPATH: /nethome/famelung/test_mintpy/MintPy

<< output from stderr >>

For individual workers, the output should be (indicating it works well):

bpeek 22739252
<< output from stdout >>
distributed.nanny - INFO -         Start Nanny at: 'tcp://10.11.108.18:44454'
distributed.worker - INFO -       Start worker at:   tcp://10.11.108.18:40307
distributed.worker - INFO -          Listening to:   tcp://10.11.108.18:40307
distributed.worker - INFO -              nanny at:         10.11.108.18:44454
distributed.worker - INFO -              bokeh at:         10.11.108.18:50397
distributed.worker - INFO - Waiting to connect to:     tcp://10.11.1.14:59394
distributed.worker - INFO - -------------------------------------------------
distributed.worker - INFO -               Threads:                          2
distributed.worker - INFO -                Memory:                    2.00 GB
distributed.worker - INFO -       Local Directory: /scratch/projects/insarlab/famelung/unittestGalapagosSenDT128/PYSAR/worker-fwzwhah5
distributed.worker - INFO - -------------------------------------------------
distributed.worker - INFO -         Registered to:     tcp://10.11.1.14:59394
distributed.worker - INFO - -------------------------------------------------
distributed.core - INFO - Starting established connection
BOX DIMS: [34, 0, 68, 833]
BOX DIMS: [1261, 0, 1295, 833]
reading unwrapPhase in [34, 0, 68, 833] * 20 ...
use input reference phase
reading unwrapPhase in [1261, 0, 1295, 833] * 20 ...
use input reference phase
skip pixels with zero/nan value in all interferograms
skip pixels with zero/nan value in all interferograms
number of pixels to invert: 28322 out of 28322 (100.0%)
reading coherence in [34, 0, 68, 833] * 20 ...
number of pixels to invert: 28322 out of 28322 (100.0%)
reading coherence in [1261, 0, 1295, 833] * 20 ...
convert coherence to weight using Fisher Information Index (Seymour & Cumming, 1994)
convert coherence to weight using Fisher Information Index (Seymour & Cumming, 1994)
inverting network of interferograms into time-series ...
inverting network of interferograms into time-series 

save_kmz.py not working if 'zip' is not available

Description of the problem
save_kmz.py expects the unix command 'zip' to be available. If it is not (as on the compute nodes of our HPC system) you get a sh: zip: command not found and there is no *.kmz produced. It would be preferred to replace the unix command with a python command.

Full script that generated the error

HERE THE OFFENDING LINE:
cmd = 'cd {d1}; zip {fz} {fl} {fd} {fc}; cd {d2}'.format(
        d1=os.path.abspath(os.path.dirname(kmz_file)),
        fz=os.path.basename(kmz_file),
        fl=os.path.basename(kml_file),
        fd=os.path.basename(data_png_file),
        fc=os.path.basename(cbar_file),
        d2=os.getcwd()
    )
    print(cmd)
    os.system(cmd)

Full error message

PASTE ERROR MESSAGE HERE

System information

  • Operating system:
  • Version of Python and relevant dependencies if applicable
  • Version of MintPy (output of smallbaselineApp.py -v):

tracking no-data for GDAL datasets in ARIA products

@hfattahi think we might need to account for the no-data values in the connected component data for ARIA, perhaps in general as well for unwrapped etc. @bbuzz31 noticed some issue when using the unwrap correction using connected component to complain about negative value.

For example, zero is a connected component and therefore we set no-date value -1 in the ARIA product. Suggest to use something like the following to find the no-data values, and then set it to a value MintPy knows to ignore? Not sure if this would need to be 0 or a NaN or so?

For loop on line 155 in prep_aria.py each time when a band is loaded.

   bnd = dsUnw.GetRasterBand(ii+1)
   try:
        NoData = bnd.GetNoDataValue()
   except:
        NoData = None

progress reporting / logging

As mintpy uses print() statements to display progress to the screen (STDOUT), this information gets scrambled on clusters. For example under LSF bpeek command does not properly relay where mintpy is in the processing. I see two ways to resolve this.

  1. Proper logging using the python logging modules
  2. adding sys.stdout.flush() at the end of each print statement.

Obviously, (2) is much easier than (1). I will this unless somebody yells. I unlikely will do the entire code but only those scripts that occasionally give trouble. I could look for a student to do the entire code. Having a student adding proper logging may not be a good idea as this may need too much oversight.

Catch and report PyAPS Copernicus Climate Store credential error

Not sure if it is bug report or feature request.

ERA5 PYAPS Tropospheric delay correction currently fails with a hard to diagnose ZeroDivisionError in the progress bar class when the user's login credentials for https://cds.climate.copernicus.eu/ is not correct (or if the user has not accepted the T&Cs). This makes a simple problem (user registration) very hard to figure out.

Going through the code and error stack below reveals that the division error is a result of the date_list2grib_file function not returning a list of data since it does not have access to the service. The progress bar then uses the length of this empty list to set the endpoint of the progress bar resulting in the error.

I think one solution would be to catch the specific error in the try clause (line 398) of the dload_grib_files function in tropo_pyaps3.py.

calcualting` delay for each date using PyAPS (Jolivet et al., 2011; 2014) ...
number of grib files used: 0
Traceback (most recent call last):
File "/home/app/app.py", line 10, in
smallbaselineApp.main()
File "/home/python/MintPy/mintpy/smallbaselineApp.py", line 1069, in main
app.run(steps=inps.runSteps, plot=inps.plot)
File "/home/python/MintPy/mintpy/smallbaselineApp.py", line 1019, in run
self.run_tropospheric_delay_correction(sname)
File "/home/python/MintPy/mintpy/smallbaselineApp.py", line 751, in run_tropospheric_delay_correction
tropo_pyaps3.main(scp_args.split())
File "/home/python/MintPy/mintpy/tropo_pyaps3.py", line 570, in main
get_delay_timeseries(inps, atr)
File "/home/python/MintPy/mintpy/tropo_pyaps3.py", line 523, in get_delay_timeseries
prog_bar = ptime.progressBar(maxValue=num_date)
File "/home/python/MintPy/mintpy/utils/ptime.py", line 277, in init
self.reset()
File "/home/python/MintPy/mintpy/utils/ptime.py", line 282, in reset
self.update_amount(0) # Build progress bar string
File "/home/python/MintPy/mintpy/utils/ptime.py", line 296, in update_amount
percentDone = (diffFromMin / np.float(self.span)) * 100.0
ZeroDivisionError: float division by zero

enforced steps in smallbaselineApp.py

Currently the smallbaselineApp.py expects timeseries_ramp_demErr.h5 to estimate "velocity". This means that the 'deramp', 'correct_topography' are mandatory before velocity estimation. I'm wondering if enforcing these two steps before velocity estimation is intentional or not?

azimuthAngle from ariaTSsetup.py

The azimuthAngle from the current ariaTSsetup.py, for SanFran Sen descending track, has an average value of -8.8; while the average value from ISCE/topsStack (band2 of los.rdr), for Galapagos Sen descending track, has an average value of -102. They are very different! #139 (comment)

Below is the description from isce/topsStack merged/geom_master/los.rdr.xml:

Two channel Line-Of-Sight geometry image (all angles in degrees).
Represents vector drawn from target to platform.
Channel 1: Incidence angle measured from vertical at target (always +ve).
Channel 2: Azimuth angle measured from North in Anti-clockwise direction.

@ehavazli and @dbekaert, what is the corresponding definition of the azimuthAngle from ARIA-tools? The current value looks unreasonable to me, in the topsStack's version of definition. And this will cause problem while applying enu2los convertion or the other way around.

regularly updated docker image

For convenience I have created a docker image on Docker hub for Mintpy: https://hub.docker.com/r/andretheronsa/mintpy
If this would be useful for the project, it can be added it as an installation option ?

It provides an easy way to have the configured MintPy environment ready.
It is built on a base debian:stretch linux image using the installation instructions (setting up conda, environment, mintpy, pyaps, pykml etc).

All you need is docker installed (on windows / linux / mac).
Pull the image from dockerhub with: docker pull andretheronsa/mintpy:latest
Run an interactive shell session in the container with: docker run -it andretheronsa/mintpy:latest bash
To map your data folder on host to the container add -v /path/to/data/dir:/home/work/ to the docker run command.

The image is the first version and has not been tested much. It can be made smaller too.

data type for connected components dataset

  1. MintPy currently enforces np.byte datatype for connected components dataset. This means only values between -128 to 127 can be stored.
  2. ISCE2 snaphu wrapper, outputs maximum 20 connected component and zeros out other components by default (can be controlled). But the largest can be up to 255 in Snaphu as the data type is unsigned byte (needs to be verified). Therefore np.byte in MintPy is fine for ISCE regular outputs.
  3. arai tools stitches different unwrapped interferograms and relabels the connected components. So it is possible to have large number of connected components over large areas.
  4. Other unwrappers (currently outside ISCE2) allow for much larger number of connected components

With all this I'm wondering if we should change the data type of connected components in MintPy hdf5 dataset to float?

Pixel weighting option

Hello,

I was wondering if there was any interest in creating a weighting scheme that weighted each pixel differently based on values from, e.g., connected components, filtered coherence, etc.

The reason for requesting this is because the connected components mask is too conservative for our purposes. It nullifies pixel values for "all time", even if the connected components are valid for most dates. Instead, it would be helpful for us to down-weight the phase at epochs when the connected component mask is 0, and keep the values when the mask is > 1.

For instance, for each pixel, the weighting matrix would contain the variance values for each interferogram when the connected component is 1. But when the connected component is 0, the weighting value for that interferogram would be trivially small, e.g., 10^-5.

I would be happy to help contribute or explain further.

Is that something you are interested in?

Best regards,
Rob Zinke

MemoryError for geocoding with GAMMA lookup table

When processing, I have divided the data into 12 blocks, but the memory error is still prompted. Divide the data into more blocks or report the same error:“MemoryError: Unable to allocate array with shape (112, 3591, 4528) and data type float32”

******************** step - geocode ********************
geocode.py -l /media/wu/文档/kunlun/dinsar/gamma_minpy/burst2/mintpy/inputs/geometryGeo.h5 -t /media/wu/文档/kunlun/dinsar/gamma_minpy/burst2/mintpy/smallbaselineApp.cfg --outdir /media/wu/文档/kunlun/dinsar/gamma_minpy/burst2/mintpy/geo --update  /media/wu/文档/kunlun/dinsar/gamma_minpy/burst2/mintpy/inputs/geometryRadar.h5 temporalCoherence.h5 timeseries_tropHgt_demErr.h5 velocity.h5
geocode.py -l /media/wu/文档/kunlun/dinsar/gamma_minpy/burst2/mintpy/inputs/geometryGeo.h5 -t /media/wu/文档/kunlun/dinsar/gamma_minpy/burst2/mintpy/smallbaselineApp.cfg --outdir /media/wu/文档/kunlun/dinsar/gamma_minpy/burst2/mintpy/geo --update /media/wu/文档/kunlun/dinsar/gamma_minpy/burst2/mintpy/inputs/geometryRadar.h5 temporalCoherence.h5 timeseries_tropHgt_demErr.h5 velocity.h5
read input option from template file: /media/wu/文档/kunlun/dinsar/gamma_minpy/burst2/mintpy/smallbaselineApp.cfg
number of processor to be used: 1
--------------------------------------------------
resampling file: /media/wu/文档/kunlun/dinsar/gamma_minpy/burst2/mintpy/inputs/geometryRadar.h5
['/media/wu/文档/kunlun/dinsar/gamma_minpy/burst2/mintpy/geo/geo_geometryRadar.h5'] exists and is newer than ['/media/wu/文档/kunlun/dinsar/gamma_minpy/burst2/mintpy/inputs/geometryRadar.h5', '/media/wu/文档/kunlun/dinsar/gamma_minpy/burst2/mintpy/inputs/geometryGeo.h5'] --> skip.
update mode is ON, skip geocoding.
--------------------------------------------------
resampling file: temporalCoherence.h5
['/media/wu/文档/kunlun/dinsar/gamma_minpy/burst2/mintpy/geo/geo_temporalCoherence.h5'] exists and is newer than ['temporalCoherence.h5', '/media/wu/文档/kunlun/dinsar/gamma_minpy/burst2/mintpy/inputs/geometryGeo.h5'] --> skip.
update mode is ON, skip geocoding.
--------------------------------------------------
resampling file: timeseries_tropHgt_demErr.h5
reading timeseries from timeseries_tropHgt_demErr.h5 ...
resampling using scipy.interpolate.RegularGridInterpolator ...
[==================================================]  101s /     2s
update REF_LAT/LON/Y/X
Traceback (most recent call last):
  File "/home/wu/python/MintPy/mintpy/smallbaselineApp.py", line 1090, in <module>
    main()
  File "/home/wu/python/MintPy/mintpy/smallbaselineApp.py", line 1080, in main
    app.run(steps=inps.runSteps, plot=inps.plot)
  File "/home/wu/python/MintPy/mintpy/smallbaselineApp.py", line 1048, in run
    self.run_geocode(sname)
  File "/home/wu/python/MintPy/mintpy/smallbaselineApp.py", line 887, in run_geocode
    mintpy.geocode.main(scp_args.split())
  File "/home/wu/python/MintPy/mintpy/geocode.py", line 335, in main
    run_geocode(inps)
  File "/home/wu/python/MintPy/mintpy/geocode.py", line 324, in run_geocode
    writefile.write(dsResDict, out_file=outfile, metadata=atr, ref_file=infile)
  File "/home/wu/python/MintPy/mintpy/utils/writefile.py", line 61, in write
    compression=compression)
  File "/home/wu/python/MintPy/mintpy/objects/stack.py", line 367, in write2hdf5
    data = np.array(data, dtype=np.float32)

Geocode fails with large timeseries file

Description of the problem
Geocoding step failed when it got to the timeseries.h5 file, with a "killed" message. I am guessing that it ran out of memory. My timeseries file has shape (87, 3988, 9791).

Full script that generated the error

smallbaselineApp.py RidgecrestSenA064.txt --dostep geocode

Full error message

******************** step - geocode ********************
create directory: /net/kraken/nobak/fielding/Ridgecrest/S1AB/A064/stack2/MintPy_v1/geo
geocode.py -l /net/kraken/nobak/fielding/Ridgecrest/S1AB/A064/stack2/MintPy_v1/inputs/geometryRadar.h5 -t /net/kraken/nobak/fielding/Ridgecrest/S1AB/A064/stack2/MintPy_v1/smallbaselineApp.cfg --outdir /net/kraken/nobak/fielding/Ridgecrest/S1AB/A064/stack2/MintPy_v1/geo --update  /net/kraken/nobak/fielding/Ridgecrest/S1AB/A064/stack2/MintPy_v1/inputs/geometryRadar.h5 temporalCoherence.h5 timeseries.h5 velocity.h5
read input option from template file: /net/kraken/nobak/fielding/Ridgecrest/S1AB/A064/stack2/MintPy_v1/smallbaselineApp.cfg
number of processor to be used: 1
output pixel size in (lat, lon) in degree: (-0.00038907328792942296, 0.00031656142518274386)
output area extent in (S N W E) in degree: (35.002445220947266, 36.55368, -119.366615, -116.2674789428711)
--------------------------------------------------
resampling file: /net/kraken/nobak/fielding/Ridgecrest/S1AB/A064/stack2/MintPy_v1/inputs/geometryRadar.h5
reading azimuthAngle       from geometryRadar.h5 ...
nearest resampling with kd_tree using 1 processor cores in 39 segments ...
reading height             from geometryRadar.h5 ...
nearest resampling with kd_tree using 1 processor cores in 39 segments ...
reading incidenceAngle     from geometryRadar.h5 ...
nearest resampling with kd_tree using 1 processor cores in 39 segments ...
reading latitude           from geometryRadar.h5 ...
nearest resampling with kd_tree using 1 processor cores in 39 segments ...
reading longitude          from geometryRadar.h5 ...
nearest resampling with kd_tree using 1 processor cores in 39 segments ...
reading shadowMask         from geometryRadar.h5 ...
restrict fill value to False for bool type source data
nearest resampling with kd_tree using 1 processor cores in 39 segments ...
reading slantRangeDistance from geometryRadar.h5 ...
nearest resampling with kd_tree using 1 processor cores in 39 segments ...
create HDF5 file: /net/kraken/nobak/fielding/Ridgecrest/S1AB/A064/stack2/MintPy_v1/geo/geo_geometryRadar.h5 with w mode
create dataset /azimuthAngle       of float32    in size of (3988, 9791)         with compression=gzip
create dataset /height             of float32    in size of (3988, 9791)         with compression=gzip
create dataset /incidenceAngle     of float32    in size of (3988, 9791)         with compression=gzip
create dataset /latitude           of float32    in size of (3988, 9791)         with compression=gzip
create dataset /longitude          of float32    in size of (3988, 9791)         with compression=gzip
create dataset /shadowMask         of bool       in size of (3988, 9791)         with compression=gzip
create dataset /slantRangeDistance of float32    in size of (3988, 9791)         with compression=gzip
finished writing to /net/kraken/nobak/fielding/Ridgecrest/S1AB/A064/stack2/MintPy_v1/geo/geo_geometryRadar.h5
--------------------------------------------------
resampling file: temporalCoherence.h5
reading temporalCoherence from temporalCoherence.h5 ...
nearest resampling with kd_tree using 1 processor cores in 39 segments ...
^A
update REF_LAT/LON/Y/X
create HDF5 file: /net/kraken/nobak/fielding/Ridgecrest/S1AB/A064/stack2/MintPy_v1/geo/geo_temporalCoherence.h5 with w mode
create dataset /temporalCoherence of float32    in size of (3988, 9791)         with compression=None
finished writing to /net/kraken/nobak/fielding/Ridgecrest/S1AB/A064/stack2/MintPy_v1/geo/geo_temporalCoherence.h5
--------------------------------------------------
resampling file: timeseries.h5
reading timeseries from timeseries.h5 ...
nearest resampling with kd_tree using 1 processor cores in 39 segments ...
Killed

System information

  • Operating system: Linux
  • Version of Python and relevant dependencies if applicable 3.7
  • Version of MintPy (output of smallbaselineApp.py -v): MintPy release version v1.2beta-21, release date 2019-08-25

problem adding shaded relief from DEM using view.py (or tsview.py)

(MintPy) tiger D83) view.py inputs/ifgramStack.h5 unwrapPhase-20190124_20200125 --wrap -d ../DEM/demLat_S38_S36_Lon_W072_W070.dem.wgs84
run view.py in MintPy release version v1.2.0-22, release date 2020-02-18
input file is ifgramStack file: /u/tiger-z5/paul/CHILLAN/S1/ARIA/D83/inputs/ifgramStack.h5 in float32 format
file size in y/x: (480, 540)
num of datasets in file ifgramStack.h5: 1476
num of datasets to exclude: 0
num of datasets to display: 1
data   coverage in y/x: (0, 0, 540, 480)
subset coverage in y/x: (0, 0, 540, 480)
data   coverage in lat/lon: (-71.65, -36.7, -71.20000018, -37.09999984)
subset coverage in lat/lon: (-71.65, -36.7, -71.20000018, -37.09999984)
------------------------------------------------------------------------
colormap: jet
figure title: unwrapPhase-20190124_20200125_wrap
figure size : [8.43750000000005, 6.0]
reading data ...
re-wrapping data to [-3.141592653589793, 3.141592653589793]
data    range: [-3.1415646, 3.1415904] radian
display range: [-3.141592653589793, 3.141592653589793] radian
reading DEM: demLat_S38_S36_Lon_W072_W070.dem.wgs84 ...
display data in transparency: 0.8
plot in Lat/Lon coordinate
map projection: PlateCarree
plotting DEM background ...
show shaded relief DEM
show contour in step of 200.0 m with smoothing factor of 3.0
plotting image ...
plot scale bar: [0.2, 0.2, 0.1]
plot reference point
showing ...
libGL error: No matching fbConfigs or visuals found
libGL error: failed to load driver: swrast

Confusing, non-fatal `basemap` installation errors under linux

I am getting the building errors below when installing MintPy as described in installallation.md. It looks very scary as the errors show up red (not reproduced here) but they are not fatal. It says failed to install basemap and a bit later Successfully installed basemap-1.2.0. Very confusing! Can't we go back to conda install basemap which did not throw any errors (I forgot why this had to be changed). If not we should add a note that these errors are nothing to worry about.

Here what I did. I install as :

./Miniconda3-4.5.12-Linux-x86_64.sh -b -p ./miniconda3
./miniconda3/bin/conda install --yes --file MintPy/docs/conda.txt 
./miniconda3/bin/pip install git+https://github.com/matplotlib/basemap.git#egg=mpl_toolkits
./miniconda3/bin/pip install git+https://github.com/tylere/pykml.git

I am getting the following errors, but at the end it says 'Sucessfully':

[famelung@pegasus test_mintpy]$ ./miniconda3/bin/pip install git+https://github.com/matplotlib/basemap.git#egg=mpl_toolkits
Collecting mpl_toolkits from git+https://github.com/matplotlib/basemap.git#egg=mpl_toolkits
  Cloning https://github.com/matplotlib/basemap.git to /tmp/pip-install-ag112_58/mpl-toolkits
  Running command git clone -q https://github.com/matplotlib/basemap.git /tmp/pip-install-ag112_58/mpl-toolkits
  WARNING: Generating metadata for package mpl-toolkits produced metadata for project name basemap. Fix your #egg=mpl-toolkits fragments.
Requirement already satisfied: matplotlib!=3.0.1,>=1.0.0 in ./miniconda3/lib/python3.7/site-packages (from basemap) (3.1.1)
Requirement already satisfied: numpy>=1.2.1 in ./miniconda3/lib/python3.7/site-packages (from basemap) (1.16.4)
Requirement already satisfied: pyproj>=1.9.3 in ./miniconda3/lib/python3.7/site-packages (from basemap) (2.2.1)
Collecting pyshp>=1.2.0 (from basemap)
Requirement already satisfied: six in ./miniconda3/lib/python3.7/site-packages (from basemap) (1.12.0)
Requirement already satisfied: python-dateutil>=2.1 in ./miniconda3/lib/python3.7/site-packages (from matplotlib!=3.0.1,>=1.0.0->basemap) (2.8.0)
Requirement already satisfied: kiwisolver>=1.0.1 in ./miniconda3/lib/python3.7/site-packages (from matplotlib!=3.0.1,>=1.0.0->basemap) (1.1.0)
Requirement already satisfied: cycler>=0.10 in ./miniconda3/lib/python3.7/site-packages (from matplotlib!=3.0.1,>=1.0.0->basemap) (0.10.0)
Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in ./miniconda3/lib/python3.7/site-packages (from matplotlib!=3.0.1,>=1.0.0->basemap) (2.4.2)
Requirement already satisfied: setuptools in ./miniconda3/lib/python3.7/site-packages (from kiwisolver>=1.0.1->matplotlib!=3.0.1,>=1.0.0->basemap) (41.0.1)
Building wheels for collected packages: basemap, basemap
  Building wheel for basemap (setup.py) ... done
  Created wheel for basemap: filename=basemap-1.2.0-cp37-cp37m-linux_x86_64.whl size=121736686 sha256=0571727c5a5c82c48570b5d2a99941bee4444e929033d3b54a2e911b048e487e
  Stored in directory: /tmp/pip-ephem-wheel-cache-uhzgp5tw/wheels/3f/f3/56/de548b66967d0d661612b7618022e2c0d4b86b9a638cf6ccf3
  Building wheel for basemap (setup.py) ... error
  ERROR: Command errored out with exit status 1:
   command: /nethome/famelung/test_mintpy/miniconda3/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-ag112_58/basemap/setup.py'"'"'; __file__='"'"'/tmp/pip-install-ag112_58/basemap/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' bdist_wheel -d /tmp/pip-wheel-niw69w9i --python-tag cp37
       cwd: /tmp/pip-install-ag112_58/basemap/
  Complete output (5 lines):
  Traceback (most recent call last):
    File "<string>", line 1, in <module>
    File "/nethome/famelung/test_mintpy/miniconda3/lib/python3.7/tokenize.py", line 447, in open
      buffer = _builtin_open(filename, 'rb')
  FileNotFoundError: [Errno 2] No such file or directory: '/tmp/pip-install-ag112_58/basemap/setup.py'
  ----------------------------------------
  ERROR: Failed building wheel for basemap
  Running setup.py clean for basemap
  ERROR: Command errored out with exit status 1:
   command: /nethome/famelung/test_mintpy/miniconda3/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-ag112_58/basemap/setup.py'"'"'; __file__='"'"'/tmp/pip-install-ag112_58/basemap/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' clean --all
       cwd: /tmp/pip-install-ag112_58/basemap
  Complete output (5 lines):
  Traceback (most recent call last):
    File "<string>", line 1, in <module>
    File "/nethome/famelung/test_mintpy/miniconda3/lib/python3.7/tokenize.py", line 447, in open
      buffer = _builtin_open(filename, 'rb')
  FileNotFoundError: [Errno 2] No such file or directory: '/tmp/pip-install-ag112_58/basemap/setup.py'
  ----------------------------------------
  ERROR: Failed cleaning build dir for basemap
Successfully built basemap
Failed to build basemap
Installing collected packages: pyshp, basemap
Successfully installed basemap-1.2.0 pyshp-2.1.0
[famelung@pegasus test_mintpy]$

And basemap is indeed there:

 ./miniconda3/bin/pip freeze | grep basemap
basemap==1.2.0

Copy smallbaseline.cfg into mintpy/pic folder

Description of the desired feature

It will be nice to save a copy in this folder so that all required information is available when only the pic folder is kept.

At the same time, it will be good to also save in inputs/*.template file for those who use MinSAR.

Mask of Connected Components not updated after modify_network

Description of the problem

The maskConnComp.h5 file is not updated when interferograms are dropped by the modify_network step with new parameters.

Full script that generated the error

Ran modify_network with new mintpy.network.startDate to removed interferograms with partial coverage. The mask file remained the same.

Here is the view of maskConnComp.h5 after modification:
maskConnComp

Note only narrow line of 1 values where partial scenes (now removed) overlap.

System information

  • Operating system: Linux
  • Version of Python and relevant dependencies if applicable
  • Version of MintPy (output of smallbaselineApp.py -v): MintPy release version v1.2beta-18, release date 2019-08-24

SNAP geocoded products support

Hi @andretheronsa, thank you for adding prep_snap.py #101! Below are things that I can think of to make the SNAP+MintPy workflow as smooth as the ISCE+MintPy workflow.

  • 1. Is it possible to add interferograms as the root directory of all interferograms? An example is shown below. This is preferred for a more simple root directory, so that interferograms folder can be in the same level as mintpy folder.
/interferograms
    /20190427_20190602
        /20190427_20190602_coh_tc.data
            /coh.img                                    #mintpy.load.corFile
            /coh.hdr
        20190427_20190602_coh_tc.dim
        /20190427_20190602_filt_int_sub_tc .data
            /filt_int.img                                #mintpy.load.intFile
            /filt_int.hdr
        20190427_20190602_filt_int_sub_tc.dim
        /20190427_20190602_unw_tc .data
            /filt_unw.img                             #mintpy.load.unwFile
            /filt_unw.hdr
        20190427_20190602_unw_tc.dim
    ...
/dem_tc.data 
    /dem.img                                         #mintpy.load.demFile
    /dem.hdr
dem_tc.dim
/mintpy

Once the question above is settled, prep_snap can be adjusted to run the complex metadata extraction function (utm_dim_to_rsc()) once and save the result into a simple text file, like data.rsc from prep_isce.py; and for the rest files, the code can search for data.rsc and used directly instead of calling utm_dim_to_rsc() again.

  • 2. Could you write a document of your SNAP workflow, if there are more details than your comment inside prep_snap, on the mintpy's wiki or somewhere else you prefer? Other users who are interested can go there for detailed instruction to prepare SNAP products.

correct_unwrap_error step fails without X display

Description of the problem
I was running the correct_unwrap_error step of the smallbaselineApp.py in a process that had no X display attached. It failed at the end of the bridge creation step. I reran it from another terminal and it successfully finished and saved common regions and sample pixels to file common_region_sample.png. It would be better if this did not require an X Windows display so we can run this in batch jobs, etc.

Full script that generated the error

smallbaselineApp.py AzoresSenA002.txt --start correct_unwrap_error

Full error message

--RUN-at-2019-09-17 21:15:52.397606--
Run routine processing with smallbaselineApp.py on steps: ['correct_unwrap_error', 'stack_interferograms', 'invert_network', 'correct_LOD', 'correct_troposphere', 'deramp', 'correct_topography', 'residual_RMS', 'reference_date', 'velocity', 'geocode', 'google_earth', 'hdfeos5']
--------------------------------------------------
Project name: AzoresSenA002
Go to work directory: /u/pez-z2/fielding/Azores/S1AB/A002/stack/MintPy_v1
copy AzoresSenA002.txt to inputs directory for backup.
read custom template file: /u/pez-z2/fielding/Azores/S1AB/A002/stack/MintPy_v1/AzoresSenA002.txt
update default template based on input custom template
    mintpy.unwrapError.method: auto --> bridging+phase_closure
read default template file: /u/pez-z2/fielding/Azores/S1AB/A002/stack/MintPy_v1/smallbaselineApp.cfg
…
calculating the integer ambiguity for the common regions defined in maskConnComp.h5
open ifgramStack file: ifgramStack.h5
reference pixel in y/x: (378, 2207) from dataset: unwrapPhase_bridging
read common mask from maskConnComp.h5
refine common mask based on water mask file waterMask.h5
remove regions with area < 2500
number of common regions: 5
number of samples per region: 100
solving the phase-unwrapping integer ambiguity for unwrapPhase_bridging
        based on the closure phase of interferograms triplets (Yunjun et al., 2019)
        using the L1-norm regularzed least squares approximation (LASSO) ...
1/5[==================================================]    9s /     0s
2/5 skip calculation for the reference region
3/5[==================================================]    7s /     0s
4/5[==================================================]    5s /     0s
5/5[==================================================]    3s /     0s
qt.qpa.screen: QXcbConnection: Could not connect to display localhost:15.0
Could not connect to any X display.

System information

  • Operating system: Linux
  • Version of Python and relevant dependencies if applicable Python 3.7
  • Version of MintPy (output of smallbaselineApp.py -v): MintPy release version v1.2beta-28, release date 2019-09-02

404 when downloading GPS for certain study area

Description of the problem
When trying to validate the InSAR velocity over a Californian track with GPS (SNWE bounding box = 34.9855018, 41.9953313, -124.8713548,- 120.4743248), the "view.py" script errors out with a 404 error. Generation of the "GPSSitesVel.csv" ends after appending the velocity for the same station "CASR", so it is my impression that there is a problem associated with the next station in the network "CASS".

Full script that generated the error

view.py velocity.h5 velocity --show-gps --gps-comp enu2los --gps-label

Full error message

calculating GPS velocity with respect to None in enu2los direction ...
start date: 20150524
end   date: 20191111
use incidenceAngle/azimuthAngle from file: geometryGeo.h5
[===>                     9%                       ] CASR    4s /    48sTraceback (most recent call last):
  File "/u/leffe0/ssangha/tools/conda_installation/stable_feb9_2020/MintPy/mintpy/view.py", line 1412, in <module>
    main(sys.argv[1:])
  File "/u/leffe0/ssangha/tools/conda_installation/stable_feb9_2020/MintPy/mintpy/view.py", line 1406, in main
    obj.plot()
  File "/u/leffe0/ssangha/tools/conda_installation/stable_feb9_2020/MintPy/mintpy/view.py", line 1368, in plot
    ax, self, im, cbar = plot_slice(ax, data, self.atr, self)
  File "/u/leffe0/ssangha/tools/conda_installation/stable_feb9_2020/MintPy/mintpy/view.py", line 498, in plot_slice
    ax = pp.plot_gps(ax, SNWE, inps, metadata, print_msg=inps.print_msg)
  File "/u/leffe0/ssangha/tools/conda_installation/stable_feb9_2020/MintPy/mintpy/utils/plot.py", line 1268, in plot_gps
    gps_comp=inps.gps_component) * unit_fac
  File "/u/leffe0/ssangha/tools/conda_installation/stable_feb9_2020/MintPy/mintpy/objects/gps.py", line 392, in get_gps_los_velocity
    gps_comp=gps_comp)[0:2]
  File "/u/leffe0/ssangha/tools/conda_installation/stable_feb9_2020/MintPy/mintpy/objects/gps.py", line 359, in read_gps_los_displacement
    inc_angle, head_angle = self.get_los_geometry(geom_obj)
  File "/u/leffe0/ssangha/tools/conda_installation/stable_feb9_2020/MintPy/mintpy/objects/gps.py", line 320, in get_los_geometry
    lat, lon = self.get_stat_lat_lon(print_msg=print_msg)
  File "/u/leffe0/ssangha/tools/conda_installation/stable_feb9_2020/MintPy/mintpy/objects/gps.py", line 213, in get_stat_lat_lon
    self.dload_site(print_msg=print_msg)
  File "/u/leffe0/ssangha/tools/conda_installation/stable_feb9_2020/MintPy/mintpy/objects/gps.py", line 196, in dload_site
    urlretrieve(url, self.file)
  File "/u/leffe0/ssangha/tools/conda_installation/stable_feb9_2020/envs/mintpy/lib/python3.7/urllib/request.py", line 247, in urlretrieve
    with contextlib.closing(urlopen(url, data)) as fp:
  File "/u/leffe0/ssangha/tools/conda_installation/stable_feb9_2020/envs/mintpy/lib/python3.7/urllib/request.py", line 222, in urlopen
    return opener.open(url, data, timeout)
  File "/u/leffe0/ssangha/tools/conda_installation/stable_feb9_2020/envs/mintpy/lib/python3.7/urllib/request.py", line 531, in open
    response = meth(req, response)
  File "/u/leffe0/ssangha/tools/conda_installation/stable_feb9_2020/envs/mintpy/lib/python3.7/urllib/request.py", line 641, in http_response
    'http', request, response, code, msg, hdrs)
  File "/u/leffe0/ssangha/tools/conda_installation/stable_feb9_2020/envs/mintpy/lib/python3.7/urllib/request.py", line 569, in error
    return self._call_chain(*args)
  File "/u/leffe0/ssangha/tools/conda_installation/stable_feb9_2020/envs/mintpy/lib/python3.7/urllib/request.py", line 503, in _call_chain
    result = func(*args)
  File "/u/leffe0/ssangha/tools/conda_installation/stable_feb9_2020/envs/mintpy/lib/python3.7/urllib/request.py", line 649, in http_error_default
    raise HTTPError(req.full_url, code, msg, hdrs

System information

  • Operating system:
  • Version of Python and relevant dependencies if applicable
  • Version of MintPy (output of smallbaselineApp.py -v):

Timeseries from ARIA standard products

Hello,

I am trying to run the MintPy timeseries for a very basic test case data set for ARIA standard product data over Tibet (ascending track 041).

I went through the setup steps required in the ARIA tools time series setup workflow:
ariaDownload.py -t 041 -b ’37.5 40 87 91’ -s 20180601 -e 20181001
ariaTSsetup.py -f .products/’S1*.nc’ -d Download -nt 8
python3 prep_aria.py -s stack -d DEM/SRTM_3arcsec.dem.vrt -i incidenceAngle/20180613_20180601.vrt
smallbaselineApp.py –start modify_network

The MintPy processes goes through the first several steps (modify network, reference point, correct unwrap error [off], stack interferograms, invert network, correct lod [off], troposphere [off], deramp). Then, it gets 22% of the way through the “correct topography” step before throwing an error:
File "/u/sarh0/rzinke/tools/MintPy/mintpy/smallbaselineApp.py", line 1070, in
main()
File "/u/sarh0/rzinke/tools/MintPy/mintpy/smallbaselineApp.py", line 1060, in main
app.run(steps=inps.runSteps, plot=inps.plot)
File "/u/sarh0/rzinke/tools/MintPy/mintpy/smallbaselineApp.py", line 1016, in run
self.run_topographic_residual_correction(sname)
File "/u/sarh0/rzinke/tools/MintPy/mintpy/smallbaselineApp.py", line 796, in run_topographic_residual_correction
mintpy.dem_error.main(scp_args.split())
File "/u/sarh0/rzinke/tools/MintPy/mintpy/dem_error.py", line 489, in main
inps = correct_dem_error(inps, A_def)
File "/u/sarh0/rzinke/tools/MintPy/mintpy/dem_error.py", line 425, in correct_dem_error
num_step=num_step)
File "/u/sarh0/rzinke/tools/MintPy/mintpy/dem_error.py", line 308, in estimate_dem_error
X = linalg.lstsq(A, ts, cond=1e-15)[0]
File "/u/sar-r0/rzinke/python/miniconda3/envs/MintPy/lib/python3.7/site-packages/scipy/linalg/basic.py", line 1152, in lstsq
a1 = _asarray_validated(a, check_finite=check_finite)
File "/u/sar-r0/rzinke/python/miniconda3/envs/MintPy/lib/python3.7/site-packages/scipy/_lib/_util.py", line 239, in _asarray_validated
a = toarray(a)
File "/u/sar-r0/rzinke/python/miniconda3/envs/MintPy/lib/python3.7/site-packages/numpy/lib/function_base.py", line 498, in asarray_chkfinite
"array must not contain infs or NaNs")
ValueError: array must not contain infs or NaNs

Has anyone run into anything similar?

Thanks,
Rob

conda_env.yml failing to install

When running the install command conda env create -f /MintPy/docs/conda_env.yml it completes at the conda level but not at the pip level. I'm getting this error below

Ran pip subprocess with arguments:
['/opt/conda/envs/mintpy/bin/python', '-m', 'pip', 'install', '-U', '-r', '/MintPy/docs/condaenv.xs9yxqx0.requirements.txt']
Pip subprocess output:
Collecting git+https://github.com/tylere/pykml.git (from -r /MintPy/docs/condaenv.xs9yxqx0.requirements.txt (line 2))
  Cloning https://github.com/tylere/pykml.git to /tmp/pip-req-build-u2km5mfy
Collecting mpl_toolkits from git+https://github.com/matplotlib/basemap.git#egg=mpl_toolkits (from -r /MintPy/docs/condaenv.xs9yxqx0.requirements.txt (line 1))
  Cloning https://github.com/matplotlib/basemap.git to /tmp/pip-install-ssja7bok/mpl-toolkits

Pip subprocess error:
  Running command git clone -q https://github.com/tylere/pykml.git /tmp/pip-req-build-u2km5mfy
  Running command git clone -q https://github.com/matplotlib/basemap.git /tmp/pip-install-ssja7bok/mpl-toolkits
    ERROR: Command errored out with exit status 1:
     command: /opt/conda/envs/mintpy/bin/python -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-ssja7bok/mpl-toolkits/setup.py'"'"'; __file__='"'"'/tmp/pip-install-ssja7bok/mpl-toolkits/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base pip-egg-info
         cwd: /tmp/pip-install-ssja7bok/mpl-toolkits/
    Complete output (18 lines):
    checking for GEOS lib in /root ....
    checking for GEOS lib in /root/local ....
    checking for GEOS lib in /usr ....
    checking for GEOS lib in /usr/local ....
    checking for GEOS lib in /sw ....
    checking for GEOS lib in /opt ....
    checking for GEOS lib in /opt/local ....
    
    Can't find geos library in standard locations ('/root', '/root/local', '/usr', '/usr/local', '/sw', '/opt', '/opt/local').
    Please install the corresponding packages using your
    systems software management system (e.g. for Debian Linux do:
    'apt-get install libgeos-3.3.3 libgeos-c1 libgeos-dev' and/or
    set the environment variable GEOS_DIR to point to the location
    where geos is installed (for example, if geos_c.h
    is in /usr/local/include, and libgeos_c is in /usr/local/lib,
    set GEOS_DIR to /usr/local), or edit the setup.py script
    manually and set the variable GEOS_dir (right after the line
    that says "set GEOS_dir manually here".
    ----------------------------------------
ERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.


CondaEnvException: Pip failed

How do I proceed with this error? Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.