Author: Zhang Yunjun, Heresh Fattahi, August 5-9, 2024 EarthScope InSAR Short Course (ISCE+).
The Miami INsar Timeseries software in PYthon (MintPy) is an open-source package for InSAR time-series analysis. MintPy currently starts from stacks of unwrapped interferograms (in either geo- or radar-coordinates) and estimates ground displacement time-series. MintPy is primarily consistent with stacks of interferograms processed with ISCE. However, the software also supports interfarograms processed with other InSAR processors such as GAMMA, GMTSAR, SNAP and ROI_PAC.
MintPy is available on Github from the following page: https://github.com/insarlab/MintPy
References: The detailed algorithms implemented in MintPy can be found in the following paper:
The cell below performs the intial setup of the notebook and must be run every time the notebook (re)starts. It defines the processing location and check the example dataset.
%matplotlib inline
import os
import matplotlib.pyplot as plt
import numpy as np
import shutil
from cartopy import crs as ccrs
from mintpy.utils import readfile, utils as ut, plot as pp
from mintpy.cli import view, tsview, plot_network, plot_transection
from mintpy.view import prep_slice, plot_slice
import utils
plt.rcParams.update({'font.size': 12})
# utils function
def write_config_file(out_file, CONFIG_TXT, mode='a'):
"""Write configuration files for MintPy to process ARIA sample products"""
if not os.path.isfile(out_file) or mode == 'w':
with open(out_file, "w") as fid:
fid.write(CONFIG_TXT)
print('write configuration to file: {}'.format(out_file))
else:
with open(out_file, "a") as fid:
fid.write("\n" + CONFIG_TXT)
print('add the following to file: \n{}'.format(CONFIG_TXT))
# define and go to the work directory
work_dir = os.path.expanduser('~/data/test/SanFranSenDT42/mintpy')
os.makedirs(work_dir, exist_ok=True)
os.chdir(work_dir)
print('Go to work directory:', work_dir)
# define the custom config file
config_file = os.path.join(work_dir, "SanFranSenDT42.txt")
Go to work directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy
Run the cell below to 1) download the staged stack of interferograms or 2) to download and prepare the stack of interferograms using ARIA-tools. We have pre-processed an example ARIA dataset on San Francisco Bay and uploaded it to AWS and Zenodo, one could also run ARIA-tools commands to download and pre-process themselves, by setting use_staged_data
below.
# download/prepare the interferogram stack from ARIA products and load into mintpy
# aws - download pre-processed data from AWS S3 bucket [recommended, requires awscli]
# zenodo - download pre-processed data from zenodo using wget
# False - download & pre-process from ARIA using ARIA-tools
use_staged_data = 'aws' #['aws' / 'zenodo' / False]
if all(os.path.isfile(os.path.join(work_dir, 'inputs', x)) for x in ['ifgramStack.h5', 'geometryGeo.h5']):
print('ARIA products are already loaded into MintPy. Skip re-loading.')
elif use_staged_data in ['aws', 'zenodo']:
# option 1: download the staged data from AWS or Zenodo
os.chdir(os.path.dirname(os.path.dirname(work_dir)))
tar_file = os.path.join(os.path.dirname(os.path.dirname(work_dir)), 'SanFranSenDT42.tar.gz')
if os.path.isfile(tar_file):
print('Staged ARIA product exists at: {}'.format(tar_file))
elif use_staged_data == 'aws':
!aws --region us-west-2 --no-sign-request s3 cp s3://asf-jupyter-data-west/unavco2022/SanFranSenDT42.tar.gz ./
elif use_staged_data == 'zenodo':
!wget https://zenodo.org/record/6990323/files/SanFranSenDT42.tar.gz ./
# decompress the tar file [it takes ~1.5 min]
print('decompressing the downloaded dataset...')
!pv SanFranSenDT42.tar.gz | tar -xz
os.chdir(work_dir)
elif use_staged_data is False:
# option 2: download / prepare ARIA products using ARIA-tools, and load into MintPy
aria_stack_files = [os.path.join(work_dir, f'../stack/{x}Stack.vrt') for x in ['unwrap', 'coh', 'connComp']]
if all(os.path.isfile(x) for x in stack_files):
print('ARIA products already exists at: {}'.format(os.path.dirname(work_dir)))
else:
print("Using ARIA-tools to download and prepare the input data for MintPy")
os.chdir(os.path.dirname(work_dir))
!ariaDownload.py -b '37.25 38.1 -122.6 -121.75' --track 42
!ariaTSsetup.py -f 'products/*.nc' -b '37.25 38.1 -122.6 -121.75' --mask Download --num_threads 4 --verbose
os.chdir(work_dir)
# load ARIA products into MintPy
!prep_aria.py -s ../stack/ -d ../DEM/SRTM_3arcsec.dem -i ../incidenceAngle/*.vrt -a ../azimuthAngle/*.vrt -w ../mask/watermask.msk --update
# compress for staging
# tar cvzf SanFranSenDT42.tar.gz SanFranSenDT42
else:
raise ValueError(f'un-recognized "use_staged_data" setting: {use_staged_data}\navailable settings: "aws", "zenodo", False.')
Staged ARIA product exists at: /Users/yunjunz/data/test/SanFranSenDT42.tar.xz decompressing the downloaded dataset... 2.47GiB 0:02:16 [18.5MiB/s] [================================>] 100%
This application provides a workflow which includes several steps to invert a stack of unwrapped interferograms and apply different corrections to obtain ground displacement timeseries. The workflow consists of two main blocks:
Some steps are optional, which are switched off by default (marked by dashed boundaries). Configuration parameters for each step are initiated with default values in a customizable text file: smallbaselineApp.cfg. In this notebook, we will walk through the various steps.
(Figure from Yunjun et al., 2019)
The smallbaselineApp.py
workflow can be called with a single command-line call. By default it will run all the required processing steps with options pulled from the template files, as shown in section 5.2. However, in this notebook, we will use the "step" processing, which allows to re-start the processing from a given step. More detailed usage can be found in help.
!smallbaselineApp.py --help
usage: smallbaselineApp.py [-h] [--dir WORKDIR] [-g] [-H] [-v] [--plot] [--start STEP] [--end STEP] [--dostep STEP] [customTemplateFile] Routine Time Series Analysis for Small Baseline InSAR Stack positional arguments: customTemplateFile custom template with option settings. ignored if the default smallbaselineApp.cfg is input. options: -h, --help show this help message and exit --dir WORKDIR, --work-dir WORKDIR work directory, (default: ./). -g generate default template (if it does not exist) and exit. -H print the default template file and exit. -v, --version print software version and exit --plot plot results [only] without running smallbaselineApp. steps processing (start/end/dostep): Command line options for steps processing with names chosen from the following list: ['load_data', 'modify_network', 'reference_point', 'quick_overview', 'correct_unwrap_error'] ['invert_network', 'correct_LOD', 'correct_SET', 'correct_ionosphere', 'correct_troposphere'] ['deramp', 'correct_topography', 'residual_RMS', 'reference_date', 'velocity', 'geocode'] ['google_earth', 'hdfeos5'] In order to use either --start or --dostep, it is necessary that a previous run was done using one of the steps options to process at least through the step immediately preceding the starting step of the current run. --start STEP start processing at the named step (default: load_data). --end STEP, --stop STEP end processing at the named step (default: hdfeos5) --dostep STEP run processing at the named step only reference: Yunjun, Z., Fattahi, H., and Amelung, F. (2019), Small baseline InSAR time series analysis: Unwrapping error correction and noise reduction, Computers & Geosciences, 133, 104331, doi:10.1016/j.cageo.2019.104331. example: smallbaselineApp.py # run with default template 'smallbaselineApp.cfg' smallbaselineApp.py <custom_template> # run with default and custom templates smallbaselineApp.py -h / --help # help smallbaselineApp.py -H # print default template options smallbaselineApp.py -g # generate default template if it does not exist smallbaselineApp.py -g <custom_template> # generate/update default template based on custom template smallbaselineApp.py --plot # plot results w/o run [to populate the 'pic' folder after failed runs] # step processing with --start/stop/dostep options smallbaselineApp.py GalapagosSenDT128.txt --dostep velocity # run step 'velocity' only smallbaselineApp.py GalapagosSenDT128.txt --end load_data # end run after step 'load_data'
The app includes the following processing steps:
load_data:
loads the stack of interferograms (unwrapped phase, coherence, and/or connected componenents) and geometry files (height, incidence/azimuth angle, lookup tables etc.) into HDF5 files with multiple datasets and attributes.modify_network:
this step (if requested) modifies the network of interferograms, e.g., based on average coherence, temporal and spatial baselines threshold, or by removing specific pairs.reference_point:
the unwrapped interferograms may be relative to differnt reference pixels. This step introduces a common reference pixel to all interferograms. For intuitive interpretation, one may choose a stable coherent non-deforming pixel. However, since the estimated InSAR displacement time-series is relative in both time and space, choosing a deforming pixel does not change the results.quick_overview:
this step provides a quick assessment of:correct_unwrap_error:
the input unwrapped interferograms may be affected by phase unwrapping errors (wrong integer number of $2\pi$ phase added during phase unwrapping). This step (if requested) offers three methods to possibly correct unwrapping errors.invert_network:
inverts the stack of unwrapped interferograms to form the InSAR phase time-series. This is equivalent to transforming the network of small-baseline interferograms to a single-reference network of interferogram (i.e., the unwrapped phase timeseries).correct_LOD:
this step is specific to Envisat data and applies an empircal correction to account for possible local oscilator drift of the radar.correct_SET:
corrects (if requested) the solid Earth tides due to the gravity pull from the Sun and the Moon.correct_ionosphere
: corrects ionospheric delay using the split-spectrum results (from ISCE-2 stack processors only).correct_troposphere:
corrects tropospheric delay 1) using atmospheric models or 2) with empirical phase elevation approach estimated from InSAR data.deramp:
this step (if requested) removes a ramp from each acquisition. Note that deramping removes residual long-wavelength interferometric phase components which may be due to noise (geometrical residual, atmospheric delay) or signal (tectonic deformation).correct_topography:
estimates residual topographic effects (due to DEM errors) which are correlated with temporal variation of perpendicular baseline.residual_RMS:
estimates the average noise level for each acquisition by calculating the RMS of the residual phase.reference_date:
change reference date.velocity:
estimates a suite of time functions, such as a linear velocity.geocode:
if the original stack in radar-coordinates, convert it to geo-coordinates in lat/longoogle_earth:
output the average velocity into an Google Earth KMZ file.hdfeos5:
output the displacement time-series with geometry info into one file in HDF-EOS5 format.The processing parameters for the smallbaselineApp.py are controlled via configuration files. At least one file is required to run smallbaselineApp.py.
default configuration
: smallbaselineApp.cfg. It contains all configuration parameters, grouped by steps, with default auto values (which are defined in smallbaselineApp_auto.cfg). This file is copied over to the current working directory and read every time smallbaselineApp.py runs.custom configuration
(optional but recommended): SanFranSenDT42.txt
in the example dataset. It constains selective, manually modified configuration parameters. The custom template file name is arbitrary. Custom template has higher priority than the default template; if custom template is specified, smallbaselineApp.py will update the default smallbaselineApp.cfg file accordingly.Run the following to create an text file named SanFranSenDT42.txt with the following few lines in it:
CONFIG_TXT = '''# vim: set filetype=cfg:
mintpy.load.processor = aria #[isce, aria, hyp3, gmtsar, snap, gamma, roipac], auto for isce
#---------interferogram datasets:
mintpy.load.unwFile = ../stack/unwrapStack.vrt
mintpy.load.corFile = ../stack/cohStack.vrt
mintpy.load.connCompFile = ../stack/connCompStack.vrt
#---------geometry datasets:
mintpy.load.demFile = ../DEM/SRTM_3arcsec.dem
mintpy.load.incAngleFile = ../incidenceAngle/*.vrt
mintpy.load.azAngleFile = ../azimuthAngle/*.vrt
mintpy.load.waterMaskFile = ../mask/watermask.msk
mintpy.reference.lalo = 37.69, -122.07
mintpy.troposphericDelay.method = no
mintpy.deramp = no
mintpy.topographicResidual = no
# options to speedup the processing (fast but not the best)
mintpy.networkInversion.weightFunc = no
mintpy.topographicResidual.pixelwiseGeometry = no
'''
write_config_file(config_file, CONFIG_TXT, mode='w')
write configuration to file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/SanFranSenDT42.txt
Check more example datasets from various InSAR processors at:
Check more example file structures and template setups from various InSAR processors at:
MintPy is most consistent with the ISCE direct outputs. However, it supports interferograms processed with other InSAR software including Gamma and SNAP. In this tutorial we are not using the direct ISCE outputs, but rather we use the ISCE outputs packaged by ARIA and pre-processed using ARIA-tools.
!smallbaselineApp.py SanFranSenDT42.txt --dostep load_data
# manually add mintpy.load.processor, as the pre-staged dataset is already loaded into mintpy, causing load_data step fail
ut.add_attribute('inputs/ifgramStack.h5', {'mintpy.load.processor':'aria'})
MintPy version 1.6.1.post3, date 2024-08-14 --RUN-at-2024-08-26 11:02:55.432636-- Current directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy Run routine processing with smallbaselineApp.py on steps: ['load_data'] Remaining steps: ['modify_network', 'reference_point', 'quick_overview', 'correct_unwrap_error', 'invert_network', 'correct_LOD', 'correct_SET', 'correct_ionosphere', 'correct_troposphere', 'deramp', 'correct_topography', 'residual_RMS', 'reference_date', 'velocity', 'geocode', 'google_earth', 'hdfeos5'] -------------------------------------------------- Project name: SanFranSenDT42 Go to work directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy copy default template file /Users/yunjunz/tools/MintPy/src/mintpy/defaults/smallbaselineApp.cfg to work directory read custom template file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/SanFranSenDT42.txt update default template based on input custom template mintpy.load.processor: auto --> aria mintpy.load.unwFile: auto --> ../stack/unwrapStack.vrt mintpy.load.corFile: auto --> ../stack/cohStack.vrt mintpy.load.connCompFile: auto --> ../stack/connCompStack.vrt mintpy.load.demFile: auto --> ../DEM/SRTM_3arcsec.dem mintpy.load.incAngleFile: auto --> ../incidenceAngle/*.vrt mintpy.load.azAngleFile: auto --> ../azimuthAngle/*.vrt mintpy.load.waterMaskFile: auto --> ../mask/watermask.msk mintpy.reference.lalo: auto --> 37.69, -122.07 mintpy.networkInversion.weightFunc: auto --> no mintpy.troposphericDelay.method: auto --> no mintpy.deramp: auto --> no mintpy.topographicResidual: auto --> no mintpy.topographicResidual.pixelwiseGeometry: auto --> no copy SanFranSenDT42.txt to inputs directory for backup. copy smallbaselineApp.cfg to inputs directory for backup. copy SanFranSenDT42.txt to pic directory for backup. copy smallbaselineApp.cfg to pic directory for backup. read default template file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg ******************** step - load_data ******************** load_data.py --template /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg /Users/yunjunz/data/test/SanFranSenDT42/mintpy/SanFranSenDT42.txt --project SanFranSenDT42 processor : aria SAR platform/sensor : Sen -------------------------------------------------- prepare metadata files for aria products prep_aria.py --template /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg --stack-dir ../stack --unwrap-stack-name unwrapStack.vrt --coherence-stack-name cohStack.vrt --conn-comp-stack-name connCompStack.vrt --dem ../DEM/SRTM_3arcsec.dem --incidence-angle "../incidenceAngle/*.vrt" --azimuth-angle "../azimuthAngle/*.vrt" --water-mask ../mask/watermask.msk --update read options from template file: smallbaselineApp.cfg multilook x/ystep: 1/1 multilook method : nearest search input data file info: unwFile : /Users/yunjunz/data/test/SanFranSenDT42/stack/unwrapStack.vrt corFile : /Users/yunjunz/data/test/SanFranSenDT42/stack/cohStack.vrt connCompFile : /Users/yunjunz/data/test/SanFranSenDT42/stack/connCompStack.vrt demFile : ../DEM/SRTM_3arcsec.dem incAngleFile : ../incidenceAngle/20150605_20150512.vrt azAngleFile : ../azimuthAngle/20150605_20150512.vrt waterMaskFile : ../mask/watermask.msk update mode: True extract metadata from /Users/yunjunz/data/test/SanFranSenDT42/stack/unwrapStack.vrt -------------------------------------------------- create HDF5 file: ./inputs/ifgramStack.h5 with w mode create dataset : date of |S8 in size of (505, 2) with compression = False create dataset : dropIfgram of <class 'numpy.bool_'> in size of (505,) with compression = False create dataset : bperp of <class 'numpy.float32'> in size of (505,) with compression = False create dataset : unwrapPhase of <class 'numpy.float32'> in size of (505, 1021, 1021) with compression = False create dataset : coherence of <class 'numpy.float32'> in size of (505, 1021, 1021) with compression = False create dataset : connectComponent of <class 'numpy.int16'> in size of (505, 1021, 1021) with compression = lzf close HDF5 file: ./inputs/ifgramStack.h5 -------------------------------------------------- open unwrapStack.vrt with gdal ... open cohStack.vrt with gdal ... open connCompStack.vrt with gdal ... grab NoDataValue for unwrapPhase : 0.0 and convert to 0. grab NoDataValue for coherence : 0.0 and convert to 0. grab NoDataValue for connectComponent: -1.0 and convert to 0. number of interferograms: 505 writing data to HDF5 file ./inputs/ifgramStack.h5 with a mode ... [==================================================] 20200227_20200310 505/505 112s / 2s finished writing to HD5 file: ./inputs/ifgramStack.h5 -------------------------------------------------- create HDF5 file: ./inputs/geometryGeo.h5 with w mode create dataset : height of <class 'numpy.float32'> in size of (1021, 1021) with compression = False create dataset : incidenceAngle of <class 'numpy.float32'> in size of (1021, 1021) with compression = False create dataset : slantRangeDistance of <class 'numpy.float32'> in size of (1021, 1021) with compression = False create dataset : azimuthAngle of <class 'numpy.float32'> in size of (1021, 1021) with compression = False create dataset : waterMask of <class 'numpy.bool_'> in size of (1021, 1021) with compression = False close HDF5 file: ./inputs/geometryGeo.h5 -------------------------------------------------- writing data to HDF5 file ./inputs/geometryGeo.h5 with a mode ... finished writing to HD5 file: ./inputs/geometryGeo.h5 -------------------------------------------------- time used: 01 mins 54.4 secs. No lookup table (longitude or rangeCoord) found in files. Input data seems to be geocoded. Lookup file not needed. Loaded dataset are processed by InSAR software: isce Loaded dataset are in GEO coordinates Interferogram Stack: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/inputs/ifgramStack.h5 Geometry File : /Users/yunjunz/data/test/SanFranSenDT42/mintpy/inputs/geometryGeo.h5 Lookup Table File : None -------------------------------------------------- updating metadata based on custom template file SanFranSenDT42.txt for file: ifgramStack.h5 Go back to directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy ################################################ Normal end of smallbaselineApp processing! ################################################ Time used: 01 mins 54.7 secs
'inputs/ifgramStack.h5'
By running this command, the "inputs" directory inside the working directory is created and two HDF5 files are produced as
ls -l inputs
total 9506976 -rw-r--r-- 1 yunjunz staff 503532872 Nov 9 2020 ERA5.h5 -rw-r--r-- 1 yunjunz staff 878 Aug 26 11:02 SanFranSenDT42.txt -rw-r--r-- 1 yunjunz staff 17886520 Aug 26 11:04 geometryGeo.h5 -rw-r--r-- 1 yunjunz staff 4339395990 Aug 26 11:04 ifgramStack.h5 -rw-r--r-- 1 yunjunz staff 24761 Aug 26 11:02 smallbaselineApp.cfg
ifgramStack.h5:
this file contains 6 dataset cubes and multiple metadata.unwrapPhase - 3D array in size of (m, l, w) for unwrapped interferometric phases data cube in radians
coherence - 3D array in size of (m, l, w) for spatial coherence data cube
connectComponent - 3D array in size of (m, l, w) for connected commponents data cube
date - 2D array in size of (m, 2) in YYYYMMDD format for the reference / secondary dates.
bperp - 1D array in size of (m,) in meters for perpendicular baselines (average value)
dropIfgram - 1D array in size of (m,) in boolean to indicate whether an interferogram is used for inversion or ignored
where m
is the number of interferograms, l
is the number of lines and w
is the number of columns.
geometryGeo.h5:
this file contains geometrical datasets including height, incidence angle, azimuth angle, shadow layover mask, slant range distance and/or water mask.Check more detailed description of the data structure here.
# !gdalinfo inputs/geometryGeo.h5
!info.py inputs/geometryGeo.h5
******************** Basic File Info ************************ file name: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/inputs/geometryGeo.h5 file type: geometry coordinates : GEO ******************** HDF5 File Structure ******************** Attributes in / level: ALOOKS 7 ANTENNA_SIDE -1 CENTER_LINE_UTC 50854.0 EARTH_RADIUS 6337286.638938101 FILE_LENGTH 1021 FILE_TYPE geometry HEADING -168 HEIGHT 693000.0 LAT_REF1 38.1 LAT_REF2 38.1 LAT_REF3 37.2491666666667 LAT_REF4 37.2491666666667 LENGTH 1021 LON_REF1 -121.75 LON_REF2 -122.600833333333 LON_REF3 -121.75 LON_REF4 -122.600833333333 NCORRLOOKS 61.29302259485575 NUMBER_OF_PAIRS 505 ORBIT_DIRECTION DESCENDING PLATFORM Sen PROCESSOR isce RANGE_PIXEL_SIZE 44.26168155670166 RLOOKS 19 STARTING_RANGE 798980.125 UTCTime (HH:MM:SS.ss) 14:07:34.000000 WAVELENGTH 0.05546576 WIDTH 1021 Wavelength (m) 0.05546576 X_FIRST -122.600833333 X_STEP 0.000833333 X_UNIT degrees Y_FIRST 38.100000000 Y_STEP -0.000833333 Y_UNIT degrees azimuthAngle -8.77992802893209 endRange 956307.125 incidenceAngle 37.84586800841093 lookAngle 33.50304291634067 orbitDirection DESCENDING slantRangeSpacing 2.329562187194824 startRange 798980.125 HDF5 dataset "/azimuthAngle ": shape=(1021, 1021) , dtype=float32 , compression=gzip HDF5 dataset "/height ": shape=(1021, 1021) , dtype=float32 , compression=gzip HDF5 dataset "/incidenceAngle ": shape=(1021, 1021) , dtype=float32 , compression=gzip HDF5 dataset "/slantRangeDistance ": shape=(1021, 1021) , dtype=float32 , compression=gzip HDF5 dataset "/waterMask ": shape=(1021, 1021) , dtype=bool , compression=gzip
!info.py inputs/ifgramStack.h5
******************** Basic File Info ************************ file name: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/inputs/ifgramStack.h5 file type: ifgramStack coordinates : GEO ******************** HDF5 File Structure ******************** Attributes in / level: ALOOKS 7 ANTENNA_SIDE -1 CENTER_LINE_UTC 50854.0 EARTH_RADIUS 6337286.638938101 FILE_LENGTH 1021 FILE_TYPE ifgramStack HEADING -168 HEIGHT 693000.0 LAT_REF1 38.1 LAT_REF2 38.1 LAT_REF3 37.2491666666667 LAT_REF4 37.2491666666667 LENGTH 1021 LON_REF1 -121.75 LON_REF2 -122.600833333333 LON_REF3 -121.75 LON_REF4 -122.600833333333 NCORRLOOKS 61.29302259485575 NUMBER_OF_PAIRS 505 ORBIT_DIRECTION DESCENDING PLATFORM Sen PROCESSOR isce RANGE_PIXEL_SIZE 44.26168155670166 RLOOKS 19 STARTING_RANGE 798980.125 UTCTime (HH:MM:SS.ss) 14:07:34.000000 WAVELENGTH 0.05546576 WIDTH 1021 Wavelength (m) 0.05546576 X_FIRST -122.600833333 X_STEP 0.000833333 X_UNIT degrees Y_FIRST 38.100000000 Y_STEP -0.000833333 Y_UNIT degrees azimuthAngle -8.77992802893209 endRange 956307.125 incidenceAngle 37.84586800841093 lookAngle 33.50304291634067 mintpy.deramp no mintpy.load.azAngleFile ../azimuthAngle/*.vrt mintpy.load.connCompFile ../stack/connCompStack.vrt mintpy.load.corFile ../stack/cohStack.vrt mintpy.load.demFile ../DEM/SRTM_3arcsec.dem mintpy.load.incAngleFile ../incidenceAngle/*.vrt mintpy.load.processor aria mintpy.load.unwFile ../stack/unwrapStack.vrt mintpy.load.waterMaskFile ../mask/watermask.msk mintpy.networkInversion.weightFunc no mintpy.reference.lalo 37.69, -122.07 mintpy.topographicResidual no mintpy.topographicResidual.pixelwiseGeometry no mintpy.troposphericDelay.method no orbitDirection DESCENDING slantRangeSpacing 2.329562187194824 startRange 798980.125 HDF5 dataset "/bperp ": shape=(505,) , dtype=float32 , compression=gzip HDF5 dataset "/coherence ": shape=(505, 1021, 1021) , dtype=float32 , compression=gzip MODIFICATION_TIME 1724641490.362794 HDF5 dataset "/connectComponent ": shape=(505, 1021, 1021) , dtype=int16 , compression=lzf MODIFICATION_TIME 1724641490.362993 HDF5 dataset "/date ": shape=(505, 2) , dtype=|S8 , compression=gzip HDF5 dataset "/dropIfgram ": shape=(505,) , dtype=bool , compression=gzip HDF5 dataset "/unwrapPhase ": shape=(505, 1021, 1021) , dtype=float32 , compression=gzip MODIFICATION_TIME 1724641490.362367
!info.py inputs/ifgramStack.h5 --date --num --compact
20150512_20150605 0 20150512_20150629 1 20150512_20150723 2 20150512_20160530 3 20150605_20150629 4 20150605_20150723 5 20150605_20150816 6 20150629_20150723 7 20150629_20150816 8 20150629_20150909 9 20150629_20160810 10 20150723_20150816 11 20150723_20150909 12 20150723_20151003 13 20150723_20160810 14 20150723_20160903 15 20150816_20150909 16 20150816_20151003 17 20150816_20151027 18 20150816_20160903 19 ...
Before inversion, it's useful to take a look at the network of interferograms. Running plot_network.py
gives an overview of the network and the average coherence of the stack.
plot_network.main('inputs/ifgramStack.h5 -t smallbaselineApp.cfg --figsize 12 4'.split())
read options from template file: smallbaselineApp.cfg read temporal/spatial baseline info from file: inputs/ifgramStack.h5 open ifgramStack file: ifgramStack.h5 calculating spatial mean of coherence in file inputs/ifgramStack.h5 ... [==================================================] 505/505 17s / 0s write average value in space into text file: coherenceSpatialAvg.txt number of acquisitions: 114 number of interferograms: 505 shift all perp baseline by -56.63898468017578 to zero mean for plotting -------------------------------------------------- number of interferograms marked as drop: 0 number of interferograms marked as keep: 505 number of acquisitions marked as drop: 0 max perpendicular baseline: 219.65 m max temporal baseline: 456.0 days showing coherence data range: [0.4564, 0.9394] display range: (0.2, 1.0) showing ...
Note that with the --nodisplay
argument, the plots won't be displayed but saved as pdf files in the current directory. Running this command creates multiple files as follows:
coherenceSpatialAvg.txt
: A simple text file that provides an overview to the stack and contains the interferogram dates, average coherence temporal and spatial baseline separation.coherenceMatrix.pdf
: Shows the avergae coherence pairs between all available pairs in the stack.coherenceHistory.pdf
: Shows for each acquisition, the minimum and maximum average coherence among all pairs involing this acquisition. Useful to identify bad acquisitions, which usually have very low value for the maximum coherence. Those acquisitions should be dropped.network.pdf
: Displays the network of interferograms on time-baseline coordinates, colorcoded by avergae coherence of the interferograms. Circles represent the acquisition dates and lines represent the interferograms. Solid lines are the interferograms used for time-series analysis and dashed line are the interferograms ignored in the time-series analysis.Before running the time-series inversion, one may want to looks at average coherence in the stack. To create a map of average spatial coherence use temporal_average.py
:
!temporal_average.py ./inputs/ifgramStack.h5 -d coherence -o avgSpatialCoh.h5
view.main('avgSpatialCoh.h5 --noverbose'.split())
# equivalent command in terminal: view.py avgSpatialCoh.h5 --noverbose
calculate the temporal average of coherence in file ./inputs/ifgramStack.h5 ... [==================================================] lines 1021/1021 1s / 0s create HDF5 file: avgSpatialCoh.h5 with w mode create dataset /coherence of float32 in size of (1021, 1021) with compression=None finished writing to avgSpatialCoh.h5 time used: 00 mins 3.3 secs view.py avgSpatialCoh.h5 --noverbose
Also one may optionally extract the water mask from the geometry file to be used later for masking the time-series results:
!generate_mask.py inputs/geometryGeo.h5 waterMask --nonzero -o waterMask.h5
view.main('waterMask.h5 -c gray --noverbose'.split())
input geometry file: inputs/geometryGeo.h5 read inputs/geometryGeo.h5 waterMask create initial mask with the same size as the input file and all = 1 all pixels with nan value = 0 exclude pixels with zero value create HDF5 file: waterMask.h5 with w mode create dataset /mask of bool in size of (1021, 1021) with compression=None finished writing to waterMask.h5 time used: 00 mins 0.0 secs. view.py waterMask.h5 -c gray --noverbose
The common connected component mask indicates pixels with valid connected component value in all kept interferograms. This is also used to guide the reference point selection.
!generate_mask.py inputs/ifgramStack.h5 --nonzero -o maskConnComp.h5 --update
view.main('maskConnComp.h5 -c gray --noverbose'.split())
input ifgramStack file: inputs/ifgramStack.h5 -------------------------------------------------- update mode: ON 1) output file maskConnComp.h5 NOT exist. run or skip: run. calculate the common mask of pixels with non-zero connectComponent value [==================================================] 505/505 10s / 0s create HDF5 file: maskConnComp.h5 with w mode create dataset /mask of bool in size of (1021, 1021) with compression=None finished writing to maskConnComp.h5 time used: 00 mins 10.9 secs. view.py maskConnComp.h5 -c gray --noverbose
The interferometric phase is relative observation by nature. The phases of each unwrapped interferogram are relative with respect to an arbitrary pixel. Therfore we need to reference all interferograms to a common reference pixel. The step "reference_point" selects a common reference pixel for the stack of interferograms. The default approach of mintpy is to choose a pixel with highest spatial coherence in the stack. Other options include specifying the longitude and latitude of the desired reference pixel or the line and column number of the refence pixel.
There are some guidelines for selecting a reference point based on the prior knowledge of the study area in the mintpy.reference.* section.
!smallbaselineApp.py SanFranSenDT42.txt --dostep reference_point
MintPy version 1.6.1.post3, date 2024-08-14 --RUN-at-2024-08-26 11:05:28.693148-- Current directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy Run routine processing with smallbaselineApp.py on steps: ['reference_point'] Remaining steps: ['quick_overview', 'correct_unwrap_error', 'invert_network', 'correct_LOD', 'correct_SET', 'correct_ionosphere', 'correct_troposphere', 'deramp', 'correct_topography', 'residual_RMS', 'reference_date', 'velocity', 'geocode', 'google_earth', 'hdfeos5'] -------------------------------------------------- Project name: SanFranSenDT42 Go to work directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy read custom template file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/SanFranSenDT42.txt update default template based on input custom template No new option value found, skip updating /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg read default template file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg ******************** step - reference_point ******************** Input data seems to be geocoded. Lookup file not needed. generate_mask.py /Users/yunjunz/data/test/SanFranSenDT42/mintpy/inputs/ifgramStack.h5 --nonzero -o /Users/yunjunz/data/test/SanFranSenDT42/mintpy/maskConnComp.h5 --update input ifgramStack file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/inputs/ifgramStack.h5 -------------------------------------------------- update mode: ON 1) output file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/maskConnComp.h5 already exists. 2) output file is newer than input dataset: connectComponent. run or skip: skip. temporal_average.py /Users/yunjunz/data/test/SanFranSenDT42/mintpy/inputs/ifgramStack.h5 --dataset coherence -o /Users/yunjunz/data/test/SanFranSenDT42/mintpy/avgSpatialCoh.h5 --update -------------------------------------------------- update mode: ON 1) output file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/avgSpatialCoh.h5 already exists. 2) output file is newer than input dataset: coherence. run or skip: skip. Input data seems to be geocoded. Lookup file not needed. reference_point.py /Users/yunjunz/data/test/SanFranSenDT42/mintpy/inputs/ifgramStack.h5 -t /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg -c /Users/yunjunz/data/test/SanFranSenDT42/mintpy/avgSpatialCoh.h5 -------------------------------------------------- reading reference info from template: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg input reference point in lat/lon: (37.69, -122.07) input reference point in y/x: (492, 637) mask: maskConnComp.h5 -------------------------------------------------- calculate the temporal average of unwrapPhase in file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/inputs/ifgramStack.h5 ... [==================================================] lines 1021/1021 1s / 0s Add/update ref_x/y attribute to file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/inputs/ifgramStack.h5 {'REF_Y': '492', 'REF_X': '637', 'REF_LAT': '37.6895834975', 'REF_LON': '-122.0695835455'} touch avgSpatialCoh.h5 touch maskConnComp.h5 Go back to directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy ################################################ Normal end of smallbaselineApp processing! ################################################ Time used: 00 mins 3.7 secs
Running the "reference_step" adds additional attributes "REF_X, REF_Y" and "REF_LON, REF_LAT" to the ifgramStack.h5 file. To see the attributes of the file run info.py:
!info.py inputs/ifgramStack.h5 | egrep 'REF_'
REF_LAT 37.6895834975 REF_LON -122.0695835455 REF_X 637 REF_Y 492
Note that reference_point does not change the actual values of the unwrapped phase dataset. However, MintPy takes into account the phase at the reference point while performing the time-series inversion.
In the next step we invert the network of differential unwrapped interferograms to estimate the time-series of unwrapped phase with respect to a reference acquisition date, which by default is the first acquisition. The estimated time-series is converted to distance change from radar to target and is provided in meters.
!smallbaselineApp.py SanFranSenDT42.txt --dostep invert_network
# copy the output temporal coherence file for later-on comparison
if not os.path.isfile('temporalCoherenceRaw.h5'):
shutil.copy2('temporalCoherence.h5', 'temporalCoherenceRaw.h5')
MintPy version 1.6.1.post3, date 2024-08-14 --RUN-at-2024-08-26 11:05:33.609577-- Current directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy Run routine processing with smallbaselineApp.py on steps: ['invert_network'] Remaining steps: ['correct_LOD', 'correct_SET', 'correct_ionosphere', 'correct_troposphere', 'deramp', 'correct_topography', 'residual_RMS', 'reference_date', 'velocity', 'geocode', 'google_earth', 'hdfeos5'] -------------------------------------------------- Project name: SanFranSenDT42 Go to work directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy read custom template file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/SanFranSenDT42.txt update default template based on input custom template No new option value found, skip updating /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg read default template file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg ******************** step - invert_network ******************** Input data seems to be geocoded. Lookup file not needed. ifgram_inversion.py /Users/yunjunz/data/test/SanFranSenDT42/mintpy/inputs/ifgramStack.h5 -t /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg --update read input option from template file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg use dataset "unwrapPhase" by default -------------------------------------------------- update mode: ON 1) NOT ALL output files found: ['timeseries.h5', 'temporalCoherence.h5', 'numInvIfgram.h5']. run or skip: run. save the original settings of ['OMP_NUM_THREADS', 'OPENBLAS_NUM_THREADS', 'MKL_NUM_THREADS', 'NUMEXPR_NUM_THREADS', 'VECLIB_MAXIMUM_THREADS'] set OMP_NUM_THREADS = 1 set OPENBLAS_NUM_THREADS = 1 set MKL_NUM_THREADS = 1 set NUMEXPR_NUM_THREADS = 1 set VECLIB_MAXIMUM_THREADS = 1 reference pixel in y/x: (492, 637) from dataset: unwrapPhase ------------------------------------------------------------------------------- least-squares solution with L2 min-norm on: deformation velocity minimum redundancy: 1.0 weight function: no calculate covariance: False mask: no ------------------------------------------------------------------------------- number of interferograms: 505 number of acquisitions : 114 number of lines : 1021 number of columns : 1021 -------------------------------------------------- create HDF5 file: timeseries.h5 with w mode create dataset : date of |S8 in size of (114,) with compression = None create dataset : bperp of <class 'numpy.float32'> in size of (114,) with compression = None create dataset : timeseries of <class 'numpy.float32'> in size of (114, 1021, 1021) with compression = None close HDF5 file: timeseries.h5 -------------------------------------------------- create HDF5 file: temporalCoherence.h5 with w mode create dataset : temporalCoherence of <class 'numpy.float32'> in size of (1021, 1021) with compression = None close HDF5 file: temporalCoherence.h5 -------------------------------------------------- create HDF5 file: numInvIfgram.h5 with w mode create dataset : mask of <class 'numpy.float32'> in size of (1021, 1021) with compression = None close HDF5 file: numInvIfgram.h5 maximum memory size: 4.0E+00 GB split 1021 lines into 2 patches for processing with each patch up to 520 lines ------- processing patch 1 out of 2 -------------- box width: 1021 box length: 520 reading unwrapPhase in (0, 0, 1021, 520) * 505 ... use input reference value convert zero value in unwrapPhase to NaN (no-data value) skip pixels (on the water) with zero value in file: waterMask.h5 skip pixels with unwrapPhase = NaN in all interferograms skip pixels with zero value in file: avgSpatialCoh.h5 number of pixels to invert: 375442 out of 530920 (70.7%) estimating time-series for pixels with valid unwrapPhase in all ifgrams (374257 pixels; 99.7%) ... calculating temporalCoherence in chunks of 400 pixels: 936 chunks in total ... chunk 200 / 936 chunk 400 / 936 chunk 600 / 936 chunk 800 / 936 estimating time-series for pixels with valid unwrapPhase in some ifgrams (1185 pixels; 0.3%) ... [==================================================] 1185/1185 pixels 2s / 0s converting LOS phase unit from radian to meter -------------------------------------------------- open HDF5 file timeseries.h5 in a mode writing dataset /timeseries block: [0, 114, 0, 520, 0, 1021] close HDF5 file timeseries.h5. -------------------------------------------------- open HDF5 file temporalCoherence.h5 in a mode writing dataset /temporalCoherence block: [0, 520, 0, 1021] close HDF5 file temporalCoherence.h5. -------------------------------------------------- open HDF5 file numInvIfgram.h5 in a mode writing dataset /mask block: [0, 520, 0, 1021] close HDF5 file numInvIfgram.h5. time used: 00 mins 18.1 secs. ------- processing patch 2 out of 2 -------------- box width: 1021 box length: 501 reading unwrapPhase in (0, 520, 1021, 1021) * 505 ... use input reference value convert zero value in unwrapPhase to NaN (no-data value) skip pixels (on the water) with zero value in file: waterMask.h5 skip pixels with unwrapPhase = NaN in all interferograms skip pixels with zero value in file: avgSpatialCoh.h5 number of pixels to invert: 382614 out of 511521 (74.8%) estimating time-series for pixels with valid unwrapPhase in all ifgrams (381275 pixels; 99.7%) ... calculating temporalCoherence in chunks of 400 pixels: 954 chunks in total ... chunk 200 / 954 chunk 400 / 954 chunk 600 / 954 chunk 800 / 954 estimating time-series for pixels with valid unwrapPhase in some ifgrams (1339 pixels; 0.3%) ... [==================================================] 1339/1339 pixels 0s / 0s converting LOS phase unit from radian to meter -------------------------------------------------- open HDF5 file timeseries.h5 in a mode writing dataset /timeseries block: [0, 114, 520, 1021, 0, 1021] close HDF5 file timeseries.h5. -------------------------------------------------- open HDF5 file temporalCoherence.h5 in a mode writing dataset /temporalCoherence block: [520, 1021, 0, 1021] close HDF5 file temporalCoherence.h5. -------------------------------------------------- open HDF5 file numInvIfgram.h5 in a mode writing dataset /mask block: [520, 1021, 0, 1021] close HDF5 file numInvIfgram.h5. time used: 00 mins 34.3 secs. -------------------------------------------------- update values on the reference pixel: (492, 637) set temporalCoherence on the reference pixel to 1. set # of observations on the reference pixel as 505 roll back to the original settings of ['OMP_NUM_THREADS', 'OPENBLAS_NUM_THREADS', 'MKL_NUM_THREADS', 'NUMEXPR_NUM_THREADS', 'VECLIB_MAXIMUM_THREADS'] set OMP_NUM_THREADS = 16 remove env variable OPENBLAS_NUM_THREADS remove env variable MKL_NUM_THREADS remove env variable NUMEXPR_NUM_THREADS remove env variable VECLIB_MAXIMUM_THREADS time used: 00 mins 34.3 secs. Input data seems to be geocoded. Lookup file not needed. generate_mask.py /Users/yunjunz/data/test/SanFranSenDT42/mintpy/temporalCoherence.h5 -m 0.7 -o /Users/yunjunz/data/test/SanFranSenDT42/mintpy/maskTempCoh.h5 update mode: ON run or skip: run input temporalCoherence file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/temporalCoherence.h5 read /Users/yunjunz/data/test/SanFranSenDT42/mintpy/temporalCoherence.h5 create initial mask with the same size as the input file and all = 1 all pixels with nan value = 0 exclude pixels with value < 0.7 create HDF5 file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/maskTempCoh.h5 with w mode create dataset /mask of bool in size of (1021, 1021) with compression=None finished writing to /Users/yunjunz/data/test/SanFranSenDT42/mintpy/maskTempCoh.h5 time used: 00 mins 0.0 secs. number of reliable pixels: 440797 Go back to directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy ################################################ Normal end of smallbaselineApp processing! ################################################ Time used: 00 mins 34.8 secs
The main product generated after running inver_network step, is timeseries.h5. To see the general content of the file run info.py
!info.py timeseries.h5 --compact
******************** Basic File Info ************************ file name: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseries.h5 file type: timeseries coordinates : GEO ******************** Date Stat Info ************************* Start Date: 20150512 End Date: 20200310 Number of dates : 114 STD of datetimes : 1.32 years ******************** HDF5 File Structure ******************** Attributes in / level: ALOOKS 7 ANTENNA_SIDE -1 CENTER_LINE_UTC 50854.0 EARTH_RADIUS 6337286.638938101 END_DATE 20200310 FILE_LENGTH 1021 FILE_TYPE timeseries HEADING -168 HEIGHT 693000.0 LAT_REF1 38.1 LAT_REF2 38.1 LAT_REF3 37.2491666666667 LAT_REF4 37.2491666666667 LENGTH 1021 LON_REF1 -121.75 LON_REF2 -122.600833333333 LON_REF3 -121.75 LON_REF4 -122.600833333333 NCORRLOOKS 61.29302259485575 NUMBER_OF_PAIRS 505 ... HDF5 dataset "/bperp ": shape=(114,) , dtype=float32 , compression=None HDF5 dataset "/date ": shape=(114,) , dtype=|S8 , compression=None HDF5 dataset "/timeseries ": shape=(114, 1021, 1021) , dtype=float32 , compression=None
The timeseries file contains three datasets, the time-series which is the interferometric range change for each acquisition relative to the reference acquisition, the "date" dataset which contains the acquisition date for each acquisition and the bperp dataset which contains the timeseries of the perpendicular baseline.
view.main('timeseries.h5 -v -5 5 --noaxis'.split())
# equivalent command in terminal: view.py timeseries.h5 -v -5 5 --noaxis
run view.py in MintPy version 1.6.1.post3, date 2024-08-14 input file is timeseries file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseries.h5 in float32 format file size in y/x: (1021, 1021) num of datasets in file timeseries.h5: 114 datasets to exclude (0): [] datasets to display (114): ['timeseries-20150512', 'timeseries-20150605', 'timeseries-20150629', 'timeseries-20150723', 'timeseries-20150816', 'timeseries-20150909', 'timeseries-20151003', 'timeseries-20151027', 'timeseries-20151120', 'timeseries-20151214', 'timeseries-20160107', 'timeseries-20160131', 'timeseries-20160224', 'timeseries-20160319', 'timeseries-20160412', 'timeseries-20160506', 'timeseries-20160530', 'timeseries-20160810', 'timeseries-20160903', 'timeseries-20160927', 'timeseries-20161021', 'timeseries-20161114', 'timeseries-20161208', 'timeseries-20170101', 'timeseries-20170119', 'timeseries-20170125', 'timeseries-20170218', 'timeseries-20170302', 'timeseries-20170326', 'timeseries-20170407', 'timeseries-20170419', 'timeseries-20170501', 'timeseries-20170513', 'timeseries-20170606', 'timeseries-20170618', 'timeseries-20170630', 'timeseries-20170712', 'timeseries-20170724', 'timeseries-20170805', 'timeseries-20170817', 'timeseries-20170910', 'timeseries-20171004', 'timeseries-20171016', 'timeseries-20171028', 'timeseries-20171109', 'timeseries-20171121', 'timeseries-20171203', 'timeseries-20171215', 'timeseries-20171227', 'timeseries-20180108', 'timeseries-20180120', 'timeseries-20180201', 'timeseries-20180213', 'timeseries-20180225', 'timeseries-20180321', 'timeseries-20180402', 'timeseries-20180414', 'timeseries-20180426', 'timeseries-20180508', 'timeseries-20180520', 'timeseries-20180601', 'timeseries-20180613', 'timeseries-20180625', 'timeseries-20180707', 'timeseries-20180719', 'timeseries-20180731', 'timeseries-20180812', 'timeseries-20180824', 'timeseries-20180905', 'timeseries-20180917', 'timeseries-20180929', 'timeseries-20181011', 'timeseries-20181023', 'timeseries-20181104', 'timeseries-20181116', 'timeseries-20181128', 'timeseries-20181210', 'timeseries-20181222', 'timeseries-20190103', 'timeseries-20190115', 'timeseries-20190127', 'timeseries-20190208', 'timeseries-20190220', 'timeseries-20190304', 'timeseries-20190316', 'timeseries-20190328', 'timeseries-20190409', 'timeseries-20190421', 'timeseries-20190503', 'timeseries-20190515', 'timeseries-20190527', 'timeseries-20190608', 'timeseries-20190620', 'timeseries-20190702', 'timeseries-20190714', 'timeseries-20190726', 'timeseries-20190807', 'timeseries-20190819', 'timeseries-20190831', 'timeseries-20190912', 'timeseries-20190924', 'timeseries-20191006', 'timeseries-20191018', 'timeseries-20191030', 'timeseries-20191111', 'timeseries-20191123', 'timeseries-20191205', 'timeseries-20191217', 'timeseries-20191229', 'timeseries-20200110', 'timeseries-20200122', 'timeseries-20200203', 'timeseries-20200227', 'timeseries-20200310'] data coverage in y/x: (0, 0, 1021, 1021) subset coverage in y/x: (0, 0, 1021, 1021) data coverage in lat/lon: (-122.600833333, 38.1, -121.75000034, 37.249167007000004) subset coverage in lat/lon: (-122.600833333, 38.1, -121.75000034, 37.249167007000004) ------------------------------------------------------------------------ colormap: jet figure title: timeseries figure size : [15.00, 8.00] dataset number: 114 row number: 8 column number: 15 figure number: 1 read mask from file: maskTempCoh.h5 total number of pixels: 1.3E+08 * multilook 2 by 2 with nearest interpolation for display to save memory ---------------------------------------- Figure 1 - timeseries.png reading data as a 3D matrix ... reading 2D slices 114/114... data range: [-10.132898, 10.528969] cm display range: [-5.0, 5.0] cm masking data plotting ... [==================================================] timeseries-20200310 2s / 0s data range: [-8.170067, 9.41587] cm display range: [-5.0, 5.0] cm show colorbar showing ...
The ground deformation caused by many geophysical or anthropogenic processes are linear at first order approximation. Therefore it is common to estimate the rate of the ground deformation which is the slope of linear fit to the time-series. The step "velocity" estimates the rate of the displacement.
!smallbaselineApp.py SanFranSenDT42.txt --dostep velocity
# copy the output velocity file for later-on comparison
if not os.path.isfile('velocityRaw.h5'):
shutil.copy2('velocity.h5', 'velocityRaw.h5')
MintPy version 1.6.1.post3, date 2024-08-14 --RUN-at-2024-08-26 11:06:15.698659-- Current directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy Run routine processing with smallbaselineApp.py on steps: ['velocity'] Remaining steps: ['geocode', 'google_earth', 'hdfeos5'] -------------------------------------------------- Project name: SanFranSenDT42 Go to work directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy read custom template file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/SanFranSenDT42.txt update default template based on input custom template No new option value found, skip updating /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg read default template file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg ******************** step - velocity ******************** timeseries2velocity.py /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseries.h5 -t /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg -o /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5 --update read options from template file: smallbaselineApp.cfg update mode: ON 1) output file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5 NOT found. run or skip: run. open timeseries file: timeseries.h5 exclude date: [] -------------------------------------------------- dates from input file: 114 ['20150512', '20150605', '20150629', '20150723', '20150816', '20150909', '20151003', '20151027', '20151120', '20151214', '20160107', '20160131', '20160224', '20160319', '20160412', '20160506', '20160530', '20160810', '20160903', '20160927', '20161021', '20161114', '20161208', '20170101', '20170119', '20170125', '20170218', '20170302', '20170326', '20170407', '20170419', '20170501', '20170513', '20170606', '20170618', '20170630', '20170712', '20170724', '20170805', '20170817', '20170910', '20171004', '20171016', '20171028', '20171109', '20171121', '20171203', '20171215', '20171227', '20180108', '20180120', '20180201', '20180213', '20180225', '20180321', '20180402', '20180414', '20180426', '20180508', '20180520', '20180601', '20180613', '20180625', '20180707', '20180719', '20180731', '20180812', '20180824', '20180905', '20180917', '20180929', '20181011', '20181023', '20181104', '20181116', '20181128', '20181210', '20181222', '20190103', '20190115', '20190127', '20190208', '20190220', '20190304', '20190316', '20190328', '20190409', '20190421', '20190503', '20190515', '20190527', '20190608', '20190620', '20190702', '20190714', '20190726', '20190807', '20190819', '20190831', '20190912', '20190924', '20191006', '20191018', '20191030', '20191111', '20191123', '20191205', '20191217', '20191229', '20200110', '20200122', '20200203', '20200227', '20200310'] -------------------------------------------------- using all dates to calculate the time function -------------------------------------------------- estimate deformation model with the following assumed time functions: polynomial : 1 periodic : [] stepDate : [] polyline : [] exp : {} log : {} add/update the following configuration metadata: ['startDate', 'endDate', 'excludeDate', 'polynomial', 'periodic', 'stepDate', 'exp', 'log', 'uncertaintyQuantification', 'timeSeriesCovFile', 'bootstrapCount'] -------------------------------------------------- create HDF5 file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5 with w mode create dataset : intercept of <class 'numpy.float32'> in size of (1021, 1021) with compression = None create dataset : interceptStd of <class 'numpy.float32'> in size of (1021, 1021) with compression = None create dataset : velocity of <class 'numpy.float32'> in size of (1021, 1021) with compression = None create dataset : velocityStd of <class 'numpy.float32'> in size of (1021, 1021) with compression = None create dataset : residue of <class 'numpy.float32'> in size of (1021, 1021) with compression = None add /intercept attribute: UNIT = m add /interceptStd attribute: UNIT = m add /velocity attribute: UNIT = m/year add /velocityStd attribute: UNIT = m/year add /residue attribute: UNIT = m close HDF5 file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5 reading data from file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseries.h5 ... skip pixels with zero/nan value in all acquisitions number of pixels to invert: 755741 out of 1042441 (72.5%) estimating time functions via linalg.lstsq ... estimating time functions STD from time-series fitting residual ... -------------------------------------------------- open HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5 in a mode writing dataset /intercept block: [0, 1021, 0, 1021] close HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5. -------------------------------------------------- open HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5 in a mode writing dataset /interceptStd block: [0, 1021, 0, 1021] close HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5. -------------------------------------------------- open HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5 in a mode writing dataset /velocity block: [0, 1021, 0, 1021] close HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5. -------------------------------------------------- open HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5 in a mode writing dataset /velocityStd block: [0, 1021, 0, 1021] close HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5. -------------------------------------------------- open HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5 in a mode writing dataset /residue block: [0, 1021, 0, 1021] close HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5. time used: 00 mins 1.0 secs. timeseries2velocity.py /Users/yunjunz/data/test/SanFranSenDT42/mintpy/inputs/ERA5.h5 -t /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg -o /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocityERA5.h5 --update --ref-date 20150512 --ref-yx 492 637 read options from template file: smallbaselineApp.cfg update mode: ON 1) output file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocityERA5.h5 NOT found. run or skip: run. open timeseries file: ERA5.h5 exclude date: [] -------------------------------------------------- dates from input file: 114 ['20150512', '20150605', '20150629', '20150723', '20150816', '20150909', '20151003', '20151027', '20151120', '20151214', '20160107', '20160131', '20160224', '20160319', '20160412', '20160506', '20160530', '20160810', '20160903', '20160927', '20161021', '20161114', '20161208', '20170101', '20170119', '20170125', '20170218', '20170302', '20170326', '20170407', '20170419', '20170501', '20170513', '20170606', '20170618', '20170630', '20170712', '20170724', '20170805', '20170817', '20170910', '20171004', '20171016', '20171028', '20171109', '20171121', '20171203', '20171215', '20171227', '20180108', '20180120', '20180201', '20180213', '20180225', '20180321', '20180402', '20180414', '20180426', '20180508', '20180520', '20180601', '20180613', '20180625', '20180707', '20180719', '20180731', '20180812', '20180824', '20180905', '20180917', '20180929', '20181011', '20181023', '20181104', '20181116', '20181128', '20181210', '20181222', '20190103', '20190115', '20190127', '20190208', '20190220', '20190304', '20190316', '20190328', '20190409', '20190421', '20190503', '20190515', '20190527', '20190608', '20190620', '20190702', '20190714', '20190726', '20190807', '20190819', '20190831', '20190912', '20190924', '20191006', '20191018', '20191030', '20191111', '20191123', '20191205', '20191217', '20191229', '20200110', '20200122', '20200203', '20200227', '20200310'] -------------------------------------------------- using all dates to calculate the time function -------------------------------------------------- estimate deformation model with the following assumed time functions: polynomial : 1 periodic : [] stepDate : [] polyline : [] exp : {} log : {} add/update the following configuration metadata: ['startDate', 'endDate', 'excludeDate', 'polynomial', 'periodic', 'stepDate', 'exp', 'log', 'uncertaintyQuantification', 'timeSeriesCovFile', 'bootstrapCount'] -------------------------------------------------- create HDF5 file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocityERA5.h5 with w mode create dataset : intercept of <class 'numpy.float32'> in size of (1021, 1021) with compression = None create dataset : interceptStd of <class 'numpy.float32'> in size of (1021, 1021) with compression = None create dataset : velocity of <class 'numpy.float32'> in size of (1021, 1021) with compression = None create dataset : velocityStd of <class 'numpy.float32'> in size of (1021, 1021) with compression = None create dataset : residue of <class 'numpy.float32'> in size of (1021, 1021) with compression = None add /intercept attribute: UNIT = m add /interceptStd attribute: UNIT = m add /velocity attribute: UNIT = m/year add /velocityStd attribute: UNIT = m/year add /residue attribute: UNIT = m close HDF5 file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocityERA5.h5 reading data from file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/inputs/ERA5.h5 ... referecing to date: 20150512 referencing to point (y, x): (492, 637) skip pixels with zero/nan value in all acquisitions /Users/yunjunz/tools/MintPy/src/mintpy/timeseries2velocity.py:276: RuntimeWarning: Mean of empty slice ts_stack = np.nanmean(ts_data, axis=0) number of pixels to invert: 758681 out of 1042441 (72.8%) estimating time functions via linalg.lstsq ... estimating time functions STD from time-series fitting residual ... -------------------------------------------------- open HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocityERA5.h5 in a mode writing dataset /intercept block: [0, 1021, 0, 1021] close HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocityERA5.h5. -------------------------------------------------- open HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocityERA5.h5 in a mode writing dataset /interceptStd block: [0, 1021, 0, 1021] close HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocityERA5.h5. -------------------------------------------------- open HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocityERA5.h5 in a mode writing dataset /velocity block: [0, 1021, 0, 1021] close HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocityERA5.h5. -------------------------------------------------- open HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocityERA5.h5 in a mode writing dataset /velocityStd block: [0, 1021, 0, 1021] close HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocityERA5.h5. -------------------------------------------------- open HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocityERA5.h5 in a mode writing dataset /residue block: [0, 1021, 0, 1021] close HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocityERA5.h5. time used: 00 mins 1.5 secs. Go back to directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy ################################################ Normal end of smallbaselineApp processing! ################################################ Time used: 00 mins 2.6 secs
%matplotlib widget
# faults used in UCERF3 in GMT lonlat format. Use gmt to convert any KML file into GMT lonlat file, e.g. `gmt kml2gmt UCERF3_Fault.kml > UCERF3_Fault.lonlat`
fault_file = os.path.join(utils.get_local_path(), 'UCERF3_Fault.lonlat')
opt = f'--dem ./inputs/geometryGeo.h5 --shade-exag 0.05 --dem-nocontour --faultline {fault_file} --faultline-min-dist 10 -v -1 1 --lalo-label --ylabel-rot 90 --figsize 10 8'
view.main(f'velocity.h5 velocity {opt}'.split())
run view.py in MintPy version 1.6.1.post3, date 2024-08-14 input file is velocity file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5 in float32 format file size in y/x: (1021, 1021) input dataset: "['velocity']" turning glob search OFF for velocity file num of datasets in file velocity.h5: 5 datasets to exclude (0): [] datasets to display (1): ['velocity'] data coverage in y/x: (0, 0, 1021, 1021) subset coverage in y/x: (0, 0, 1021, 1021) data coverage in lat/lon: (-122.600833333, 38.1, -121.75000034, 37.249167007000004) subset coverage in lat/lon: (-122.600833333, 38.1, -121.75000034, 37.249167007000004) ------------------------------------------------------------------------ colormap: jet initiate cartopy map projection: PlateCarree figure title: velocity read mask from file: maskTempCoh.h5 reading data ... masking data data range: [-1.4076022, 1.1951466] cm/year display range: [-1.0, 1.0] cm/year reading DEM: geometryGeo.h5 ... display data in transparency: 0.8 plot in geo-coordinate plotting DEM background ... show shaded relief DEM (min/max=-4000/3126 m; exag=0.05; az/alt=315.0/45.0 deg) plotting data as image via matplotlib.pyplot.imshow ... plot fault lines from GMT lonlat file: /Users/yunjunz/tools/utils/MintPy-tutorial/workflows/UCERF3_Fault.lonlat [==================================================] line 9527 / 9527 0s / 0s plot scale bar: [0.2, 0.2, 0.1] plot lat/lon label in step of 0.30000000000000004 and location of [1, 0, 0, 1] plot reference point rotate Y-axis tick labels by 90.0 deg showing ...
/Users/yunjunz/tools/mambaforge/envs/insar/lib/python3.10/site-packages/matplotlib/cm.py:496: RuntimeWarning: invalid value encountered in cast xx = (xx * 255).astype(np.uint8)
Obvious features in the estimated velocity map:
The general pattern of displacement is consistent with tectonic setting of the region. Pacific plate is moving north-west with respect to North american plate. The satellites is on a descending track and the estimated displacement shows the blue region is moving away from the red region.
The magnitude of the relative movement (blue region relative to red region) is about ~13 mm/yr in radar Line-Of-Sight (LOS) direction which is consistent with ~40 mm/yr horizontal movements of pacific relative to north america.
The estimated velocity shows a linear feature almost aligned with the south-east north-west diagonal of the map. This linear feature shows the aseismic fault creep on Hayward fault.
Around latitude 38.0N, another linear feature represents Concord fault parallel to Hayward fault.
Around latitude 37.9N, a east-west linear discontinuity is evident. This is most likely caused by unwrapping errors due to imperfect phase stitching with frame-by-frame processing.
Around latitude 37.4N at Salta Clara Valley, a linear feature represents the Silver Creek Fault, mixed together with a hydrological signals showing ground deformation caused by acquifer recharge (Schmidt and Bürgmann, 2003; Chaussard et al., 2014).
The block box at [37.69N, 122.07W] is the reference pixel for this map.
Reference:
The estimated velocity also comes with an expression of unecrtainty which is simply based on the goodness of fit while fitting a linear model to the time-series. This quantity is saved in "velocity.h5" under the velocityStd dataset.
%matplotlib inline
view.main('velocity.h5 velocityStd -u mm -v 0 0.8'.split())
run view.py in MintPy version 1.6.1.post3, date 2024-08-14 input file is velocity file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5 in float32 format file size in y/x: (1021, 1021) input dataset: "['velocityStd']" turning glob search OFF for velocity file num of datasets in file velocity.h5: 5 datasets to exclude (0): [] datasets to display (1): ['velocityStd'] data coverage in y/x: (0, 0, 1021, 1021) subset coverage in y/x: (0, 0, 1021, 1021) data coverage in lat/lon: (-122.600833333, 38.1, -121.75000034, 37.249167007000004) subset coverage in lat/lon: (-122.600833333, 38.1, -121.75000034, 37.249167007000004) ------------------------------------------------------------------------ colormap: jet figure title: velocityStd figure size : [7.5, 6.0] read mask from file: maskTempCoh.h5 reading data ... masking data data range: [0.0, 1.3705224] mm/year display range: [0.0, 0.8] mm/year display data in transparency: 1.0 plot in geo-coordinate plotting data as image via matplotlib.pyplot.imshow ... plot scale bar: [0.2, 0.2, 0.1] plot reference point showing ...
The estimated standard deviation only represents the goodness of fit and can be biased or maybe under-estimating the actual uncertainty of the product. However, the spatial pattern of the estimated standard deviation is interesting and clearly shows the spatial correltion of noise in the time-series. The uncertainty is distance dependent and increases with increasing distance between pixels. This map shows the uncertainty for each pixle relative to the reference pixel.
Anatomy of interferometric phases:
$$ \large \Delta \phi = \color{green}{ \Delta\phi_{defo} + \Delta\phi_{atm} + \Delta\phi_{geom} + \Delta\phi_{tidal}} + \color{royalblue}{ \Delta\phi_{x}} $$where $\color{green}{\Delta\phi_{atm}}$ represent the atmospheric propagation delay, including ionosphere and troposphere, $\color{green}{\Delta\phi_{geom}}$ includes the topographic residual (DEM error) and orbital error, etc. $\color{green}{\Delta\phi_{tidal}}$ includes solid Earth tides, ocean tidal loading, etc. $\color{royalblue}{\Delta\phi_{x}}$ includes all phase contributions that does not fulfill the zero phase closure of interferogram triplets, i.e. $\color{royalblue}{\Delta\phi^{ab}_x} + \color{royalblue}{\Delta\phi^{bc}_x} - \color{royalblue}{\Delta\phi^{ac}_x} \neq 0$, including:
Uncertainty of the ground displacement products derived from InSAR time-series, depends on 1) the quality of the inversion of the stack of interferograms and 2) the accuracy in separating the ground displacement from other components of the InSAR data. Therefore the definition of signal vs noise is different at the two main steps in mintpy:
During the inversion: all phase components that fulfill the zero phase closure ($\color{green}{ \Delta\phi_{defo}, \Delta\phi_{atm}, \Delta\phi_{geom}, \Delta\phi_{tidal}}$) are considered signal, while the rest ($\color{royalblue}{\Delta\phi_{x}}$) is considered noise
After inversion: the ground displacement is signal, and everything else (including the propagation delay and geometrical residuals) are considered noise
Therefore we first discuss the possible sources of error during the inversion and the existing ways in MintPy to evaluate the quality of inversion and to improve the uncertainty of the inversion. Afterwards we explain the different components of the time-series and the different processing steps in MintPy to separate them from ground displacement signal.
The main sources of noise during the time-series inversion $\color{royalblue}{\Delta\phi_{x}}$ includes decorrelation, phase unwrapping error and the inconsistency of triplets of interferofgrams. Here we mainly focus on the decorrelation and unwrapping errors. We first show the existing quantities in MintPy to evaluate these noises; then discuss approaches to reduce their impacts.
Mintpy computes temporal average of spatial coherence of the entire stack as a potential ancillary measure to choose reliable pixels after time-series inversion.
In addition to timeseries.h5 which contains the time-series dataset, "invert_network" step produces other quantities, which contain metrics to evaluate the quality of the inversion, e.g. temporalCoherence.h5
file. Temporal coherence represents the consistency between the timeseries and the network of interferograms (Pepe and Lanari, 2006).
where $\Delta\phi$ is the interferometric unwrapped phase, $A$ is the design matrix, $\hat{\phi}$ is the estimated time-series, $H$ is an $M\times1$ all-ones column vector, $j$ is the imaginary unit.
Temporal coherence varies from 0 to 1. Pixels with values closer to 1 are considered reliable and pixels with values closer to zero are considered unreliable. For a dense network of interferograms, a threshold of 0.7 may be used (Yunjun et al, 2019).
view.main('avgSpatialCoh.h5 --noverbose'.split())
view.main('temporalCoherence.h5 --noverbose'.split())
view.py avgSpatialCoh.h5 --noverbose
view.py temporalCoherence.h5 --noverbose
# The line below generates a multi-choice selection box for the question above. It requires `ipywidgets`. If not working, just ignore this and continue.
utils.tcoh_vs_scoh
VBox(children=(Output(layout=Layout(width='auto')), RadioButtons(options=(('A', 0), ('B', 1), ('C', 2), ('D', …
The interferometric phases are wrapped (modulo 2$\pi$). Integration of the wrapped phase, commonly called phase unwrapping, is required to obtain a field of relative phase with respect to a given pixel. The phase unwrapping algorithms add integer number of 2$\pi$ phase jumps to recover the unwrapped phase. Decorrelation and discontinuities among different coherent regions may lead to wrong 2$\pi$ jumps added to the phase field, which is known as unwrapping error. Unwrapping errors can bias the estimated time-series.
By looking at the temporal coherence we suspect that some of the interferograms may have missing frames or may have wrong stitching. Let's first plot interferograms and visually investigate any problem with some pairs.
view.main('inputs/ifgramStack.h5 unwrapPhase-201604* -v -10 10 --zero-mask --noaxis --noverbose'.split())
view.py inputs/ifgramStack.h5 unwrapPhase-201604* -v -10 10 --zero-mask --noaxis --noverbose
It's obvious that the interferograms with the indices 100 and 101 have clear jumps at a burst boundary most likely caused by missing bursts in some acquisitions or due to problems in stitching.
For several stacks of interferograms, it's sometimes impractical to check every single interferogram for phase unwrapping errors. Below is a way for use one map to indicate potential phase unwrapping errors for the entire stack.
For an interferogram triplet ($\Delta\phi^{ij}$, $\Delta\phi^{jk}$ and $\Delta\phi^{ik}$), unwrapping errors will introduce an non-zero integer component $C_{int}^{ijk}$ in the closure phase $C^{ijk}$. Therefore, the number of interferogram triplets with non-zero integer ambiguity $T_{int}$ can be used to detect unwrapping errors (Yunjun et al., 2019):
$$ \large C^{ijk}=\Delta\phi^{ij}+\Delta\phi^{jk}-\Delta\phi^{ik}$$$$ \large C_{int}^{ijk}=\frac{C^{ijk}-wrap(C^{ijk})}{2\pi}$$$$ \large T_{int}=\sum_{i=1}^T(C_{int}^{ijk}!=0)$$where $warp$ is an operator to wrap the input number into $[-\pi, \pi)$; $T$ is the number of interferogram triplets.
!smallbaselineApp.py SanFranSenDT42.txt --dostep quick_overview
pp.plot_num_triplet_with_nonzero_integer_ambiguity('numTriNonzeroIntAmbiguity.h5', disp_fig=True, fig_size=[14, 4])
MintPy version 1.6.1.post3, date 2024-08-14 --RUN-at-2024-08-26 11:06:21.782311-- Current directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy Run routine processing with smallbaselineApp.py on steps: ['quick_overview'] Remaining steps: ['correct_unwrap_error', 'invert_network', 'correct_LOD', 'correct_SET', 'correct_ionosphere', 'correct_troposphere', 'deramp', 'correct_topography', 'residual_RMS', 'reference_date', 'velocity', 'geocode', 'google_earth', 'hdfeos5'] -------------------------------------------------- Project name: SanFranSenDT42 Go to work directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy read custom template file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/SanFranSenDT42.txt update default template based on input custom template No new option value found, skip updating /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg read default template file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg ******************** step - quick_overview ******************** Input data seems to be geocoded. Lookup file not needed. temporal_average.py /Users/yunjunz/data/test/SanFranSenDT42/mintpy/inputs/ifgramStack.h5 --dataset unwrapPhase -o /Users/yunjunz/data/test/SanFranSenDT42/mintpy/avgPhaseVelocity.h5 --update -------------------------------------------------- update mode: ON 1) output file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/avgPhaseVelocity.h5 NOT exist. run or skip: run. calculate the temporal average of unwrapPhase in file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/inputs/ifgramStack.h5 ... [==================================================] lines 1021/1021 3s / 1s create HDF5 file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/avgPhaseVelocity.h5 with w mode create dataset /velocity of float32 in size of (1021, 1021) with compression=None finished writing to /Users/yunjunz/data/test/SanFranSenDT42/mintpy/avgPhaseVelocity.h5 time used: 00 mins 6.3 secs unwrap_error_phase_closure.py /Users/yunjunz/data/test/SanFranSenDT42/mintpy/inputs/ifgramStack.h5 --water-mask /Users/yunjunz/data/test/SanFranSenDT42/mintpy/waterMask.h5 --action calculate --update open ifgramStack file: ifgramStack.h5 number of interferograms: 505 number of triplets: 541 calculating the number of triplets with non-zero integer ambiguity of closure phase ... block by block with size up to (260, 1021), 4 blocks in total reference pixel in y/x: (492, 637) from dataset: unwrapPhase [==================================================] line 780 / 1021 9s / 3s mask out pixels with zero in file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/waterMask.h5 mask out pixels with zero in file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/avgSpatialCoh.h5 write to file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/numTriNonzeroIntAmbiguity.h5 create HDF5 file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/numTriNonzeroIntAmbiguity.h5 with w mode create dataset /mask of float32 in size of (1021, 1021) with compression=None finished writing to /Users/yunjunz/data/test/SanFranSenDT42/mintpy/numTriNonzeroIntAmbiguity.h5 plot and save figure to file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/numTriNonzeroIntAmbiguity.png time used: 00 mins 12.5 secs Done. Go back to directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy ################################################ Normal end of smallbaselineApp processing! ################################################ Time used: 00 mins 23.8 secs plot and save figure to file numTriNonzeroIntAmbiguity.png
Take home messages from $T_{int}$ map and histogram:
# The line below generates a multi-choice selection box for the question above. It requires `ipywidgets`. If not working, just ignore this and continue.
utils.inv_quality
VBox(children=(Output(layout=Layout(width='auto')), RadioButtons(options=(('A', 0), ('B', 1), ('C', 2), ('D', …
MintPy provides three methods to possibly correct the phase unwrapping errors (Yunjun et al., 2019, section 3).
bridging
: automating the traditional manual bridging method in which coherent components with the smallest distance from each other are assumed connected and therefore the a smooth phase variation across them are enforced.phase_closure
: based on the phase closure of the triplets of the interferograms.bridging+phase_closure
: a hybrid approach and simply uses the both approached mentioned before.Note that to use the phase closure approach a dense network of interferograms should be available. To use the phase unwrapping error correction methods, usually a common mask is generated, which shows pixels with valid unwrapped phase in all interferograms.
The phase unwrapping error correction can be turned ON using mintpy.unwrapError.method
option, and further customized using the template options:
write_config_file(config_file, "mintpy.unwrapError.method = bridging")
!smallbaselineApp.py SanFranSenDT42.txt --dostep correct_unwrap_error
MintPy version 1.6.1.post3, date 2024-08-14 --RUN-at-2024-08-26 11:06:47.950428-- Current directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy Run routine processing with smallbaselineApp.py on steps: ['correct_unwrap_error'] Remaining steps: ['invert_network', 'correct_LOD', 'correct_SET', 'correct_ionosphere', 'correct_troposphere', 'deramp', 'correct_topography', 'residual_RMS', 'reference_date', 'velocity', 'geocode', 'google_earth', 'hdfeos5'] -------------------------------------------------- Project name: SanFranSenDT42 Go to work directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy read custom template file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/SanFranSenDT42.txt update default template based on input custom template mintpy.unwrapError.method: auto --> bridging copy SanFranSenDT42.txt to inputs directory for backup. copy smallbaselineApp.cfg to inputs directory for backup. copy SanFranSenDT42.txt to pic directory for backup. copy smallbaselineApp.cfg to pic directory for backup. read default template file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg ******************** step - correct_unwrap_error ******************** Input data seems to be geocoded. Lookup file not needed. unwrap_error_bridging.py /Users/yunjunz/data/test/SanFranSenDT42/mintpy/inputs/ifgramStack.h5 --template /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg --update read options from template file: smallbaselineApp.cfg -------------------------------------------------- update mode: ON 1) output dataset: unwrapPhase_bridging NOT found. run or skip: run. -------------------------------------------------- correct unwrapping error in /Users/yunjunz/data/test/SanFranSenDT42/mintpy/inputs/ifgramStack.h5 with bridging ... read water mask from file: waterMask.h5 open /Users/yunjunz/data/test/SanFranSenDT42/mintpy/inputs/ifgramStack.h5 with r+ mode input dataset: unwrapPhase output dataset: unwrapPhase_bridging create /unwrapPhase_bridging of np.float32 in size of (505, 1021, 1021) [==================================================] 20200227_20200310 74s / 1s close /Users/yunjunz/data/test/SanFranSenDT42/mintpy/inputs/ifgramStack.h5 file. add/update the following configuration metadata to file: add/update mintpy.unwrapError.waterMaskFile = waterMask.h5 add/update mintpy.unwrapError.connCompMinArea = 2500.0 add/update mintpy.unwrapError.bridgePtsRadius = 50 time used: 01 mins 16.3 secs Done. Go back to directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy ################################################ Normal end of smallbaselineApp processing! ################################################ Time used: 01 mins 16.7 secs
Plotting the interferogram after unwrapping error correction shows how well the correction is.
view.main('inputs/ifgramStack.h5 unwrap*20160131_20170302 conn*-20160131_20170302 --zero-mask --figsize 12 4 --noverbose'.split())
view.py inputs/ifgramStack.h5 unwrap*20160131_20170302 conn*-20160131_20170302 --zero-mask --figsize 12 4 --noverbose
After correcting for unwrapping errors, we need to re-run the inversion to see the impact of the unwrapping error correction.
This step enables modifying the network of interferograms before the network inversion. Motivation:
Several options exist to modify a network.
All different network modification options can be configured through the configuration file in the mintpy.network.* section.
(Figure from Yunjun et al., 2019)
The following weighted least squares (WLS) inversions methods are supported:
## Invert network of interferograms into time-series using weighted least sqaure (WLS) estimator.
## weighting options for least square inversion [fast option available but not best]:
## a. var - use inverse of covariance as weight (Tough et al., 1995; Guarnieri & Tebaldini, 2008) [recommended]
## b. fim - use Fisher Information Matrix as weight (Seymour & Cumming, 1994; Samiei-Esfahany et al., 2016).
## c. coh - use coherence as weight (Perissin & Wang, 2012)
## d. no - uniform weight (Berardino et al., 2002) [fast]
## SBAS (Berardino et al., 2002) = minNormVelocity (yes) + weightFunc (no)
mintpy.networkInversion.weightFunc = auto #[var / fim / coh / no], auto for var
By default MintPy uses the same network of interferograms for all the pixels. However, it is possible to use variable networks for different pixels by specifying a mask dataset, a threshold value for the mask and a minimum redundance value:
## mask options for unwrapPhase of each interferogram before inversion (recommed if weightFunct=no):
## a. coherence - mask out pixels with spatial coherence < maskThreshold
## b. connectComponent - mask out pixels with False/0 value
## c. no - no masking [recommended].
## d. offsetSNR - mask out pixels with offset SNR < maskThreshold [for offset]
mintpy.networkInversion.maskDataset = auto #[coherence / connectComponent / offsetSNR / no], auto for no
mintpy.networkInversion.maskThreshold = auto #[0-inf], auto for 0.4
mintpy.networkInversion.minRedundancy = auto #[1-inf], auto for 1.0, min num_ifgram for every SAR acquisition
MintPy by default masks the estimated time-series using temporal coherence with a threshold:
## Temporal coherence is calculated and used to generate the mask as the reliability measure
## reference: Pepe & Lanari (2006, IEEE-TGRS)
mintpy.networkInversion.minTempCoh = auto #[0.0-1.0], auto for 0.7, min temporal coherence for mask
mintpy.networkInversion.minNumPixel = auto #[int > 1], auto for 100, min number of pixels in mask above
mintpy.networkInversion.shadowMask = auto #[yes / no], auto for yes [if shadowMask is in geometry file] or no.
The least squares inversion can be performed using phase (matrix $A$ in Berardino et al, 2002) or using phase velocity (matrix $B$ in Berardino at al, 2002). The latter allows to invert a disconnected network. For a connected network, the design matrix is full-rank and the inversion using either methods are the same.
mintpy.networkInversion.minNormVelocity = auto #[yes / no], auto for yes, min-norm deformation velocity or phase
!smallbaselineApp.py SanFranSenDT42.txt --dostep invert_network
MintPy version 1.6.1.post3, date 2024-08-14 --RUN-at-2024-08-26 11:08:06.255017-- Current directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy Run routine processing with smallbaselineApp.py on steps: ['invert_network'] Remaining steps: ['correct_LOD', 'correct_SET', 'correct_ionosphere', 'correct_troposphere', 'deramp', 'correct_topography', 'residual_RMS', 'reference_date', 'velocity', 'geocode', 'google_earth', 'hdfeos5'] -------------------------------------------------- Project name: SanFranSenDT42 Go to work directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy read custom template file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/SanFranSenDT42.txt update default template based on input custom template No new option value found, skip updating /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg read default template file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg ******************** step - invert_network ******************** Input data seems to be geocoded. Lookup file not needed. ifgram_inversion.py /Users/yunjunz/data/test/SanFranSenDT42/mintpy/inputs/ifgramStack.h5 -t /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg --update read input option from template file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg phase unwrapping error correction "bridging" is turned ON use dataset "unwrapPhase_bridging" by default -------------------------------------------------- update mode: ON 1) output files already exist: ['timeseries.h5', 'temporalCoherence.h5', 'numInvIfgram.h5']. 2) output files are NOT newer than input dataset: unwrapPhase_bridging. run or skip: run. save the original settings of ['OMP_NUM_THREADS', 'OPENBLAS_NUM_THREADS', 'MKL_NUM_THREADS', 'NUMEXPR_NUM_THREADS', 'VECLIB_MAXIMUM_THREADS'] set OMP_NUM_THREADS = 1 set OPENBLAS_NUM_THREADS = 1 set MKL_NUM_THREADS = 1 set NUMEXPR_NUM_THREADS = 1 set VECLIB_MAXIMUM_THREADS = 1 reference pixel in y/x: (492, 637) from dataset: unwrapPhase_bridging ------------------------------------------------------------------------------- least-squares solution with L2 min-norm on: deformation velocity minimum redundancy: 1.0 weight function: no calculate covariance: False mask: no ------------------------------------------------------------------------------- number of interferograms: 505 number of acquisitions : 114 number of lines : 1021 number of columns : 1021 -------------------------------------------------- create HDF5 file: timeseries.h5 with w mode create dataset : date of |S8 in size of (114,) with compression = None create dataset : bperp of <class 'numpy.float32'> in size of (114,) with compression = None create dataset : timeseries of <class 'numpy.float32'> in size of (114, 1021, 1021) with compression = None close HDF5 file: timeseries.h5 -------------------------------------------------- create HDF5 file: temporalCoherence.h5 with w mode create dataset : temporalCoherence of <class 'numpy.float32'> in size of (1021, 1021) with compression = None close HDF5 file: temporalCoherence.h5 -------------------------------------------------- create HDF5 file: numInvIfgram.h5 with w mode create dataset : mask of <class 'numpy.float32'> in size of (1021, 1021) with compression = None close HDF5 file: numInvIfgram.h5 maximum memory size: 4.0E+00 GB split 1021 lines into 2 patches for processing with each patch up to 520 lines ------- processing patch 1 out of 2 -------------- box width: 1021 box length: 520 reading unwrapPhase_bridging in (0, 0, 1021, 520) * 505 ... use input reference value convert zero value in unwrapPhase_bridging to NaN (no-data value) skip pixels (on the water) with zero value in file: waterMask.h5 skip pixels with unwrapPhase_bridging = NaN in all interferograms skip pixels with zero value in file: avgSpatialCoh.h5 number of pixels to invert: 375442 out of 530920 (70.7%) estimating time-series for pixels with valid unwrapPhase_bridging in all ifgrams (374257 pixels; 99.7%) ... calculating temporalCoherence in chunks of 400 pixels: 936 chunks in total ... chunk 200 / 936 chunk 400 / 936 chunk 600 / 936 chunk 800 / 936 estimating time-series for pixels with valid unwrapPhase_bridging in some ifgrams (1185 pixels; 0.3%) ... [==================================================] 1185/1185 pixels 1s / 0s converting LOS phase unit from radian to meter -------------------------------------------------- open HDF5 file timeseries.h5 in a mode writing dataset /timeseries block: [0, 114, 0, 520, 0, 1021] close HDF5 file timeseries.h5. -------------------------------------------------- open HDF5 file temporalCoherence.h5 in a mode writing dataset /temporalCoherence block: [0, 520, 0, 1021] close HDF5 file temporalCoherence.h5. -------------------------------------------------- open HDF5 file numInvIfgram.h5 in a mode writing dataset /mask block: [0, 520, 0, 1021] close HDF5 file numInvIfgram.h5. time used: 00 mins 17.0 secs. ------- processing patch 2 out of 2 -------------- box width: 1021 box length: 501 reading unwrapPhase_bridging in (0, 520, 1021, 1021) * 505 ... use input reference value convert zero value in unwrapPhase_bridging to NaN (no-data value) skip pixels (on the water) with zero value in file: waterMask.h5 skip pixels with unwrapPhase_bridging = NaN in all interferograms skip pixels with zero value in file: avgSpatialCoh.h5 number of pixels to invert: 382614 out of 511521 (74.8%) estimating time-series for pixels with valid unwrapPhase_bridging in all ifgrams (381275 pixels; 99.7%) ... calculating temporalCoherence in chunks of 400 pixels: 954 chunks in total ... chunk 200 / 954 chunk 400 / 954 chunk 600 / 954 chunk 800 / 954 estimating time-series for pixels with valid unwrapPhase_bridging in some ifgrams (1339 pixels; 0.3%) ... [==================================================] 1339/1339 pixels 0s / 0s converting LOS phase unit from radian to meter -------------------------------------------------- open HDF5 file timeseries.h5 in a mode writing dataset /timeseries block: [0, 114, 520, 1021, 0, 1021] close HDF5 file timeseries.h5. -------------------------------------------------- open HDF5 file temporalCoherence.h5 in a mode writing dataset /temporalCoherence block: [520, 1021, 0, 1021] close HDF5 file temporalCoherence.h5. -------------------------------------------------- open HDF5 file numInvIfgram.h5 in a mode writing dataset /mask block: [520, 1021, 0, 1021] close HDF5 file numInvIfgram.h5. time used: 00 mins 32.5 secs. -------------------------------------------------- update values on the reference pixel: (492, 637) set temporalCoherence on the reference pixel to 1. set # of observations on the reference pixel as 505 roll back to the original settings of ['OMP_NUM_THREADS', 'OPENBLAS_NUM_THREADS', 'MKL_NUM_THREADS', 'NUMEXPR_NUM_THREADS', 'VECLIB_MAXIMUM_THREADS'] set OMP_NUM_THREADS = 16 remove env variable OPENBLAS_NUM_THREADS remove env variable MKL_NUM_THREADS remove env variable NUMEXPR_NUM_THREADS remove env variable VECLIB_MAXIMUM_THREADS time used: 00 mins 32.5 secs. Input data seems to be geocoded. Lookup file not needed. generate_mask.py /Users/yunjunz/data/test/SanFranSenDT42/mintpy/temporalCoherence.h5 -m 0.7 -o /Users/yunjunz/data/test/SanFranSenDT42/mintpy/maskTempCoh.h5 update mode: ON run or skip: run input temporalCoherence file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/temporalCoherence.h5 read /Users/yunjunz/data/test/SanFranSenDT42/mintpy/temporalCoherence.h5 create initial mask with the same size as the input file and all = 1 all pixels with nan value = 0 exclude pixels with value < 0.7 delete exsited file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/maskTempCoh.h5 create HDF5 file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/maskTempCoh.h5 with w mode create dataset /mask of bool in size of (1021, 1021) with compression=None finished writing to /Users/yunjunz/data/test/SanFranSenDT42/mintpy/maskTempCoh.h5 time used: 00 mins 0.0 secs. number of reliable pixels: 450950 Go back to directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy ################################################ Normal end of smallbaselineApp processing! ################################################ Time used: 00 mins 33.1 secs
Let's check how the temporal coherence looks like after the phase unwrapping error correction.
# view.py options
opt = ' --lalo-label --lalo-step 0.5 --ylabel-rot 90 --noverbose --noscalebar '
cmd_list = [
f'view.py temporalCoherenceRaw.h5 --title "temporal coherence w/o PU error corr." {opt} --nocbar',
f'view.py temporalCoherence.h5 --title "temporal coherence w/ PU error corr." {opt}',
]
# plot using matplotlib & mintpy.view
fig, axs = plt.subplots(nrows=1, ncols=2, figsize=[10, 5], subplot_kw=dict(projection=ccrs.PlateCarree()))
for ax, cmd in zip(axs, cmd_list):
data, atr, inps = prep_slice(cmd)
plot_slice(ax, data, atr, inps)
fig.tight_layout()
plt.show()
view.py temporalCoherenceRaw.h5 --title temporal coherence w/o PU error corr. --lalo-label --lalo-step 0.5 --ylabel-rot 90 --noverbose --noscalebar --nocbar view.py temporalCoherence.h5 --title temporal coherence w/ PU error corr. --lalo-label --lalo-step 0.5 --ylabel-rot 90 --noverbose --noscalebar
Note that the temporal coherence at the top has increased and the discontinuity has disapeared.
After inversion of the network of interferograms, the estimated time-series contains different components including: tropospheric delay, ionospheric delay, topographic residuals, ground displacement and other possible geophysical components (e.g., tides) or instrumental effects (e.g., the local oscilator drift of Envisat). Given ground displacement as our signal of interest, the following processing steps attempt to separate signal from noise and provide a ground displacement time-series for selected coherent pixels.
This step corrects the tropospheric phase delay. Thress methods are supported:
The corresponding template options are:
## correct tropospheric delay using the following methods:
## a. pyaps - use Global Atmospheric Models (GAMs) data (Jolivet et al., 2011; 2014)
## supports ERA5 from ECMWF [recommended]
## b. gacos - use GACOS with the iterative tropospheric decomposition model (Yu et al., 2018a, RSE; 2018b, JGR)
## need to manually download GACOS products at http://www.gacos.net for all acquisitions before running this step
## c. height_correlation - correct stratified tropospheric delay (Doin et al., 2009, J Applied Geop)
mintpy.troposphericDelay.method = auto #[pyaps / gacos / height_correlation / no], auto for pyaps
It outputs:
This step corrects the ionospheric phase delay. Currently support the range split-spectrum results from ISCE-2 stack processors only.
The corresponding template options are:
########## 1. load_data
##---------ionosphere stack (optional):
mintpy.load.ionUnwFile = auto #[path pattern of unwrapped interferogram files]
mintpy.load.ionCorFile = auto #[path pattern of spatial coherence files]
mintpy.load.ionConnCompFile = auto #[path pattern of connected components files], optional but recommended
########## 7. correct_ionosphere (optional but recommended)
## correct ionospheric delay [need split spectrum results from ISCE-2 stack processors]
mintpy.ionosphericDelay.method = auto #[split_spectrum / no], auto for no
mintpy.ionosphericDelay.excludeDate = auto #[20080520,20090817 / no], auto for no
mintpy.ionosphericDelay.excludeDate12 = auto #[20080520_20090817 / no], auto for no
Example mintpy.load.ion*
values for:
This step corrects the solid Earth tides due to the gravity pull from the Sun and the Moon using PySolid (Yunjun et al., 2022), which implements the IERS (International Earth Rotation and Reference Systems Service) 2010 Conventions (Petit & Luzum, 2010).
The corresponding template options are:
mintpy.solidEarthTides = auto #[yes / no], auto for no
The rigid motion of tectonic plates (up to several cm/year) can introduce a long-wavelength spatial gradient in mm/year level over several 100km scale (Stephenson et al., 2022, GRL), when projecting into the line-of-sight (LOS) direction with range-dependent incidence angle (and to a lesser extent, azimuth angle), as shown below. This impact can be accounted for using exsiting plate motion models, such as the ITRF2014-PMM (Altamimi et al., 2017).
(Figure from Stephenson et al., 2022, GRL)
The method is implemented in MintPy as plate_motion.py
(not integrated into smallbaselineApp
yet).
Run plate_motion.py --help
for more detailed usage.
Recent studies show that using small temporal baseline interferograms may result in biased displacement time-series (Ansari et al., 2021, TGRS). This biased estimation has been confirmed in several studies and linked to the non-zero closuer phase, i.e. $\Delta\phi^{ij}+\Delta\phi^{jk}-\Delta\phi^{ik} \neq 0$. Zheng et al. (2022, TGRS) have developed a simple model explaining both the non-zero closure phase and the observed systematic "bias", showing that the non-zero closure phase can be an indicator of temporally inconsistent physical processes that alter both the phase and amplitude of interferometric measurements. The figure below shows the average velocity maps from small baseline approaches using bandwidth-1, bandwidth-5, and bandwidth-10; and how these descrepancies can be estimated and removed.
The model/algorithm from Zheng et al. (2022) has been implemented into MintPy as closure_phase_bias.py
(not integrated into smallbaselineApp
yet).
In this notebook, we will demonstrate only the detection of potential non-closure phase bias using closure_phase_bias.py --action mask
. The acutal bias estimation and correction can be found in another dedicated notebook tutorial with example dataset at insarlab/MintPy-tutorial/closure_phase_bias.ipynb.
Following on Zheng et al. (2022), we can calculate the average sequential closure phase $\bar{C_n}$ as:
$$ \large C_n^k = \Delta\phi^{k,k+1} + \Delta\phi^{k+1,k+2} + ... + \Delta\phi^{k+n-1,k+n} - \Delta\phi^{k,k+n} $$$$ \large \bar{C_n} = \sum_{k=1}^{K} e^{j C_n^k} \,/\, K $$where $C_n^k$ is the sequential closure phase with connection level-$n$ and starting at the $k$-th acquisition, e.g., $C_5^1 = \Delta\phi^{1,2} + \Delta\phi^{2,3} + \Delta\phi^{3,4} + \Delta\phi^{4,5} - \Delta\phi^{1,5}$. $K$ is the number of available sequential closure phase, $n$ is the assumed bias free connection level, beyond which the closure phase bias is negligible. $\bar{C_n}$ is a complex number. By setting a threshold on both the amplitude and phase of $\bar{C_n}$, we can obtain a proxy map for regions susceptible to the closure phase bias.
Based on the network plot in section 2.2 above, we know there are near-complete interferograms for connection level 1, 2, 3 and 28. Therefore, we choose connection-level-28 as the assumed bias free connection level, and use it for the detection calculation as below.
!closure_phase_bias.py -i inputs/ifgramStack.h5 --action mask --conn-level 28 --epsilon 0.4
-------------------------------------------------------------------------------- calculating the mask to flag areas susceptible to non-closure-phase related biases (as zero) ... number of valid acquisitions: 114 (20150512 - 20200310) average complex closure phase threshold in amplitude/correlation: 0.4 average complex closure phase threshold in phase: 3 sigma (1.4 rad) calculating the average complex closure phase length / width: 1021 / 1021 number of closure measurements expected: 86 number of closure measurements found : 16 reading unwrapPhase to compute closure phases apply spatial referencing to aria products reference pixel in y/x: (492, 637) from dataset: unwrapPhase mask out pixels with no-data-value (zero incidenceAngle from file: geometryGeo.h5) create mask for areas susceptible to non-closure phase biases set pixels with average complex closure phase angle > 3 sigma (1.4 rad) to 0. set pixels with average complex closure phase amplitude (correlation) < 0.4 to 1. create HDF5 file: ./maskClosurePhase.h5 with w mode create dataset /mask of bool in size of (1021, 1021) with compression=None finished writing to ./maskClosurePhase.h5 create HDF5 file: ./avgCpxClosurePhase.h5 with w mode create dataset /amplitude of float32 in size of (1021, 1021) with compression=None create dataset /phase of float32 in size of (1021, 1021) with compression=None finished writing to ./avgCpxClosurePhase.h5 time used: 00 mins 3.6 secs.
# view.py options
opt = ' --lalo-label --lalo-step 0.5 --ylabel-rot 90 --noverbose --noscalebar '
cmd_list = [f'view.py avgCpxClosurePhase.h5 amplitude -c gray {opt}',
f'view.py avgCpxClosurePhase.h5 phase -c RdBu {opt}', #--lalo-loc 0 0 0 1 ',
f'view.py maskClosurePhase.h5 -c gray_r {opt}',] # --lalo-loc 0 0 0 1 ',]
# plot using matplotlib & mintpy.view
fig, axs = plt.subplots(nrows=1, ncols=3, figsize=[18, 6], subplot_kw=dict(projection=ccrs.PlateCarree()))
for ax, cmd in zip(axs, cmd_list):
data, atr, inps = prep_slice(cmd)
plot_slice(ax, data, atr, inps)
fig.tight_layout()
plt.show()
view.py avgCpxClosurePhase.h5 amplitude -c gray --lalo-label --lalo-step 0.5 --ylabel-rot 90 --noverbose --noscalebar view.py avgCpxClosurePhase.h5 phase -c RdBu --lalo-label --lalo-step 0.5 --ylabel-rot 90 --noverbose --noscalebar view.py maskClosurePhase.h5 -c gray_r --lalo-label --lalo-step 0.5 --ylabel-rot 90 --noverbose --noscalebar
Based on the result above, we know that:
This step estimate and removes a linear or quadratic ramp from each acquisition based on the phase of the reliable pixels. It's recommended for localized deformation signals, such as volcanic deformation, landslides and city subsidence; but not recommeded for long spatial wavelength deformation signals, such as interseismic deformation.
The cooresponding template options are:
## Estimate and remove a phase ramp for each acquisition based on the reliable pixels.
## Recommended for localized deformation signals, i.e. volcanic deformation, landslide and land subsidence, etc.
## NOT recommended for long spatial wavelength deformation signals, i.e. co-, post- and inter-seimic deformation.
mintpy.deramp = auto #[no / linear / quadratic], auto for no - no ramp will be removed
mintpy.deramp.maskFile = auto #[filename / no], auto for maskTempCoh.h5, mask file for ramp estimation
It outputs a new time-series HDF5 file with suffix ramp: timeseries_ramp.h5 in this example.
This step corrects the phase residual caused by the inaccuracy of DEM (DEM error) using its relationship with the perpendicular baseline time-series (Fattahi & Amelung, 2013). The corresponding template options are:
## stepFuncDate - Specify stepFuncDate option if you know there are sudden displacement jump in your area,
## i.e. volcanic eruption, or earthquake, and check timeseriesStepModel.h5 afterward for their estimation.
## excludeDate - Dates excluded for error estimation only
## pixelwiseGeometry - Use pixel-wise geometry info, i.e. incidence angle and slant range distance
## yes - use pixel-wise geometry when they are available [slow; used by default]
## no - use mean geometry [fast]
mintpy.topographicResidual = auto #[yes / no], auto for yes
mintpy.topographicResidual.polyOrder = auto #[1-inf], auto for 2, poly order of temporal deformation model
mintpy.topographicResidual.phaseVelocity = auto #[yes / no], auto for no - phase, use phase velocity for minimization
mintpy.topographicResidual.stepFuncDate = auto #[20080529,20100611 / no], auto for no, date of step jump
mintpy.topographicResidual.excludeDate = auto #[20070321 / txtFile / no], auto for exclude_date.txt
mintpy.topographicResidual.pixelwiseGeometry = auto #[yes / no], auto for yes, use pixel-wise geometry info
It outputs:
write_config_file(config_file, "mintpy.topographicResidual = yes")
!smallbaselineApp.py SanFranSenDT42.txt --dostep correct_topography
MintPy version 1.6.1.post3, date 2024-08-14 --RUN-at-2024-08-26 11:08:46.595346-- Current directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy Run routine processing with smallbaselineApp.py on steps: ['correct_topography'] Remaining steps: ['residual_RMS', 'reference_date', 'velocity', 'geocode', 'google_earth', 'hdfeos5'] -------------------------------------------------- Project name: SanFranSenDT42 Go to work directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy read custom template file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/SanFranSenDT42.txt update default template based on input custom template mintpy.topographicResidual: no --> yes copy SanFranSenDT42.txt to inputs directory for backup. copy smallbaselineApp.cfg to inputs directory for backup. copy SanFranSenDT42.txt to pic directory for backup. copy smallbaselineApp.cfg to pic directory for backup. read default template file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg ******************** step - correct_topography ******************** Input data seems to be geocoded. Lookup file not needed. dem_error.py /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseries.h5 -t /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg -o /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseries_demErr.h5 --update read options from template file: smallbaselineApp.cfg -------------------------------------------------- update mode: ON 1) output file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseries_demErr.h5 NOT found. run or skip: run. save the original settings of ['OMP_NUM_THREADS', 'OPENBLAS_NUM_THREADS', 'MKL_NUM_THREADS', 'NUMEXPR_NUM_THREADS', 'VECLIB_MAXIMUM_THREADS'] set OMP_NUM_THREADS = 1 set OPENBLAS_NUM_THREADS = 1 set MKL_NUM_THREADS = 1 set NUMEXPR_NUM_THREADS = 1 set VECLIB_MAXIMUM_THREADS = 1 open timeseries file: timeseries.h5 -------------------------------------------------------------------------------- correct topographic phase residual (DEM error) (Fattahi & Amelung, 2013, IEEE-TGRS) ordinal least squares (OLS) inversion with L2-norm minimization on: phase temporal deformation model: polynomial order = 2 -------------------------------------------------------------------------------- add/update the following configuration metadata to file: ['polyOrder', 'phaseVelocity', 'stepDate', 'excludeDate'] -------------------------------------------------- create HDF5 file: demErr.h5 with w mode create dataset : dem of <class 'numpy.float32'> in size of (1021, 1021) with compression = None close HDF5 file: demErr.h5 -------------------------------------------------- grab dataset structure from ref_file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseries.h5 create HDF5 file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseries_demErr.h5 with w mode create dataset : bperp of float32 in size of (114,) with compression = None create dataset : date of |S8 in size of (114,) with compression = None create dataset : timeseries of float32 in size of (114, 1021, 1021) with compression = None close HDF5 file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseries_demErr.h5 -------------------------------------------------- grab dataset structure from ref_file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseries.h5 create HDF5 file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseriesResidual.h5 with w mode create dataset : bperp of float32 in size of (114,) with compression = None create dataset : date of |S8 in size of (114,) with compression = None create dataset : timeseries of float32 in size of (114, 1021, 1021) with compression = None close HDF5 file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseriesResidual.h5 read mean incidenceAngle, slantRangeDistance, bperp value from timeseries file near incidence angle : 31.5984 degree center incidence angle : 34.2721 degree far incidence angle : 36.9457 degree near range : 798980.12 m center range : 821553.58 m far range : 844127.04 m skip pixels with ZERO in ALL acquisitions skip pixels with NaN in ANY acquisitions skip pixels with ZERO temporal coherence number of pixels to invert: 755740 out of 1042441 (72.5%) estimating DEM error ... -------------------------------------------------- open HDF5 file demErr.h5 in a mode writing dataset /dem block: [0, 1021, 0, 1021] close HDF5 file demErr.h5. -------------------------------------------------- open HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseries_demErr.h5 in a mode writing dataset /timeseries block: [0, 114, 0, 1021, 0, 1021] close HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseries_demErr.h5. -------------------------------------------------- open HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseriesResidual.h5 in a mode writing dataset /timeseries block: [0, 114, 0, 1021, 0, 1021] close HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseriesResidual.h5. roll back to the original settings of ['OMP_NUM_THREADS', 'OPENBLAS_NUM_THREADS', 'MKL_NUM_THREADS', 'NUMEXPR_NUM_THREADS', 'VECLIB_MAXIMUM_THREADS'] set OMP_NUM_THREADS = 16 remove env variable OPENBLAS_NUM_THREADS remove env variable MKL_NUM_THREADS remove env variable NUMEXPR_NUM_THREADS remove env variable VECLIB_MAXIMUM_THREADS time used: 00 mins 3.6 secs. Go back to directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy ################################################ Normal end of smallbaselineApp processing! ################################################ Time used: 00 mins 3.7 secs
To view the estimated DEM error:
view.main(['demErr.h5', '--zero-mask'])
run view.py in MintPy version 1.6.1.post3, date 2024-08-14 input file is dem file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/demErr.h5 in float32 format file size in y/x: (1021, 1021) num of datasets in file demErr.h5: 1 datasets to exclude (0): [] datasets to display (1): ['dem'] data coverage in y/x: (0, 0, 1021, 1021) subset coverage in y/x: (0, 0, 1021, 1021) data coverage in lat/lon: (-122.600833333, 38.1, -121.75000034, 37.249167007000004) subset coverage in lat/lon: (-122.600833333, 38.1, -121.75000034, 37.249167007000004) ------------------------------------------------------------------------ colormap: jet figure title: demErr figure size : [7.5, 6.0] reading data ... masking pixels with zero value data range: [-42.726265, 59.88446] m display range: [-42.726265, 59.88446] m display data in transparency: 1.0 plot in geo-coordinate plotting data as image via matplotlib.pyplot.imshow ... plot scale bar: [0.2, 0.2, 0.1] plot reference point showing ...
This step calculates the Root Mean Square (RMS) of the residual phase time-series for each acquisition; then it:
The corresponding template options are:
## 1) Residual Phase Root Mean Square
## calculate the Root Mean Square (RMS) of residual phase time-series for each acquisition
## To get rid of long wavelength component in space, a ramp is removed for each acquisition
## Set optimal reference date to date with min RMS
## Set exclude dates (outliers) to dates with RMS > cutoff * median RMS (Median Absolute Deviation)
mintpy.residualRMS.maskFile = auto #[file name / no], auto for maskTempCoh.h5, mask for ramp estimation
mintpy.residualRMS.deramp = auto #[quadratic / linear / no], auto for quadratic
mintpy.residualRMS.cutoff = auto #[0.0-inf], auto for 3
It outputs:
rms_timeseriesResidual_ramp.txt
: for RMS value of each acquisitionrms_timeseriesResidual_ramp.pdf
: plot of the rms_timeseriesResidual_ramp.txtreference_date.txt
: date in YYYYMMDD format for the optional reference dateexclude_date.txt
: date(s) in YYYYMMDD format for the noisy acquisitions (if at least one is detected).!smallbaselineApp.py SanFranSenDT42.txt --dostep residual_RMS
MintPy version 1.6.1.post3, date 2024-08-14 --RUN-at-2024-08-26 11:08:51.657195-- Current directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy Run routine processing with smallbaselineApp.py on steps: ['residual_RMS'] Remaining steps: ['reference_date', 'velocity', 'geocode', 'google_earth', 'hdfeos5'] -------------------------------------------------- Project name: SanFranSenDT42 Go to work directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy read custom template file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/SanFranSenDT42.txt update default template based on input custom template No new option value found, skip updating /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg read default template file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg ******************** step - residual_RMS ******************** timeseries_rms.py timeseriesResidual.h5 -t /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg read options from template file: smallbaselineApp.cfg remove quadratic ramp from file: timeseriesResidual.h5 read mask file: maskTempCoh.h5 -------------------------------------------------- grab metadata from ref_file: timeseriesResidual.h5 grab dataset structure from ref_file: timeseriesResidual.h5 create HDF5 file: timeseriesResidual_ramp.h5 with w mode create dataset : bperp of float32 in size of (114,) with compression = None create dataset : date of |S8 in size of (114,) with compression = None create dataset : timeseries of float32 in size of (114, 1021, 1021) with compression = None close HDF5 file: timeseriesResidual_ramp.h5 estimating phase ramp one date at a time ... [==================================================] 114/114 27s / 0s finished writing to file: timeseriesResidual_ramp.h5 time used: 00 mins 28.5 secs. calculating residual RMS for each epoch from file: timeseriesResidual_ramp.h5 read mask from file: maskTempCoh.h5 reading timeseries data from file: timeseriesResidual_ramp.h5 ... [==================================================] 114/114 1s / 0s save timeseries RMS to text file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/rms_timeseriesResidual_ramp.txt read timeseries residual RMS from file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/rms_timeseriesResidual_ramp.txt -------------------------------------------------- date with min RMS: 20190527 - 0.0023 save date to file: reference_date.txt -------------------------------------------------- date(s) with RMS > 3.0 * median RMS (0.0183) None. save figure to file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/rms_timeseriesResidual_ramp.pdf Go back to directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy ################################################ Normal end of smallbaselineApp processing! ################################################ Time used: 00 mins 31.2 secs
pp.plot_timeseries_rms('./rms_timeseriesResidual_ramp.txt', fig_size=[12, 3])
cat reference_date.txt
20190527
This step changes the reference date of all phase time-series files, based on the input template option:
## reference all time-series to one date in time
## no - do not change the default reference date (1st date)
mintpy.reference.date = auto #[reference_date.txt / 20090214 / no], auto for reference_date.txt
This step operates on the existing time-series files and does not output new files.
!smallbaselineApp.py SanFranSenDT42.txt --dostep reference_date
MintPy version 1.6.1.post3, date 2024-08-14 --RUN-at-2024-08-26 11:09:24.229176-- Current directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy Run routine processing with smallbaselineApp.py on steps: ['reference_date'] Remaining steps: ['velocity', 'geocode', 'google_earth', 'hdfeos5'] -------------------------------------------------- Project name: SanFranSenDT42 Go to work directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy read custom template file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/SanFranSenDT42.txt update default template based on input custom template No new option value found, skip updating /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg read default template file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg ******************** step - reference_date ******************** reference_date.py -t /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseries.h5 /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseries_demErr.h5 read reference date from file: reference_date.txt input reference date: 20190527 -------------------------------------------------- change reference date for file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseries.h5 reading data ... referencing in time ... -------------------------------------------------- open HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseries.h5 in r+ mode writing dataset /timeseries block: (0, 114, 0, 1021, 0, 1021) close HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseries.h5. update "REF_DATE" attribute value to 20190527 -------------------------------------------------- change reference date for file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseries_demErr.h5 reading data ... referencing in time ... -------------------------------------------------- open HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseries_demErr.h5 in r+ mode writing dataset /timeseries block: (0, 114, 0, 1021, 0, 1021) close HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseries_demErr.h5. update "REF_DATE" attribute value to 20190527 time used: 00 mins 3.0 secs. Go back to directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy ################################################ Normal end of smallbaselineApp processing! ################################################ Time used: 00 mins 3.0 secs
Noisy acquisitions (identified in "residual_RMS" step) from exclude_date.txt file are excluded by default during the estimation.
!smallbaselineApp.py SanFranSenDT42.txt --dostep velocity
MintPy version 1.6.1.post3, date 2024-08-14 --RUN-at-2024-08-26 11:09:28.142596-- Current directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy Run routine processing with smallbaselineApp.py on steps: ['velocity'] Remaining steps: ['geocode', 'google_earth', 'hdfeos5'] -------------------------------------------------- Project name: SanFranSenDT42 Go to work directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy read custom template file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/SanFranSenDT42.txt update default template based on input custom template No new option value found, skip updating /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg read default template file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg ******************** step - velocity ******************** timeseries2velocity.py /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseries_demErr.h5 -t /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg -o /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5 --update read options from template file: smallbaselineApp.cfg update mode: ON 1) output file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5 already exists. 2) output file is NOT newer than input file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseries_demErr.h5. run or skip: run. open timeseries file: timeseries_demErr.h5 exclude date: [] -------------------------------------------------- dates from input file: 114 ['20150512', '20150605', '20150629', '20150723', '20150816', '20150909', '20151003', '20151027', '20151120', '20151214', '20160107', '20160131', '20160224', '20160319', '20160412', '20160506', '20160530', '20160810', '20160903', '20160927', '20161021', '20161114', '20161208', '20170101', '20170119', '20170125', '20170218', '20170302', '20170326', '20170407', '20170419', '20170501', '20170513', '20170606', '20170618', '20170630', '20170712', '20170724', '20170805', '20170817', '20170910', '20171004', '20171016', '20171028', '20171109', '20171121', '20171203', '20171215', '20171227', '20180108', '20180120', '20180201', '20180213', '20180225', '20180321', '20180402', '20180414', '20180426', '20180508', '20180520', '20180601', '20180613', '20180625', '20180707', '20180719', '20180731', '20180812', '20180824', '20180905', '20180917', '20180929', '20181011', '20181023', '20181104', '20181116', '20181128', '20181210', '20181222', '20190103', '20190115', '20190127', '20190208', '20190220', '20190304', '20190316', '20190328', '20190409', '20190421', '20190503', '20190515', '20190527', '20190608', '20190620', '20190702', '20190714', '20190726', '20190807', '20190819', '20190831', '20190912', '20190924', '20191006', '20191018', '20191030', '20191111', '20191123', '20191205', '20191217', '20191229', '20200110', '20200122', '20200203', '20200227', '20200310'] -------------------------------------------------- using all dates to calculate the time function -------------------------------------------------- estimate deformation model with the following assumed time functions: polynomial : 1 periodic : [] stepDate : [] polyline : [] exp : {} log : {} add/update the following configuration metadata: ['startDate', 'endDate', 'excludeDate', 'polynomial', 'periodic', 'stepDate', 'exp', 'log', 'uncertaintyQuantification', 'timeSeriesCovFile', 'bootstrapCount'] -------------------------------------------------- create HDF5 file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5 with w mode create dataset : intercept of <class 'numpy.float32'> in size of (1021, 1021) with compression = None create dataset : interceptStd of <class 'numpy.float32'> in size of (1021, 1021) with compression = None create dataset : velocity of <class 'numpy.float32'> in size of (1021, 1021) with compression = None create dataset : velocityStd of <class 'numpy.float32'> in size of (1021, 1021) with compression = None create dataset : residue of <class 'numpy.float32'> in size of (1021, 1021) with compression = None add /intercept attribute: UNIT = m add /interceptStd attribute: UNIT = m add /velocity attribute: UNIT = m/year add /velocityStd attribute: UNIT = m/year add /residue attribute: UNIT = m close HDF5 file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5 reading data from file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/timeseries_demErr.h5 ... skip pixels with zero/nan value in all acquisitions number of pixels to invert: 755741 out of 1042441 (72.5%) estimating time functions via linalg.lstsq ... estimating time functions STD from time-series fitting residual ... -------------------------------------------------- open HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5 in a mode writing dataset /intercept block: [0, 1021, 0, 1021] close HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5. -------------------------------------------------- open HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5 in a mode writing dataset /interceptStd block: [0, 1021, 0, 1021] close HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5. -------------------------------------------------- open HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5 in a mode writing dataset /velocity block: [0, 1021, 0, 1021] close HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5. -------------------------------------------------- open HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5 in a mode writing dataset /velocityStd block: [0, 1021, 0, 1021] close HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5. -------------------------------------------------- open HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5 in a mode writing dataset /residue block: [0, 1021, 0, 1021] close HDF5 file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5. time used: 00 mins 1.1 secs. timeseries2velocity.py /Users/yunjunz/data/test/SanFranSenDT42/mintpy/inputs/ERA5.h5 -t /Users/yunjunz/data/test/SanFranSenDT42/mintpy/smallbaselineApp.cfg -o /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocityERA5.h5 --update --ref-date 20190527 --ref-yx 492 637 read options from template file: smallbaselineApp.cfg update mode: ON 1) output file /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocityERA5.h5 already exists. 2) output file is newer than input file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/inputs/ERA5.h5. 3) all key configuration parameters are the same: ['startDate', 'endDate', 'excludeDate', 'polynomial', 'periodic', 'stepDate', 'exp', 'log', 'uncertaintyQuantification', 'timeSeriesCovFile', 'bootstrapCount']. run or skip: skip. Go back to directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy ################################################ Normal end of smallbaselineApp processing! ################################################ Time used: 00 mins 1.1 secs
Compare the linear velocity before and after the corrections for: 1) phase unwrapping errors and 2) topographic residuals.
# view.py options
opt = ' -v -1.5 1 --lalo-label --lalo-step 0.5 --ylabel-rot 90 --noverbose --noscalebar'
cmd_list = [
f'view.py velocityRaw.h5 velocity {opt} --title velocity-before-corrections',
f'view.py velocity.h5 velocity {opt} --title velocity-after-corrections --lalo-loc 0 0 0 1',
]
# plot using matplotlib & mintpy.view
fig, axs = plt.subplots(nrows=1, ncols=2, figsize=[14, 6], subplot_kw=dict(projection=ccrs.PlateCarree()))
for ax, cmd in zip(axs, cmd_list):
data, atr, inps = prep_slice(cmd)
plot_slice(ax, data, atr, inps)
plt.show()
view.py velocityRaw.h5 velocity -v -1.5 1 --lalo-label --lalo-step 0.5 --ylabel-rot 90 --noverbose --noscalebar --title velocity-before-corrections view.py velocity.h5 velocity -v -1.5 1 --lalo-label --lalo-step 0.5 --ylabel-rot 90 --noverbose --noscalebar --title velocity-after-corrections --lalo-loc 0 0 0 1
%matplotlib inline
#scp_args = 'velocity.h5 --start-lalo 37.7629 -122.4929 --end-lalo 37.9504 -121.9296 --noverbose '
scp_args = 'velocity.h5 --start-lalo 37.6523 -122.1163 --end-lalo 37.7047 -122.0497 --noverbose '
plot_transection.main(scp_args.split())
plot_transection.py velocity.h5 --start-lalo 37.6523 -122.1163 --end-lalo 37.7047 -122.0497 --noverbose view.py velocity.h5 velocity --noverbose figure size : [7.5, 6.0]
The transect across Hayward fault shows ~3.5 mm/yr fault creep in LOS direction.
MintPy's analysis is independent of GNSS observations. This allows validating InSAR products with GNSS data when they are available. MintPy supports automatically download GNSS data over the region of interest from several sources (--gnss-source
), then project the GNSS observations into InSAR LOS direction (--gnss-comp
):
In order to display the GNSS station names on the plot add --gnss-label
to the plot.
opt = '--show-gnss --ref-gnss P225 --gnss-comp enu2los --gnss-label -v -1 1 --lalo-label --lalo-step 0.5 --ylabel-rot 90 --figsize 10 8 '
view.main(f'velocity.h5 velocity {opt}'.split())
run view.py in MintPy version 1.6.1.post3, date 2024-08-14 input file is velocity file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/velocity.h5 in float32 format file size in y/x: (1021, 1021) input dataset: "['velocity']" turning glob search OFF for velocity file num of datasets in file velocity.h5: 5 datasets to exclude (0): [] datasets to display (1): ['velocity'] data coverage in y/x: (0, 0, 1021, 1021) subset coverage in y/x: (0, 0, 1021, 1021) data coverage in lat/lon: (-122.600833333, 38.1, -121.75000034, 37.249167007000004) subset coverage in lat/lon: (-122.600833333, 38.1, -121.75000034, 37.249167007000004) ------------------------------------------------------------------------ colormap: jet initiate cartopy map projection: PlateCarree figure title: velocity read mask from file: maskTempCoh.h5 reading data ... masking data data range: [-1.3947906, 1.1969569] cm/year display range: [-1.0, 1.0] cm/year display data in transparency: 1.0 plot in geo-coordinate create directory: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/GNSS-UNR referencing InSAR data to the pixel nearest to GNSS station: P225 at [37.713865, -122.058327] by substrating 0.028 cm/year plotting data as image via matplotlib.pyplot.imshow ... plot scale bar: [0.2, 0.2, 0.1] plot lat/lon label in step of 0.5 and location of [1, 0, 0, 1] downloading site list from UNR: http://geodesy.unr.edu/NGLStationPages/DataHoldings.txt to DataHoldings.txt load 22184 GNSS sites with fields: site lat lon start_date end_date num_solution keep sites within SNWE of (37.249167007000004, 38.1, -122.600833333, -121.75000034): [88] keep sites with end_date >= 20150512: [73] keep sites with start_date <= 20200310: [67] keep sites with # of solutions >= 50: [67] ['BRI2' 'BRIB' 'BRIC' 'CAC3' 'CACD' 'CADY' 'CAFT' 'CAHB' 'CAP1' 'CAP6' 'CAPL' 'CCSF' 'CSJB' 'DIAB' 'DUBP' 'EBMD' 'JRSC' 'LRA1' 'LRA2' 'LRA3' 'LRA4' 'LRA5' 'LRA6' 'LUTZ' 'MHDL' 'MILP' 'MONB' 'MSHP' 'MTVW' 'OHLN' 'ORE2' 'ORE3' 'OREO' 'OXMT' 'P176' 'P177' 'P178' 'P181' 'P219' 'P220' 'P221' 'P222' 'P223' 'P224' 'P225' 'P226' 'P227' 'P229' 'P230' 'P248' 'P262' 'ROCP' 'SBRB' 'SCCP' 'SFCC' 'SLAC' 'SRB1' 'STFU' 'SVIN' 'SWEP' 'T3RP' 'TIBB' 'UCSF' 'WIN2' 'WINT' 'ZOA1' 'ZOA2'] nearest GNSS site (potential --ref-gnss choice): P225 at [37.71390151977539, -122.05830383300781] ------------------------------ plotting GNSS velocity in IGS14 reference frame in enu2los direction with respect to P225 ... number of available GNSS stations: 67 start date: 20150512 end date: 20200310 components projection: enu2los default GNSS observation file name: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/gnss_enu2los_UNR.csv calculating GNSS observation ... use incidence / azimuth angle from file: geometryGeo.h5 [> ] Velocity calculation failed for site BRI2 [=> 4% ] 3/67 BRIC 6s / 163s Velocity calculation failed for site BRIC [==> 6% ] 4/67 CAC3 9s / 146s Velocity calculation failed for site CAC3 [=====================> 43% ] 29/67 MTVW 144s / 191s Velocity calculation failed for site MTVW [======================= 82% ============> ] 55/67 SFCC 290s / 63s Velocity calculation failed for site SFCC [==================================================] 67/67 ZOA2 338s / 10s write GNSS observations to file: /Users/yunjunz/data/test/SanFranSenDT42/mintpy/gnss_enu2los_UNR.csv ignore the following 7 stations due to limited overlap/observations in time ['BRI2' 'BRIC' 'CAC3' 'MTVW' 'SFCC' 'SVIN' 'T3RP'] rotate Y-axis tick labels by 90.0 deg showing ...
kwargs = dict(ref_gnss_site='P225', csv_file='gnss_enu2los_UNR.csv', msk_file='maskTempCoh.h5', cutoff=5, fig_size=[6, 6])
kwargs['ex_gnss_sites'] = ['LRA4', 'ORE2', 'OREO']
sites, insar_obs, gnss_obs = pp.plot_insar_vs_gnss_scatter(vel_file='velocity.h5', **kwargs)
read GNSS velocity from file: gnss_enu2los_UNR.csv read InSAR velocity from file: velocity.h5 [==================================================] 64/64 ZOA2 0s / 0s median offset between InSAR and GNSS [before common referencing]: 2.11 cm/year referencing both InSAR and GNSS data to site: P225 removing sites with NaN values in GNSS or InSAR GNSS min/max: -0.85 / 0.58 InSAR min/max: -0.94 / 0.61 RMSE = 0.16 cm/yr R^2 = 0.92 Preliminary outliers detection: abs(InSAR - GNSS) > med abs dev (0.13) * 5 Site: InSAR GNSS save figure to file insar_vs_gnss_scatter.pdf
Check GNSS stations on Nevada Geodetic Lab for detailed info and further investigation.
After correcting for potential errors and noises, we can estimate a suite of time functions from the final displacement time-series for our signal of interest, such as in Hetland et al. (2012), treating InSAR time-series in a similar way as GNSS time-series. This step is highly subjective, and depends on our "geophysical sense of smell".
(Figure from Hetland et al., 2012)
This step is implemented as timeseries2velocity.py
in MintPy, which is called within smallbaselineApp.py --dostep velocity
. The currently supported time functions are:
polynomial
- defined by its degree in integer. 1 for linear, 2 for quadratic, etc.periodic
- defined by a list of periods in decimal years. 1 for annual, 0.5 for semi-annual, etc.step
- defined by a list of onset times in str in YYYYMMDD(THHMM) formatexp
- defined by an onset time followed by an charateristic time in integer days.log
- defined by an onset time followed by an charateristic time in integer days.The complete time function estimation options, including uncertainty quantification, can be configured through the configuration file in the mintpy.timeFunc.* section.
First, we examine the time-series on a few selected pixels, in order to decide which time functions to choose. Then we apply these time functions to all pixels, to get the map view of the estimated time function parameters.
%matplotlib widget
tsview.main('timeseries_demErr.h5 --ref-date 20150909 --figsize 9 3 --figsize-img 4 4 --ylim -8 8 --noverbose '.split())
tsview.py timeseries_demErr.h5 --ref-date 20150909 --figsize 9 3 --figsize-img 4 4 --ylim -8 8 --noverbose No lookup table (longitude or rangeCoord) found in files. reading 2D slices 114/114...
From the examination with tsview.py
, the time-dependent displacement behaviors seem to include: 1) highly order polynomial, 2) seasonal.
!timeseries2velocity.py timeseries_demErr.h5 --poly 2 --periodic 1 -o velocityTimeFunc.h5
open timeseries file: timeseries_demErr.h5 -------------------------------------------------- dates from input file: 114 ['20150512', '20150605', '20150629', '20150723', '20150816', '20150909', '20151003', '20151027', '20151120', '20151214', '20160107', '20160131', '20160224', '20160319', '20160412', '20160506', '20160530', '20160810', '20160903', '20160927', '20161021', '20161114', '20161208', '20170101', '20170119', '20170125', '20170218', '20170302', '20170326', '20170407', '20170419', '20170501', '20170513', '20170606', '20170618', '20170630', '20170712', '20170724', '20170805', '20170817', '20170910', '20171004', '20171016', '20171028', '20171109', '20171121', '20171203', '20171215', '20171227', '20180108', '20180120', '20180201', '20180213', '20180225', '20180321', '20180402', '20180414', '20180426', '20180508', '20180520', '20180601', '20180613', '20180625', '20180707', '20180719', '20180731', '20180812', '20180824', '20180905', '20180917', '20180929', '20181011', '20181023', '20181104', '20181116', '20181128', '20181210', '20181222', '20190103', '20190115', '20190127', '20190208', '20190220', '20190304', '20190316', '20190328', '20190409', '20190421', '20190503', '20190515', '20190527', '20190608', '20190620', '20190702', '20190714', '20190726', '20190807', '20190819', '20190831', '20190912', '20190924', '20191006', '20191018', '20191030', '20191111', '20191123', '20191205', '20191217', '20191229', '20200110', '20200122', '20200203', '20200227', '20200310'] -------------------------------------------------- using all dates to calculate the time function -------------------------------------------------- estimate deformation model with the following assumed time functions: polynomial : 2 periodic : [1.0] stepDate : [] polyline : [] exp : {} log : {} add/update the following configuration metadata: ['startDate', 'endDate', 'excludeDate', 'polynomial', 'periodic', 'stepDate', 'exp', 'log', 'uncertaintyQuantification', 'timeSeriesCovFile', 'bootstrapCount'] -------------------------------------------------- create HDF5 file: velocityTimeFunc.h5 with w mode create dataset : intercept of <class 'numpy.float32'> in size of (1021, 1021) with compression = None create dataset : interceptStd of <class 'numpy.float32'> in size of (1021, 1021) with compression = None create dataset : velocity of <class 'numpy.float32'> in size of (1021, 1021) with compression = None create dataset : velocityStd of <class 'numpy.float32'> in size of (1021, 1021) with compression = None create dataset : acceleration of <class 'numpy.float32'> in size of (1021, 1021) with compression = None create dataset : accelerationStd of <class 'numpy.float32'> in size of (1021, 1021) with compression = None create dataset : annualAmplitude of <class 'numpy.float32'> in size of (1021, 1021) with compression = None create dataset : annualPhase of <class 'numpy.float32'> in size of (1021, 1021) with compression = None create dataset : residue of <class 'numpy.float32'> in size of (1021, 1021) with compression = None add /intercept attribute: UNIT = m add /interceptStd attribute: UNIT = m add /velocity attribute: UNIT = m/year add /velocityStd attribute: UNIT = m/year add /acceleration attribute: UNIT = m/year^2 add /accelerationStd attribute: UNIT = m/year^2 add /annualAmplitude attribute: UNIT = m add /annualPhase attribute: UNIT = radian add /residue attribute: UNIT = m close HDF5 file: velocityTimeFunc.h5 reading data from file timeseries_demErr.h5 ... skip pixels with zero/nan value in all acquisitions number of pixels to invert: 755741 out of 1042441 (72.5%) estimating time functions via linalg.lstsq ... estimating time functions STD from time-series fitting residual ... -------------------------------------------------- open HDF5 file velocityTimeFunc.h5 in a mode writing dataset /intercept block: [0, 1021, 0, 1021] close HDF5 file velocityTimeFunc.h5. -------------------------------------------------- open HDF5 file velocityTimeFunc.h5 in a mode writing dataset /interceptStd block: [0, 1021, 0, 1021] close HDF5 file velocityTimeFunc.h5. -------------------------------------------------- open HDF5 file velocityTimeFunc.h5 in a mode writing dataset /velocity block: [0, 1021, 0, 1021] close HDF5 file velocityTimeFunc.h5. -------------------------------------------------- open HDF5 file velocityTimeFunc.h5 in a mode writing dataset /velocityStd block: [0, 1021, 0, 1021] close HDF5 file velocityTimeFunc.h5. -------------------------------------------------- open HDF5 file velocityTimeFunc.h5 in a mode writing dataset /acceleration block: [0, 1021, 0, 1021] close HDF5 file velocityTimeFunc.h5. -------------------------------------------------- open HDF5 file velocityTimeFunc.h5 in a mode writing dataset /accelerationStd block: [0, 1021, 0, 1021] close HDF5 file velocityTimeFunc.h5. -------------------------------------------------- open HDF5 file velocityTimeFunc.h5 in a mode writing dataset /annualAmplitude block: [0, 1021, 0, 1021] close HDF5 file velocityTimeFunc.h5. -------------------------------------------------- open HDF5 file velocityTimeFunc.h5 in a mode writing dataset /annualPhase block: [0, 1021, 0, 1021] close HDF5 file velocityTimeFunc.h5. -------------------------------------------------- open HDF5 file velocityTimeFunc.h5 in a mode writing dataset /residue block: [0, 1021, 0, 1021] close HDF5 file velocityTimeFunc.h5. time used: 00 mins 1.2 secs.
%matplotlib inline
# view.py options
opt = ' --lalo-label --lalo-step 0.5 --ylabel-rot 90 --noverbose --noscalebar '
cmd_list = [
f'view.py velocityTimeFunc.h5 velocity {opt} --lalo-loc 1 0 0 0 ',
f'view.py velocityTimeFunc.h5 acceleration {opt} --lalo-loc 0 0 0 0 ',
f'view.py velocityTimeFunc.h5 annualAmplitude {opt} --lalo-loc 1 0 0 1 ',
f'view.py velocityTimeFunc.h5 annualPhase {opt} --lalo-loc 0 0 0 1 -c cmy ',
]
# plot using matplotlib & mintpy.view
fig, axs = plt.subplots(nrows=2, ncols=2, figsize=[11, 10], subplot_kw=dict(projection=ccrs.PlateCarree()))
for ax, cmd in zip(axs.flatten(), cmd_list):
data, atr, inps = prep_slice(cmd)
plot_slice(ax, data, atr, inps)
plt.show()
view.py velocityTimeFunc.h5 velocity --lalo-label --lalo-step 0.5 --ylabel-rot 90 --noverbose --noscalebar --lalo-loc 1 0 0 0 view.py velocityTimeFunc.h5 acceleration --lalo-label --lalo-step 0.5 --ylabel-rot 90 --noverbose --noscalebar --lalo-loc 0 0 0 0 view.py velocityTimeFunc.h5 annualAmplitude --lalo-label --lalo-step 0.5 --ylabel-rot 90 --noverbose --noscalebar --lalo-loc 1 0 0 1 view.py velocityTimeFunc.h5 annualPhase --lalo-label --lalo-step 0.5 --ylabel-rot 90 --noverbose --noscalebar --lalo-loc 0 0 0 1 -c cmy
smallbaselineApp.py
non-stop processing¶After getting familiar with the template setting, one can setup all the custom modified configurations (in the custom template file - recommeded), and run smallbaselineApp.py
with a single command-line call to process all steps as:
smallbaselineApp.py SanFranSenDT42.txt
To facilitate the re-running process, an auto-skipping strategy is implemented for all steps in smallbaselineApp.py
and skip steps if:
Therefore, one can modified the template option and re-run smallbaselineApp.py without specifying the start/end step, and the processing will start from where is changed and continue from there.
./inputs
directory¶The ./inputs
folder contains everything smallbaselineApp.py
needs. One can move the ./inputs
folder to anywhere MintPy is installed and re-start the whole analysis. It's well-suited for users who want to play around the analysis on their local laptop after the massive InSAR stack processing on the High Performance Computer (HPC) or Cloud.
Almost all processing steps are callable through running individual command line scripts. As an example one can invert the interferograms using ifgram_inversion.py inputs/ifgramStack.h5 -t smallbaselineApp.cfg
.
Or perform the atmospheric delay correction using tropo_pyaps3.py
or tropo_phase_elevation.py
.
Or apply the topographic residual correction using dem_error.py
.
MintPy leverages the Dask
library for parallel processing. To turn this feature on, just set mintpy.compute.cluster = local
in the configuration file. This capability is currently implemented for the two most time consuming steps: "invert_network" and "correct_topography". More descriptions can be found here.
########## computing resource configuration
mintpy.compute.maxMemory = auto #[float > 0.0], auto for 4, max memory to allocate in GB
## parallel processing with dask
## currently apply to steps: invert_network, correct_topography
## cluster = none to turn off the parallel computing
## numWorker = all to use all locally available cores (for cluster = local only)
## config = none to rollback to the default name (same as the cluster type; for cluster != local)
mintpy.compute.cluster = auto #[local / slurm / pbs / lsf / none], auto for none, cluster type
mintpy.compute.numWorker = auto #[int > 1 / all], auto for 4 (local) or 40 (slurm / pbs / lsf), num of workers
mintpy.compute.config = auto #[none / slurm / pbs / lsf ], auto for none (same as cluster), config name
All HDF5 files can be plotted using view.py
and/or tsview.py
.
view.py
: 2D plot(s) for:tsview.py
: 1D time series plotplot_transection.py
: plot 1D profile along a line of a 2D matrixplot_coherence_matrix.py
: plot coherence matrix for one pixelsave_kmz.py
: Google Earth points/raster for 2D displacementsave_kmz_timeseries.py
: Google Earth points for 3D displacement time-seriesCheck out University of Miami online time-series viewer: https://insarmaps.miami.edu/
If geocoded displacement products from ascending and descending paths are available, asc_desc2horz_vert.py
script can be called to project the LOS components to Horizontal and vertical components
image_stitch.py
merges two or more geocoded datasets sharing common area into one. It finds the overlap area and calculates the average offset between the datasets. For example:
image_stitch.py vel_AlosAT422.h5 vel_AlosAT423.h5 vel_AlosAT424.h5 vel_AlosAT425.h5 -o vel_AlosA.h5