The OpenSARLab MintPy workflow has been moved into the OpenSARLab_MintPy_Recipe_Book, which can be found on OpenSARLab at the path: ~/Data_Recipe_Jupyter_Books/opensarlab_MintPy_Recipe_Book
We will no longer support this workflow or the osl_mintpy
environment that it uses.
Hint: Once you have navigated to the Recipe Book, click the "JB" tab on the left hand JupyterLab sidebar to access its table of contents.
Author: Alex Lewandowski; University of Alaska Fairbanks
Based on prep_hyp3_for_mintpy.ipynb by Jiang Zhu; University of Alaska Fairbanks
Important Note about JupyterHub Your JupyterHub server will automatically shutdown when left idle for more than 1 hour. Your notebooks will not be lost but you will have to restart their kernels and re-run them from the beginning. You will not be able to seamlessly continue running a partially run notebook.
import url_widget as url_w
notebookUrl = url_w.URLWidget()
display(notebookUrl)
from IPython.display import Markdown
from IPython.display import display
notebookUrl = notebookUrl.value
user = !echo $JUPYTERHUB_USER
env = !echo $CONDA_PREFIX
if env[0] == '':
env[0] = 'Python 3 (base)'
if env[0] != '/home/jovyan/.local/envs/osl_mintpy':
display(Markdown(f'<text style=color:red><strong>WARNING:</strong></text>'))
display(Markdown(f'<text style=color:red>This notebook should be run using the "osl_mintpy" conda environment.</text>'))
display(Markdown(f'<text style=color:red>It is currently using the "{env[0].split("/")[-1]}" environment.</text>'))
display(Markdown(f'<text style=color:red>Select the "osl_mintpy" from the "Change Kernel" submenu of the "Kernel" menu.</text>'))
display(Markdown(f'<text style=color:red>If the "osl_mintpy" environment is not present, use <a href="{notebookUrl.split("/user")[0]}/user/{user[0]}/notebooks/conda_environments/Create_OSL_Conda_Environments.ipynb"> Create_OSL_Conda_Environments.ipynb </a> to create it.</text>'))
display(Markdown(f'<text style=color:red>Note that you must restart your server after creating a new environment before it is usable by notebooks.</text>'))
In this notebook we will use the following scientific libraries:
Our first step is to import gdal and other needed packages
import copy
from datetime import datetime, timedelta
import ipywidgets as widgets
from itertools import chain
import json
from pathlib import Path
import re
import requests
import shutil
from tqdm.auto import tqdm
from typing import Union
import numpy as np
from osgeo import gdal
gdal.UseExceptions()
import pandas
from rasterio.warp import transform_bounds
import opensarlab_lib as asfn
from hyp3_sdk import Batch, HyP3
from IPython.display import clear_output
%matplotlib widget
This notebook assumes that you are accessing an InSAR time series created using the Alaska Satellite Facility's value-added product system HyP3, available via ASF Data Search/Vertex. HyP3 is an ASF service used to prototype value added products and provide them to users to collect feedback.
You can access HyP3 on-demand products from your HyP3 account or from publically available, pre-processed SARVIEWS Event data. https://sarviews-hazards.alaska.edu/
Before downloading anything, create an analysis directory to hold your data.
Select or create a working directory for the analysis:
while True:
print(f"Current working directory: {Path.cwd()}")
data_dir = Path(input(f"\nPlease enter the name of a directory in which to store your data for this analysis."))
if data_dir == Path('.'):
continue
if data_dir.is_dir():
contents = data_dir.glob('*')
if len(list(contents)) > 0:
choice = asfn.handle_old_data(data_dir)
if choice == 1:
if data_dir.exists():
shutil.rmtree(data_dir)
data_dir.mkdir()
break
elif choice == 2:
break
else:
clear_output()
continue
else:
break
else:
data_dir.mkdir()
break
Define absolute path to analysis directory:
analysis_directory = Path.cwd()/(data_dir)
print(f"analysis_directory: {analysis_directory}")
def dates_from_product_name(product_name):
regex = "[0-9]{8}T[0-9]{6}_[0-9]{8}T[0-9]{6}"
results = re.search(regex, product_name)
if results:
return results.group(0)
else:
return None
Create a HyP3 object and authenticate:
hyp3 = HyP3(prompt=True)
List projects containing active InSAR products and select one:
Your HyP3 InSAR project should include DEMs, which are available as options when submitting a HyP3 project
my_hyp3_info = hyp3.my_info()
active_projects = dict()
print("Checking all HyP3 projects for current INSAR_GAMMA jobs")
for project in tqdm(my_hyp3_info['job_names']):
batch = Batch()
batch = hyp3.find_jobs(name=project, job_type='INSAR_GAMMA').filter_jobs(running=False, include_expired=False)
if len(batch) > 0:
active_projects.update({batch.jobs[0].name: batch})
if len(active_projects) > 0:
display(Markdown("<text style='color:darkred;'>Note: After selecting a project, you must select the next cell before hitting the 'Run' button or typing Shift/Enter.</text>"))
display(Markdown("<text style='color:darkred;'>Otherwise, you will rerun this code cell.</text>"))
print('\nSelect a Project:')
project_select = asfn.select_parameter(active_projects.keys())
display(project_select)
else:
print("Found no active projects containing InSAR products")
Select a date range of products to download:
jobs = active_projects[project_select.value]
display(Markdown("<text style='color:darkred;'>Note: After selecting a date range, you should select the next cell before hitting the 'Run' button or typing Shift/Enter.</text>"))
display(Markdown("<text style='color:darkred;'>Otherwise, you may simply rerun this code cell.</text>"))
print('\nSelect a Date Range:')
dates = asfn.get_job_dates(jobs)
date_picker = asfn.gui_date_picker(dates)
display(date_picker)
Save the selected date range and remove products falling outside of it:
date_range = asfn.get_slider_vals(date_picker)
date_range[0] = date_range[0].date()
date_range[1] = date_range[1].date()
print(f"Date Range: {str(date_range[0])} to {str(date_range[1])}")
jobs = asfn.filter_jobs_by_date(jobs, date_range)
Gather the available paths and orbit directions for the remaining products:
display(Markdown("<text style='color:darkred;'><text style='font-size:150%;'>This may take some time for projects containing many jobs...</text></text>"))
asfn.set_paths_orbits(jobs)
paths = set()
orbit_directions = set()
for p in jobs:
paths.add(p.path)
orbit_directions.add(p.orbit_direction)
display(Markdown(f"<text style=color:blue><text style='font-size:175%;'>Done.</text></text>"))
Select a path:
This notebook does not currently support merging InSAR products in multiple paths
display(Markdown("<text style='color:darkred;'>Note: After selecting a path, you must select the next cell before hitting the 'Run' button or typing Shift/Enter.</text>"))
display(Markdown("<text style='color:darkred;'>Otherwise, you will simply rerun this code cell.</text>"))
print('\nSelect a Path:')
path_choice = asfn.select_parameter(paths)
display(path_choice)
Save the selected flight path/s:
flight_path = path_choice.value
if flight_path:
if flight_path:
print(f"Flight Path: {flight_path}")
else:
print('Flight Path: All Paths')
else:
print("WARNING: You must select a flight path in the previous cell, then rerun this cell.")
Select an orbit direction:
if len(orbit_directions) > 1:
display(Markdown("<text style='color:red;'>Note: After selecting a flight direction, you must select the next cell before hitting the 'Run' button or typing Shift/Enter.</text>"))
display(Markdown("<text style='color:red;'>Otherwise, you will simply rerun this code cell.</text>"))
print('\nSelect a Flight Direction:')
direction_choice = asfn.select_parameter(orbit_directions, 'Direction:')
display(direction_choice)
Save the selected orbit direction:
direction = direction_choice.value
print(f"Orbit Direction: {direction}")
Filter jobs by path and orbit direction:
jobs = asfn.filter_jobs_by_path(jobs, [flight_path])
jobs = asfn.filter_jobs_by_orbit(jobs, direction)
print(f"There are {len(jobs)} products to download.")
Download the products, unzip them into a directory named after the product type, and delete the zip files:
print(f"\nProject: {jobs.jobs[0].name}")
project_zips = jobs.download_files(analysis_directory)
for z in project_zips:
asfn.asf_unzip(str(analysis_directory), str(z))
z.unlink()
Include Look Vectors
option is selected.Include DEM
option is selectedInclude Look Vectors
option is selectedAll of the above mentioned files will be included in an InSAR project if Set MintPy Options is selected when adding InSAR jobs to a project in ASF-Search (Vertex)
dems = list(analysis_directory.glob('*/*dem*.tif'))
az_angle_maps = list(analysis_directory.glob('*/*lv_phi*.tif'))
inc_angle_maps = list(analysis_directory.glob('*/*lv_theta*.tif'))
if len(dems) > 0:
print("Success: Found at least 1 DEM.")
else:
raise FileNotFoundError("Failed to find at least 1 DEM. \
\nYou will not be able to successfully run a MintPy time-series unless you reorder your HyP3 project \
with DEMS or provide one from another source.")
if len(az_angle_maps) > 0:
print("Success: Found at least 1 Azimuth Angle Map.")
else:
raise FileNotFoundError("Failed to find at least 1 Azimuth Angle Map. \
\nYou will not be able to successfully run a MintPy time-series unless your reorder your HyP3 project \
with 'Include Look Vectors' option selected.")
if len(inc_angle_maps) > 0:
print("Success: Found at least 1 Incidence Angle Map.")
else:
raise FileNotFoundError("Failed to find at least 1 Incidence Angle Map. \
\nYou will not be able to successfully run a MintPy time-series unless your reorder your HyP3 project \
with 'Include Inc. Angle Map' option selected.")
Delete unneeded files:
for pattern in ["xml","png","kmz","md.txt"]:
unneeded_files = analysis_directory.glob(f"*/*.{pattern}")
for file in unneeded_files:
file.unlink()
Project all tiffs to Predominant UTM
from typing import List, Union, Dict
from collections import Counter
def get_projection(img_path: Union[Path, str]) -> Union[str, None]:
"""
Takes: a string or posix path to a product in a UTM projection
Returns: the projection (as a string) or None if none found
"""
img_path = str(img_path)
try:
info = gdal.Info(img_path, format='json')['coordinateSystem']['wkt']
except KeyError:
return None
except TypeError:
raise FileNotFoundError
regex = 'ID\["EPSG",[0-9]{4,5}\]\]$'
results = re.search(regex, info)
if results:
return results.group(0).split(',')[1][:-2]
else:
return None
def get_projections(tiff_paths: List[Union[Path, str]]) -> Dict:
"""
Takes: List of string or posix paths to geotiffs
Returns: Dictionary key: epsg, value: number of tiffs in that epsg
"""
epsgs = []
for p in tiff_paths:
epsgs.append(get_projection(p))
epsgs = dict(Counter(epsgs))
return epsgs
def get_res(tiff):
tiff = str(tiff)
f = gdal.Open(tiff)
return f.GetGeoTransform()[1]
def get_no_data_val(pth):
pth = str(pth)
f = gdal.Open(str(pth))
if f.GetRasterBand(1).DataType > 5:
no_data_val = f.GetRasterBand(1).GetNoDataValue()
return np.nan if no_data_val == None else f.GetRasterBand(1).GetNoDataValue()
else:
return 0
fnames = list(analysis_directory.glob('*/*.tif'))
fnames.sort()
epsgs = get_projections(fnames)
predominant_epsg = None if len(epsgs) == 1 else max(epsgs, key=epsgs.get)
if predominant_epsg:
for pth in fnames:
src_SRS = get_projection(str(pth))
res = get_res(pth)
if src_SRS != predominant_epsg:
res = get_res(pth)
no_data_val = get_no_data_val(pth)
temp = pth.parent/f"temp_{pth.stem}.tif"
pth.rename(temp)
warp_options = {
"dstSRS":f"EPSG:{predominant_epsg}", "srcSRS":f"EPSG:{src_SRS}",
"targetAlignedPixels":True,
"xRes":res, "yRes":res,
"dstNodata": no_data_val
}
gdal.Warp(str(pth), str(temp), **warp_options)
temp.unlink()
Determine the maximum and common extents of the stack and plot an Area-of_Interest Selector:
amp = list(analysis_directory.glob(f'*/*_amp*.tif'))
max_extents = asfn.get_max_extents(amp)
xmin, ymin, xmax, ymax = transform_bounds(int(asfn.get_projection(str(amp[0]))), 3857, *max_extents)
max_extents = [xmin, ymin, xmax, ymax]
common_extents = asfn.get_common_coverage_extents(amp)
xmin, ymin, xmax, ymax = transform_bounds(int(asfn.get_projection(str(amp[0]))), 3857, *common_extents)
common_extents = [xmin, ymin, xmax, ymax]
print(max_extents)
print(common_extents)
aoi = asfn.AOI_Selector(max_extents, common_extents, figsize=(10, 8))
Convert the subset corner coordinates from Web-Mercator back to the input data's EPSG:
try:
xmin, ymin, xmax, ymax = transform_bounds(3857,
int(asfn.get_projection(str(amp[0]))),
*[aoi.x1, aoi.y1, aoi.x2, aoi.y2])
ul = [xmin, ymax]
lr = [xmax, ymin]
print(f"AOI Corner Coordinates:")
print(f"upper left corner: {ul}")
print(f"lower right corner: {lr}")
except TypeError:
print('TypeError')
display(Markdown(f'<text style=color:red>This error may occur if an AOI was not selected.</text>'))
display(Markdown(f'<text style=color:red>Note that the square tool icon in the AOI selector menu is <b>NOT</b> the selection tool. It is the zoom tool.</text>'))
Crop the stack to the AOI and reproject to lat-lon:
fnames = list(analysis_directory.glob('*/*.tif'))
fnames.sort()
for i, fname in enumerate(fnames):
clip = fname.parent/f"{fname.stem}_clip.tif"
gdal.Translate(destName=str(clip), srcDS=str(fname), projWin=[ul[0], ul[1], lr[0], lr[1]])
gdal.Warp(str(clip), str(clip), dstSRS='EPSG:4326', dstNodata=0)
fname.unlink()
Remove any subset scenes containing no data:
fnames = list(analysis_directory.glob('*/*.tif'))
fnames = [str(f) for f in fnames]
fnames.sort()
removed = []
for f in fnames:
if not "dem" in str(f):
raster = gdal.Open(f)
if raster:
band = raster.ReadAsArray()
if np.count_nonzero(band) < 1:
Path(f).unlink()
removed.append(f)
if len(removed) == 0:
print("No Geotiffs were removed")
else:
print(f"{len(removed)} GeoTiffs removed:")
for f in removed:
print(f)
Run the code cell below for a link to the MintPy_Time_Series_From_Prepared_Data_Stack notebook:
Describe what each notebook does in bottom links
# from pathlib import Path
from IPython.display import display, HTML
current = Path.cwd()
abs_path = current/"MintPy_Time_Series_From_Prepared_Data_Stack.ipynb"
relative_path = abs_path.relative_to(current)
link_t = f"Open <a href='{relative_path}'>MintPy_Time_Series_From_Prepared_Data_Stack.ipynb</a> to run an InSAR time-series analysis on your data using MintPy"
html = HTML(link_t)
display(html)
Prepare_HyP3_InSAR_Stack_for_MintPy.ipynb - Version 1.2.5 - April 2024
Version Changes