Preparing a HyP3 InSAR Stack for MintPy

Author: Alex Lewandowski; University of Alaska Fairbanks

Based on prep_hyp3_for_mintpy.ipynb by Jiang Zhu; University of Alaska Fairbanks

Important Note about JupyterHub Your JupyterHub server will automatically shutdown when left idle for more than 1 hour. Your notebooks will not be lost but you will have to restart their kernels and re-run them from the beginning. You will not be able to seamlessly continue running a partially run notebook.

In [ ]:
import url_widget as url_w
notebookUrl = url_w.URLWidget()
display(notebookUrl)
In [ ]:
from IPython.display import Markdown
from IPython.display import display

notebookUrl = notebookUrl.value
user = !echo $JUPYTERHUB_USER
env = !echo $CONDA_PREFIX
if env[0] == '':
    env[0] = 'Python 3 (base)'
if env[0] != '/home/jovyan/.local/envs/insar_analysis':
    display(Markdown(f'<text style=color:red><strong>WARNING:</strong></text>'))
    display(Markdown(f'<text style=color:red>This notebook should be run using the "insar_analysis" conda environment.</text>'))
    display(Markdown(f'<text style=color:red>It is currently using the "{env[0].split("/")[-1]}" environment.</text>'))
    display(Markdown(f'<text style=color:red>Select the "insar_analysis" from the "Change Kernel" submenu of the "Kernel" menu.</text>'))
    display(Markdown(f'<text style=color:red>If the "insar_analysis" environment is not present, use <a href="{notebookUrl.split("/user")[0]}/user/{user[0]}/notebooks/conda_environments/Create_OSL_Conda_Environments.ipynb"> Create_OSL_Conda_Environments.ipynb </a> to create it.</text>'))
    display(Markdown(f'<text style=color:red>Note that you must restart your server after creating a new environment before it is usable by notebooks.</text>'))

0. Importing Relevant Python Packages

In this notebook we will use the following scientific libraries:

  • GDAL is a software library for reading and writing raster and vector geospatial data formats. It includes a collection of programs tailored for geospatial data processing. Most modern GIS systems (such as ArcGIS or QGIS) use GDAL in the background.

Our first step is to import gdal and other needed packages

In [ ]:
import copy
from datetime import datetime, timedelta
import ipywidgets as widgets
from itertools import chain
import json
from pathlib import Path
import re
import requests
from tqdm import tqdm
from typing import Union

import numpy as np
from osgeo import gdal
import pyproj
from pyproj import Transformer
import pandas

import opensarlab_lib as asfn

from hyp3_sdk import Batch, HyP3

%matplotlib widget
In [ ]:
# The code in this cell is taken directly from the develop branch of asf_tools, 
# https://github.com/ASFHyP3/asf-tools

# Delete this cell, install, and import asf-tools when v0.3.0 is released:
# https://github.com/ASFHyP3/asf-tools/pull/90


from osgeo import gdal

gdal.UseExceptions()


class GDALConfigManager:
    """Context manager for setting GDAL config options temporarily"""
    def __init__(self, **options):
        """
        Args:
            **options: GDAL Config `option=value` keyword arguments.
        """
        self.options = options.copy()
        self._previous_options = {}

    def __enter__(self):
        for key in self.options:
            self._previous_options[key] = gdal.GetConfigOption(key)

        for key, value in self.options.items():
            gdal.SetConfigOption(key, value)

    def __exit__(self, exc_type, exc_val, exc_tb):
        for key, value in self._previous_options.items():
            gdal.SetConfigOption(key, value)
            
            
from pathlib import Path
from typing import Iterator, List, Union

from osgeo import ogr

ogr.UseExceptions()


def get_features(vector_path: Union[str, Path]) -> List[ogr.Feature]:
    ds = ogr.Open(str(vector_path))
    layer = ds.GetLayer()
    return [feature for feature in layer]


def get_property_values_for_intersecting_features(geometry: ogr.Geometry, features: Iterator) -> bool:
    for feature in features:
        if feature.GetGeometryRef().Intersects(geometry):
            return True


def intersecting_feature_properties(geometry: ogr.Geometry, features: Iterator,
                                    feature_property: str) -> List[str]:
    property_values = []
    for feature in features:
        if feature.GetGeometryRef().Intersects(geometry):
            property_values.append(feature.GetField(feature_property))
    return property_values


"""Prepare a Copernicus GLO-30 DEM virtual raster (VRT) covering a given geometry"""
from pathlib import Path
from typing import Union

from osgeo import gdal, ogr
from shapely.geometry.base import BaseGeometry

DEM_GEOJSON = '/vsicurl/https://asf-dem-west.s3.amazonaws.com/v2/cop30.geojson'

gdal.UseExceptions()
ogr.UseExceptions()


def prepare_dem_vrt(vrt: Union[str, Path], geometry: Union[ogr.Geometry, BaseGeometry]):
    """Create a DEM mosaic VRT covering a given geometry
    The DEM mosaic is assembled from the Copernicus GLO-30 DEM tiles that intersect the geometry.
    Note: `asf_tools` does not currently support geometries that cross the antimeridian.
    Args:
        vrt: Path for the output VRT file
        geometry: Geometry in EPSG:4326 (lon/lat) projection for which to prepare a DEM mosaic
    """
    with GDALConfigManager(GDAL_DISABLE_READDIR_ON_OPEN='EMPTY_DIR'):
        if isinstance(geometry, BaseGeometry):
            geometry = ogr.CreateGeometryFromWkb(geometry.wkb)

        min_lon, max_lon, _, _ = geometry.GetEnvelope()
        if min_lon < -160. and max_lon > 160.:
            raise ValueError(f'asf_tools does not currently support geometries that cross the antimeridian: {geometry}')

        tile_features = get_features(DEM_GEOJSON)
        if not get_property_values_for_intersecting_features(geometry, tile_features):
            raise ValueError(f'Copernicus GLO-30 DEM does not intersect this geometry: {geometry}')

        dem_file_paths = intersecting_feature_properties(geometry, tile_features, 'file_path')

        gdal.BuildVRT(str(vrt), dem_file_paths)

1. Load Your Own Data Stack Into the Notebook

This notebook assumes that you are accessing an InSAR time series created using the Alaska Satellite Facility's value-added product system HyP3, available via ASF Data Search/Vertex. HyP3 is an ASF service used to prototype value added products and provide them to users to collect feedback.

You can access HyP3 on-demand products from your HyP3 account or from publically available, pre-processed SARVIEWS Event data. https://sarviews-hazards.alaska.edu/

Before downloading anything, create an analysis directory to hold your data.

Select or create a working directory for the analysis:

In [ ]:
while True:
    print(f"Current working directory: {Path.cwd()}")
    data_dir = Path(input(f"\nPlease enter the name of a directory in which to store your data for this analysis."))
    if data_dir == Path('.'):
        continue
    if data_dir.is_dir():
        contents = data_dir.glob('*')
        if len(list(contents)) > 0:
            choice = asfn.handle_old_data(data_dir)
            if choice == 1:
                if data_dir.exists():
                    shutil.rmtree(data_dir)
                data_dir.mkdir()
                break
            elif choice == 2:
                break
            else:
                clear_output()
                continue
        else:
            break
    else:
        data_dir.mkdir()
        break

Define absolute path to analysis directory:

In [ ]:
analysis_directory = Path.cwd()/(data_dir)
print(f"analysis_directory: {analysis_directory}")

Decide whether to pull data from HyP3 or SARVIEWS:

In [ ]:
print("Are you downloading a HyP3 project or SARVIEWS event?")
data_source = asfn.select_parameter(['HyP3', 'SARVIEWS'], '')
display(data_source)

Create some data source flags for use in the notebook:

In [ ]:
hyp3_data = data_source.value == 'HyP3'
sarviews_data = data_source.value == 'SARVIEWS'

Select a SARVIEWS event type (or skip ahead to the HyP3 section):

In [ ]:
if sarviews_data:
    event_type = asfn.select_parameter(['quake', 'volcano'], '')
    print("Select an event type")
    display(event_type)

Select a SARVIEWS Event:

In [ ]:
if sarviews_data:
    sarviews_api = 'https://gm3385dq6j.execute-api.us-west-2.amazonaws.com/events'
    try:
        response = requests.get(
            sarviews_api,
            params=[('output', 'json')]
        )
    except requests.exceptions.RequestException as e:
        print(e)
        sys.exit(1)
    else:
        if len(response.json()) > 0:
            json_response = response.json()
        else:
            print("SARVIEWS returned no events")
            
    events = [ i for i in json_response if i['event_type'] == event_type.value]
    if event_type.value == 'quake':
        event_names = asfn.select_parameter([f"{e['description']} (depth: {e['depth']}, magnitude: {e['magnitude']})" for e in events], '')
    else:
        event_names = asfn.select_parameter([e['description'] for e in events], '')
    print(f"Select a {event_type.value} event")
    display(event_names)

Gather the paths to your event's InSAR products from the SARVIEWS s3 bucket:

In [ ]:
if sarviews_data:
    event = [e for e in json_response if e['description'] == event_names.value.split(' (')[0]][0]
    print(f"Event Details: \n{event}\n")
    event_id = event['event_id']
    
    s3_path = "s3://hyp3-event-monitoring-productbucket-1t80jdtrfscje/"
    s3_event_path = f"{s3_path}{event_id}/"
    job_dirs = !aws --no-sign-request --region us-west-2 s3 ls $s3_event_path
    jobs = !aws --no-sign-request --region us-west-2 s3 ls $s3_event_path --recursive | grep .zip
    jobs = [job.split(' ')[-1] for job in jobs if 'INT' in job]

Filter products by date range:

Some SARVIEWS Events last years and are too large to download to OpenSARlab in their entirety.

In [ ]:
def dates_from_product_name(product_name):
    regex = "[0-9]{8}T[0-9]{6}_[0-9]{8}T[0-9]{6}"
    results = re.search(regex, product_name)
    if results:
        return results.group(0)
    else:
        return None

if sarviews_data:
    dates = set()
    for j in jobs:
        job_dates = dates_from_product_name(j).split('_')
        dates.add(datetime.strptime(job_dates[0], '%Y%m%dT%H%M%S').date())
        dates.add(datetime.strptime(job_dates[1], '%Y%m%dT%H%M%S').date()) 
    dates = list(dates)
    dates.sort()
    date_picker = selection_range_slider = widgets.SelectionRangeSlider(
    options = dates,
    index = (0, len(dates)-1),
    description = 'Dates',
    orientation = 'horizontal',
    layout = {'width': '500px'})       

    display(date_picker)

Check the number of jobs found in your date range:

The number may be high, which is okay.

In [ ]:
if sarviews_data:
    filtered_jobs = [job.split(' ')[-1] for job in jobs if datetime.strptime(asfn.date_from_product_name(job).split('T')[0], "%Y%m%d").date() >= date_picker.value[0] and datetime.strptime(asfn.date_from_product_name(job).split('T')[0], "%Y%m%d").date() <= date_picker.value[1]]
    
    print(f"Found {len(filtered_jobs)} jobs, some of which may belong to temporally disconnected stacks.\n")
    print(f"We will now sort the jobs into connected stacks and select one to download.")

Identify all temporally connected stacks in your date range:

In [ ]:
if sarviews_data:
    
    def get_start_end_stack_dates(stack):
        dates = list(chain.from_iterable((dates_from_product_name(str(s)).split('_')[0], dates_from_product_name(str(s)).split('_')[1]) for s in stack))
        dates.sort()
        return datetime.strptime(dates[0], '%Y%m%dT%H%M%S').date(), datetime.strptime(dates[-1], '%Y%m%dT%H%M%S').date()


    def get_temporal_stacks(jobs):
        stacks = dict()
        job_dates = {k: dates_from_product_name(k).split('_') for k in jobs}
        stack_name = 0
        for j in job_dates:
            if len(stacks) == 0:
                stacks.update({stack_name: [j]})
                stack_name += 1
                continue
            stacks_copy = copy.copy(stacks)
            added = False
            for s in stacks_copy:
                stack_dates = list(chain.from_iterable((dates_from_product_name(i).split('_')[0],
                                                       dates_from_product_name(i).split('_')[1]) for i in stacks[s]))
                if job_dates[j][0] in stack_dates or job_dates[j][1] in stack_dates:
                    stacks[s].append(j)
                    added = True
            if not added:
                stacks.update({stack_name: [j]})
                stack_name += 1
        return stacks
    
    stacks = get_temporal_stacks(filtered_jobs)
    print(f"Found {len(stacks)} stacks.\n")
    for s in stacks:
        start, end = get_start_end_stack_dates(stacks[s])
        print(f"TEMPORAL STACK #{s}, {len(stacks[s])} Scenes")
        print(f"Timespan: {start} - {end}")
        for j in stacks[s]:
            print(j.split('/')[-1])
        print("\n")

Select a temporally connected stack to download:

In [ ]:
if sarviews_data:
    stack = asfn.select_parameter(stacks, '')
    print("Select a temporally connected stack to download:")
    display(stack)

Confirm you have space to download your data:

In [ ]:
if sarviews_data:
    print(f"Found {len(stack.value)} jobs\n")
    print(f"Confirm that you have space to store {len(stack.value)} jobs or you will")
    print("risk overrunning your volume capacity and be locked out of OpenSARlab.\n")
    print(f"If you become locked out of OpenSARlab, please contact an administrator for help.\n\n")
    %df

Download and unzip the interferograms:

In [ ]:
if sarviews_data:
    for j in tqdm(stack.value):
        print(j)
        source = f"{s3_path}{j}"
        dest = Path(f"{analysis_directory}/{j.split('/')[-1]}")
        !aws --no-sign-request --region us-west-2 s3 cp $source $dest
     
        asfn.asf_unzip(str(analysis_directory), str(dest))
        dest.unlink()

Download a DEM and align it with the time series:

In [ ]:
if sarviews_data:
    fnames = list(analysis_directory.glob(f'*/*.tif'))
    fnames.sort()

    # Find the largest AOI intersecting any image
    corners = [gdal.Info(str(f), format='json')['cornerCoordinates'] for f in fnames]
    ulx = min(corner['upperLeft'][0] for corner in corners)
    uly = max(corner['upperLeft'][1] for corner in corners)
    lrx = max(corner['lowerRight'][0] for corner in corners)
    lry = min(corner['lowerRight'][1] for corner in corners)

    # corner coords to latlon for prepare_dem_vrt()
    resolution = int(gdal.Info(str(fnames[0]), format='json')['geoTransform'][1])
    info = gdal.Info(str(fnames[0]), format='json')['coordinateSystem']['wkt']
    utm = info.split('ID')[-1].split(',')[1][0:-2]
    transformer = Transformer.from_crs(f"epsg:{utm}", "epsg:4326")    
    ul = transformer.transform(ulx, uly)
    lr = transformer.transform(lrx, lry)

    # create osgeo Geometry object of AOI 
    geojson = {
        'type': 'Polygon',
        'coordinates': [[
            [ul[1], ul[0]],
            [lr[1], ul[0]],
            [lr[1], lr[0]],
            [ul[1], lr[0]],
            [ul[1], ul[0]],
        ]],
    }
    geometry = ogr.CreateGeometryFromJson(json.dumps(geojson))
    
    # download a dem for the AOI
    prepare_dem_vrt(analysis_directory/"dem.vrt", geometry)

    # convert the vrt to a geotiff
    latlon_name = f"{fnames[0].parent}/latlon_dem.tif"
    utm_name = f"{str(fnames[0].parent/fnames[0].parent.relative_to(fnames[0].parents[1]))}_dem.tif"
    gdal.Translate(destName=latlon_name, 
                   srcDS=str(analysis_directory/"dem.vrt"), 
                   format='GTiff')
    
    # set AREA_OR_POINT to Point
    dem = gdal.Open(latlon_name, gdal.GA_Update)
    dem.SetMetadataItem('AREA_OR_POINT', 'Point')
    dem = None
    
    # Align dem pixels to an integer resolution value
    gdal.Warp(utm_name, latlon_name, 
              dstSRS=f'EPSG:{utm}', srcSRS='EPSG:4326', 
              xRes=resolution, yRes=resolution, targetAlignedPixels=True)
    Path(latlon_name).unlink()

    # Align geotiffs to an integer resolution value
    fnames = list(analysis_directory.glob('*/*.tif'))
    fnames.sort()
    for fname in fnames:
        gdal.Warp(str(fname), str(fname), 
                  dstSRS=f'EPSG:{utm}', srcSRS=f'EPSG:{utm}', 
                  xRes=resolution, yRes=resolution, targetAlignedPixels=True)

Check the InSAR metadata files and add Earth radius at nadir and Spacecraft height, if missing:

In [ ]:
if sarviews_data:
    md_txt_files = analysis_directory.glob(f"*/*.md.txt")
    for file in md_txt_files:
        file.unlink()

    metadata_files = list(analysis_directory.glob('*/*.txt'))
    for file in metadata_files:
        parameter_found = False
        with open(file, 'r') as f:
            lines = f.readlines()
            for line in lines:
                if 'Earth radius at nadir' in line:
                    parameter_found = True
                    break
        if not parameter_found:   
            with open(file, 'a') as f:
                lines = ['Spacecraft height: 693000.0\n', 'Earth radius at nadir: 6337286.638938101\n']
                f.writelines(lines)

Create a HyP3 object and authenticate:

In [ ]:
if hyp3_data:
    hyp3 = HyP3(prompt=True)

List projects containing active InSAR products and select one:

Your HyP3 InSAR project should include DEMs, which are available as options when submitting a HyP3 project

In [ ]:
if hyp3_data:
    my_hyp3_info = hyp3.my_info()
    active_projects = dict()

    for project in my_hyp3_info['job_names']:
        batch = Batch()
        batch = hyp3.find_jobs(name=project, job_type='INSAR_GAMMA').filter_jobs(running=False, include_expired=False)
        if len(batch) > 0:
            active_projects.update({batch.jobs[0].name: batch})

    if len(active_projects) > 0:
        display(Markdown("<text style='color:darkred;'>Note: After selecting a project, you must select the next cell before hitting the 'Run' button or typing Shift/Enter.</text>"))
        display(Markdown("<text style='color:darkred;'>Otherwise, you will rerun this code cell.</text>"))
        print('\nSelect a Project:')
        project_select = asfn.select_parameter(active_projects)
        display(project_select)
    else:
        print("Found no active projects containing InSAR products")

Select a date range of products to download:

In [ ]:
if hyp3_data:
    jobs = project_select.value

    display(Markdown("<text style='color:darkred;'>Note: After selecting a date range, you should select the next cell before hitting the 'Run' button or typing Shift/Enter.</text>"))
    display(Markdown("<text style='color:darkred;'>Otherwise, you may simply rerun this code cell.</text>"))
    print('\nSelect a Date Range:')
    dates = asfn.get_job_dates(jobs)
    date_picker = asfn.gui_date_picker(dates)
    display(date_picker)

Save the selected date range and remove products falling outside of it:

In [ ]:
if hyp3_data:
    date_range = asfn.get_slider_vals(date_picker)
    date_range[0] = date_range[0].date()
    date_range[1] = date_range[1].date()
    print(f"Date Range: {str(date_range[0])} to {str(date_range[1])}")
    jobs = asfn.filter_jobs_by_date(jobs, date_range)

Gather the available paths and orbit directions for the remaining products:

In [ ]:
if hyp3_data:
    display(Markdown("<text style='color:darkred;'><text style='font-size:150%;'>This may take some time for projects containing many jobs...</text></text>"))
    asfn.set_paths_orbits(jobs)
    paths = set()
    orbit_directions = set()
    for p in jobs:
        paths.add(p.path)
        orbit_directions.add(p.orbit_direction)
    display(Markdown(f"<text style=color:blue><text style='font-size:175%;'>Done.</text></text>"))

Select a path:

This notebook does not currently support merging InSAR products in multiple paths

In [ ]:
if hyp3_data:
    display(Markdown("<text style='color:darkred;'>Note: After selecting a path, you must select the next cell before hitting the 'Run' button or typing Shift/Enter.</text>"))
    display(Markdown("<text style='color:darkred;'>Otherwise, you will simply rerun this code cell.</text>"))
    print('\nSelect a Path:')
    path_choice = asfn.select_parameter(paths)
    display(path_choice)

Save the selected flight path/s:

In [ ]:
if hyp3_data:
    flight_path = path_choice.value
    if flight_path:
        if flight_path:
            print(f"Flight Path: {flight_path}")
        else:
            print('Flight Path: All Paths')
    else:
        print("WARNING: You must select a flight path in the previous cell, then rerun this cell.")

Select an orbit direction:

In [ ]:
if hyp3_data:
    if len(orbit_directions) > 1:
        display(Markdown("<text style='color:red;'>Note: After selecting a flight direction, you must select the next cell before hitting the 'Run' button or typing Shift/Enter.</text>"))
        display(Markdown("<text style='color:red;'>Otherwise, you will simply rerun this code cell.</text>"))
    print('\nSelect a Flight Direction:')
    direction_choice = asfn.select_parameter(orbit_directions, 'Direction:')
    display(direction_choice)

Save the selected orbit direction:

In [ ]:
if hyp3_data:
    direction = direction_choice.value
    print(f"Orbit Direction: {direction}")

Filter jobs by path and orbit direction:

In [ ]:
if hyp3_data:
    jobs = asfn.filter_jobs_by_path(jobs, [flight_path])
    jobs = asfn.filter_jobs_by_orbit(jobs, direction)
    print(f"There are {len(jobs)} products to download.")

Download the products, unzip them into a directory named after the product type, and delete the zip files:

In [ ]:
if hyp3_data:
    print(f"\nProject: {jobs.jobs[0].name}")
    project_zips = jobs.download_files(analysis_directory)
    for z in project_zips:
        asfn.asf_unzip(str(analysis_directory), str(z))
        z.unlink()

2. Confirm Presence of a DEM, Azimuth Angle Map, and Incidence Angle Map

  • These are optional addon products for HyP3, which are necessary for MintPy
    • Incidence angle maps are included with HyP3 jobs when the Include Look Vectors option is selected.
    • DEMs are included with HyP3 jobs when the Include DEM option is selected
  • This is an optional addon product for HyP3, which is necessary for MintPy if running the correct_SET (Solid Earth Tides) step
    • Azimuth angle maps are included with HyP3 jobs when the Include Look Vectors option is selected

All of the above mentioned files will be included in an InSAR project if Set MintPy Options is selected when adding InSAR jobs to a project in ASF-Search (Vertex)

In [ ]:
dems = list(analysis_directory.glob('*/*dem.tif'))
az_angle_maps = list(analysis_directory.glob('*/*lv_phi.tif'))
inc_angle_maps = list(analysis_directory.glob('*/*lv_theta.tif'))

if len(dems) > 0:
    print("Success: Found at least 1 DEM.")
else:
    raise FileNotFoundError("Failed to find at least 1 DEM. \
    \nYou will not be able to successfully run a MintPy time-series unless you reorder your HyP3 project \
with DEMS or provide one from another source.")
    
if len(az_angle_maps) > 0:
    print("Success: Found at least 1 Azimuth Angle Map.")
else:
    raise FileNotFoundError("Failed to find at least 1 Azimuth Angle Map. \
    \nYou will not be able to successfully run a MintPy time-series unless your reorder your HyP3 project \
with 'Include Look Vectors' option selected.")
    
if len(inc_angle_maps) > 0:
    print("Success: Found at least 1 Incidence Angle Map.")
else:
    raise FileNotFoundError("Failed to find at least 1 Incidence Angle Map. \
    \nYou will not be able to successfully run a MintPy time-series unless your reorder your HyP3 project \
with 'Include Inc. Angle Map' option selected.")

3. Subset the Stack and Cleanup Unused Files

Delete unneeded files:

In [ ]:
for pattern in ["xml","png","kmz","md.txt"]:
    unneeded_files = analysis_directory.glob(f"*/*.{pattern}")
    for file in unneeded_files:
        file.unlink()

Subset the timeseries to your AOI:

In [ ]:
amp = list(analysis_directory.glob(f'*/*_amp.tif'))
merge_paths = ""

for pth in amp:
    merge_paths = f"{merge_paths} {pth}"

full_scene = analysis_directory/"full_scene.tif"
if full_scene.exists():
    full_scene.unlink()
gdal_command = f"gdal_merge.py -o {full_scene} {merge_paths}"
!{gdal_command}

image_file = f"{analysis_directory}/raster_stack.vrt"
!gdalbuildvrt -separate $image_file -overwrite $full_scene
img = gdal.Open(image_file)
rasterstack = img.ReadAsArray()

aoi = asfn.AOI_Selector(rasterstack, 7.5, 7.5)

Convert the AOI plot's x,y values to georaphic coordinates:

In [ ]:
geotrans = img.GetGeoTransform()

def geolocation(x, y, geotrans):
    return [geotrans[0]+x*geotrans[1], geotrans[3]+y*geotrans[5]]

try:
    ul = geolocation(aoi.x1, aoi.y1, geotrans)
    lr = geolocation(aoi.x2, aoi.y2, geotrans)
    print(f"AOI Corner Coordinates:")
    print(f"upper left corner: {ul}")
    print(f"lower right corner: {lr}")
except TypeError:
    print('TypeError')
    display(Markdown(f'<text style=color:red>This error may occur if an AOI was not selected.</text>'))
    display(Markdown(f'<text style=color:red>Note that the square tool icon in the AOI selector menu is <b>NOT</b> the selection tool. It is the zoom tool.</text>'))

Crop the stack to the AOI and reproject to lat-lon:

In [ ]:
fnames = list(analysis_directory.glob('*/*.tif'))
fnames.sort()
    

for i, fname in enumerate(fnames):
    clip = fname.parent/f"{fname.stem}_clip.tif"
    gdal.Translate(destName=str(clip), srcDS=str(fname), projWin=[ul[0], ul[1], lr[0], lr[1]])
    gdal.Warp(str(clip), str(clip), dstSRS='EPSG:4326', dstNodata=0)
    fname.unlink() 

Remove any subset scenes containing no data:

In [ ]:
fnames = list(analysis_directory.glob('*/*.tif*'))
fnames = [str(f) for f in fnames]
fnames.sort()

removed = []
for f in fnames:
    if not "dem" in str(f):
        raster = gdal.Open(f)
        if raster:
            band = raster.ReadAsArray()
            if np.count_nonzero(band) < 1:
                Path(f).unlink()
                removed.append(f)

if len(removed) == 0:
    print("No Geotiffs were removed")
else:
    print(f"{len(removed)} GeoTiffs removed:")
    for f in removed:
        print(f)

4. Proceed to the MintPy Time-Series Notebook

Run the code cell below for a link to the MintPy_Time_Series_From_Prepared_Data_Stack notebook:

Describe what each notebook does in bottom links

In [ ]:
# from pathlib import Path
from IPython.display import display, HTML

current = Path.cwd()
abs_path = Path('/home/jovyan/notebooks/SAR_Training/English/Master/MintPy_Time_Series_From_Prepared_Data_Stack.ipynb')
relative_path = abs_path.relative_to(current)    

link_t = f"Open <a href='{relative_path}'>MintPy_Time_Series_From_Prepared_Data_Stack.ipynb</a> to run an InSAR time-series analysis on your data using MintPy"
html = HTML(link_t)
display(html)

Prepare_HyP3_InSAR_Stack_for_MintPy.ipynb - Version 1.1.2 - November 2021

Version Changes

  • asf_notebook -> opensarlab_lib
  • url-widget
  • update link to MintPy_Time_Series_From_Prepared_Data_Stack.ipynb