The Planetary Computer's /data
API provides an easy way to visualize and perform basic analytics on data hosted by the Planetary Computer, without having to deploy your own compute in Azure.
One of the core principals of cloud-native geospatial is putting the compute next to the data. The Planetary Computer stores its data in Azure Blob Storage in the West Europe region, so that would mean using one of Azure's many compute services to set up your own compute in West Europe. That's why we set up the Planetary Computer Hub: a very convenient way to get started with cloud-native geospatial from your own browser.
For some use-cases, however, logging into the Hub and starting a Python kernel isn't appropriate (displaying images on a webpage, for example). The Hub is essentially a manual and interactive form of compute, and involves the (costly) process of starting a Jupyter server on a Virtual Machine in Azure. Even if there were a hot virtual machine or Azure Function ready and waiting, eliminating the startup cost, the hassel of deployment might not be worth it for the outcome (displaying an image, again).
That's why the Planetary Computer provides a /data
API: to efficiently and conveniently serve these kinds of "simple" usecases. The /data
API, along with our STAC API, is what powers our Explorer.
The reference documentation for the data API is at https://planetarycomputer.microsoft.com/api/data/v1/docs. This notebook gives a brief introduction and some examples.
import requests
import pystac
import folium
import shapely.geometry
from IPython.display import Image
The simplest use of the data
API looks similar to accessing a raw asset from Blob Storage. Many of our STAC items have a rendered_preview
asset that's actually dynamically served by our data
API.
r = requests.get(
"https://planetarycomputer.microsoft.com/api/stac/v1/collections/sentinel-2-l2a/items/S2B_MSIL2A_20220606T080609_R078_T36PUR_20220606T193343" # noqa: E501
)
item = pystac.Item.from_dict(r.json())
asset = item.assets["rendered_preview"]
print(asset.href)
https://planetarycomputer.microsoft.com/api/data/v1/item/preview.png?collection=sentinel-2-l2a&item=S2B_MSIL2A_20220606T080609_R078_T36PUR_20220606T193343&assets=visual&asset_bidx=visual%7C1%2C2%2C3&nodata=0&format=png
Notice the /api/data/v1
in the asset HREF, which indicates that this targets the /data
API rather than something like .blob.core.windows.net
, which targets Azure Blob Storage. A request to that URL will trigger a TiTiler server to read raw data from Blob Storage, combine and transform it (according to the parameters in the URL, which you could customize) and return you the PNG for display.
Image(url=item.assets["rendered_preview"].href)
So we're able to display an asset using a client that only understands HTTP and JSON.
The tilejson
asset is similar to rendered_preview
, but is useful for putting the asset on a map.
print(item.assets["tilejson"].href)
https://planetarycomputer.microsoft.com/api/data/v1/item/tilejson.json?collection=sentinel-2-l2a&item=S2B_MSIL2A_20220606T080609_R078_T36PUR_20220606T193343&assets=visual&asset_bidx=visual%7C1%2C2%2C3&nodata=0&format=png
Making a request to that endpoint returns an object with a tiles
url, which has everything filled in for this specific item.
r = requests.get(item.assets["tilejson"].href).json()
r
{'tilejson': '2.2.0', 'version': '1.0.0', 'scheme': 'xyz', 'tiles': ['https://planetarycomputer.microsoft.com/api/data/v1/item/tiles/WebMercatorQuad/{z}/{x}/{y}@1x?collection=sentinel-2-l2a&item=S2B_MSIL2A_20220606T080609_R078_T36PUR_20220606T193343&assets=visual&asset_bidx=visual%7C1%2C2%2C3&nodata=0&format=png'], 'minzoom': 0, 'maxzoom': 24, 'bounds': [31.17569761, 8.95381176, 32.17948101, 9.95039568], 'center': [31.677589310000002, 9.45210372, 0]}
tiles = r["tiles"][0]
That can be handed to any system that understands tilejson URLs, like let
. Panning and zooming around the map will trigger requests to load new data.
center = ((item.bbox[1] + item.bbox[3]) / 2, (item.bbox[0] + item.bbox[2]) / 2)
m = folium.Map(
location=center,
zoom_start=11,
tiles="https://server.arcgisonline.com/ArcGIS/rest/services/World_Shaded_Relief/MapServer/tile/{z}/{y}/{x}", # noqa: E501
attr="Basemap © Esri — Source: Esri",
)
folium.TileLayer(tiles=tiles, attr="Planetary Computer").add_to(m)
m
Thus far, we've worked with just a single asset. The /data
API also supports combining multiple assets into a single asset by registering a STAC API search. You define a provide a search defining the space, time, and other properties to include in the results and the /data
API will combine the results.
We'll define the area of interest as a GeoJSON polygon. And we'll use pystac-client
to construct the search parameters.
import pystac_client
catalog = pystac_client.Client.open(
"https://planetarycomputer.microsoft.com/api/stac/v1"
)
aoi = {
"type": "Polygon",
"coordinates": [
[
[29.036865234375, 7.857940257224196],
[31.4813232421875, 7.857940257224196],
[31.4813232421875, 10.055402736564236],
[29.036865234375, 10.055402736564236],
[29.036865234375, 7.857940257224196],
]
],
}
collection = "sentinel-2-l2a"
query = {"eo:cloud_cover": {"lt": "10"}}
search = catalog.search(intersects=aoi, collections="sentinel-2-l2a", query=query)
We can register this search with a POST
request to the /data/v1/mosaic/register
endpoint.
r_register = requests.post(
"https://planetarycomputer.microsoft.com/api/data/v1/mosaic/register",
json=search.get_parameters(),
)
registered = r_register.json()
registered
{'searchid': '3c377cdaf6bddee4f5379ee6707505fd', 'links': [{'rel': 'metadata', 'type': 'application/json', 'href': 'https://planetarycomputer.microsoft.com/api/data/v1/mosaic/3c377cdaf6bddee4f5379ee6707505fd/info'}, {'rel': 'tilejson', 'type': 'application/json', 'href': 'https://planetarycomputer.microsoft.com/api/data/v1/mosaic/3c377cdaf6bddee4f5379ee6707505fd/tilejson.json'}]}
That returns an object with a couple of links. We're interested in the /tilejson.json
link, to visualize the results on a map.
tilejson_url = registered["links"][1]["href"]
In addition to that tilejson_url
, we need to provide a couple other things. First, the collection
ID, which we already have. Second, we need to tell the tiler how to convert the raw data to an image. Several libraries are involved here, including TiTiler, rio-tiler, and rio-color. There's a ton of flexibility here, but to to keep things as simple as possible, we'll use the /data/mosaic/info
to get some good defaults that were set by the Planetary Computer team.
mosaic_info = requests.get(
"https://planetarycomputer.microsoft.com/api/data/v1/mosaic/info",
params=dict(collection=item.collection_id),
).json()
render_config = mosaic_info["renderOptions"][0]
render_config
{'name': 'Natural color', 'description': 'True color composite of visible bands (B04, B03, B02)', 'type': 'raster-tile', 'options': 'assets=B04&assets=B03&assets=B02&nodata=0&color_formula=Gamma RGB 3.2 Saturation 0.8 Sigmoidal RGB 25 0.35', 'vectorOptions': None, 'minZoom': 9, 'legend': None, 'conditions': None}
We need to modify that slightly before it's ready to go to the tilejson
endpoint.
import itertools
def key(s):
return s.split("=")[0]
params = {
k: [x.split("=")[1] for x in v]
for k, v in itertools.groupby(render_config["options"].split("&"), key=key)
}
params["collection"] = item.collection_id
Finally, we can get our full tilejson URL.
tiles = requests.get(tilejson_url, params=params).json()["tiles"][0]
tiles
'https://planetarycomputer.microsoft.com/api/data/v1/mosaic/tiles/3c377cdaf6bddee4f5379ee6707505fd/WebMercatorQuad/{z}/{x}/{y}@1x?assets=B04&assets=B03&assets=B02&nodata=0&color_formula=Gamma+RGB+3.2+Saturation+0.8+Sigmoidal+RGB+25+0.35&collection=sentinel-2-l2a'
Which can be provided to folium
or ipyleaflet
.
center = shapely.geometry.shape(aoi).centroid
m = folium.Map(
location=(center.y, center.x),
zoom_start=9,
tiles="https://server.arcgisonline.com/ArcGIS/rest/services/World_Shaded_Relief/MapServer/tile/{z}/{y}/{x}", # noqa: E501
attr="Basemap © Esri — Source: Esri",
)
folium.TileLayer(
tiles=tiles, attr="Planetary Computer", min_zoom=render_config["minZoom"]
).add_to(m)
folium.GeoJson(data=aoi).add_to(m)
m
This is essentially how the Planetary Computer Explorer works. The filter
is generated based on your browser's window and whatever filters you've toggled. Based on that user input, it generates the CQL2-json query, registers a search, builds a TileJSON request (using any visualization options you've set) and displays the result on the map.
This was a brief introduction to the /data
API. For more, see the reference documentation. Feel free to share your creations using the /data
API on the Planetary Computer discussions board.