In order to lower the uncertainty of calculations done with py4dgeo
, point cloud registration should be performed to tightly align the point clouds. py4dgeo
supports this in two ways:
py4dgeo
.This notebook show cases both ways of usage.
Note: Be aware that an initial transformation is required that roughly aligns the two point clouds.
import py4dgeo
import numpy as np
We load two epochs and register them to each other:
reference_epoch = py4dgeo.read_from_xyz("plane_horizontal_t1.xyz")
epoch = py4dgeo.read_from_xyz("plane_horizontal_t2.xyz")
Registration algorithms use a simple function call interface. It returns an affine transformation as a 3x4
matrix which contains the Rotation matrix $\mathbf{R}$ and the translation vector $\mathbf{t}$:
When calculating and applying such transformations in py4dgeo
, it is always possible to specify the reduction point $\mathbf{x_0}$. This allows shifting coordinates towards the origin before applying rotations - leading to a numerically robust operation. The algorithm for a transformation is:
We demonstrate the use of registration algorithms within py4dgeo
using our implementation of the standard Iterative Closest Point (ICP) algorithm. Registration algorithms are provided as simple Python functions taking two epochs and the reduction point as arguments. Some algorithms might require additional, algorithm-specific parameters.
trafo = py4dgeo.iterative_closest_point(
reference_epoch, epoch, reduction_point=np.array([0, 0, 0])
)
The returned trafo
object stores both the computed affine transformation, as well as the reduction point. It is important to store these togethers, as only the correct combination of these two quantities will yield the correct transformation result. The transformation trafo
can now be applied to epoch
by passing it to epoch.transform
. Note that, by design, this operation is under explicit user control and not done implicitly while calculating the transformation:
epoch.transform(trafo)
The Epoch
class records applied transformations and makes them available through the transformation
property:
epoch.transformation
Finally, we can export the transformed epoch to inspect the registration quality e.g. in CloudCompare.
epoch.save("plane_horizontal_t2_transformed.xyz")
To learn more, about parameters of our ICP implementations, you can have a look at the documentation of py4dgeo.iterative_closest_point
or py4dgeo.point_to_plane_icp
:
?py4dgeo.point_to_plane_icp
If you have already computed a transformation using the tool of your choice, you can easily import it into py4dgeo
by loading it as numpy
arrays and pass it directly to Epoch.transform
:
epoch.transform(
rotation=np.array([[1, 0, 0], [0, 1, 0], [0, 0, 1]]),
translation=np.array([0, 0, 0]),
reduction_point=np.array([0, 0, 0]),
)
It is both possible to define rotation
and translation
individually or to provide a 3x4
matrix to the parameter affine_transformation
.
py4dgeo
does not only implement standard ICP algorithms, but also an ICP algorithm that is better suited for geomorphological applications. It identifies stable areas in the point cloud and restricts the application of the ICP algorithm to these areas. It is based on the article "Supervoxel-based targetless registration and identification of stable areas for deformed point clouds" by Yuhui Yang and Volker Schwieger.
The algorithm requires normals. You can either have py4dgeo
calculate these normals or load precomputed ones:
reference_epoch = py4dgeo.read_from_las(
"plane_horizontal_t1_w_normals.laz",
normal_columns=["NormalX", "NormalY", "NormalZ"],
)
epoch = py4dgeo.read_from_las(
"plane_horizontal_t2_w_normals.laz",
normal_columns=["NormalX", "NormalY", "NormalZ"],
)
As seen with the standard ICP algorithm, the API for the registration algorithm is a simple function call:
trafo = py4dgeo.icp_with_stable_areas(
reference_epoch,
epoch,
initial_distance_threshold=10,
level_of_detection=10,
reference_supervoxel_resolution=0.2,
supervoxel_resolution=0.2,
min_svp_num=1,
reduction_point=np.array([0, 0, 0]),
)
Apart from the usual arguments reference_epoch
, epoch
and reduction_point
, the algorithm allows the following customizations. To achieve best performance, you should familiarize yourself with these parameters and tune them for your application:
initial_distance_threshold
: The upper boundary of the distance threshold in the iteration. It can be (1) an empirical value manually set by the user according to the approximate accuracy of coarse registration, or (2) calculated by the mean and standard of the nearest neighbor distances of all points.level_of_detection
: The lower boundary (minimum) of the distance threshold in the iteration. It can be (1) an empirical value manually set by the user according to the approximate uncertainty of laser scanning measurements in different scanning configurations and scenarios (e.g., 1 cm for TLS point clouds in short distance and 4 cm in long distance, 8 cm for ALS point clouds, etc.), or (2) calculated by estimating the standard deviation from local modeling (e.g., using the level of detection in M3C2 or M3C2-EP calculations).reference_supervoxel_resolution
: The approximate size of generated supervoxels for the reference epoch. It can be (1) an empirical value manually set by the user according to different surface geometries and scanning distance (e.g., 2-10 cm for indoor scenes, 1-3 m for landslide surface), or (2) calculated by 10-20 times the average point spacing (original resolution of point clouds). In both cases, the number of points in each supervoxel should be at least 10 (i.e., min_svp_num = 10).supervoxel_resolution
: The same as reference_supervoxel_resolution
but for a different epoch.min_svp_num
: Minimum number of points for supervoxels to be taken into account in further calculations.