This notebook is part of the kikuchipy
documentation https://kikuchipy.org.
Links to the documentation won't work from the notebook.
In this tutorial we will combine Hough indexing (HI), dictionary indexing (DI) and refinement in a hybrid indexing approach. Hough indexing is generally much faster than dictionary indexing, but less robust towards noisy patterns. To make good use of both indexing approaches, we can quickly index all patterns with Hough indexing, identify badly indexed map points and re-index these with dictionary indexing. Before combining the Hough indexed and dictionary indexed points into a single map in the final step, we must refine them. As always, it is important to evaluate intermediate results after each step by inspecting quality metrics, geometrical simulations etc.
We will demonstrate this workflow on a single phase recrystallized nickel sample. The dataset is available from Zenodo at Ånes et al. (2019). It is dataset number 10 (24 dB) out of the series of 10 datasets taken with increasing gain (0-24 dB). It is very noisy, so we will average each pattern to its nearest neighbours before indexing.
The complete workflow then becomes:
Let's start by important the necessary libraries
# Exchange inline for notebook or qt5 (from pyqt) for interactive plotting
%matplotlib inline
import matplotlib.pyplot as plt
import numpy as np
from diffsims.crystallography import ReciprocalLatticeVector
import hyperspy.api as hs
import kikuchipy as kp
from orix import io, plot, sampling
from orix.crystal_map import PhaseList
from orix.vector import Vector3d
plt.rcParams.update(
{"figure.facecolor": "w", "font.size": 15, "figure.dpi": 75}
)
s = kp.data.ni_gain(10, allow_download=True) # ~100 MB into memory
s
2023-03-22 20:55:44,187 - numexpr.utils - INFO - NumExpr defaulting to 4 threads.
<EBSD, title: Pattern, dimensions: (200, 149|60, 60)>
Enhance signal-to-noise ratio (background correction)
s.remove_static_background()
s.remove_dynamic_background()
[########################################] | 100% Completed | 809.63 ms [########################################] | 100% Completed | 3.13 ss
Average each pattern to its eight nearest neighbors in a Gaussian kernel with a standard deviation of 1
window = kp.filters.Window("gaussian", std=1)
s.average_neighbour_patterns(window)
[########################################] | 100% Completed | 1.11 sms
Inspect an image quality map (pattern sharpness, not to be confused with image quality determined from height of peaks in the Hough transform)
iq = s.get_image_quality()
[########################################] | 100% Completed | 1.62 ss
Inspect patterns in the image quality map
# Move the pointer programmatically to inside a large grain
s.axes_manager.indices = (156, 80)
s.plot(hs.signals.Signal2D(iq))
The image quality map is very noisy, but we might be able to convince ourselves that the darker lines are grain boundaries (they are). The map is noisy because the patterns are noisy. The pattern shown is from inside one of the larger grains on the right side of the map from the small red square (the pointer). Even though the material is well recrystallized with appreciably large grains, the pattern is very noisy. But again, we might be able to convince ourselves that there is some "correlated noise", e.g. a couple of zones axes (darker regions in the pattern) and some bands delineated by darker lines on each side of the band...
Seven calibration patterns of high quality was acquired prior to acquiring the full dataset above, in order to calibrate the sample-detector geometry. The detector was mounted with 0$^{\circ}$, and we will assume that the nominal sample tilt of 70$^{\circ}$ is correct. What remains then is to determine a plane of projection centers (PCs), one for each point in the full dataset. Since we know the detector pixel size (~70 $\mu$m on the NORDIF UF-1100 detector), we can extrapolate this plane of PCs from a mean PC. The workflow is the following:
After each step, we will evaluate our results.
Load calibration patterns
s_cal = kp.data.ni_gain_calibration(10)
s_cal
<EBSD, title: Calibration patterns, dimensions: (7|480, 480)>
Remove static and dynamic background
s_cal.remove_static_background()
s_cal.remove_dynamic_background()
[########################################] | 100% Completed | 101.52 ms [########################################] | 100% Completed | 101.93 ms
Extract positions of calibration patterns on the sample surface (possibly outside the region of interest [ROI]) and the shape and position of the ROI relative to the area imaged in an overview secondary electron image. This information is read with the calibration patterns from the NORDIF settings file.
omd = s_cal.original_metadata
Plot calibration pattern map locations
kp.draw.plot_pattern_positions_in_map(
rc=omd.calibration_patterns.indices_scaled,
roi_shape=omd.roi.shape_scaled,
roi_origin=omd.roi.origin_scaled,
roi_image=iq,
area_shape=omd.area.shape_scaled,
area_image=omd.area_image,
color="w",
)
Hough indexing requires a phase list in order to make a look-up table of interplanar angles to compare the detected angles (from combinations of bands) to. So far, Hough indexing only supports face-centered and body-centered cubic (FCC and BCC) materials (dictionary indexing and pattern matching support all crystallographic point groups). See the hough indexing tutorial for more details. Since we later on need a dynamically simulated master pattern of nickel (simulated with EMsoft), we will load this here and use the phase attached to this master pattern in the phase list.
Note
PyEBSDIndex is an optional dependency of kikuchipy, and can be installed with both pip
and conda
(from conda-forge
).
To install PyEBSDIndex, see their installation instructions.
PyEBSDIndex supports indexing face centered and body centered cubic (FCC and BCC) materials.
# kikuchipy.data.nickel_ebsd_master_pattern_small() is an alternative
mp = kp.data.ebsd_master_pattern(
"ni", allow_download=True, projection="lambert", energy=20
)
mp
<EBSDMasterPattern, title: ni_mc_mp_20kv, dimensions: (|1001, 1001)>
phase = mp.phase
phase
<name: ni. space group: Fm-3m. point group: m-3m. proper point group: 432. color: tab:blue>
Create a PyEBSDIndex indexer to use with Hough indexing. We get this from our EBSD detector instance attached to the calibration pattern signal
det_cal = s_cal.detector
phase_list = PhaseList(phase)
indexer = det_cal.get_indexer(phase_list)
indexer.phaselist
['FCC']
Estimate PCs from an initial guess based on previous experiments, and print the mean and standard deviation
det_cal = s_cal.hough_indexing_optimize_pc(
pc0=[0.4, 0.2, 0.5], indexer=indexer, batch=True
)
print(det_cal.pc_flattened.mean(axis=0))
print(det_cal.pc_flattened.std(0))
[0.41600247 0.22051028 0.50324232] [0.0055121 0.00561148 0.00422592]
Compare the distribution of PCs to the above plotted map locations (especially PCx vs. PCy)
det_cal.plot_pc("scatter", annotate=True)
We see no direct correlation between the sample positions and the PCs. Let's index the calibration patterns using these PCs and compare the solution's band positions to the experimental band positions. We update our indexer instance with the estimated PCs.
indexer.PC = det_cal.pc
xmap_cal = s_cal.hough_indexing(
phase_list=phase_list, indexer=indexer, verbose=2
)
Hough indexing with PyEBSDIndex information: PyOpenCL: True Projection center (Bruker, mean): (0.416, 0.2205, 0.5032) Indexing 7 pattern(s) in 1 chunk(s) Radon Time: 0.03576446600027339 Convolution Time: 0.003266480000092997 Peak ID Time: 0.0015165800000431773 Band Label Time: 0.04264826100006758 Total Band Find Time: 0.08321748199978174
Band Vote Time: 0.020705252999960067 Indexing speed: 25.25290 patterns/s
Check indexed orientations by plotting geometrical simulations on top of the patterns. See the tutorial on geometrical EBSD simulations for details.
rlv = ReciprocalLatticeVector.from_min_dspacing(phase.deepcopy(), 0.07)
rlv.sanitise_phase() # "Fill atoms in unit cell"
rlv.calculate_structure_factor()
structure_factor = abs(rlv.structure_factor)
rlv = rlv[structure_factor > 0.12 * structure_factor.max()]
rlv.print_table()
h k l d |F|_hkl |F|^2 |F|^2_rel Mult 1 1 1 0.203 0.4 0.2 100.0 8 2 0 0 0.176 0.3 0.1 60.9 6 2 2 0 0.125 0.1 0.0 8.7 12 3 1 1 0.106 0.1 0.0 2.0 24
simulator = kp.simulations.KikuchiPatternSimulator(rlv)
sim_cal = simulator.on_detector(det_cal, xmap_cal.rotations)
Finding bands that are in some pattern: [########################################] | 100% Completed | 107.15 ms Finding zone axes that are in some pattern: [########################################] | 100% Completed | 101.81 ms Calculating detector coordinates for bands and zone axes: [########################################] | 100% Completed | 101.92 ms
fig, axes = plt.subplots(ncols=3, nrows=3, figsize=(12, 12))
axes = axes.ravel()
for i in range(xmap_cal.size):
axes[i].imshow(s_cal.inav[i].data, cmap="gray")
lines = sim_cal.as_collections(i, lines_kwargs=dict(color="w"))[0]
axes[i].add_collection(lines)
axes[i].text(5, 10, i, c="w", va="top", ha="left")
_ = [a.axis("off") for a in axes]
fig.subplots_adjust(wspace=0.01, hspace=0.01)
Most lines align quite well with experimental bands, but some of them, especially in the lower parts of the patterns and on wide bands, do not follow the band middle. Let's refine these solutions using pattern matching
xmap_cal_ref, det_cal_ref = s_cal.refine_orientation_projection_center(
xmap=xmap_cal,
detector=det_cal,
master_pattern=mp,
energy=20,
method="LN_NELDERMEAD",
trust_region=[5, 5, 5, 0.05, 0.05, 0.05], # Sufficiently wide
# One pattern per iteration to utilize all CPUs
chunk_kwargs=dict(chunk_shape=1),
)
Refinement information: Method: LN_NELDERMEAD (local) from NLopt Trust region (+/-): [5. 5. 5. 0.05 0.05 0.05] Relative tolerance: 0.0001 Refining 7 orientation(s) and projection center(s): [########################################] | 100% Completed | 23.94 ss Refinement speed: 0.29231 patterns/s
Check quality metrics
print(xmap_cal_ref.scores.mean())
print(xmap_cal_ref.num_evals.mean())
0.502202991928373 213.57142857142858
Check deviations from Hough indexed solutions (it is important that these deviations are well within our trust region above)
angles_cal = xmap_cal.orientations.angle_with(
xmap_cal_ref.orientations, degrees=True
)
pc_dev_cal = det_cal.pc_flattened - det_cal_ref.pc_flattened
print(angles_cal)
print(abs(pc_dev_cal).max(0))
[1.13962903 0.8635618 1.35129915 1.29976489 2.46056115 0.59547646 1.28795102] [0.01114928 0.01696463 0.01033452]
Get geometrical simulations from refined orientations and PCs and add lines from these simulations (in red) to the existing figure
sim_cal_ref = simulator.on_detector(det_cal_ref, xmap_cal_ref.rotations)
Finding bands that are in some pattern: [########################################] | 100% Completed | 101.68 ms Finding zone axes that are in some pattern: [########################################] | 100% Completed | 102.20 ms Calculating detector coordinates for bands and zone axes: [########################################] | 100% Completed | 102.81 ms
for i in range(xmap_cal_ref.size):
lines = sim_cal_ref.as_collections(i)[0]
axes[i].add_collection(lines)
fig
We see that the red lines align better with wider bands and bands in the lower part of the patterns (where the deviations are also greater).
Check the refined PCs
det_cal_ref.plot_pc("scatter", annotate=True)
We now see that pattern 0, 3, 4 and 6 seem to align quite well compared to the sample positions. However, the other PCs do not align as expected. It is safest in this instance to compute a mean PC and extrapolate from this.
Get sample positions of our patterns
pc_indices = omd.calibration_patterns.indices_scaled
pc_indices -= omd.roi.origin_scaled
pc_indices = pc_indices.T
Specify the outliers
is_outlier = np.zeros(det_cal_ref.navigation_size, dtype=bool)
is_outlier[[1, 2, 5]] = True
Extrapolate a plane of PCs
det_cal_fit = det_cal_ref.extrapolate_pc(
pc_indices=pc_indices,
navigation_shape=s.xmap.shape,
step_sizes=(1.5, 1.5),
px_size=70,
binning=8,
is_outlier=is_outlier,
)
print(det_cal_fit.pc_average)
print(det_cal_fit.pc_flattened.std(0))
print(det_cal_fit.sample_tilt)
[0.42129343 0.21341257 0.50080491] [3.22178043e-04 2.25545245e-04 8.20917556e-05] 70.0
Check the plane of PCs
det_cal_fit.plot_pc()
As a final evaluation of this plane of PCs, we will refine the (refined) orientations using a fixed PC for each pattern taken from the plane of PCs
det_cal_fit2 = det_cal_fit.deepcopy()
det_cal_fit2.pc = det_cal_fit2.pc[tuple(pc_indices)]
xmap_cal_ref2 = s_cal.refine_orientation(
xmap=xmap_cal_ref,
detector=det_cal_fit2,
master_pattern=mp,
energy=20,
method="LN_NELDERMEAD",
trust_region=[5, 5, 5],
chunk_kwargs=dict(chunk_shape=1),
)
Refinement information: Method: LN_NELDERMEAD (local) from NLopt Trust region (+/-): [5 5 5] Relative tolerance: 0.0001 Refining 7 orientation(s): [########################################] | 100% Completed | 4.75 sms Refinement speed: 1.47350 patterns/s
print(xmap_cal_ref2.scores.mean())
print(xmap_cal_ref2.num_evals.mean())
0.4991740882396698 58.285714285714285
angles_cal2 = xmap_cal_ref.orientations.angle_with(
xmap_cal_ref2.orientations, degrees=True
)
print(angles_cal2)
[0.1066364 0.24294515 0.27140636 0.3090115 0.48565123 0.3495071 0.44546699]
Get geometrical simulations and add a third set of lines (in blue) to the existing figure
sim_cal_ref2 = simulator.on_detector(det_cal_fit2, xmap_cal_ref2.rotations)
Finding bands that are in some pattern: [########################################] | 100% Completed | 101.51 ms Finding zone axes that are in some pattern: [########################################] | 100% Completed | 102.72 ms Calculating detector coordinates for bands and zone axes: [########################################] | 100% Completed | 102.12 ms
for i in range(xmap_cal_ref2.size):
lines = sim_cal_ref2.as_collections(i, lines_kwargs=dict(color="b"))[0]
axes[i].add_collection(lines)
fig
Now that we are confident of our geometry calibration, we can index all patterns in our noisy experimental dataset.
Copy the detector with the calibrated PCs and update the detector shape to match our experimental patterns
det = det_cal_fit.deepcopy()
det.shape = s.detector.shape
det
EBSDDetector (60, 60), px_size 70 um, binning 8, tilt 0.0, azimuthal 0, pc (0.421, 0.213, 0.501)
Get a new indexer
indexer = det.get_indexer(phase_list)
Perform Hough indexing with PyEBSDIndex (using the GPU via PyOpenCL, but only a single CPU)
xmap_hi = s.hough_indexing(phase_list=phase_list, indexer=indexer, verbose=2)
Hough indexing with PyEBSDIndex information: PyOpenCL: True Projection center (Bruker, mean): (0.4213, 0.2134, 0.5008) Indexing 29800 pattern(s) in 57 chunk(s) Radon Time: 1.7695503400018424 Convolution Time: 5.326358579001408 Peak ID Time: 2.036565604000316 Band Label Time: 4.461307794000277 Total Band Find Time: 13.594405164999898
Band Vote Time: 19.560826325000107 Indexing speed: 891.52337 patterns/s
Our use of PyEBSDIndex here gave indexing of about 900 patterns/s.
Note
Note that PyEBSDIndex can index a lot faster than this by using more CPUs and by passing a file directly (not via NumPy or Dask arrays, as done here) to its indexing functions. See its documentation for details: https://pyebsdindex.readthedocs.io/en/latest.
xmap_hi
Phase Orientations Name Space group Point group Proper point group Color -1 210 (0.7%) not_indexed None None None w 0 29590 (99.3%) ni Fm-3m m-3m 432 tab:blue Properties: fit, cm, pq, nmatch Scan unit: um
# Save HI map
# io.save("xmap_hi.h5", xmap_hi)
PyEBSDIndex couldn't index some 200 points (pattern fit above 3$^{\circ}$). Let's check the quality metrics (pattern fit, confidence metric, pattern quality and the number of detected bands labeled)
aspect_ratio = xmap_hi.shape[1] / xmap_hi.shape[0]
figsize = (8 * aspect_ratio, 4.5 * aspect_ratio)
fig, ax = plt.subplots(nrows=2, ncols=2, figsize=figsize)
for a, to_plot in zip(ax.ravel(), ["pq", "cm", "fit", "nmatch"]):
if to_plot == "fit":
im = a.imshow(xmap_hi.get_map_data(to_plot), vmin=0, vmax=3)
else:
im = a.imshow(xmap_hi.get_map_data(to_plot))
fig.colorbar(im, ax=a, label=to_plot, pad=0.02)
a.axis("off")
fig.subplots_adjust(wspace=0, hspace=0.05)
The bright points in the lower right pattern fit map are the 213 points considered not indexed. The confidence metric and number of successfully labeled bands (out of nine) seem to be highest within grains and lowest at grain boundaries. Let's inspect the spatial variation of "successfully" indexed orientations in an inverse pole figure map (IPF-X)
sym = xmap_hi.phases[0].point_group
ckey = plot.IPFColorKeyTSL(sym, Vector3d([1, 0, 0]))
ckey
IPFColorKeyTSL, symmetry: m-3m, direction: [1 0 0]
rgb_hi = ckey.orientation2color(xmap_hi["indexed"].rotations)
fig = xmap_hi["indexed"].plot(
rgb_hi, remove_padding=True, return_figure=True
)
# Place color key in bottom right corner, coordinates are
# [left, bottom, width, height]
ax_ckey = fig.add_axes(
[0.75, 0.07, 0.2, 0.2], projection="ipf", symmetry=sym
)
ax_ckey.plot_ipf_color_key(show_title=False)
ax_ckey.patch.set_facecolor("None")
Many points seen as single color deviations from otherwise smooth colors within recrystallized grains are located mostly at grain boundaries.
Let's see if we can easily separate the good from bad points using any of the quality metrics
fig, ax = plt.subplots(ncols=4, figsize=(10, 3))
for a, to_plot in zip(ax.ravel(), ["pq", "cm", "fit", "nmatch"]):
_ = a.hist(xmap_hi["indexed"].prop[to_plot], bins=100)
a.set(xlabel=to_plot, ylabel="Frequency")
fig.tight_layout()
... hm, there is no clear bimodal distribution in any of the histograms. It is recommended to find the "good" and "bad" points by trial-and-error until a desired separation is achieved. Here, we have combined metrics by trial-and-error to get a plausible separation, but note that other combinations might be better for other datasets.
mask_reindex = np.logical_or.reduce(
(
~xmap_hi.is_indexed,
xmap_hi.fit > 1.4,
xmap_hi.nmatch < 5,
)
)
frac_reindex = mask_reindex.sum() / mask_reindex.size
print(f"Fraction to re-index: {100 * frac_reindex:.2f}%")
# Get colors for all points, even the ones considered not-indexed
rgb_hi_all = ckey.orientation2color(xmap_hi.rotations)
# Get separate arrays for points to keep and to re-index, with in the other
# array as black
rgb_hi_reindex = np.zeros((xmap_hi.size, 3))
rgb_hi_keep = np.zeros_like(rgb_hi_reindex)
rgb_hi_reindex[mask_reindex] = rgb_hi_all[mask_reindex]
rgb_hi_keep[~mask_reindex] = rgb_hi_all[~mask_reindex]
nav_shape = xmap_hi.shape + (3,)
fig, (ax0, ax1) = plt.subplots(ncols=2, figsize=(12, 5))
ax0.imshow(rgb_hi_keep.reshape(nav_shape))
ax1.imshow(rgb_hi_reindex.reshape(nav_shape))
for a, title in zip([ax0, ax1], ["Keep", "Re-index with DI"]):
a.axis("off")
a.set_title(title)
fig.subplots_adjust(wspace=0.02)
Fraction to re-index: 43.20%
There are some spurious points still left among the points to keep, otherwise the orientation map looks fairly convincing.
Make navigation mask 2D.
Points to re-index are set to False
.
nav_mask = np.ones(xmap_hi.size, dtype=bool)
nav_mask[mask_reindex] = False
nav_mask = nav_mask.reshape(xmap_hi.shape)
To generate the dictionary of Ni patterns, we need to sample the orientation space at a sufficiently high resolution (here 2$^{\circ}$) with a fixed calibration geometry (PC). See the pattern matching tutorial for details.
rot = sampling.get_sample_fundamental(resolution=2, point_group=sym)
rot
Rotation (100347,) [[ 0.8541 -0.3536 -0.3536 -0.1435] [ 0.8541 -0.3536 -0.3536 0.1435] [ 0.8541 -0.3536 -0.1435 -0.3536] ... [ 0.8541 0.3536 0.1435 0.3536] [ 0.8541 0.3536 0.3536 -0.1435] [ 0.8541 0.3536 0.3536 0.1435]]
det_pc1 = det.deepcopy()
det_pc1.pc = det_pc1.pc_average
det_pc1.pc
array([[0.42129343, 0.21341257, 0.50080491]])
sim = mp.get_patterns(
rotations=rot,
detector=det_pc1,
energy=20,
chunk_shape=rot.size // 15,
)
sim
|
|
We only match the intensities within a circular mask (note the inversion!)
signal_mask = kp.filters.Window("circular", det.shape).astype(bool)
signal_mask = ~signal_mask
Perform dictionary indexing of the patterns and intensities marked as False
in the navigation and signal masks
xmap_di = s.dictionary_indexing(
sim,
keep_n=1,
navigation_mask=nav_mask,
signal_mask=signal_mask,
)
Dictionary indexing information: Phase name: ni Matching 12873/29800 experimental pattern(s) to 100347 dictionary pattern(s) NormalizedCrossCorrelationMetric: float32, greater is better, rechunk: False, navigation mask: True, signal mask: True
100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 16/16 [01:30<00:00, 5.63s/it]
Indexing speed: 142.91829 patterns/s, 14341421.71372 comparisons/s
# Save DI map
# io.save("xmap_di.h5", xmap_di)
We see that HI is about 6-7x faster than DI.
xmap_di.scores.mean()
0.15499452
rgb_di = ckey.orientation2color(xmap_di.rotations)
fig = xmap_di.plot(rgb_di, remove_padding=True, return_figure=True)
An average correlation score of 0.15 is low, but OK since we can refine the solutions and would expect a higher score from this. The orientation map looks plausible, though, which it did not from these patterns after HI.
First we specify common refinement parameters, so that the scores obtain can be compared. This is very important!
ref_kw = dict(
detector=det,
master_pattern=mp,
energy=20,
signal_mask=signal_mask,
method="LN_NELDERMEAD",
trust_region=[5, 5, 5],
)
Of the Hough indexed solutions, we only want to refine those that are not re-indexed using dictionary indexing.
We therefore pass the navigation mask, but have to take care to set those points that we want to index to False
xmap_hi_ref = s.refine_orientation(
xmap=xmap_hi, navigation_mask=~nav_mask, **ref_kw
)
Refinement information: Method: LN_NELDERMEAD (local) from NLopt Trust region (+/-): [5 5 5] Relative tolerance: 0.0001 Refining 16927 orientation(s): [########################################] | 100% Completed | 117.81 s Refinement speed: 143.62058 patterns/s
print(xmap_hi_ref.scores.mean())
print(xmap_hi_ref.num_evals.max())
0.2521151989052916 165
An average correlation score of 0.25 is OK. We now refine the re-indexed points. No navigation mask is necessary, since the crystal map returned from DI has a mask keeping track of which points are "in the data" via CrystalMap.is_in_data.
xmap_di_ref = s.refine_orientation(xmap=xmap_di, **ref_kw)
Refinement information: Method: LN_NELDERMEAD (local) from NLopt Trust region (+/-): [5 5 5] Relative tolerance: 0.0001 Refining 12873 orientation(s): [########################################] | 100% Completed | 88.87 ss Refinement speed: 144.78633 patterns/s
print(xmap_di_ref.scores.mean())
print(xmap_di_ref.num_evals.max())
0.1973858830665668 167
An average correlation score of 0.20 is still OK. We see that refinement of these points needed slightly more function evaluations, presumably because they were harder to index due to them being mostly located at grain boundaries and having less clear bands.
We can now merge the results, taking care to pass the navigation mask where each refined map should be considered. Since only the points not in the refined HI map were indexed with DI, the same mask can be used in both cases.
xmap_ref = kp.indexing.merge_crystal_maps(
[xmap_hi_ref, xmap_di_ref], navigation_masks=[~nav_mask, nav_mask]
)
xmap_ref
Phase Orientations Name Space group Point group Proper point group Color 0 29800 (100.0%) ni Fm-3m m-3m 432 tab:blue Properties: scores, merged_scores Scan unit: um
We see that we have a complete map for all our points!
# Save final refined combined map
# io.save("xmap_ref.h5", xmap_ref)
Finally, we can compare the IPF-X maps with HI only and after re-indexing, refinement and combination
rgb_ref = ckey.orientation2color(xmap_ref.orientations)
rgb_shape = xmap_ref.shape + (3,)
fig, (ax0, ax1) = plt.subplots(ncols=2, figsize=(12, 5))
ax0.imshow(rgb_hi_all.reshape(rgb_shape))
ax1.imshow(rgb_ref.reshape(rgb_shape))
for a, title in zip([ax0, ax1], ["HI", "HI + DI + ref"]):
a.axis("off")
a.set_title(title)
ax_ckey = fig.add_axes(
[0.805, 0.2, 0.1, 0.1], projection="ipf", symmetry=sym
)
ax_ckey.plot_ipf_color_key(show_title=False)
ax_ckey.patch.set_facecolor("None")
_ = [t.set_fontsize(15) for t in ax_ckey.texts]
fig.subplots_adjust(wspace=0.02)
We extract a grid of patterns and plot the geometrical simulations on top of these patterns
s.xmap = xmap_ref
s.detector = det
grid_shape = (4, 4)
s_grid, idx = s.extract_grid(grid_shape, return_indices=True)
kp.draw.plot_pattern_positions_in_map(
rc=idx.T.reshape((-1, 2)),
roi_shape=xmap_ref.shape,
roi_image=rgb_ref.reshape(rgb_shape),
)
sim_grid = simulator.on_detector(
s_grid.detector, s_grid.xmap.rotations.reshape(*grid_shape)
)
Finding bands that are in some pattern: [########################################] | 100% Completed | 101.48 ms Finding zone axes that are in some pattern: [########################################] | 100% Completed | 101.67 ms Calculating detector coordinates for bands and zone axes: [########################################] | 100% Completed | 102.37 ms
fig, axes = plt.subplots(
nrows=grid_shape[0], ncols=grid_shape[1], figsize=(12, 12)
)
for idx in np.ndindex(grid_shape):
ax = axes[idx]
ax.imshow(s_grid.data[idx] * ~signal_mask, cmap="gray")
lines = sim_grid.as_collections(idx, lines_kwargs=dict(color="k"))[0]
ax.add_collection(lines)
ax.axis("off")
ax.text(
0,
1,
np.ravel_multi_index(idx, grid_shape),
va="top",
ha="left",
c="w",
)
fig.subplots_adjust(wspace=0.02, hspace=0.02)
It's difficult to see any bands in these very noisy patterns... There at least seems to be a correlation between darker regions in the patterns (not the corners) and zone axes, which is expected.
In conclusion, by combining the speed of Hough indexing (about 6-7x faster than DI) with the robustness towards noise of dictionary indexing, a dataset can be indexed in a shorter time and achieve about the same results as with DI (and refinement) only.