This example introduces the pybop.BaseTransformation
class and its instances. This class enables the cost and likelihood functions to be evaluated in a transformed search space. The search space is used by the optimiser and sampler classes during inference. These transformations can be both linear (e.g. pybop.ScaledTransformation
) and non-linear (e.g. pybop.LogTransformation
). By default, if transformations are applied, the sampling and optimisers will search in the transformed space.
Transformations can be helpful when the difference in parameter magnitudes is large, or to create a search space that is better posed for the optimisation algorithm. Before we begin, we need to ensure that we have all the necessary tools. We will install and import PyBOP alongside any other package dependencies.
%pip install --upgrade pip ipywidgets -q
%pip install pybop -q
import numpy as np
import pybop
pybop.plot.PlotlyManager().pio.renderers.default = "notebook_connected"
First, to showcase the transformation functionality, we need to construct a cost. This class is typically built on the following objects:
We will first construct the model, then the parameters and corresponding dataset. Once that is complete, the problem will be created. With the cost class created, we will showcase the different interactions users can have with transformations.
model = pybop.lithium_ion.SPM()
Now that we have the model constructed, let's define the parameters for identification. At this point, we define the transformations applied to each parameter. PyBOP allows for transformations to be applied at the individual parameter level, which are then combined for application during the optimisation. Below we apply a linear transformation using the pybop.ScaledTransformation
class. This class has arguments for a coefficient
which defines the linear stretch or scaling of the search space, and intercept
which defines the translation or shift. The equation for this transformation is:
where $m$ is the linear scale coefficient, $b$ is the intercept, $x_{model}$ is the model parameter space, and $y_{search}$ is the transformed space.
parameters = pybop.Parameters(
pybop.Parameter(
"Negative electrode active material volume fraction",
initial_value=0.6,
bounds=[0.35, 0.7],
transformation=pybop.ScaledTransformation(
coefficient=1 / 0.35, intercept=-0.35
),
),
pybop.Parameter(
"Positive electrode active material volume fraction",
initial_value=0.6,
bounds=[0.45, 0.625],
transformation=pybop.ScaledTransformation(
coefficient=1 / 0.175, intercept=-0.45
),
),
)
Next, to create the pybop.Dataset
we generate some synthetic data from the model using the model.predict
method.
t_eval = np.linspace(0, 10, 100)
values = model.predict(t_eval=t_eval)
dataset = pybop.Dataset(
{
"Time [s]": t_eval,
"Current function [A]": values["Current [A]"].data,
"Voltage [V]": values["Voltage [V]"].data,
}
)
Now that we have the model, parameters, and dataset, we can combine them and construct the problem and cost classes.
problem = pybop.FittingProblem(model, parameters, dataset)
cost = pybop.SumOfPower(problem)
The conventional way to use the cost class is through the cost.__call__
method, which is completed below without transformations applied.
cost([0.6, 0.6])
0.006904000484441733
The optimiser and sampler classes call the cost via the pybop.CostInterface
which applies the transformations to the search parameters from the optimiser (using to_model
) and then optionally returns the gradient with respect to the search parameters (using the jacobian
). We can check that the transformation is applied during optimisation by creating an optimisation class.
optim = pybop.Optimisation(cost=cost)
x0 = parameters.initial_value(apply_transform=True)
optim.call_cost(x0, cost=cost)
0.006904000484441733
We can see that the result of the two cost evaluations (with and without transformations) by comparing the output.
optim.call_cost(parameters.initial_value(apply_transform=True), cost=cost) == cost(
parameters.initial_value()
)
True
We can also compare the cost landscapes plotted in the model and search spaces. Let's first plot the cost in the model space through the conventional method:
pybop.plot.contour(cost);
Next, we can use the apply_transform
argument when constructing the cost landscape to plot in the transformed space.
pybop.plot.contour(cost, steps=15, apply_transform=True);
Note the difference in axis scale compared to the non-transformed landscape. Next, let's change the transformation on the 'Positive electrode active material volume fraction' to a non-linear log space.
parameters[
"Positive electrode active material volume fraction"
].transformation = pybop.LogTransformation()
cost.transformation = parameters.construct_transformation()
Let's update the bounds and plot the cost landscape again. This time, the values on the y-axis are negative as they correspond to the log of the model values.
pybop.plot.contour(cost, steps=15, apply_transform=True);
In this notebook, we've introduced the transformation class and its interaction with the pybop.Parameters
and pybop.CostInterface
classes. Transformations allow the optimisation or sampling search space to be transformed for improved convergence in situations where the optimisation hyperparameters are poorly tuned, or in optimisation tasks with high variance in the parameter magnitudes.