#!/usr/bin/env python # coding: utf-8 # # Introduction to Transformations # This example introduces the `pybop.BaseTransformation` class and it's instances. This class adds functionality for the cost and likelihood functions to be transformed into a separate search space. This search space is used by the optimiser and sampler classes during inference. These transformations can be both linear (`pybop.ScaledTransformation`) and non-linear (`pybop.LogTransformation`). By default, if transformations are applied, the sampling and optimisers will search in the transformed space. # # Transformation can be helpful when the difference in parameter magnitudes is large, or to create a search space that is better posed for the optimisation algorithm. Before we begin, we need to ensure that we have all the necessary tools. We will install and import PyBOP alongside any other package dependencies. # In[ ]: get_ipython().run_line_magic('pip', 'install --upgrade pip ipywidgets -q') get_ipython().run_line_magic('pip', 'install pybop -q') import numpy as np import pybop pybop.plot.PlotlyManager().pio.renderers.default = "notebook_connected" # First, to showcase the transformation functionality, we need to construct the `pybop.Cost` class. This class needs the following objects: # - Model # - Dataset # - Parameters to identify # - Problem # # We will first construct the model, then the parameters and corresponding dataset. Once that is complete, the problem will be created. With the cost class created, we will showcase the different interactions users can have with the class. A small example with evaluation as well as computation is presented. # In[ ]: model = pybop.lithium_ion.SPM() # Now that we have the model constructed, let's define the parameters for identification. At this point, we define the corresponding transformations applied to each parameter. PyBOP allows for transformations to be applied at the individual parameter level, which is then combined for application on each cost call. Below we will apply a linear transformation using the pybop `ScaledTransformation` class. This class has arguments for a `coefficient` which defines the linear stretch or scaling of the search space, and `intercept` which defines the space shift. The equation for this transformation is: # # $$ # y_{search} = m*x_{model}+b # $$ # # where $m$ is the linear scale coefficient, $b$ is the intercept, $x_{model}$ is the model parameter space, and $y_{search}$ is the transformed space. # In[ ]: parameters = pybop.Parameters( pybop.Parameter( "Negative electrode active material volume fraction", initial_value=0.6, bounds=[0.35, 0.7], transformation=pybop.ScaledTransformation(coefficient=2.0, intercept=-0.6), ), pybop.Parameter( "Positive electrode active material volume fraction", initial_value=0.6, bounds=[0.45, 0.625], transformation=pybop.ScaledTransformation(coefficient=2.0, intercept=-0.6), ), ) # Next, to create the `pybop.Dataset` we generate some synthetic data from the model using the `model.predict` method. # In[ ]: t_eval = np.linspace(0, 10, 100) values = model.predict(t_eval=t_eval) dataset = pybop.Dataset( { "Time [s]": t_eval, "Current function [A]": values["Current [A]"].data, "Voltage [V]": values["Voltage [V]"].data, } ) # Now that we have the model, parameters, and dataset, we can combine them and construct the problem class. This is the last step needed before investigating how the transformation functionality works. # In[ ]: problem = pybop.FittingProblem(model, parameters, dataset) cost = pybop.SumofPower(problem) # The conventional way to use the cost class is through the `cost.__call__` method, which is completed below without transformations applied. # In[ ]: cost([0.6, 0.6]) # However, we can also interact with the cost function with transformations applied via the `apply_transform` optional arugment. This arg is by default set to `False`. # In[ ]: cost([0.0, 0.0], apply_transform=True) # Given the transformations applied in the parameter class above, we can confirm the alignment between the search space and the model space by passing values that coincide: # In[ ]: cost([0.0, 0.0], apply_transform=True) == cost([0.6, 0.6]) # We can more thoroughly test the transformation by testing that the search space is scaled by a value of two through the following: # In[ ]: cost([0.05, 0.05], apply_transform=True) == cost([0.625, 0.625]) # Next, we can plot cost landscapes of these two spaces. In the first instance, we plot the model space through the conventional method: # In[ ]: pybop.plot.contour(cost); # Next, we can use the `apply_transform` argument when constructing the cost landscape to via the search space. First, we will transform the bounds used above to the search space using each parameters transformation instance and the `to_search` method. # In[ ]: pybop.plot.contour(cost, steps=15, apply_transform=True); # Note the difference in axis scale compared to the non-transformed landscape. Next, let's change the transformation on the 'Positive electrode active material volume fraction' to a non-linear log space. # In[ ]: parameters[ "Positive electrode active material volume fraction" ].transformation = pybop.LogTransformation() cost.transformation = parameters.construct_transformation() # Let's update the bounds and plot the cost landscape again: # In[ ]: pybop.plot.contour(cost, steps=15, apply_transform=True); # ## Concluding Thoughts # # In the notebook we've introduced the transformation class and it's interaction with the `pybop.Parameters` and `pybop.BaseCost` classes. Transformation allow for the optimisation or sampling search space to scaled for improved convergence in situations where the optimisation hyperparameters are poorly tuned, or in optimisation tasks with high variance in the parameter magnitudes.