#!/usr/bin/env python # coding: utf-8 # ## Parameter Estimation with AdamW in PyBOP # # In this notebook, we demonstrate an example of parameter estimation for a single-particle model using the AdamW optimiser [1][2]. The AdamW optimiser is an algorithm for gradient-based optimisation, combining the advantages of the Adaptive Gradient Algorithm (AdaGrad) and Root Mean Square Propagation (RMSProp). # # [[1]: Adam: A Method for Stochastic Optimization](https://arxiv.org/abs/1412.6980) # # [[2]: Decoupled Weight Decay Regularization](https://doi.org/10.48550/arXiv.1711.05101) # # ### Setting up the Environment # # Before we begin, we need to ensure that we have all the necessary tools. We will install PyBOP and upgrade dependencies: # In[ ]: get_ipython().run_line_magic('pip', 'install --upgrade pip ipywidgets -q') get_ipython().run_line_magic('pip', 'install pybop -q') # ### Importing Libraries # # With the environment set up, we can now import PyBOP alongside other libraries we will need: # In[ ]: import numpy as np import pybop # Let's fix the random seed in order to generate consistent output during development, although this does not need to be done in practice. # In[ ]: np.random.seed(8) # ### Generate Synthetic Data # # To demonstrate parameter estimation, we first need some data. We will generate synthetic data using the PyBOP forward model, which requires defining a parameter set and the model itself. # # #### Defining Parameters and Model # # We start by creating an example parameter set and then instantiate the single-particle model (SPM): # In[ ]: parameter_set = pybop.ParameterSet.pybamm("Chen2020") model = pybop.lithium_ion.SPM(parameter_set=parameter_set) # ### Simulating Forward Model # # We can then simulate the model using the `predict` method, with a default constant current to generate voltage data. # In[ ]: t_eval = np.arange(0, 900, 2) values = model.predict(t_eval=t_eval) # ### Adding Noise to Voltage Data # # To make the parameter estimation more realistic, we add Gaussian noise to the data. # In[ ]: sigma = 0.001 corrupt_values = values["Voltage [V]"].data + np.random.normal(0, sigma, len(t_eval)) # ## Identify the Parameters # We will now set up the parameter estimation process by defining the datasets for optimisation and selecting the model parameters we wish to estimate. # ### Creating Optimisation Dataset # # The dataset for optimisation is composed of time, current, and the noisy voltage data: # In[ ]: dataset = pybop.Dataset( { "Time [s]": t_eval, "Current function [A]": values["Current [A]"].data, "Voltage [V]": corrupt_values, } ) # ### Defining Parameters to Estimate # # We select the parameters for estimation and set up their prior distributions and bounds: # In[ ]: parameters = [ pybop.Parameter( "Negative electrode active material volume fraction", prior=pybop.Gaussian(0.6, 0.02), bounds=[0.5, 0.8], ), pybop.Parameter( "Positive electrode active material volume fraction", prior=pybop.Gaussian(0.48, 0.02), bounds=[0.4, 0.7], ), ] # ### Setting up the Optimisation Problem # # With the datasets and parameters defined, we can set up the optimisation problem, its cost function, and the optimiser. # In[ ]: problem = pybop.FittingProblem(model, parameters, dataset) cost = pybop.SumSquaredError(problem) optim = pybop.Optimisation(cost, optimiser=pybop.AdamW) optim.set_max_unchanged_iterations(40) optim.set_max_iterations(150) # ### Running the Optimisation # # We proceed to run the AdamW optimisation algorithm to estimate the parameters: # In[ ]: x, final_cost = optim.run() # ### Viewing the Estimated Parameters # # After the optimisation, we can examine the estimated parameter values: # In[ ]: x # This will output the estimated parameters # ## Plotting and Visualisation # # PyBOP provides various plotting utilities to visualise the results of the optimisation. # ### Comparing System Response # # We can quickly plot the system's response using the estimated parameters compared to the target: # In[ ]: pybop.quick_plot(problem, problem_inputs=x, title="Optimised Comparison"); # ### Convergence and Parameter Trajectories # # To assess the optimisation process, we can plot the convergence of the cost function and the trajectories of the parameters: # In[ ]: pybop.plot_convergence(optim) pybop.plot_parameters(optim); # ### Cost Landscape # # Finally, we can visualise the cost landscape and the path taken by the optimiser: # In[ ]: # Plot the cost landscape pybop.plot2d(cost, steps=15) # Plot the cost landscape with optimisation path and updated bounds bounds = np.asarray([[0.6, 0.9], [0.5, 0.8]]) pybop.plot2d(optim, bounds=bounds, steps=15); # ### Conclusion # # This notebook illustrates how to perform parameter estimation using AdamW in PyBOP, providing insights into the optimisation process through various visualisations.