In this notebook, we discuss the process of changing PyBaMM solvers and the corresponding performance trade-offs with each. For further reading on different solvers, see the PyBaMM solver documentation:
Before we begin, we need to ensure that we have all the necessary tools. We will install PyBOP and upgrade dependencies:
%pip install --upgrade pip ipywidgets -q
%pip install pybop -q
import time
import numpy as np
import pybamm
import pybop
Note: you may need to restart the kernel to use updated packages. Note: you may need to restart the kernel to use updated packages.
Let's fix the random seed in order to generate consistent output during development, although this does not need to be done in practice.
np.random.seed(8)
We start by constructing a pybop model, and a synthetic dataset needed for the pybop problem we will be using for the solver benchmarking
# Model
parameter_set = pybop.ParameterSet.pybamm("Chen2020")
model = pybop.lithium_ion.SPM(parameter_set=parameter_set)
# Synthetic data
t_eval = np.arange(0, 900, 2)
values = model.predict(t_eval=t_eval)
# Dataset
dataset = pybop.Dataset(
{
"Time [s]": t_eval,
"Current function [A]": values["Current [A]"].data,
"Voltage [V]": values["Voltage [V]"].data,
}
)
# Parameters
parameters = pybop.Parameters(
pybop.Parameter(
"Negative electrode active material volume fraction",
prior=pybop.Gaussian(0.6, 0.02),
bounds=[0.5, 0.8],
),
pybop.Parameter(
"Positive electrode active material volume fraction",
prior=pybop.Gaussian(0.48, 0.02),
bounds=[0.4, 0.7],
),
)
Now that we have set up the majority of the pybop objects, we construct the solvers we want to benchmark on the given model, and applied current.
solvers = [
pybamm.IDAKLUSolver(atol=1e-6, rtol=1e-6),
pybamm.CasadiSolver(atol=1e-6, rtol=1e-6, mode="safe"),
pybamm.CasadiSolver(atol=1e-6, rtol=1e-6, mode="fast"),
pybamm.CasadiSolver(atol=1e-6, rtol=1e-6, mode="fast with events"),
]
Next, we construct a range of inputs for the parameters defined above, and select the number of instances in that range to benchmark on. For more statistically repeatable results, increase the variable n
below.
n = 50 # Number of solves
inputs = list(zip(np.linspace(0.45, 0.6, n), np.linspace(0.45, 0.6, n)))
Next, let's benchmark the solvers without sensitivities. This provides a reference for the non-gradient based pybop optimisers and samplers.
for solver in solvers:
model.solver = solver
problem = pybop.FittingProblem(model, parameters, dataset)
start_time = time.time()
for input_values in inputs:
problem.evaluate(inputs=input_values)
print(f"Time Evaluate {solver.name}: {time.time() - start_time:.3f}")
Time Evaluate IDA KLU solver: 0.477 Time Evaluate CasADi solver with 'safe' mode: 0.464 Time Evaluate CasADi solver with 'fast' mode: 0.446 Time Evaluate CasADi solver with 'fast with events' mode: 0.443
Excellent, given the above results, we know which solver we should select for optimisation on your machine, i.e. the one with the smallest time.
Next, let's repeat the same toy problem, but for the gradient-based cost evaluation,
for solver in solvers:
model.solver = solver
problem = pybop.FittingProblem(model, parameters, dataset)
start_time = time.time()
for input_values in inputs:
problem.evaluateS1(inputs=input_values)
print(f"Time EvaluateS1 {solver.name}: {time.time() - start_time:.3f}")
Time EvaluateS1 IDA KLU solver: 0.997 Time EvaluateS1 CasADi solver with 'safe' mode: 2.743 Time EvaluateS1 CasADi solver with 'fast' mode: 2.689 Time EvaluateS1 CasADi solver with 'fast with events' mode: 2.688
Now we have the relevant information for the gradient-based optimisers. Likewise to the above results, we should select the solver with the smallest time.