Approximate Bayesian computation (ABC) relies on the efficient comparison of relevant features in simulated and observed data, via distance metrics and potentially summary statistics. Separately, methods have been developed to adaptively scale-normalize the distance metric, and to semi-automatically derive informative, low-dimensional summary statistics.
In the notebook on "Adaptive distances" we demonstrated how distances adjusting weights to normalize scales are beneficial for heterogeneous, including outlier-corrupted, data. However, when parts of the data are uninformative, it is desirable to further concentrate the analysis on informative data points. Various methods have been develoepd to capture information of data on parameters in a low-dimensional summary statistics representation, see e.g. Blum et al. 2013 for a review. A particular approach constructs summary statistics as outputs of regression models of parameters on data, see the seminal work by Fearnhead and Prangle 2012. In this notebook, we illustrate the use of regression methods to construct informative summary statistics and sensitivity distance weights in pyABC.
# install if not done yet
!pip install pyabc[plotly] --quiet
import numpy as np
import scipy as sp
import tempfile
import matplotlib.pyplot as plt
from functools import partial
import logging
from IPython.display import SVG,display
import pyabc
from pyabc.distance import *
from pyabc.predictor import *
from pyabc.sumstat import *
from pyabc.util import EventIxs, ParTrafo, dict2arrlabels
pyabc.settings.set_figure_params("pyabc") # for beautified plots
# for debugging
for logger in ["ABC.Distance", "ABC.Predictor", "ABC.Sumstat"]:
logging.getLogger(logger).setLevel(logging.DEBUG)
To illustrate informativeness of data points, we consider a simple test problem. It consists of a single model output $y_1$ informative of parameter $p_1$, and uninformative model outputs $y_2$.
# problem definition
sigmas = {"p1": 0.1}
def model(p):
return {"y1": p["p1"] + 1 + sigmas["p1"] * np.random.normal(),
"y2": 2 + 0.1 * np.random.normal(size=3)}
gt_par = {"p1": 3}
data = {"y1": gt_par["p1"] + 1, "y2": 2 * np.ones(shape=3)}
prior_bounds = {"p1": (0, 10)}
prior = pyabc.Distribution(
**{
key: pyabc.RV("uniform", lb, ub - lb)
for key, (lb, ub) in prior_bounds.items()
},
)
We employ three approaches to perform inference on this problem.
Firstly, we use a distance adaptively scale-normalizing all statistics by their respective in-sample median absolute deviations (MAD), as introduced in the "Adaptive distances" notebook ("L1+Ada.+MAD").
Secondly, we employ an approach similar to Fearnhead and Prangle 2012, using a linear regression model, trained after 40% of the total sample budget, as summary statistic, with a simple L1 distance ("L1+StatLR").
Thirdly, we complement the MAD scale-normalizing weights by sensitivity weights, derived via normalized sensitivities of a linear regression model trained similarly to the second approach. This method thus accounts for informativeness by re-weighting of model outputs, without explicitly employing a low-dimensional summary statistics representation.
# analysis definition
pop_size = 100
total_sims = 3000
fit_sims = 0.4 * total_sims
YPredictor = LinearPredictor
#YPredictor = MLPPredictor
distances = {
"L1+Ada.+MAD": AdaptivePNormDistance(
p=1,
# adaptive scale normalization
scale_function=mad,
),
"L1+StatLR": PNormDistance(
p=1,
# regression-based summary statistics
sumstat=PredictorSumstat(
# regression model used
predictor=YPredictor(),
# when to fit the regression model
fit_ixs=EventIxs(sims=fit_sims),
),
),
"L1+Ada.+MAD+SensiLR": InfoWeightedPNormDistance(
p=1,
# adaptive scale normalization
scale_function=mad,
# regression model used to define sensitivity weights
predictor=YPredictor(),
# when to fit the regression model
fit_info_ixs=EventIxs(sims=fit_sims),
),
}
colors = {
distance_id: f"C{i}"
for i, distance_id in enumerate(distances)
}
ABC.Distance DEBUG: Fit scale ixs: <EventIxs, ts={inf}> ABC.Sumstat DEBUG: Fit model ixs: <EventIxs, sims=[1200.0]> ABC.Distance DEBUG: Fit scale ixs: <EventIxs, ts={inf}> ABC.Distance DEBUG: Fit info ixs: <EventIxs, sims=[1200.0]>
We perform the analysis using all above distance functions and summary statistics. Additionally, below we specify various logging files to capture relevant information for further analysis.
# runs
db_file = tempfile.mkstemp(suffix=".db")[1]
scale_log_file = tempfile.mkstemp()[1]
info_log_file = tempfile.mkstemp()[1]
info_sample_log_file = tempfile.mkstemp()[1]
hs = []
for distance_id, distance in distances.items():
print(distance_id)
if isinstance(distance, AdaptivePNormDistance):
distance.scale_log_file = f"{scale_log_file}_{distance_id}.json"
if isinstance(distance, InfoWeightedPNormDistance):
distance.info_log_file = f"{info_log_file}_{distance_id}.json"
distance.info_sample_log_file = f"{info_sample_log_file}_{distance_id}"
abc = pyabc.ABCSMC(model, prior, distance, population_size=pop_size)
h = abc.new(db="sqlite:///" + db_file, observed_sum_stat=data)
abc.run(max_total_nr_simulations=total_sims)
hs.append(h)
ABC.Sampler INFO: Parallelize sampling on 4 processes. ABC.History INFO: Start <ABCSMC id=1, start_time=2021-10-05 16:41:13>
L1+Ada.+MAD
ABC INFO: Calibration sample t = -1. ABC.Distance DEBUG: Scale weights[0] = {'y1': 3.9916e-01, 'y2:0': 1.3898e+01, 'y2:1': 1.5266e+01, 'y2:2': 1.6439e+01} ABC.Population INFO: Recording also rejected particles: True ABC INFO: t: 0, eps: 4.61306516e+00. ABC INFO: Accepted: 100 / 220 = 4.5455e-01, ESS: 1.0000e+02. ABC.Distance DEBUG: Scale weights[1] = {'y1': 4.2536e-01, 'y2:0': 1.4397e+01, 'y2:1': 1.2537e+01, 'y2:2': 1.6311e+01} ABC INFO: t: 1, eps: 3.40116321e+00. ABC INFO: Accepted: 100 / 339 = 2.9499e-01, ESS: 8.8877e+01. ABC.Distance DEBUG: Scale weights[2] = {'y1': 5.6351e-01, 'y2:0': 1.3952e+01, 'y2:1': 1.4549e+01, 'y2:2': 1.5463e+01} ABC INFO: t: 2, eps: 2.90921477e+00. ABC INFO: Accepted: 100 / 559 = 1.7889e-01, ESS: 9.4989e+01. ABC.Distance DEBUG: Scale weights[3] = {'y1': 7.1644e-01, 'y2:0': 1.4643e+01, 'y2:1': 1.5089e+01, 'y2:2': 1.4978e+01} ABC INFO: t: 3, eps: 2.53687272e+00. ABC INFO: Accepted: 100 / 776 = 1.2887e-01, ESS: 9.1048e+01. ABC.Distance DEBUG: Scale weights[4] = {'y1': 1.0699e+00, 'y2:0': 1.5647e+01, 'y2:1': 1.4792e+01, 'y2:2': 1.5166e+01} ABC INFO: t: 4, eps: 2.39675847e+00. ABC INFO: Accepted: 100 / 931 = 1.0741e-01, ESS: 8.6099e+01. ABC.Distance DEBUG: Scale weights[5] = {'y1': 1.4596e+00, 'y2:0': 1.5166e+01, 'y2:1': 1.5478e+01, 'y2:2': 1.3777e+01} ABC INFO: t: 5, eps: 2.14635356e+00. ABC INFO: Accepted: 100 / 1278 = 7.8247e-02, ESS: 9.1582e+01. ABC.Distance DEBUG: Scale weights[6] = {'y1': 2.1129e+00, 'y2:0': 1.5285e+01, 'y2:1': 1.4913e+01, 'y2:2': 1.4321e+01} ABC INFO: Stop: Total simulations budget. ABC.History INFO: Done <ABCSMC id=1, duration=0:00:03.996836, end_time=2021-10-05 16:41:17> ABC.Sampler INFO: Parallelize sampling on 4 processes. ABC.History INFO: Start <ABCSMC id=2, start_time=2021-10-05 16:41:17> ABC INFO: Calibration sample t = -1.
L1+StatLR
ABC.Population INFO: Recording also rejected particles: True ABC INFO: t: 0, eps: 2.87686490e+00. ABC INFO: Accepted: 100 / 205 = 4.8780e-01, ESS: 1.0000e+02. ABC INFO: t: 1, eps: 1.56873559e+00. ABC INFO: Accepted: 100 / 223 = 4.4843e-01, ESS: 9.9030e+01. ABC INFO: t: 2, eps: 9.57269210e-01. ABC INFO: Accepted: 100 / 224 = 4.4643e-01, ESS: 9.9415e+01. ABC INFO: t: 3, eps: 6.07547820e-01. ABC INFO: Accepted: 100 / 189 = 5.2910e-01, ESS: 9.8959e+01. ABC INFO: t: 4, eps: 4.14336762e-01. ABC INFO: Accepted: 100 / 229 = 4.3668e-01, ESS: 9.3448e+01. ABC INFO: t: 5, eps: 3.05346717e-01. ABC INFO: Accepted: 100 / 359 = 2.7855e-01, ESS: 8.6204e+01. ABC.Predictor INFO: Fitted <LinearPredictor predictor=LinearRegression(normalize=True)> in 0.00s ABC.Predictor INFO: Pearson correlations: 0.855 ABC.Predictor DEBUG: Linear regression coefficients (n_target, n_feature): [[ 0.85345358 0.00434316 -0.00268666 -0.03438184]] ABC INFO: t: 6, eps: 3.01123061e-01. ABC INFO: Accepted: 100 / 337 = 2.9674e-01, ESS: 8.7691e+01. ABC INFO: t: 7, eps: 1.71022914e-01. ABC INFO: Accepted: 100 / 479 = 2.0877e-01, ESS: 6.5137e+01. ABC INFO: t: 8, eps: 7.95773325e-02. ABC INFO: Accepted: 100 / 1142 = 8.7566e-02, ESS: 8.8637e+01. ABC INFO: Stop: Total simulations budget. ABC.History INFO: Done <ABCSMC id=2, duration=0:00:04.704629, end_time=2021-10-05 16:41:22> ABC.Sampler INFO: Parallelize sampling on 4 processes. ABC.History INFO: Start <ABCSMC id=3, start_time=2021-10-05 16:41:22> ABC INFO: Calibration sample t = -1.
L1+Ada.+MAD+SensiLR
ABC.Distance DEBUG: Scale weights[0] = {'y1': 4.6828e-01, 'y2:0': 1.4908e+01, 'y2:1': 1.6688e+01, 'y2:2': 1.8701e+01} ABC.Population INFO: Recording also rejected particles: True ABC.Population INFO: Recording also rejected particles: True ABC INFO: t: 0, eps: 4.79871628e+00. ABC INFO: Accepted: 100 / 250 = 4.0000e-01, ESS: 1.0000e+02. ABC.Distance DEBUG: Scale weights[1] = {'y1': 3.7404e-01, 'y2:0': 1.3719e+01, 'y2:1': 1.6838e+01, 'y2:2': 1.5203e+01} ABC INFO: t: 1, eps: 3.04687823e+00. ABC INFO: Accepted: 100 / 362 = 2.7624e-01, ESS: 9.3661e+01. ABC.Distance DEBUG: Scale weights[2] = {'y1': 6.3353e-01, 'y2:0': 1.6098e+01, 'y2:1': 1.5150e+01, 'y2:2': 1.4629e+01} ABC INFO: t: 2, eps: 2.80424688e+00. ABC INFO: Accepted: 100 / 651 = 1.5361e-01, ESS: 9.5979e+01. ABC.Distance DEBUG: Scale weights[3] = {'y1': 7.3991e-01, 'y2:0': 1.4131e+01, 'y2:1': 1.6120e+01, 'y2:2': 1.5415e+01} ABC.Predictor INFO: Fitted <LinearPredictor predictor=LinearRegression(normalize=True)> in 0.00s ABC.Predictor INFO: Pearson correlations: 0.999 ABC.Predictor DEBUG: Linear regression coefficients (n_target, n_feature): [[ 9.98648081e-01 -3.59852955e-03 3.41440040e-03 3.49105721e-04]] ABC.Distance DEBUG: Optimal FD delta: [0.1 0.1 0.1 0.1] ABC.Distance DEBUG: Info weights[3] = {'y1': 9.9345e-01, 'y2:0': 3.3450e-03, 'y2:1': 2.8975e-03, 'y2:2': 3.0555e-04} ABC INFO: t: 3, eps: 6.27981174e-01. ABC INFO: Accepted: 100 / 223 = 4.4843e-01, ESS: 9.9388e+01. ABC.Distance DEBUG: Scale weights[4] = {'y1': 1.0916e+00, 'y2:0': 1.5020e+01, 'y2:1': 1.6405e+01, 'y2:2': 1.6376e+01} ABC INFO: t: 4, eps: 4.10373777e-01. ABC INFO: Accepted: 100 / 212 = 4.7170e-01, ESS: 9.6175e+01. ABC.Distance DEBUG: Scale weights[5] = {'y1': 2.5702e+00, 'y2:0': 1.4002e+01, 'y2:1': 1.5511e+01, 'y2:2': 1.3470e+01} ABC INFO: t: 5, eps: 4.83755744e-01. ABC INFO: Accepted: 100 / 229 = 4.3668e-01, ESS: 9.8700e+01. ABC.Distance DEBUG: Scale weights[6] = {'y1': 4.9205e+00, 'y2:0': 1.5758e+01, 'y2:1': 1.4302e+01, 'y2:2': 1.2520e+01} ABC INFO: t: 6, eps: 4.88461606e-01. ABC INFO: Accepted: 100 / 294 = 3.4014e-01, ESS: 9.3014e+01. ABC.Distance DEBUG: Scale weights[7] = {'y1': 6.8078e+00, 'y2:0': 1.5560e+01, 'y2:1': 1.5163e+01, 'y2:2': 1.5084e+01} ABC INFO: t: 7, eps: 3.28511752e-01. ABC INFO: Accepted: 100 / 470 = 2.1277e-01, ESS: 8.8489e+01. ABC.Distance DEBUG: Scale weights[8] = {'y1': 9.2931e+00, 'y2:0': 1.5087e+01, 'y2:1': 1.4246e+01, 'y2:2': 1.4925e+01} ABC INFO: t: 8, eps: 2.49471607e-01. ABC INFO: Accepted: 100 / 836 = 1.1962e-01, ESS: 8.5409e+01. ABC.Distance DEBUG: Scale weights[9] = {'y1': 9.8016e+00, 'y2:0': 1.3972e+01, 'y2:1': 1.4274e+01, 'y2:2': 1.4674e+01} ABC INFO: Stop: Total simulations budget. ABC.History INFO: Done <ABCSMC id=3, duration=0:00:04.727970, end_time=2021-10-05 16:41:26>
The comparison of the obtained posterior approximations with the true posterior reveals that L1+Ada.+MAD gave a worse fit compared to the other approaches. This is because it only applies scale-normalization, but does not account for informativeness of data, and thus spends a lot of time on fitting $y_2$.
# plot ABC posterior approximations
fig, axes = plt.subplots(ncols=len(prior_bounds), figsize=(6, 4))
if len(prior_bounds) == 1:
axes = [axes]
# plot ground truth
def unnorm_1d_normal_pdf(p, y_obs, sigma, p_to_y = None):
"""Non-normalized 1-d normal density.
Parameters
----------
p: Parameter to evaluate at.
y_obs: Observed data.
sigma: Noise standard deviation.
p_to_y: Function to deterministically transform p to simulated data.
Returns
-------
pd: Probability density/densities at p.
"""
if p_to_y is None:
p_to_y = lambda p: p
y = p_to_y(p)
pd = np.exp(- (y - y_obs)**2 / (2 * sigma**2))
return pd
for i_par, par in enumerate(gt_par.keys()):
# define parameter-simulation transformation
p_to_y = lambda p: p+1
# observed data corresponding to parameter
y_obs = p_to_y(gt_par[par])
# bounds
xmin, xmax = prior_bounds[par]
# standard deviation
sigma = sigmas[par]
# pdf as function of only p
pdf = partial(
unnorm_1d_normal_pdf, y_obs=y_obs, sigma=sigma, p_to_y=p_to_y,
)
# integrate density
norm = sp.integrate.quad(pdf, xmin, xmax)[0]
# plot density
xs = np.linspace(xmin, xmax, 300)
axes[i_par].plot(
xs, pdf(xs) / norm, linestyle="dashed",
color="grey",label="ground truth",
)
# plot ABC approximations
for i_par, par in enumerate(prior_bounds.keys()):
for distance_id, h in zip(distances.keys(), hs):
pyabc.visualization.plot_kde_1d_highlevel(
h, x=par, xname=par,
xmin=prior_bounds[par][0], xmax=prior_bounds[par][1],
ax=axes[i_par], label=distance_id,
numx=500,
)
# prettify
for ax in axes[1:]:
ax.set_ylabel(None)
fig.tight_layout(rect=(0, 0.1, 1, 1))
axes[-1].legend()
<matplotlib.legend.Legend at 0x7f717b152c40>
Via the log files, we can further examine the employed weights, firstly scale-normalizing weights based on MAD, and secondly sensitivity weights quantifying informativeness. Indeed, while both L1+Ada.+MAD and L1+Ada.+MAD+SensiLR assign large weights to $y_2$, the additional sensitivity weights employed by L1+Ada.+MAD+SensiLR counteract this by assigning a large weight to $y_1$.
# plot weights
fig, axes = plt.subplots(nrows=2, ncols=len(gt_par), figsize=(4, 8))
# scale weights
scale_distance_ids = [
distance_id for distance_id in distances.keys()
if "Ada." in distance_id and "Stat" not in distance_id
]
scale_log_files = []
for i_dist, distance_id in enumerate(scale_distance_ids):
scale_log_files.append(f"{scale_log_file}_{distance_id}.json")
pyabc.visualization.plot_distance_weights(
scale_log_files,
labels=scale_distance_ids,
colors=[colors[distance_id] for distance_id in scale_distance_ids],
xlabel="Model output",
title="Scale weights",
ax=axes[0],
keys=dict2arrlabels(data, keys=data.keys()),
)
# info weights
info_distance_ids = [
distance_id for distance_id in distances.keys()
if "Sensi" in distance_id
]
info_log_files = []
for i_dist, distance_id in enumerate(info_distance_ids):
info_log_files.append(f"{info_log_file}_{distance_id}.json")
pyabc.visualization.plot_distance_weights(
info_log_files,
labels=info_distance_ids,
colors=[colors[distance_id] for distance_id in info_distance_ids],
xlabel="Model output",
title="Sensitivity weights",
ax=axes[1],
keys=dict2arrlabels(data, keys=data.keys()),
)
fig.tight_layout()
To further understand the employed sensitivity matrix, we can visualize the connections between model outputs and parameters (in this case a single one) via a "Sankey" flow diagram. In this case, this gives no further information beyond the above weight diagram.
# plot flow diagram
fig = pyabc.visualization.plot_sensitivity_sankey(
info_sample_log_file=f"{info_sample_log_file}_L1+Ada.+MAD+SensiLR",
t=f"{info_log_file}_L1+Ada.+MAD+SensiLR.json",
h=hs[-1],
predictor=LinearPredictor(),
)
# here just showing a non-interactive plot to reduce storage
img_file = tempfile.mkstemp(suffix=(".svg"))[1]
fig.write_image(img_file)
display(SVG(img_file))
ABC.Predictor INFO: Fitted <LinearPredictor predictor=LinearRegression(normalize=True)> in 0.00s ABC.Predictor INFO: Pearson correlations: 0.999 ABC.Predictor DEBUG: Linear regression coefficients (n_target, n_feature): [[ 9.98648081e-01 -3.59852955e-03 3.41440040e-03 3.49105721e-04]] ABC.Distance DEBUG: Optimal FD delta: [0.01 0.1 0.1 0.1 ]
Now, we turn to a slightly more challenging problem:
The problem encompasses multiple challenging features established methods have problems with:
# problem definition
sigmas = {"p1": 1e-1, "p2": 1e2, "p3": 1e2, "p4": 1e-1}
def model(p):
return {
"y1": p["p1"] + sigmas["p1"] * np.random.normal(),
"y2": p["p2"] + sigmas["p2"] * np.random.normal(),
"y3": p["p3"] + np.sqrt(4 * sigmas["p3"]**2) * np.random.normal(size=4),
"y4": p["p4"] ** 2 + sigmas["p4"] * np.random.normal(),
"y5": 1e1 * np.random.normal(size=10),
}
prior_bounds = {
"p1": (-7e0, 7e0),
"p2": (-7e2, 7e2),
"p3": (-7e2, 7e2),
"p4": (-1e0, 1e0),
}
prior = pyabc.Distribution(
**{
key: pyabc.RV("uniform", lb, ub - lb)
for key, (lb, ub) in prior_bounds.items()
},
)
gt_par = {"p1": 0, "p2": 0, "p3": 0, "p4": 0.5}
data = {"y1": 0, "y2": 0, "y3": 0 * np.ones(4), "y4": 0.5**2, "y5": 0 * np.ones(10)}
To tackle these problems, we suggest to firstly consistenly employ scale normalization, both on the raw model outputs and on the level of summary statistics. Secondly, we suggest to instead of only inferring a mapping $s: y \mapsto \theta$, we target augmented parameter vectors, $s: y \mapsto \lambda(\theta)$, with e.g. $\lambda(\theta) = (\theta^1,\ldots,\theta^4)$. This practically allows to break symmetry, e.g. if only $\theta^2$ can be expressed as a function of the data. Conceptually, this further allows to obtain a more accurate description of the posterior distribution, as the summary statistics may be regarded as approximations to $s(y) = \mathbb{E}[\lambda(\theta)|y]$, using which as summary statistics preserves the corresponding posterior moments, i.e.
$$\lim_{\varepsilon\rightarrow 0}\mathbb{E}_{\pi_{\text{ABC},\varepsilon}}[\lambda(\Theta)|s(y_\text{obs})] = \mathbb{E}[\lambda(\Theta)|Y=y_\text{obs}].$$Methods employing scale normalization, accounting for informativeness, and augmented regression targets, are L1+Ada.+MAD+StatLR+P4, which uses regression-based summary statistics, and L1+Ada.+MAD+SensiLR+P4, which uses sensitivity weights. For comparison, we consider L1+Ada.+MAD only normalizing scales, and L1+StatLR, using non-scale normaled summary statistics, as well as L1+Ada.+MAD+StatLR and L1+Ada.+MAD+SensiLR using only a subset of methods.
# analysis definition
pop_size = 1000
total_sims = 50000
par_trafos = [lambda x: x, lambda x: x**2, lambda x: x**3, lambda x: x**4]
fit_sims = 0.4 * total_sims
YPredictor = LinearPredictor
#YPredictor = MLPPredictor
distances = {
"L1+Ada.+MAD": AdaptivePNormDistance(
p=1,
scale_function=mad,
),
"L1+StatLR": PNormDistance(
p=1,
sumstat=PredictorSumstat(
predictor=YPredictor(normalize_features=False, normalize_labels=False),
fit_ixs=EventIxs(sims=fit_sims),
),
),
"L1+Ada.+MAD+StatLR": AdaptivePNormDistance(
p=1,
scale_function=mad,
sumstat=PredictorSumstat(
predictor=YPredictor(),
fit_ixs=EventIxs(sims=fit_sims),
),
),
"L1+Ada.+MAD+StatLR+P4": AdaptivePNormDistance(
p=1,
scale_function=mad,
sumstat=PredictorSumstat(
predictor=YPredictor(),
fit_ixs=EventIxs(sims=fit_sims),
par_trafo=ParTrafo(trafos=par_trafos),
),
),
"L1+Ada.+MAD+SensiLR": InfoWeightedPNormDistance(
p=1,
scale_function=mad,
predictor=YPredictor(),
fit_info_ixs=EventIxs(sims=fit_sims),
feature_normalization="mad",
),
"L1+Ada.+MAD+SensiLR+P4": InfoWeightedPNormDistance(
p=1,
scale_function=mad,
predictor=YPredictor(),
fit_info_ixs=EventIxs(sims=fit_sims),
feature_normalization="mad",
par_trafo=ParTrafo(trafos=par_trafos),
),
}
colors = {
distance_id: f"C{i}"
for i, distance_id in enumerate(distances)
}
ABC.Distance DEBUG: Fit scale ixs: <EventIxs, ts={inf}> ABC.Sumstat DEBUG: Fit model ixs: <EventIxs, sims=[20000.0]> ABC.Sumstat DEBUG: Fit model ixs: <EventIxs, sims=[20000.0]> ABC.Distance DEBUG: Fit scale ixs: <EventIxs, ts={inf}> ABC.Sumstat DEBUG: Fit model ixs: <EventIxs, sims=[20000.0]> ABC.Distance DEBUG: Fit scale ixs: <EventIxs, ts={inf}> ABC.Distance DEBUG: Fit scale ixs: <EventIxs, ts={inf}> ABC.Distance DEBUG: Fit info ixs: <EventIxs, sims=[20000.0]> ABC.Distance DEBUG: Fit scale ixs: <EventIxs, ts={inf}> ABC.Distance DEBUG: Fit info ixs: <EventIxs, sims=[20000.0]>
For the analysis, we suggest the use of sufficiently large population sizes, as the process model is more complex.
%%time
# runs
db_file = tempfile.mkstemp(suffix=".db")[1]
scale_log_file = tempfile.mkstemp()[1]
info_log_file = tempfile.mkstemp()[1]
info_sample_log_file = tempfile.mkstemp()[1]
hs = []
for distance_id, distance in distances.items():
print(distance_id)
if isinstance(distance, AdaptivePNormDistance):
distance.scale_log_file = f"{scale_log_file}_{distance_id}.json"
if isinstance(distance, InfoWeightedPNormDistance):
distance.info_log_file = f"{info_log_file}_{distance_id}.json"
distance.info_sample_log_file = f"{info_sample_log_file}_{distance_id}"
abc = pyabc.ABCSMC(model, prior, distance, population_size=pop_size)
h = abc.new(db="sqlite:///" + db_file, observed_sum_stat=data)
abc.run(max_total_nr_simulations=total_sims)
hs.append(h)
ABC.Sampler INFO: Parallelize sampling on 4 processes. ABC.History INFO: Start <ABCSMC id=1, start_time=2021-10-05 16:41:28> ABC INFO: Calibration sample t = -1.
L1+Ada.+MAD
ABC.Distance DEBUG: Scale weights[0] = {'y1': 2.9294e-01, 'y2': 2.7377e-03, 'y3:0': 2.8626e-03, 'y3:1': 2.9173e-03, 'y3:2': 2.9049e-03, 'y3:3': 2.9623e-03, 'y4': 4.3981e+00, 'y5:0': 1.4131e-01, 'y5:1': 1.5015e-01, 'y5:2': 1.4726e-01, 'y5:3': 1.5484e-01, 'y5:4': 1.4182e-01, 'y5:5': 1.4668e-01, 'y5:6': 1.5088e-01, 'y5:7': 1.4703e-01, 'y5:8': 1.4922e-01, 'y5:9': 1.3950e-01} ABC.Population INFO: Recording also rejected particles: True ABC INFO: t: 0, eps: 1.90563584e+01. ABC INFO: Accepted: 1000 / 2018 = 4.9554e-01, ESS: 1.0000e+03. ABC.Distance DEBUG: Scale weights[1] = {'y1': 2.8470e-01, 'y2': 2.9895e-03, 'y3:0': 2.8676e-03, 'y3:1': 2.7813e-03, 'y3:2': 2.8152e-03, 'y3:3': 2.8497e-03, 'y4': 4.5285e+00, 'y5:0': 1.4742e-01, 'y5:1': 1.4402e-01, 'y5:2': 1.5378e-01, 'y5:3': 1.4295e-01, 'y5:4': 1.4342e-01, 'y5:5': 1.4931e-01, 'y5:6': 1.4800e-01, 'y5:7': 1.4684e-01, 'y5:8': 1.5270e-01, 'y5:9': 1.5023e-01} ABC INFO: t: 1, eps: 1.65481821e+01. ABC INFO: Accepted: 1000 / 2915 = 3.4305e-01, ESS: 7.9901e+02. ABC.Distance DEBUG: Scale weights[2] = {'y1': 3.4158e-01, 'y2': 3.3718e-03, 'y3:0': 3.7079e-03, 'y3:1': 3.8657e-03, 'y3:2': 3.6288e-03, 'y3:3': 3.7560e-03, 'y4': 5.6115e+00, 'y5:0': 1.4786e-01, 'y5:1': 1.4220e-01, 'y5:2': 1.4858e-01, 'y5:3': 1.4810e-01, 'y5:4': 1.5262e-01, 'y5:5': 1.4673e-01, 'y5:6': 1.4934e-01, 'y5:7': 1.4790e-01, 'y5:8': 1.4400e-01, 'y5:9': 1.4952e-01} ABC INFO: t: 2, eps: 1.62250426e+01. ABC INFO: Accepted: 1000 / 4420 = 2.2624e-01, ESS: 6.5218e+02. ABC.Distance DEBUG: Scale weights[3] = {'y1': 3.4803e-01, 'y2': 3.4560e-03, 'y3:0': 4.2484e-03, 'y3:1': 4.3525e-03, 'y3:2': 4.2639e-03, 'y3:3': 4.4310e-03, 'y4': 5.9792e+00, 'y5:0': 1.5168e-01, 'y5:1': 1.4306e-01, 'y5:2': 1.5072e-01, 'y5:3': 1.4940e-01, 'y5:4': 1.4796e-01, 'y5:5': 1.4404e-01, 'y5:6': 1.4895e-01, 'y5:7': 1.5217e-01, 'y5:8': 1.5272e-01, 'y5:9': 1.4671e-01} ABC INFO: t: 3, eps: 1.54467098e+01. ABC INFO: Accepted: 1000 / 7079 = 1.4126e-01, ESS: 7.6828e+02. ABC.Distance DEBUG: Scale weights[4] = {'y1': 3.8851e-01, 'y2': 3.5183e-03, 'y3:0': 4.8134e-03, 'y3:1': 4.7860e-03, 'y3:2': 4.9635e-03, 'y3:3': 4.8290e-03, 'y4': 6.1102e+00, 'y5:0': 1.4576e-01, 'y5:1': 1.4508e-01, 'y5:2': 1.4572e-01, 'y5:3': 1.5013e-01, 'y5:4': 1.4984e-01, 'y5:5': 1.5022e-01, 'y5:6': 1.5114e-01, 'y5:7': 1.4977e-01, 'y5:8': 1.4550e-01, 'y5:9': 1.4246e-01} ABC INFO: t: 4, eps: 1.46981636e+01. ABC INFO: Accepted: 1000 / 10506 = 9.5184e-02, ESS: 6.4057e+02. ABC.Distance DEBUG: Scale weights[5] = {'y1': 4.1201e-01, 'y2': 3.7667e-03, 'y3:0': 5.3283e-03, 'y3:1': 5.3384e-03, 'y3:2': 5.4103e-03, 'y3:3': 5.3343e-03, 'y4': 6.3827e+00, 'y5:0': 1.4821e-01, 'y5:1': 1.4911e-01, 'y5:2': 1.4648e-01, 'y5:3': 1.4632e-01, 'y5:4': 1.4849e-01, 'y5:5': 1.4841e-01, 'y5:6': 1.4589e-01, 'y5:7': 1.4635e-01, 'y5:8': 1.4786e-01, 'y5:9': 1.4916e-01} ABC INFO: t: 5, eps: 1.41715617e+01. ABC INFO: Accepted: 1000 / 17467 = 5.7251e-02, ESS: 4.1359e+02. ABC.Distance DEBUG: Scale weights[6] = {'y1': 4.5671e-01, 'y2': 3.9722e-03, 'y3:0': 5.5490e-03, 'y3:1': 5.5915e-03, 'y3:2': 5.6185e-03, 'y3:3': 5.6091e-03, 'y4': 6.2272e+00, 'y5:0': 1.4800e-01, 'y5:1': 1.4988e-01, 'y5:2': 1.4662e-01, 'y5:3': 1.4592e-01, 'y5:4': 1.4934e-01, 'y5:5': 1.4760e-01, 'y5:6': 1.4867e-01, 'y5:7': 1.4818e-01, 'y5:8': 1.4862e-01, 'y5:9': 1.4893e-01} ABC INFO: t: 6, eps: 1.35740321e+01. ABC INFO: Accepted: 1000 / 27750 = 3.6036e-02, ESS: 6.6695e+02. ABC.Distance DEBUG: Scale weights[7] = {'y1': 4.9653e-01, 'y2': 4.1405e-03, 'y3:0': 5.7993e-03, 'y3:1': 5.8629e-03, 'y3:2': 5.8766e-03, 'y3:3': 5.8056e-03, 'y4': 6.3017e+00, 'y5:0': 1.4710e-01, 'y5:1': 1.4942e-01, 'y5:2': 1.4619e-01, 'y5:3': 1.4899e-01, 'y5:4': 1.4903e-01, 'y5:5': 1.4791e-01, 'y5:6': 1.4870e-01, 'y5:7': 1.4773e-01, 'y5:8': 1.4677e-01, 'y5:9': 1.4654e-01} ABC INFO: Stop: Total simulations budget. ABC.History INFO: Done <ABCSMC id=1, duration=0:01:14.597146, end_time=2021-10-05 16:42:43> ABC.Sampler INFO: Parallelize sampling on 4 processes. ABC.History INFO: Start <ABCSMC id=2, start_time=2021-10-05 16:42:43> ABC INFO: Calibration sample t = -1.
L1+StatLR
ABC.Population INFO: Recording also rejected particles: True ABC INFO: t: 0, eps: 1.81512613e+03. ABC INFO: Accepted: 1000 / 2065 = 4.8426e-01, ESS: 1.0000e+03. ABC INFO: t: 1, eps: 1.27573430e+03. ABC INFO: Accepted: 1000 / 2163 = 4.6232e-01, ESS: 8.4246e+02. ABC INFO: t: 2, eps: 1.01199987e+03. ABC INFO: Accepted: 1000 / 3008 = 3.3245e-01, ESS: 6.8605e+02. ABC INFO: t: 3, eps: 8.38463037e+02. ABC INFO: Accepted: 1000 / 4407 = 2.2691e-01, ESS: 6.9940e+02. ABC INFO: t: 4, eps: 7.16759482e+02. ABC INFO: Accepted: 1000 / 6141 = 1.6284e-01, ESS: 5.4982e+02. ABC INFO: t: 5, eps: 6.24981827e+02. ABC INFO: Accepted: 1000 / 9703 = 1.0306e-01, ESS: 4.7455e+02. ABC.Predictor INFO: Fitted <LinearPredictor predictor=LinearRegression(normalize=True) normalize_features=False normalize_labels=False> in 0.01s ABC.Predictor INFO: Pearson correlations: 1.000 0.886 0.815 0.097 ABC.Predictor DEBUG: Linear regression coefficients (n_target, n_feature): [[ 9.99261882e-01 -2.02321439e-06 1.31433366e-06 -1.02471763e-05 6.03041020e-06 2.44448630e-06 1.11233986e-03 2.86938000e-05 4.62949008e-05 -1.47375622e-04 1.07937746e-04 1.20979546e-04 1.12870254e-04 8.56516758e-05 -1.83345738e-04 -6.27656419e-06 -7.36791751e-05] [-8.90658141e-02 7.90034834e-01 -9.68880155e-04 -1.50247549e-03 5.62952727e-03 6.64435835e-03 1.34481213e+01 -1.41244384e-01 -9.79291457e-02 -6.09723366e-02 -5.54299132e-02 -6.71159793e-02 1.32018814e-01 -1.39732829e-01 3.28816331e-03 7.21479456e-02 -4.14487829e-02] [-6.40806297e-01 2.65843829e-03 1.64800569e-01 1.65838067e-01 1.66051504e-01 1.66044169e-01 5.45851877e+00 2.50318673e-02 -4.11312360e-02 3.73507719e-02 -1.19725048e-01 1.14815514e-01 -5.58334230e-02 -5.73220879e-03 -1.04073562e-01 9.38231547e-02 -1.86993037e-02] [ 4.04954057e-03 -1.53390296e-04 -6.30799858e-06 -6.70342648e-05 -2.07515714e-05 -4.99358021e-05 -6.04129514e-02 9.06524318e-05 2.67532399e-04 -1.00936902e-04 1.43159873e-03 -2.13917378e-04 -2.80451656e-04 1.48873861e-04 -3.66426734e-04 -7.67511333e-05 1.39372242e-04]] ABC INFO: t: 6, eps: 1.07656135e+02. ABC INFO: Accepted: 1000 / 5862 = 1.7059e-01, ESS: 6.8986e+02. ABC INFO: t: 7, eps: 7.83587949e+01. ABC INFO: Accepted: 1000 / 8071 = 1.2390e-01, ESS: 5.1651e+02. ABC INFO: t: 8, eps: 5.55745743e+01. ABC INFO: Accepted: 1000 / 15140 = 6.6050e-02, ESS: 1.4642e+02. ABC INFO: Stop: Total simulations budget. ABC.History INFO: Done <ABCSMC id=2, duration=0:01:23.533864, end_time=2021-10-05 16:44:07> ABC.Sampler INFO: Parallelize sampling on 4 processes. ABC.History INFO: Start <ABCSMC id=3, start_time=2021-10-05 16:44:07> ABC INFO: Calibration sample t = -1.
L1+Ada.+MAD+StatLR
ABC.Distance DEBUG: Scale weights[0] = {'y1': 2.8360e-01, 'y2': 2.8449e-03, 'y3:0': 2.8090e-03, 'y3:1': 2.8704e-03, 'y3:2': 3.0113e-03, 'y3:3': 2.7977e-03, 'y4': 4.4981e+00, 'y5:0': 1.3920e-01, 'y5:1': 1.5504e-01, 'y5:2': 1.4481e-01, 'y5:3': 1.3933e-01, 'y5:4': 1.4622e-01, 'y5:5': 1.5380e-01, 'y5:6': 1.5324e-01, 'y5:7': 1.4794e-01, 'y5:8': 1.4410e-01, 'y5:9': 1.4428e-01} ABC.Population INFO: Recording also rejected particles: True ABC.Population INFO: Recording also rejected particles: True ABC INFO: t: 0, eps: 1.89788629e+01. ABC INFO: Accepted: 1000 / 2092 = 4.7801e-01, ESS: 1.0000e+03. ABC.Distance DEBUG: Scale weights[1] = {'y1': 2.9700e-01, 'y2': 2.8961e-03, 'y3:0': 2.7278e-03, 'y3:1': 2.9267e-03, 'y3:2': 2.9031e-03, 'y3:3': 2.8338e-03, 'y4': 4.6206e+00, 'y5:0': 1.4612e-01, 'y5:1': 1.4235e-01, 'y5:2': 1.4921e-01, 'y5:3': 1.5118e-01, 'y5:4': 1.4642e-01, 'y5:5': 1.4915e-01, 'y5:6': 1.4164e-01, 'y5:7': 1.4511e-01, 'y5:8': 1.5077e-01, 'y5:9': 1.4455e-01} ABC INFO: t: 1, eps: 1.65234215e+01. ABC INFO: Accepted: 1000 / 2906 = 3.4412e-01, ESS: 8.1919e+02. ABC.Distance DEBUG: Scale weights[2] = {'y1': 3.3395e-01, 'y2': 3.2839e-03, 'y3:0': 3.7548e-03, 'y3:1': 3.7551e-03, 'y3:2': 3.7084e-03, 'y3:3': 3.8617e-03, 'y4': 5.2658e+00, 'y5:0': 1.4869e-01, 'y5:1': 1.4634e-01, 'y5:2': 1.5073e-01, 'y5:3': 1.5067e-01, 'y5:4': 1.5077e-01, 'y5:5': 1.5069e-01, 'y5:6': 1.4723e-01, 'y5:7': 1.4294e-01, 'y5:8': 1.4836e-01, 'y5:9': 1.4844e-01} ABC INFO: t: 2, eps: 1.62712315e+01. ABC INFO: Accepted: 1000 / 4565 = 2.1906e-01, ESS: 7.5369e+02. ABC.Distance DEBUG: Scale weights[3] = {'y1': 3.5945e-01, 'y2': 3.3952e-03, 'y3:0': 4.1892e-03, 'y3:1': 4.2511e-03, 'y3:2': 4.1854e-03, 'y3:3': 4.1843e-03, 'y4': 5.6099e+00, 'y5:0': 1.5179e-01, 'y5:1': 1.4674e-01, 'y5:2': 1.4691e-01, 'y5:3': 1.4689e-01, 'y5:4': 1.4601e-01, 'y5:5': 1.5123e-01, 'y5:6': 1.4625e-01, 'y5:7': 1.5410e-01, 'y5:8': 1.4710e-01, 'y5:9': 1.4584e-01} ABC INFO: t: 3, eps: 1.52930894e+01. ABC INFO: Accepted: 1000 / 6926 = 1.4438e-01, ESS: 7.2490e+02. ABC.Distance DEBUG: Scale weights[4] = {'y1': 3.9954e-01, 'y2': 3.6629e-03, 'y3:0': 4.8313e-03, 'y3:1': 5.0066e-03, 'y3:2': 4.8489e-03, 'y3:3': 4.8810e-03, 'y4': 5.7531e+00, 'y5:0': 1.4545e-01, 'y5:1': 1.4758e-01, 'y5:2': 1.4679e-01, 'y5:3': 1.4475e-01, 'y5:4': 1.4526e-01, 'y5:5': 1.4903e-01, 'y5:6': 1.4593e-01, 'y5:7': 1.4829e-01, 'y5:8': 1.4632e-01, 'y5:9': 1.4916e-01} ABC INFO: t: 4, eps: 1.47558356e+01. ABC INFO: Accepted: 1000 / 10375 = 9.6386e-02, ESS: 6.0913e+02. ABC.Predictor INFO: Fitted <LinearPredictor predictor=LinearRegression(normalize=True)> in 0.01s ABC.Predictor INFO: Pearson correlations: 0.999 0.956 0.897 0.079 ABC.Predictor DEBUG: Linear regression coefficients (n_target, n_feature): [[ 9.99515408e-01 -3.86759789e-05 1.42968606e-04 1.14880201e-04 -3.65423452e-05 -3.19570392e-04 -2.27018354e-04 2.16987203e-04 -8.14777125e-05 2.20974395e-04 4.98663027e-04 -4.35641344e-04 1.13276228e-04 -9.08217962e-05 -4.61978306e-05 -1.88724079e-04 -1.25277706e-04] [ 1.71800329e-03 9.56312748e-01 -1.55403499e-03 2.32582239e-03 -5.77802127e-03 4.29149503e-03 -1.09836410e-03 -1.98100008e-03 1.90948771e-03 1.08171739e-03 5.95593391e-03 2.75841570e-03 8.75552240e-04 -1.94831619e-03 -1.30369978e-03 -2.62790094e-03 -5.19154675e-03] [ 2.53850125e-02 -3.16563739e-03 2.86391423e-01 2.85956856e-01 2.75099732e-01 2.81032423e-01 -1.13708251e-02 2.77865179e-03 -2.66920520e-03 2.13589647e-03 -4.71644712e-03 -7.16245013e-03 4.82919419e-04 -3.82892650e-03 9.34521103e-03 -1.80994464e-03 6.96733560e-04] [-5.88018322e-02 2.92127417e-02 1.39159863e-02 -9.32495748e-03 -6.03665193e-03 6.19094227e-03 1.97792399e-02 1.90802328e-02 5.01209357e-03 -2.44874653e-03 -7.42443481e-04 -4.26378578e-03 -1.17238653e-02 2.73835730e-02 2.76767367e-03 8.81457864e-03 1.64323131e-02]] ABC.Distance DEBUG: Scale weights[5] = {'s_p1': 1.3640e+00, 's_p2': 1.4125e+00, 's_p3': 1.6263e+00, 's_p4': 1.8090e+01} ABC INFO: t: 5, eps: 3.64152309e+00. ABC INFO: Accepted: 1000 / 2782 = 3.5945e-01, ESS: 2.9263e+01. ABC.Distance DEBUG: Scale weights[6] = {'s_p1': 1.4036e+00, 's_p2': 1.4303e+00, 's_p3': 1.9125e+00, 's_p4': 1.7601e+01} ABC INFO: t: 6, eps: 3.33772850e+00. ABC INFO: Accepted: 1000 / 3694 = 2.7071e-01, ESS: 7.2711e+02. ABC.Distance DEBUG: Scale weights[7] = {'s_p1': 2.2250e+00, 's_p2': 1.7086e+00, 's_p3': 1.1984e+00, 's_p4': 2.2854e+01} ABC INFO: t: 7, eps: 3.02525303e+00. ABC INFO: Accepted: 1000 / 3413 = 2.9300e-01, ESS: 1.6228e+02. ABC.Distance DEBUG: Scale weights[8] = {'s_p1': 2.4544e+00, 's_p2': 1.7996e+00, 's_p3': 1.9632e+00, 's_p4': 2.5003e+01} ABC INFO: t: 8, eps: 3.09365488e+00. ABC INFO: Accepted: 1000 / 4402 = 2.2717e-01, ESS: 5.9452e+02. ABC.Distance DEBUG: Scale weights[9] = {'s_p1': 4.0623e+00, 's_p2': 2.1868e+00, 's_p3': 1.3733e+00, 's_p4': 2.7188e+01} ABC INFO: t: 9, eps: 2.88332225e+00. ABC INFO: Accepted: 1000 / 4811 = 2.0786e-01, ESS: 5.1986e+02. ABC.Distance DEBUG: Scale weights[10] = {'s_p1': 4.4089e+00, 's_p2': 2.3425e+00, 's_p3': 1.9345e+00, 's_p4': 2.6712e+01} ABC INFO: t: 10, eps: 2.71753911e+00. ABC INFO: Accepted: 1000 / 5556 = 1.7999e-01, ESS: 5.5855e+02. ABC.Distance DEBUG: Scale weights[11] = {'s_p1': 6.9993e+00, 's_p2': 2.7107e+00, 's_p3': 1.7287e+00, 's_p4': 2.8761e+01} ABC INFO: Stop: Total simulations budget. ABC.History INFO: Done <ABCSMC id=3, duration=0:01:35.931459, end_time=2021-10-05 16:45:43> ABC.Sampler INFO: Parallelize sampling on 4 processes. ABC.History INFO: Start <ABCSMC id=4, start_time=2021-10-05 16:45:43> ABC INFO: Calibration sample t = -1.
L1+Ada.+MAD+StatLR+P4
ABC.Distance DEBUG: Scale weights[0] = {'y1': 2.8077e-01, 'y2': 2.8126e-03, 'y3:0': 2.7881e-03, 'y3:1': 2.7861e-03, 'y3:2': 2.8809e-03, 'y3:3': 2.8890e-03, 'y4': 4.1594e+00, 'y5:0': 1.4783e-01, 'y5:1': 1.4070e-01, 'y5:2': 1.5137e-01, 'y5:3': 1.5022e-01, 'y5:4': 1.4811e-01, 'y5:5': 1.5462e-01, 'y5:6': 1.4502e-01, 'y5:7': 1.4772e-01, 'y5:8': 1.5449e-01, 'y5:9': 1.5510e-01} ABC.Population INFO: Recording also rejected particles: True ABC.Population INFO: Recording also rejected particles: True ABC INFO: t: 0, eps: 1.90970210e+01. ABC INFO: Accepted: 1000 / 2014 = 4.9652e-01, ESS: 1.0000e+03. ABC.Distance DEBUG: Scale weights[1] = {'y1': 2.8982e-01, 'y2': 2.9374e-03, 'y3:0': 2.9085e-03, 'y3:1': 3.0154e-03, 'y3:2': 2.9217e-03, 'y3:3': 2.9557e-03, 'y4': 4.5449e+00, 'y5:0': 1.5060e-01, 'y5:1': 1.4618e-01, 'y5:2': 1.4638e-01, 'y5:3': 1.5011e-01, 'y5:4': 1.4354e-01, 'y5:5': 1.4498e-01, 'y5:6': 1.5106e-01, 'y5:7': 1.4521e-01, 'y5:8': 1.4523e-01, 'y5:9': 1.4900e-01} ABC INFO: t: 1, eps: 1.67806175e+01. ABC INFO: Accepted: 1000 / 2743 = 3.6456e-01, ESS: 5.8332e+02. ABC.Distance DEBUG: Scale weights[2] = {'y1': 3.3447e-01, 'y2': 3.4306e-03, 'y3:0': 3.5759e-03, 'y3:1': 3.7008e-03, 'y3:2': 3.7112e-03, 'y3:3': 3.6142e-03, 'y4': 5.5397e+00, 'y5:0': 1.4838e-01, 'y5:1': 1.4479e-01, 'y5:2': 1.4672e-01, 'y5:3': 1.4907e-01, 'y5:4': 1.5227e-01, 'y5:5': 1.4507e-01, 'y5:6': 1.5143e-01, 'y5:7': 1.5095e-01, 'y5:8': 1.5101e-01, 'y5:9': 1.4795e-01} ABC INFO: t: 2, eps: 1.63074220e+01. ABC INFO: Accepted: 1000 / 4449 = 2.2477e-01, ESS: 8.0066e+02. ABC.Distance DEBUG: Scale weights[3] = {'y1': 3.4771e-01, 'y2': 3.4150e-03, 'y3:0': 4.2136e-03, 'y3:1': 4.1559e-03, 'y3:2': 4.2964e-03, 'y3:3': 4.1997e-03, 'y4': 5.7274e+00, 'y5:0': 1.5001e-01, 'y5:1': 1.4493e-01, 'y5:2': 1.5045e-01, 'y5:3': 1.5227e-01, 'y5:4': 1.4597e-01, 'y5:5': 1.4765e-01, 'y5:6': 1.5191e-01, 'y5:7': 1.4534e-01, 'y5:8': 1.5024e-01, 'y5:9': 1.4617e-01} ABC INFO: t: 3, eps: 1.53674078e+01. ABC INFO: Accepted: 1000 / 6436 = 1.5538e-01, ESS: 7.2059e+02. ABC.Distance DEBUG: Scale weights[4] = {'y1': 3.8614e-01, 'y2': 3.8536e-03, 'y3:0': 4.7719e-03, 'y3:1': 4.8699e-03, 'y3:2': 4.8246e-03, 'y3:3': 4.8221e-03, 'y4': 6.0614e+00, 'y5:0': 1.5387e-01, 'y5:1': 1.4580e-01, 'y5:2': 1.4963e-01, 'y5:3': 1.4744e-01, 'y5:4': 1.5159e-01, 'y5:5': 1.4894e-01, 'y5:6': 1.4487e-01, 'y5:7': 1.4763e-01, 'y5:8': 1.4807e-01, 'y5:9': 1.4720e-01} ABC INFO: t: 4, eps: 1.48909781e+01. ABC INFO: Accepted: 1000 / 9961 = 1.0039e-01, ESS: 5.8022e+02. ABC.Predictor INFO: Fitted <LinearPredictor predictor=LinearRegression(normalize=True)> in 0.03s ABC.Predictor INFO: Pearson correlations: 1.000 0.958 0.882 0.112 0.108 0.052 0.134 0.923 0.881 0.846 0.722 0.130 0.102 0.052 0.164 0.869 ABC.Predictor DEBUG: Linear regression coefficients (n_target, n_feature): [[ 9.99506372e-01 -4.10370676e-04 -6.99626429e-05 3.89830268e-04 -3.51303005e-04 1.02401171e-04 -4.24905318e-04 -2.78988665e-04 2.16993443e-04 -1.88829112e-04 -2.27726764e-05 1.74126507e-04 3.89331950e-04 3.27453536e-04 -2.25480587e-04 4.15335305e-04 2.18652836e-04] [-8.90559228e-04 9.57354684e-01 3.18105565e-03 -5.18412786e-04 -1.15223896e-03 7.95214492e-04 -2.64224858e-04 -3.64033502e-03 2.84788378e-03 2.13311445e-03 3.35486390e-03 1.79774948e-03 -1.83442678e-04 2.06400616e-03 -1.82072070e-03 3.32108558e-03 3.32373602e-03] [-4.62342003e-03 1.12044259e-02 2.74386957e-01 2.85667544e-01 2.93111163e-01 2.83972926e-01 -1.37769181e-02 8.90328434e-04 -3.35715644e-03 2.17639816e-03 -1.84512651e-04 1.21181440e-03 -3.66354889e-03 4.48226594e-03 -6.12851132e-03 -6.79633686e-03 -4.80980803e-03] [-6.26188718e-02 -2.58195598e-02 -3.23882041e-04 5.97589110e-03 -2.80341717e-02 -1.21203285e-03 -8.26561359e-02 1.20571345e-02 -9.18934298e-03 1.64551366e-02 3.02851614e-03 -6.00382429e-03 -6.78491201e-04 -2.95422627e-03 -5.32494631e-03 -4.32510392e-03 -2.26084558e-03] [ 7.37800713e-02 -5.71407963e-02 -2.49647813e-04 1.73687296e-02 -9.11963136e-03 1.31843383e-03 -3.47534882e-02 -9.54790249e-03 1.59536226e-02 4.13712655e-03 2.04826360e-03 1.10087700e-04 1.33075527e-02 4.18016831e-03 -2.09131975e-02 -5.72902466e-03 -1.66708192e-02] [-9.35047355e-03 2.76829221e-02 -2.55640305e-02 3.10416384e-03 1.13659540e-02 -1.28985152e-02 1.77324594e-02 -1.43142485e-03 2.62622754e-03 -2.61114938e-03 -3.59926875e-03 -2.65933870e-03 -8.26217254e-03 -1.66845679e-03 -2.14225078e-02 -7.49277775e-03 -1.34046064e-02] [-1.30736233e-03 6.02784144e-04 3.08437582e-02 4.65510251e-02 3.53621842e-02 5.03684678e-02 -2.26242842e-02 1.79143920e-02 -3.65062515e-03 1.58836347e-05 -1.48461027e-02 -1.30430115e-03 1.19414015e-02 6.53403013e-03 4.39638259e-03 -1.43426933e-02 -3.78200584e-03] [ 1.95197251e-03 3.40373522e-03 3.63039238e-03 -5.49370826e-03 -3.35316337e-03 1.31956620e-03 9.22695048e-01 -4.18856734e-03 -5.91678917e-03 -1.82199864e-03 1.70748850e-03 4.02874632e-03 3.82421059e-03 -2.14695431e-03 -7.57102972e-04 -1.58945722e-03 2.12243305e-03] [ 8.82206272e-01 2.55773326e-02 5.50111167e-03 4.47745769e-05 -1.24964485e-03 2.38934533e-03 -7.47806992e-03 -5.91132344e-03 -6.08207766e-03 9.97143208e-03 1.60818033e-03 1.93882818e-03 6.84414070e-03 5.02913215e-03 1.86688037e-03 -9.09344091e-03 -1.77960376e-04] [-6.37502120e-03 8.45829185e-01 -4.96810834e-03 -7.03042018e-03 -4.33387888e-03 6.59019767e-03 -1.23297225e-02 -1.45843620e-03 -2.59572890e-03 5.87074157e-03 -1.96325952e-03 2.14247583e-03 1.30955454e-03 2.66857330e-03 -6.31877169e-04 1.17149364e-04 9.22070118e-03] [-8.36608646e-03 1.94890977e-02 2.22160584e-01 2.35750305e-01 2.52856687e-01 2.17580903e-01 -2.77142317e-02 7.58855411e-03 1.19144965e-04 5.78515172e-03 -4.95023849e-03 -5.58887065e-03 -2.45991934e-03 6.30305154e-03 -9.63198630e-03 4.94768838e-03 -6.36851220e-03] [-5.70129141e-02 -1.83686745e-02 4.85009270e-03 -3.90835625e-03 -3.10433223e-02 3.56848211e-03 -1.10066400e-01 9.62050935e-03 -5.20114380e-03 1.46952230e-02 6.40335841e-04 -2.35001406e-03 -7.20011543e-05 -3.00475669e-03 4.14010056e-03 -9.39651291e-03 -2.95253628e-03] [ 7.01723772e-02 -5.77618676e-02 9.61883267e-04 1.74537618e-02 -4.65487215e-03 -7.31046946e-03 -2.34802454e-02 -6.34236515e-03 9.21428809e-03 2.79996570e-03 -1.35043477e-03 -2.44552494e-03 1.32397824e-02 5.36473518e-03 -2.46268284e-02 -3.35497840e-03 -1.58695766e-02] [-8.62090721e-03 2.17672736e-02 -2.27270662e-02 7.97276681e-03 5.48870239e-03 -1.42927808e-02 2.25169005e-02 -7.72460558e-03 -2.66329732e-04 -3.94771521e-03 -9.88757490e-03 -5.78267925e-04 -3.76154409e-03 7.90133931e-04 -2.26883358e-02 -7.99515279e-03 -1.38255104e-02] [ 1.87066301e-03 -1.69228385e-02 3.98332241e-02 5.58054565e-02 5.10492456e-02 5.84204588e-02 -1.04923988e-02 1.75580050e-02 9.30735356e-03 7.28934171e-03 -1.43211571e-02 -5.20625309e-03 9.51561684e-03 8.31446875e-03 -5.07189840e-03 -1.65328391e-02 -1.12610819e-04] [-6.54084437e-04 2.99057116e-03 5.32906781e-03 -2.34674599e-03 -7.25199515e-03 -2.38758706e-03 8.68789158e-01 -4.44191842e-03 -3.94463671e-03 4.42286138e-04 1.08813509e-03 -4.27342250e-03 2.83960350e-03 -8.37233368e-03 9.74338105e-04 4.01638226e-03 -1.11220935e-03]] ABC.Distance DEBUG: Scale weights[5] = {'s_p1_0': 1.3273e+00, 's_p2_0': 1.4116e+00, 's_p3_0': 1.6600e+00, 's_p4_0': 1.3306e+01, 's_p1_1': 1.3270e+01, 's_p2_1': 2.8061e+01, 's_p3_1': 1.0951e+01, 's_p4_1': 1.7339e+00, 's_p1_2': 1.5158e+00, 's_p2_2': 1.6033e+00, 's_p3_2': 2.0330e+00, 's_p4_2': 1.1597e+01, 's_p1_3': 1.3933e+01, 's_p2_3': 2.8433e+01, 's_p3_3': 8.9102e+00, 's_p4_3': 1.8476e+00} ABC INFO: t: 5, eps: 1.45829559e+01. ABC INFO: Accepted: 1000 / 3232 = 3.0941e-01, ESS: 7.4765e+02. ABC.Distance DEBUG: Scale weights[6] = {'s_p1_0': 1.4918e+00, 's_p2_0': 1.5626e+00, 's_p3_0': 1.8493e+00, 's_p4_0': 1.3664e+01, 's_p1_1': 1.3735e+01, 's_p2_1': 2.8792e+01, 's_p3_1': 1.2282e+01, 's_p4_1': 1.7527e+00, 's_p1_2': 1.6843e+00, 's_p2_2': 1.7714e+00, 's_p3_2': 2.2726e+00, 's_p4_2': 1.1811e+01, 's_p1_3': 1.4694e+01, 's_p2_3': 2.9073e+01, 's_p3_3': 9.9983e+00, 's_p4_3': 1.8622e+00} ABC INFO: t: 6, eps: 1.32270251e+01. ABC INFO: Accepted: 1000 / 3839 = 2.6048e-01, ESS: 5.1224e+02. ABC.Distance DEBUG: Scale weights[7] = {'s_p1_0': 1.9117e+00, 's_p2_0': 1.7880e+00, 's_p3_0': 1.9630e+00, 's_p4_0': 1.5269e+01, 's_p1_1': 1.7408e+01, 's_p2_1': 3.1300e+01, 's_p3_1': 1.2858e+01, 's_p4_1': 1.7606e+00, 's_p1_2': 2.1410e+00, 's_p2_2': 2.0173e+00, 's_p3_2': 2.3775e+00, 's_p4_2': 1.3116e+01, 's_p1_3': 1.8717e+01, 's_p2_3': 3.1069e+01, 's_p3_3': 1.0383e+01, 's_p4_3': 1.8641e+00} ABC INFO: t: 7, eps: 1.24704131e+01. ABC INFO: Accepted: 1000 / 5797 = 1.7250e-01, ESS: 6.1449e+02. ABC.Distance DEBUG: Scale weights[8] = {'s_p1_0': 2.3480e+00, 's_p2_0': 1.8099e+00, 's_p3_0': 1.8957e+00, 's_p4_0': 1.5896e+01, 's_p1_1': 1.9128e+01, 's_p2_1': 3.0651e+01, 's_p3_1': 1.2577e+01, 's_p4_1': 1.8289e+00, 's_p1_2': 2.6383e+00, 's_p2_2': 2.0353e+00, 's_p3_2': 2.3133e+00, 's_p4_2': 1.3469e+01, 's_p1_3': 2.0385e+01, 's_p2_3': 3.1081e+01, 's_p3_3': 1.0238e+01, 's_p4_3': 1.9455e+00} ABC INFO: t: 8, eps: 1.10416307e+01. ABC INFO: Accepted: 1000 / 7446 = 1.3430e-01, ESS: 6.8077e+02. ABC.Distance DEBUG: Scale weights[9] = {'s_p1_0': 3.0664e+00, 's_p2_0': 2.1392e+00, 's_p3_0': 2.0154e+00, 's_p4_0': 1.7000e+01, 's_p1_1': 2.1087e+01, 's_p2_1': 3.2306e+01, 's_p3_1': 1.3134e+01, 's_p4_1': 1.9369e+00, 's_p1_2': 3.4174e+00, 's_p2_2': 2.4183e+00, 's_p3_2': 2.4827e+00, 's_p4_2': 1.4248e+01, 's_p1_3': 2.2824e+01, 's_p2_3': 3.2050e+01, 's_p3_3': 1.0701e+01, 's_p4_3': 2.0545e+00} ABC INFO: t: 9, eps: 1.05113788e+01. ABC INFO: Accepted: 1000 / 11033 = 9.0637e-02, ESS: 2.3208e+02. ABC.Distance DEBUG: Scale weights[10] = {'s_p1_0': 3.9477e+00, 's_p2_0': 2.3339e+00, 's_p3_0': 2.1431e+00, 's_p4_0': 1.7466e+01, 's_p1_1': 2.2624e+01, 's_p2_1': 3.2675e+01, 's_p3_1': 1.3811e+01, 's_p4_1': 1.9103e+00, 's_p1_2': 4.3988e+00, 's_p2_2': 2.6275e+00, 's_p3_2': 2.6231e+00, 's_p4_2': 1.4303e+01, 's_p1_3': 2.4448e+01, 's_p2_3': 3.1887e+01, 's_p3_3': 1.1242e+01, 's_p4_3': 2.0308e+00} ABC INFO: Stop: Total simulations budget. ABC.History INFO: Done <ABCSMC id=4, duration=0:01:48.005309, end_time=2021-10-05 16:47:31> ABC.Sampler INFO: Parallelize sampling on 4 processes. ABC.History INFO: Start <ABCSMC id=5, start_time=2021-10-05 16:47:31> ABC INFO: Calibration sample t = -1.
L1+Ada.+MAD+SensiLR
ABC.Distance DEBUG: Scale weights[0] = {'y1': 2.7707e-01, 'y2': 2.6862e-03, 'y3:0': 2.9186e-03, 'y3:1': 2.7353e-03, 'y3:2': 2.7297e-03, 'y3:3': 2.7408e-03, 'y4': 4.3451e+00, 'y5:0': 1.4542e-01, 'y5:1': 1.4500e-01, 'y5:2': 1.4893e-01, 'y5:3': 1.4706e-01, 'y5:4': 1.5039e-01, 'y5:5': 1.5847e-01, 'y5:6': 1.6396e-01, 'y5:7': 1.4962e-01, 'y5:8': 1.4731e-01, 'y5:9': 1.4335e-01} ABC.Population INFO: Recording also rejected particles: True ABC.Population INFO: Recording also rejected particles: True ABC INFO: t: 0, eps: 1.88845213e+01. ABC INFO: Accepted: 1000 / 2075 = 4.8193e-01, ESS: 1.0000e+03. ABC.Distance DEBUG: Scale weights[1] = {'y1': 2.8298e-01, 'y2': 2.9160e-03, 'y3:0': 2.8455e-03, 'y3:1': 2.9108e-03, 'y3:2': 2.8274e-03, 'y3:3': 2.9511e-03, 'y4': 4.6351e+00, 'y5:0': 1.4603e-01, 'y5:1': 1.4764e-01, 'y5:2': 1.5305e-01, 'y5:3': 1.4800e-01, 'y5:4': 1.5216e-01, 'y5:5': 1.4926e-01, 'y5:6': 1.5010e-01, 'y5:7': 1.4313e-01, 'y5:8': 1.4485e-01, 'y5:9': 1.4863e-01} ABC INFO: t: 1, eps: 1.65719057e+01. ABC INFO: Accepted: 1000 / 3084 = 3.2425e-01, ESS: 7.6658e+02. ABC.Distance DEBUG: Scale weights[2] = {'y1': 3.2246e-01, 'y2': 3.4994e-03, 'y3:0': 3.6661e-03, 'y3:1': 3.8406e-03, 'y3:2': 3.6982e-03, 'y3:3': 3.8082e-03, 'y4': 5.6868e+00, 'y5:0': 1.4534e-01, 'y5:1': 1.4631e-01, 'y5:2': 1.4619e-01, 'y5:3': 1.5213e-01, 'y5:4': 1.4716e-01, 'y5:5': 1.4659e-01, 'y5:6': 1.4577e-01, 'y5:7': 1.4569e-01, 'y5:8': 1.5005e-01, 'y5:9': 1.4723e-01} ABC INFO: t: 2, eps: 1.62548492e+01. ABC INFO: Accepted: 1000 / 4681 = 2.1363e-01, ESS: 7.6954e+02. ABC.Distance DEBUG: Scale weights[3] = {'y1': 3.4030e-01, 'y2': 3.5408e-03, 'y3:0': 4.1942e-03, 'y3:1': 4.1495e-03, 'y3:2': 4.2106e-03, 'y3:3': 4.1275e-03, 'y4': 5.7985e+00, 'y5:0': 1.4545e-01, 'y5:1': 1.4458e-01, 'y5:2': 1.5223e-01, 'y5:3': 1.4814e-01, 'y5:4': 1.4645e-01, 'y5:5': 1.4576e-01, 'y5:6': 1.4706e-01, 'y5:7': 1.4793e-01, 'y5:8': 1.4813e-01, 'y5:9': 1.4934e-01} ABC INFO: t: 3, eps: 1.53441024e+01. ABC INFO: Accepted: 1000 / 6092 = 1.6415e-01, ESS: 4.8992e+02. ABC.Distance DEBUG: Scale weights[4] = {'y1': 3.7786e-01, 'y2': 3.8322e-03, 'y3:0': 4.8910e-03, 'y3:1': 4.9793e-03, 'y3:2': 4.9898e-03, 'y3:3': 5.0362e-03, 'y4': 5.7256e+00, 'y5:0': 1.4743e-01, 'y5:1': 1.5141e-01, 'y5:2': 1.4759e-01, 'y5:3': 1.4885e-01, 'y5:4': 1.5434e-01, 'y5:5': 1.4839e-01, 'y5:6': 1.5159e-01, 'y5:7': 1.4788e-01, 'y5:8': 1.5045e-01, 'y5:9': 1.4711e-01} ABC INFO: t: 4, eps: 1.51105144e+01. ABC INFO: Accepted: 1000 / 9533 = 1.0490e-01, ESS: 6.2200e+02. ABC.Distance DEBUG: Scale weights[5] = {'y1': 3.9213e-01, 'y2': 3.6439e-03, 'y3:0': 5.3029e-03, 'y3:1': 5.3070e-03, 'y3:2': 5.3043e-03, 'y3:3': 5.1848e-03, 'y4': 6.0026e+00, 'y5:0': 1.4928e-01, 'y5:1': 1.4776e-01, 'y5:2': 1.4702e-01, 'y5:3': 1.5112e-01, 'y5:4': 1.4486e-01, 'y5:5': 1.4810e-01, 'y5:6': 1.4829e-01, 'y5:7': 1.4943e-01, 'y5:8': 1.5241e-01, 'y5:9': 1.4861e-01} ABC.Predictor INFO: Fitted <LinearPredictor predictor=LinearRegression(normalize=True)> in 0.01s ABC.Predictor INFO: Pearson correlations: 1.000 0.960 0.892 0.072 ABC.Predictor DEBUG: Linear regression coefficients (n_target, n_feature): [[ 9.99528173e-01 -4.58092644e-07 3.38701384e-04 2.79414195e-05 -1.20692224e-04 -7.90465325e-04 9.10058043e-05 -4.30399858e-05 1.76979879e-04 3.72148189e-04 1.79634049e-04 5.06658271e-04 1.74741649e-04 -3.85305074e-04 -1.51585425e-04 -1.22041478e-05 -2.80113431e-04] [-8.16361016e-03 9.59321905e-01 4.48736292e-03 -5.60624970e-04 -3.34494969e-03 4.03190376e-04 -7.32258834e-03 -4.55157888e-03 -3.22947516e-04 -4.32617537e-03 1.10243367e-03 4.60644768e-03 -3.70547877e-03 4.64397208e-04 3.79471867e-03 -4.65088797e-04 1.00594500e-03] [-1.10354921e-02 1.08490764e-02 2.71191985e-01 2.76110771e-01 2.92744787e-01 2.94343325e-01 -9.48900919e-03 4.69944697e-04 1.03401791e-02 7.92096835e-04 5.55259513e-04 -2.05104987e-03 4.41043952e-03 -9.70822161e-04 5.45620307e-03 2.47759219e-04 -3.01654257e-03] [ 4.34175539e-02 1.38174719e-02 -2.43021664e-02 2.95916022e-02 -1.87411472e-03 8.92354987e-03 4.06628622e-02 6.71295552e-03 -3.52489615e-03 1.64473037e-03 -9.85775979e-03 1.96956506e-02 4.50938429e-03 -5.79189537e-03 1.14177245e-02 -2.12376196e-03 -5.57922826e-04]] ABC.Distance DEBUG: Optimal FD delta: [0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.01 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.01 0.1 ] ABC.Distance DEBUG: Info weights[5] = {'y1': 1.2262e+00, 'y2': 1.0339e+00, 'y3:0': 3.3462e-01, 'y3:1': 3.5974e-01, 'y3:2': 2.5367e-01, 'y3:3': 2.8884e-01, 'y4': 1.7384e-01, 'y5:0': 3.2926e-02, 'y5:1': 2.4229e-02, 'y5:2': 1.1991e-02, 'y5:3': 4.3166e-02, 'y5:4': 9.1526e-02, 'y5:5': 2.6492e-02, 'y5:6': 2.6260e-02, 'y5:7': 5.6959e-02, 'y5:8': 9.6415e-03, 'y5:9': 6.0618e-03} ABC INFO: t: 5, eps: 3.47183121e+00. ABC INFO: Accepted: 1000 / 3464 = 2.8868e-01, ESS: 6.6030e+02. ABC.Distance DEBUG: Scale weights[6] = {'y1': 4.2879e-01, 'y2': 4.1838e-03, 'y3:0': 5.7918e-03, 'y3:1': 5.6599e-03, 'y3:2': 5.4093e-03, 'y3:3': 5.8011e-03, 'y4': 6.1425e+00, 'y5:0': 1.4619e-01, 'y5:1': 1.4344e-01, 'y5:2': 1.4549e-01, 'y5:3': 1.4741e-01, 'y5:4': 1.4828e-01, 'y5:5': 1.5322e-01, 'y5:6': 1.4584e-01, 'y5:7': 1.4160e-01, 'y5:8': 1.5229e-01, 'y5:9': 1.4761e-01} ABC INFO: t: 6, eps: 3.19989925e+00. ABC INFO: Accepted: 1000 / 3268 = 3.0600e-01, ESS: 8.0832e+02. ABC.Distance DEBUG: Scale weights[7] = {'y1': 8.7909e-01, 'y2': 5.6855e-03, 'y3:0': 5.4663e-03, 'y3:1': 5.2478e-03, 'y3:2': 5.4038e-03, 'y3:3': 5.3134e-03, 'y4': 5.7039e+00, 'y5:0': 1.4752e-01, 'y5:1': 1.4974e-01, 'y5:2': 1.4483e-01, 'y5:3': 1.5303e-01, 'y5:4': 1.4977e-01, 'y5:5': 1.4393e-01, 'y5:6': 1.4629e-01, 'y5:7': 1.4487e-01, 'y5:8': 1.4171e-01, 'y5:9': 1.5060e-01} ABC INFO: t: 7, eps: 3.34532960e+00. ABC INFO: Accepted: 1000 / 3538 = 2.8265e-01, ESS: 6.8462e+02. ABC.Distance DEBUG: Scale weights[8] = {'y1': 1.1505e+00, 'y2': 6.6000e-03, 'y3:0': 5.5011e-03, 'y3:1': 5.7120e-03, 'y3:2': 5.5755e-03, 'y3:3': 5.5854e-03, 'y4': 5.9549e+00, 'y5:0': 1.4993e-01, 'y5:1': 1.4909e-01, 'y5:2': 1.5224e-01, 'y5:3': 1.4733e-01, 'y5:4': 1.5118e-01, 'y5:5': 1.4891e-01, 'y5:6': 1.5109e-01, 'y5:7': 1.4416e-01, 'y5:8': 1.5078e-01, 'y5:9': 1.4639e-01} ABC INFO: t: 8, eps: 3.22838771e+00. ABC INFO: Accepted: 1000 / 3732 = 2.6795e-01, ESS: 6.7016e+02. ABC.Distance DEBUG: Scale weights[9] = {'y1': 1.9829e+00, 'y2': 7.9518e-03, 'y3:0': 5.5596e-03, 'y3:1': 5.6247e-03, 'y3:2': 5.4491e-03, 'y3:3': 5.4073e-03, 'y4': 5.6879e+00, 'y5:0': 1.5277e-01, 'y5:1': 1.4205e-01, 'y5:2': 1.4714e-01, 'y5:3': 1.4509e-01, 'y5:4': 1.5013e-01, 'y5:5': 1.4892e-01, 'y5:6': 1.5112e-01, 'y5:7': 1.4709e-01, 'y5:8': 1.5069e-01, 'y5:9': 1.4760e-01} ABC INFO: t: 9, eps: 3.21326217e+00. ABC INFO: Accepted: 1000 / 4684 = 2.1349e-01, ESS: 6.7510e+02. ABC.Distance DEBUG: Scale weights[10] = {'y1': 2.8658e+00, 'y2': 7.8365e-03, 'y3:0': 5.3990e-03, 'y3:1': 5.4563e-03, 'y3:2': 5.6418e-03, 'y3:3': 5.4384e-03, 'y4': 6.2526e+00, 'y5:0': 1.4679e-01, 'y5:1': 1.4627e-01, 'y5:2': 1.4339e-01, 'y5:3': 1.4857e-01, 'y5:4': 1.4847e-01, 'y5:5': 1.4619e-01, 'y5:6': 1.4558e-01, 'y5:7': 1.4979e-01, 'y5:8': 1.5285e-01, 'y5:9': 1.4914e-01} ABC INFO: t: 10, eps: 2.98269522e+00. ABC INFO: Accepted: 1000 / 5357 = 1.8667e-01, ESS: 1.3654e+02. ABC.Distance DEBUG: Scale weights[11] = {'y1': 4.3162e+00, 'y2': 8.7796e-03, 'y3:0': 5.5469e-03, 'y3:1': 5.5586e-03, 'y3:2': 5.5649e-03, 'y3:3': 5.6428e-03, 'y4': 5.5100e+00, 'y5:0': 1.4686e-01, 'y5:1': 1.4739e-01, 'y5:2': 1.4729e-01, 'y5:3': 1.4921e-01, 'y5:4': 1.4830e-01, 'y5:5': 1.5037e-01, 'y5:6': 1.4822e-01, 'y5:7': 1.4475e-01, 'y5:8': 1.4505e-01, 'y5:9': 1.4859e-01} ABC INFO: Stop: Total simulations budget. ABC.History INFO: Done <ABCSMC id=5, duration=0:01:20.943401, end_time=2021-10-05 16:48:52> ABC.Sampler INFO: Parallelize sampling on 4 processes. ABC.History INFO: Start <ABCSMC id=6, start_time=2021-10-05 16:48:52> ABC INFO: Calibration sample t = -1.
L1+Ada.+MAD+SensiLR+P4
ABC.Distance DEBUG: Scale weights[0] = {'y1': 2.9515e-01, 'y2': 2.9228e-03, 'y3:0': 2.8040e-03, 'y3:1': 2.8466e-03, 'y3:2': 2.9885e-03, 'y3:3': 2.7908e-03, 'y4': 4.6492e+00, 'y5:0': 1.4109e-01, 'y5:1': 1.5018e-01, 'y5:2': 1.5061e-01, 'y5:3': 1.4368e-01, 'y5:4': 1.5586e-01, 'y5:5': 1.5172e-01, 'y5:6': 1.4601e-01, 'y5:7': 1.4309e-01, 'y5:8': 1.4618e-01, 'y5:9': 1.4110e-01} ABC.Population INFO: Recording also rejected particles: True ABC.Population INFO: Recording also rejected particles: True ABC INFO: t: 0, eps: 1.90370823e+01. ABC INFO: Accepted: 1000 / 2062 = 4.8497e-01, ESS: 1.0000e+03. ABC.Distance DEBUG: Scale weights[1] = {'y1': 2.8181e-01, 'y2': 2.7431e-03, 'y3:0': 2.8163e-03, 'y3:1': 2.8346e-03, 'y3:2': 2.8045e-03, 'y3:3': 2.8256e-03, 'y4': 4.6809e+00, 'y5:0': 1.4608e-01, 'y5:1': 1.4597e-01, 'y5:2': 1.4613e-01, 'y5:3': 1.4680e-01, 'y5:4': 1.5315e-01, 'y5:5': 1.4733e-01, 'y5:6': 1.4561e-01, 'y5:7': 1.5146e-01, 'y5:8': 1.5494e-01, 'y5:9': 1.5565e-01} ABC INFO: t: 1, eps: 1.66816691e+01. ABC INFO: Accepted: 1000 / 2744 = 3.6443e-01, ESS: 7.5528e+02. ABC.Distance DEBUG: Scale weights[2] = {'y1': 3.3147e-01, 'y2': 3.5889e-03, 'y3:0': 3.6830e-03, 'y3:1': 3.7496e-03, 'y3:2': 3.8873e-03, 'y3:3': 3.7363e-03, 'y4': 5.8496e+00, 'y5:0': 1.4555e-01, 'y5:1': 1.5127e-01, 'y5:2': 1.4873e-01, 'y5:3': 1.4557e-01, 'y5:4': 1.4132e-01, 'y5:5': 1.4883e-01, 'y5:6': 1.4500e-01, 'y5:7': 1.5185e-01, 'y5:8': 1.4885e-01, 'y5:9': 1.5134e-01} ABC INFO: t: 2, eps: 1.64342272e+01. ABC INFO: Accepted: 1000 / 4382 = 2.2821e-01, ESS: 7.0985e+02. ABC.Distance DEBUG: Scale weights[3] = {'y1': 3.4101e-01, 'y2': 3.6199e-03, 'y3:0': 4.2096e-03, 'y3:1': 4.2142e-03, 'y3:2': 4.2356e-03, 'y3:3': 4.3150e-03, 'y4': 6.1319e+00, 'y5:0': 1.4916e-01, 'y5:1': 1.4335e-01, 'y5:2': 1.4834e-01, 'y5:3': 1.5323e-01, 'y5:4': 1.4797e-01, 'y5:5': 1.4518e-01, 'y5:6': 1.4611e-01, 'y5:7': 1.4832e-01, 'y5:8': 1.4561e-01, 'y5:9': 1.5067e-01} ABC INFO: t: 3, eps: 1.55803586e+01. ABC INFO: Accepted: 1000 / 6582 = 1.5193e-01, ESS: 6.8813e+02. ABC.Distance DEBUG: Scale weights[4] = {'y1': 3.8074e-01, 'y2': 3.8903e-03, 'y3:0': 4.7745e-03, 'y3:1': 4.6480e-03, 'y3:2': 4.7519e-03, 'y3:3': 4.6807e-03, 'y4': 6.1727e+00, 'y5:0': 1.4632e-01, 'y5:1': 1.4536e-01, 'y5:2': 1.4742e-01, 'y5:3': 1.4601e-01, 'y5:4': 1.4732e-01, 'y5:5': 1.4923e-01, 'y5:6': 1.4505e-01, 'y5:7': 1.4653e-01, 'y5:8': 1.4821e-01, 'y5:9': 1.4740e-01} ABC INFO: t: 4, eps: 1.48077363e+01. ABC INFO: Accepted: 1000 / 10167 = 9.8357e-02, ESS: 5.7857e+02. ABC.Distance DEBUG: Scale weights[5] = {'y1': 4.2770e-01, 'y2': 4.0296e-03, 'y3:0': 5.2047e-03, 'y3:1': 5.3214e-03, 'y3:2': 5.4221e-03, 'y3:3': 5.2252e-03, 'y4': 6.4655e+00, 'y5:0': 1.4598e-01, 'y5:1': 1.4573e-01, 'y5:2': 1.4675e-01, 'y5:3': 1.4958e-01, 'y5:4': 1.4612e-01, 'y5:5': 1.4952e-01, 'y5:6': 1.4774e-01, 'y5:7': 1.4844e-01, 'y5:8': 1.4886e-01, 'y5:9': 1.4630e-01} ABC.Predictor INFO: Fitted <LinearPredictor predictor=LinearRegression(normalize=True)> in 0.01s ABC.Predictor INFO: Pearson correlations: 1.000 0.957 0.894 0.103 0.080 0.057 0.077 0.922 0.874 0.844 0.702 0.118 0.086 0.048 0.075 0.871 ABC.Predictor DEBUG: Linear regression coefficients (n_target, n_feature): [[ 9.99502539e-01 -1.93422296e-04 5.38128670e-04 -2.78272962e-04 -3.64400057e-04 5.20488788e-04 -2.75772885e-04 3.29200383e-04 -2.16924232e-04 3.96429528e-04 2.05301004e-04 -1.58054070e-04 2.52347551e-04 -2.66167901e-04 -3.41627677e-04 -8.48553540e-05 -2.83522902e-05] [ 1.22609981e-03 9.56693166e-01 1.34578866e-03 -1.75636077e-03 -5.61365414e-03 1.38204788e-03 2.79901635e-03 4.04343720e-03 9.28082533e-04 -1.45912565e-03 3.49118766e-03 1.87129512e-03 1.98694536e-03 2.38042034e-04 2.82936251e-03 2.61300293e-03 8.63342824e-04] [ 6.97733388e-04 1.42899017e-03 2.97121475e-01 2.80317270e-01 2.88297308e-01 2.73882623e-01 -1.79522779e-03 -4.88825537e-04 -4.84904533e-03 -5.48304208e-03 -1.56598223e-03 5.21996342e-03 5.90480759e-03 1.71287180e-03 6.27001079e-03 -7.60741132e-03 5.38254641e-03] [ 1.86342914e-02 1.49850863e-02 -2.11680567e-02 -2.36141370e-02 -2.92081094e-02 -2.02124314e-02 -5.84719624e-02 9.40286176e-03 -9.53711146e-03 6.88167343e-03 -6.22476491e-03 -1.89685540e-02 -9.92165182e-03 -3.24649283e-03 -1.05839665e-02 -5.74367762e-03 -5.40630652e-03] [ 6.19976357e-02 3.54147247e-02 1.90022234e-02 9.05898665e-03 3.82018573e-03 -1.86956484e-02 9.54110145e-03 1.58515480e-02 -8.56265416e-03 7.47177764e-03 1.53222971e-02 1.20975133e-03 -2.43841590e-03 4.02911501e-03 6.54005290e-06 -8.73388260e-04 -4.43929688e-03] [ 4.74106601e-03 2.11271153e-02 3.68741823e-02 -1.54524485e-03 -2.10830327e-03 1.93862053e-02 -2.56674759e-03 1.83971848e-04 3.29892551e-03 -1.09803892e-02 9.50504961e-03 4.57211750e-03 5.72192780e-03 -8.74475932e-03 -8.97192256e-03 -5.13451305e-04 -9.99264711e-03] [-2.02845285e-02 -6.38385712e-02 -2.09746621e-03 -5.31799426e-03 -3.32279828e-03 -7.82148314e-03 -1.59573277e-02 3.65841973e-03 -1.02623786e-02 8.25198870e-03 -1.48725939e-02 6.14443464e-03 1.24004930e-02 -6.42325732e-03 -1.55410705e-02 1.81759017e-03 1.84227591e-03] [-2.76978715e-03 4.97347648e-03 1.11083990e-03 4.37780705e-04 4.48517765e-03 6.48282544e-04 9.21489621e-01 -1.54673665e-03 1.94443646e-04 8.90100407e-04 1.05346671e-03 -2.13112390e-03 6.00478964e-03 8.70609213e-04 4.60594062e-03 -5.65607796e-03 -2.57597826e-04] [ 8.73319171e-01 6.55444889e-04 7.83556124e-03 -4.45019822e-03 1.01516097e-02 9.38162514e-03 -2.19304526e-02 1.07384673e-03 1.90855693e-03 8.62610252e-04 -2.09702558e-03 -8.10359481e-03 -6.67290695e-04 -4.57633752e-03 1.02865712e-02 2.31683330e-03 -2.40384572e-03] [ 2.20721993e-02 8.42882585e-01 6.16174297e-03 -3.58262665e-03 -2.22856725e-02 7.92678688e-03 6.92701763e-03 9.58321274e-03 2.96823421e-03 -1.25577793e-03 3.59061773e-04 5.87249506e-03 -7.64333716e-03 7.05466409e-03 7.41305008e-03 5.63469262e-03 6.73543224e-04] [-1.07758165e-02 2.23813696e-02 2.35224957e-01 2.11983649e-01 2.26523541e-01 2.19666447e-01 -3.69013309e-03 -5.54700848e-03 -5.11880067e-04 1.80424513e-03 -7.33856418e-04 1.29538951e-02 6.33007119e-03 8.06957732e-03 1.35996359e-02 4.15445724e-03 5.57121040e-03] [ 2.56581042e-02 7.50768486e-03 -2.32518089e-02 -2.84231428e-02 -1.44127370e-02 -1.16752093e-02 -8.96396788e-02 1.73735904e-02 2.98404362e-03 5.61770558e-03 3.06813868e-03 -1.44469035e-02 -1.41435502e-02 -3.35284422e-03 -4.99373023e-03 -1.53659936e-02 -1.71846681e-03] [ 6.85717558e-02 3.66800260e-02 1.39254617e-02 1.01595295e-02 -3.59304673e-03 -1.59117868e-02 1.00044533e-02 1.66555162e-02 -2.39792234e-03 7.86289108e-03 1.81313271e-02 1.24330277e-03 -6.05568449e-03 5.32854717e-03 3.33437644e-03 1.79666604e-03 -5.45706411e-03] [ 5.21988144e-03 2.23199420e-02 1.83199425e-02 2.56450366e-03 3.04297238e-03 1.55015932e-02 3.99832515e-03 6.46758992e-03 3.74799300e-03 -9.73298110e-03 4.88130383e-03 3.72336041e-03 1.22049212e-02 -1.05176662e-02 -1.17146179e-02 6.09597565e-04 -6.59170415e-03] [-3.61081481e-02 -5.58208004e-02 -5.35585906e-03 -1.82876459e-03 -9.34228588e-04 3.56734365e-04 -1.54124166e-02 -3.75362547e-03 -2.17199609e-03 8.80632443e-03 -1.66151682e-02 3.43746434e-03 6.51614679e-03 -1.19214597e-02 -1.06969471e-02 -1.60769686e-03 7.32059765e-03] [ 2.39967768e-03 -3.84397235e-03 6.40911293e-04 -1.48758975e-03 3.86621378e-03 2.29760411e-03 8.70800760e-01 -2.81812126e-03 -2.07555748e-03 2.33175589e-03 1.97700750e-03 -4.62633215e-03 -1.60231232e-03 3.78858173e-03 1.02850428e-03 -1.25607858e-02 -3.25529886e-03]] ABC.Distance DEBUG: Optimal FD delta: [0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1] ABC.Distance DEBUG: Info weights[5] = {'y1': 3.1271e+00, 'y2': 3.2786e+00, 'y3:0': 1.2268e+00, 'y3:1': 8.0065e-01, 'y3:2': 7.5554e-01, 'y3:3': 1.0245e+00, 'y4': 2.6664e+00, 'y5:0': 3.5140e-01, 'y5:1': 2.1763e-01, 'y5:2': 3.5710e-01, 'y5:3': 4.3954e-01, 'y5:4': 2.8029e-01, 'y5:5': 3.6936e-01, 'y5:6': 3.1661e-01, 'y5:7': 3.8748e-01, 'y5:8': 1.5404e-01, 'y5:9': 2.4689e-01} ABC INFO: t: 5, eps: 1.46852265e+01. ABC INFO: Accepted: 1000 / 4788 = 2.0886e-01, ESS: 6.6571e+02. ABC.Distance DEBUG: Scale weights[6] = {'y1': 4.4325e-01, 'y2': 4.0733e-03, 'y3:0': 5.5294e-03, 'y3:1': 5.4443e-03, 'y3:2': 5.6522e-03, 'y3:3': 5.5228e-03, 'y4': 6.2858e+00, 'y5:0': 1.4948e-01, 'y5:1': 1.5177e-01, 'y5:2': 1.4518e-01, 'y5:3': 1.5009e-01, 'y5:4': 1.4988e-01, 'y5:5': 1.4563e-01, 'y5:6': 1.5015e-01, 'y5:7': 1.4674e-01, 'y5:8': 1.4756e-01, 'y5:9': 1.4556e-01} ABC INFO: t: 6, eps: 1.33029749e+01. ABC INFO: Accepted: 1000 / 4680 = 2.1368e-01, ESS: 6.8952e+02. ABC.Distance DEBUG: Scale weights[7] = {'y1': 7.3375e-01, 'y2': 5.6712e-03, 'y3:0': 5.3075e-03, 'y3:1': 5.2708e-03, 'y3:2': 5.1898e-03, 'y3:3': 5.1731e-03, 'y4': 6.7822e+00, 'y5:0': 1.4759e-01, 'y5:1': 1.4860e-01, 'y5:2': 1.4526e-01, 'y5:3': 1.4912e-01, 'y5:4': 1.4282e-01, 'y5:5': 1.5020e-01, 'y5:6': 1.4932e-01, 'y5:7': 1.4778e-01, 'y5:8': 1.5219e-01, 'y5:9': 1.4692e-01} ABC INFO: t: 7, eps: 1.34780152e+01. ABC INFO: Accepted: 1000 / 5494 = 1.8202e-01, ESS: 4.7720e+02. ABC.Distance DEBUG: Scale weights[8] = {'y1': 9.6986e-01, 'y2': 6.8574e-03, 'y3:0': 5.5035e-03, 'y3:1': 5.4118e-03, 'y3:2': 5.4408e-03, 'y3:3': 5.6220e-03, 'y4': 7.0448e+00, 'y5:0': 1.5111e-01, 'y5:1': 1.4968e-01, 'y5:2': 1.4786e-01, 'y5:3': 1.4848e-01, 'y5:4': 1.4709e-01, 'y5:5': 1.4734e-01, 'y5:6': 1.4757e-01, 'y5:7': 1.4919e-01, 'y5:8': 1.4807e-01, 'y5:9': 1.4631e-01} ABC INFO: t: 8, eps: 1.31930228e+01. ABC INFO: Accepted: 1000 / 6794 = 1.4719e-01, ESS: 6.4437e+02. ABC.Distance DEBUG: Scale weights[9] = {'y1': 1.5512e+00, 'y2': 7.3295e-03, 'y3:0': 5.1699e-03, 'y3:1': 5.2961e-03, 'y3:2': 5.3251e-03, 'y3:3': 5.2167e-03, 'y4': 6.7131e+00, 'y5:0': 1.4712e-01, 'y5:1': 1.4707e-01, 'y5:2': 1.4566e-01, 'y5:3': 1.4994e-01, 'y5:4': 1.4823e-01, 'y5:5': 1.4954e-01, 'y5:6': 1.4531e-01, 'y5:7': 1.4479e-01, 'y5:8': 1.4901e-01, 'y5:9': 1.4919e-01} ABC INFO: t: 9, eps: 1.25985309e+01. ABC INFO: Accepted: 1000 / 8043 = 1.2433e-01, ESS: 6.9170e+02. ABC.Distance DEBUG: Scale weights[10] = {'y1': 1.9748e+00, 'y2': 8.1948e-03, 'y3:0': 5.4270e-03, 'y3:1': 5.5257e-03, 'y3:2': 5.5349e-03, 'y3:3': 5.4652e-03, 'y4': 6.8710e+00, 'y5:0': 1.5052e-01, 'y5:1': 1.4727e-01, 'y5:2': 1.4841e-01, 'y5:3': 1.4777e-01, 'y5:4': 1.4775e-01, 'y5:5': 1.4712e-01, 'y5:6': 1.4966e-01, 'y5:7': 1.4827e-01, 'y5:8': 1.4875e-01, 'y5:9': 1.4670e-01} ABC INFO: Stop: Total simulations budget. ABC.History INFO: Done <ABCSMC id=6, duration=0:01:20.459454, end_time=2021-10-05 16:50:12>
CPU times: user 3min 38s, sys: 15.6 s, total: 3min 53s Wall time: 8min 43s
While overall all approaches would benefit from a continued analysis, the approaches L1+Ada.+MAD+StatLR+P4 and L1+Ada.+MAD+SensiLR+P4 employing scale normalization, accounting for informativeness, and using augmented regression targets, approximate the true posterior distribution best. Using only scale normalization captures the overall dynamics, however givese large uncertainties, as unnecessary emphasis is put on $y_5$. Approaches only using $\theta$ as regression targets however fail to capture the dynamics of $\theta_4$, as the regression model cannot unravel a meaningful relationship between data and parameters.
fig, axes = plt.subplots(ncols=len(prior_bounds), figsize=(16, 4))
# plot ground truth
for i_par, par in enumerate(gt_par.keys()):
# define parameter-simulation transformation
p_to_y = lambda p: p
if par == "p4":
p_to_y = lambda p: p**2
# observed data corresponding to parameter
y_obs = p_to_y(gt_par[par])
# bounds
xmin, xmax = prior_bounds[par]
# standard deviation
sigma = sigmas[par]
# pdf as function of only p
pdf = partial(
unnorm_1d_normal_pdf, y_obs=y_obs, sigma=sigma, p_to_y=p_to_y,
)
# integrate density
norm = sp.integrate.quad(pdf, xmin, xmax)[0]
# plot density
xs = np.linspace(xmin, xmax, 300)
axes[i_par].plot(
xs, pdf(xs) / norm, linestyle="dashed",
color="grey",label="ground truth",
)
# plot ABC approximations
for i_par, par in enumerate(prior_bounds.keys()):
for distance_id, h in zip(distances.keys(), hs):
pyabc.visualization.plot_kde_1d_highlevel(
h, x=par, xname=par,
xmin=prior_bounds[par][0], xmax=prior_bounds[par][1],
ax=axes[i_par], label=distance_id,
kde=pyabc.GridSearchCV() if par == "p4" else None,
numx=500,
)
# prettify
for ax in axes[1:]:
ax.set_ylabel(None)
fig.tight_layout(rect=(0, 0.1, 1, 1))
axes[-1].legend(bbox_to_anchor=(1, -0.2), loc="upper right", ncol=len(distances)+1)
<matplotlib.legend.Legend at 0x7f717afb2a30>
While the scale weights accurately depict the scales the various model output types vary on, the sensitivity weights are high for $y_1$ through $y_4$, with low weights assigned to $y_5$. Very roughly, the sum of sensitivity weights for the four model outputs $y_3$ is roughly equal to e.g. the sensitivity weigh assigned to $y_2$, as desirable. However, the weights assigned are now completely homogeneous, indicating that an increased training sample or more complex regression model may be preferable.
# plot weights
fig, axes = plt.subplots(nrows=2, ncols=1, figsize=(8, 8))
# scale weights
scale_distance_ids = [
distance_id for distance_id in distances.keys()
if "Ada." in distance_id and "Stat" not in distance_id
]
scale_log_files = []
for i_dist, distance_id in enumerate(scale_distance_ids):
scale_log_files.append(f"{scale_log_file}_{distance_id}.json")
pyabc.visualization.plot_distance_weights(
scale_log_files,
labels=scale_distance_ids,
colors=[colors[distance_id] for distance_id in scale_distance_ids],
xlabel="Model output",
title="Scale weights",
ax=axes[0],
keys=dict2arrlabels(data, keys=data.keys()),
)
# info weights
info_distance_ids = [
distance_id for distance_id in distances.keys()
if "Sensi" in distance_id
]
info_log_files = []
for i_dist, distance_id in enumerate(info_distance_ids):
info_log_files.append(f"{info_log_file}_{distance_id}.json")
pyabc.visualization.plot_distance_weights(
info_log_files,
labels=info_distance_ids,
colors=[colors[distance_id] for distance_id in info_distance_ids],
xlabel="Model output",
title="Sensitivity weights",
ax=axes[1],
keys=dict2arrlabels(data, keys=data.keys()),
)
fig.tight_layout()
# plot flow diagram
fig = pyabc.visualization.plot_sensitivity_sankey(
info_sample_log_file=f"{info_sample_log_file}_L1+Ada.+MAD+SensiLR+P4",
t=f"{info_log_file}_L1+Ada.+MAD+SensiLR+P4.json",
h=hs[-1],
predictor=LinearPredictor(),
par_trafo=ParTrafo(trafos=par_trafos),
height=900,
)
# here just showing a non-interactive plot to reduce storage
img_file = tempfile.mkstemp(suffix=(".svg"))[1]
fig.write_image(img_file)
display(SVG(img_file))
ABC.Predictor INFO: Fitted <LinearPredictor predictor=LinearRegression(normalize=True)> in 0.01s INFO:ABC.Predictor:Fitted <LinearPredictor predictor=LinearRegression(normalize=True)> in 0.01s ABC.Predictor INFO: Pearson correlations: 1.000 0.957 0.894 0.103 0.080 0.057 0.077 0.922 0.874 0.844 0.702 0.118 0.086 0.048 0.075 0.871 INFO:ABC.Predictor:Pearson correlations: 1.000 0.957 0.894 0.103 0.080 0.057 0.077 0.922 0.874 0.844 0.702 0.118 0.086 0.048 0.075 0.871 ABC.Predictor DEBUG: Linear regression coefficients (n_target, n_feature): [[ 9.99502539e-01 -1.93422296e-04 5.38128670e-04 -2.78272962e-04 -3.64400057e-04 5.20488788e-04 -2.75772885e-04 3.29200383e-04 -2.16924232e-04 3.96429528e-04 2.05301004e-04 -1.58054070e-04 2.52347551e-04 -2.66167901e-04 -3.41627677e-04 -8.48553540e-05 -2.83522902e-05] [ 1.22609981e-03 9.56693166e-01 1.34578866e-03 -1.75636077e-03 -5.61365414e-03 1.38204788e-03 2.79901635e-03 4.04343720e-03 9.28082533e-04 -1.45912565e-03 3.49118766e-03 1.87129512e-03 1.98694536e-03 2.38042034e-04 2.82936251e-03 2.61300293e-03 8.63342824e-04] [ 6.97733388e-04 1.42899017e-03 2.97121475e-01 2.80317270e-01 2.88297308e-01 2.73882623e-01 -1.79522779e-03 -4.88825537e-04 -4.84904533e-03 -5.48304208e-03 -1.56598223e-03 5.21996342e-03 5.90480759e-03 1.71287180e-03 6.27001079e-03 -7.60741132e-03 5.38254641e-03] [ 1.86342914e-02 1.49850863e-02 -2.11680567e-02 -2.36141370e-02 -2.92081094e-02 -2.02124314e-02 -5.84719624e-02 9.40286176e-03 -9.53711146e-03 6.88167343e-03 -6.22476491e-03 -1.89685540e-02 -9.92165182e-03 -3.24649283e-03 -1.05839665e-02 -5.74367762e-03 -5.40630652e-03] [ 6.19976357e-02 3.54147247e-02 1.90022234e-02 9.05898665e-03 3.82018573e-03 -1.86956484e-02 9.54110145e-03 1.58515480e-02 -8.56265416e-03 7.47177764e-03 1.53222971e-02 1.20975133e-03 -2.43841590e-03 4.02911501e-03 6.54005290e-06 -8.73388260e-04 -4.43929688e-03] [ 4.74106601e-03 2.11271153e-02 3.68741823e-02 -1.54524485e-03 -2.10830327e-03 1.93862053e-02 -2.56674759e-03 1.83971848e-04 3.29892551e-03 -1.09803892e-02 9.50504961e-03 4.57211750e-03 5.72192780e-03 -8.74475932e-03 -8.97192256e-03 -5.13451305e-04 -9.99264711e-03] [-2.02845285e-02 -6.38385712e-02 -2.09746621e-03 -5.31799426e-03 -3.32279828e-03 -7.82148314e-03 -1.59573277e-02 3.65841973e-03 -1.02623786e-02 8.25198870e-03 -1.48725939e-02 6.14443464e-03 1.24004930e-02 -6.42325732e-03 -1.55410705e-02 1.81759017e-03 1.84227591e-03] [-2.76978715e-03 4.97347648e-03 1.11083990e-03 4.37780705e-04 4.48517765e-03 6.48282544e-04 9.21489621e-01 -1.54673665e-03 1.94443646e-04 8.90100407e-04 1.05346671e-03 -2.13112390e-03 6.00478964e-03 8.70609213e-04 4.60594062e-03 -5.65607796e-03 -2.57597826e-04] [ 8.73319171e-01 6.55444889e-04 7.83556124e-03 -4.45019822e-03 1.01516097e-02 9.38162514e-03 -2.19304526e-02 1.07384673e-03 1.90855693e-03 8.62610252e-04 -2.09702558e-03 -8.10359481e-03 -6.67290695e-04 -4.57633752e-03 1.02865712e-02 2.31683330e-03 -2.40384572e-03] [ 2.20721993e-02 8.42882585e-01 6.16174297e-03 -3.58262665e-03 -2.22856725e-02 7.92678688e-03 6.92701763e-03 9.58321274e-03 2.96823421e-03 -1.25577793e-03 3.59061773e-04 5.87249506e-03 -7.64333716e-03 7.05466409e-03 7.41305008e-03 5.63469262e-03 6.73543224e-04] [-1.07758165e-02 2.23813696e-02 2.35224957e-01 2.11983649e-01 2.26523541e-01 2.19666447e-01 -3.69013309e-03 -5.54700848e-03 -5.11880067e-04 1.80424513e-03 -7.33856418e-04 1.29538951e-02 6.33007119e-03 8.06957732e-03 1.35996359e-02 4.15445724e-03 5.57121040e-03] [ 2.56581042e-02 7.50768486e-03 -2.32518089e-02 -2.84231428e-02 -1.44127370e-02 -1.16752093e-02 -8.96396788e-02 1.73735904e-02 2.98404362e-03 5.61770558e-03 3.06813868e-03 -1.44469035e-02 -1.41435502e-02 -3.35284422e-03 -4.99373023e-03 -1.53659936e-02 -1.71846681e-03] [ 6.85717558e-02 3.66800260e-02 1.39254617e-02 1.01595295e-02 -3.59304673e-03 -1.59117868e-02 1.00044533e-02 1.66555162e-02 -2.39792234e-03 7.86289108e-03 1.81313271e-02 1.24330277e-03 -6.05568449e-03 5.32854717e-03 3.33437644e-03 1.79666604e-03 -5.45706411e-03] [ 5.21988144e-03 2.23199420e-02 1.83199425e-02 2.56450366e-03 3.04297238e-03 1.55015932e-02 3.99832515e-03 6.46758992e-03 3.74799300e-03 -9.73298110e-03 4.88130383e-03 3.72336041e-03 1.22049212e-02 -1.05176662e-02 -1.17146179e-02 6.09597565e-04 -6.59170415e-03] [-3.61081481e-02 -5.58208004e-02 -5.35585906e-03 -1.82876459e-03 -9.34228588e-04 3.56734365e-04 -1.54124166e-02 -3.75362547e-03 -2.17199609e-03 8.80632443e-03 -1.66151682e-02 3.43746434e-03 6.51614679e-03 -1.19214597e-02 -1.06969471e-02 -1.60769686e-03 7.32059765e-03] [ 2.39967768e-03 -3.84397235e-03 6.40911293e-04 -1.48758975e-03 3.86621378e-03 2.29760411e-03 8.70800760e-01 -2.81812126e-03 -2.07555748e-03 2.33175589e-03 1.97700750e-03 -4.62633215e-03 -1.60231232e-03 3.78858173e-03 1.02850428e-03 -1.25607858e-02 -3.25529886e-03]] DEBUG:ABC.Predictor:Linear regression coefficients (n_target, n_feature): [[ 9.99502539e-01 -1.93422296e-04 5.38128670e-04 -2.78272962e-04 -3.64400057e-04 5.20488788e-04 -2.75772885e-04 3.29200383e-04 -2.16924232e-04 3.96429528e-04 2.05301004e-04 -1.58054070e-04 2.52347551e-04 -2.66167901e-04 -3.41627677e-04 -8.48553540e-05 -2.83522902e-05] [ 1.22609981e-03 9.56693166e-01 1.34578866e-03 -1.75636077e-03 -5.61365414e-03 1.38204788e-03 2.79901635e-03 4.04343720e-03 9.28082533e-04 -1.45912565e-03 3.49118766e-03 1.87129512e-03 1.98694536e-03 2.38042034e-04 2.82936251e-03 2.61300293e-03 8.63342824e-04] [ 6.97733388e-04 1.42899017e-03 2.97121475e-01 2.80317270e-01 2.88297308e-01 2.73882623e-01 -1.79522779e-03 -4.88825537e-04 -4.84904533e-03 -5.48304208e-03 -1.56598223e-03 5.21996342e-03 5.90480759e-03 1.71287180e-03 6.27001079e-03 -7.60741132e-03 5.38254641e-03] [ 1.86342914e-02 1.49850863e-02 -2.11680567e-02 -2.36141370e-02 -2.92081094e-02 -2.02124314e-02 -5.84719624e-02 9.40286176e-03 -9.53711146e-03 6.88167343e-03 -6.22476491e-03 -1.89685540e-02 -9.92165182e-03 -3.24649283e-03 -1.05839665e-02 -5.74367762e-03 -5.40630652e-03] [ 6.19976357e-02 3.54147247e-02 1.90022234e-02 9.05898665e-03 3.82018573e-03 -1.86956484e-02 9.54110145e-03 1.58515480e-02 -8.56265416e-03 7.47177764e-03 1.53222971e-02 1.20975133e-03 -2.43841590e-03 4.02911501e-03 6.54005290e-06 -8.73388260e-04 -4.43929688e-03] [ 4.74106601e-03 2.11271153e-02 3.68741823e-02 -1.54524485e-03 -2.10830327e-03 1.93862053e-02 -2.56674759e-03 1.83971848e-04 3.29892551e-03 -1.09803892e-02 9.50504961e-03 4.57211750e-03 5.72192780e-03 -8.74475932e-03 -8.97192256e-03 -5.13451305e-04 -9.99264711e-03] [-2.02845285e-02 -6.38385712e-02 -2.09746621e-03 -5.31799426e-03 -3.32279828e-03 -7.82148314e-03 -1.59573277e-02 3.65841973e-03 -1.02623786e-02 8.25198870e-03 -1.48725939e-02 6.14443464e-03 1.24004930e-02 -6.42325732e-03 -1.55410705e-02 1.81759017e-03 1.84227591e-03] [-2.76978715e-03 4.97347648e-03 1.11083990e-03 4.37780705e-04 4.48517765e-03 6.48282544e-04 9.21489621e-01 -1.54673665e-03 1.94443646e-04 8.90100407e-04 1.05346671e-03 -2.13112390e-03 6.00478964e-03 8.70609213e-04 4.60594062e-03 -5.65607796e-03 -2.57597826e-04] [ 8.73319171e-01 6.55444889e-04 7.83556124e-03 -4.45019822e-03 1.01516097e-02 9.38162514e-03 -2.19304526e-02 1.07384673e-03 1.90855693e-03 8.62610252e-04 -2.09702558e-03 -8.10359481e-03 -6.67290695e-04 -4.57633752e-03 1.02865712e-02 2.31683330e-03 -2.40384572e-03] [ 2.20721993e-02 8.42882585e-01 6.16174297e-03 -3.58262665e-03 -2.22856725e-02 7.92678688e-03 6.92701763e-03 9.58321274e-03 2.96823421e-03 -1.25577793e-03 3.59061773e-04 5.87249506e-03 -7.64333716e-03 7.05466409e-03 7.41305008e-03 5.63469262e-03 6.73543224e-04] [-1.07758165e-02 2.23813696e-02 2.35224957e-01 2.11983649e-01 2.26523541e-01 2.19666447e-01 -3.69013309e-03 -5.54700848e-03 -5.11880067e-04 1.80424513e-03 -7.33856418e-04 1.29538951e-02 6.33007119e-03 8.06957732e-03 1.35996359e-02 4.15445724e-03 5.57121040e-03] [ 2.56581042e-02 7.50768486e-03 -2.32518089e-02 -2.84231428e-02 -1.44127370e-02 -1.16752093e-02 -8.96396788e-02 1.73735904e-02 2.98404362e-03 5.61770558e-03 3.06813868e-03 -1.44469035e-02 -1.41435502e-02 -3.35284422e-03 -4.99373023e-03 -1.53659936e-02 -1.71846681e-03] [ 6.85717558e-02 3.66800260e-02 1.39254617e-02 1.01595295e-02 -3.59304673e-03 -1.59117868e-02 1.00044533e-02 1.66555162e-02 -2.39792234e-03 7.86289108e-03 1.81313271e-02 1.24330277e-03 -6.05568449e-03 5.32854717e-03 3.33437644e-03 1.79666604e-03 -5.45706411e-03] [ 5.21988144e-03 2.23199420e-02 1.83199425e-02 2.56450366e-03 3.04297238e-03 1.55015932e-02 3.99832515e-03 6.46758992e-03 3.74799300e-03 -9.73298110e-03 4.88130383e-03 3.72336041e-03 1.22049212e-02 -1.05176662e-02 -1.17146179e-02 6.09597565e-04 -6.59170415e-03] [-3.61081481e-02 -5.58208004e-02 -5.35585906e-03 -1.82876459e-03 -9.34228588e-04 3.56734365e-04 -1.54124166e-02 -3.75362547e-03 -2.17199609e-03 8.80632443e-03 -1.66151682e-02 3.43746434e-03 6.51614679e-03 -1.19214597e-02 -1.06969471e-02 -1.60769686e-03 7.32059765e-03] [ 2.39967768e-03 -3.84397235e-03 6.40911293e-04 -1.48758975e-03 3.86621378e-03 2.29760411e-03 8.70800760e-01 -2.81812126e-03 -2.07555748e-03 2.33175589e-03 1.97700750e-03 -4.62633215e-03 -1.60231232e-03 3.78858173e-03 1.02850428e-03 -1.25607858e-02 -3.25529886e-03]] ABC.Distance DEBUG: Optimal FD delta: [0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1] DEBUG:ABC.Distance:Optimal FD delta: [0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1]