This notebook is part of a notebook collection available at the NEOPRENE Project Site for illustration, reproducibility and reusability purposes. This notebook is licensed under the the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.
This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
In this script we calibrate the NSRPM (Neyman Scott Rectangular Pulse Model) using two rainfall series with different rainfall regimes: one with daily data located in the north of Spain, on a temperate climate (Cfb), and other with hourly data located in New Mexico (USA) on a semi-arid climate (BSh-BSk). The calibration parameters are used to simulate several decades of synthetic rainfall data which can be very useful for rainfall extreme analysis or disaggregation purposes, among other applications.
The script also contains:
a validation section where the observed series are compared with the simulated ones in terms of their statistics and exceedance probabilities.
a disaggregation function to disaggreate daily to hourly rainfall data.
Please, be advised that some of the processes may take up to 5 minutes in a modern computer.
%load_ext autoreload
%autoreload 2
import os
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
from NEOPRENE.NSRP.HiperParams import Calibration as Calhps, Simulation as Simhps
from NEOPRENE.NSRP.Calibration import Calibration
from NEOPRENE.NSRP.Statistics import Statistics
from NEOPRENE.NSRP.Simulation import Simulation
from NEOPRENE.NSRP.Analysis import Analysis
import warnings
warnings.filterwarnings('ignore')
Serie_PD = pd.read_csv('auxiliary-materials/Point_Daily.csv', sep=";", decimal=".", index_col=0, parse_dates=True)
Serie_PD[Serie_PD.values<0] = np.nan
Input_Serie = pd.DataFrame(index=Serie_PD.index)
Input_Serie['Rain'] = Serie_PD.values
f, (ax0, ax1) = plt.subplots(1, 2, gridspec_kw={'width_ratios': [3, 1]}, figsize=(20, 5))
t1=str(Input_Serie.index.year[0])
t2=str(Input_Serie.index.year[-1])
Input_Serie.plot(xlim=(t1, t2), ylim=(0, 300), ax = ax0)
ax0.grid(True)
ax0.set_ylabel('mm/day')
grouped_m = Input_Serie.groupby(lambda x: x.month)
Month_sum = grouped_m.sum()*24/(len(Input_Serie>=0)/30)
Month_sum.plot(ax = ax1)
ax1.grid(True)
ax1.set_ylabel('mm/month');
The calibration input file (Input_Cal_PD.yml) contains the hyperparameters to calibrate the model. This hyperparameters are loaded by the Calibration
class within the NEOPRENE.NSRP.HiperParams
module. In this notebook, the module is loaded as Calhps
.
Input rainfall statistics can be directly calculated by the software from a time series or can be introduced from a file with the statistics (both options are shown in the present notebook).
A description of the hyperparamters available in the calibration file (Input_Cal_PD.yml) can be found within the doc
folder of the project repository.
# Reading hiperparamteres for the calibration
hiper_params_cal = Calhps('./Input_Cal_PD.yml')
The original time series statistics are computed first, as the model calibrates against those statistics. Once the statistics have been computed, the library does not need the complete time series anymore.
# Input statistics calculated from a Rainfall serie
statistics_model_1 = Statistics(hiper_params_cal, time_series = Input_Serie)
# Input statistics from file
#statistics_model_2 = Statistics(hiper_params_cal, file = 'auxiliary-materials/statististics_PD.csv')
The hyperparameters are passed to the Calibration class which return a calibrator object. This object can be used as a function, that receives the statistics that need to be reproduced by the model. The object can also save the calibrated parameters to disk for later use.
CAL = Calibration(hiper_params_cal)
CAL1 = CAL(statistics_model_1, verbose=False)
#CAL2 = CAL(statistics_model_2, verbose=False)
os.makedirs('./POINT_DAILY/CAL1/',exist_ok=True)
CAL1.save_files('./POINT_DAILY/CAL1/')
#CAL2.save_files('./CAL2/')
################################################################################ Adjustment of parameters using the Particle Swarm Optimization (PSO)
The input simulation file (Input_Sim_PD.yml) contains the hypermarametes for rainfall simulation.
A description of the hyperparamters available in the simulation file (Input_Sim_PD.yml) can be found within the doc
folder of the project repository.
The first thing we do is to load the hyperparameters
# Reading hiperparamteres for the simulation
hiper_params_sim = Simhps('./Input_Sim_PD.yml')
And those hyperparameters are then used to configure the Simulation class, which return a Simulation object that is used in a very similar way to the Calibration object.
SIM = Simulation(hiper_params_sim)
# Input parameters from the model
SIM1 = SIM(params_cal = CAL1)
# Input parameters from a dataframe
os.makedirs('./POINT_DAILY/SIM1/',exist_ok=True)
SIM1.save_files('./POINT_DAILY/SIM1/')
#SIM2 = SIM(params_cal = './CAL2/Calibrated_parameters.csv')
#SIM2.save_files('./SIM1/')
################################################################################ Synthetic simulation Total cumulative rainfall - Analytical estimation = 99299.62 Total cumulative rainfall - Simulated = 98859.84
In this section, an analysis of the results is made.
Analysis_results = Analysis(CAL1,SIM1)
The curve of exceedence probabilities is shown to evaluate the quality of the fit of the simulations and the observations
# Comparing exceedence Probability betwen Observed and Simulated series
Analysis_results.exceedence_probability_fig(Input_Serie, SIM1.Daily_Simulation)
Another set of verification plots is generated where the value of different observed statistics (dashed line) is compared against the fitted statistics (blue squares), that is, the best values of the statistic that the optimal parameters were able to reproduce, and against the simulated statistics (red triangles), the values of the statistics that were obtained when simulating synthetic time series with the calibrated parameters.
The model correctly reproduces those parameters for which it is calibrated. Those parameters that do not participate in the calibration procedure may exhibit a more erratic behavior, thus the differences between the calibrated and simulated values of the statistics.
# Comparing the values of the statistcs (observed, fitted and simulated).
Analysis_results.compare_statistics_fig()
Disaggregation process might take some minutes to dissagregate several decades of data. Here we select only 5 year as an example
# Period selected from the input (onserved) serie
year1 = 2000; year2 = 2005
x_series = Input_Serie[(Input_Serie.index.year >= year1) & (Input_Serie.index.year < year2)]
# Defining hourly synthetic serie
y_series = SIM1.Hourly_Simulation.copy()
# Daily-to-hourly disaggregation
Analysis_results.disaggregate_rainfall(x_series, y_series)
hourly_disaggregation = Analysis_results.hourly_disaggregation
# Resampling hourly disaggregated serie to daily one.
#The series is resampled through groupby to avoid possible problems when dates are greater than 2262 and less than 1677.
daily_disaggregation = pd.DataFrame(hourly_disaggregation.groupby([(hourly_disaggregation.index.year),
(hourly_disaggregation.index.month),
(hourly_disaggregation.index.day)]).sum().values,
index=pd.period_range(start=hourly_disaggregation.index[0],end=hourly_disaggregation.index[-1],freq='D'),columns=['Rain'])
# Calculating statistics from the disaggregated daily resample serie and for the period selected from the Input serie
statistics_model_3 = Statistics(hiper_params_cal, time_series = x_series)
statistics_disaggregated = Statistics(hiper_params_cal, time_series = daily_disaggregation)
# Comparing daily observed statiscis with dialy dissagregated ones
statistics_model_3.statististics_dataframe.compare(statistics_disaggregated.statististics_dataframe)
1 | 2 | 3 | 4 | 5 | ... | 8 | 9 | 10 | 11 | 12 | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
self | other | self | other | self | other | self | other | self | other | ... | self | other | self | other | self | other | self | other | self | other | |
mean | 3.507258 | 2.799354 | 2.702817 | 2.698224 | 2.089677 | 2.081935 | 2.91 | 2.914867 | 2.270323 | 2.264621 | ... | 1.465161 | 1.465842 | 1.595333 | 1.589176 | 3.713548 | 3.711987 | 4.966187 | 4.588271 | 5.054032 | 3.948681 |
var_1 | 41.278254 | 34.736253 | 26.169006 | 26.079205 | 24.566345 | 24.369237 | 31.787167 | 31.814725 | 25.8257 | 25.663644 | ... | 14.645625 | 14.628236 | 24.285778 | 24.268923 | 48.780268 | 48.568544 | 64.706555 | 61.040213 | 101.595064 | 76.751037 |
var_2 | 119.565434 | 93.178984 | 66.536588 | 66.467611 | 63.601981 | 63.25332 | 82.352267 | 82.305725 | 77.780251 | 76.915392 | ... | 40.120764 | 40.176382 | 49.032313 | 48.879519 | 116.462913 | 116.088647 | 180.721227 | 117.238622 | 300.214935 | 236.360579 |
var_3 | 215.945334 | 194.339293 | 131.377066 | 129.905686 | 94.772822 | 93.796437 | 114.2329 | 114.195465 | 148.364508 | 146.917671 | ... | 69.314508 | 69.073412 | 78.142004 | 77.057658 | 192.214822 | 190.975687 | 216.536007 | 204.436874 | 496.819303 | 430.490138 |
var_4 | 298.109802 | 285.731876 | 145.907932 | 145.965362 | 153.684076 | 152.620861 | 196.617458 | 196.364875 | 251.422288 | 248.342357 | ... | 95.344339 | 94.893352 | 83.931309 | 82.569719 | 266.877291 | 265.260172 | 331.735478 | 219.456351 | 896.459417 | 550.912086 |
autocorr_1_1 | 0.39951 | 0.431729 | 0.405202 | 0.402361 | 0.259598 | 0.259651 | 0.245841 | 0.244219 | 0.431689 | 0.42731 | ... | 0.250708 | 0.249333 | -0.001251 | -0.008748 | 0.207701 | 0.20849 | 0.172506 | 0.193309 | 0.508178 | 0.498675 |
autocorr_2_1 | 0.19412 | 0.238424 | 0.173018 | 0.177214 | 0.092012 | 0.09022 | 0.046585 | 0.046296 | 0.216869 | 0.215802 | ... | 0.234899 | 0.230817 | -0.036542 | -0.036248 | 0.042307 | 0.040583 | -0.099715 | -0.069953 | 0.320199 | 0.351899 |
autocorr_3_1 | 0.140389 | 0.190398 | 0.029453 | 0.030262 | -0.007084 | -0.006561 | -0.010978 | -0.009902 | -0.036449 | -0.037439 | ... | 0.174173 | 0.171109 | -0.03413 | -0.032763 | -0.026483 | -0.027486 | -0.096403 | -0.066312 | 0.207887 | 0.266495 |
fih_1 | 0.516129 | 0.612903 | 0.521127 | 0.507042 | 0.6 | 0.593548 | 0.433333 | 0.42 | 0.587097 | 0.593548 | ... | 0.63871 | 0.619355 | 0.673333 | 0.653333 | 0.496774 | 0.503226 | 0.460432 | 0.5 | 0.443548 | 0.567742 |
fiWW_1 | 0.75 | 0.75 | 0.632353 | 0.714286 | 0.629032 | 0.68254 | 0.776471 | 0.793103 | 0.609375 | 0.587302 | ... | 0.696429 | 0.762712 | 0.55102 | 0.634615 | 0.705128 | 0.714286 | 0.68 | 0.693333 | 0.884058 | 0.850746 |
fiDD_1 | 0.765625 | 0.842105 | 0.662162 | 0.722222 | 0.752688 | 0.782609 | 0.707692 | 0.714286 | 0.725275 | 0.717391 | ... | 0.828283 | 0.854167 | 0.782178 | 0.806122 | 0.701299 | 0.717949 | 0.625 | 0.693333 | 0.854545 | 0.886364 |
M3_1 | 632.177191 | 561.087011 | 331.922783 | 329.59892 | 417.915617 | 412.105625 | 552.230324 | 553.721389 | 409.973966 | 403.512372 | ... | 242.032888 | 241.174446 | 648.894994 | 647.7617 | 841.705477 | 833.999763 | 1014.796293 | 970.169039 | 3242.221812 | 2215.17669 |
12 rows × 24 columns
Analysis_results.disaggregation_fig(hourly_disaggregation,daily_disaggregation,x_series,year1,year1)
To save the figures in png format we execute the following line
Analysis_results.save_figures('./POINT_DAILY/Figures/')
Serie_PH=pd.read_csv('auxiliary-materials/Point_Hourly.csv', sep=",", decimal=".", index_col=0, parse_dates = True)
Serie_PH[Serie_PH.values<0]=np.nan
Input_Serie=pd.DataFrame(index=Serie_PH.index)
Input_Serie['Rain']=Serie_PH.values
f, (ax0, ax1) = plt.subplots(1, 2, gridspec_kw={'width_ratios': [3, 1]}, figsize=(20, 5))
t1=str(Input_Serie.index.year[0]); t2=str(Input_Serie.index.year[-1])
Input_Serie.plot(xlim=(t1, t2), ylim=(0, 20), ax = ax0)
ax0.grid(True)
ax0.set_ylabel('mm/hour')
grouped_m = Input_Serie.groupby(lambda x: x.month)
Month_sum=grouped_m.sum()*24/(len(Input_Serie>=0)/30)
Month_sum.plot(ax = ax1)
ax1.grid(True)
ax1.set_ylabel('mm/month')
Text(0, 0.5, 'mm/month')
hiper_params_cal = Calhps('./Input_Cal_PH.yml')
statistics_model_1 = Statistics(hiper_params_cal, time_series = Input_Serie)
CAL = Calibration(hiper_params_cal)
CAL1 = CAL(statistics_model_1, verbose=False)
os.makedirs('./POINT_HOURLY/CAL1/', exist_ok=True)
CAL1.save_files('./POINT_HOURLY/CAL1/')
################################################################################ Adjustment of parameters using the Particle Swarm Optimization (PSO)
hiper_params_sim = Simhps('./Input_Sim_PH.yml')
SIM = Simulation(hiper_params_sim)
SIM1 = SIM(params_cal = CAL1)
#SIM2 = SIM(params_cal = './CAL2/Calibrated_parameters.csv')
################################################################################ Synthetic simulation Total cumulative rainfall - Analytical estimation - Storm 1 = 1116.16 Total cumulative rainfall - Analytical estimation - Storm 2 = 990.31 Total cumulative rainfall - Analytical estimation = 2106.47 Total cumulative rainfall - Simulated = 1867.26
os.makedirs('./POINT_HOURLY/SIM1/',exist_ok=True)
SIM1.save_files('./POINT_HOURLY/SIM1/')
In this section, an analysis of the results is made.
Analysis_results = Analysis(CAL1,SIM1)
# Comparing the values of the statistcs (observed, fitted and simulated).
Analysis_results.compare_statistics_fig()
# Comparing exceedence Probability betwen Observed and Simulated series
Analysis_results.exceedence_probability_fig(Input_Serie, SIM1.Hourly_Simulation)
To save the figures in png format we execute the following line
Analysis_results.save_figures('./POINT_HOURLY/Figures/')
!conda list
# packages in environment at C:\Anaconda3\envs\NEOPRENE_2024: # # Name Version Build Channel anyio 4.2.0 pyhd8ed1ab_0 conda-forge argon2-cffi 23.1.0 pyhd8ed1ab_0 conda-forge argon2-cffi-bindings 21.2.0 py311ha68e1ae_4 conda-forge arrow 1.3.0 pyhd8ed1ab_0 conda-forge asttokens 2.4.1 pyhd8ed1ab_0 conda-forge async-lru 2.0.4 pyhd8ed1ab_0 conda-forge attrs 23.2.0 pyh71513ae_0 conda-forge babel 2.14.0 pyhd8ed1ab_0 conda-forge beautifulsoup4 4.12.3 pyha770c72_0 conda-forge bleach 6.1.0 pyhd8ed1ab_0 conda-forge brotli-python 1.1.0 py311h12c1d0e_1 conda-forge bzip2 1.0.8 hcfcfb64_5 conda-forge ca-certificates 2023.11.17 h56e8100_0 conda-forge cached-property 1.5.2 hd8ed1ab_1 conda-forge cached_property 1.5.2 pyha770c72_1 conda-forge certifi 2023.11.17 pyhd8ed1ab_0 conda-forge cffi 1.16.0 py311ha68e1ae_0 conda-forge charset-normalizer 3.3.2 pyhd8ed1ab_0 conda-forge colorama 0.4.6 pyhd8ed1ab_0 conda-forge comm 0.2.1 pyhd8ed1ab_0 conda-forge contourpy 1.2.0 pypi_0 pypi cycler 0.12.1 pypi_0 pypi datetime 5.4 pypi_0 pypi debugpy 1.8.0 py311h12c1d0e_1 conda-forge decorator 5.1.1 pyhd8ed1ab_0 conda-forge defusedxml 0.7.1 pyhd8ed1ab_0 conda-forge entrypoints 0.4 pyhd8ed1ab_0 conda-forge exceptiongroup 1.2.0 pyhd8ed1ab_2 conda-forge executing 2.0.1 pyhd8ed1ab_0 conda-forge fonttools 4.47.2 pypi_0 pypi fqdn 1.5.1 pyhd8ed1ab_0 conda-forge haversine 2.8.1 pypi_0 pypi idna 3.6 pyhd8ed1ab_0 conda-forge importlib-metadata 7.0.1 pyha770c72_0 conda-forge importlib_metadata 7.0.1 hd8ed1ab_0 conda-forge importlib_resources 6.1.1 pyhd8ed1ab_0 conda-forge ipykernel 6.29.0 pyha63f2e9_0 conda-forge ipython 8.20.0 pyh7428d3b_0 conda-forge isoduration 20.11.0 pyhd8ed1ab_0 conda-forge jedi 0.19.1 pyhd8ed1ab_0 conda-forge jinja2 3.1.3 pyhd8ed1ab_0 conda-forge json5 0.9.14 pyhd8ed1ab_0 conda-forge jsonpointer 2.4 py311h1ea47a8_3 conda-forge jsonschema 4.21.0 pyhd8ed1ab_0 conda-forge jsonschema-specifications 2023.12.1 pyhd8ed1ab_0 conda-forge jsonschema-with-format-nongpl 4.21.0 pyhd8ed1ab_0 conda-forge jupyter-lsp 2.2.2 pyhd8ed1ab_0 conda-forge jupyter_client 8.6.0 pyhd8ed1ab_0 conda-forge jupyter_core 5.7.1 py311h1ea47a8_0 conda-forge jupyter_events 0.9.0 pyhd8ed1ab_0 conda-forge jupyter_server 2.12.5 pyhd8ed1ab_0 conda-forge jupyter_server_terminals 0.5.1 pyhd8ed1ab_0 conda-forge jupyterlab 4.0.10 pyhd8ed1ab_0 conda-forge jupyterlab_pygments 0.3.0 pyhd8ed1ab_0 conda-forge jupyterlab_server 2.25.2 pyhd8ed1ab_0 conda-forge kiwisolver 1.4.5 pypi_0 pypi libffi 3.4.2 h8ffe710_5 conda-forge libsodium 1.0.18 h8d14728_1 conda-forge libsqlite 3.44.2 hcfcfb64_0 conda-forge libzlib 1.2.13 hcfcfb64_5 conda-forge markupsafe 2.1.3 py311ha68e1ae_1 conda-forge matplotlib 3.8.2 pypi_0 pypi matplotlib-inline 0.1.6 pyhd8ed1ab_0 conda-forge mistune 3.0.2 pyhd8ed1ab_0 conda-forge nbclient 0.8.0 pyhd8ed1ab_0 conda-forge nbconvert-core 7.14.2 pyhd8ed1ab_0 conda-forge nbformat 5.9.2 pyhd8ed1ab_0 conda-forge neoprene 1.0.1 dev_0 <develop> nest-asyncio 1.5.9 pyhd8ed1ab_0 conda-forge notebook-shim 0.2.3 pyhd8ed1ab_0 conda-forge numpy 1.26.3 pypi_0 pypi openssl 3.2.0 hcfcfb64_1 conda-forge overrides 7.4.0 pyhd8ed1ab_0 conda-forge packaging 23.2 pyhd8ed1ab_0 conda-forge pandas 2.1.4 pypi_0 pypi pandocfilters 1.5.0 pyhd8ed1ab_0 conda-forge parso 0.8.3 pyhd8ed1ab_0 conda-forge pickleshare 0.7.5 py_1003 conda-forge pillow 10.2.0 pypi_0 pypi pip 23.3.2 pyhd8ed1ab_0 conda-forge pkgutil-resolve-name 1.3.10 pyhd8ed1ab_1 conda-forge platformdirs 4.1.0 pyhd8ed1ab_0 conda-forge prometheus_client 0.19.0 pyhd8ed1ab_0 conda-forge prompt-toolkit 3.0.42 pyha770c72_0 conda-forge psutil 5.9.7 py311ha68e1ae_0 conda-forge pure_eval 0.2.2 pyhd8ed1ab_0 conda-forge pycparser 2.21 pyhd8ed1ab_0 conda-forge pygments 2.17.2 pyhd8ed1ab_0 conda-forge pyparsing 3.1.1 pypi_0 pypi pyshp 2.3.1 pypi_0 pypi pysocks 1.7.1 pyh0701188_6 conda-forge python 3.11.0 hcf16a7b_0_cpython conda-forge python-dateutil 2.8.2 pyhd8ed1ab_0 conda-forge python-fastjsonschema 2.19.1 pyhd8ed1ab_0 conda-forge python-json-logger 2.0.7 pyhd8ed1ab_0 conda-forge python_abi 3.11 4_cp311 conda-forge pytz 2023.3.post1 pyhd8ed1ab_0 conda-forge pywin32 306 py311h12c1d0e_2 conda-forge pywinpty 2.0.12 py311h12c1d0e_0 conda-forge pyyaml 6.0.1 py311ha68e1ae_1 conda-forge pyzmq 25.1.2 py311h9250fbb_0 conda-forge referencing 0.32.1 pyhd8ed1ab_0 conda-forge requests 2.31.0 pyhd8ed1ab_0 conda-forge rfc3339-validator 0.1.4 pyhd8ed1ab_0 conda-forge rfc3986-validator 0.1.1 pyh9f0ad1d_0 conda-forge rpds-py 0.17.1 py311hc37eb10_0 conda-forge scipy 1.11.4 pypi_0 pypi send2trash 1.8.2 pyh08f2357_0 conda-forge setuptools 69.0.3 pyhd8ed1ab_0 conda-forge shapely 2.0.2 pypi_0 pypi six 1.16.0 pyh6c4a22f_0 conda-forge sniffio 1.3.0 pyhd8ed1ab_0 conda-forge soupsieve 2.5 pyhd8ed1ab_1 conda-forge stack_data 0.6.2 pyhd8ed1ab_0 conda-forge terminado 0.18.0 pyh5737063_0 conda-forge tinycss2 1.2.1 pyhd8ed1ab_0 conda-forge tk 8.6.13 h5226925_1 conda-forge tomli 2.0.1 pyhd8ed1ab_0 conda-forge tornado 6.3.3 py311ha68e1ae_1 conda-forge tqdm 4.66.1 pypi_0 pypi traitlets 5.14.1 pyhd8ed1ab_0 conda-forge types-python-dateutil 2.8.19.20240106 pyhd8ed1ab_0 conda-forge typing-extensions 4.9.0 hd8ed1ab_0 conda-forge typing_extensions 4.9.0 pyha770c72_0 conda-forge typing_utils 0.1.0 pyhd8ed1ab_0 conda-forge tzdata 2023.4 pypi_0 pypi ucrt 10.0.22621.0 h57928b3_0 conda-forge uri-template 1.3.0 pyhd8ed1ab_0 conda-forge urllib3 2.1.0 pyhd8ed1ab_0 conda-forge vc 14.3 hcf57466_18 conda-forge vc14_runtime 14.38.33130 h82b7239_18 conda-forge vs2015_runtime 14.38.33130 hcb4865c_18 conda-forge wcwidth 0.2.13 pyhd8ed1ab_0 conda-forge webcolors 1.13 pyhd8ed1ab_0 conda-forge webencodings 0.5.1 pyhd8ed1ab_2 conda-forge websocket-client 1.7.0 pyhd8ed1ab_0 conda-forge wheel 0.42.0 pyhd8ed1ab_0 conda-forge win_inet_pton 1.1.0 pyhd8ed1ab_6 conda-forge winpty 0.4.3 4 conda-forge xz 5.2.6 h8d14728_0 conda-forge yaml 0.2.5 h8ffe710_2 conda-forge zeromq 4.3.5 h63175ca_0 conda-forge zipp 3.17.0 pyhd8ed1ab_0 conda-forge zope-interface 6.1 pypi_0 pypi