This tutorial describes how to run a storage optimization over multiple timesteps with a PandaModels.jl multinetwork together with pandapower.
To run a storage optimization over multiple time steps, the power system data is copied n_timestep times internally. This is done efficiently in a julia script. Each network in the multinetwork dict represents a single time step. The input time series must be written to the loads and generators accordingly to each network. This is currently done by converting input time series to pandapwower controllers, saving it together with the grid data as a json file and loading the data back in julia. This "hack" is probably just a temporary solution.
Some notes:
For more details on PowerModels (PandaModels) storage model see:
https://lanl-ansi.github.io/PowerModels.jl/stable/storage/ and https://github.com/e2nIEE/PandaModels.jl/blob/develop/src/models/call_powermodels.jl
For more details on PowerModels multinetworks see:
https://lanl-ansi.github.io/PowerModels.jl/stable/multi-networks/
You need the standard Julia, PowerModels, Ipopt and JuMP Installation (see the opf_powermodels.ipynb).
In order to start the optimization and visualize results, we follow four steps:
We load the cigre medium voltage grid with "pv" and "wind" generators. Also we set some limits and add a storage with controllable == True
import pandapower as pp
import pandapower.networks as nw
def cigre_grid():
net = nw.create_cigre_network_mv("pv_wind")
# set some limits
min_vm_pu = 0.95
max_vm_pu = 1.05
net["bus"].loc[:, "min_vm_pu"] = min_vm_pu
net["bus"].loc[:, "max_vm_pu"] = max_vm_pu
net["line"].loc[:, "max_loading_percent"] = 100.
# close all switches
net.switch.loc[:, "closed"] = True
# add storage to bus 10
pp.create_storage(net, 10, p_mw=0.5, max_e_mwh=.2, soc_percent=0., q_mvar=0., controllable=True)
return net
The following functions loads the example time series from the input_file and scales the power accordingly. It then adds the time series data to the grid model by creating controllers.
import pandas as pd
from pandapower.control import ConstControl
from pandapower.timeseries import DFData
def convert_timeseries_to_controller(net, input_file):
# set the load type in the cigre grid, since it is not specified
net["load"].loc[:, "type"] = "residential"
# set the sgen type in the cigre grid
net.sgen.loc[:, "type"] = "pv"
net.sgen.loc[8, "type"] = "wind"
# read the example time series
time_series = pd.read_json(input_file)
time_series.sort_index(inplace=True)
# this example time series has a 15min resolution with 96 time steps for one day
n_timesteps = time_series.shape[0]
# get rated power
load_p = net["load"].loc[:, "p_mw"].values
sgen_p = net["sgen"].loc[:7, "p_mw"].values
wind_p = net["sgen"].loc[8, "p_mw"]
load_ts = pd.DataFrame(index=time_series.index.tolist(), columns=net.load.index.tolist())
sgen_ts = pd.DataFrame(index=time_series.index.tolist(), columns=net.sgen.index.tolist())
for t in range(n_timesteps):
load_ts.loc[t] = load_p * time_series.at[t, "residential"]
sgen_ts.loc[t][:8] = sgen_p * time_series.at[t, "pv"]
sgen_ts.loc[t][8] = wind_p * time_series.at[t, "wind"]
# create time series controller for load and sgen
ConstControl(net, element="load", variable="p_mw",
element_index=net.load.index.tolist(), profile_name=net.load.index.tolist(),
data_source=DFData(load_ts))
ConstControl(net, element="sgen", variable="p_mw",
element_index=net.sgen.index.tolist(), profile_name=net.sgen.index.tolist(),
data_source=DFData(sgen_ts))
Before we start the optimization, we create the grid and controller, adding the time series in 15min resolution.
# open the cigre mv grid
net = cigre_grid()
# convert the time series to pandapower controller
input_file = "cigre_timeseries_15min.json"
convert_timeseries_to_controller(net, input_file)
Now, the time series is added through (const) controllers, and you can check the created controllers
# print controller
print("--- time series controller:", net.controller)
# print time series data in controller
print("--- considered element of controller 0:", net.controller.object[0].__dict__["matching_params"]["element"])
print("--- considered element index of controller 0:",net.controller.object[0].__dict__["matching_params"]["element_index"])
print("--- time series data:",net.controller.object[0].data_source.df)
--- time series controller: object in_service order level initial_run \ 0 ConstControl [load.p_mw] True -1.0 -1 False 1 ConstControl [sgen.p_mw] True -1.0 -1 False recycle 0 {'trafo': False, 'gen': False, 'bus_pq': True} 1 {'trafo': False, 'gen': False, 'bus_pq': True} --- considered element of controller 0: load --- considered element index of controller 0: [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17] --- time series data: 0 1 2 3 4 5 6 \ 0 3.838794 0.070777 0.110512 0.186256 0.140313 0.150247 0.121687 1 3.731032 0.06879 0.10741 0.181027 0.136374 0.146029 0.118271 2 3.178328 0.0586 0.091498 0.154211 0.116172 0.124397 0.100751 3 3.026164 0.055795 0.087118 0.146828 0.11061 0.118441 0.095927 4 3.136205 0.057823 0.090286 0.152167 0.114632 0.122748 0.099416 .. ... ... ... ... ... ... ... 91 6.532286 0.120438 0.188053 0.316943 0.238763 0.255667 0.207069 92 6.160585 0.113585 0.177352 0.298908 0.225177 0.241119 0.195287 93 5.168657 0.095296 0.148796 0.25078 0.188921 0.202296 0.163843 94 4.558566 0.084048 0.131233 0.221179 0.166621 0.178418 0.144504 95 4.293205 0.079155 0.123594 0.208304 0.156922 0.168032 0.136092 7 8 9 10 11 12 13 \ 0 0.084436 3.838794 0.053393 1.240427 0.057669 0.019586 0.146893 1 0.082066 3.731032 0.051895 1.205606 0.05605 0.019036 0.142769 2 0.069909 3.178328 0.044207 1.027011 0.047747 0.016216 0.12162 3 0.066562 3.026164 0.042091 0.977842 0.045461 0.01544 0.115797 4 0.068982 3.136205 0.043621 1.0134 0.047114 0.016001 0.120008 .. ... ... ... ... ... ... ... 91 0.143681 6.532286 0.090857 2.110773 0.098132 0.033328 0.24996 92 0.135505 6.160585 0.085687 1.990665 0.092548 0.031432 0.235737 93 0.113687 5.168657 0.07189 1.670144 0.077647 0.026371 0.19778 94 0.100268 4.558566 0.063405 1.473006 0.068482 0.023258 0.174435 95 0.094431 4.293205 0.059714 1.38726 0.064495 0.021904 0.164281 14 15 16 17 0 0.017409 1.284206 0.008705 0.084871 1 0.016921 1.248156 0.00846 0.082489 2 0.014414 1.063258 0.007207 0.070269 3 0.013724 1.012354 0.006862 0.066905 4 0.014223 1.049167 0.007112 0.069338 .. ... ... ... ... 91 0.029625 2.185271 0.014812 0.144421 92 0.027939 2.060924 0.01397 0.136203 93 0.023441 1.72909 0.01172 0.114273 94 0.020674 1.524994 0.010337 0.100785 95 0.01947 1.436222 0.009735 0.094918 [96 rows x 18 columns]
We start the optimization for timesteps from 0 to 10.
# run the optimization for the first ten timesteps (the first run can be slow.
try:
pp.runpm_storage_opf(net, from_time_step=0, to_time_step=10)
except Exception as err:
print(err)
hp.pandapower.opf.make_objective - WARNING: no costs are given - overall generated power is minimized hp.pandapower.opf.run_powermodels - INFO: Optimization ('run_powermodels_multi_storage') is finished in 63.47 seconds:
Get and plot the optimization results for the storage.
from pandapower.opf.pm_storage import read_pm_storage_results
import matplotlib.pyplot as plt
def plot_storage_results(storage_results):
n_res = len(storage_results.keys())
fig, axes = plt.subplots(n_res, 2)
if n_res == 1:
axes = [axes]
for i, (key, val) in enumerate(storage_results.items()):
res = val
axes[i][0].set_title("Storage {}".format(key))
el = res.loc[:, ["p_mw", "q_mvar", "soc_mwh"]]
el.plot(ax=axes[i][0])
axes[i][0].set_xlabel("time step")
axes[i][0].legend(loc=4)
axes[i][0].grid()
ax2 = axes[i][1]
patch = plt.plot([], [], ms=8, ls="--", mec=None, color="grey", label="{:s}".format("soc_percent"))
ax2.legend(handles=patch)
ax2.set_label("SOC percent")
res.loc[:, "soc_percent"].plot(ax=ax2, linestyle="--", color="grey")
ax2.grid()
plt.show()
# get the results
#storage_results = read_pm_storage_results(net)
# plot the results
#plot_storage_results(storage_results)