In this code, using Geometric Brownian Motion, simulate returns and create distribution. And then find the VaR.
Value at risk (VaR): Loss that will be incurred in the event of an extreme adverse price change with some given, usually low, probability. i.e., the worst-case outcome. For example, 1% VaR means that 99% of returns will exceed the VaR and 1% of returns will be worse.
Assuming that portfolio returns are normally distributed, the VaR is fully determined by the mean and standard deviation of the distribution.
For example, VaR(1%, normal) = Mean - 2.33SD
To obtain a sample estimate of 1% VaR, we sort the observations from high to low. The VaR is the return at the 1st percentile of the sample distribution.
For example, with 95% confidence, we expect that our worst daily loss will not exceed 4%. If we invest 100 dollars, we are 95% confident that our worst daily loss will not exceed 4 dollars (100 dollars x -4%).
Confidence Level | Two sided CV | One sided CV |
---|---|---|
90% | 1.64 | 1.28 |
95% | 1.96 | 1.64 |
99% | 2.58 | 2.33 |
!pip install pandas-datareader
!pip install --upgrade pandas-datareader
!pip install yfinance
#Diable the warnings
#import warnings
#warnings.filterwarnings('ignore')
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/ Requirement already satisfied: pandas-datareader in /usr/local/lib/python3.7/dist-packages (0.10.0) Requirement already satisfied: lxml in /usr/local/lib/python3.7/dist-packages (from pandas-datareader) (4.9.0) Requirement already satisfied: pandas>=0.23 in /usr/local/lib/python3.7/dist-packages (from pandas-datareader) (1.3.5) Requirement already satisfied: requests>=2.19.0 in /usr/local/lib/python3.7/dist-packages (from pandas-datareader) (2.27.1) Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.7/dist-packages (from pandas>=0.23->pandas-datareader) (2.8.2) Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.7/dist-packages (from pandas>=0.23->pandas-datareader) (2022.1) Requirement already satisfied: numpy>=1.17.3 in /usr/local/lib/python3.7/dist-packages (from pandas>=0.23->pandas-datareader) (1.21.6) Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.7/dist-packages (from python-dateutil>=2.7.3->pandas>=0.23->pandas-datareader) (1.15.0) Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.7/dist-packages (from requests>=2.19.0->pandas-datareader) (2.10) Requirement already satisfied: charset-normalizer~=2.0.0 in /usr/local/lib/python3.7/dist-packages (from requests>=2.19.0->pandas-datareader) (2.0.12) Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.7/dist-packages (from requests>=2.19.0->pandas-datareader) (1.24.3) Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.7/dist-packages (from requests>=2.19.0->pandas-datareader) (2022.5.18.1) Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/ Requirement already satisfied: pandas-datareader in /usr/local/lib/python3.7/dist-packages (0.10.0) Requirement already satisfied: pandas>=0.23 in /usr/local/lib/python3.7/dist-packages (from pandas-datareader) (1.3.5) Requirement already satisfied: lxml in /usr/local/lib/python3.7/dist-packages (from pandas-datareader) (4.9.0) Requirement already satisfied: requests>=2.19.0 in /usr/local/lib/python3.7/dist-packages (from pandas-datareader) (2.27.1) Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.7/dist-packages (from pandas>=0.23->pandas-datareader) (2022.1) Requirement already satisfied: numpy>=1.17.3 in /usr/local/lib/python3.7/dist-packages (from pandas>=0.23->pandas-datareader) (1.21.6) Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.7/dist-packages (from pandas>=0.23->pandas-datareader) (2.8.2) Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.7/dist-packages (from python-dateutil>=2.7.3->pandas>=0.23->pandas-datareader) (1.15.0) Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.7/dist-packages (from requests>=2.19.0->pandas-datareader) (2.10) Requirement already satisfied: charset-normalizer~=2.0.0 in /usr/local/lib/python3.7/dist-packages (from requests>=2.19.0->pandas-datareader) (2.0.12) Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.7/dist-packages (from requests>=2.19.0->pandas-datareader) (1.24.3) Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.7/dist-packages (from requests>=2.19.0->pandas-datareader) (2022.5.18.1) Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/ Requirement already satisfied: yfinance in /usr/local/lib/python3.7/dist-packages (0.1.70) Requirement already satisfied: requests>=2.26 in /usr/local/lib/python3.7/dist-packages (from yfinance) (2.27.1) Requirement already satisfied: numpy>=1.15 in /usr/local/lib/python3.7/dist-packages (from yfinance) (1.21.6) Requirement already satisfied: pandas>=0.24.0 in /usr/local/lib/python3.7/dist-packages (from yfinance) (1.3.5) Requirement already satisfied: lxml>=4.5.1 in /usr/local/lib/python3.7/dist-packages (from yfinance) (4.9.0) Requirement already satisfied: multitasking>=0.0.7 in /usr/local/lib/python3.7/dist-packages (from yfinance) (0.0.10) Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.7/dist-packages (from pandas>=0.24.0->yfinance) (2.8.2) Requirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.7/dist-packages (from pandas>=0.24.0->yfinance) (2022.1) Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.7/dist-packages (from python-dateutil>=2.7.3->pandas>=0.24.0->yfinance) (1.15.0) Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.7/dist-packages (from requests>=2.26->yfinance) (1.24.3) Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.7/dist-packages (from requests>=2.26->yfinance) (2.10) Requirement already satisfied: charset-normalizer~=2.0.0 in /usr/local/lib/python3.7/dist-packages (from requests>=2.26->yfinance) (2.0.12) Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.7/dist-packages (from requests>=2.26->yfinance) (2022.5.18.1)
import math
import numpy as np
import numpy.random as npr
import pandas as pd
import pandas_datareader as pdr
import yfinance as yf
import scipy as sp
from scipy import stats
from pylab import plt, mpl
plt.style.use('fivethirtyeight')
#plt.style.use('seaborn')
mpl.rcParams['font.family'] = 'DejaVu Sans'
pd.set_option('precision', 3)
pd.set_option('display.max_colwidth', 100)
%matplotlib inline
print("At 95% confidence level {:.2f}".format(sp.stats.norm.ppf(1-.10/2)))
print("At 97.5% confidence level {:.2f}".format(sp.stats.norm.ppf(1-.05/2)))
print("At 90% confidence level {:.2f}".format(sp.stats.norm.ppf(1-.10)))
print("At 99% confidence level {:.2f}".format(sp.stats.norm.ppf(1-.01)))
At 95% confidence level 1.64 At 97.5% confidence level 1.96 At 90% confidence level 1.28 At 99% confidence level 2.33
def get_prices(tickers, freq_p, st_day, end_day):
mystock = pd.DataFrame()
for t in tickers:
mystock[t] = yf.download(t, start=st_day, end=end_day, interval=freq_p)['Adj Close']
return mystock
# for short time horizons er will be small, and therefore VaR estimations
# will not be much influenced by it
# assuming that portfolio is normally distributed,
def var_calc(CL, days, p_val, vol, t):
VaR = p_val * vol * np.sqrt(t/days) * sp.stats.norm.ppf(CL)
percent_loss = -VaR/p_val *100
print("Assuming that we invest {:.2f}, for the next {:.1f} trading days".format(p_val, days))
print("At {:.3f} confidence level, loss will not exceed {:,.2f}".format(CL, VaR))
print("This represents a move of {:.2f} standard deviations below the expected return,\
or a loss of {:.2f}%.".format(sp.stats.norm.ppf(CL), percent_loss))
return
tic=['SPY', 'TLT', 'TSLA', 'AAPL', 'VNQ', 'BAC', 'WMT', 'AMD', 'JNJ', 'GM', 'CMG', 'SHV']
prices= get_prices(tic, freq_p='1wk', st_day="2011-01-01", end_day="2022-05-31") # id , 1wk, 1mo
prices.info()
prices.tail(3)
[*********************100%***********************] 1 of 1 completed [*********************100%***********************] 1 of 1 completed [*********************100%***********************] 1 of 1 completed [*********************100%***********************] 1 of 1 completed [*********************100%***********************] 1 of 1 completed [*********************100%***********************] 1 of 1 completed [*********************100%***********************] 1 of 1 completed [*********************100%***********************] 1 of 1 completed [*********************100%***********************] 1 of 1 completed [*********************100%***********************] 1 of 1 completed [*********************100%***********************] 1 of 1 completed [*********************100%***********************] 1 of 1 completed <class 'pandas.core.frame.DataFrame'> DatetimeIndex: 640 entries, 2011-01-01 to 2022-05-21 Data columns (total 12 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 SPY 595 non-null float64 1 TLT 595 non-null float64 2 TSLA 595 non-null float64 3 AAPL 595 non-null float64 4 VNQ 595 non-null float64 5 BAC 595 non-null float64 6 WMT 595 non-null float64 7 AMD 595 non-null float64 8 JNJ 595 non-null float64 9 GM 595 non-null float64 10 CMG 595 non-null float64 11 SHV 595 non-null float64 dtypes: float64(12) memory usage: 65.0 KB
SPY | TLT | TSLA | AAPL | VNQ | BAC | WMT | AMD | JNJ | GM | CMG | SHV | |
---|---|---|---|---|---|---|---|---|---|---|---|---|
Date | ||||||||||||
2022-05-07 | 401.72 | 115.98 | 769.59 | 147.11 | 96.78 | 35.17 | 148.05 | 95.12 | 175.721 | 38.21 | 1318.28 | 110.22 |
2022-05-14 | 389.63 | 118.51 | 663.90 | 137.59 | 94.83 | 33.86 | 119.20 | 93.50 | 175.850 | 35.40 | 1294.11 | 110.23 |
2022-05-21 | 415.26 | 119.08 | 759.63 | 149.64 | 100.30 | 37.02 | 128.48 | 102.26 | 179.934 | 38.57 | 1402.42 | 110.27 |
for t in tic:
confidence_l = .95
annual_volatility = np.std(prices[t].pct_change()*np.sqrt(252))
most_recent_p = prices[t][-1]
holding_period = 252
frequency = 21
print("Historical annual volatility of {} = {:.4f}".format(t, annual_volatility))
var_calc(confidence_l, holding_period, most_recent_p, annual_volatility, frequency)
print(50 * "-")
Historical annual volatility of SPY = 0.3427 Assuming that we invest 415.26, for the next 252.0 trading days At 0.950 confidence level, loss will not exceed 67.56 This represents a move of 1.64 standard deviations below the expected return, or a loss of -16.27%. -------------------------------------------------- Historical annual volatility of TLT = 0.2965 Assuming that we invest 119.08, for the next 252.0 trading days At 0.950 confidence level, loss will not exceed 16.77 This represents a move of 1.64 standard deviations below the expected return, or a loss of -14.08%. -------------------------------------------------- Historical annual volatility of TSLA = 1.1661 Assuming that we invest 759.63, for the next 252.0 trading days At 0.950 confidence level, loss will not exceed 420.60 This represents a move of 1.64 standard deviations below the expected return, or a loss of -55.37%. -------------------------------------------------- Historical annual volatility of AAPL = 0.5918 Assuming that we invest 149.64, for the next 252.0 trading days At 0.950 confidence level, loss will not exceed 42.05 This represents a move of 1.64 standard deviations below the expected return, or a loss of -28.10%. -------------------------------------------------- Historical annual volatility of VNQ = 0.4469 Assuming that we invest 100.30, for the next 252.0 trading days At 0.950 confidence level, loss will not exceed 21.28 This represents a move of 1.64 standard deviations below the expected return, or a loss of -21.22%. -------------------------------------------------- Historical annual volatility of BAC = 0.6979 Assuming that we invest 37.02, for the next 252.0 trading days At 0.950 confidence level, loss will not exceed 12.27 This represents a move of 1.64 standard deviations below the expected return, or a loss of -33.14%. -------------------------------------------------- Historical annual volatility of WMT = 0.4125 Assuming that we invest 128.48, for the next 252.0 trading days At 0.950 confidence level, loss will not exceed 25.17 This represents a move of 1.64 standard deviations below the expected return, or a loss of -19.59%. -------------------------------------------------- Historical annual volatility of AMD = 1.2085 Assuming that we invest 102.26, for the next 252.0 trading days At 0.950 confidence level, loss will not exceed 58.68 This represents a move of 1.64 standard deviations below the expected return, or a loss of -57.38%. -------------------------------------------------- Historical annual volatility of JNJ = 0.3385 Assuming that we invest 179.93, for the next 252.0 trading days At 0.950 confidence level, loss will not exceed 28.92 This represents a move of 1.64 standard deviations below the expected return, or a loss of -16.08%. -------------------------------------------------- Historical annual volatility of GM = 0.7413 Assuming that we invest 38.57, for the next 252.0 trading days At 0.950 confidence level, loss will not exceed 13.58 This represents a move of 1.64 standard deviations below the expected return, or a loss of -35.20%. -------------------------------------------------- Historical annual volatility of CMG = 0.7466 Assuming that we invest 1402.42, for the next 252.0 trading days At 0.950 confidence level, loss will not exceed 497.18 This represents a move of 1.64 standard deviations below the expected return, or a loss of -35.45%. -------------------------------------------------- Historical annual volatility of SHV = 0.0086 Assuming that we invest 110.27, for the next 252.0 trading days At 0.950 confidence level, loss will not exceed 0.45 This represents a move of 1.64 standard deviations below the expected return, or a loss of -0.41%. --------------------------------------------------
Monte Carolo approach using stochastic geometric Borwnian motion,
S0 = prices['AAPL'][-1]
print(S0)
r = 0.05
sigma = np.std(prices['AAPL'].pct_change()*np.sqrt(252))
T = 21.0
I = 10000
ST1 = S0 * np.exp((r - 0.5 * sigma ** 2) * T +
sigma * math.sqrt(T) * np.random.standard_normal(I))
ST2 = S0 * np.random.lognormal((r - 0.5 * sigma ** 2) * T,
sigma * math.sqrt(T), size=I)
149.63999938964844
def print_statistics(a1, a2):
'''
Parameters
==========
a1, a2: ndarray objects
results objects from simulation
'''
sta1 = sp.stats.describe(a1)
sta2 = sp.stats.describe(a2)
print('%14s %14s %14s' % ('statistic', 'data set 1', 'data set 2'))
print(45 * "-")
print('%14s %14.3f %14.3f' % ('size', sta1[0], sta2[0]))
print('%14s %14.3f %14.3f' % ('min', sta1[1][0], sta2[1][0]))
print('%14s %14.3f %14.3f' % ('max', sta1[1][1], sta2[1][1]))
print('%14s %14.3f %14.3f' % ('mean', sta1[2], sta2[2]))
print('%14s %14.3f %14.3f' % ('std', np.sqrt(sta1[3]), np.sqrt(sta2[3])))
print('%14s %14.3f %14.3f' % ('skew', sta1[4], sta2[4]))
print('%14s %14.3f %14.3f' % ('kurtosis', sta1[5], sta2[5]))
print_statistics(ST1, ST2)
statistic data set 1 data set 2 --------------------------------------------- size 10000.000 10000.000 min 0.000 0.001 max 300993.143 114507.025 mean 445.331 313.031 std 5473.487 2648.420 skew 36.528 28.691 kurtosis 1653.676 1042.670
aapl = 1
aapl_price = pdr.get_quote_yahoo('AAPL')['price'] # the most recent price
aapl_value = aapl * aapl_price
aapl_value = aapl_value.at['AAPL']
t = 21/252
mu = .05
volatility = .307
iterations = 10000
type(aapl_price)
pandas.core.series.Series
aapl_price.describe()
count 1.00 mean 148.84 std NaN min 148.84 25% 148.84 50% 148.84 75% 148.84 max 148.84 Name: price, dtype: float64
# checking to see what get_quote_yahoo is retreveing
pdr.get_quote_yahoo('AAPL')
language | region | quoteType | typeDisp | quoteSourceName | triggerable | customPriceAlertConfidence | currency | marketState | gmtOffSetMilliseconds | ... | earningsTimestamp | earningsTimestampStart | earningsTimestampEnd | trailingAnnualDividendRate | trailingPE | trailingAnnualDividendYield | epsTrailingTwelveMonths | epsForward | displayName | price | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
AAPL | en-US | US | EQUITY | Equity | Nasdaq Real Time Price | True | HIGH | USD | PREPRE | -14400000 | ... | 1651163400 | 1658779200 | 1659124800 | 0.88 | 24.253 | 0.006 | 6.137 | 6.56 | Apple | 148.84 |
1 rows × 78 columns
type(aapl_price)
pandas.core.series.Series
type(aapl_value)
numpy.float64
aapl_price
AAPL 148.84 Name: price, dtype: float64
# aapl_value = # of shares times price
aapl_value
148.84
Below, let's simulate future value from Geometric Brownian Motion series.
def VaR(pv, mu, vol, T, iterations):
end = pv * np.exp((mu - .5 * vol ** 2) * T +
vol * np.sqrt(T) * np.random.standard_normal(iterations))
ending_values = end - pv
return ending_values
at_risk = VaR(aapl_value, mu, volatility, t, iterations)
at_risk
array([ 12.27010694, -3.17392878, -1.72863707, ..., -13.28831742, -6.68840767, 17.63879795])
type(at_risk)
numpy.ndarray
np.shape(at_risk)
(10000,)
np.ndim(at_risk)
1
at_risk.mean()
0.6267317841989819
at_risk.std()
13.313012258045937
plt.hist(at_risk,bins=100)
(array([ 1., 2., 2., 3., 1., 8., 5., 6., 7., 9., 13., 12., 19., 19., 30., 39., 29., 39., 58., 68., 74., 72., 104., 117., 101., 126., 137., 144., 163., 178., 189., 181., 219., 233., 245., 272., 240., 270., 253., 283., 264., 290., 262., 269., 275., 250., 271., 288., 240., 278., 225., 271., 228., 205., 194., 200., 167., 181., 149., 154., 153., 115., 127., 112., 109., 95., 69., 60., 58., 65., 45., 47., 37., 38., 26., 27., 24., 18., 15., 19., 15., 18., 5., 7., 13., 9., 7., 2., 9., 6., 2., 0., 4., 3., 1., 1., 4., 2., 0., 1.]), array([-40.65270376, -39.72052762, -38.78835148, -37.85617535, -36.92399921, -35.99182307, -35.05964693, -34.12747079, -33.19529466, -32.26311852, -31.33094238, -30.39876624, -29.4665901 , -28.53441397, -27.60223783, -26.67006169, -25.73788555, -24.80570941, -23.87353328, -22.94135714, -22.009181 , -21.07700486, -20.14482872, -19.21265259, -18.28047645, -17.34830031, -16.41612417, -15.48394803, -14.5517719 , -13.61959576, -12.68741962, -11.75524348, -10.82306734, -9.89089121, -8.95871507, -8.02653893, -7.09436279, -6.16218665, -5.23001052, -4.29783438, -3.36565824, -2.4334821 , -1.50130596, -0.56912983, 0.36304631, 1.29522245, 2.22739859, 3.15957473, 4.09175086, 5.023927 , 5.95610314, 6.88827928, 7.82045542, 8.75263155, 9.68480769, 10.61698383, 11.54915997, 12.48133611, 13.41351224, 14.34568838, 15.27786452, 16.21004066, 17.1422168 , 18.07439293, 19.00656907, 19.93874521, 20.87092135, 21.80309749, 22.73527362, 23.66744976, 24.5996259 , 25.53180204, 26.46397818, 27.39615431, 28.32833045, 29.26050659, 30.19268273, 31.12485887, 32.057035 , 32.98921114, 33.92138728, 34.85356342, 35.78573956, 36.71791569, 37.65009183, 38.58226797, 39.51444411, 40.44662025, 41.37879638, 42.31097252, 43.24314866, 44.1753248 , 45.10750094, 46.03967707, 46.97185321, 47.90402935, 48.83620549, 49.76838163, 50.70055776, 51.6327339 , 52.56491004]), <a list of 100 Patch objects>)
percentiles = [1,5,10]
np.percentile(at_risk, percentiles)
array([-27.7934676 , -20.33712168, -16.12524849])