The purpose of this notebook is to show how to use entanglish to calculate the squashed entanglement of a mixed state (either pure or not pure).
Consider a bipartite system consisting of two parts labelled by the random variables $\rvx$ and $\rvy$, and described by a density matrix $\rho_{\rvx, \rvy}$. The squashed entanglement of such a system is defined as
$$ E_{\rvx, \rvy}(\rho_{\rvx, \rvy}) = \frac{1}{2} \min S(\rvx : \rvy|\rvalp) \;. $$The min()---or infimum() if one wishes to be more mathematically precise--is over all density matrices $\rho_{\rvx, \rvy,\rvalp}$ such that ${\rm tr}_\rvalp \; \rho_{\rvx, \rvy,\rvalp}= \rho_{\rvx, \rvy}$ with $\rho_{\rvx, \rvy}$ held fixed. If $\rho_{\rvx, \rvy}$ is a pure state, then $E_{\rvx, \rvy} = S(\rvx) = S(\rvy)$. Entanglish-Original-Ref discusses other interesting properties of squashed entanglement
Entanglish-Original-Ref also describes the algo used by Entanglish to calculate squashed entanglement. The algorithm is recursive. The number of recursive steps can be chosen by the user and is called num_ab_steps (ab stands for Arimoto-Blahut). Another parameter of the algorithm is num_hidden_states, which is the number of possible $\rvalp$ values.
Entanglish-Original-Ref
Squashed Entanglement and a Python Implementation Thereof", by R.R.Tucci
First change your working directory to the entanglish directory in your computer, and add its path to the path environment variable.
import os
import sys
print(os.getcwd())
os.chdir('../../')
print(os.getcwd())
sys.path.insert(0,os.getcwd())
/home/rrtucci/PycharmProjects/Entanglish/entanglish/jupyter_notebooks /home/rrtucci/PycharmProjects/Entanglish
from entanglish.SymNupState import *
from entanglish.SquashedEnt import *
Next we construct a symmetrized n-up pure state. Then we compare the Arimoto-Blahut algo entanglement value to the known entanglement value, for various possible bi-partitions of the set of row axes. As expected, they are equal.
recursion_init is a str that specifies how to initiate the recursion for Kxy_a. There are currently 2 options for recursion_init, 'eigen' and 'eigen+'. 'eigen+' uses Dxy.num_rows^2 degrees of freedom, which is sufficient but may not be necessary. 'eigen' uses only Dxy.num_rows degrees of freedom. With both 'eigen' and 'eigen+', the recursion converges instantly for pure states.
num_qbits = 4
num_up = 1
dm1 = DenMat(1 << num_qbits, tuple([2]*num_qbits))
st = SymNupState(num_up, num_qbits)
st_vec = st.get_st_vec()
dm1.set_arr_from_st_vec(st_vec)
recursion_init = 'eigen+'
num_ab_steps = 5
print('recursion_init=', recursion_init)
print('num_ab_steps=', num_ab_steps)
ecase = SquashedEnt(dm1, num_ab_steps,
recursion_init=recursion_init, verbose=True)
print('entang_023: algo value, known value\n',
ecase.get_entang({0, 2, 3}),
st.get_known_entang(3))
print('entang_02: algo value, known value\n',
ecase.get_entang({0, 2}),
st.get_known_entang(2))
print('entang_1: algo value, known value\n',
ecase.get_entang({1}),
st.get_known_entang(1))
recursion_init= eigen+ num_ab_steps= 5 initial norm of Dxy - sum_alp Kxy_alp, should be 0 2.7194799110210365e-16 --ab step= 0 , entang= 0.562335 , err= 0.000000 --ab step= 1 , entang= 0.562335 , err= 0.000000 --ab step= 2 , entang= 0.562335 , err= 0.000000 --ab step= 3 , entang= 0.562335 , err= 0.000000 --ab step= 4 , entang= 0.562335 , err= 0.000000 entang_023: algo value, known value 0.5623351446188093 0.5623351446188083 initial norm of Dxy - sum_alp Kxy_alp, should be 0 2.7194799110210365e-16 --ab step= 0 , entang= 0.693147 , err= 0.000000 --ab step= 1 , entang= 0.693147 , err= 0.000000 --ab step= 2 , entang= 0.693147 , err= 0.000000 --ab step= 3 , entang= 0.693147 , err= 0.000000 --ab step= 4 , entang= 0.693147 , err= 0.000000 entang_02: algo value, known value 0.6931471805599457 0.6931471805599453 initial norm of Dxy - sum_alp Kxy_alp, should be 0 2.7194799110210365e-16 --ab step= 0 , entang= 0.562335 , err= 0.000000 --ab step= 1 , entang= 0.562335 , err= 0.000000 --ab step= 2 , entang= 0.562335 , err= 0.000000 --ab step= 3 , entang= 0.562335 , err= 0.000000 --ab step= 4 , entang= 0.562335 , err= 0.000000 entang_1: algo value, known value 0.5623351446188094 0.5623351446188083
Next we consider 2 random density matrices (actually, only their eigenvectors are random. Their eigenvalues are specified by the user.) For each of those 2 density matrices, we calculate the Arimoto-Blahut algo entanglement value, for various possible bi-partitions of the set of row axes.
np.random.seed(123)
dm = DenMat(8, (2, 2, 2))
evas_of_dm_list = [
np.array([.07, .03, .25, .15, .3, .1, .06, .04])
, np.array([.05, .05, .2, .2, .3, .1, .06, .04])
]
recursion_init = 'eigen+'
num_ab_steps = 100
print('recursion_init=', recursion_init)
print('num_ab_steps=', num_ab_steps)
for evas_of_dm in evas_of_dm_list:
evas_of_dm /= np.sum(evas_of_dm)
print('***************new dm')
print('evas_of_dm\n', evas_of_dm)
dm.set_arr_to_rand_den_mat(evas_of_dm)
ecase = SquashedEnt(dm, num_ab_steps,
recursion_init=recursion_init, verbose=True)
print('ent_02_1=', ecase.get_entang({0, 2}))
recursion_init= eigen+ num_ab_steps= 100 ***************new dm evas_of_dm [0.07 0.03 0.25 0.15 0.3 0.1 0.06 0.04] initial norm of Dxy - sum_alp Kxy_alp, should be 0 4.880055036277153e-16 --ab step= 0 , entang= 0.446881 , err= 26.212601 --ab step= 1 , entang= 0.169440 , err= 18.442138 --ab step= 2 , entang= 0.240223 , err= 8.346209 --ab step= 3 , entang= 0.090830 , err= 2.642048 --ab step= 4 , entang= 0.051815 , err= 0.848501 --ab step= 5 , entang= 0.041352 , err= 0.599272 --ab step= 6 , entang= 0.034656 , err= 0.515530 --ab step= 7 , entang= 0.029621 , err= 0.450331 --ab step= 8 , entang= 0.025599 , err= 0.402359 --ab step= 9 , entang= 0.022220 , err= 0.367524 --ab step= 10 , entang= 0.019274 , err= 0.344233 --ab step= 11 , entang= 0.016668 , err= 0.328257 --ab step= 12 , entang= 0.014370 , err= 0.316916 --ab step= 13 , entang= 0.012373 , err= 0.307283 --ab step= 14 , entang= 0.010676 , err= 0.298265 --ab step= 15 , entang= 0.009261 , err= 0.289201 --ab step= 16 , entang= 0.008100 , err= 0.279912 --ab step= 17 , entang= 0.007158 , err= 0.270411 --ab step= 18 , entang= 0.006397 , err= 0.260751 --ab step= 19 , entang= 0.005784 , err= 0.251003 --ab step= 20 , entang= 0.005289 , err= 0.241213 --ab step= 21 , entang= 0.004887 , err= 0.231420 --ab step= 22 , entang= 0.004561 , err= 0.221658 --ab step= 23 , entang= 0.004294 , err= 0.211962 --ab step= 24 , entang= 0.004075 , err= 0.202367 --ab step= 25 , entang= 0.003896 , err= 0.192912 --ab step= 26 , entang= 0.003748 , err= 0.183631 --ab step= 27 , entang= 0.003626 , err= 0.174525 --ab step= 28 , entang= 0.003525 , err= 0.165595 --ab step= 29 , entang= 0.003442 , err= 0.156900 --ab step= 30 , entang= 0.003374 , err= 0.148464 --ab step= 31 , entang= 0.003317 , err= 0.140289 --ab step= 32 , entang= 0.003270 , err= 0.132369 --ab step= 33 , entang= 0.003231 , err= 0.124695 --ab step= 34 , entang= 0.003198 , err= 0.117257 --ab step= 35 , entang= 0.003171 , err= 0.110049 --ab step= 36 , entang= 0.003149 , err= 0.103073 --ab step= 37 , entang= 0.003130 , err= 0.096333 --ab step= 38 , entang= 0.003115 , err= 0.089844 --ab step= 39 , entang= 0.003101 , err= 0.083622 --ab step= 40 , entang= 0.003090 , err= 0.077686 --ab step= 41 , entang= 0.003081 , err= 0.072055 --ab step= 42 , entang= 0.003073 , err= 0.066746 --ab step= 43 , entang= 0.003067 , err= 0.061771 --ab step= 44 , entang= 0.003061 , err= 0.057135 --ab step= 45 , entang= 0.003056 , err= 0.052841 --ab step= 46 , entang= 0.003052 , err= 0.048883 --ab step= 47 , entang= 0.003048 , err= 0.045250 --ab step= 48 , entang= 0.003045 , err= 0.041929 --ab step= 49 , entang= 0.003042 , err= 0.038903 --ab step= 50 , entang= 0.003040 , err= 0.036151 --ab step= 51 , entang= 0.003038 , err= 0.033654 --ab step= 52 , entang= 0.003036 , err= 0.031390 --ab step= 53 , entang= 0.003034 , err= 0.029340 --ab step= 54 , entang= 0.003033 , err= 0.027484 --ab step= 55 , entang= 0.003032 , err= 0.025803 --ab step= 56 , entang= 0.003031 , err= 0.024280 --ab step= 57 , entang= 0.003030 , err= 0.022899 --ab step= 58 , entang= 0.003029 , err= 0.021646 --ab step= 59 , entang= 0.003028 , err= 0.020506 --ab step= 60 , entang= 0.003027 , err= 0.019468 --ab step= 61 , entang= 0.003026 , err= 0.018519 --ab step= 62 , entang= 0.003026 , err= 0.017650 --ab step= 63 , entang= 0.003025 , err= 0.016850 --ab step= 64 , entang= 0.003025 , err= 0.016111 --ab step= 65 , entang= 0.003024 , err= 0.015426 --ab step= 66 , entang= 0.003024 , err= 0.014787 --ab step= 67 , entang= 0.003023 , err= 0.014188 --ab step= 68 , entang= 0.003023 , err= 0.013625 --ab step= 69 , entang= 0.003023 , err= 0.013092 --ab step= 70 , entang= 0.003022 , err= 0.012586 --ab step= 71 , entang= 0.003022 , err= 0.012104 --ab step= 72 , entang= 0.003022 , err= 0.011644 --ab step= 73 , entang= 0.003022 , err= 0.011203 --ab step= 74 , entang= 0.003021 , err= 0.010781 --ab step= 75 , entang= 0.003021 , err= 0.010375 --ab step= 76 , entang= 0.003021 , err= 0.009985 --ab step= 77 , entang= 0.003021 , err= 0.009610 --ab step= 78 , entang= 0.003021 , err= 0.009249 --ab step= 79 , entang= 0.003021 , err= 0.008901 --ab step= 80 , entang= 0.003020 , err= 0.008566 --ab step= 81 , entang= 0.003020 , err= 0.008244 --ab step= 82 , entang= 0.003020 , err= 0.007933 --ab step= 83 , entang= 0.003020 , err= 0.007635 --ab step= 84 , entang= 0.003020 , err= 0.007347 --ab step= 85 , entang= 0.003020 , err= 0.007070 --ab step= 86 , entang= 0.003020 , err= 0.006804 --ab step= 87 , entang= 0.003020 , err= 0.006548 --ab step= 88 , entang= 0.003020 , err= 0.006303 --ab step= 89 , entang= 0.003020 , err= 0.006066 --ab step= 90 , entang= 0.003019 , err= 0.005840 --ab step= 91 , entang= 0.003019 , err= 0.005622 --ab step= 92 , entang= 0.003019 , err= 0.005413 --ab step= 93 , entang= 0.003019 , err= 0.005213 --ab step= 94 , entang= 0.003019 , err= 0.005021 --ab step= 95 , entang= 0.003019 , err= 0.004837 --ab step= 96 , entang= 0.003019 , err= 0.004661 --ab step= 97 , entang= 0.003019 , err= 0.004493 --ab step= 98 , entang= 0.003019 , err= 0.004332 --ab step= 99 , entang= 0.003019 , err= 0.004178 ent_02_1= 0.0030188886598406317 ***************new dm evas_of_dm [0.05 0.05 0.2 0.2 0.3 0.1 0.06 0.04] initial norm of Dxy - sum_alp Kxy_alp, should be 0 3.5153434668415376e-16 --ab step= 0 , entang= 0.441242 , err= 26.182794 --ab step= 1 , entang= 0.178945 , err= 17.149929 --ab step= 2 , entang= 0.157736 , err= 8.343695 --ab step= 3 , entang= 0.080895 , err= 3.064098 --ab step= 4 , entang= 0.053597 , err= 0.823209 --ab step= 5 , entang= 0.044044 , err= 0.512262 --ab step= 6 , entang= 0.037648 , err= 0.446695 --ab step= 7 , entang= 0.032125 , err= 0.419670 --ab step= 8 , entang= 0.027126 , err= 0.403061 --ab step= 9 , entang= 0.022587 , err= 0.389903 --ab step= 10 , entang= 0.018556 , err= 0.376987 --ab step= 11 , entang= 0.015128 , err= 0.361961 --ab step= 12 , entang= 0.012336 , err= 0.344634 --ab step= 13 , entang= 0.010124 , err= 0.326300 --ab step= 14 , entang= 0.008395 , err= 0.308335 --ab step= 15 , entang= 0.007049 , err= 0.291393 --ab step= 16 , entang= 0.005998 , err= 0.275618 --ab step= 17 , entang= 0.005171 , err= 0.260934 --ab step= 18 , entang= 0.004516 , err= 0.247196 --ab step= 19 , entang= 0.003991 , err= 0.234285 --ab step= 20 , entang= 0.003565 , err= 0.222246 --ab step= 21 , entang= 0.003215 , err= 0.211055 --ab step= 22 , entang= 0.002925 , err= 0.200671 --ab step= 23 , entang= 0.002680 , err= 0.191050 --ab step= 24 , entang= 0.002472 , err= 0.182143 --ab step= 25 , entang= 0.002293 , err= 0.173895 --ab step= 26 , entang= 0.002138 , err= 0.166252 --ab step= 27 , entang= 0.002003 , err= 0.159154 --ab step= 28 , entang= 0.001883 , err= 0.152555 --ab step= 29 , entang= 0.001777 , err= 0.146431 --ab step= 30 , entang= 0.001681 , err= 0.140749 --ab step= 31 , entang= 0.001596 , err= 0.135479 --ab step= 32 , entang= 0.001518 , err= 0.130593 --ab step= 33 , entang= 0.001448 , err= 0.126066 --ab step= 34 , entang= 0.001384 , err= 0.121875 --ab step= 35 , entang= 0.001325 , err= 0.117998 --ab step= 36 , entang= 0.001270 , err= 0.114413 --ab step= 37 , entang= 0.001221 , err= 0.111097 --ab step= 38 , entang= 0.001174 , err= 0.108027 --ab step= 39 , entang= 0.001132 , err= 0.105181 --ab step= 40 , entang= 0.001092 , err= 0.102536 --ab step= 41 , entang= 0.001056 , err= 0.100070 --ab step= 42 , entang= 0.001022 , err= 0.097761 --ab step= 43 , entang= 0.000990 , err= 0.095592 --ab step= 44 , entang= 0.000961 , err= 0.093543 --ab step= 45 , entang= 0.000933 , err= 0.091599 --ab step= 46 , entang= 0.000908 , err= 0.089745 --ab step= 47 , entang= 0.000884 , err= 0.087971 --ab step= 48 , entang= 0.000862 , err= 0.086266 --ab step= 49 , entang= 0.000842 , err= 0.084621 --ab step= 50 , entang= 0.000823 , err= 0.083031 --ab step= 51 , entang= 0.000805 , err= 0.081490 --ab step= 52 , entang= 0.000788 , err= 0.079994 --ab step= 53 , entang= 0.000772 , err= 0.078540 --ab step= 54 , entang= 0.000758 , err= 0.077125 --ab step= 55 , entang= 0.000744 , err= 0.075749 --ab step= 56 , entang= 0.000732 , err= 0.074409 --ab step= 57 , entang= 0.000720 , err= 0.073106 --ab step= 58 , entang= 0.000709 , err= 0.071838 --ab step= 59 , entang= 0.000698 , err= 0.070605 --ab step= 60 , entang= 0.000688 , err= 0.069408 --ab step= 61 , entang= 0.000679 , err= 0.068246 --ab step= 62 , entang= 0.000671 , err= 0.067120 --ab step= 63 , entang= 0.000662 , err= 0.066009 --ab step= 64 , entang= 0.000655 , err= 0.064327 --ab step= 65 , entang= 0.000648 , err= 0.062313 --ab step= 66 , entang= 0.000642 , err= 0.060533 --ab step= 67 , entang= 0.000636 , err= 0.059003 --ab step= 68 , entang= 0.000630 , err= 0.057613 --ab step= 69 , entang= 0.000625 , err= 0.056295 --ab step= 70 , entang= 0.000620 , err= 0.055025 --ab step= 71 , entang= 0.000615 , err= 0.053807 --ab step= 72 , entang= 0.000611 , err= 0.052639 --ab step= 73 , entang= 0.000607 , err= 0.051523 --ab step= 74 , entang= 0.000603 , err= 0.050453 --ab step= 75 , entang= 0.000599 , err= 0.049426 --ab step= 76 , entang= 0.000596 , err= 0.048439 --ab step= 77 , entang= 0.000593 , err= 0.047488 --ab step= 78 , entang= 0.000590 , err= 0.046571 --ab step= 79 , entang= 0.000587 , err= 0.045685 --ab step= 80 , entang= 0.000584 , err= 0.044826 --ab step= 81 , entang= 0.000582 , err= 0.043994 --ab step= 82 , entang= 0.000580 , err= 0.043184 --ab step= 83 , entang= 0.000577 , err= 0.042395 --ab step= 84 , entang= 0.000575 , err= 0.041625 --ab step= 85 , entang= 0.000573 , err= 0.040873 --ab step= 86 , entang= 0.000571 , err= 0.040135 --ab step= 87 , entang= 0.000570 , err= 0.039412 --ab step= 88 , entang= 0.000568 , err= 0.038701 --ab step= 89 , entang= 0.000566 , err= 0.038001 --ab step= 90 , entang= 0.000565 , err= 0.037311 --ab step= 91 , entang= 0.000563 , err= 0.036631 --ab step= 92 , entang= 0.000562 , err= 0.035959 --ab step= 93 , entang= 0.000561 , err= 0.035296 --ab step= 94 , entang= 0.000560 , err= 0.034639 --ab step= 95 , entang= 0.000558 , err= 0.033990 --ab step= 96 , entang= 0.000557 , err= 0.033347 --ab step= 97 , entang= 0.000556 , err= 0.032711 --ab step= 98 , entang= 0.000555 , err= 0.032080 --ab step= 99 , entang= 0.000554 , err= 0.031456 ent_02_1= 0.0005543764855846465