Imagine a 2-dimensional lattice arrangement of $n \times n$ magnetic dipole moments (spins) that can be in one of two states ($+1$ or $−1$, Ising model). With interactions between spins being short ranged, each spin interacts only with its four neighbors. The probability to find a spin in one of the orientations is a function of temperature $T$ according to $p \sim e^{−a/T},\;a = \mathrm{const.}$).
At extremely low temperatures $T \rightarrow 0$, neighboring spins have a very low probability of different orientations, so that a uniform overall state (ferromagnetic state) is adopted, characterized by $+1$ or $−1$. At very high temperatures $T \rightarrow \infty$, a paramagnetic phase with random spin alignment results, yielding $50\%$ of $+1$ and $0%$ of $−1$ orientations. Below a critical temperature $0 < T < T_c$, stable ferromagnetic domains emerge, with both orientations being equally probable in the absence of an external magnetic field. The spin-spin correlation function diverges at $T_c$, whereas the correlation decays for $T > T_c$.
The data for this task contain the $n \times n$ dipole orientations on the lattice for different temperatures $T$. Classify the two magnetic phases (paramagnetic/ferromagnetic)!
from tensorflow import keras
import numpy as np
callbacks = keras.callbacks
layers = keras.layers
keras version 2.4.0
See https://doi.org/10.1038/nphys4035 for more information
import gdown
url = "https://drive.google.com/u/0/uc?export=download&confirm=HgGH&id=1Ihxt1hb3Kyv0IrjHlsYb9x9QY7l7n2Sl"
output = 'ising_data.npz'
gdown.download(url, output, quiet=True)
f = np.load(output, allow_pickle=True)
n_train = 20000
x_train, x_test = f["C"][:n_train], f["C"][n_train:]
T_train, T_test = f["T"][:n_train], f["T"][n_train:]
import matplotlib.pyplot as plt
for i,j in enumerate(np.random.choice(n_train, 6)):
plt.subplot(2,3,i+1)
image = x_train[j]
plot = plt.imshow(image)
plt.title("T: %.2f" % T_train[j])
plt.tight_layout()
plt.show()
plt.hist(T_test)
plt.xlabel("T")
plt.ylabel("frequency")
Text(0, 0.5, 'frequency')
Tc = 2.27
y_train, y_test = T_train > Tc, T_test > Tc
model = keras.models.Sequential()
model.add(layers.InputLayer(input_shape=(32, 32, 1)))
model.add(layers.Reshape((1024,)))
model.add(layers.Dense(32, activation='relu'))
model.add(layers.Dropout(0.3))
model.add(layers.Dense(32, activation='relu'))
model.add(layers.Dropout(0.3))
model.add(layers.Dense(1, activation='sigmoid'))
model.summary()
Model: "sequential" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= reshape (Reshape) (None, 1024) 0 _________________________________________________________________ dense (Dense) (None, 32) 32800 _________________________________________________________________ dropout (Dropout) (None, 32) 0 _________________________________________________________________ dense_1 (Dense) (None, 32) 1056 _________________________________________________________________ dropout_1 (Dropout) (None, 32) 0 _________________________________________________________________ dense_2 (Dense) (None, 1) 33 ================================================================= Total params: 33,889 Trainable params: 33,889 Non-trainable params: 0 _________________________________________________________________
model.compile(
loss='binary_crossentropy',
optimizer=keras.optimizers.Adam(0.001),
metrics=['accuracy'])
results = model.fit(x_train, y_train,
batch_size=64,
epochs=50,
verbose=2,
validation_split=0.1,
callbacks=[
callbacks.EarlyStopping(patience=5, verbose=1),
callbacks.ReduceLROnPlateau(factor=0.67, patience=2, verbose=1)]
)
Epoch 1/50 282/282 - 1s - loss: 0.2575 - accuracy: 0.9009 - val_loss: 0.0738 - val_accuracy: 0.9770 Epoch 2/50 282/282 - 0s - loss: 0.1426 - accuracy: 0.9539 - val_loss: 0.0773 - val_accuracy: 0.9730 Epoch 3/50 282/282 - 0s - loss: 0.1159 - accuracy: 0.9632 - val_loss: 0.0722 - val_accuracy: 0.9755 Epoch 4/50 282/282 - 0s - loss: 0.0941 - accuracy: 0.9714 - val_loss: 0.0673 - val_accuracy: 0.9785 Epoch 5/50 282/282 - 0s - loss: 0.0800 - accuracy: 0.9752 - val_loss: 0.0886 - val_accuracy: 0.9755 Epoch 6/50 282/282 - 0s - loss: 0.0752 - accuracy: 0.9745 - val_loss: 0.0939 - val_accuracy: 0.9690 Epoch 00006: ReduceLROnPlateau reducing learning rate to 0.0006700000318232924. Epoch 7/50 282/282 - 0s - loss: 0.0571 - accuracy: 0.9822 - val_loss: 0.0935 - val_accuracy: 0.9685 Epoch 8/50 282/282 - 0s - loss: 0.0565 - accuracy: 0.9824 - val_loss: 0.0729 - val_accuracy: 0.9735 Epoch 00008: ReduceLROnPlateau reducing learning rate to 0.0004489000252215192. Epoch 9/50 282/282 - 0s - loss: 0.0510 - accuracy: 0.9827 - val_loss: 0.0756 - val_accuracy: 0.9725 Epoch 00009: early stopping
plt.figure(1, (12, 4))
plt.subplot(1, 2, 1)
plt.plot(results.history['loss'])
plt.plot(results.history['val_loss'])
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['train', 'val'], loc='upper right')
<matplotlib.legend.Legend at 0x7f0784126da0>
plt.figure(1, (12, 4))
plt.subplot(1, 2, 1)
plt.plot(results.history['accuracy'])
plt.plot(results.history['val_accuracy'])
plt.ylabel('accuracy')
plt.xlabel('epoch')
plt.legend(['train', 'val'], loc='upper right')
<matplotlib.legend.Legend at 0x7f078409b748>
a = model.evaluate(x_test, y_test, verbose=0)
import seaborn as sns
preds = model.predict(x_test).round().squeeze()
acc = (preds == y_test).astype(np.float)
ax = sns.regplot(x=T_test, y=acc, x_estimator= np.mean, fit_reg=False)
ax.set_ylabel("accuracy")
ax.set_xlabel("T")
plt.axvline(x=Tc, color='k', linestyle='--', label='Tc')
<matplotlib.lines.Line2D at 0x7f0784013438>