Improvise a Jazz Solo with an LSTM Network

Welcome to your final programming assignment of this week! In this notebook, you will implement a model that uses an LSTM to generate music. You will even be able to listen to your own music at the end of the assignment.

You will learn to:

  • Apply an LSTM to music generation.
  • Generate your own jazz music with deep learning.

Updates

If you were working on the notebook before this update...

  • The current notebook is version "3a".
  • You can find your original work saved in the notebook with the previous version name ("v3")
  • To view the file directory, go to the menu "File->Open", and this will open a new tab that shows the file directory.

List of updates

  • djmodel
    • Explains Input layer and its parameter shape.
    • Explains Lambda layer and replaces the given solution with hints and sample code (to improve the learning experience).
    • Adds hints for using the Keras Model.
  • music_inference_model
    • Explains each line of code in the one_hot function.
    • Explains how to apply one_hot with a Lambda layer instead of giving the code solution (to improve the learning experience).
    • Adds instructions on defining the Model.
  • predict_and_sample
    • Provides detailed instructions for each step.
    • Clarifies which variable/function to use for inference.
  • Spelling, grammar and wording corrections.

Please run the following cell to load all the packages required in this assignment. This may take a few minutes.

In [18]:
from __future__ import print_function
import IPython
import sys
from music21 import *
import numpy as np
from grammar import *
from qa import *
from preprocess import * 
from music_utils import *
from data_utils import *
from keras.models import load_model, Model
from keras.layers import Dense, Activation, Dropout, Input, LSTM, Reshape, Lambda, RepeatVector
from keras.initializers import glorot_uniform
from keras.utils import to_categorical
from keras.optimizers import Adam
from keras import backend as K

1 - Problem statement

You would like to create a jazz music piece specially for a friend's birthday. However, you don't know any instruments or music composition. Fortunately, you know deep learning and will solve this problem using an LSTM network.

You will train a network to generate novel jazz solos in a style representative of a body of performed work.

1.1 - Dataset

You will train your algorithm on a corpus of Jazz music. Run the cell below to listen to a snippet of the audio from the training set:

In [20]:
IPython.display.Audio('./data/30s_seq.mp3')
Out[20]:

We have taken care of the preprocessing of the musical data to render it in terms of musical "values."

Details about music (optional)

You can informally think of each "value" as a note, which comprises a pitch and duration. For example, if you press down a specific piano key for 0.5 seconds, then you have just played a note. In music theory, a "value" is actually more complicated than this--specifically, it also captures the information needed to play multiple notes at the same time. For example, when playing a music piece, you might press down two piano keys at the same time (playing multiple notes at the same time generates what's called a "chord"). But we don't need to worry about the details of music theory for this assignment.

Music as a sequence of values

  • For the purpose of this assignment, all you need to know is that we will obtain a dataset of values, and will learn an RNN model to generate sequences of values.
  • Our music generation system will use 78 unique values.

Run the following code to load the raw music data and preprocess it into values. This might take a few minutes.

In [22]:
X, Y, n_values, indices_values = load_music_utils()
print('number of training examples:', X.shape[0])
print('Tx (length of sequence):', X.shape[1])
print('total # of unique values:', n_values)
print('shape of X:', X.shape)
print('Shape of Y:', Y.shape)
number of training examples: 60
Tx (length of sequence): 30
total # of unique values: 78
shape of X: (60, 30, 78)
Shape of Y: (30, 60, 78)

You have just loaded the following:

  • X: This is an (m, $T_x$, 78) dimensional array.

    • We have m training examples, each of which is a snippet of $T_x =30$ musical values.
    • At each time step, the input is one of 78 different possible values, represented as a one-hot vector.
      • For example, X[i,t,:] is a one-hot vector representing the value of the i-th example at time t.
  • Y: a $(T_y, m, 78)$ dimensional array

    • This is essentially the same as X, but shifted one step to the left (to the past).
    • Notice that the data in Y is reordered to be dimension $(T_y, m, 78)$, where $T_y = T_x$. This format makes it more convenient to feed into the LSTM later.
    • Similar to the dinosaur assignment, we're using the previous values to predict the next value.
      • So our sequence model will try to predict $y^{\langle t \rangle}$ given $x^{\langle 1\rangle}, \ldots, x^{\langle t \rangle}$.
  • n_values: The number of unique values in this dataset. This should be 78.

  • indices_values: python dictionary mapping integers 0 through 77 to musical values.

1.2 - Overview of our model

Here is the architecture of the model we will use. This is similar to the Dinosaurus model, except that you will implement it in Keras.

  • $X = (x^{\langle 1 \rangle}, x^{\langle 2 \rangle}, \cdots, x^{\langle T_x \rangle})$ is a window of size $T_x$ scanned over the musical corpus.
  • Each $x^{\langle t \rangle}$ is an index corresponding to a value.
  • $\hat{y}^{t}$ is the prediction for the next value.
  • We will be training the model on random snippets of 30 values taken from a much longer piece of music.
    • Thus, we won't bother to set the first input $x^{\langle 1 \rangle} = \vec{0}$, since most of these snippets of audio start somewhere in the middle of a piece of music.
    • We are setting each of the snippets to have the same length $T_x = 30$ to make vectorization easier.

Overview of parts 2 and 3

  • We're going to train a model that predicts the next note in a style that is similar to the jazz music that it's trained on. The training is contained in the weights and biases of the model.
  • In Part 3, we're then going to use those weights and biases in a new model which predicts a series of notes, using the previous note to predict the next note.
  • The weights and biases are transferred to the new model using 'global shared layers' described below"

2 - Building the model

  • In this part you will build and train a model that will learn musical patterns.
  • The model takes input X of shape $(m, T_x, 78)$ and labels Y of shape $(T_y, m, 78)$.
  • We will use an LSTM with hidden states that have $n_{a} = 64$ dimensions.
In [23]:
# number of dimensions for the hidden state of each LSTM cell.
n_a = 64 

Sequence generation uses a for-loop

  • If you're building an RNN where, at test time, the entire input sequence $x^{\langle 1 \rangle}, x^{\langle 2 \rangle}, \ldots, x^{\langle T_x \rangle}$ is given in advance, then Keras has simple built-in functions to build the model.
  • However, for sequence generation, at test time we don't know all the values of $x^{\langle t\rangle}$ in advance.
  • Instead we generate them one at a time using $x^{\langle t\rangle} = y^{\langle t-1 \rangle}$.
    • The input at time "t" is the prediction at the previous time step "t-1".
  • So you'll need to implement your own for-loop to iterate over the time steps.

Shareable weights

  • The function djmodel() will call the LSTM layer $T_x$ times using a for-loop.
  • It is important that all $T_x$ copies have the same weights.
    • The $T_x$ steps should have shared weights that aren't re-initialized.
  • Referencing a globally defined shared layer will utilize the same layer-object instance at each time step.
  • The key steps for implementing layers with shareable weights in Keras are:
  1. Define the layer objects (we will use global variables for this).
  2. Call these objects when propagating the input.

3 types of layers

  • We have defined the layers objects you need as global variables.
  • Please run the next cell to create them.
  • Please read the Keras documentation and understand these layers:
    • Reshape(): Reshapes an output to a certain shape.
    • LSTM(): Long Short-Term Memory layer
    • Dense(): A regular fully-connected neural network layer.
In [25]:
n_values = 78 # number of music values
reshapor = Reshape((1, n_values))                        # Used in Step 2.B of djmodel(), below
LSTM_cell = LSTM(n_a, return_state = True)         # Used in Step 2.C
densor = Dense(n_values, activation='softmax')     # Used in Step 2.D
  • reshapor, LSTM_cell and densor are globally defined layer objects, that you'll use to implement djmodel().
  • In order to propagate a Keras tensor object X through one of these layers, use layer_object().
    • For one input, use layer_object(X)
    • For more than one input, put the inputs in a list: layer_object([X1,X2])

Exercise: Implement djmodel().

Inputs (given)

  • The Input() layer is used for defining the input X as well as the initial hidden state 'a0' and cell state c0.
  • The shape parameter takes a tuple that does not include the batch dimension (m).
    • For example,
      X = Input(shape=(Tx, n_values)) # X has 3 dimensions and not 2: (m, Tx, n_values)
      
      #### Step 1: Outputs (TODO)
  1. Create an empty list "outputs" to save the outputs of the LSTM Cell at every time step.

Step 2: Loop through time steps (TODO)

  • Loop for $t \in 1, \ldots, T_x$:

2A. Select the 't' time-step vector from X.

  • X has the shape (m, Tx, n_values).
  • The shape of the 't' selection should be (n_values,).
  • Recall that if you were implementing in numpy instead of Keras, you would extract a slice from a 3D numpy array like this:
    var1 = array1[:,1,:]
    

Lambda layer

  • Since we're using Keras, we need to define this step inside a custom layer.
  • In Keras, this is a Lambda layer Lambda
  • As an example, a Lambda layer that takes the previous layer and adds '1' looks like this
         lambda_layer1 = Lambda(lambda z: z + 1)(previous_layer)
  • The previous layer in this case is X.
  • z is a local variable of the lambda function.
    • The previous_layer gets passed into the parameter z in the lowercase lambda function.
    • You can choose the name of the variable to be something else if you want.
  • The operation after the colon ':' should be the operation to extract a slice from the previous layer.
  • Hint: You'll be using the variable t within the definition of the lambda layer even though it isn't passed in as an argument to Lambda.

2B. Reshape x to be (1,n_values).

  • Use the reshapor() layer. It is a function that takes the previous layer as its input argument.

2C. Run x through one step of LSTM_cell.

  • Initialize the LSTM_cell with the previous step's hidden state $a$ and cell state $c$.
  • Use the following formatting:
    next_hidden_state, _, next_cell_state = LSTM_cell(inputs=input_x, initial_state=[previous_hidden_state, previous_cell_state])
    
    • Choose appropriate variables for inputs, hidden state and cell state.

2D. Dense layer

  • Propagate the LSTM's hidden state through a dense+softmax layer using densor.

2E. Append output

  • Append the output to the list of "outputs".

Step 3: After the loop, create the model

  • Use the Keras Model object to create a model.
  • specify the inputs and outputs:
    model = Model(inputs=[input_x, initial_hidden_state, initial_cell_state], outputs=the_outputs)
    
    • Choose the appropriate variables for the input tensor, hidden state, cell state, and output.
  • See the documentation for Model
In [38]:
# GRADED FUNCTION: djmodel

def djmodel(Tx, n_a, n_values):
    """
    Implement the model
    
    Arguments:
    Tx -- length of the sequence in a corpus
    n_a -- the number of activations used in our model
    n_values -- number of unique values in the music data 
    
    Returns:
    model -- a keras instance model with n_a activations
    """
    
    # Define the input layer and specify the shape
    X = Input(shape=(Tx, n_values))
    
    # Define the initial hidden state a0 and initial cell state c0
    # using `Input`
    a0 = Input(shape=(n_a,), name='a0')
    c0 = Input(shape=(n_a,), name='c0')
    a = a0
    c = c0
    
    ### START CODE HERE ### 
    # Step 1: Create empty list to append the outputs while you iterate (≈1 line)
    outputs = []
    
    # Step 2: Loop
    for t in range(Tx):
        
        # Step 2.A: select the "t"th time step vector from X. 
        x = Lambda(lambda x: X[:,t,:])(X)
        # Step 2.B: Use reshapor to reshape x to be (1, n_values) (≈1 line)
        x = reshapor(x)
        # Step 2.C: Perform one step of the LSTM_cell
        a, _, c = LSTM_cell(x, initial_state=[a, c])
        # Step 2.D: Apply densor to the hidden state output of LSTM_Cell
        out = densor(a)
        # Step 2.E: add the output to "outputs"
        outputs.append(out)
        
    # Step 3: Create model instance
    model = Model(inputs=[X, a0, c0], outputs=outputs)
    
    ### END CODE HERE ###
    
    return model

Create the model object

  • Run the following cell to define your model.
  • We will use Tx=30, n_a=64 (the dimension of the LSTM activations), and n_values=78.
  • This cell may take a few seconds to run.
In [39]:
model = djmodel(Tx = 30 , n_a = 64, n_values = 78)
In [40]:
# Check your model
model.summary()
____________________________________________________________________________________________________
Layer (type)                     Output Shape          Param #     Connected to                     
====================================================================================================
input_5 (InputLayer)             (None, 30, 78)        0                                            
____________________________________________________________________________________________________
lambda_141 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
reshape_3 (Reshape)              (None, 1, 78)         0           lambda_141[0][0]                 
                                                                   lambda_142[0][0]                 
                                                                   lambda_143[0][0]                 
                                                                   lambda_144[0][0]                 
                                                                   lambda_145[0][0]                 
                                                                   lambda_146[0][0]                 
                                                                   lambda_147[0][0]                 
                                                                   lambda_148[0][0]                 
                                                                   lambda_149[0][0]                 
                                                                   lambda_150[0][0]                 
                                                                   lambda_151[0][0]                 
                                                                   lambda_152[0][0]                 
                                                                   lambda_153[0][0]                 
                                                                   lambda_154[0][0]                 
                                                                   lambda_155[0][0]                 
                                                                   lambda_156[0][0]                 
                                                                   lambda_157[0][0]                 
                                                                   lambda_158[0][0]                 
                                                                   lambda_159[0][0]                 
                                                                   lambda_160[0][0]                 
                                                                   lambda_161[0][0]                 
                                                                   lambda_162[0][0]                 
                                                                   lambda_163[0][0]                 
                                                                   lambda_164[0][0]                 
                                                                   lambda_165[0][0]                 
                                                                   lambda_166[0][0]                 
                                                                   lambda_167[0][0]                 
                                                                   lambda_168[0][0]                 
                                                                   lambda_169[0][0]                 
                                                                   lambda_170[0][0]                 
____________________________________________________________________________________________________
a0 (InputLayer)                  (None, 64)            0                                            
____________________________________________________________________________________________________
c0 (InputLayer)                  (None, 64)            0                                            
____________________________________________________________________________________________________
lambda_142 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lstm_3 (LSTM)                    [(None, 64), (None, 6 36608       reshape_3[30][0]                 
                                                                   a0[0][0]                         
                                                                   c0[0][0]                         
                                                                   reshape_3[31][0]                 
                                                                   lstm_3[30][0]                    
                                                                   lstm_3[30][2]                    
                                                                   reshape_3[32][0]                 
                                                                   lstm_3[31][0]                    
                                                                   lstm_3[31][2]                    
                                                                   reshape_3[33][0]                 
                                                                   lstm_3[32][0]                    
                                                                   lstm_3[32][2]                    
                                                                   reshape_3[34][0]                 
                                                                   lstm_3[33][0]                    
                                                                   lstm_3[33][2]                    
                                                                   reshape_3[35][0]                 
                                                                   lstm_3[34][0]                    
                                                                   lstm_3[34][2]                    
                                                                   reshape_3[36][0]                 
                                                                   lstm_3[35][0]                    
                                                                   lstm_3[35][2]                    
                                                                   reshape_3[37][0]                 
                                                                   lstm_3[36][0]                    
                                                                   lstm_3[36][2]                    
                                                                   reshape_3[38][0]                 
                                                                   lstm_3[37][0]                    
                                                                   lstm_3[37][2]                    
                                                                   reshape_3[39][0]                 
                                                                   lstm_3[38][0]                    
                                                                   lstm_3[38][2]                    
                                                                   reshape_3[40][0]                 
                                                                   lstm_3[39][0]                    
                                                                   lstm_3[39][2]                    
                                                                   reshape_3[41][0]                 
                                                                   lstm_3[40][0]                    
                                                                   lstm_3[40][2]                    
                                                                   reshape_3[42][0]                 
                                                                   lstm_3[41][0]                    
                                                                   lstm_3[41][2]                    
                                                                   reshape_3[43][0]                 
                                                                   lstm_3[42][0]                    
                                                                   lstm_3[42][2]                    
                                                                   reshape_3[44][0]                 
                                                                   lstm_3[43][0]                    
                                                                   lstm_3[43][2]                    
                                                                   reshape_3[45][0]                 
                                                                   lstm_3[44][0]                    
                                                                   lstm_3[44][2]                    
                                                                   reshape_3[46][0]                 
                                                                   lstm_3[45][0]                    
                                                                   lstm_3[45][2]                    
                                                                   reshape_3[47][0]                 
                                                                   lstm_3[46][0]                    
                                                                   lstm_3[46][2]                    
                                                                   reshape_3[48][0]                 
                                                                   lstm_3[47][0]                    
                                                                   lstm_3[47][2]                    
                                                                   reshape_3[49][0]                 
                                                                   lstm_3[48][0]                    
                                                                   lstm_3[48][2]                    
                                                                   reshape_3[50][0]                 
                                                                   lstm_3[49][0]                    
                                                                   lstm_3[49][2]                    
                                                                   reshape_3[51][0]                 
                                                                   lstm_3[50][0]                    
                                                                   lstm_3[50][2]                    
                                                                   reshape_3[52][0]                 
                                                                   lstm_3[51][0]                    
                                                                   lstm_3[51][2]                    
                                                                   reshape_3[53][0]                 
                                                                   lstm_3[52][0]                    
                                                                   lstm_3[52][2]                    
                                                                   reshape_3[54][0]                 
                                                                   lstm_3[53][0]                    
                                                                   lstm_3[53][2]                    
                                                                   reshape_3[55][0]                 
                                                                   lstm_3[54][0]                    
                                                                   lstm_3[54][2]                    
                                                                   reshape_3[56][0]                 
                                                                   lstm_3[55][0]                    
                                                                   lstm_3[55][2]                    
                                                                   reshape_3[57][0]                 
                                                                   lstm_3[56][0]                    
                                                                   lstm_3[56][2]                    
                                                                   reshape_3[58][0]                 
                                                                   lstm_3[57][0]                    
                                                                   lstm_3[57][2]                    
                                                                   reshape_3[59][0]                 
                                                                   lstm_3[58][0]                    
                                                                   lstm_3[58][2]                    
____________________________________________________________________________________________________
lambda_143 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_144 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_145 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_146 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_147 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_148 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_149 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_150 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_151 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_152 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_153 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_154 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_155 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_156 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_157 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_158 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_159 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_160 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_161 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_162 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_163 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_164 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_165 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_166 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_167 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_168 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_169 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
lambda_170 (Lambda)              (None, 78)            0           input_5[0][0]                    
____________________________________________________________________________________________________
dense_3 (Dense)                  (None, 78)            5070        lstm_3[30][0]                    
                                                                   lstm_3[31][0]                    
                                                                   lstm_3[32][0]                    
                                                                   lstm_3[33][0]                    
                                                                   lstm_3[34][0]                    
                                                                   lstm_3[35][0]                    
                                                                   lstm_3[36][0]                    
                                                                   lstm_3[37][0]                    
                                                                   lstm_3[38][0]                    
                                                                   lstm_3[39][0]                    
                                                                   lstm_3[40][0]                    
                                                                   lstm_3[41][0]                    
                                                                   lstm_3[42][0]                    
                                                                   lstm_3[43][0]                    
                                                                   lstm_3[44][0]                    
                                                                   lstm_3[45][0]                    
                                                                   lstm_3[46][0]                    
                                                                   lstm_3[47][0]                    
                                                                   lstm_3[48][0]                    
                                                                   lstm_3[49][0]                    
                                                                   lstm_3[50][0]                    
                                                                   lstm_3[51][0]                    
                                                                   lstm_3[52][0]                    
                                                                   lstm_3[53][0]                    
                                                                   lstm_3[54][0]                    
                                                                   lstm_3[55][0]                    
                                                                   lstm_3[56][0]                    
                                                                   lstm_3[57][0]                    
                                                                   lstm_3[58][0]                    
                                                                   lstm_3[59][0]                    
====================================================================================================
Total params: 41,678
Trainable params: 41,678
Non-trainable params: 0
____________________________________________________________________________________________________

Expected Output
Scroll to the bottom of the output, and you'll see the following:

Total params: 41,678
Trainable params: 41,678
Non-trainable params: 0

Compile the model for training

  • You now need to compile your model to be trained.
  • We will use:
    • optimizer: Adam optimizer
    • Loss function: categorical cross-entropy (for multi-class classification)
In [41]:
opt = Adam(lr=0.01, beta_1=0.9, beta_2=0.999, decay=0.01)

model.compile(optimizer=opt, loss='categorical_crossentropy', metrics=['accuracy'])

Initialize hidden state and cell state

Finally, let's initialize a0 and c0 for the LSTM's initial state to be zero.

In [42]:
m = 60
a0 = np.zeros((m, n_a))
c0 = np.zeros((m, n_a))

Train the model

  • Lets now fit the model!
  • We will turn Y into a list, since the cost function expects Y to be provided in this format
    • list(Y) is a list with 30 items, where each of the list items is of shape (60,78).
    • Lets train for 100 epochs. This will take a few minutes.
In [43]:
model.fit([X, a0, c0], list(Y), epochs=100)
Epoch 1/100
60/60 [==============================] - 6s - loss: 125.9067 - dense_3_loss_1: 4.3527 - dense_3_loss_2: 4.3483 - dense_3_loss_3: 4.3447 - dense_3_loss_4: 4.3393 - dense_3_loss_5: 4.3492 - dense_3_loss_6: 4.3405 - dense_3_loss_7: 4.3397 - dense_3_loss_8: 4.3431 - dense_3_loss_9: 4.3418 - dense_3_loss_10: 4.3331 - dense_3_loss_11: 4.3417 - dense_3_loss_12: 4.3482 - dense_3_loss_13: 4.3420 - dense_3_loss_14: 4.3365 - dense_3_loss_15: 4.3437 - dense_3_loss_16: 4.3417 - dense_3_loss_17: 4.3346 - dense_3_loss_18: 4.3381 - dense_3_loss_19: 4.3432 - dense_3_loss_20: 4.3412 - dense_3_loss_21: 4.3449 - dense_3_loss_22: 4.3310 - dense_3_loss_23: 4.3462 - dense_3_loss_24: 4.3400 - dense_3_loss_25: 4.3396 - dense_3_loss_26: 4.3388 - dense_3_loss_27: 4.3419 - dense_3_loss_28: 4.3393 - dense_3_loss_29: 4.3416 - dense_3_loss_30: 0.0000e+00 - dense_3_acc_1: 0.0667 - dense_3_acc_2: 0.1000 - dense_3_acc_3: 0.0167 - dense_3_acc_4: 0.0500 - dense_3_acc_5: 0.0167 - dense_3_acc_6: 0.0500 - dense_3_acc_7: 0.0667 - dense_3_acc_8: 0.0500 - dense_3_acc_9: 0.0000e+00 - dense_3_acc_10: 0.1500 - dense_3_acc_11: 0.0333 - dense_3_acc_12: 0.0167 - dense_3_acc_13: 0.0167 - dense_3_acc_14: 0.0167 - dense_3_acc_15: 0.0167 - dense_3_acc_16: 0.0500 - dense_3_acc_17: 0.1167 - dense_3_acc_18: 0.0667 - dense_3_acc_19: 0.0333 - dense_3_acc_20: 0.0500 - dense_3_acc_21: 0.0167 - dense_3_acc_22: 0.0333 - dense_3_acc_23: 0.0333 - dense_3_acc_24: 0.0333 - dense_3_acc_25: 0.0167 - dense_3_acc_26: 0.0667 - dense_3_acc_27: 0.0000e+00 - dense_3_acc_28: 0.1000 - dense_3_acc_29: 0.0667 - dense_3_acc_30: 0.0000e+00                                                                     
Epoch 2/100
60/60 [==============================] - 0s - loss: 123.3107 - dense_3_loss_1: 4.3375 - dense_3_loss_2: 4.3123 - dense_3_loss_3: 4.2903 - dense_3_loss_4: 4.2854 - dense_3_loss_5: 4.2756 - dense_3_loss_6: 4.2733 - dense_3_loss_7: 4.2572 - dense_3_loss_8: 4.2511 - dense_3_loss_9: 4.2530 - dense_3_loss_10: 4.2287 - dense_3_loss_11: 4.2484 - dense_3_loss_12: 4.2689 - dense_3_loss_13: 4.2409 - dense_3_loss_14: 4.2262 - dense_3_loss_15: 4.2415 - dense_3_loss_16: 4.2369 - dense_3_loss_17: 4.2359 - dense_3_loss_18: 4.2445 - dense_3_loss_19: 4.2263 - dense_3_loss_20: 4.2470 - dense_3_loss_21: 4.2457 - dense_3_loss_22: 4.2345 - dense_3_loss_23: 4.2560 - dense_3_loss_24: 4.2314 - dense_3_loss_25: 4.2493 - dense_3_loss_26: 4.2125 - dense_3_loss_27: 4.2308 - dense_3_loss_28: 4.2280 - dense_3_loss_29: 4.2415 - dense_3_loss_30: 0.0000e+00 - dense_3_acc_1: 0.0667 - dense_3_acc_2: 0.1500 - dense_3_acc_3: 0.1333 - dense_3_acc_4: 0.1667 - dense_3_acc_5: 0.2167 - dense_3_acc_6: 0.0833 - dense_3_acc_7: 0.1500 - dense_3_acc_8: 0.1833 - dense_3_acc_9: 0.1333 - dense_3_acc_10: 0.1333 - dense_3_acc_11: 0.1333 - dense_3_acc_12: 0.1167 - dense_3_acc_13: 0.0833 - dense_3_acc_14: 0.1167 - dense_3_acc_15: 0.0667 - dense_3_acc_16: 0.1000 - dense_3_acc_17: 0.1833 - dense_3_acc_18: 0.1500 - dense_3_acc_19: 0.1500 - dense_3_acc_20: 0.0667 - dense_3_acc_21: 0.0667 - dense_3_acc_22: 0.1167 - dense_3_acc_23: 0.0833 - dense_3_acc_24: 0.0833 - dense_3_acc_25: 0.1000 - dense_3_acc_26: 0.2167 - dense_3_acc_27: 0.0667 - dense_3_acc_28: 0.2000 - dense_3_acc_29: 0.1333 - dense_3_acc_30: 0.0000e+00     
Epoch 3/100
60/60 [==============================] - 0s - loss: 117.4297 - dense_3_loss_1: 4.3145 - dense_3_loss_2: 4.2654 - dense_3_loss_3: 4.2095 - dense_3_loss_4: 4.2015 - dense_3_loss_5: 4.1519 - dense_3_loss_6: 4.1541 - dense_3_loss_7: 4.1033 - dense_3_loss_8: 4.0278 - dense_3_loss_9: 4.0330 - dense_3_loss_10: 3.9978 - dense_3_loss_11: 3.9876 - dense_3_loss_12: 4.0718 - dense_3_loss_13: 4.0149 - dense_3_loss_14: 3.8780 - dense_3_loss_15: 3.9907 - dense_3_loss_16: 3.9961 - dense_3_loss_17: 3.9426 - dense_3_loss_18: 4.1699 - dense_3_loss_19: 3.9261 - dense_3_loss_20: 4.1486 - dense_3_loss_21: 4.1405 - dense_3_loss_22: 3.9439 - dense_3_loss_23: 4.0719 - dense_3_loss_24: 4.0760 - dense_3_loss_25: 4.0585 - dense_3_loss_26: 3.7204 - dense_3_loss_27: 3.8641 - dense_3_loss_28: 3.9029 - dense_3_loss_29: 4.0665 - dense_3_loss_30: 0.0000e+00 - dense_3_acc_1: 0.0667 - dense_3_acc_2: 0.1000 - dense_3_acc_3: 0.1333 - dense_3_acc_4: 0.1167 - dense_3_acc_5: 0.1500 - dense_3_acc_6: 0.0833 - dense_3_acc_7: 0.0833 - dense_3_acc_8: 0.1500 - dense_3_acc_9: 0.0667 - dense_3_acc_10: 0.1000 - dense_3_acc_11: 0.0500 - dense_3_acc_12: 0.0667 - dense_3_acc_13: 0.0333 - dense_3_acc_14: 0.0667 - dense_3_acc_15: 0.0667 - dense_3_acc_16: 0.0833 - dense_3_acc_17: 0.1167 - dense_3_acc_18: 0.0667 - dense_3_acc_19: 0.0667 - dense_3_acc_20: 0.1000 - dense_3_acc_21: 0.0833 - dense_3_acc_22: 0.0667 - dense_3_acc_23: 0.0167 - dense_3_acc_24: 0.0167 - dense_3_acc_25: 0.0833 - dense_3_acc_26: 0.1500 - dense_3_acc_27: 0.0667 - dense_3_acc_28: 0.0833 - dense_3_acc_29: 0.1333 - dense_3_acc_30: 0.0000e+00                 
Epoch 4/100
60/60 [==============================] - 0s - loss: 113.4645 - dense_3_loss_1: 4.2957 - dense_3_loss_2: 4.2194 - dense_3_loss_3: 4.1234 - dense_3_loss_4: 4.1074 - dense_3_loss_5: 4.0115 - dense_3_loss_6: 4.0257 - dense_3_loss_7: 3.9557 - dense_3_loss_8: 3.7725 - dense_3_loss_9: 3.8240 - dense_3_loss_10: 3.6975 - dense_3_loss_11: 3.7843 - dense_3_loss_12: 4.0462 - dense_3_loss_13: 3.7754 - dense_3_loss_14: 3.7355 - dense_3_loss_15: 3.8115 - dense_3_loss_16: 3.8044 - dense_3_loss_17: 3.9172 - dense_3_loss_18: 3.9386 - dense_3_loss_19: 3.7156 - dense_3_loss_20: 4.0442 - dense_3_loss_21: 4.0320 - dense_3_loss_22: 3.8452 - dense_3_loss_23: 3.8858 - dense_3_loss_24: 3.8170 - dense_3_loss_25: 4.0390 - dense_3_loss_26: 3.5931 - dense_3_loss_27: 3.7707 - dense_3_loss_28: 3.8171 - dense_3_loss_29: 4.0590 - dense_3_loss_30: 0.0000e+00 - dense_3_acc_1: 0.0667 - dense_3_acc_2: 0.1167 - dense_3_acc_3: 0.1667 - dense_3_acc_4: 0.1000 - dense_3_acc_5: 0.2000 - dense_3_acc_6: 0.1333 - dense_3_acc_7: 0.1333 - dense_3_acc_8: 0.1833 - dense_3_acc_9: 0.1333 - dense_3_acc_10: 0.1500 - dense_3_acc_11: 0.1167 - dense_3_acc_12: 0.1167 - dense_3_acc_13: 0.1667 - dense_3_acc_14: 0.1500 - dense_3_acc_15: 0.1500 - dense_3_acc_16: 0.1167 - dense_3_acc_17: 0.1500 - dense_3_acc_18: 0.1333 - dense_3_acc_19: 0.1167 - dense_3_acc_20: 0.1000 - dense_3_acc_21: 0.1000 - dense_3_acc_22: 0.1167 - dense_3_acc_23: 0.1167 - dense_3_acc_24: 0.1167 - dense_3_acc_25: 0.0833 - dense_3_acc_26: 0.2333 - dense_3_acc_27: 0.1167 - dense_3_acc_28: 0.1833 - dense_3_acc_29: 0.0833 - dense_3_acc_30: 0.0000e+00     
Epoch 5/100
60/60 [==============================] - 0s - loss: 110.4102 - dense_3_loss_1: 4.2781 - dense_3_loss_2: 4.1743 - dense_3_loss_3: 4.0514 - dense_3_loss_4: 4.0284 - dense_3_loss_5: 3.9277 - dense_3_loss_6: 3.9164 - dense_3_loss_7: 3.8665 - dense_3_loss_8: 3.6801 - dense_3_loss_9: 3.7591 - dense_3_loss_10: 3.6048 - dense_3_loss_11: 3.6930 - dense_3_loss_12: 3.9390 - dense_3_loss_13: 3.7007 - dense_3_loss_14: 3.6204 - dense_3_loss_15: 3.7007 - dense_3_loss_16: 3.6914 - dense_3_loss_17: 3.8117 - dense_3_loss_18: 3.8140 - dense_3_loss_19: 3.5708 - dense_3_loss_20: 3.8482 - dense_3_loss_21: 3.8875 - dense_3_loss_22: 3.7421 - dense_3_loss_23: 3.7026 - dense_3_loss_24: 3.6653 - dense_3_loss_25: 3.8916 - dense_3_loss_26: 3.4965 - dense_3_loss_27: 3.6475 - dense_3_loss_28: 3.7934 - dense_3_loss_29: 3.9069 - dense_3_loss_30: 0.0000e+00 - dense_3_acc_1: 0.1000 - dense_3_acc_2: 0.1167 - dense_3_acc_3: 0.2000 - dense_3_acc_4: 0.1667 - dense_3_acc_5: 0.2167 - dense_3_acc_6: 0.1333 - dense_3_acc_7: 0.1000 - dense_3_acc_8: 0.2167 - dense_3_acc_9: 0.2000 - dense_3_acc_10: 0.1667 - dense_3_acc_11: 0.1000 - dense_3_acc_12: 0.0667 - dense_3_acc_13: 0.1833 - dense_3_acc_14: 0.1833 - dense_3_acc_15: 0.1333 - dense_3_acc_16: 0.1333 - dense_3_acc_17: 0.1500 - dense_3_acc_18: 0.0333 - dense_3_acc_19: 0.1500 - dense_3_acc_20: 0.1167 - dense_3_acc_21: 0.0500 - dense_3_acc_22: 0.1833 - dense_3_acc_23: 0.1000 - dense_3_acc_24: 0.0833 - dense_3_acc_25: 0.0833 - dense_3_acc_26: 0.2500 - dense_3_acc_27: 0.1000 - dense_3_acc_28: 0.1167 - dense_3_acc_29: 0.0833 - dense_3_acc_30: 0.0000e+00         
Epoch 6/100
60/60 [==============================] - 0s - loss: 107.2678 - dense_3_loss_1: 4.2639 - dense_3_loss_2: 4.1327 - dense_3_loss_3: 3.9775 - dense_3_loss_4: 3.9502 - dense_3_loss_5: 3.8383 - dense_3_loss_6: 3.8287 - dense_3_loss_7: 3.7906 - dense_3_loss_8: 3.5722 - dense_3_loss_9: 3.6534 - dense_3_loss_10: 3.5154 - dense_3_loss_11: 3.5947 - dense_3_loss_12: 3.8219 - dense_3_loss_13: 3.6171 - dense_3_loss_14: 3.5404 - dense_3_loss_15: 3.5989 - dense_3_loss_16: 3.5627 - dense_3_loss_17: 3.6245 - dense_3_loss_18: 3.6384 - dense_3_loss_19: 3.4829 - dense_3_loss_20: 3.6784 - dense_3_loss_21: 3.7394 - dense_3_loss_22: 3.6098 - dense_3_loss_23: 3.5981 - dense_3_loss_24: 3.5687 - dense_3_loss_25: 3.7648 - dense_3_loss_26: 3.3818 - dense_3_loss_27: 3.5370 - dense_3_loss_28: 3.6661 - dense_3_loss_29: 3.7194 - dense_3_loss_30: 0.0000e+00 - dense_3_acc_1: 0.1000 - dense_3_acc_2: 0.1500 - dense_3_acc_3: 0.2167 - dense_3_acc_4: 0.2000 - dense_3_acc_5: 0.2167 - dense_3_acc_6: 0.1167 - dense_3_acc_7: 0.1000 - dense_3_acc_8: 0.1500 - dense_3_acc_9: 0.1333 - dense_3_acc_10: 0.1333 - dense_3_acc_11: 0.1000 - dense_3_acc_12: 0.0500 - dense_3_acc_13: 0.0833 - dense_3_acc_14: 0.1167 - dense_3_acc_15: 0.1000 - dense_3_acc_16: 0.0833 - dense_3_acc_17: 0.1833 - dense_3_acc_18: 0.0833 - dense_3_acc_19: 0.1167 - dense_3_acc_20: 0.1167 - dense_3_acc_21: 0.1000 - dense_3_acc_22: 0.1167 - dense_3_acc_23: 0.0833 - dense_3_acc_24: 0.0833 - dense_3_acc_25: 0.1167 - dense_3_acc_26: 0.2333 - dense_3_acc_27: 0.0833 - dense_3_acc_28: 0.2000 - dense_3_acc_29: 0.1500 - dense_3_acc_30: 0.0000e+00         
Epoch 7/100
60/60 [==============================] - 0s - loss: 103.5019 - dense_3_loss_1: 4.2472 - dense_3_loss_2: 4.0852 - dense_3_loss_3: 3.8976 - dense_3_loss_4: 3.8663 - dense_3_loss_5: 3.7379 - dense_3_loss_6: 3.7369 - dense_3_loss_7: 3.6998 - dense_3_loss_8: 3.4461 - dense_3_loss_9: 3.5140 - dense_3_loss_10: 3.3688 - dense_3_loss_11: 3.4746 - dense_3_loss_12: 3.6831 - dense_3_loss_13: 3.4492 - dense_3_loss_14: 3.4054 - dense_3_loss_15: 3.4593 - dense_3_loss_16: 3.4248 - dense_3_loss_17: 3.4793 - dense_3_loss_18: 3.4785 - dense_3_loss_19: 3.3816 - dense_3_loss_20: 3.4713 - dense_3_loss_21: 3.5390 - dense_3_loss_22: 3.4195 - dense_3_loss_23: 3.4759 - dense_3_loss_24: 3.4395 - dense_3_loss_25: 3.5541 - dense_3_loss_26: 3.2498 - dense_3_loss_27: 3.4065 - dense_3_loss_28: 3.5109 - dense_3_loss_29: 3.5997 - dense_3_loss_30: 0.0000e+00 - dense_3_acc_1: 0.1000 - dense_3_acc_2: 0.1167 - dense_3_acc_3: 0.2167 - dense_3_acc_4: 0.2000 - dense_3_acc_5: 0.2167 - dense_3_acc_6: 0.1000 - dense_3_acc_7: 0.1000 - dense_3_acc_8: 0.1833 - dense_3_acc_9: 0.1667 - dense_3_acc_10: 0.1500 - dense_3_acc_11: 0.1167 - dense_3_acc_12: 0.0667 - dense_3_acc_13: 0.1333 - dense_3_acc_14: 0.2000 - dense_3_acc_15: 0.1333 - dense_3_acc_16: 0.1500 - dense_3_acc_17: 0.1167 - dense_3_acc_18: 0.1167 - dense_3_acc_19: 0.1667 - dense_3_acc_20: 0.1000 - dense_3_acc_21: 0.1167 - dense_3_acc_22: 0.1333 - dense_3_acc_23: 0.1667 - dense_3_acc_24: 0.0667 - dense_3_acc_25: 0.0833 - dense_3_acc_26: 0.2000 - dense_3_acc_27: 0.1167 - dense_3_acc_28: 0.1833 - dense_3_acc_29: 0.1000 - dense_3_acc_30: 0.0000e+00     
Epoch 8/100
60/60 [==============================] - 0s - loss: 101.0993 - dense_3_loss_1: 4.2337 - dense_3_loss_2: 4.0426 - dense_3_loss_3: 3.8096 - dense_3_loss_4: 3.7776 - dense_3_loss_5: 3.6265 - dense_3_loss_6: 3.6401 - dense_3_loss_7: 3.6055 - dense_3_loss_8: 3.2965 - dense_3_loss_9: 3.3564 - dense_3_loss_10: 3.2483 - dense_3_loss_11: 3.3564 - dense_3_loss_12: 3.5072 - dense_3_loss_13: 3.2903 - dense_3_loss_14: 3.2627 - dense_3_loss_15: 3.3222 - dense_3_loss_16: 3.3103 - dense_3_loss_17: 3.3469 - dense_3_loss_18: 3.3981 - dense_3_loss_19: 3.2821 - dense_3_loss_20: 3.4338 - dense_3_loss_21: 3.4668 - dense_3_loss_22: 3.3057 - dense_3_loss_23: 3.4427 - dense_3_loss_24: 3.4588 - dense_3_loss_25: 3.5568 - dense_3_loss_26: 3.2285 - dense_3_loss_27: 3.3818 - dense_3_loss_28: 3.4580 - dense_3_loss_29: 3.6534 - dense_3_loss_30: 0.0000e+00 - dense_3_acc_1: 0.1000 - dense_3_acc_2: 0.1167 - dense_3_acc_3: 0.2167 - dense_3_acc_4: 0.1833 - dense_3_acc_5: 0.2833 - dense_3_acc_6: 0.1000 - dense_3_acc_7: 0.1000 - dense_3_acc_8: 0.2000 - dense_3_acc_9: 0.1667 - dense_3_acc_10: 0.1667 - dense_3_acc_11: 0.1833 - dense_3_acc_12: 0.1167 - dense_3_acc_13: 0.2000 - dense_3_acc_14: 0.2000 - dense_3_acc_15: 0.1167 - dense_3_acc_16: 0.1500 - dense_3_acc_17: 0.1667 - dense_3_acc_18: 0.1333 - dense_3_acc_19: 0.1833 - dense_3_acc_20: 0.1167 - dense_3_acc_21: 0.0667 - dense_3_acc_22: 0.1500 - dense_3_acc_23: 0.1000 - dense_3_acc_24: 0.0500 - dense_3_acc_25: 0.0833 - dense_3_acc_26: 0.1833 - dense_3_acc_27: 0.1167 - dense_3_acc_28: 0.1167 - dense_3_acc_29: 0.1167 - dense_3_acc_30: 0.0000e+00     
Epoch 9/100
60/60 [==============================] - 0s - loss: 96.5578 - dense_3_loss_1: 4.2201 - dense_3_loss_2: 4.0023 - dense_3_loss_3: 3.7328 - dense_3_loss_4: 3.6937 - dense_3_loss_5: 3.5251 - dense_3_loss_6: 3.5416 - dense_3_loss_7: 3.4910 - dense_3_loss_8: 3.1773 - dense_3_loss_9: 3.2248 - dense_3_loss_10: 3.0890 - dense_3_loss_11: 3.2406 - dense_3_loss_12: 3.3926 - dense_3_loss_13: 3.1693 - dense_3_loss_14: 3.1587 - dense_3_loss_15: 3.1628 - dense_3_loss_16: 3.1832 - dense_3_loss_17: 3.0909 - dense_3_loss_18: 3.1728 - dense_3_loss_19: 3.1359 - dense_3_loss_20: 3.2933 - dense_3_loss_21: 3.2941 - dense_3_loss_22: 3.0466 - dense_3_loss_23: 3.1844 - dense_3_loss_24: 3.2412 - dense_3_loss_25: 3.3630 - dense_3_loss_26: 3.0009 - dense_3_loss_27: 3.0564 - dense_3_loss_28: 3.2772 - dense_3_loss_29: 3.3964 - dense_3_loss_30: 0.0000e+00 - dense_3_acc_1: 0.1000 - dense_3_acc_2: 0.1167 - dense_3_acc_3: 0.2167 - dense_3_acc_4: 0.1833 - dense_3_acc_5: 0.2667 - dense_3_acc_6: 0.1000 - dense_3_acc_7: 0.1000 - dense_3_acc_8: 0.2167 - dense_3_acc_9: 0.1833 - dense_3_acc_10: 0.1667 - dense_3_acc_11: 0.1667 - dense_3_acc_12: 0.1167 - dense_3_acc_13: 0.1833 - dense_3_acc_14: 0.1667 - dense_3_acc_15: 0.1667 - dense_3_acc_16: 0.1833 - dense_3_acc_17: 0.2167 - dense_3_acc_18: 0.1167 - dense_3_acc_19: 0.1833 - dense_3_acc_20: 0.2000 - dense_3_acc_21: 0.2000 - dense_3_acc_22: 0.1667 - dense_3_acc_23: 0.1333 - dense_3_acc_24: 0.1167 - dense_3_acc_25: 0.1333 - dense_3_acc_26: 0.2500 - dense_3_acc_27: 0.2167 - dense_3_acc_28: 0.1833 - dense_3_acc_29: 0.1333 - dense_3_acc_30: 0.0000e+00     
Epoch 10/100
60/60 [==============================] - 0s - loss: 93.2772 - dense_3_loss_1: 4.2087 - dense_3_loss_2: 3.9654 - dense_3_loss_3: 3.6645 - dense_3_loss_4: 3.6087 - dense_3_loss_5: 3.4273 - dense_3_loss_6: 3.4251 - dense_3_loss_7: 3.3715 - dense_3_loss_8: 3.0531 - dense_3_loss_9: 3.1099 - dense_3_loss_10: 3.0066 - dense_3_loss_11: 3.1364 - dense_3_loss_12: 3.2442 - dense_3_loss_13: 3.0378 - dense_3_loss_14: 3.0149 - dense_3_loss_15: 3.0575 - dense_3_loss_16: 3.0612 - dense_3_loss_17: 3.0185 - dense_3_loss_18: 3.0208 - dense_3_loss_19: 3.0964 - dense_3_loss_20: 3.1264 - dense_3_loss_21: 3.1232 - dense_3_loss_22: 2.9505 - dense_3_loss_23: 3.1416 - dense_3_loss_24: 3.0676 - dense_3_loss_25: 3.2097 - dense_3_loss_26: 2.8680 - dense_3_loss_27: 3.0454 - dense_3_loss_28: 3.0294 - dense_3_loss_29: 3.1868 - dense_3_loss_30: 0.0000e+00 - dense_3_acc_1: 0.0833 - dense_3_acc_2: 0.1833 - dense_3_acc_3: 0.2167 - dense_3_acc_4: 0.1833 - dense_3_acc_5: 0.2833 - dense_3_acc_6: 0.1000 - dense_3_acc_7: 0.1167 - dense_3_acc_8: 0.2500 - dense_3_acc_9: 0.1833 - dense_3_acc_10: 0.1833 - dense_3_acc_11: 0.1833 - dense_3_acc_12: 0.1500 - dense_3_acc_13: 0.2000 - dense_3_acc_14: 0.2333 - dense_3_acc_15: 0.1667 - dense_3_acc_16: 0.1500 - dense_3_acc_17: 0.2167 - dense_3_acc_18: 0.2167 - dense_3_acc_19: 0.2000 - dense_3_acc_20: 0.1667 - dense_3_acc_21: 0.1833 - dense_3_acc_22: 0.1833 - dense_3_acc_23: 0.2333 - dense_3_acc_24: 0.1333 - dense_3_acc_25: 0.1333 - dense_3_acc_26: 0.2667 - dense_3_acc_27: 0.2000 - dense_3_acc_28: 0.2000 - dense_3_acc_29: 0.1500 - dense_3_acc_30: 0.0000e+00     
Epoch 11/100
60/60 [==============================] - 0s - loss: 89.7774 - dense_3_loss_1: 4.1967 - dense_3_loss_2: 3.9256 - dense_3_loss_3: 3.5923 - dense_3_loss_4: 3.5117 - dense_3_loss_5: 3.3100 - dense_3_loss_6: 3.2893 - dense_3_loss_7: 3.2642 - dense_3_loss_8: 2.9527 - dense_3_loss_9: 2.9944 - dense_3_loss_10: 2.9050 - dense_3_loss_11: 3.0681 - dense_3_loss_12: 3.0528 - dense_3_loss_13: 2.8347 - dense_3_loss_14: 2.8333 - dense_3_loss_15: 2.9265 - dense_3_loss_16: 2.8885 - dense_3_loss_17: 2.7442 - dense_3_loss_18: 2.8901 - dense_3_loss_19: 2.9598 - dense_3_loss_20: 2.9747 - dense_3_loss_21: 2.9813 - dense_3_loss_22: 2.8475 - dense_3_loss_23: 3.1287 - dense_3_loss_24: 2.9326 - dense_3_loss_25: 3.0973 - dense_3_loss_26: 2.7781 - dense_3_loss_27: 2.9584 - dense_3_loss_28: 2.9191 - dense_3_loss_29: 3.0198 - dense_3_loss_30: 0.0000e+00 - dense_3_acc_1: 0.0667 - dense_3_acc_2: 0.2000 - dense_3_acc_3: 0.2333 - dense_3_acc_4: 0.1833 - dense_3_acc_5: 0.3000 - dense_3_acc_6: 0.1167 - dense_3_acc_7: 0.1667 - dense_3_acc_8: 0.2167 - dense_3_acc_9: 0.1667 - dense_3_acc_10: 0.2167 - dense_3_acc_11: 0.2167 - dense_3_acc_12: 0.1167 - dense_3_acc_13: 0.3000 - dense_3_acc_14: 0.3333 - dense_3_acc_15: 0.1833 - dense_3_acc_16: 0.1667 - dense_3_acc_17: 0.2667 - dense_3_acc_18: 0.2000 - dense_3_acc_19: 0.1833 - dense_3_acc_20: 0.2333 - dense_3_acc_21: 0.1667 - dense_3_acc_22: 0.2000 - dense_3_acc_23: 0.1000 - dense_3_acc_24: 0.1500 - dense_3_acc_25: 0.1667 - dense_3_acc_26: 0.3333 - dense_3_acc_27: 0.1000 - dense_3_acc_28: 0.2500 - dense_3_acc_29: 0.1833 - dense_3_acc_30: 0.0000e+00     
Epoch 12/100
60/60 [==============================] - 0s - loss: 86.1381 - dense_3_loss_1: 4.1851 - dense_3_loss_2: 3.8850 - dense_3_loss_3: 3.5297 - dense_3_loss_4: 3.4131 - dense_3_loss_5: 3.2159 - dense_3_loss_6: 3.1708 - dense_3_loss_7: 3.1263 - dense_3_loss_8: 2.8819 - dense_3_loss_9: 2.8328 - dense_3_loss_10: 2.7485 - dense_3_loss_11: 2.9368 - dense_3_loss_12: 2.9211 - dense_3_loss_13: 2.7100 - dense_3_loss_14: 2.7404 - dense_3_loss_15: 2.7736 - dense_3_loss_16: 2.8558 - dense_3_loss_17: 2.6848 - dense_3_loss_18: 2.7083 - dense_3_loss_19: 2.7587 - dense_3_loss_20: 2.8939 - dense_3_loss_21: 2.8221 - dense_3_loss_22: 2.6711 - dense_3_loss_23: 2.8983 - dense_3_loss_24: 2.7623 - dense_3_loss_25: 3.0229 - dense_3_loss_26: 2.6133 - dense_3_loss_27: 2.8025 - dense_3_loss_28: 2.7344 - dense_3_loss_29: 2.8388 - dense_3_loss_30: 0.0000e+00 - dense_3_acc_1: 0.0667 - dense_3_acc_2: 0.2000 - dense_3_acc_3: 0.2833 - dense_3_acc_4: 0.2000 - dense_3_acc_5: 0.2833 - dense_3_acc_6: 0.1667 - dense_3_acc_7: 0.1833 - dense_3_acc_8: 0.2500 - dense_3_acc_9: 0.2333 - dense_3_acc_10: 0.2833 - dense_3_acc_11: 0.2333 - dense_3_acc_12: 0.1833 - dense_3_acc_13: 0.3000 - dense_3_acc_14: 0.3333 - dense_3_acc_15: 0.2667 - dense_3_acc_16: 0.1667 - dense_3_acc_17: 0.3000 - dense_3_acc_18: 0.2167 - dense_3_acc_19: 0.2333 - dense_3_acc_20: 0.2667 - dense_3_acc_21: 0.2000 - dense_3_acc_22: 0.2167 - dense_3_acc_23: 0.2000 - dense_3_acc_24: 0.2333 - dense_3_acc_25: 0.1833 - dense_3_acc_26: 0.3667 - dense_3_acc_27: 0.2000 - dense_3_acc_28: 0.2333 - dense_3_acc_29: 0.2500 - dense_3_acc_30: 0.0000e+00     
Epoch 13/100
60/60 [==============================] - 0s - loss: 83.0048 - dense_3_loss_1: 4.1759 - dense_3_loss_2: 3.8460 - dense_3_loss_3: 3.4680 - dense_3_loss_4: 3.3173 - dense_3_loss_5: 3.1173 - dense_3_loss_6: 3.0510 - dense_3_loss_7: 2.9889 - dense_3_loss_8: 2.7988 - dense_3_loss_9: 2.7163 - dense_3_loss_10: 2.6419 - dense_3_loss_11: 2.8107 - dense_3_loss_12: 2.7711 - dense_3_loss_13: 2.6191 - dense_3_loss_14: 2.6568 - dense_3_loss_15: 2.6911 - dense_3_loss_16: 2.7627 - dense_3_loss_17: 2.5963 - dense_3_loss_18: 2.6295 - dense_3_loss_19: 2.6522 - dense_3_loss_20: 2.7368 - dense_3_loss_21: 2.6396 - dense_3_loss_22: 2.5529 - dense_3_loss_23: 2.7474 - dense_3_loss_24: 2.6789 - dense_3_loss_25: 2.8978 - dense_3_loss_26: 2.5337 - dense_3_loss_27: 2.7028 - dense_3_loss_28: 2.5108 - dense_3_loss_29: 2.6932 - dense_3_loss_30: 0.0000e+00 - dense_3_acc_1: 0.0667 - dense_3_acc_2: 0.2500 - dense_3_acc_3: 0.3167 - dense_3_acc_4: 0.2000 - dense_3_acc_5: 0.2667 - dense_3_acc_6: 0.1833 - dense_3_acc_7: 0.2500 - dense_3_acc_8: 0.2833 - dense_3_acc_9: 0.3167 - dense_3_acc_10: 0.2833 - dense_3_acc_11: 0.2667 - dense_3_acc_12: 0.1833 - dense_3_acc_13: 0.3333 - dense_3_acc_14: 0.3833 - dense_3_acc_15: 0.2833 - dense_3_acc_16: 0.1833 - dense_3_acc_17: 0.2833 - dense_3_acc_18: 0.2667 - dense_3_acc_19: 0.2333 - dense_3_acc_20: 0.2833 - dense_3_acc_21: 0.2667 - dense_3_acc_22: 0.2667 - dense_3_acc_23: 0.2667 - dense_3_acc_24: 0.2500 - dense_3_acc_25: 0.1667 - dense_3_acc_26: 0.3833 - dense_3_acc_27: 0.2167 - dense_3_acc_28: 0.3000 - dense_3_acc_29: 0.2000 - dense_3_acc_30: 0.0000e+00     
Epoch 14/100
60/60 [==============================] - 0s - loss: 79.0719 - dense_3_loss_1: 4.1663 - dense_3_loss_2: 3.8087 - dense_3_loss_3: 3.4003 - dense_3_loss_4: 3.2259 - dense_3_loss_5: 2.9954 - dense_3_loss_6: 2.9358 - dense_3_loss_7: 2.8625 - dense_3_loss_8: 2.6763 - dense_3_loss_9: 2.6039 - dense_3_loss_10: 2.4892 - dense_3_loss_11: 2.6584 - dense_3_loss_12: 2.6078 - dense_3_loss_13: 2.4825 - dense_3_loss_14: 2.5514 - dense_3_loss_15: 2.5484 - dense_3_loss_16: 2.6027 - dense_3_loss_17: 2.4027 - dense_3_loss_18: 2.4791 - dense_3_loss_19: 2.5219 - dense_3_loss_20: 2.5299 - dense_3_loss_21: 2.4862 - dense_3_loss_22: 2.4257 - dense_3_loss_23: 2.6476 - dense_3_loss_24: 2.4565 - dense_3_loss_25: 2.6770 - dense_3_loss_26: 2.3889 - dense_3_loss_27: 2.6094 - dense_3_loss_28: 2.3659 - dense_3_loss_29: 2.4654 - dense_3_loss_30: 0.0000e+00 - dense_3_acc_1: 0.0667 - dense_3_acc_2: 0.2333 - dense_3_acc_3: 0.3167 - dense_3_acc_4: 0.2000 - dense_3_acc_5: 0.3000 - dense_3_acc_6: 0.2167 - dense_3_acc_7: 0.2500 - dense_3_acc_8: 0.2667 - dense_3_acc_9: 0.3333 - dense_3_acc_10: 0.3000 - dense_3_acc_11: 0.3000 - dense_3_acc_12: 0.2500 - dense_3_acc_13: 0.3333 - dense_3_acc_14: 0.3000 - dense_3_acc_15: 0.2833 - dense_3_acc_16: 0.2500 - dense_3_acc_17: 0.3167 - dense_3_acc_18: 0.2833 - dense_3_acc_19: 0.2667 - dense_3_acc_20: 0.3333 - dense_3_acc_21: 0.3167 - dense_3_acc_22: 0.2333 - dense_3_acc_23: 0.2500 - dense_3_acc_24: 0.2333 - dense_3_acc_25: 0.1500 - dense_3_acc_26: 0.4167 - dense_3_acc_27: 0.2500 - dense_3_acc_28: 0.3333 - dense_3_acc_29: 0.2500 - dense_3_acc_30: 0.0000e+00     
Epoch 15/100
60/60 [==============================] - 0s - loss: 75.4414 - dense_3_loss_1: 4.1564 - dense_3_loss_2: 3.7674 - dense_3_loss_3: 3.3228 - dense_3_loss_4: 3.1187 - dense_3_loss_5: 2.8622 - dense_3_loss_6: 2.8109 - dense_3_loss_7: 2.7416 - dense_3_loss_8: 2.5302 - dense_3_loss_9: 2.4563 - dense_3_loss_10: 2.3512 - dense_3_loss_11: 2.5117 - dense_3_loss_12: 2.4126 - dense_3_loss_13: 2.2999 - dense_3_loss_14: 2.4006 - dense_3_loss_15: 2.3811 - dense_3_loss_16: 2.4643 - dense_3_loss_17: 2.2281 - dense_3_loss_18: 2.3949 - dense_3_loss_19: 2.4170 - dense_3_loss_20: 2.3722 - dense_3_loss_21: 2.4219 - dense_3_loss_22: 2.3248 - dense_3_loss_23: 2.5244 - dense_3_loss_24: 2.3268 - dense_3_loss_25: 2.5027 - dense_3_loss_26: 2.2544 - dense_3_loss_27: 2.5240 - dense_3_loss_28: 2.2352 - dense_3_loss_29: 2.3271 - dense_3_loss_30: 0.0000e+00 - dense_3_acc_1: 0.0667 - dense_3_acc_2: 0.2333 - dense_3_acc_3: 0.3500 - dense_3_acc_4: 0.2167 - dense_3_acc_5: 0.3167 - dense_3_acc_6: 0.1833 - dense_3_acc_7: 0.2500 - dense_3_acc_8: 0.2833 - dense_3_acc_9: 0.3333 - dense_3_acc_10: 0.3333 - dense_3_acc_11: 0.3000 - dense_3_acc_12: 0.2667 - dense_3_acc_13: 0.3833 - dense_3_acc_14: 0.3000 - dense_3_acc_15: 0.3167 - dense_3_acc_16: 0.3000 - dense_3_acc_17: 0.3167 - dense_3_acc_18: 0.2167 - dense_3_acc_19: 0.2333 - dense_3_acc_20: 0.3333 - dense_3_acc_21: 0.3000 - dense_3_acc_22: 0.2500 - dense_3_acc_23: 0.2667 - dense_3_acc_24: 0.2167 - dense_3_acc_25: 0.2500 - dense_3_acc_26: 0.4000 - dense_3_acc_27: 0.2500 - dense_3_acc_28: 0.4000 - dense_3_acc_29: 0.3333 - dense_3_acc_30: 0.0000e+00     
Epoch 16/100
60/60 [==============================] - 0s - loss: 71.7663 - dense_3_loss_1: 4.1465 - dense_3_loss_2: 3.7265 - dense_3_loss_3: 3.2414 - dense_3_loss_4: 3.0035 - dense_3_loss_5: 2.7335 - dense_3_loss_6: 2.6863 - dense_3_loss_7: 2.6216 - dense_3_loss_8: 2.3765 - dense_3_loss_9: 2.3230 - dense_3_loss_10: 2.2207 - dense_3_loss_11: 2.3979 - dense_3_loss_12: 2.2693 - dense_3_loss_13: 2.0989 - dense_3_loss_14: 2.1999 - dense_3_loss_15: 2.2687 - dense_3_loss_16: 2.3802 - dense_3_loss_17: 2.1471 - dense_3_loss_18: 2.2193 - dense_3_loss_19: 2.2368 - dense_3_loss_20: 2.2320 - dense_3_loss_21: 2.3192 - dense_3_loss_22: 2.1732 - dense_3_loss_23: 2.3451 - dense_3_loss_24: 2.2468 - dense_3_loss_25: 2.3530 - dense_3_loss_26: 2.0803 - dense_3_loss_27: 2.4072 - dense_3_loss_28: 2.1060 - dense_3_loss_29: 2.2058 - dense_3_loss_30: 0.0000e+00 - dense_3_acc_1: 0.0667 - dense_3_acc_2: 0.2333 - dense_3_acc_3: 0.3667 - dense_3_acc_4: 0.2667 - dense_3_acc_5: 0.3167 - dense_3_acc_6: