📚 The CoCalc Library - books, templates and other resources
cocalc-examples / data-science-ipython-notebooks / deep-learning / keras-tutorial / 3.3 (Extra) LSTM for Sentence Generation.ipynb
132928 viewsLicense: OTHER
Kernel: Python 3
Credits: Forked from deep-learning-keras-tensorflow by Valerio Maggio
RNN using LSTM
<img src="imgs/RNN-rolled.png"/ width="80px" height="80px">
<img src="imgs/RNN-unrolled.png"/ width="400px" height="400px">
<img src="imgs/LSTM3-chain.png"/ width="60%">
In [3]:
Reading blog post from data directory
In [4]:
In [5]:
Out[5]:
/home/valerio/deep-learning-keras-euroscipy2016/data
In [6]:
In [7]:
In [85]:
In [86]:
In [87]:
In [88]:
In [89]:
Out[89]:
X_train_rnn shape: (3873, 100) (3873,)
X_test_rnn shape: (969, 100) (969,)
In [90]:
In [91]:
In [92]:
Out[92]:
Train on 3873 samples, validate on 969 samples
Epoch 1/4
3873/3873 [==============================] - 3s - loss: 0.2487 - acc: 0.5378 - val_loss: 0.2506 - val_acc: 0.5191
Epoch 2/4
3873/3873 [==============================] - 3s - loss: 0.2486 - acc: 0.5401 - val_loss: 0.2508 - val_acc: 0.5191
Epoch 3/4
3873/3873 [==============================] - 3s - loss: 0.2484 - acc: 0.5417 - val_loss: 0.2496 - val_acc: 0.5191
Epoch 4/4
3873/3873 [==============================] - 3s - loss: 0.2484 - acc: 0.5399 - val_loss: 0.2502 - val_acc: 0.5191
<keras.callbacks.History at 0x7fa1e96ac4e0>
In [93]:
Out[93]:
969/969 [==============================] - 0s
In [94]:
Out[94]:
0.250189056399 0.519091847357
Using TFIDF Vectorizer as an input instead of one hot encoder
In [95]:
In [96]:
In [97]:
In [98]:
In [99]:
In [100]:
Out[100]:
X_train_rnn shape: (4152, 100) (4152,)
X_test_rnn shape: (1038, 100) (1038,)
In [101]:
In [102]:
In [103]:
Out[103]:
Train on 4152 samples, validate on 1038 samples
Epoch 1/4
4152/4152 [==============================] - 3s - loss: 0.2502 - acc: 0.4988 - val_loss: 0.2503 - val_acc: 0.4865
Epoch 2/4
4152/4152 [==============================] - 3s - loss: 0.2507 - acc: 0.4843 - val_loss: 0.2500 - val_acc: 0.4865
Epoch 3/4
4152/4152 [==============================] - 3s - loss: 0.2504 - acc: 0.4952 - val_loss: 0.2501 - val_acc: 0.4865
Epoch 4/4
4152/4152 [==============================] - 3s - loss: 0.2506 - acc: 0.4913 - val_loss: 0.2500 - val_acc: 0.5135
<keras.callbacks.History at 0x7fa1f466f278>
In [104]:
Out[104]:
1038/1038 [==============================] - 0s
In [105]:
Out[105]:
0.249981284572 0.513487476145
Sentence Generation using LSTM
In [106]:
In [107]:
Out[107]:
(2552476, 20, 152) (2552476, 152)
(2552476, 20, 152) (2552476, 152)
In [109]:
Out[109]:
Build model...
In [74]:
In [110]:
In [111]:
In [113]:
Out[113]:
--------------------------------------------------
Iteration 1
Epoch 1/1
2552476/2552476 [==============================] - 226s - loss: 1.8022
----- diversity: 0.2
----- Generating with seed: "p from the lack of "
sense of the search
----- diversity: 0.4
----- Generating with seed: "p from the lack of "
through that possibl
----- diversity: 0.6
----- Generating with seed: "p from the lack of "
. This is a " by p
----- diversity: 0.8
----- Generating with seed: "p from the lack of "
d he latermal ta we
--------------------------------------------------
Iteration 2
Epoch 1/1
2552476/2552476 [==============================] - 228s - loss: 1.7312
----- diversity: 0.2
----- Generating with seed: "s Last Dance" with t"
screening on the st
----- diversity: 0.4
----- Generating with seed: "s Last Dance" with t"
r song think of the
----- diversity: 0.6
----- Generating with seed: "s Last Dance" with t"
. I'm akin computer
----- diversity: 0.8
----- Generating with seed: "s Last Dance" with t"
played that comment
--------------------------------------------------
Iteration 3
Epoch 1/1
2552476/2552476 [==============================] - 229s - loss: 1.8693
----- diversity: 0.2
----- Generating with seed: ", as maybe someone w"
the ssone the so the
----- diversity: 0.4
----- Generating with seed: ", as maybe someone w"
the sasd nouts and t
----- diversity: 0.6
----- Generating with seed: ", as maybe someone w"
p hin I had at f¿ to
----- diversity: 0.8
----- Generating with seed: ", as maybe someone w"
oge rely bluy leanda
--------------------------------------------------
Iteration 4
Epoch 1/1
2552476/2552476 [==============================] - 228s - loss: 1.9135
----- diversity: 0.2
----- Generating with seed: "o the package :(. Ah"
suadedbe teacher th
----- diversity: 0.4
----- Generating with seed: "o the package :(. Ah"
e a searingly the id
----- diversity: 0.6
----- Generating with seed: "o the package :(. Ah"
propost the bure so
----- diversity: 0.8
----- Generating with seed: "o the package :(. Ah"
ing.Lever fan. By in
--------------------------------------------------
Iteration 5
Epoch 1/1
2552476/2552476 [==============================] - 229s - loss: 4.5892
----- diversity: 0.2
----- Generating with seed: "ot as long as my fri"
atde getu th> QQ.“]
----- diversity: 0.4
----- Generating with seed: "ot as long as my fri"
tQ t[we QaaefYhere Q
----- diversity: 0.6
----- Generating with seed: "ot as long as my fri"
ew[”*ing”e[ t[w that
----- diversity: 0.8
----- Generating with seed: "ot as long as my fri"
me]sQoonQ“]e” ti nw
--------------------------------------------------
Iteration 6
Epoch 1/1
2552476/2552476 [==============================] - 229s - loss: 6.7174
----- diversity: 0.2
----- Generating with seed: "use I'm pretty damn "
me g 'o a a a a
----- diversity: 0.4
----- Generating with seed: "use I'm pretty damn "
a o theT a o a
----- diversity: 0.6
----- Generating with seed: "use I'm pretty damn "
n . thot auupe to
----- diversity: 0.8
----- Generating with seed: "use I'm pretty damn "
tomalek ho tt Ion i
--------------------------------------------------
Iteration 7
Epoch 1/1
2552476/2552476 [==============================] - 227s - loss: 6.9138
----- diversity: 0.2
----- Generating with seed: "ats all got along be"
thrtg t ia thv i c
----- diversity: 0.4
----- Generating with seed: "ats all got along be"
th wtot.. t to gt?
----- diversity: 0.6
----- Generating with seed: "ats all got along be"
ed dthwnn,is a ment
----- diversity: 0.8
----- Generating with seed: "ats all got along be"
t incow . wmiyit
--------------------------------------------------
Iteration 8
Epoch 1/1
2552476/2552476 [==============================] - 228s - loss: 11.0629
----- diversity: 0.2
----- Generating with seed: "oot of my sleeping b"
m g te>t e s t anab
----- diversity: 0.4
----- Generating with seed: "oot of my sleeping b"
dttoe s s“snge es s
----- diversity: 0.6
----- Generating with seed: "oot of my sleeping b"
tut hou wen a onap
----- diversity: 0.8
----- Generating with seed: "oot of my sleeping b"
evtyr tt e io on tok
--------------------------------------------------
Iteration 9
Epoch 1/1
2552476/2552476 [==============================] - 228s - loss: 8.7874
----- diversity: 0.2
----- Generating with seed: " I’ve always looked "
ea e ton ann n ffee
----- diversity: 0.4
----- Generating with seed: " I’ve always looked "
o tire n a anV sia a
----- diversity: 0.6
----- Generating with seed: " I’ve always looked "
r i jooe Vag o en
----- diversity: 0.8
----- Generating with seed: " I’ve always looked "
ao at ge ena oro o