CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
y33-j3T

Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place. Commercial Alternative to JupyterHub.

GitHub Repository: y33-j3T/Coursera-Deep-Learning
Path: blob/master/Introduction to TensorFlow for Artificial Intelligence, Machine Learning, and Deep Learning/Week 2 - Introduction to Computer Vision/Exercise2-Question.ipynb
Views: 13373
Kernel: Python 3

Exercise 2

In the course you learned how to do classificaiton using Fashion MNIST, a data set containing items of clothing. There's another, similar dataset called MNIST which has items of handwriting -- the digits 0 through 9.

Write an MNIST classifier that trains to 99% accuracy or above, and does it without a fixed number of epochs -- i.e. you should stop training once you reach that level of accuracy.

Some notes:

  1. It should succeed in less than 10 epochs, so it is okay to change epochs= to 10, but nothing larger

  2. When it reaches 99% or greater it should print out the string "Reached 99% accuracy so cancelling training!"

  3. If you add any additional variables, make sure you use the same names as the ones used in the class

I've started the code for you below -- how would you finish it?

import tensorflow as tf from os import path, getcwd, chdir # DO NOT CHANGE THE LINE BELOW. If you are developing in a local # environment, then grab mnist.npz from the Coursera Jupyter Notebook # and place it inside a local folder and edit the path to that location path = f"{getcwd()}/../tmp2/mnist.npz"
# GRADED FUNCTION: train_mnist def train_mnist(): # Please write your code only where you are indicated. # please do not remove # model fitting inline comments. # YOUR CODE SHOULD START HERE class StopTrainingCallback(tf.keras.callbacks.Callback): def on_epoch_end(self, epoch, logs={}): if (logs.get('acc') >= 0.99): print('\nReached desired accuracy (0.99), No more training.') self.model.stop_training = True callback = StopTrainingCallback() # YOUR CODE SHOULD END HERE mnist = tf.keras.datasets.mnist (x_train, y_train),(x_test, y_test) = mnist.load_data(path=path) # YOUR CODE SHOULD START HERE x_train = x_train / 255.0 x_test = x_test / 255.0 # YOUR CODE SHOULD END HERE model = tf.keras.models.Sequential([ # YOUR CODE SHOULD START HERE tf.keras.layers.Flatten(), tf.keras.layers.Dense(512, activation=tf.nn.relu), tf.keras.layers.Dense(10, activation=tf.nn.softmax) # YOUR CODE SHOULD END HERE ]) model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy']) # model fitting history = model.fit( # YOUR CODE SHOULD START HERE x_train, y_train, epochs=10, callbacks=[callback] # YOUR CODE SHOULD END HERE ) # model fitting return history.epoch, history.history['acc'][-1]
train_mnist()
Epoch 1/10 60000/60000 [==============================] - 10s 173us/sample - loss: 0.2035 - acc: 0.9395 Epoch 2/10 60000/60000 [==============================] - 10s 160us/sample - loss: 0.0798 - acc: 0.9758 Epoch 3/10 60000/60000 [==============================] - 10s 159us/sample - loss: 0.0516 - acc: 0.9837 Epoch 4/10 60000/60000 [==============================] - 10s 161us/sample - loss: 0.0367 - acc: 0.9881 Epoch 5/10 59744/60000 [============================>.] - ETA: 0s - loss: 0.0262 - acc: 0.9917 Reached desired accuracy (0.99), No more training. 60000/60000 [==============================] - 10s 168us/sample - loss: 0.0262 - acc: 0.9917
([0, 1, 2, 3, 4], 0.9917)
# Now click the 'Submit Assignment' button above. # Once that is complete, please run the following two cells to save your work and close the notebook
%%javascript <!-- Save the notebook --> IPython.notebook.save_checkpoint();
%%javascript IPython.notebook.session.delete(); window.onbeforeunload = null setTimeout(function() { window.close(); }, 1000);