CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
y33-j3T

Real-time collaboration for Jupyter Notebooks, Linux Terminals, LaTeX, VS Code, R IDE, and more,
all in one place. Commercial Alternative to JupyterHub.

GitHub Repository: y33-j3T/Coursera-Deep-Learning
Path: blob/master/Introduction to TensorFlow for Artificial Intelligence, Machine Learning, and Deep Learning/Week 4 - Using Real-world Images/Exercise4-Question.ipynb
Views: 13373
Kernel: Python 3

Below is code with a link to a happy or sad dataset which contains 80 images, 40 happy and 40 sad. Create a convolutional neural network that trains to 100% accuracy on these images, which cancels training upon hitting training accuracy of >.999

Hint -- it will work best with 3 convolutional layers.

import tensorflow as tf import os import zipfile from os import path, getcwd, chdir # DO NOT CHANGE THE LINE BELOW. If you are developing in a local # environment, then grab happy-or-sad.zip from the Coursera Jupyter Notebook # and place it inside a local folder and edit the path to that location path = f"{getcwd()}/../tmp2/happy-or-sad.zip" zip_ref = zipfile.ZipFile(path, 'r') zip_ref.extractall("/tmp/h-or-s") zip_ref.close()
# GRADED FUNCTION: train_happy_sad_model def train_happy_sad_model(): # Please write your code only where you are indicated. # please do not remove # model fitting inline comments. DESIRED_ACCURACY = 0.999 class myCallback(tf.keras.callbacks.Callback): def on_epoch_end(self, epoch, logs={}): if logs.get('acc') > DESIRED_ACCURACY: print('\nReached 99.9% accuracy so cancelling training!') self.model.stop_training = True callbacks = myCallback() # This Code Block should Define and Compile the Model. Please assume the images are 150 X 150 in your implementation. model = tf.keras.models.Sequential([ # Your Code Here tf.keras.layers.Conv2D(64, (3, 3), activation='relu', input_shape=(150, 150, 3)), tf.keras.layers.MaxPooling2D(2, 2), tf.keras.layers.Conv2D(64, (3, 3), activation='relu'), tf.keras.layers.MaxPooling2D(2, 2), tf.keras.layers.Conv2D(64, (3, 3), activation='relu'), tf.keras.layers.MaxPooling2D(2, 2), tf.keras.layers.Flatten(), tf.keras.layers.Dense(128, activation='relu'), tf.keras.layers.Dense(1, activation='sigmoid') ]) from tensorflow.keras.optimizers import RMSprop model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['acc']) # This code block should create an instance of an ImageDataGenerator called train_datagen # And a train_generator by calling train_datagen.flow_from_directory from tensorflow.keras.preprocessing.image import ImageDataGenerator train_datagen = ImageDataGenerator(rescale=1/255.0) # Please use a target_size of 150 X 150. train_generator = train_datagen.flow_from_directory( '/tmp/h-or-s', target_size=(150, 150), batch_size=32, class_mode='binary' ) # Expected output: 'Found 80 images belonging to 2 classes' # This code block should call model.fit_generator and train for # a number of epochs. # model fitting history = model.fit_generator( train_generator, steps_per_epoch=1, epochs=30, verbose=1, callbacks=[callbacks] ) # model fitting return history.history['acc'][-1]
# The Expected output: "Reached 99.9% accuracy so cancelling training!"" train_happy_sad_model()
WARNING: Logging before flag parsing goes to stderr. W1214 10:41:39.849465 139889356474176 deprecation.py:506] From /usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/init_ops.py:1251: calling VarianceScaling.__init__ (from tensorflow.python.ops.init_ops) with dtype is deprecated and will be removed in a future version. Instructions for updating: Call initializer instance with the dtype argument instead of passing it to the constructor W1214 10:41:40.224138 139889356474176 deprecation.py:323] From /usr/local/lib/python3.6/dist-packages/tensorflow/python/ops/nn_impl.py:180: add_dispatch_support.<locals>.wrapper (from tensorflow.python.ops.array_ops) is deprecated and will be removed in a future version. Instructions for updating: Use tf.where in 2.0, which has the same broadcast rule as np.where
Found 80 images belonging to 2 classes. Epoch 1/30 1/1 [==============================] - 5s 5s/step - loss: 0.6968 - acc: 0.5312 Epoch 2/30 1/1 [==============================] - 0s 86ms/step - loss: 1.3948 - acc: 0.4688 Epoch 3/30 1/1 [==============================] - 0s 407ms/step - loss: 0.7137 - acc: 0.5000 Epoch 4/30 1/1 [==============================] - 0s 91ms/step - loss: 0.7384 - acc: 0.4688 Epoch 5/30 1/1 [==============================] - 0s 93ms/step - loss: 0.6834 - acc: 0.5625 Epoch 6/30 1/1 [==============================] - 0s 27ms/step - loss: 0.6924 - acc: 0.4375 Epoch 7/30 1/1 [==============================] - 0s 178ms/step - loss: 0.6829 - acc: 0.5000 Epoch 8/30 1/1 [==============================] - 0s 88ms/step - loss: 0.6716 - acc: 0.5625 Epoch 9/30 1/1 [==============================] - 0s 109ms/step - loss: 0.6962 - acc: 0.4688 Epoch 10/30 1/1 [==============================] - 0s 116ms/step - loss: 0.6600 - acc: 0.7500 Epoch 11/30 1/1 [==============================] - 0s 16ms/step - loss: 0.6318 - acc: 0.6875 Epoch 12/30 1/1 [==============================] - 0s 113ms/step - loss: 0.7155 - acc: 0.4062 Epoch 13/30 1/1 [==============================] - 0s 176ms/step - loss: 0.5984 - acc: 0.5938 Epoch 14/30 1/1 [==============================] - 0s 114ms/step - loss: 0.6052 - acc: 0.8750 Epoch 15/30 1/1 [==============================] - 0s 83ms/step - loss: 0.5532 - acc: 0.8125 Epoch 16/30 1/1 [==============================] - 0s 119ms/step - loss: 0.6283 - acc: 0.5625 Epoch 17/30 1/1 [==============================] - 0s 13ms/step - loss: 0.4351 - acc: 0.9375 Epoch 18/30 Reached 99.9% accuracy so cancelling training! 1/1 [==============================] - 0s 118ms/step - loss: 0.4483 - acc: 1.0000
1.0
# Now click the 'Submit Assignment' button above. # Once that is complete, please run the following two cells to save your work and close the notebook
%%javascript <!-- Save the notebook --> IPython.notebook.save_checkpoint();
%%javascript IPython.notebook.session.delete(); window.onbeforeunload = null setTimeout(function() { window.close(); }, 1000);