STEP 1: Import ALL the things!

The packages

In [14]:
!pip install ipython-autotime
%load_ext autotime
Collecting ipython-autotime
  Downloading https://files.pythonhosted.org/packages/e6/f9/0626bbdb322e3a078d968e87e3b01341e7890544de891d0cb613641220e6/ipython-autotime-0.1.tar.bz2
Building wheels for collected packages: ipython-autotime
  Building wheel for ipython-autotime (setup.py) ... done
  Created wheel for ipython-autotime: filename=ipython_autotime-0.1-cp36-none-any.whl size=1832 sha256=b99dbb80eedb7316d13d31df07c66ac1021233c76b6b2b641c2511e1b0f8ea8d
  Stored in directory: /root/.cache/pip/wheels/d2/df/81/2db1e54bc91002cec40334629bc39cfa86dff540b304ebcd6e
Successfully built ipython-autotime
Installing collected packages: ipython-autotime
Successfully installed ipython-autotime-0.1
In [15]:
import matplotlib
matplotlib.use("Agg")
from sklearn.preprocessing import LabelBinarizer
from sklearn.model_selection import train_test_split
from sklearn.metrics import classification_report
from keras.models import Sequential
from keras.layers.core import Dense
from keras.optimizers import SGD
from imutils import paths
import matplotlib.pyplot as plt
import numpy as np
import argparse
import random
import pickle
import cv2
import os
time: 9.97 ms

The data

In [16]:
## mount your Google Drive folder
from google.colab import drive
drive.mount('/content/drive')
Drive already mounted at /content/drive; to attempt to forcibly remount, call drive.mount("/content/drive", force_remount=True).
time: 2.67 ms
In [17]:
## change into the directory where your data is 
os.chdir("drive/My Drive/data")
---------------------------------------------------------------------------
FileNotFoundError                         Traceback (most recent call last)
<ipython-input-17-14114c974572> in <module>()
----> 1 os.chdir("drive/My Drive/data")

FileNotFoundError: [Errno 2] No such file or directory: 'drive/My Drive/data'
time: 1.86 s

STEP 2: Process the data

  1. Get image paths. Shuffle them.
  2. Loop over the paths and and load the images

Inside the loop (for each image)

  • resize the image
  • flatten the image
  • store the (newly flattened & resized) image in 'data' array
  • store the label in 'labels' array
In [19]:
print("[INFO] loading images...")
data = []
labels = []

# imagePaths = sorted(list(paths.list_images('animals')))
imagePaths = sorted(list(paths.list_images('jcrew')))
random.seed(42)
random.shuffle(imagePaths)

for imagePath in imagePaths:
	image = cv2.imread(imagePath)
	image = cv2.resize(image, (32, 32)).flatten()
	data.append(image)
	label = imagePath.split(os.path.sep)[-2]
	labels.append(label)
[INFO] loading images...
time: 1min 43s
In [20]:
data = np.array(data, dtype="float") / 255.0
labels = np.array(labels)
time: 10.7 ms
In [21]:
(trainX, testX, trainY, testY) = train_test_split(data,
	labels, test_size=0.25, random_state=42)
time: 11.5 ms

Convert the labels from integers to vectors

NOTE: for 2-class, binary classification you should use Keras' to_categorical function instead as the scikit-learn's LabelBinarizer will not return a vector

In [22]:
lb = LabelBinarizer()
trainY = lb.fit_transform(trainY)
testY = lb.transform(testY)
time: 7.41 ms

STEP 3: MODEL

In [23]:
model = Sequential()
model.add(Dense(1024, input_shape=(3072,), activation="sigmoid"))
model.add(Dense(512, activation="sigmoid"))
model.add(Dense(len(lb.classes_), activation="softmax"))
time: 58 ms
  1. Initialize learning rate and # of epochs
  2. Compile the model using:
  • OPTIMIZER: SGD
  • LOSS: categorical cross-entropy loss

NOTE: use binary_crossentropy for 2-class classification

In [24]:
INIT_LR = 0.01
EPOCHS = 75

print("[INFO] training network...")
opt = SGD(lr=INIT_LR)
model.compile(loss="categorical_crossentropy", optimizer=opt,
	metrics=["accuracy"])
[INFO] training network...
time: 52.8 ms

STEP 4: Train the neural network

In [25]:
H = model.fit(trainX, trainY, validation_data=(testX, testY),
	epochs=EPOCHS, batch_size=32)
Train on 341 samples, validate on 114 samples
Epoch 1/75
341/341 [==============================] - 1s 2ms/step - loss: 1.1165 - acc: 0.3695 - val_loss: 1.1075 - val_acc: 0.3596
Epoch 2/75
341/341 [==============================] - 0s 1ms/step - loss: 1.0861 - acc: 0.4252 - val_loss: 1.0684 - val_acc: 0.3509
Epoch 3/75
341/341 [==============================] - 0s 1ms/step - loss: 1.0697 - acc: 0.4370 - val_loss: 1.1091 - val_acc: 0.3596
Epoch 4/75
341/341 [==============================] - 0s 1ms/step - loss: 1.0495 - acc: 0.4457 - val_loss: 1.0706 - val_acc: 0.4211
Epoch 5/75
341/341 [==============================] - 0s 1ms/step - loss: 1.0429 - acc: 0.4692 - val_loss: 1.0699 - val_acc: 0.3596
Epoch 6/75
341/341 [==============================] - 0s 1ms/step - loss: 1.0180 - acc: 0.4663 - val_loss: 1.0424 - val_acc: 0.5088
Epoch 7/75
341/341 [==============================] - 0s 1ms/step - loss: 1.0139 - acc: 0.5718 - val_loss: 1.0202 - val_acc: 0.4474
Epoch 8/75
341/341 [==============================] - 0s 1ms/step - loss: 1.0080 - acc: 0.4985 - val_loss: 1.0259 - val_acc: 0.3596
Epoch 9/75
341/341 [==============================] - 0s 1ms/step - loss: 0.9903 - acc: 0.5455 - val_loss: 0.9959 - val_acc: 0.5351
Epoch 10/75
341/341 [==============================] - 0s 1ms/step - loss: 0.9698 - acc: 0.6364 - val_loss: 0.9801 - val_acc: 0.5614
Epoch 11/75
341/341 [==============================] - 0s 1ms/step - loss: 0.9674 - acc: 0.6305 - val_loss: 0.9609 - val_acc: 0.7368
Epoch 12/75
341/341 [==============================] - 0s 1ms/step - loss: 0.9521 - acc: 0.6510 - val_loss: 0.9498 - val_acc: 0.7368
Epoch 13/75
341/341 [==============================] - 0s 1ms/step - loss: 0.9418 - acc: 0.7009 - val_loss: 0.9430 - val_acc: 0.6667
Epoch 14/75
341/341 [==============================] - 0s 1ms/step - loss: 0.9224 - acc: 0.6833 - val_loss: 0.9413 - val_acc: 0.5702
Epoch 15/75
341/341 [==============================] - 0s 1ms/step - loss: 0.9137 - acc: 0.6804 - val_loss: 0.9358 - val_acc: 0.6404
Epoch 16/75
341/341 [==============================] - 0s 1ms/step - loss: 0.8939 - acc: 0.7126 - val_loss: 0.9098 - val_acc: 0.6930
Epoch 17/75
341/341 [==============================] - 0s 1ms/step - loss: 0.8826 - acc: 0.7214 - val_loss: 0.9010 - val_acc: 0.6228
Epoch 18/75
341/341 [==============================] - 0s 1ms/step - loss: 0.8627 - acc: 0.7390 - val_loss: 0.8884 - val_acc: 0.6754
Epoch 19/75
341/341 [==============================] - 0s 1ms/step - loss: 0.8402 - acc: 0.7155 - val_loss: 0.8712 - val_acc: 0.8070
Epoch 20/75
341/341 [==============================] - 0s 1ms/step - loss: 0.8372 - acc: 0.7155 - val_loss: 0.8541 - val_acc: 0.8070
Epoch 21/75
341/341 [==============================] - 0s 1ms/step - loss: 0.8160 - acc: 0.7889 - val_loss: 0.8458 - val_acc: 0.6842
Epoch 22/75
341/341 [==============================] - 0s 1ms/step - loss: 0.8024 - acc: 0.7771 - val_loss: 0.8341 - val_acc: 0.7105
Epoch 23/75
341/341 [==============================] - 0s 1ms/step - loss: 0.7999 - acc: 0.7595 - val_loss: 0.8153 - val_acc: 0.7544
Epoch 24/75
341/341 [==============================] - 0s 1ms/step - loss: 0.7731 - acc: 0.7771 - val_loss: 0.8048 - val_acc: 0.7281
Epoch 25/75
341/341 [==============================] - 0s 1ms/step - loss: 0.7544 - acc: 0.8240 - val_loss: 0.8132 - val_acc: 0.6316
Epoch 26/75
341/341 [==============================] - 0s 1ms/step - loss: 0.7438 - acc: 0.7859 - val_loss: 0.7948 - val_acc: 0.6842
Epoch 27/75
341/341 [==============================] - 0s 1ms/step - loss: 0.7299 - acc: 0.7859 - val_loss: 0.7909 - val_acc: 0.6754
Epoch 28/75
341/341 [==============================] - 0s 1ms/step - loss: 0.7181 - acc: 0.7918 - val_loss: 0.7551 - val_acc: 0.7544
Epoch 29/75
341/341 [==============================] - 0s 1ms/step - loss: 0.7017 - acc: 0.8240 - val_loss: 0.7749 - val_acc: 0.7719
Epoch 30/75
341/341 [==============================] - 0s 1ms/step - loss: 0.6974 - acc: 0.7947 - val_loss: 0.7382 - val_acc: 0.7281
Epoch 31/75
341/341 [==============================] - 0s 1ms/step - loss: 0.6790 - acc: 0.8240 - val_loss: 0.7244 - val_acc: 0.7456
Epoch 32/75
341/341 [==============================] - 0s 1ms/step - loss: 0.6669 - acc: 0.8182 - val_loss: 0.7111 - val_acc: 0.7719
Epoch 33/75
341/341 [==============================] - 0s 1ms/step - loss: 0.6704 - acc: 0.8211 - val_loss: 0.7091 - val_acc: 0.7368
Epoch 34/75
341/341 [==============================] - 0s 1ms/step - loss: 0.6371 - acc: 0.8299 - val_loss: 0.7077 - val_acc: 0.7018
Epoch 35/75
341/341 [==============================] - 0s 1ms/step - loss: 0.6254 - acc: 0.8328 - val_loss: 0.6930 - val_acc: 0.7281
Epoch 36/75
341/341 [==============================] - 0s 1ms/step - loss: 0.6193 - acc: 0.8328 - val_loss: 0.7136 - val_acc: 0.6842
Epoch 37/75
341/341 [==============================] - 0s 1ms/step - loss: 0.6176 - acc: 0.8240 - val_loss: 0.6626 - val_acc: 0.7544
Epoch 38/75
341/341 [==============================] - 0s 1ms/step - loss: 0.5951 - acc: 0.8387 - val_loss: 0.6641 - val_acc: 0.7456
Epoch 39/75
341/341 [==============================] - 0s 1ms/step - loss: 0.5837 - acc: 0.8328 - val_loss: 0.6366 - val_acc: 0.8070
Epoch 40/75
341/341 [==============================] - 0s 1ms/step - loss: 0.5616 - acc: 0.8710 - val_loss: 0.6285 - val_acc: 0.8684
Epoch 41/75
341/341 [==============================] - 0s 1ms/step - loss: 0.5610 - acc: 0.8856 - val_loss: 0.6261 - val_acc: 0.7544
Epoch 42/75
341/341 [==============================] - 0s 1ms/step - loss: 0.5525 - acc: 0.8768 - val_loss: 0.6807 - val_acc: 0.6491
Epoch 43/75
341/341 [==============================] - 0s 1ms/step - loss: 0.5460 - acc: 0.8387 - val_loss: 0.6123 - val_acc: 0.7719
Epoch 44/75
341/341 [==============================] - 0s 1ms/step - loss: 0.5306 - acc: 0.8856 - val_loss: 0.6086 - val_acc: 0.7456
Epoch 45/75
341/341 [==============================] - 0s 1ms/step - loss: 0.5192 - acc: 0.8886 - val_loss: 0.5962 - val_acc: 0.7544
Epoch 46/75
341/341 [==============================] - 0s 1ms/step - loss: 0.5092 - acc: 0.8798 - val_loss: 0.5774 - val_acc: 0.8333
Epoch 47/75
341/341 [==============================] - 0s 1ms/step - loss: 0.4955 - acc: 0.8915 - val_loss: 0.5655 - val_acc: 0.8684
Epoch 48/75
341/341 [==============================] - 0s 1ms/step - loss: 0.4908 - acc: 0.9267 - val_loss: 0.5831 - val_acc: 0.7456
Epoch 49/75
341/341 [==============================] - 0s 1ms/step - loss: 0.4768 - acc: 0.8915 - val_loss: 0.5710 - val_acc: 0.7544
Epoch 50/75
341/341 [==============================] - 0s 1ms/step - loss: 0.4682 - acc: 0.9003 - val_loss: 0.5419 - val_acc: 0.8860
Epoch 51/75
341/341 [==============================] - 0s 1ms/step - loss: 0.4574 - acc: 0.9384 - val_loss: 0.5484 - val_acc: 0.7895
Epoch 52/75
341/341 [==============================] - 0s 1ms/step - loss: 0.4495 - acc: 0.9179 - val_loss: 0.5265 - val_acc: 0.8772
Epoch 53/75
341/341 [==============================] - 0s 1ms/step - loss: 0.4402 - acc: 0.9267 - val_loss: 0.5202 - val_acc: 0.8947
Epoch 54/75
341/341 [==============================] - 0s 1ms/step - loss: 0.4426 - acc: 0.9296 - val_loss: 0.5186 - val_acc: 0.8421
Epoch 55/75
341/341 [==============================] - 0s 1ms/step - loss: 0.4266 - acc: 0.9384 - val_loss: 0.5158 - val_acc: 0.8684
Epoch 56/75
341/341 [==============================] - 0s 1ms/step - loss: 0.4182 - acc: 0.9326 - val_loss: 0.5002 - val_acc: 0.8860
Epoch 57/75
341/341 [==============================] - 0s 1ms/step - loss: 0.4084 - acc: 0.9296 - val_loss: 0.4996 - val_acc: 0.8596
Epoch 58/75
341/341 [==============================] - 0s 1ms/step - loss: 0.4058 - acc: 0.9413 - val_loss: 0.4962 - val_acc: 0.8333
Epoch 59/75
341/341 [==============================] - 0s 1ms/step - loss: 0.4000 - acc: 0.9355 - val_loss: 0.4833 - val_acc: 0.8772
Epoch 60/75
341/341 [==============================] - 0s 1ms/step - loss: 0.3961 - acc: 0.9296 - val_loss: 0.4751 - val_acc: 0.8860
Epoch 61/75
341/341 [==============================] - 0s 1ms/step - loss: 0.3845 - acc: 0.9531 - val_loss: 0.4839 - val_acc: 0.8246
Epoch 62/75
341/341 [==============================] - 0s 1ms/step - loss: 0.3787 - acc: 0.9355 - val_loss: 0.4811 - val_acc: 0.8509
Epoch 63/75
341/341 [==============================] - 0s 1ms/step - loss: 0.3708 - acc: 0.9413 - val_loss: 0.4610 - val_acc: 0.8772
Epoch 64/75
341/341 [==============================] - 0s 1ms/step - loss: 0.3656 - acc: 0.9560 - val_loss: 0.4635 - val_acc: 0.8596
Epoch 65/75
341/341 [==============================] - 0s 1ms/step - loss: 0.3605 - acc: 0.9472 - val_loss: 0.4484 - val_acc: 0.8860
Epoch 66/75
341/341 [==============================] - 0s 1ms/step - loss: 0.3551 - acc: 0.9648 - val_loss: 0.4452 - val_acc: 0.8772
Epoch 67/75
341/341 [==============================] - 0s 1ms/step - loss: 0.3458 - acc: 0.9560 - val_loss: 0.4423 - val_acc: 0.9123
Epoch 68/75
341/341 [==============================] - 0s 1ms/step - loss: 0.3443 - acc: 0.9560 - val_loss: 0.4363 - val_acc: 0.8860
Epoch 69/75
341/341 [==============================] - 0s 1ms/step - loss: 0.3385 - acc: 0.9472 - val_loss: 0.4305 - val_acc: 0.8947
Epoch 70/75
341/341 [==============================] - 0s 1ms/step - loss: 0.3357 - acc: 0.9501 - val_loss: 0.4268 - val_acc: 0.8860
Epoch 71/75
341/341 [==============================] - 0s 1ms/step - loss: 0.3315 - acc: 0.9531 - val_loss: 0.4195 - val_acc: 0.8860
Epoch 72/75
341/341 [==============================] - 0s 1ms/step - loss: 0.3208 - acc: 0.9560 - val_loss: 0.4303 - val_acc: 0.9386
Epoch 73/75
341/341 [==============================] - 0s 1ms/step - loss: 0.3170 - acc: 0.9824 - val_loss: 0.4134 - val_acc: 0.8860
Epoch 74/75
341/341 [==============================] - 0s 1ms/step - loss: 0.3135 - acc: 0.9589 - val_loss: 0.4178 - val_acc: 0.9035
Epoch 75/75
341/341 [==============================] - 0s 1ms/step - loss: 0.3154 - acc: 0.9472 - val_loss: 0.4045 - val_acc: 0.9123
time: 34 s

STEP 5: Evaluate the trained network

In [26]:
print("[INFO] evaluating network...")
predictions = model.predict(testX, batch_size=32)
print(classification_report(testY.argmax(axis=1),
	predictions.argmax(axis=1), target_names=lb.classes_))

N = np.arange(0, EPOCHS)
plt.style.use("ggplot")
plt.figure()
plt.plot(N, H.history["loss"], label="train_loss")
plt.plot(N, H.history["val_loss"], label="val_loss")
plt.plot(N, H.history["acc"], label="train_acc")
plt.plot(N, H.history["val_acc"], label="val_acc")
plt.title("Training Loss and Accuracy (Simple NN)")
plt.xlabel("Epoch #")
plt.ylabel("Loss/Accuracy")
plt.legend()
plt.savefig('results.png')
[INFO] evaluating network...
              precision    recall  f1-score   support

       pants       0.90      0.82      0.86        33
 shirts_tops       0.86      0.95      0.90        40
       shoes       0.97      0.95      0.96        41

    accuracy                           0.91       114
   macro avg       0.91      0.91      0.91       114
weighted avg       0.91      0.91      0.91       114

time: 242 ms

STEP 6: Test the model on new data

In [27]:
image = cv2.imread('images/jc_test.jpeg')
output = image.copy()

## doing the same pre-processing we did above
image = cv2.resize(image, (32, 32))
image = image.astype("float") / 255.0
image = image.flatten()
image = image.reshape((1, image.shape[0]))
time: 142 ms
In [34]:
preds = model.predict(image)

## take the label with the highest probability
i = preds.argmax(axis=1)[0]
label = lb.classes_[i]
print(label, preds[0][i])
shirts_tops 0.62997866
time: 13.5 ms
In [0]: