Skip to content

Latest commit

 

History

History
21 lines (14 loc) · 977 Bytes

readme.md

File metadata and controls

21 lines (14 loc) · 977 Bytes

Using Transfer Learning to Capitalize on State of the Art Networks

Repurposing InceptionV3, VGG16, and ResNet50

Read my full write-up with visualizations on my website galenballew.github.io

Or check out the article on Medium.

The Challenge: Some of the most advanced convolutional neural networks are available with their weights already trained. This starting state can produce enormous leverage for your next image classification task. This project shows how the bottlenecking technique can be employed to significantly increase the training speed for the repurposed network.

The Toolkit:

  • TensforFlow
  • scikit-learn
  • Keras
  • numpy
  • D3.js

The Results: Check out my website for the D3 visualization of training accuracy/loss for the different networks.