Skip to content

Series of projects and tutorials around using Building production grade Deep learning pipelines

License

Notifications You must be signed in to change notification settings

Kredaro/Deeplearning-pipelines

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 

Repository files navigation

Deeplearning-pipelines

Series of projects and tutorials around using Building production grade Deep learning pipelines

Plan

Content for blog 1

  • A brief architecture for a simple/scalable Deep learning pipeline. 

  • Building blocks of a real world Deep learning pipeline. 

  • Explain the different building blocks. 

  • HDFS vs Minio for storage layer. 

  • Training on batch data, storing Data into Minio. 

  • Data exploration using Spark SQL, Minio and Jupyter notebook.

  • Data preparation.

  • Hello world tensorflow + Minio. 

  • Docker container to run the example. 

  • Code example to run Hello World tensorflow on Minio play. 

  • Push the code into Recipes.  

  • Read the data from Minio using Tensorflow into memory and run a simple Logistic regression training job.

  • Create Google collab notebooks for easy running of code.

  • Live coding video with explanation on the same.

  • Push the code to repo.

Content for second blog

  • Saving the trained model in the last blog into Minio. 
  • Serving from the trained model that is stored in Minio. 
  • Check pointing the training into Minio. 
  • Restoring the checkpoints from Minio. 
  • A Neural network project. 
  • Save the code examples in Github. 
  • Create Google collab notebooks for the project.
  • Live coding video with explanation on the same.
  • Push the code to repo.

Content for third blog

  • Running the project in last example end to end on Kubeflow. 
  • Create Terraform templates to recreate the Kubeflow setup with Minio in one shot on GKE.
  • Live coding video with explanation on the same.
  • Push the code to repo.

Content for fourth blog.

  • Batch reading of data stored in Minio for training and scaling the in memory training. 
  • Neural network training on Large dataset. 
  • Create Containers for the code to run, push to repo.
  • Live coding video with explanation on the same.

Content for the fifth blog

  • Distributing the tensorflow training with large datasets stored in Minio distributed setup. 
  • Terraform scripts for getting the one shot setup. 
  • Kubeflow project for the same.
  • Live coding video with explanation on the same.
  • Push the code to repo.

Content for sixth blog

  • Ingest real world data from Kafka and store it in Minio as shredded files. 
  • Run batch Deep learning jobs on the injested stream data into Minio. 
  • Live coding video with explanation on the same.
  • Terraform scripts for the setup. 
  • Kubeflow project.
  • Push the code to repo.

Content for seventh blog

  • Clean the data using Kafka streaming and Kafka-SQL. 
  • Store the cleaned data on Minio. 
  • Run Deep learning training jobs. 
  • Terraform scripts and live code video. 
  • Push the code to repo.

Content for eighth, Ninth and 10th blog

  • Large scale Image processing pipeline using Convolutional Neural networks using Minio and Tensorflow.
  • Live coding video. 
  • Terraform scripts and Kubeflow projects. 
  • Push the code to repo.

About

Series of projects and tutorials around using Building production grade Deep learning pipelines

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published