Argoproj is a set of loosely coupled tools that aims to unleash to power of Kubernetes:
- Argo Workflows - container-native workflow engine for orchestrating parallel jobs on Kubernetes.
- Argo Events - The event-driven workflow automation framework
- Argo CD - Declarative continuous delivery for Kubernetes
- Argo Rollouts - Progressive delivery for Kubernetes
Each Argo sub-project is focused on a separate use-case and can be used independently. But together, Argo projects complement each other and form a powerful application delivery platform.
This repository demonstrates how Argo allows gluing disconnected open-source projects into a complex system that solves real-world use cases, with zero code written.
We are going to build a web application consisting of multiple microservices as well as a background batch processing system that leverages machine learning. The application allows users to upload an image, stores the photo in an S3 compatible storage and produces a new image with highlighted faces on it using a pre-trained ML model.
Instead of building an application from scratch, we are going to leverage existing open-source projects and the Argo projects to make them work together for profit:
- minio - Kubernetes native S3 compatible storage. Minio is going to store user uploaded images and ML workflows outputs.
- filestash - A modern web client for the storage of your choice. Filestash allows user to upload an image and view processing results.
- facedetect - a simple face detector for batch processing. The facedetect encapsulates the magic and produces an image with detected faces.
- argo-workflows - workflow engine for Kubernetes. Argo Workflows orchestrates steps requires to access uploaded image, process the image and store processing results.
- argo-events - event-driven automation framework for Kubernetes. Argo Events are watching for new images in the S3 storage and triggers the batch processing workflow.
- argo-cd - GitOps continuous delivery tool for Kubernetes. Argo CD manages components listed above it the Kubernetes cluster and encapsulates both day one and day two operations.
- First of all we need a Kubernetes cluster. Nothing fancy is required here. Use your GKE, EKS cluster or just run minikube cluster on your laptop.
- Install Argo CD. Follow the getting started instructions in Argo CD documentation.
TLDR:
kubectl create namespace argocd
kubectl apply -n argocd -f https://raw.githubusercontent.com/argoproj/argo-cd/stable/manifests/install.yaml
- Deploy the demo scenario using Argo CD, which in turn will spawn all required components in your K8s cluster:
kubectl apply -f https://raw.githubusercontent.com/alexmt/argo-combined-demo/master/demo.yaml -n argocd
- Check the components' status in Argo CD user interface.
- Navigate to the external IP address of
demo-filestash
service on port9001
and upload an image using the filestash web user interface. Note: image file must have ".jpg" extension. - Use argo workflows user interface to observe background processing process.
Workflows user interface is available via external IP address of
argo-server
service on port2746
. - See image processing results in filestash interface.