Skip to content

Tauffer-Consulting/domino

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Build amazing ideas, piece by piece.

Docs DOI


create-workflow

Table of contents


About

Domino is an open source workflow management platform, with:

  • πŸ–₯️ an intuitive Graphical User Interface that facilitates creating, editing and monitoring any type of Workflow, from data processing to machine learning
  • πŸ“¦ a standard way of writing and publishing functional Pieces, which follows good practices for data modeling, documentation and distribution
  • βš™οΈ a REST API that controls a running Apache Airflow instance

Creating Workflows in the GUI is as simple as dragging and dropping Pieces to the canvas, and connecting them. The user can schedule the Workflow to run periodically, at a specific date/time, or trigger it manually. The monitoring page shows the status of each Workflow Piece in real time, including the logs and results of each run.

Pieces are functional units that can be reused in multiple Workflows. Pieces can execute anything that can be written in Python, and can be easily distributed and installed directly from Github repositories to be used in Domino Workflows.

Every Domino Workflow corresponds to an Apache Airflow DAG, and each Piece corresponds to an Airflow task. Domino controls an Airflow instance, which is responsible for executing, scheduling and monitoring the Workflows (DAGs).

You can think of Domino as Airflow with superpowers:

  • πŸ–₯️ create highly complex Workflows with simple point-and-click and drag-and-drop operations, in an user-friendly GUI
  • πŸ“¦ make use of Pieces developed by other people, share and reuse your own Pieces
  • πŸ‘₯ collaborate in groups to edit and monitor Workflows
  • πŸ“ˆ experience a cleaner and more intuitive GUI for viewing Workflows results, including logs and richer reports with images and tables
  • πŸ’½ shared storage for tasks in the same workflow
  • πŸ”„ use gitSync to sync DAGs from files stored in a Git repository
  • ☸️ scalable, Kubernetes-native platform
  • πŸ”‹ powered by Apache Airflow for top-tier workflows scheduling and monitoring

Quick start

Check out the quick start guide in the documentation.

The Domino Python package can be installed via pip. We reccommend you install Domino in a separate Python environment.

pip install domino-py[cli]

You can then use Domino command line interface to easily run the Domino platform locally (requires Docker Compose V2). Go to a new, empty directory and run the following command:

domino platform run-compose

After all processes started successfully, navigate to localhost:3000 to access the Domino frontend service.
Obs.: the first time you run the platform, it may take a few minutes to download the Docker images.

Running the Domino platform locally with Docker compose is useful for development and testing purposes. For production environments, we recommend you deploy Domino and Airflow to a Kubernetes cluster. For other deployment modes, check out the instructions in the documentation.


GUI

The Domino frontend service is a React application that provides the GUI for easily creating, editing and monitoring Workflows. Check out the GUI documentation for more details.

Access authentication Sign up and login to use the Domino platform.

signup and login

Select or Create Workspaces Select an existing or create a new Workspace.

create workspace

Install Pieces repositories Install bundles of Pieces to your Domino Workspaces direclty from Github repositories, and use them in your Workflows.

install pieces

Create Workflows Create Workflows by dragging and dropping Pieces to the canvas, and connecting them.

create-workflow

Edit Pieces Edit Pieces by changing their input. Outputs from upstream Pieces are automatically available as inputs for downstream Pieces. Pieces can pass forward any type of data, from simple strings to heavy files, all automatically handled by Domino shared storage system.

edit pieces

Configure Workflows Configure and schedule Workflows to run periodically, at a specific date/time, or trigger them manually.

schedule workflows

Monitor Workflows Monitor Workflows in real time, including the status of each Piece, the logs and results of each run.

run-pieces7


Pieces

Pieces

Pieces are the secret sauce of Domino, they are functional units that can be distributed and reused in multiple Workflows. Domino Pieces are special because they:

  • 🐍 can execute anything written in Python, heavy-weight (e.g. Machine Learning) as well as light-weight (e.g. sending emails) tasks
  • πŸš₯ have well defined data models for inputs, outputs and secrets
  • πŸ“¦ run in self-contained and isolated execution environments (Docker containers)
  • βš™οΈ are immutable, guaranteeing reproducibility of your workflows
  • :octocat: are organized in git repositories, for easy packaging, distribution and installation
  • πŸ“‘ are properly versioned, tested and documented
  • ⚑ are plug-and-play and versatile, can be easily incorporated in any workflow

It is very easy to create and share your own Pieces:

  1. write your Python function as a Piece
  2. define the data types, dependencies, metadata and tests
  3. publish in a git repository (public or private)

The Pieces repository template provides the basic structure, example files and automatic actions for a seamless Pieces creation experience.

Read more in the Pieces documentation.


REST

The Backend service is a REST API that controls a running Apache Airflow instance. It is responsible for:

  • executing operations requested by the frontend service
  • interacting with the Airflow instance, including triggering, creating, editing and deleting Workflows (DAGs)
  • interacting with the Domino Database

The REST service is written in Python, using the FastAPI framework. Read more about it in the REST documentation.


Credits

Domino is developed and maintained by Tauffer Consulting.