Easy to use Airflow & gRPC starter project.
Clone me and get ready to schedule your first gRPC requests !
proto/
: Dummy protobuf service definition (with a simple "ping" service)src/server
: gRPC server implementation in Pythonsrc/airflow
: Airflow DAG implementation (gets copied to the docker-compose container)
Airflow is started using docker-compose, with the scheduler and the webserver stared in 2 different services.
Python dependencies are added to the Airflow image using a custom Dockerfile.
In order to allow Airflow to communicate with the gRPC server that lives on the host, a
docker-host
service is used (it proxies the request to the host).
- Generate proto files:
./generate-protos.sh
- Start Airflow using docker-compose (Airflow Web UI will be at localhost:8084)
docker-compose up -d
- Create a new gRPC connection. In Web UI, go to Admin -> Connections -> Create, and fill the fields as follows:
Field Name | Value | Description |
---|---|---|
Conn id | grpc_default |
Connection ID. Should match the ID specified in your gRPC Operator, in src/airflow/dags/ping_grpc.py |
Conn type | GPRC Connection |
Connection type. |
Host | dockerhost |
gRPC server host. In this setup, the dockerhost service in docker-compose.yml proxies the request from docker to the host (where the server lives).In a real world setup, you may want to set it to a DNS name or an IP. |
Port | 3170 |
gRPC server port. |
Extra | {"extra__grpc__auth_type": "NO_AUTH"} |
Extra configuration. Specify the authentication mechanism, in this case, creates an insecure channel (over HTTP). You may want to use other settings, as listed here. |
Grpc Auth Type | NO_AUTH |
gRPC Auth type. Should match the setting set above. |
- Start the gRPC server
./start-server.sh
- Manually trigger the
ping_grpc
DAG in Airflow, and check the output in the DAG run logs, you should see the gRPC server's response !