This project contains a docker-compose file, and some docker configurations to deploy an airflow server with local executor for testing/development.
- python 3.8+
- docker and docker compose
- install the python requirements:
pip install -r requirements.txt
To contribute to this project, you need to activate the pre-commit in order to apply the linters on each new commit:
pre-commit install
To install your python packages and use them in the dags, you can add them to the requirements.txt file, and you can update the Dockerfile as you need.
In this project version, we configure the airflow webserver credentials in the airflow docker compose file which we provide as environment variables for the which you can update it to add your user infos.
_AIRFLOW_WWW_USER_USERNAME: airflow_user
_AIRFLOW_WWW_USER_FIRSTNAME: Airflow
_AIRFLOW_WWW_USER_LASTNAME: Admin
_AIRFLOW_WWW_USER_EMAIL: airflowadmin@example.com
_AIRFLOW_WWW_USER_ROLE: Admin
_AIRFLOW_WWW_USER_PASSWORD: airflow_password
Before deploying the server, make sure you have the folders dags, db, logs, and scripts which are attached to some docker services.
# choose your airflow version
export AIRFLOW_VERSION=2.4.1
invoke compose.up-airflow --build
Finally, in the browser, open http://localhost:8080
and put the username and password you used in
this step and click on login.
To stop the server, you have multiple option:
- Stop the containers without deleting them:
invoke compose.stop
- Delete the containers:
invoke compose.down
- Delete the containers and the volumes:
invoke compose.down --volumes