Welcome to the repository of the AIDE federated learning project! 🚀
Start by cloning this repository to your local machine or a designated testbed server:
git clone <repository-url>
To set up your environment, you should aim to use the Docker containers locally or to run the system on a (managed) k8s instance.
Docker provides a consistent environment that simplifies dependency management. Follow these steps to set up using Docker:
-
Pull Docker Images
Pull the pre-built Docker images for both the server and clients from the registry:
docker pull [TODO build & push cnl image to github] // Server docker pull [TODO build & push cnl image to github] // Client
-
Run Server Container
Start the server in a Docker container, exposing the necessary ports and mounting volumes if required:
docker run -p 8080:8080 -v /path/to/data:/app/dataset --network host gitlab.ilabt.imec.be:4567/aide-fl/aide-infra/server \ --experiment <experiment-name> \ --server 0.0.0.0:8080
This command runs the server, maps port
8080
, and mounts thedata
directory from the host to the container. -
Run Client Containers
To connect clients to the server, run the client containers:
docker run -v /path/to/data:/app/dataset --network host gitlab.ilabt.imec.be:4567/aide-fl/aide-infra/client \ --cid <client-id> \ --experiment <experiment-name> \ --server <server-address>
Use
--network host
to ensure proper network configuration for local testing. Adjust the server address as needed.
If you prefer to not run the Docker containers locally, but deploy them directly onto k8s then there are two options.
Option 1 (recommended): CloudNativeLab
Option 2: local k8s (microk8s, kubeadm)
Monitor training progress and visualize metrics with TensorBoard:
tensorboard --logdir=logs
This however is not available for every experiment ❗
With these options, you can choose the setup method that best fits your needs. Enjoy exploring Federated Learning with Flower! 🌸