Welcome to the AWS-Powered Web Architecture with Automated Backup System project! 🚀
In this project, I embarked on a journey to deepen my expertise in Amazon Web Services (AWS) by designing and implementing a resilient and scalable web architecture. The focus was on leveraging AWS services such as Docker, EC2, RDS, S3, API Gateway, Lambda, SNS, Django, and Redis to create a robust and efficient system.
The architecture is designed to handle web traffic efficiently and ensure high availability:
- NGINX serves as the web server, directing traffic to the core of the application.
- Django (running on Gunicorn) handles incoming requests, with the application running inside a Docker container for consistency across different environments.
- Docker Compose is used to manage the Docker containers, ensuring a seamless deployment process.
Asynchronous tasks are a critical part of the architecture, managed through:
- Celery for task processing, interfacing with Redis as the message broker. This setup allows for efficient queuing and processing of tasks, improving the responsiveness of the application.
Data persistence and management are handled by:
- PostgreSQL hosted on Amazon RDS, providing a reliable and scalable database solution.
- Amazon S3 is utilized for storing unstructured data, static files, and backup files, offering robust storage with virtually unlimited scalability.
For external APIs and serverless tasks:
- Amazon API Gateway is used to define, secure, and monitor APIs.
- AWS Lambda runs serverless compute tasks, triggered by the API Gateway, allowing for scalable, on-demand processing without the need for managing infrastructure.
A robust notification system is implemented using:
- Amazon SNS for a pub/sub messaging system, enabling decoupled communication between services.
- An email notification system triggers based on specific application events, keeping users informed.
A key feature of this project is the automated backup system for the 'notes' table in PostgreSQL:
- Celery Beat schedules tasks that are dispatched to Redis.
- Celery Workers process tasks, invoking the Amazon API Gateway, which triggers an AWS Lambda function.
- The Lambda function connects to Amazon RDS, retrieves data, and stores it in Amazon S3 as a JSON file.
- Upon successful completion, a message is published to an Amazon SNS topic, triggering an email notification to subscribed users.
The workflow of the backup process is illustrated as follows:
- Celery Beat dispatches messages or tasks to Redis.
- Celery Workers retrieve and process tasks from Redis.
- Celery Workers execute tasks that invoke the Amazon API Gateway.
- The Amazon API Gateway triggers a function deployed on AWS Lambda.
- The Lambda function connects to Amazon RDS and retrieves data.
- The Lambda function stores the retrieved data in Amazon S3 as a JSON file.
- The Lambda function publishes a message to an Amazon SNS topic.
- Subscribed users receive an email notification upon completion.
Below are some images demonstrating the key components and workflow of the project:
This diagram illustrates the overall architecture of the AWS-powered web application. It shows the interaction between different components such as NGINX, Docker, Django, Celery, Redis, Amazon RDS, Amazon S3, Amazon API Gateway, AWS Lambda, and Amazon SNS. The diagram visualizes how external users interact with the application through NGINX, which forwards requests to Django running on Gunicorn within a Docker container. Celery is used for asynchronous task processing, with Redis as the message broker. Data is managed using PostgreSQL on Amazon RDS and stored in Amazon S3. API requests are handled by Amazon API Gateway and processed by AWS Lambda. The architecture also includes an automated backup system, where the results of SQL queries are stored in Amazon S3, and notifications are sent via Amazon SNS.
This screenshot shows the Django administration interface where a note is being added.
This screenshot displays the Django administration interface for managing periodic tasks. The highlighted task, dump_notes_table
, is responsible for periodically backing up data from the notes table in PostgreSQL to Amazon S3. Although this task typically runs on a scheduled basis , it is being executed manually for demonstration purposes.
Amazon S3 interface, displaying the backup files generated from the notes
table. ; the content of one of the JSON backup files stored in Amazon S3.
*This screenshot shows an email notification received from AWS SNS, confirming the successful completion of the backup operation. *
This project was driven by my passion for leveraging AWS services to create scalable, efficient, and resilient web architectures. By implementing this solution, I've gained hands-on experience with various AWS components, enhancing my understanding of cloud infrastructure.
I invite you to explore the project's code on GitHub and welcome any constructive feedback to further refine and enhance my work.
Feel free to reach out if you have any questions, suggestions, or potential collaborations in mind. Let's build something amazing together!
Thank you for visiting the repository! 😊