A Scalable aws streaming Application for real time reporting of the movement of the cars. THis application gather latitudes and longituttes of the moving cars, and the persist the data into AWS S3 and Redishift for data analyst to monitor cars movements and other partens.
- Find detailed document of steps here
- Write a script to generale real time vehicle data
- Write a script or create a lambda function to load data into s3
- Sending data to redshift
CREATE EXTERNAL SCHEMA streamdataschema
FROM KINESIS
IAM_ROLE 'arn:aws:iam::533267024701:role/redshiftkinesisrole';
CREATE MATERIALIZED VIEW devicedataview AS
SELECT approximate_arrival_timestamp,
partition_key,
shard_id,
sequence_number,
json_parse(from_varbyte(kinesis_data, 'utf-8')) as payload
FROM streamdataschema."d2d-app-kinesis-stream";
REFRESH MATERIALIZED VIEW <VIEW_NAME>;
select * from <VIEW_NAME>
-
AWS CODECOMMIT SETUP
Commit the local code to the AWS Codecommit: AWS CODECOMMIT SETUP
- Setting up AWS CodeCommit IAM User with HTTPs Git Credential for AWS CodeCommit.
- Create CodeCommit Repo
{not ecr repo!!!!!!}
- Copy GitHub Repo Data to AWS CodeCommit
-
AWS CODEBUILD SETUP
AWS CODEBUILD SETUP
- Prepare ECR for CodeBuild.
- Sett up CodeBuild
- Setup IAM roles and permissions To allow CodeBuild to push Docker images to ECR Note docker image can be pushed to dockerhub instead
-
CODEDEPLOY
Takes docker image created in codebuild stage and deploy to ecs AWS CODEDEPLOY SETUP
- Make Available ECS service infrastructure
- Create Deploy Stage
- Run the Pipeline by making changes in the local repo and pushing to CodeCommit
-
sudo amazon-linux-extras install python3.8
-
python3.8 get-pip.py --user
-
sudo python3.8 -m pip install psycopg2-binary -t python/
-
zip -r dependancies.zip python
A view of data loaded into
Redishift
- Before sending the data to Redishift.
- read data into a csv
- do visualization
- Finish the codepipeline part
👤 Niyomukiza Thamar
- GitHub: Niyomukiza Thamar
- LinkedIn: Niyomukiza Thamar
Give a ⭐ if you like this project!