Skip to content

Commit

Permalink
Initial commit
Browse files Browse the repository at this point in the history
  • Loading branch information
acbhatt12 committed Sep 15, 2023
0 parents commit 066de15
Show file tree
Hide file tree
Showing 171 changed files with 6,174 additions and 0 deletions.
3 changes: 3 additions & 0 deletions .gitlab-ci.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
---
include:
- local: gitlab-ci/test-pipeline/entrypoint-tests.yml
4 changes: 4 additions & 0 deletions CODE_OF_CONDUCT.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
## Code of Conduct
This project has adopted the [Amazon Open Source Code of Conduct](https://aws.github.io/code-of-conduct).
For more information see the [Code of Conduct FAQ](https://aws.github.io/code-of-conduct-faq) or contact
opensource-codeofconduct@amazon.com with any additional questions or comments.
59 changes: 59 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
# Contributing Guidelines

Thank you for your interest in contributing to our project. Whether it's a bug report, new feature, correction, or additional
documentation, we greatly value feedback and contributions from our community.

Please read through this document before submitting any issues or pull requests to ensure we have all the necessary
information to effectively respond to your bug report or contribution.


## Reporting Bugs/Feature Requests

We welcome you to use the GitHub issue tracker to report bugs or suggest features.

When filing an issue, please check existing open, or recently closed, issues to make sure somebody else hasn't already
reported the issue. Please try to include as much information as you can. Details like these are incredibly useful:

* A reproducible test case or series of steps
* The version of our code being used
* Any modifications you've made relevant to the bug
* Anything unusual about your environment or deployment


## Contributing via Pull Requests
Contributions via pull requests are much appreciated. Before sending us a pull request, please ensure that:

1. You are working against the latest source on the *main* branch.
2. You check existing open, and recently merged, pull requests to make sure someone else hasn't addressed the problem already.
3. You open an issue to discuss any significant work - we would hate for your time to be wasted.

To send us a pull request, please:

1. Fork the repository.
2. Modify the source; please focus on the specific change you are contributing. If you also reformat all the code, it will be hard for us to focus on your change.
3. Ensure local tests pass.
4. Commit to your fork using clear commit messages.
5. Send us a pull request, answering any default questions in the pull request interface.
6. Pay attention to any automated CI failures reported in the pull request, and stay involved in the conversation.

GitHub provides additional document on [forking a repository](https://help.github.com/articles/fork-a-repo/) and
[creating a pull request](https://help.github.com/articles/creating-a-pull-request/).


## Finding contributions to work on
Looking at the existing issues is a great way to find something to contribute on. As our projects, by default, use the default GitHub issue labels (enhancement/bug/duplicate/help wanted/invalid/question/wontfix), looking at any 'help wanted' issues is a great place to start.


## Code of Conduct
This project has adopted the [Amazon Open Source Code of Conduct](https://aws.github.io/code-of-conduct).
For more information see the [Code of Conduct FAQ](https://aws.github.io/code-of-conduct-faq) or contact
opensource-codeofconduct@amazon.com with any additional questions or comments.


## Security issue notifications
If you discover a potential security issue in this project we ask that you notify AWS/Amazon Security via our [vulnerability reporting page](http://aws.amazon.com/security/vulnerability-reporting/). Please do **not** create a public github issue.


## Licensing

See the [LICENSE](LICENSE) file for our project's licensing. We will ask you to confirm the licensing of your contribution.
17 changes: 17 additions & 0 deletions LICENSE
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
MIT No Attribution

Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.

Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "Software"), to deal in
the Software without restriction, including without limitation the rights to
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
the Software, and to permit persons to whom the Software is furnished to do so.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

109 changes: 109 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,109 @@
# DevOps Pipeline Accelerator (DPA)

This repository contains the code and resources on how to build centralised templates for your infrastructure deployment pipeline, for various types of Infrastructure as a Code Tools. These templates can be consumed by various teams to orchestrate their Infrastructure build and deployment pipelines with the standard/basic required configurations.
This will help the teams to focus more on developing features than working on building pipelines for the deployment.

DPA Solution is basically build around 3 pillars - Entrypoints, Aggregators and Stages.
1. Entrypoint is the Configuration file which you can use to Customize the Default Pipeline as per the requirements of each Appplication. This is the file which resides in your Application Repository. All other files/configurations explained further in this Solution would reside in the Common Repository or Pipeline Repository.

## Table of content
* [Principles of DPA](#principles)
* [Prerequisites](#prerequisites)
* [Architecture](#architecture)
* [Deployment](#deployment)
* [Benefits](#benefits)
* [Limitations](#limitations)
* [Next-Steps(Backlog)](#backlog)
* [Access Management](#accessmanagement)
* [Examples](#examples)

## Principles
1. Deployments to environments must be consistent and use same artifacts for deployment
2. Each job in pipeline should run in specific Docker containers
3. DPA has been designed to work with feature branch based branching model

DPA contains below few important modules in code :

### Entrypoints
Entrypoints in DPA represets specific IaC pipeline strating point that will be consumed by application. Entrypoint consists of aggregators and various stages.
This is the Configuration file which you need to add in your Application Repository and used for Customizing the various pre-defined stages.

***Example Entrypoint files :***
1. https://github.com/aws-samples/aws-devops-pipeline-accelerator/blob/main/examples/aws_codepipeline/terraform/entrypoint/terraform-infrastructure.json
2. https://github.com/aws-samples/aws-devops-pipeline-accelerator/blob/main/gitlab-ci/entrypoints/terraform-infrastructure.yml

### Aggregators
Aggregators in DPA is a collection of jobs managed by stages, there are multiple wrappers that forms entrypoint for specific IaC pipeline.

### Stages
Stages contain actual building blocks that form the jobs inside stages. Each stage represent specific execution of pipeline jobs.
***Default Stages provisioned for a Single Environment (dev) are mentioned below***. You can customize the pre-defined stages in the Entrypoint configuration provided above. And this is explained in detail in the Deployment Section for individual types of Deployment [below](#deployment)

The pipeline stops/exits on encountering a failure at any of the below stages. Then the user has to fix these failures and re-run the Pipeline to proceed.

1. Init
2. Build
3. Test
4. PreDeploy, Package, Publish
5. Deploy
6. Verify
7. Manual Approval
8. Destroy

## Prerequisites

1. An AWS Account with necessary permissions/roles that will be used to provision resources using IaC templates
2. Docker Image created and pushed to ECR as [outlined here](https://github.com/aws-samples/aws-devops-pipeline-accelerator/blob/main/shared-docker/docker-images/README.md)

## Architecture
<img width="947" alt="image" src="https://github.com/aws-samples/aws-devops-pipeline-accelerator/assets/106240341/217c927d-3e1d-4f95-8cce-34560da9ea0d">

## Deployment
1. [AWS CodePipeline for deploying Terraform resources](https://github.com/aws-samples/aws-devops-pipeline-accelerator/blob/main/aws-codepipeline/terraform/README.md)
2. [AWS CodePipeline for deploying CloudFormation resources](https://github.com/aws-samples/aws-devops-pipeline-accelerator/blob/main/aws-codepipeline/terraform/README.md)
3. [Gitlab CI for deploying Terraform, CDK and CloudFormation resources](https://github.com/aws-samples/aws-devops-pipeline-accelerator/blob/main/gitlab-ci/README.md)

## Benefits
1. ***Standardisation & consistency*** : Standardised Pipeline for any type of Infrastructure brings in consistency in infrastructure deployment.
2. ***Reusability*** : Entire solution is reusable and scalable in nature. Accelerators can be consumed to orchestrate the pipeline.
3. ***Velocity*** : Application team can focus more on to developing app than worrying about building pipeline, which will improve the velocity of the team.
4. ***Security*** : Built in quality-gates that secures quality of the infrastructure deployment with DevSecOps concepts
5. ***Scalability*** : Outcome of this entire solution would be configurable templates which are highly scalable and can easily integrate with any type of infrastructure that supports CI/CD pipeline for chosen platform.

## Limitations
1. First release supports (provides reuasble code) only for AWS CodePipeline and Gitlab CI platforms.

## Backlog
* Workflow templates for Infrastructure based applications for Amazon CodeCatalyst, Github Actions and Jenkins
* Java based application pipeline templates that deploys to container ecosystem like ECS and EKS
* Frontend UI based application pipeline templates for deployment to AWS environments
* Enable Multi-Account and Cross-Region provsioning and deployment
* CodeBuild artifact KMS encryption using CMK.

## Access Management
There are multiple aspects of accesses while working or using DPA templates:
1. Role based access to be configured for pipeline visibility and execution of workflows irrespective of platform
2. Service account to be used for pipeline execution and deployment to cloud environments. [Example Policy](https://docs.aws.amazon.com/codepipeline/latest/userguide/security-iam-id-policies-examples.html)
3. Pipelines uses least privileges IAM roles to access AWS resources like ECR docker images, SNS and distribution list in pipeline job execution
4. Pipelines should be protected with limited access to avoid unwanted updates on pipeline configs and deletion
5. Gitlab to AWS Integration is via the Environment variables configured and explained [here](https://github.com/aws-samples/aws-devops-pipeline-accelerator/blob/main/gitlab-ci/README.md#usage)
6. CodeBuild Artefacts can be encrypted using CMK, [outlined here](https://docs.aws.amazon.com/codebuild/latest/userguide/security-encryption.html)

## Examples
There is an Examples folder included in this solution to showcase how to integrate/use the DPA solution in your repositories.
As this is just an Example and not part of the solution, it contains just the basic usage/code of resources like EC2, S3 etc. It is recommended to follow best-practices while creating any resource in your AWS account
* [S3 bucket best practices](https://docs.aws.amazon.com/AmazonS3/latest/userguide/security-best-practices.html)
* [EC2 best practices](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ec2-best-practices.html)
* [Terraform backend](https://developer.hashicorp.com/terraform/language/settings/backends/s3)

## Contributing
See [CONTRIBUTING](CONTRIBUTING.md#security-issue-notifications) for more information.

## License
This library is licensed under the MIT-0 License. See the LICENSE file.

## Contributors
* Ashish Bhatt
* Eknaprasath P
* Mayuri Patil
* Ruchika Modi
59 changes: 59 additions & 0 deletions aws-codepipeline/cloudformation/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
# AWS CodePipeline for deploying CloudFormation resources
This folder contians the code and resources on how to build centralised templates for your infrastructure deployment pipeline using CloudFormation and AWS CodePipeline.

## Requirements
1. An AWS Account with necessary permissions/roles to create Codecommit Repositories and CodePipeline components.
2. To deploy this solution we suggest to have 2 separate CodeCommit repositories
* First repository will contain the shared libraries, buildspec file and dependencies. Lets call this as the `common-repo`
* Second repository will contain the CloudFormation templates to deploy your infrastructure. Lets call this the `app-repo`

## Usage
This folder contains below 2 directories:
* `pipeline-modules` ## The CloudFormation code to to deploy the Standardized Pipeline
* `shared` ## the Ready-to-use Buildspec files.

To begin with:
1. Create a folder named shared and copy all the Buildspec files from the `shared` folder into the `common-repo/` folder
2. In the `app-repo`, create a folder named `entrypoint` and copy the file .\examples\aws_codepipeline\cloudformation\entrypoint\config.json into it.
3. Refer to the `.\examples\aws_codepipeline\cloudformation` folder for the struture. Please note that using this example we are creating S3 buckets.
4. You can clone this repository and use the templates under `pipeline-modules` to setup your pipeline


## Explanation of the `config.json` file:
This is the main config file, where you can customize and enable/disable a stage. Please note that if the stage is disbaled, it will just be skip the execution, but not delete/remove the stage from the Pipeline.
```json
{
"init_stage_required" : "true",
"test_stage_required" : "true",
"createinfra_stage_required": "true",
"envType" : "cloudformation",
"stage_required" : "true",
"cft_s3_bucket" : "pipeline-bucket", #S3 bucket from the destination account to keep CFT templates
"stack_name" : "aws-cft-poc", #CloudFormation stack name
"account" : "************", #Destination AWS account to deploy stack
"roleName" : "codestack-poc-cross-account-role", #Cross account IAM role name
"region" : "us-east-1",
"destroy_stack" : "false" #To destroy the provisioned stack this value set to be "true"
}
```

## Creating Pipeline with all the Stages defined
Once we have the information filled then we are good to go with Pipeline creation. To create the pipeline we need to use the “pipeline-cft.yaml” file. Please deploy the CloudFormation template, below mentioned are the necessary Parameters to be passed to the stack.

ArtifactsBucket:
Description: Name of the repo where the pipeline artifacts to be updated contains.
EcrDockerRepository:
Description: ECR Docker repository name for the CodeBuild image
CodeCommitAppRepo:
Description: CodeCommit repository name which contains the templates
CodeCommitBaseRepo:
Description: CodeCommit repository name which contains the shared files
CodeCommitRepoBranch:
Description: CodeCommit repository branch name
SNSMailAddress:
Description: email address to receive SNS notification for pipeline approval.


* After this login to the AWS account and you should see the new pipeline created.
* Add necessary permissions to the new IAM Role created for CodeBuild. For e.g. the Cross Account IAM Role should have permissions to create S3 buckets as per the Terrraform template in the examples directory.

Loading

0 comments on commit 066de15

Please sign in to comment.