Skip to content

Commit

Permalink
feat: update core and dotnet infra docs (#569)
Browse files Browse the repository at this point in the history
* update azure infra docs

* add aws infra docs

* fix: linting
  • Loading branch information
saulfrance authored Nov 5, 2024
1 parent 0281e68 commit 8ae2c2a
Show file tree
Hide file tree
Showing 15 changed files with 305 additions and 224 deletions.
58 changes: 58 additions & 0 deletions docs/infrastructure/aws/core_infrastructure.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,58 @@
---
id: core_infrastructure_aws
title: AWS Core Infrastructure
sidebar_label: Core Infrastructure
description: How to bootstrap the Azure tenant
keywords:
- github actions
- workload
- pipeline
- pipeline template
- resources
---

import HideNavigation from "../../../src/components/HideNavigation/HideNavigation";
import useBaseUrl from '@docusaurus/useBaseUrl';

The core infrastructure is the foundation for all other Ensono Stacks Workloads. As, in most cases, this will be the first part of Ensono Stacks that you deploy we will also cover bootstrapping your AWS tenant.

## Resources Provisioned

Both the diagram and resource list below are for a single environment. By default, the pipeline template will create two environments (nonprod and prod).

### Diagram

<img alt="AWS Core Infrastructure" src={useBaseUrl('img/aws_core_infrastructure.png')} />

### Resource List

| Resource | Description |
| --------------------- | --------------------------------------------------- |
| Virtual Private Cloud | Fundamental building block for the network |
| Public Subnet | Dedicated subnet required for Network Load Balancer |
| Network Load Balancer | Web traffic load balancer |
| Private Subnet | Subnet used by the EKS cluster |
| EKS | Amazon Elastic Kubernetes Service |
| Route 53 | Hosted service for DNS domain |
| IAM | Identity and access management |
| KMS | Cryptographic keys and secrets management service |

## Deploying

### Bootstrap the AWS tenant

This process only needs to be run once on an administrators workstation.

The administrator will need the permissions that allow them to:

1. Create an IAM User for use with Terraform. This will require permissions to read and create all the core resources.
- Make note of the Access Key ID and Secret Access Key
2. Create an [S3 Bucket](https://docs.aws.amazon.com/AmazonS3/latest/userguide/create-bucket-overview.html) for storing Terraform state.
- Take note of the S3 bucket name.
3. Create a [DynamoDB](https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/getting-started-step-1.html) table for locking Terraform state.

### Pipelines

The following pipelines are currently supported for automating the deployment:

- [GitHub Actions](./pipelines/github_actions.md)
57 changes: 57 additions & 0 deletions docs/infrastructure/aws/pipelines/github_actions.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
---
id: github_actions
title: GitHub Actions Pipeline
sidebar_label: GitHub Actions
description: How do you set up a pipeline in GitHub Actions
keywords:
- workload
- pipeline
- github actions
- pipeline template
---

import HideNavigation from "../../../../src/components/HideNavigation/HideNavigation";
import useBaseUrl from '@docusaurus/useBaseUrl';

The pipeline will automate provisioning and updating the core infrastructure in AWS. This page assumes you have already completed the steps on the [core infrastructure page](../core_infrastructure.md).

The AWS infrastructure source code can be found [here](https://github.com/Ensono/stacks-infrastructure-eks).

## Pipeline Diagram

### Feature branch -> Non-Prod sequence

<img alt="AWS Core - GitHub Actions Pipeline" src={useBaseUrl('img/core_pipeline_nonprod.png')} />

### Main branch -> Prod sequence

<img alt="AWS Core - GitHub Actions Pipeline" src={useBaseUrl('img/core_pipeline_prod.png')} />

## Setting up GitHub Actions

### Environment Secrets

Environment secrets will need creating for storing some sensitive variables to be used by the pipeline. Instructions for creating a environment secrets can be found [here](https://docs.github.com/en/actions/managing-workflow-runs-and-deployments/managing-deployments/managing-environments-for-deployment#environment-secrets).

Add the following secrets:

| Variable Name | Description | Required for |
| ------------------------ | ------------------------------------------------- | -------------------------- |
| AWS_ACCESS_KEY_ID | AWS IAM User Access Key ID | AWS Authentication |
| AWS_ACCOUNT_ID | AWS Account ID | AWS Authentication |
| AWS_SECRET_ACCESS_KEY | AWS IAM User Secret Access Key | AWS Authentication |
| AWS_TF_STATE_BUCKET | S3 Bucket name for Terraform state | Terraform State Management |
| AWS_TF_STATE_DYNAMOTABLE | DynamoDB Table name for Terraform state | Terraform State Management |
| AWS_TF_STATE_ENCRYPTION | Encrypt Terraform state. `true` or `false` | Terraform State Management |
| AWS_TF_STATE_KEY | Unique name for this applications Terraform state | Terraform State Management |
| AWS_TF_STATE_REGION | AWS region | Terraform State Management |

### Update pipeline template placeholders

Where possible, the Stacks CLI will have populated the correct values in the pipeline environment file `.github/workflows/infrastructure.env`. We very much recommend that you go through the whole `.github/workflows` directory to make sure that values are correct for your project. Once you are happy with the template, commit the changes to your repository.

### Create the pipeline

Stacks generates the GitHub Actions workflow file in the directory expected by GitHub Actions, `.github/workflows`. Committing this file to the `main` branch of a GitHub repository will "create" the pipeline. Instructions for viewing the results can be found [here](https://docs.github.com/en/actions/writing-workflows/quickstart#viewing-your-workflow-results)

<HideNavigation next />
80 changes: 0 additions & 80 deletions docs/infrastructure/azure/core_infrastructure.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,6 @@ sidebar_label: Core Infrastructure
description: How to bootstrap the Azure tenant
keywords:
- azure devops
- scaffolding cli
- workload
- pipeline
- pipeline template
Expand Down Expand Up @@ -66,87 +65,8 @@ With owner privileges:
2. Create a [Blob Storage instance](https://docs.microsoft.com/en-us/azure/storage/common/storage-account-create) and [container](https://docs.microsoft.com/en-us/cli/azure/storage/container?view=azure-cli-latest#az_storage_container_create) for storing Terraform state.
- Take note of the storage account and container name.

### Using the Scaffolding CLI

The [Ensono Stacks Scaffolding CLI](https://www.npmjs.com/package/@amidostacks/scaffolding-cli) can be used to create a project consisting of the core infrastructure as code scripts and the deployment pipeline.

We are supporting and running [node@12](https://nodejs.org/en/about/releases/).
Please ensure that your local environment has the correct version [installed](https://nodejs.org/en/download/).

To run the Scaffolding CLI, use the following command:

```bash
npx @amidostacks/scaffolding-cli@latest run -i
```

You will be asked a number of questions. Make sure to select `Azure` and `Cloud platform shared services`.

<!-- TODO: Example video here -->

### Pipelines

The following pipelines are currently supported for automating the deployment:

- [Azure DevOps](./pipelines/azure_devops.md)

### Running Locally

Currently, vars.tf and provider configuration is not
automatically updated. Future iterations will include this.

The safest way to run and maintain this locally is to rely on Docker and environment
variables as that is the way the pipeline will trigger the
executions of Terraform.

Sample commands with example environment vars:

```bash
# Navigate to the infra directory
cd ${YOUR_DIRECTORY_PATH}/deploy/azure/infra

# Run Ensono Terraform Docker container
docker run -v $(pwd):/usr/data --rm -it amidostacks/ci-tf:0.0.4 /bin/bash

###########################################################################
# All commands from this point should be executed in the Docker container #
###########################################################################

# Navigate to /usr/data directory
cd /usr/data

# Export Azure Credentials. Replace the example values.
export ARM_CLIENT_ID=1111-2222-3333-444 \
ARM_CLIENT_SECRET=1111-2222-3333-4444 \
ARM_SUBSCRIPTION_ID=1111-2222-3333-444 \
ARM_TENANT_ID=1111-2222-3333-444

# Export Terraform variables. Replace the example values.
export TF_VAR_resource_group_location=uksouth \
TF_VAR_name_company=ensono \
TF_VAR_name_project=example \
TF_VAR_name_component=core \
TF_VAR_name_environment=nonprod \
TF_VAR_create_acr=true \
TF_VAR_acme_email=example@ensonostacks.com \
TF_VAR_is_cluster_private=true \
TF_VAR_cluster_version=1.17.11 \
TF_VAR_stage=nonprod \
TF_VAR_key_vault_name=example-core-nonprod \
TF_VAR_dns_zone=nonprod.ensonostacks.com \
TF_VAR_internal_dns_zone=nonprod.ensonostacks.internal

# Initial Terraform. Replace the example values.
terraform init \
-backend-config="resource_group_name=amido-stacks-terraform" \
-backend-config="storage_account_name=amidostacksterraform" \
-backend-config="container_name=tfstate" \
-backend-config="key=example-core"

# Select or create the "nonprod" workspace.
terraform workspace select nonprod || terraform workspace new nonprod

# Check the plan matches your expected changes.
terraform plan
```
<HideNavigation prev />
54 changes: 17 additions & 37 deletions docs/infrastructure/azure/pipelines/azure_devops.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,69 +10,49 @@ keywords:
- pipeline template
---

import HideNavigation from "../../../../src/components/HideNavigation/HideNavigation";
import HideNavigation from "../../../../src/components/HideNavigation/HideNavigation";
import useBaseUrl from '@docusaurus/useBaseUrl';

The pipeline will automate provisioning and updating the core infrastructure in Azure. This page assumes you have already completed the steps on the [core infrastructure page](../core_infrastructure.md).

Where possible, we are creating reusable steps ([stacks-pipeline-templates](https://github.com/Ensono/stacks-pipeline-templates)) that can be pulled into any base pipeline. Reusable steps can include tasks to deploy, build, test and more.
Azure infrastructure source code can be found [here](https://github.com/Ensono/stacks-infrastructure-aks).
The Azure infrastructure source code can be found [here](https://github.com/Ensono/stacks-infrastructure-aks).

## Pipeline Diagram

<img alt="Azure Core - Azure DevOps Pipeline" src={useBaseUrl('img/azure_core_azure_devops_pipeline.png')} />
### Feature branch -> Non-Prod sequence

## Setting up Azure DevOps

### Service connection

A service connection will need to be configured to ensure you can pull in pipeline templates form the public repo. The service connection will need a [Github Personal Access Token](https://github.com/settings/tokens) (or credentials) to pull in the code. At a minimum, the access token will need to include:

* read:repo
<img alt="Azure Core - Azure DevOps Pipeline" src={useBaseUrl('img/core_pipeline_nonprod.png')} />

Once a token is generated, the service connection can be configured for the project. Instructions can be found [here](https://docs.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml#create-a-service-connection).
### Main branch -> Prod sequence

Finally, the repository resource endpoint value will need to be updated in the `build/azDevops/azure/infra-pipeline.yml` file.
<img alt="Azure Core - Azure DevOps Pipeline" src={useBaseUrl('img/core_pipeline_prod.png')} />

```yaml
resources:
repositories:
- repository: templates
type: github
name: amido/stacks-pipeline-templates
ref: refs/tags/v1.1.0 # Ensure the correct tag is referenced here to ensure version control
endpoint: amidostacks # Name of the service account created for the connection to GitHub from Azure DevOps
```
## Setting up Azure DevOps

### Variable groups

Variable groups will need creating for storing Azure Credentials to be used with the pipeline. Instructions for creating a variable group can be found [here](https://docs.microsoft.com/en-us/azure/devops/pipelines/library/variable-groups?view=azure-devops&tabs=classic#create-a-variable-group).

Create a variable group for the nonprod infrastructure. Give the variable group a name and description and make sure the **Allow access to all pipelines** option is checked. Add the following variables using the Service Connection details from [bootstrapping the Azure tenant](../core_infrastructure.md#bootstrap-the-azure-tenant):
* azure_tenant_id
* azure_subscription_id
* azure_client_id
* azure_client_secret
<img alt="Azure Core Variable Group" src={useBaseUrl('img/azure_core_variable_group.png')} />
Create a variable group called `azure-sp-creds`, add a description and make sure the **Allow access to all pipelines** option is checked. Add the following variables using the Service Connection details from [bootstrapping the Azure tenant](../core_infrastructure.md#bootstrap-the-azure-tenant):

Repeat this to create a prod infrastructure variable group with the variables below:
Add the following variables:

* prod_azure_tenant_id
* prod_azure_subscription_id
* prod_azure_client_id
* prod_azure_client_secret
| Variable Name | Description | Required for |
| ------------------- | ------------------------------------- | ------------------------------ |
| ARM_CLIENT_ID | Azure Service Principal Client ID | Terraform and Helm deployments |
| ARM_CLIENT_SECRET | Azure Service Principal Client Secret | Terraform and Helm deployments |
| ARM_SUBSCRIPTION_ID | Azure Subscription ID | Terraform and Helm deployments |
| ARM_TENANT_ID | Azure Tenant ID | Terraform and Helm deployments |

### Update pipeline template placeholders

Where possible, the scaffolding CLI will have populated the correct values in the pipeline template file `build/azDevops/azure/infra-pipeline.yml`. The values that need to be manually configured, such as the variable group name setup previously, will have placeholders using the prefix `%REPLACE_ME_FOR`. We very much recommend that you go through the whole template to make sure that values are correct for your project. Once you are happy with the template, commit the changes to your repository.
Where possible, the Stacks CLI will have populated the correct values in the pipeline template file `build/azDevOps/azure/pipeline-vars.yml`. We very much recommend that you go through the whole template to make sure that values are correct for your project. Once you are happy with the template, commit the changes to your repository.

### Create the pipeline

Follow the steps below to create the pipeline and trigger the initial run.

*Please note that pipeline will create DNS zones for both nonprod and prod (by default, `nonprod.${BASE_DOMAIN}` and `prod.${BASE_DOMAIN}`). These will need NS records adding to the base domain and will cause the pipeline to fail on the initial run.*
_Please note that pipeline will create DNS zones for both nonprod and prod (by default, `nonprod.${BASE_DOMAIN}` and `prod.${BASE_DOMAIN}`). These will need NS records adding to the base domain and will cause the pipeline to fail on the initial run._

1. In the pipelines section of Azure DevOps, select **New Pipeline**.
2. Select your repository.
Expand Down
55 changes: 55 additions & 0 deletions docs/workloads/azure/backend/netcore/infrastructure_aws_netcore.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
---
id: infrastructure_aws_netcore
title: .NET - AWS Infrastructure
sidebar_label: AWS Infrastructure
description: High level design of the reference implementation of the .NET Ensono Stacks REST API with CQRS.
keywords:
- .net
- rest api
- cqrs
- pipeline
- aws
- elastic container registry
- elastic kubernetes service
- dynamodb
- deployment
---

import useBaseUrl from '@docusaurus/useBaseUrl';

## Overview

This page present the high level design of the reference implementation of the .NET Ensono Stacks REST API with CQRS.

<!-- **This page assumes that the core infrastructure has already been provisioned. Instructions and additional information on the core infrastructure can be found [here](../../../../infrastructure/aws/core_infrastructure.md)** -->

Both the diagram and resource list below are for a single environment.

### Diagram

<img alt="AWS .NET API Infrastructure" src={useBaseUrl('img/aws_rest_api_infrastructure.png')} />

### Resource List

| Resource | Description |
| ---------------------- | ---------------------------------------------------------------------------------------------------- |
| EKS **\*** | Amazon Elastic Kubernetes Service |
| Public DNS Record | DNS record pointing Ingress (via application load balancer **\*** and internal load balancer **\***) |
| ECR **\*** | Elastic Container Registry |
| Namespace: `{env}-api` | Kubernetes namespace for the environment |
| Ingress | Kubernetes Ingress to handle routing to Service |
| Service | Kubernetes Service to handle routing to Deployment Pods |
| Deployment | Kubernetes Deployment for managing Pods |
| DynamoDB | Amazon managed NoSQL database |

**\*** _Resource is created by the core infrastructure deployment._

## Deploying

All infrastructure is deployed using [Terraform](https://www.terraform.io/) that is included with the generated code.

### Pipelines

The following pipelines are currently supported for automating the deployment:

- [GitHub Action](./pipeline_gha_netcore.md)
Loading

0 comments on commit 8ae2c2a

Please sign in to comment.