This repository contains code used for a workshop about IaC using Terraform.
- Terraform:
- https://www.terraform.io/downloads.html
- You should be able to run:
terraform -help
- AWS CLI:
- Download the CLI for your OS
- You should be able to run:
aws --version
- Install some form of terraform extension for your idea. This makes it easier to find any syntax errors.
- See if you have received AWS credentials from the course leader (you should receive this shortly before the course start).
- This should contain username and password for the AWS Console and access keys.
- Go to the AWS Console.
- Choose to sign in as IAM user and type in
bouvet-ost-tech1
as AccountID and hit "Next". - Under "IAM user name" and "Password" type in the username and password you received from the course leader.
- Create a new password.
In this lecture you should configure the AWS CLI, initiate terraform and deploy an S3 Bucket.
Keep in mind that all the participants in this workshop will use the same AWS account, so try to give your resources names that you can easily identify.
- Configure aws:
- Run
aws configure
- Type in your personal access keys (
Access Key ID
andSecret Access Key
) you received from the course leader. - When asked about default region type:
eu-central-1
and set default output format tojson
.
- Run
- Clone this repository
- Open the code in your editor of choice.
- Set S3 bucket name: Open the variables.tf file and change the
s3_bucket_name
value from<YOUR_BUCKET_NAME>
, to whatever you want to call your S3 bucket. - While in the variables.tf file, update the value in the
my_name
variable to your AWS username. - Open lecture_1.tf and read the instructions to set the S3 bucket name.
- Initiate Terraform: Open your terminal in the project folder and run
terraform init
. - Preview what will happen if you deploy this code:
terraform plan
. - Deploy your S3 bucket:
terraform apply
(when asked if you want to apply this code, typeyes
and hit enter). - Login to the AWS Console and find your S3 bucket. Try uploading any text file to this bucket.
In this lecture we will create a lambda function that will run every time a file is uploaded to our S3 bucket.
All the lambda function code for each lecture can be found under lambda_code.
- Uncomment the code in lecture_2.tf.
- Open the the variables.tf file and make sure the
s3_consumer_lambda_function_code_path
variable is pointing to the folder containing the python code for lecture 2. The path should be as follows:./lambda_code/lecture_2
. - Go to lecture_2.tf, find the aws lambda function configuration, under environment variables update the S3 bucket variable to point to the bucket name variable in variables.tf.
- Run
terraform init
to import the archive plugin. - Preview the changes:
terraform plan
- Deploy changes:
terraform apply
- Upload a new file to the S3 bucket.
- View logs (it can take a few minutes (1-2) before logs are showing):
- Open the Lambda function in the AWS Console.
- Click on monitoring and view the Lambda Function logs in CloudWatch.
- Open the CloudWatch log stream, and you should see that our application has run and printed a message saying:
success!!!
.
In this lecture we will create a DynamoDB table and update our lambda function to consume files uploaded to S3 and store the content in the DynamoDB table.
The Lambda function will only support json lists, so use the file provided here when uploading to the S3 bucket.
-
Uncomment the code in lecture_3.tf.
-
Set DynamoDB table name: Open the variables.tf file and change the
dynamodb_table_name
value from<YOUR_TABLE_NAME>
, to whatever you want to call your DynamoDB table. -
In variables.tf update the
s3_consumer_lambda_function_code_path
variable to point to the python for lecture three. The path should be as follows:./lambda_code/lecture_3
. -
Go to lecture_2.tf and find the
s3_consumer_lambda
resource. Set the environment variableDB_NAME
to get the name property of the resource:my_dynamodb_table
located in lecture_3.tf.- The example below shows how to retrieve properties from a terraform resources. The output
api_name
retrieves the name property from thebirds_api
resource.
resource "aws_api_gateway_rest_api" "birds_api" { name = "birds_api" } output "api_name" { value = aws_api_gateway_rest_api.birds_api.name }
- The example below shows how to retrieve properties from a terraform resources. The output
-
Preview the changes:
terraform plan
-
Deploy changes:
terraform apply
-
Upload the json file to your S3 bucket.
-
Open the DynamoDB table in the AWS Console. The content of the file should now be stored in the DynamoDB table.
In this lecture we will create a lambda function which can get a single object from the database, given an id. We will then create an API Gateway where we will add an endpoint connected to the lambda function. This way we will be able to hit an API endpoint and get an object from the database in return.
- Uncomment the code in lecture_4.tf, including the line comment on line: 128.
- In the lecture_4.tf file find the
api_gateway_invoke_db_reader_lambda_permission
and underfunction_name
, get the lambda function name from thedb_reader_lambda
. - Anywhere in the lecture_4.tf file create an terraform output with the value of the
demo_env
invoke URL. - Find the
birds_resource
resource in lecture_4.tf and add a string path underpath_part
. Try to keep it simple without any special characters, like "birds" or something. - Preview the changes:
terraform plan
- Deploy changes:
terraform apply
- The URL you can use to access the API should be printed in the terminal. Copy that URL and paste it in your web browser, followed by
/
and the path you wrote in step 4, followed by/
and an object id provided in the json file. The complete URL should look something like this: https://xk5x3cs7ik.execute-api.eu-central-1.amazonaws.com/demo/birds/079b42b8-a1ab-11eb-bcbc-0242ac130002. - See if you get a JSON object in return from the URL in step 7. If so, then you have a working API 👏🏼.
In this lecture you should create your own terraform code to add a new endpoint to the existing API from lecture 4. This endpoint should be of type HTTP GET, and should return a JSON list of all the objects in the database. You will find the python code in this folder.
Keep in mind that every terraform resource has to have a unique set of labels (type and name). This is important for this lecture since you are going to create multiple new terraform resources of the same type.
- Create a new
.tf
file in the project root folder. - In your new terraform file, add all the necessary terraform code for a lambda function (TIP: copy much of the code from lecture_4.tf).
- You can reuse the
db_reader_lambda_iam_role
since this contains all the access you need. - In your
aws_lambda_function
make sure to use "get_all_birds.lambda_handler" ashandler
. - In your
archive_file
, point to the correct folder where the code is stored.
- You can reuse the
- Create your API endpoint: Since the API is created beforehand you only need to create three new terraform resources based on the one in lecture_4.tf to add a new endpoint to the API:
aws_api_gateway_method
: Change theresource_id
to point to the root resource:birds_resource
and remove request_parameters.aws_api_gateway_integration
: Change these parameters accordingly:http_method
,resource_id
anduri
.aws_lambda_permission
:function_name
should point to your new lambda function definition.
- Preview the changes:
terraform plan
- Deploy changes:
terraform apply
- Test your new endpoint to retrieve a list of objects.
- Empty S3 bucket using the AWS Console
- Run
terraform destory