This Terraform setup uses the ansraliant/s3-state/aws
module to configure a remote backend with S3 and DynamoDB for storing the Terraform state remotely.
Click here for 👉 Cloud Resume Frontend.
- Terraform installed on your machine.
- AWS CLI configured for AWS credentials with sufficient permissions
- After installing AWS CLI
- RUN "aws configure"
- Get Credentials: "right-click-on-AWS-profile -> security-credentials -> Create-Access-Keys.
- Google Cloud SDK for creating a Google Cloud storage bucket. Ensure you have an account with a project with billing enabled.
Option 1) Create infrastructure from the console and import it to Terraform to recreate a similar code structure and avoid massive code changes:
- First, define the basic terraform code for your resources (without any additional arguments like name, etc.) then:
- Run "terraform init"
- ''terraform validate"
- ''terraform import resource.resource_name resource_identifier"
- "terraform scan"
- Find and copy the imported code block for your resource.
- Replace the code with the basic code in your Terraform files.
- "terraform validate" to know which code lines we need to remove from our Terraform files.
- Refine the code by replacing deprecated code blocks to resolve the warnings (if any) seen upon running "terraform plan" with the respective resource blocks.
- "terraform plan" or "terraform apply" might show that it creates new resources. This is normal since Terraform considers separate resource policy blocks as individual resources. However, it shall not affect your existing infrastructure if you have used the original code or carefully resolved the "terraform plan" warnings.
-
Clone this repository:
git clone [email protected]:deepansharya1111/cloud-resume-backend.git cd cloud-resume-backend/terraform/
-
Customisation: replace the highlighted texts with your desired globally unique names. Run
terraform fmt
to format if additional changes are made.- For Terraform Remote Backend with S3 and DynamoDB, change the
bucket_name
anddynamodb_table
names inbackend.tf
. - For creating a GCP Storage bucket, change the
project
andname
variables ingcp.tf
. - AWS Resources:
- For the S3 bucket to store the frontend code, change every
deepansh_app_bucket
anddeepansh.app
inaws.tf
. - For CloudFront distribution and ACM Certificate: change every instance of your bucket name
deepansh.app
and domain*.deepansh.app
or CNAMEwww.deepansh.app
with yourbucket_name
,*your.domain
, andyour CNAME
that is registered as a DNS record in your DNS. - (optional) For DynamoDB table and Lambda Function: customize lambda function name or iam policy names, etc, to your liking; else, leave it as it is.
- For the S3 bucket to store the frontend code, change every
- For Terraform Remote Backend with S3 and DynamoDB, change the
-
Initialize Terraform:
terraform init
-
Validate your Terraform configuration:
terraform validate
-
Plan your Terraform configuration:
terraform plan
-
Apply your Terraform configuration:
terraform apply
-
See Google Cloud storage buckets:
gsutil ls
-
See AWS DynamoDB tables
aws dynamodb list-tables
-
See AWS S3 bucket
aws s3 ls
aws.tf
: Contains the main AWS Terraform configuration.
backend.tf
: Includes the ansraliant/s3-state/aws module for remote state configuration.
provider.tf
: Contains the AWS provider configuration.
gcp.tf
: Contains the GCP provider and Cloud Storage Bucket configurations for hosting static websites.
``backend.tf.json`: Backend configuration file with details for S3 bucket, key, region, and DynamoDB table.