What is Terraform and how it is different from other IaC tools?
Terraform is a tool for provisioning, managing, and deploying infrastructure resources. It is an open-source tool written in Golang and created by the HashiCorp company. With Terraform, you can manage infrastructure for your applications across multiple cloud providers - AWS, Azure, GCP, etc. - using a single tool.
Terraform allows you to define your infrastructure requirements with high-level configuration syntax. Terraform supports multi-cloud integration. Terraform comes with a robust CLI, and its modules provide separate resources in dedicated and reusable templates. Terraform renders uniform syntax for IaC.
How do you call a main.tf module?
To call a module from main.tf file simply write module block with a reference name and in the
source
mentionmain.tf
directory path.# Example module "foo" { souce = "./my-module-directory" }
What exactly is Sentinel? Can you provide few examples where we can use for Sentinel policies?
Sentinel Policies are rules which are enforced on Terraform runs to validate that the plan and corresponding resources are in compliance with company policies.
Here are a few examples of Sentinel policies:
Compliance: A policy that checks whether a VM is deployed in a specific region to comply with data residency regulations.
Security: A policy that ensures that storage accounts are encrypted using a specific encryption method.
Naming conventions: A policy that checks whether a resource group name includes a specific prefix.
Resource limits: A policy that prevents users from creating more than a certain number of virtual machines in a specific subscription.
Cost optimization: A policy that checks whether resources are tagged with a specific label that indicates their business purpose.
You have a Terraform configuration file that defines an infrastructure deployment. However, there are multiple instances of the same resource that need to be created. How would you modify the configuration file to achieve this?
In this case, add a count attribute to the instance resource block and give the number for the instance. Follow the below given example:
resource "aws_instance" "server" { count = 5 # create five similar EC2 instances ami = "ami-xxx" instance_type = "t2.micro" tags = { Name = "My Server - ${count.index}" } }
You want to know from which paths Terraform is loading providers referenced in your Terraform configuration (*.tf files). You need to enable debug messages to find this out. Which of the following would achieve this?
A. Set the environment variable TF_LOG=TRACE
B. Set verbose logging for each provider in your Terraform configuration
C. Set the environment variable TF_VAR_log=TRACE
D. Set the environment variable TF_LOG_PATH
- The correct answer is
A
- The correct answer is
Below command will destroy everything that is being created in the infrastructure. Tell us how would you save any particular resource while destroying the complete infrastructure.
terraform destroy
If you want to save a particular resource from being destroyed, you can use Terraform's resource lifecycle management feature. To save a specific resource, you can modify the resource block in your Terraform configuration file by adding the lifecycle block and setting the prevent_destroy argument to true.
resource "aws_instance" "example" { ami = "ami-xxx" instance_type = "t2.micro" tags = { Name = "terraform-learn-state-ec2" } lifecycle { prevent_destroy = true } }
Which module is used to store .tfstate file in S3?
backend "s3"
module interraform
block is used to store .tfstate file in AWS S3.Note: S3 has to be created before calling the
backend
terraform { required_providers { aws = { source = "hashicorp/aws" version = "~> 4.16" } } required_version = ">= 1.2.0" backend "s3" { bucket = "my-app-state-bucket" key = "terraform.tfstate" region = "us-east-1" dynamodb_table = "my-app-state-table" } }
How do you manage sensitive data in Terraform, such as API keys or passwords?
There are multiple ways in which we can pass sensitive data in Terraform.
Use Terraform input variables: Define input variables in your Terraform configuration file and use them to prompt the user for sensitive data during runtime.
Store sensitive data in environment variables: Store sensitive data as environment variables and reference them in your Terraform configuration file.
Use external secret management tools: Use external secret management tools like Vault or AWS Secrets Manager to store and manage sensitive data.
Use encrypted state files: Encrypt your Terraform state files using tools like sops or age to protect sensitive data stored in the state file.
You are working on a Terraform project that needs to provision an S3 bucket, and a user with read and write access to the bucket. What resources would you use to accomplish this, and how would you configure them?
To provision an S3 bucket and a user with read and write access to the bucket, I would use the following Terraform resources:
aws_s3_bucket : This resource would define the S3 bucket and its configuration options such as its name, region, access control, and versioning.
aws_iam_user: This resource would define the user and its configuration options such as its name and permissions.
aws_iam_access_key: This resource would create access keys for the user, which would be used to authenticate the user when accessing the S3 bucket.
aws_s3_bucket_policy: This resource would define the permissions that the user would have on the S3 bucket.
Here is an example Terraform configuration file that creates an S3 bucket, an IAM user, access keys, and a bucket policy:
provider "aws" { region = "us-east-1" } resource "aws_s3_bucket" "foo_bucket" { bucket = "foo-bucket" acl = "private" } resource "aws_iam_user" "foo_user" { name = "foo-user" } resource "aws_iam_access_key" "foo_access_key" { user = aws_iam_user.foo_user.name } resource "aws_s3_bucket_policy" "foo_policy" { bucket = aws_s3_bucket.foo_bucket.id policy = jsonencode({ Version = "2012-10-17" Statement = [ { Effect = "Allow" Principal = { AWS = aws_iam_user.foo_user.arn } Action = [ "s3:GetObject", "s3:PutObject" ] Resource = "${aws_s3_bucket.foo_bucket.arn}/*" } ] }) }
This configuration file defines an S3 bucket with a private access control list, an IAM user named “example-user”, access keys for the user, and a bucket policy that allows the user to read and write objects in the bucket.
Who maintains Terraform providers?
Terraform providers are maintained by both the open-source community and the respective companies that the providers belong to.
The community contributes to the development of Terraform providers by submitting code contributions, bug reports, and feature requests through the project’s GitHub repository.
The respective companies, such as Amazon Web Services, Google Cloud Platform, and Microsoft Azure, maintain their Terraform providers by ensuring compatibility with their respective cloud services and updating them with new features and bug fixes as necessary.
Terraform providers are usually updated and released separately from the core Terraform project to provide a modular and extensible architecture.
How can we export data from one module to another?
In Terraform, you can export data from one module to another using outputs.
Outputs allow you to expose values from one module that can be consumed by another module or referenced outside of the module.
To export data from one module, you define an output in the module’s outputs.tf file, and assign a value to it. For example:
output "my_output" { value = "some value" }
Then, in the consuming module, you can reference this output using the syntax <module_name>.<output_name>
For example:
module "my_module" { source = "./my_module" my_output = module.other_module.my_output }
This would assign the value of my_output from the other_module to the my_output variable in the my_module.
Â