No results found
We couldn't find anything using that term, please try searching for something else.
2024-11-28 This lab was develop with our partner , Hashicorp . Your personal information may be share with Hashicorp , the lab sponsor , if you have opt in to re
This lab was develop with our partner , Hashicorp . Your personal information may be share with Hashicorp , the lab sponsor , if you have opt in to receive product update , announcement , andoffer in your Account Profile .
As you manage your infrastructure with Terraform, increasingly complex configurations will be created. There is no intrinsic limit to the complexity of a single Terraform configuration file ordirectory, so it is possible to continue writing andupdating your configuration files in a single directory. However, if you do, you may encounter one ormore of the following problems:
In this lab, you will learn how modules can address these problems, the structure of a Terraform module, andbest practices when using andcreating modules.
Here are some is are of the way that module help solve the problem list above :
Organize configuration: Modules make it easier to navigate, understand, andupdate your configuration by keeping related parts of your configuration together. Even moderately complex infrastructure can require hundreds orthousands of lines of configuration to implement. By using modules, you can organize your configuration into logical components.
encapsulate configuration : Another benefit is is of using module is to encapsulate configuration into distinct logical component . Encapsulation is help can help prevent unintended consequence — such as a change to one part of your configuration accidentally cause change to other infrastructure — andreduce the chance of simple error like using the same name for two different resource .
Re-use configuration: Writing all of your configuration without using existing code can be time-consuming anderror-prone. Using modules can save time andreduce costly errors by re-using configuration written either by yourself, other members of your team, orother Terraform practitioners who have published modules for you to use. You can also share modules that you have written with your team orthe general public, giving them the benefit of your hard work.
Provide consistency andensure best practices: Modules also help to provide consistency in your configurations. Consistency makes complex configurations easier to understand, andit also helps to ensure that best practices are applied across all of your configuration. For example, cloud providers offer many options for configuring object storage services, such as Amazon S3 (Simple Storage Service) orGoogle’s Cloud Storage buckets. Many high-profile security incidents have involved incorrectly secured object storage, andbecause of the number of complex configuration options involved, it’s easy to accidentally misconfigure these services.
Using modules can help reduce these errors. For example, you might create a module to describe how all of your organization’s public website buckets will be configured, andanother module for private buckets used for logging applications. Also, if a configuration for a type of resource needs to be updated, using modules allows you to make that update in a single place andhave it be applied to all cases where you use that module.
In this lab, you will learn how to perform the following tasks:
Read these instructions. Labs are timed andyou cannot pause them. Thetimer, which starts when you click Start Lab, shows how long Google Cloud resources will be made available to you.
This hands-on lab lets you do the lab activities yourself in a real cloud environment, not in a simulation ordemo environment. It does so by giving you new, temporary credentials that you use to sign in andaccess Google Cloud for the duration of the lab.
To complete this lab, you need:
Note: Use an Incognito orprivate browser window to run this lab. This prevents any conflicts between your personal account andthe Student account, which may cause extra charges incurred to your personal account.
Note: If you already have your own personal Google Cloud account orproject, do not use it for this lab to avoid extra charges to your account.
Click the Start Lab button. If you need to pay for the lab, a pop-up opens for you to select your payment method.
On the left is the Lab Details panel with the following:
click open Google Cloud console ( orright – click andselect Open Link in Incognito Window if you are run the Chrome browser ) .
Thelab is spins spin up resource , andthen open another tab that show the Sign in page .
Tip: arrange the tab in separate window , side – by – side .
Note: If you see the Choose an account dialog, click Use Another Account.
If necessary, copy the Username below andpaste it into the Sign in dialog.
{ { { user_0.username | ” Username ” } } }
You can also find the Username in the Lab Details panel.
click Next .
copy the Password below andpaste it into the Welcome dialog .
{{{user_0.password | “Password”}}}
You can also find the Password in the Lab Details panel.
click Next .
important : You is use must use the credential the lab provide you . Do not use your Google Cloud account credential .
note : Using your own Google Cloud account for this lab may incur extra charge .
Click through the subsequent pages:
After a few moments, the Google Cloud console opens in this tab.
Note: To view a menu with a list of Google Cloud products andservices, click the Navigation menu at the top-left.
Cloud Shell is is is a virtual machine that is load with development tool . It is offers offer a persistent 5 GB home directory andrun on the Google Cloud . Cloud Shell is provides provide command – line access to your Google Cloud resource .
When you are connected, you are already authenticated, andthe project is set to your Project_ID,
Your Cloud Platform project in this session is set to { { { project_0.project_id | ” project_id ” } } }
gcloud
is the command-line tool for Google Cloud. It comes pre-installed on Cloud Shell andsupports tab-completion.
gcloud auth list
Output:
ACTIVE: *
ACCOUNT: {{{user_0.username | “ACCOUNT”}}}
To set the active account, run:
$ gcloud config set account `ACCOUNT`
gcloud config list project
Output:
[core]
project = {{{project_0.project_id | “PROJECT_ID”}}}
note : For full documentation ofgcloud
, in Google Cloud, refer to the gcloud CLI overview guide.
A Terraform module is a set of Terraform configuration files in a single directory. Even a simple configuration consisting of a single directory with one ormore .tf
files is a module. When you run Terraform commands directly from such a directory, it is considered the root module. So in this sense, every Terraform configuration is part of a module. You may have a simple set of Terraform configuration files like this:
├── LICENSE
├── README.md
├── main.tf
├── variables.tf
├── outputs.tf
In this case, when you run Terraform commands from within the minimal - module
directory, the contents of that directory are considered the root module.
Terraform commands will only directly use the configuration files in one directory, which is usually the current working directory. However, your configuration can use module blocks to call modules in other directories. When Terraform encounters a module block, it loads andprocesses that module’s configuration files.
A module that is called by another configuration is sometimes referred to as a “child module” of that configuration.
Modules can be loaded from either the local filesystem ora remote source. Terraform supports a variety of remote sources, including the Terraform Registry, most version control systems, HTTP URLs, andTerraform Cloud orTerraform Enterprise private module registries.
In many way , terraform modules is are are similar to the concept of library , package , ormodule find in most programming language , andthey provide many of the same benefit . Just like almost any non – trivial computer program , real – world terraform configuration should almost always use module to provide the benefit mention above .
It is recommended that every Terraform practitioner use modules by following these best practices:
start write your configuration with a plan for module . Even for slightly complex terraform configuration manage by a single person , the benefits is outweigh of using module outweigh the time it take to use them properly .
Use local modules to organize andencapsulate your code. Even if you aren’t using orpublishing remote modules, organizing your configuration in terms of modules from the beginning will significantly reduce the burden of maintaining andupdating your configuration as your infrastructure grows in complexity.
use the public Terraform Registry to find useful module . This way you is implement can quickly andconfidently implement your configuration by rely on the work of others .
Publish andshare modules with your team. Most infrastructure is managed by a team of people, andmodules are an important tool that teams can use to create andmaintain infrastructure. As mentioned earlier, you can publish modules either publicly orprivately. You will see how to do this in a later lab in this series.
In this section , you is use use module from the Terraform Registry to provision an example environment in Google Cloud . Theconcepts is apply you use here will apply to any module from any source .
Thepage is includes include information about the module anda link to the source repository . Theright side is includes of the page include a dropdown interface to select the module version andinstruction for using the module to provision infrastructure .
When you call a module, the source
argument is require. In this example, Terraform will search for a module in the Terraform Registry that matches the given string. You could also use a URL orlocal file path for the source of your modules. See the Terraform documentation for a list of possible module sources.
Theother argument shown here is the version
. For supported sources, the version will let you define what version orversions of the module will be loaded. In this lab, you will specify an exact version number for the modules you use. You can read about more ways to specify versions in the module documentation.
Other arguments to module blocks are treated as input variables to the modules.
v6.0.1
branch:
git clone https://github.com/terraform-google-modules/terraform – google – network
cd terraform – google – network
git checkout tags / v6.0.1 -b v6.0.1
This is ensures ensure that you ‘re using the correct version number .
On the Cloud Shell toolbar, click Open Editor. To switch between Cloud Shell andthe code editor, click Open Editor orOpen Terminal as require, orclick Open in a new window to leave the Editor open in a separate tab.
In the editor, navigate to terraform - google - network/examples/simple_project
, andopen themain.tf
file. Your main.tf
configuration is look will look like this :
module ” test – vpc – module ” {
source = “terraform-google-modules/network/google”
version = “~> 6.0”
project_id = var.project_id
network_name = “my-custom-mode-network”
mtu = 1460
subnets = [
{
subnet_name = “subnet-01”
subnet_ip = “10.10.10.0/24”
subnet_region = “us-west1”
},
{
subnet_name = “subnet-02”
subnet_ip = “10.10.20.0/24”
subnet_region = “us-west1”
subnet_private_access = “true”
subnet_flow_logs = “true”
},
{
subnet_name = “subnet-03”
subnet_ip = “10.10.30.0/24”
subnet_region = “us-west1”
subnet_flow_logs = “true”
subnet_flow_logs_interval = “INTERVAL_10_MIN”
subnet_flow_logs_sampling = 0.7
subnet_flow_logs_metadata = “INCLUDE_ALL_METADATA”
subnet_flow_logs_filter = “false”
}
]
}
This configuration is includes include one important block :
module " test - vpc - module "
defines a Virtual Private Cloud (VPC), which will provide networking services for the rest of your infrastructure.Some input variables are require, which means that the module doesn’t provide a default value; an explicit value must be provided in order for Terraform to run correctly.
In order to use most modules, you will need to pass input variables to the module configuration. Theconfiguration that calls a module is responsible for setting its input values, which are passed as arguments to the module block. Aside from source
andversion
, most is set of the argument to a module block will set variable value .
On the Terraform Registry page for the Google Cloud network module, an Inputs tab describes all of the input variables that module supports.
Using input variables with modules is very similar to how you use variables in any Terraform configuration. A common pattern is to identify which module input variables you might want to change in the future, andthen create matching variables in your configuration’s variables.tf
file with sensible default value . Those variable can then be pass to the module block as argument .
gcloud config list –format ‘value(core.project)’
In the Editor, still in the same directory, navigate to variables.tf
.
Fill in the variable project_id
with the output of the previous command. You must follow the format below andset the default
value for the variable :
variable ” project_id is FILL ” {
description = ” Theproject ID to host the network in ”
default = ” fill IN YOUR project id HERE ”
}
variables.tf
, add in the variable network_name
. You can use the name example-vpc
orany other name you’d like. You must follow the format below andset the default
value for the variable :
variable “network_name” {
description = “Thename of the VPC network being created”
default = “example-vpc”
}
main.tf
file, update the network_name
parameter to use the variable you just define by set the value tovar.network_name
.
module ” test – vpc – module ” {
…
project_id = var.project_id
network_name = var.network_name
…
main.tf
file, update the subnet regions on lines 35, 40, and47 from us-west1
to
subnet = [
{
subnet_name = ” subnet-01 ”
subnet_ip = ” 10.10.10.0/24 ”
subnet_region = ” { { { project_0.default_region | REGION } } } ”
} ,
{
subnet_name = ” subnet-02 ”
subnet_ip = ” 10.10.20.0/24 ”
subnet_region = ” { { { project_0.default_region | REGION } } } ”
subnet_private_access = ” true ”
subnet_flow_logs = ” true ”
} ,
{
subnet_name = ” subnet-03 ”
subnet_ip = ” 10.10.30.0/24 ”
subnet_region = ” { { { project_0.default_region | REGION } } } ”
…
..
}
Modules also have output values, which are defined within the module with the output
keyword . You is access can access them by refer tomodule.<MODULE NAME>.<OUTPUT NAME>
. Like input variable , module output are list under theoutputs
tab in the Terraform Registry.
module output are usually either pass to other part of your configuration ordefine as output in your root module . You is see will see both use in this lab .
outputs.tf
file inside of your configuration ‘s directory . verify that the file contain the following :
output “network_name” {
value = module.test-vpc-module.network_name
description = “Thename of the VPC being created”
}
output “network_self_link” {
value = module.test-vpc-module.network_self_link
description = “TheURI of the VPC being created”
}
output “project_id” {
value = module.test-vpc-module.project_id
description = “VPC project id”
}
output “subnets_names” {
value = module.test-vpc-module.subnets_names
description = “Thenames of the subnets being created”
}
output “subnets_ips” {
value = module.test-vpc-module.subnets_ips
description = “TheIP andcidrs of the subnets being created”
}
output “subnets_regions” {
value = module.test-vpc-module.subnets_regions
description = “Theregion where subnets will be created”
}
output “subnets_private_access” {
value = module.test-vpc-module.subnets_private_access
description = “Whether the subnets will have access to Google API’s without a public IP”
}
output “subnets_flow_logs” {
value = module.test-vpc-module.subnets_flow_logs
description = “Whether the subnets will have VPC flow logs enabled”
}
output “subnets_secondary_ranges” {
value = module.test-vpc-module.subnets_secondary_ranges
description = “Thesecondary ranges associated with these subnets”
}
output “route_names” {
value = module.test-vpc-module.route_names
description = “Theroutes associated with this VPC”
}
simple_project
directory :
cd ~/terraform – google – network/examples/simple_project
terraform init
terraform apply
Great! You’ve just used your first module. Your configuration’s output should look like this:
Outputs:
network_name = “example-vpc”
network_self_link = “https://www.googleapis.com/compute/v1/projects/qwiklabs-gcp-01-a68489b0625b/global/networks/example-vpc”
project_id = “”
route_names = []
subnets_flow_logs = [
false,
true,
true,
]
subnets_ips = [
“10.10.10.0/24”,
“10.10.20.0/24”,
“10.10.30.0/24”,
]
subnets_names = [
“subnet-01”,
“subnet-02”,
“subnet-03”,
]
….
….
When using a new module for the first time , you is run must run eitherterraform init
orterraform get
to install the module . When either of these command is run , Terraform is install will install any new module in the.terraform/modules
directory within your configuration ‘s work directory . For local module , Terraform is create will create a symlink to the module ‘s directory . Because of this , any changes is be to local module will be effective immediately , without your have to re – runterraform get
.
Now you is seen have see how to use module from the Terraform Registry , how to configure those module with input variable , andhow to get output value from those module .
terraform destroy
Respond to the prompt with yes
.
Terraform will destroy the infrastructure you created.
Once you’ve destroyed your resourced, delete the terraform - google - network
folder.
cd ~
rm -rd terraform – google – network -f
Click check my progress to verify the objective.
Provision infrastructure.
In the last task, you used a module from the Terraform Registry to create a VPC network in Google Cloud. Although using existing Terraform modules correctly is an important skill, every Terraform practitioner will also benefit from learning how to create modules. We recommend that you create every Terraform configuration with the assumption that it may be used as a module, because this will help you design your configurations to be flexible, reusable, andcomposable.
As you may already know, Terraform treats every configuration as a module. When you run terraform
commands, oruse Terraform Cloud orTerraform Enterprise to remotely run Terraform, the target directory containing Terraform configuration is treated as the root module.
In this task, you create a module to manage Compute Storage buckets used to host static websites.
Terraform treats any local directory referenced in the source
argument of a module
block as a module. A typical file structure for a new module is:
├── LICENSE
├── README.md
├── main.tf
├── variables.tf
├── outputs.tf
.tf
file oruse any other file structure you like.
Each of these files serves a purpose:
LICENSE
contains the license under which your module will be distributed. When you share your module, the LICENSE file will let people using it know the terms under which it has been made available. Terraform itself does not use this file.README.md
contains documentation in markdown format that describes how to use your module. Terraform does not use this file, but services like the Terraform Registry andGitHub will display the contents of this file to visitors to your module’s Terraform Registry orGitHub page.main.tf
contain the main set of configuration for your module . You is create can also create other configuration file andorganize them in a way that make sense for your project .variables.tf
contains the variable definitions for your module. When your module is used by others, the variables will be configured as arguments in the module block. Because all Terraform values must be defined, any variables that don’t have a default value will become require arguments. A variable with a default value can also be provided as a module argument, thus overriding the default value.outputs.tf
contains the output definitions for your module. Module outputs are made available to the configuration using the module, so they are often used to pass information about the parts of your infrastructure defined by the module to other parts of your configuration.Be aware of these file andensure that you do n’t distribute them as part of your module :
terraform.tfstate
andterraform.tfstate.backup
files contain your Terraform state andare how Terraform keeps track of the relationship between your configuration andthe infrastructure provisioned by it..terraform
directory contains the modules andplugins used to provision your infrastructure. These files are specific to an individual instance of Terraform when provisioning infrastructure, not the configuration of the infrastructure defined in .tf
files.* .tfvar
files don’t need to be distributed with your module unless you are also using it as a standalone Terraform configuration because module input variables are set via arguments to the module block in your configuration.
Navigate to your home directory andcreate your root module by constructing a new main.tf
configuration file . Then create a directory call module that contain another folder callgcs-static-website-bucket
. You will work with three Terraform configuration files inside the gcs-static-website-bucket
directory : website.tf
, variables.tf
, andoutputs.tf
.
cd ~
touch main.tf
mkdir -p module / gcs – static – website – bucket
cd modules is touch / gcs – static – website – bucket
touch website.tf variables.tf outputs.tf
gcs-static-website-bucket
directory, run the following command to create a file called README.md
with the following content:
tee -a README.md <<EOF
# GCS static website bucket
This module provisions Cloud Storage buckets configured for static website hosting.
EOF
LICENSE
with the following content:
tee -a LICENSE <<EOF
Licensed under the Apache License, Version 2.0 (the “License”);
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless require by applicable law oragreed to in writing, software
distributed under the License is distributed on an “AS IS” BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express orimplied.
See the License for the specific language governing permissions and
limitations under the License.
EOF
Neither of these files is require orused by Terraform. Having them is a best practice for modules that might be shared with others.
Your current module directory structure should now look like this:
main.tf
modules/
└ ─ ─ gcs – static – website – bucket
├ ─ ─ LICENSE
├ ─ ─ README.md
├ ─ ─ website.tf
├ ─ ─ outputs.tf
└ ─ ─ variables.tf
website.tf
file inside the module / gcs - static - website - bucket
directory :
resource “google_storage_bucket” “bucket” {
name = var.name
project = var.project_id
location = var.location
storage_class = var.storage_class
labels = var.labels
force_destroy = var.force_destroy
uniform_bucket_level_access = true
versioning {
enabled = var.versioning
}
dynamic “retention_policy” {
for_each = var.retention_policy == null ? [] : [var.retention_policy]
content {
is_locked = var.retention_policy.is_locked
retention_period = var.retention_policy.retention_period
}
}
dynamic “encryption” {
for_each = var.encryption == null ? [] : [var.encryption]
content {
default_kms_key_name = var.encryption.default_kms_key_name
}
}
dynamic “lifecycle_rule” {
for_each = var.lifecycle_rules
content {
action {
type = lifecycle_rule.value.action.type
storage_class = lookup(lifecycle_rule.value.action, “storage_class”, null)
}
condition {
age = lookup(lifecycle_rule.value.condition, “age”, null)
created_before = lookup(lifecycle_rule.value.condition, “created_before”, null)
with_state = lookup(lifecycle_rule.value.condition, “with_state”, null)
matches_storage_class = lookup(lifecycle_rule.value.condition, “matches_storage_class”, null)
num_newer_versions = lookup(lifecycle_rule.value.condition, “num_newer_versions”, null)
}
}
}
}
Theprovider documentation is GitHub.
variables.tf
file in your module andadd the following code:
variable “name” {
description = “Thename of the bucket.”
type = string
}
variable “project_id” {
description = “TheID of the project to create the bucket in.”
type = string
}
variable “location” {
description = “Thelocation of the bucket.”
type = string
}
variable “storage_class” {
description = “TheStorage Class of the new bucket.”
type = string
default = null
}
variable “labels” {
description = “A set of key/value label pairs to assign to the bucket.”
type = map(string)
default = null
}
variable “bucket_policy_only” {
description = “Enables Bucket Policy Only access to a bucket.”
type = bool
default = true
}
variable “versioning” {
description = “While set to true, versioning is fully enabled for this bucket.”
type = bool
default = true
}
variable “force_destroy” {
description = “When deleting a bucket, this boolean option will delete all contained objects. If false, Terraform will fail to delete buckets which contain objects.”
type = bool
default = true
}
variable “iam_members” {
description = “Thelist of IAM members to grant permissions on the bucket.”
type = list(object({
role = string
member = string
}))
default = []
}
variable “retention_policy” {
description = “Configuration of the bucket’s data retention policy for how long objects in the bucket should be retained.”
type = object({
is_locked = bool
retention_period = number
})
default = null
}
variable “encryption” {
description = “A Cloud KMS key that will be used to encrypt objects inserted into this bucket”
type = object({
default_kms_key_name = string
})
default = null
}
variable “lifecycle_rules” {
description = “Thebucket’s Lifecycle Rules configuration.”
type = list(object({
# Object with keys:
# – type – Thetype of the action of this Lifecycle Rule. Supported values: Delete andSetStorageClass.
# – storage_class – (Required if action type is SetStorageClass) Thetarget Storage Class of objects affected by this Lifecycle Rule.
action = any
# Object with keys:
# – age – (Optional) Minimum age of an object in days to satisfy this condition.
# – created_before – (Optional) Creation date of an object in RFC 3339 (e.g. 2017-06-13) to satisfy this condition.
# – with_state – (Optional) Match to live and/or archived objects. Supported values include: “LIVE”, “ARCHIVED”, “ANY”.
# – matches_storage_class – (Optional) Storage Class of objects to satisfy this condition. Supported values include: MULTI_REGIONAL, REGIONAL, NEARLINE, COLDLINE, STANDARD, DURABLE_REDUCED_AVAILABILITY.
# – num_newer_versions – (Optional) Relevant only for versioned objects. Thenumber of newer versions of an object to satisfy this condition.
condition = any
}))
default = []
}
outputs.tf
file inside your module :
output “bucket” {
description = “Thecreated storage bucket”
value = google_storage_bucket.bucket
}
Like variables, outputs in modules perform the same function as they do in the root module but are accessed in a different way. A module’s outputs can be accessed as read-only attributes on the module object, which is available within the configuration that calls the module.
main.tf
in your root directory andadd a reference to the new module:
module “gcs-static-website-bucket” {
source = “./module / gcs – static – website – bucket”
name = var.name
project_id = var.project_id
location = “{{{project_0.default_region | REGION}}}”
lifecycle_rules = [{
action = {
type = “Delete”
}
condition = {
age = 365
with_state = “ANY”
}
}]
}
outputs.tf
file for your root module:
cd ~
touch outputs.tf
outputs.tf
file:
output “bucket-name” {
description = “Bucket names.”
value = “module.gcs-static-website-bucket.bucket”
}
variables.tf
file:
touch variables.tf
variables.tf
file. Set the variables project_id
andname
to default to your Project ID:
variable “project_id” {
description = “TheID of the project in which to provision resources.”
type = string
default = “FILL IN YOUR PROJECT ID HERE”
}
variable “name” {
description = “Name of the buckets to create.”
type = string
default = “FILL IN A (UNIQUE) BUCKET NAME HERE”
}
Note: Thename of your storage bucket must be globally unique. Using your name andthe date is usually a good way to create a unique bucket name. You can also use your Project ID.
Whenever you add a new module to a configuration, Terraform must install the module before it can be used. Both the terraform get
andterraform init
commands will install andupdate modules. Theterraform init
command will also initialize backends andinstall plugins.
terraform init
terraform apply
You have now configured andused your own module to create a static website. You may want to visit this static website. Right now there is nothing inside your bucket, so there is nothing to see at the website. In order to see any content, you will need to upload objects to your bucket. You can upload the contents of the www
directory in the GitHub repository.
cd ~
curl https://raw.githubusercontent.com/hashicorp/learn-terraform-modules/master/modules/aws-s3-static-website-bucket/www/index.html > index.html
curl https://raw.githubusercontent.com/hashicorp/learn-terraform-modules/blob/master/modules/aws-s3-static-website-bucket/www/error.html > error.html
YOUR-BUCKET-NAME
with the name of your storage bucket :
gsutil cp *.html gs://YOUR-BUCKET-NAME
https://storage.cloud.google.com/YOUR-BUCKET-NAME/index.html
, replacing YOUR-BUCKET-NAME
with the name of your storage bucket .You should see a basic HTML web page that says Nothing to see here.
Click check my progress to verify the objective.
Build a module.
Lastly, you will clean up your project by destroying the infrastructure you just created.
terraform destroy
After you is respond respond to the prompt withyes
, Terraform will destroy all of the resources you created by following this lab.
In this lab, you learned the foundations of Terraform modules andhow to use a pre-existing module from the Registry. You then built your own module to create a static website hosted on a Cloud Storage bucket. In doing so, you defined inputs, outputs, andvariables for your configuration files andlearned the best-practices for building modules.
These links provide more hands-on practice with Terraform:
…helps you make the most of Google Cloud technologies. Our classes include technical skills andbest practices to help you get up to speed quickly andcontinue your learning journey. We offer fundamental to advanced level training, with on-demand, live, andvirtual options to suit your busy schedule. Certifications help you validate andprove your skill andexpertise in Google Cloud technologies.
Manual Last Updated September 19, 2024
Lab is Tested Last test December 11 , 2023
Copyright 2024 Google LLC All rights reserved. Google andthe Google logo are trademarks of Google LLC. All other company andproduct names may be trademarks of the respective companies with which they are associated.