Terraform Nuggets: Disallow deletion of terraform resource

 We might get into trouble if we run terraform destroy over a stack which contains a critical resource which might be getting used by some other stack.


To avoid such situations to prevent deletion of critical resources managed via terraform we can use a lifecycle block in the resource definition in terraform with prevent_destroy attribute set to true as below:-

 resource "digitalocean_droplet" "db" {

    lifecycle {

            prevent_destroy = true

    }

}



The Other arguments that can be used with lifecycle block are:- 

1. create_before_destroy = by default in case of resources which cannot be updated in place, terraform first destroys the old resource and then creates a new one. However with this argument we can override the default behavior to create the new resource first before destroying the old one.

2. ignore_changes- By default, Terraform detects any difference in the current settings of a real infrastructure object and plans to update the remote object to match configuration. The ignore_changes feature is intended to be used when a resource is created with references to data that may change in the future, but should not affect said resource after its creation. In some rare cases, settings of a remote object are modified by processes outside of Terraform, which Terraform would then attempt to "fix" on the next run. In order to make Terraform share management responsibilities of a single object with a separate process, the ignore_changes meta-argument specifies resource attributes that Terraform should ignore when planning updates to the associated remote object. 

The arguments corresponding to the given attribute names are considered when planning a create operation, but are ignored when planning an update. The arguments are the relative address of the attributes in the resource. Map and list elements can be referenced using index notation, like tags["Name"] and list[0] respectively.

Comments

Popular posts from this blog

python3: unpickling error

Azure Data Analytics: Part1: Hosting Data Lake storage: Gen1 and Gen2