Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow module to depend on another module #21378

Closed
Raviadonis opened this issue May 21, 2019 · 4 comments
Closed

Allow module to depend on another module #21378

Raviadonis opened this issue May 21, 2019 · 4 comments

Comments

@Raviadonis
Copy link

Current Terraform Version

Terraform v0.11.6

Use-cases

I have two different files. (ie., module.tf and resource.tf)

Have written two modules in a single module.tf. The second module should run once after the first module ran successfully. Each module has to run a provisioner within the null_resource with a different set of variables appropriate to the modules. So thought of making one module is dependent on another module.

Attempted Solutions

RESOURCE.TF:

resource "aws_instance" "testapp" {
  count                                = "${var.count}"
  ami                                   = "${var.ami_id}"
  key_name                         = "${var.key_name}"
  subnet_id                          = "${var.aws_subnet}"
  private_ip                          = "${var.private_ip}"
  instance_type                   = "${var.instance_type}"
  security_groups                = ["${var.security_groups}"]
  availability_zone               = "${var.availability_zone}"

  root_block_device {
  volume_type                      = "gp2"
  volume_size                       = "${var.root_volume}"
  delete_on_termination      = "true"
  }

  tags {
  Name                                 = "${var.name}${format("%02d", count.index + var.name_suffix)}"
  Owner                                = "${var.owner}"
  Project                                = "${var.project}"
     }
  }


resource "null_resource" "cluster" {

 provisioner "file" {
  source                                 = "${template_dir.scripts.destination_dir}"
  destination                          = "/home/user1/"
   }

 connection {
   host                                = "${element(aws_instance.testapp.*.private_ip, count.index)}"
   type                                = "ssh"
   user                                = "user1"
   private_key                     = "${file("/tmp/cluster/test_key_pair.pem")}"
   }

  provisioner "remote-exec" {
   inline                              = ["sh -x /home/user1/${var.destination}"]
   }
}

output "instance_id" {
  value = "${aws_instance.testapp.id}"
}

MODULE.TF:

module "node_1" {
  source                               = "../../resources/compute"
  ami_id                               = "${var.ami_id}"
  key_name                          = "${var.key_name}"
  private_ip                          = "${var.private_ip}"
  aws_subnet                       = "${var.aws_subnet}"
  name_suffix                          = "${var.name_suffix_dc}"
  destination                          = "${var.destination}"
  root_volume                          = "${var.root_volume}"
  instance_type                        = "${var.instance_type}"
  security_groups                      = ["${var.security_groups}"]
  availability_zone                    = "${var.availability_zone}"

  ### Tag variables ###
  os                                       = "${var.os}"
  name                                 = "${var.hostname}"
  owner                                = "${var.owner}"
  project                              = "${var.project}"
  additional_volume_size    = "${var.additional_volume_size}"
  }


module "node_2" {
  depends_on                           = ["${module.node_1.instance_id}"]
  source                               = "../../resources/compute"
  ami_id                               = "${var.ami_id}"
  key_name                             = "${var.key_name}"
  private_ip                           = "${var.private_ip}"
  aws_subnet                           = "${var.aws_subnet}"
  destination                          = "${var.destination}"
  name_suffix                          = "${var.name_suffix_hc}"
  root_volume                          = "${var.root_volume}"
  instance_type                        = "${var.instance_type}"
  security_groups                      = ["${var.security_groups}"]
  availability_zone                    = "${var.availability_zone}"
  
    ### Tag variables ###
  os                                           = "${var.os}"
  name                                      = "${var.hostname}"
  owner                                     = "${var.owner}"
  project                                    = "${var.project}"
  additional_volume_size          = "${var.additional_volume_size}"
  }

**Note:**Both module using the same resource.tf file

Output

module root: module node_2: depends_on is not a valid argument

References

#10462

@Prabhakar-cg
Copy link

This is something really necessary while launching multiple applications with same resources. Does anyone have any inputs or suggestions for this issue?.

@Raviadonis
Copy link
Author

Prior to 0.8.0, we have to trick Terraform to do it with a combination of outputs and other trickery. But Terraform still doesn't allow a module to depend on another module directly via depends_on -- this attribute is not available for modules whatsoever.

@jbardin
Copy link
Member

jbardin commented May 21, 2019

Hi @Raviadonis,

Thanks for filing this. We're tracking this feature in the referenced issue #10462. Modules can already be the target of depends_on, so the remaining issue is that depends_on cannot be used in a module itself. That issue is still outstanding because it's dependent on major changes in how terraform operates, for example data sources are going to require something along the lines of #17034 first. Once that can be implemented, there would be nothing else needed for this to work.

Thanks!

@jbardin jbardin closed this as completed May 21, 2019
@ghost
Copy link

ghost commented Jul 25, 2019

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues.

If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@ghost ghost locked and limited conversation to collaborators Jul 25, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

3 participants