I've created a straighforward module:
.
├── inputs.tf
└── main.tf
Input variables are declared in inputs.tf
:
variable "workers" {
type = number
description = "Amount of spark workers"
}
variable "values_values_path" {}
main.tf
is:
resource "helm_release" "spark" {
name = "spark"
repository = "https://charts.bitnami.com/bitnami"
chart = "spark"
version = "1.2.21"
namespace = ...
set {
name = "worker.replicaCount"
value = var.workers
}
values = [
"${file("${var.custom_values_path}")}"
]
}
As you can see I'm trying to deploy a helm release. I'd like to set a custom values file parameterized as custom_values_path
:
.
├── main.tf
├── provider.tf
└── spark-values.yaml
My main.tf
here is:
module "spark" {
source = "../modules/spark"
workers = 1
custom_values_path = "./spark_values.yaml"
}
However, I'm getting:
Error: Error in function call
on ../modules/spark/main.tf line 14, in resource "helm_release" "spark":
14: "${file("${var.custom_values_path}")}"
|----------------
| var.custom_values_path is "./spark_values.yaml"
Call to function "file" failed: no file exists at spark_values.yaml.
Complete directory structure is:
.
├── stash
│ ├── main.tf
│ ├── provider.tf
│ └── spark-values.yaml
└── modules
└── spark
├── inputs.tf
└── main.tf
When I perform terraform plan
I'm in ./stash
.
So complete commands would be:
$ > cd ./stash $ stash > terraform plan Error: Error in function call
on ../modules/spark/main.tf line 14, in resource "helm_release" "spark":
14: "${file("${var.custom_values_path}")}"
|----------------
| var.custom_values_path is "./spark_values.yaml"
Call to function "file" failed: no file exists at spark_values.yaml.
Why do I get Call to function "file" failed: no file exists?
Since you are referring to a file in the calling module from a child module, you should provide an absolute path based on the calling module's path using path.module as follows:
module "spark" {
source = "../modules/spark"
workers = 1
custom_values_path = "${path.module}/spark_values.yaml"
}
I recommend against referring to files across module boundaries in Terraform. You are better off keeping dependencies between modules to variables only to avoid weird issues like this. An alternative is to provide the entire file as a variable (this is what the file function does anyway).
module "spark" {
source = "../modules/spark"
workers = 1
custom_values = file("${path.module}/spark_values.yaml")
}
And then modify your Spark module to expect custom_values with the content rather than the path of the file:
resource "helm_release" "spark" {
name = "spark"
repository = "https://charts.bitnami.com/bitnami"
chart = "spark"
version = "1.2.21"
namespace = ...
set {
name = "worker.replicaCount"
value = var.workers
}
values = [
var.custom_values
]
}
Looking at that, I suspect the values parameters expects list(string) so you might need to use yamldecode on custom_values.
I think what I happening here is a problem with the relative path. You are passing the variable to the module and relative to the module path, the file spark_values.yaml
is located at "../../stash/spark_values.yaml".
When working with the file
module, I usually use ${path.module}
. I would name the variable just as the file name: spark_values.yaml
. Then I would call it as follows: file("${path.module}/${var.file_name}"
. Can you double check if that works for your case?
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With