Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

object.metadata.finalizers causes a diff on every plan for elbv2.k8s.aws/v1beta1.TargetGroupBinding #1442

Closed
achille-roussel opened this issue Oct 6, 2021 · 8 comments
Labels

Comments

@achille-roussel
Copy link

achille-roussel commented Oct 6, 2021

Terraform Version, Provider Version and Kubernetes Version

Terraform version: v1.0.8
Kubernetes provider version: v2.5.0
Kubernetes version: 1.21

Affected Resource(s)

  • kubernetes_manifest

Terraform Configuration Files

resource "kubernetes_manifest" "target_group_binding" {
  manifest = {
    apiVersion = "elbv2.k8s.aws/v1beta1"
    kind       = "TargetGroupBinding"

    metadata = {
      name      = "service"
      namespace = "default"
    }

    spec = {
      serviceRef     = module.tracedb_gateway.service_ref
      targetGroupARN = aws_lb_target_group.example.arn
      networking     = {
        ingress = [{
          from = [{
            securityGroup = {
              groupID = aws_security_group.example.id
            }
          }]
          ports = [{
            port     = 80
            protocol = "TCP"
          }]
        }]
      }
    }
  }
}

Steps to Reproduce

  • terraform apply

Expected Behavior

There should be no diffs in the terraform output when the resource has not changed.

Actual Behavior

│ Error: Provider produced inconsistent result after apply
│
│ When applying changes to module.tracedb_gateway.kubernetes_manifest.target_group_binding, provider "provider[\"registry.terraform.io/hashicorp/kubernetes\"]" produced an unexpected new value: .object: wrong final value type: incorrect object attributes.
│
│ This is a bug in the provider, which should be reported in the provider's own issue tracker.

Important Factoids

  • If the manifest.metadata.finalizers field is explicitly set, the plan does not show a diff, and there are no errors on apply:
resource "kubernetes_manifest" "target_group_binding" {
  manifest = {
    ...

    metadata = {
      ...
      finalizers = ["elbv2.k8s.aws/resources"]
    }

    ...
}
  • Ignoring changes on the fields does not seem to address the issue.

References

Community Note

  • Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment
@alexsomesan
Copy link
Member

Hi! This sounds like the finalizers field is behaving like what Terraform calls a "computed" values, meaning the API will change it's value from the one supplied during apply.

You can get around this by specifing the "finalizers" attribute in the list of computed fields, like this:

resource "kubernetes_manifest" "target_group_binding" {
  computed_fields = ["metadata.finalizers"]
  manifest = {

Be aware, that by default, when computed_fileds is not set by the users, it has a default value which includes "metadata.annotations" and "metadata.labels". For robustness, you should also include these two whenever you explicitly set a value for "computed_fields".

In the end, it should look like this:

resource "kubernetes_manifest" "target_group_binding" {
  computed_fields = ["metadata.finalizers", "metadata.annotations", "metadata.labels"]
  manifest = {

Let us know if this resolved your issue.

@achille-roussel
Copy link
Author

Specifying the list of computed fields does not address the issue. So far the only way has been to explicitly declare the finalizers list.

@alexsomesan
Copy link
Member

@achille-roussel I was able to apply your example manifest on a freshly created EKS cluster.

We will need more precise information about the cluster configuration where you are seeing this issue. Please share versions of all components used / installed on the cluster and ideally also the provisioning procedure. We will then try to reproduce this again.

Thanks a lot!

@pastjean
Copy link

I get a similar problem this exact config if this helps

resource "kubernetes_namespace" "istio-system" {
  metadata {
    name = "istio-system"
  }
}

resource "helm_release" "operator" {
  name  = var.name
  chart = "${path.module}/charts/istio-operator"

  values = []
}

or install istioctl and istioctl operator init

then

resource "kubernetes_manifest" "istio-operator-1-11-4" {
  computed_fields = ["metadata.finalizers", "metadata.annotations", "metadata.labels"]

  manifest = {
    apiVersion = "install.istio.io/v1alpha1"
    kind       = "IstioOperator"

    metadata = {
      name      = "istio-demo"
      namespace = "istio-system"
    }

    spec = {
      profile  = "demo"
    }
  }
}

istioOperator then gets finalizers added, and when I run apply a second time it tries to destroy everything

@achille-roussel
Copy link
Author

@alexsomesan thanks for looking into this.

I also did not experience any issues applying the resource, the problem manifests on subsequent plans: the resource shows a diff every time even if nothing is changing.

@toddgardner
Copy link

I had this issue (with computed_fields not working, and having to specify the finalizer) but it resolved after upgrading to the latest version of the provider (2.6.1)

@alexsomesan
Copy link
Member

Closing this since there are reports of issue being resolved. Anyone still seeing this issue, please feel free to reopen with relevant details.

@github-actions
Copy link

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues.
If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Nov 27, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

No branches or pull requests

4 participants