Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

run acceptance tests with multiple versions of kubernetes #2563

Merged
merged 5 commits into from
Aug 13, 2024

Conversation

coryflucas
Copy link
Contributor

Description

Leverage a matrix to run the acceptance tests against multiple versions of Kubernetes.

The versions are based on the default version previously used (1.27) and the versions used in the manifest acceptance tests but with corrected SHAs to match the version of kind in use. Note: there is no 1.30 image available for kind 0.20 (its available for 0.23) so I opted to use 1.29.

Acceptance tests

N/A

Release Note

Release note for CHANGELOG:

NONE

References

fixes #2562

Community Note

  • Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment

@BBBmau
Copy link
Contributor

BBBmau commented Aug 5, 2024

can you provide a link to your workflow of your changes to confirm? We should be able to see the matrix of the k8s versions all passing.

@coryflucas
Copy link
Contributor Author

I had originally thought the workflow had already run in my branch but it hadn't due to the workflow being restricted to only run against the Hashicorp org. I got it running and have found that a subset of the tests fail against older versions of Kubernetes. The failures seem to fall into 3 categories:

  1. They are covering features that aren't supported in those k8s versions.
  2. They are testing things that are both supported and some that are not supported in those k8s versions, so really just the tests need refactoring.
  3. The provider has unexpected behavior with older versions k8s. The number of issues in this category looks to be pretty small (it looks like 3 right now) and they are relatively minor (mostly they result in an apply followed by a plan still showing changes).

I'd like to propose the following to get the tests in a passing state:

  • For all 3 categories of tests add a skipIfClusterVersionLessThan precheck to the tests. This resolves category 1, and gets the category 2 and 3 tests back to a passing state and still has them running against more versions of k8s than they are running against today.
  • For tests in category 2 open an issue with the list of tests and the versions they should be able to work under which I'm happy to pick up as a follow up to this. I'm thinking a single issue for this should be ok given it should just be test refactoring. I had started trying to do this in my current branch but it got to a bit larger than I expected pretty quickly.
  • For tests in category 3 open an issue each individual problem. Again I am happy to start looking at these as well if we deem them worth fixing.

Let me know what you think about this approach and I'll follow up.

@coryflucas
Copy link
Contributor Author

@BBBmau I wen't ahead and did my proposed change in a new branch and have all tests now passing. Here is a break down of the issues I found:

  • 4 tests were missing version checks when testing features that are not available on older clusters.
  • 4 tests need refactoring to split testing features that are not available on older clusters out to new tests.
  • 2 tests have failures due to provider issues. Both are issues where after an apply a plan still shows changes to be applied on older clusters versions.
  • 2 tests that only run for old versions of k8s had actual bugs that I fixed as well.

Here is a link to the workflow run with the changes + a change to enable the workflow in my fork. Please let me know if you'd like me to merge those changes into this branch or go with a different approach.

@BBBmau
Copy link
Contributor

BBBmau commented Aug 9, 2024

@coryflucas thanks for working on this! I haven't had the change to really look at your PR in depth. I plan on doing this next week just as a heads-up. Apologies for the delay!

@BBBmau BBBmau self-assigned this Aug 12, 2024
@@ -5,7 +5,7 @@ on:
inputs:
kindVersion:
description: The kind version
default: 0.20.0 # Kubernetes version: 1.27.X
default: 0.20.0
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Really great stuff in your comments! We would love to take in your fixes involving the test failures in old k8s versions.

Only question I have in this PR is leaving it at 0.20. Was their a specific reason for not moving it up to the latest in order to test with 1.30 as well?

We are planning to include 1.30 in order to include some new fields in a few resources. Would upgrading the kind version result in more failing tests? Not sure if you tried this already or not.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll go ahead and get the issues written up and start on PRs for the fixes 👍

On updating kind, I ran against kind 0.23 locally just fine so I don't expect any real issues, I was mostly trying to keep the scope of this change down since kind is used in a couple other workflows as well and I assume we'd want to keep those in sync. It requires bumping the version of kind and updating the Kubernetes versions since the node images are specific to a version of kind. I'm happy to tackle that as another PR or as part of this, it definitely makes sense to me to upgrade.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Another note would be that kind v0.23.0 does not support as old of versions of k8s as the test currently run against. It has supported node images going back to 1.25, but not 1.23. Kubernetes 1.23 is almost 3 years old though so maybe dropping it from the tests makes sense.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep let's treat that as another PR. We can perform the kind bump once the PR of bumping kubernetes to v1.30.3 gets merged: #2561

This will also give us a chance to address the issues you discovered prior to bumping to v1.30.3 😄

@github-actions github-actions bot added size/M and removed size/XS labels Aug 13, 2024
Copy link
Contributor

@BBBmau BBBmau left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I look forward to your upcoming issues / PRs that you came across while working on this. Thanks again!

@BBBmau BBBmau merged commit 6f200bc into hashicorp:main Aug 13, 2024
22 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Run acceptance tests across multiple versions of Kubernetes
2 participants