Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[release/v1.7] Update Go to 1.21.5 and machine-controller to v1.57.4 #2989

Merged
merged 2 commits into from
Dec 21, 2023

Conversation

xmudrii
Copy link
Member

@xmudrii xmudrii commented Dec 19, 2023

What this PR does / why we need it:

Update Go to 1.21.5 and machine-controller to v1.57.4 on the release/v1.7 branch (cherry-pick of #2988).

What type of PR is this?

/kind feature

Does this PR introduce a user-facing change? Then add your Release Note here:

- KubeOne is now built with Go 1.21.5
- Update machine-controller to v1.57.4

Documentation:

NONE

/assign @kron4eg

@kubermatic-bot kubermatic-bot added the do-not-merge/cherry-pick-not-approved Indicates that a PR is not yet approved to merge into a release branch. label Dec 19, 2023
@kubermatic-bot kubermatic-bot added this to the KubeOne 1.7 milestone Dec 19, 2023
@kubermatic-bot kubermatic-bot added release-note Denotes a PR that will be considered when it comes time to generate release notes. kind/feature Categorizes issue or PR as related to a new feature. dco-signoff: yes Denotes that all commits in the pull request have the valid DCO signoff message. size/XXL Denotes a PR that changes 1000+ lines, ignoring generated files. labels Dec 19, 2023
@xmudrii
Copy link
Member Author

xmudrii commented Dec 19, 2023

/test pull-kubeone-e2e-aws-default-install-containerd-v1.25.11

@embik
Copy link
Member

embik commented Dec 20, 2023

/retest

@embik
Copy link
Member

embik commented Dec 20, 2023

We bumped sonobuoy in https://github.com/kubermatic/infra/pull/2501 from 0.56.16 to 0.57.1, I'm pretty sure that's the reason the jobs are starting to fail.

@embik
Copy link
Member

embik commented Dec 20, 2023

I think sonobuoy wait is just broken in 0.57. I see vmware-tanzu/sonobuoy#1909 as a candidate for breaking it, but haven't validated that yet. But the same thing happens on a KKP user cluster:

$ sonobuoy run --mode=conformance-lite --dns-pod-labels "app.kubernetes.io/name=kube-dns,app=coredns" && sonobuoy wait
INFO[0000] create request issued                         name=sonobuoy namespace= resource=namespaces
INFO[0000] create request issued                         name=sonobuoy-serviceaccount namespace=sonobuoy resource=serviceaccounts
INFO[0000] create request issued                         name=sonobuoy-serviceaccount-sonobuoy namespace= resource=clusterrolebindings
INFO[0000] create request issued                         name=sonobuoy-serviceaccount-sonobuoy namespace= resource=clusterroles
INFO[0000] create request issued                         name=sonobuoy-config-cm namespace=sonobuoy resource=configmaps
INFO[0000] create request issued                         name=sonobuoy-plugins-cm namespace=sonobuoy resource=configmaps
INFO[0000] create request issued                         name=sonobuoy namespace=sonobuoy resource=pods
INFO[0000] create request issued                         name=sonobuoy-aggregator namespace=sonobuoy resource=services
ERRO[0000] error attempting to run sonobuoy: waiting for run to finish: context deadline exceeded

sonobuoy wait just fails instantly:

$ time sonobuoy wait
ERRO[0000] error attempting to run sonobuoy: waiting for run to finish: context deadline exceeded
sonobuoy wait  0.03s user 0.01s system 14% cpu 0.295 total

@embik
Copy link
Member

embik commented Dec 20, 2023

Okay, it seems this is intended behaviour of sonobuoy wait? You need to pass sonobuoy wait --wait to actually get that, otherwise the default value of 0 for --wait means that there will be no wait.

@kron4eg
Copy link
Member

kron4eg commented Dec 21, 2023

/lgtm
/approve

@kubermatic-bot kubermatic-bot added the lgtm Indicates that a PR is ready to be merged. label Dec 21, 2023
@kubermatic-bot
Copy link
Contributor

LGTM label has been added.

Git tree hash: 26f3831bad21e20c6d1ae58b70cf34b62347c68c

@kubermatic-bot
Copy link
Contributor

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: kron4eg

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@kubermatic-bot kubermatic-bot added the approved Indicates a PR has been approved by an approver from all required OWNERS files. label Dec 21, 2023
@xmudrii xmudrii added the cherry-pick-approved Indicates a PR has been approved by release managers. label Dec 21, 2023
@kubermatic-bot kubermatic-bot removed the do-not-merge/cherry-pick-not-approved Indicates that a PR is not yet approved to merge into a release branch. label Dec 21, 2023
@kubermatic-bot kubermatic-bot merged commit 77ee187 into release/v1.7 Dec 21, 2023
13 checks passed
@kubermatic-bot kubermatic-bot deleted the go-1.21.5-1.7 branch December 21, 2023 10:00
@kpython
Copy link

kpython commented Feb 21, 2024

When installing kubeone_1.7.2_darwin_arm64.zip, I still see "goVersion": "go1.21.3"
Shouldn't be go1.21.5 ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
approved Indicates a PR has been approved by an approver from all required OWNERS files. cherry-pick-approved Indicates a PR has been approved by release managers. dco-signoff: yes Denotes that all commits in the pull request have the valid DCO signoff message. kind/feature Categorizes issue or PR as related to a new feature. lgtm Indicates that a PR is ready to be merged. release-note Denotes a PR that will be considered when it comes time to generate release notes. size/XXL Denotes a PR that changes 1000+ lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants