Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

k3s crashes completely with "Observed a panic: "integer divide by zero"" #10384

Closed
r1ggah opened this issue Jun 20, 2024 · 10 comments
Closed

k3s crashes completely with "Observed a panic: "integer divide by zero"" #10384

r1ggah opened this issue Jun 20, 2024 · 10 comments
Labels
kind/upstream-issue This issue appears to be caused by an upstream bug

Comments

@r1ggah
Copy link

r1ggah commented Jun 20, 2024

Environmental Info:
K3s Version:
v1.29.5+k3s1 (4e53a32)

Node(s) CPU architecture, OS, and Version:
Rocky Linux k3s.hostname 5.14.0-427.20.1.el9_4.x86_64 #1 SMP PREEMPT_DYNAMIC Fri Jun 7 14:51:39 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux

Cluster Configuration:
Single server configuration

Describe the bug:
Once we deploy a Persistent volume that contains shortname rather than FQDN in the hostname section the whole cluster crashes immediately. The only way to recover from this is complete reinstall. After the reinstall if we deploy using the fully qualified domain there is no such behavior.

Steps To Reproduce:

  • Installed K3s: curl -sfL https://get.k3s.io | sh --kubeconfig - done on single node.
  • kubectl apply -f master-pv.yaml
    apiVersion: v1
    kind: PersistentVolume
    metadata:
    name: master-data
    spec:
    capacity:
    storage: 20Gi
    accessModes:
    • ReadWriteOnce
      persistentVolumeReclaimPolicy: Retain
      storageClassName: local-path
      local:
      path: /data/master
      nodeAffinity:
      required:
      nodeSelectorTerms:
      • matchExpressions:
        • key: kubernetes.io/hostname
          operator: In
          values:
          • k3s
  • Added a PV having a short name as hostname rather than FQDN like this:
    • key: kubernetes.io/hostname
      operator: In
      values:
      - k3s.domain.com

Expected behavior:
It should, theoretically, work just as normal as with FQDN

Actual behavior:
The cluster crashes immediately after the PV is being applied with "integer divide by zero" critical panic error.

Additional context / logs:

E0619 16:07:33.139374   68254 runtime.go:79] Observed a panic: "integer divide by zero" (runtime error: integer divide by zero)
goroutine 17046 [running]:
k8s.io/apimachinery/pkg/util/runtime.logPanic({0x56df280?, 0xa49e700})
	/go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/runtime/runtime.go:75 +0x85
k8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0x0?})
	/go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/runtime/runtime.go:49 +0x6b
panic({0x56df280?, 0xa49e700?})
	/usr/local/go/src/runtime/panic.go:914 +0x21f
k8s.io/kubernetes/pkg/scheduler.(*Scheduler).findNodesThatFitPod(0xc00dbfd440, {0x6fb82f0, 0xc006b5f810}, {0x701faa0, 0xc00e8f9000}, 0x6fb82f0?, 0xc006e4e480)
	/go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/schedule_one.go:503 +0x9f1
k8s.io/kubernetes/pkg/scheduler.(*Scheduler).schedulePod(0xc00dbfd440, {0x6fb82f0, 0xc006b5f810}, {0x701faa0, 0xc00e8f9000}, 0xc00cf23a78?, 0xc006e4e480)
	/go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/schedule_one.go:400 +0x33f
k8s.io/kubernetes/pkg/scheduler.(*Scheduler).schedulingCycle(0xc00dbfd440, {0x6fb82f0, 0xc006b5f810}, 0x2?, {0x701faa0, 0xc00e8f9000}, 0xc006b5eff0, {0xc194de71484c8482, 0xb9836182, 0xa6cc5e0}, ...)
	/go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/schedule_one.go:150 +0xf3
k8s.io/kubernetes/pkg/scheduler.(*Scheduler).scheduleOne(0xc00dbfd440, {0x6fb82f0, 0xc00a2d8dc0})
	/go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/schedule_one.go:112 +0x5cd
k8s.io/apimachinery/pkg/util/wait.JitterUntilWithContext.func1()
	/go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:259 +0x22
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x0?, {0x6f62c00, 0xc0034be720}, 0x1, 0xc004d7f6e0)
	/go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x0?, 0x0, 0x0, 0x0?, 0x1?)
	/go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.JitterUntilWithContext({0x6fb82f0, 0xc00a2d8dc0}, 0xc00554f730, 0xc00cf23fd0?, 0x17783b8?, 0xd0?)
	/go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:259 +0x93
k8s.io/apimachinery/pkg/util/wait.UntilWithContext({0x6fb82f0?, 0xc00a2d8dc0?}, 0x6fb82f0?, 0xc006b5f6d0?)
	/go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:170 +0x25
created by k8s.io/kubernetes/pkg/scheduler.(*Scheduler).Run in goroutine 425
	/go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/scheduler.go:414 +0xf6
panic: runtime error: integer divide by zero [recovered]
	panic: runtime error: integer divide by zero

goroutine 17046 [running]:
k8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0x0?})
	/go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/runtime/runtime.go:56 +0xcd
panic({0x56df280?, 0xa49e700?})
	/usr/local/go/src/runtime/panic.go:914 +0x21f
k8s.io/kubernetes/pkg/scheduler.(*Scheduler).findNodesThatFitPod(0xc00dbfd440, {0x6fb82f0, 0xc006b5f810}, {0x701faa0, 0xc00e8f9000}, 0x6fb82f0?, 0xc006e4e480)
	/go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/schedule_one.go:503 +0x9f1
k8s.io/kubernetes/pkg/scheduler.(*Scheduler).schedulePod(0xc00dbfd440, {0x6fb82f0, 0xc006b5f810}, {0x701faa0, 0xc00e8f9000}, 0xc00cf23a78?, 0xc006e4e480)
	/go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/schedule_one.go:400 +0x33f
k8s.io/kubernetes/pkg/scheduler.(*Scheduler).schedulingCycle(0xc00dbfd440, {0x6fb82f0, 0xc006b5f810}, 0x2?, {0x701faa0, 0xc00e8f9000}, 0xc006b5eff0, {0xc194de71484c8482, 0xb9836182, 0xa6cc5e0}, ...)
	/go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/schedule_one.go:150 +0xf3
k8s.io/kubernetes/pkg/scheduler.(*Scheduler).scheduleOne(0xc00dbfd440, {0x6fb82f0, 0xc00a2d8dc0})
	/go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/schedule_one.go:112 +0x5cd
k8s.io/apimachinery/pkg/util/wait.JitterUntilWithContext.func1()
	/go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:259 +0x22
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x0?, {0x6f62c00, 0xc0034be720}, 0x1, 0xc004d7f6e0)
	/go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x0?, 0x0, 0x0, 0x0?, 0x1?)
	/go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.JitterUntilWithContext({0x6fb82f0, 0xc00a2d8dc0}, 0xc00554f730, 0xc00cf23fd0?, 0x17783b8?, 0xd0?)
	/go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:259 +0x93
k8s.io/apimachinery/pkg/util/wait.UntilWithContext({0x6fb82f0?, 0xc00a2d8dc0?}, 0x6fb82f0?, 0xc006b5f6d0?)
	/go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:170 +0x25
created by k8s.io/kubernetes/pkg/scheduler.(*Scheduler).Run in goroutine 425
	/go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/scheduler.go:414 +0xf6
@brandond
Copy link
Member

I'm not seeing any K3s code here, this appears to be a bug in the Kubernetes scheduler. Have you reported this upstream, or looked for an existing issue that covers it?

@r1ggah
Copy link
Author

r1ggah commented Jun 20, 2024

I have not reported it anywhere else just yet. However, after a deep dive into the issue, it seems that it's somehow GO specifically related. For example, Prometheus has also had an "Observed a panic: 'integer divide by zero'" issue.

@brandond
Copy link
Member

I mean yes, divide by zero will cause golang to panic. It is however the responsibility of the application code (Kubernetes in this case) to ensure that this does not occur. It is not golangs fault that Kubernetes or Prometheus is trying to divide by zero, the bug is in the application code that attempts an invalid mathematical operation.

@r1ggah
Copy link
Author

r1ggah commented Jun 20, 2024

I figured, yea. Well, in case you're interested I'll keep you updated either here or elsewhere how it goes further, as I will publish it then to k8s repo specifically.

@brandond
Copy link
Member

Feel free to link this issue from your issue in k/k, or vice versa.

@caroline-suse-rancher caroline-suse-rancher added the kind/upstream-issue This issue appears to be caused by an upstream bug label Jun 20, 2024
@debugger24
Copy link

I am also facing same issue with v1.29.5+k3s1 version.

@debugger24
Copy link

debugger24 commented Jun 22, 2024

Looks like its fixed in this kubernetes/kubernetes#124933.

@debugger24
Copy link

No still have that issue with 1.30.1 too.

Jun 23 01:41:26 pi2 k3s[91704]: I0623 01:41:26.441764   91704 leaderelection.go:260] successfully acquired lease kube-system/kube-scheduler
Jun 23 01:41:26 pi2 k3s[91704]: E0623 01:41:26.472103   91704 runtime.go:79] Observed a panic: "integer divide by zero" (runtime error: integer divide by zero)
Jun 23 01:41:26 pi2 k3s[91704]: goroutine 32916 [running]:
Jun 23 01:41:26 pi2 k3s[91704]: k8s.io/apimachinery/pkg/util/runtime.logPanic({0x52e8f60, 0xa1f3860})
Jun 23 01:41:26 pi2 k3s[91704]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/runtime/runtime.go:75 +0x7c
Jun 23 01:41:26 pi2 k3s[91704]: k8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0x40270a81c0?})
Jun 23 01:41:26 pi2 k3s[91704]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/runtime/runtime.go:49 +0x78
Jun 23 01:41:26 pi2 k3s[91704]: panic({0x52e8f60?, 0xa1f3860?})
Jun 23 01:41:26 pi2 k3s[91704]:         /usr/local/go/src/runtime/panic.go:770 +0x124
Jun 23 01:41:26 pi2 k3s[91704]: k8s.io/kubernetes/pkg/scheduler.(*Scheduler).findNodesThatFitPod(0x4038ad0780, {0x6c65c50, 0x4019ba4b90}, {0x6cd47f8, 0x4038ad6008}, 0x4028dbee40, 0x402a005688)
Jun 23 01:41:26 pi2 k3s[91704]:         /go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/schedule_one.go:505 +0x8a0
Jun 23 01:41:26 pi2 k3s[91704]: k8s.io/kubernetes/pkg/scheduler.(*Scheduler).schedulePod(0x4038ad0780, {0x6c65c50, 0x4019ba4b90}, {0x6cd47f8, 0x4038ad6008}, 0x4028dbee40, 0x402a005688)
Jun 23 01:41:26 pi2 k3s[91704]:         /go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/schedule_one.go:402 +0x25c
Jun 23 01:41:26 pi2 k3s[91704]: k8s.io/kubernetes/pkg/scheduler.(*Scheduler).schedulingCycle(0x4038ad0780, {0x6c65c50, 0x4019ba4b90}, 0x4028dbee40, {0x6cd47f8, 0x4038ad6008}, 0x40273be230, {0x2?, 0x4025528a20?, 0xa45d580?}, ...)
Jun 23 01:41:26 pi2 k3s[91704]:         /go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/schedule_one.go:149 +0xb8
Jun 23 01:41:26 pi2 k3s[91704]: k8s.io/kubernetes/pkg/scheduler.(*Scheduler).ScheduleOne(0x4038ad0780, {0x6c65c50, 0x4017ce3cc0})
Jun 23 01:41:26 pi2 k3s[91704]:         /go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/schedule_one.go:111 +0x4c0
Jun 23 01:41:26 pi2 k3s[91704]: k8s.io/apimachinery/pkg/util/wait.JitterUntilWithContext.func1()
Jun 23 01:41:26 pi2 k3s[91704]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:259 +0x2c
Jun 23 01:41:26 pi2 k3s[91704]: k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x403484dec8?)
Jun 23 01:41:26 pi2 k3s[91704]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:226 +0x40
Jun 23 01:41:26 pi2 k3s[91704]: k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x403484df68, {0x6c0e880, 0x402e0131d0}, 0x1, 0x400c304fc0)
Jun 23 01:41:26 pi2 k3s[91704]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:227 +0x90
Jun 23 01:41:26 pi2 k3s[91704]: k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x403659ef68, 0x0, 0x0, 0x1, 0x400c304fc0)
Jun 23 01:41:26 pi2 k3s[91704]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:204 +0x80
Jun 23 01:41:26 pi2 k3s[91704]: k8s.io/apimachinery/pkg/util/wait.JitterUntilWithContext({0x6c65c50, 0x4017ce3cc0}, 0x402a16a840, 0x0, 0x0, 0x1)
Jun 23 01:41:26 pi2 k3s[91704]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:259 +0x80
Jun 23 01:41:26 pi2 k3s[91704]: k8s.io/apimachinery/pkg/util/wait.UntilWithContext(...)
Jun 23 01:41:26 pi2 k3s[91704]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:170
Jun 23 01:41:26 pi2 k3s[91704]: created by k8s.io/kubernetes/pkg/scheduler.(*Scheduler).Run in goroutine 32867
Jun 23 01:41:26 pi2 k3s[91704]:         /go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/scheduler.go:445 +0x104
Jun 23 01:41:26 pi2 k3s[91704]: panic: runtime error: integer divide by zero [recovered]
Jun 23 01:41:26 pi2 k3s[91704]:         panic: runtime error: integer divide by zero
Jun 23 01:41:26 pi2 k3s[91704]: goroutine 32916 [running]:
Jun 23 01:41:26 pi2 k3s[91704]: k8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0x40270a81c0?})
Jun 23 01:41:26 pi2 k3s[91704]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/runtime/runtime.go:56 +0xe0
Jun 23 01:41:26 pi2 k3s[91704]: panic({0x52e8f60?, 0xa1f3860?})
Jun 23 01:41:26 pi2 k3s[91704]:         /usr/local/go/src/runtime/panic.go:770 +0x124
Jun 23 01:41:26 pi2 k3s[91704]: k8s.io/kubernetes/pkg/scheduler.(*Scheduler).findNodesThatFitPod(0x4038ad0780, {0x6c65c50, 0x4019ba4b90}, {0x6cd47f8, 0x4038ad6008}, 0x4028dbee40, 0x402a005688)
Jun 23 01:41:26 pi2 k3s[91704]:         /go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/schedule_one.go:505 +0x8a0
Jun 23 01:41:26 pi2 k3s[91704]: k8s.io/kubernetes/pkg/scheduler.(*Scheduler).schedulePod(0x4038ad0780, {0x6c65c50, 0x4019ba4b90}, {0x6cd47f8, 0x4038ad6008}, 0x4028dbee40, 0x402a005688)
Jun 23 01:41:26 pi2 k3s[91704]:         /go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/schedule_one.go:402 +0x25c
Jun 23 01:41:26 pi2 k3s[91704]: k8s.io/kubernetes/pkg/scheduler.(*Scheduler).schedulingCycle(0x4038ad0780, {0x6c65c50, 0x4019ba4b90}, 0x4028dbee40, {0x6cd47f8, 0x4038ad6008}, 0x40273be230, {0x2?, 0x4025528a20?, 0xa45d580?}, ...)
Jun 23 01:41:26 pi2 k3s[91704]:         /go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/schedule_one.go:149 +0xb8
Jun 23 01:41:26 pi2 k3s[91704]: k8s.io/kubernetes/pkg/scheduler.(*Scheduler).ScheduleOne(0x4038ad0780, {0x6c65c50, 0x4017ce3cc0})
Jun 23 01:41:26 pi2 k3s[91704]:         /go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/schedule_one.go:111 +0x4c0
Jun 23 01:41:26 pi2 k3s[91704]: k8s.io/apimachinery/pkg/util/wait.JitterUntilWithContext.func1()
Jun 23 01:41:26 pi2 k3s[91704]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:259 +0x2c
Jun 23 01:41:26 pi2 k3s[91704]: k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x403484dec8?)
Jun 23 01:41:26 pi2 k3s[91704]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:226 +0x40
Jun 23 01:41:26 pi2 k3s[91704]: k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x403484df68, {0x6c0e880, 0x402e0131d0}, 0x1, 0x400c304fc0)
Jun 23 01:41:26 pi2 k3s[91704]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:227 +0x90
Jun 23 01:41:26 pi2 k3s[91704]: k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x403659ef68, 0x0, 0x0, 0x1, 0x400c304fc0)
Jun 23 01:41:26 pi2 k3s[91704]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:204 +0x80
Jun 23 01:41:26 pi2 k3s[91704]: k8s.io/apimachinery/pkg/util/wait.JitterUntilWithContext({0x6c65c50, 0x4017ce3cc0}, 0x402a16a840, 0x0, 0x0, 0x1)
Jun 23 01:41:26 pi2 k3s[91704]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:259 +0x80
Jun 23 01:41:26 pi2 k3s[91704]: k8s.io/apimachinery/pkg/util/wait.UntilWithContext(...)
Jun 23 01:41:26 pi2 k3s[91704]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:170
Jun 23 01:41:26 pi2 k3s[91704]: created by k8s.io/kubernetes/pkg/scheduler.(*Scheduler).Run in goroutine 32867
Jun 23 01:41:26 pi2 k3s[91704]:         /go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/scheduler.go:445 +0x104
Jun 23 01:41:27 pi2 systemd[1]: k3s.service: Main process exited, code=exited, status=2/INVALIDARGUMENT

@brandond
Copy link
Member

Try with v1.30.2-rc3+k3s1 - if you still get a crash please post the logs.

@nettnikl
Copy link

nettnikl commented Jul 1, 2024

Had the same occur on k3s version v1.30.0+k3s1 (14549535). Was in a crash loop since i added a few resources there.

Jul 02 00:01:43 mydevice k3s[540610]: E0702 00:01:43.815367  540610 runtime.go:79] Observed a panic: "integer divide by zero" (runtime error: integer divide by zero)
Jul 02 00:01:43 mydevice k3s[540610]: goroutine 33922 [running]:
Jul 02 00:01:43 mydevice k3s[540610]: k8s.io/apimachinery/pkg/util/runtime.logPanic({0x52e9ae0, 0xa1f3860})
Jul 02 00:01:43 mydevice k3s[540610]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/runtime/runtime.go:75 +0x7c
Jul 02 00:01:43 mydevice k3s[540610]: k8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0x400c54e700?})
Jul 02 00:01:43 mydevice k3s[540610]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/runtime/runtime.go:49 +0x78
Jul 02 00:01:43 mydevice k3s[540610]: panic({0x52e9ae0?, 0xa1f3860?})
Jul 02 00:01:43 mydevice k3s[540610]:         /usr/local/go/src/runtime/panic.go:770 +0x124
Jul 02 00:01:43 mydevice k3s[540610]: k8s.io/kubernetes/pkg/scheduler.(*Scheduler).findNodesThatFitPod(0x401a353d40, {0x6c66870, 0x40034fe280}, {0x6cd5418, 0x4006bced88}, 0x4007b58380, 0x401a9f6908)
Jul 02 00:01:43 mydevice k3s[540610]:         /go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/schedule_one.go:502 +0x88c
Jul 02 00:01:43 mydevice k3s[540610]: k8s.io/kubernetes/pkg/scheduler.(*Scheduler).schedulePod(0x401a353d40, {0x6c66870, 0x40034fe280}, {0x6cd5418, 0x4006bced88}, 0x4007b58380, 0x401a9f6908)
Jul 02 00:01:43 mydevice k3s[540610]:         /go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/schedule_one.go:402 +0x25c
Jul 02 00:01:43 mydevice k3s[540610]: k8s.io/kubernetes/pkg/scheduler.(*Scheduler).schedulingCycle(0x401a353d40, {0x6c66870, 0x40034fe280}, 0x4007b58380, {0x6cd5418, 0x4006bced88}, 0x401a88c050, {0x2?, 0x28?, 0xa45d5c0?}, ...)
Jul 02 00:01:43 mydevice k3s[540610]:         /go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/schedule_one.go:149 +0xb8
Jul 02 00:01:43 mydevice k3s[540610]: k8s.io/kubernetes/pkg/scheduler.(*Scheduler).ScheduleOne(0x401a353d40, {0x6c66870, 0x40041b9310})
Jul 02 00:01:43 mydevice k3s[540610]:         /go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/schedule_one.go:111 +0x4c0
Jul 02 00:01:43 mydevice k3s[540610]: k8s.io/apimachinery/pkg/util/wait.JitterUntilWithContext.func1()
Jul 02 00:01:43 mydevice k3s[540610]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:259 +0x2c
Jul 02 00:01:43 mydevice k3s[540610]: k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
Jul 02 00:01:43 mydevice k3s[540610]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:226 +0x40
Jul 02 00:01:43 mydevice k3s[540610]: k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x4004b25f68, {0x6c0f4a0, 0x401a6dba10}, 0x1, 0x401a3e8a80)
Jul 02 00:01:43 mydevice k3s[540610]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:227 +0x90
Jul 02 00:01:43 mydevice k3s[540610]: k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x400b609768, 0x0, 0x0, 0x1, 0x401a3e8a80)
Jul 02 00:01:43 mydevice k3s[540610]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:204 +0x80
Jul 02 00:01:43 mydevice k3s[540610]: k8s.io/apimachinery/pkg/util/wait.JitterUntilWithContext({0x6c66870, 0x40041b9310}, 0x400a774160, 0x0, 0x0, 0x1)
Jul 02 00:01:43 mydevice k3s[540610]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:259 +0x80
Jul 02 00:01:43 mydevice k3s[540610]: k8s.io/apimachinery/pkg/util/wait.UntilWithContext(...)
Jul 02 00:01:43 mydevice k3s[540610]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:170
Jul 02 00:01:43 mydevice k3s[540610]: created by k8s.io/kubernetes/pkg/scheduler.(*Scheduler).Run in goroutine 2344
Jul 02 00:01:43 mydevice k3s[540610]:         /go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/scheduler.go:445 +0x104
Jul 02 00:01:43 mydevice k3s[540610]: panic: runtime error: integer divide by zero [recovered]
Jul 02 00:01:43 mydevice k3s[540610]:         panic: runtime error: integer divide by zero
Jul 02 00:01:43 mydevice k3s[540610]: goroutine 33922 [running]:
Jul 02 00:01:43 mydevice k3s[540610]: k8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0x400c54e700?})
Jul 02 00:01:43 mydevice k3s[540610]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/runtime/runtime.go:56 +0xe0
Jul 02 00:01:43 mydevice k3s[540610]: panic({0x52e9ae0?, 0xa1f3860?})
Jul 02 00:01:43 mydevice k3s[540610]:         /usr/local/go/src/runtime/panic.go:770 +0x124
Jul 02 00:01:43 mydevice k3s[540610]: k8s.io/kubernetes/pkg/scheduler.(*Scheduler).findNodesThatFitPod(0x401a353d40, {0x6c66870, 0x40034fe280}, {0x6cd5418, 0x4006bced88}, 0x4007b58380, 0x401a9f6908)
Jul 02 00:01:43 mydevice k3s[540610]:         /go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/schedule_one.go:502 +0x88c
Jul 02 00:01:43 mydevice k3s[540610]: k8s.io/kubernetes/pkg/scheduler.(*Scheduler).schedulePod(0x401a353d40, {0x6c66870, 0x40034fe280}, {0x6cd5418, 0x4006bced88}, 0x4007b58380, 0x401a9f6908)
Jul 02 00:01:43 mydevice k3s[540610]:         /go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/schedule_one.go:402 +0x25c
Jul 02 00:01:43 mydevice k3s[540610]: k8s.io/kubernetes/pkg/scheduler.(*Scheduler).schedulingCycle(0x401a353d40, {0x6c66870, 0x40034fe280}, 0x4007b58380, {0x6cd5418, 0x4006bced88}, 0x401a88c050, {0x2?, 0x28?, 0xa45d5c0?}, ...)
Jul 02 00:01:43 mydevice k3s[540610]:         /go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/schedule_one.go:149 +0xb8
Jul 02 00:01:43 mydevice k3s[540610]: k8s.io/kubernetes/pkg/scheduler.(*Scheduler).ScheduleOne(0x401a353d40, {0x6c66870, 0x40041b9310})
Jul 02 00:01:43 mydevice k3s[540610]:         /go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/schedule_one.go:111 +0x4c0
Jul 02 00:01:43 mydevice k3s[540610]: k8s.io/apimachinery/pkg/util/wait.JitterUntilWithContext.func1()
Jul 02 00:01:43 mydevice k3s[540610]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:259 +0x2c
Jul 02 00:01:43 mydevice k3s[540610]: k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
Jul 02 00:01:43 mydevice k3s[540610]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:226 +0x40
Jul 02 00:01:43 mydevice k3s[540610]: k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x4004b25f68, {0x6c0f4a0, 0x401a6dba10}, 0x1, 0x401a3e8a80)
Jul 02 00:01:43 mydevice k3s[540610]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:227 +0x90
Jul 02 00:01:43 mydevice k3s[540610]: k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x400b609768, 0x0, 0x0, 0x1, 0x401a3e8a80)
Jul 02 00:01:43 mydevice k3s[540610]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:204 +0x80
Jul 02 00:01:43 mydevice k3s[540610]: k8s.io/apimachinery/pkg/util/wait.JitterUntilWithContext({0x6c66870, 0x40041b9310}, 0x400a774160, 0x0, 0x0, 0x1)
Jul 02 00:01:43 mydevice k3s[540610]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:259 +0x80
Jul 02 00:01:43 mydevice k3s[540610]: k8s.io/apimachinery/pkg/util/wait.UntilWithContext(...)
Jul 02 00:01:43 mydevice k3s[540610]:         /go/pkg/mod/github.com/k3s-io/kubernetes/staging/src/k8s.io/[email protected]/pkg/util/wait/backoff.go:170
Jul 02 00:01:43 mydevice k3s[540610]: created by k8s.io/kubernetes/pkg/scheduler.(*Scheduler).Run in goroutine 2344
Jul 02 00:01:43 mydevice k3s[540610]:         /go/pkg/mod/github.com/k3s-io/[email protected]/pkg/scheduler/scheduler.go:445 +0x104
Jul 02 00:01:43 mydevice systemd[1]: k3s.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Jul 02 00:01:43 mydevice systemd[1]: k3s.service: Failed with result 'exit-code'.

Looking at the linked commit, i suppose the bit that created the trouble was:

  nodeAffinity:
    required:
      nodeSelectorTerms:
      - matchExpressions:
        - key: kubernetes.io/hostname
          operator: In
          values:
          - mydevice.lan

Updated to 1.30.2+k3s1 (aa4794b3), now the issue is solved. Thanks.

@brandond brandond closed this as completed Jul 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/upstream-issue This issue appears to be caused by an upstream bug
Projects
Archived in project
Development

No branches or pull requests

5 participants