Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

On deploying both the operator and the agent and integrating, it gets stuck in maintenance #193

Open
javierdelapuente opened this issue Sep 27, 2024 · 0 comments

Comments

@javierdelapuente
Copy link
Contributor

Bug Description

Deploying both the jenkins-k8s and jenkins-agent-k8s and integrating (without waiting) gets the charm into maintenance state, and it does not get out of it.

The reason looks related to receiving first _on_agent_relation_joined and not pebble ready. This event gets deferred. However, when the pebble ready event runs, it first processes the _on_agent_relation_joined, which looks to block here. Then it raises timeouterror and the charm gets into the loop...

ubuntu@my-juju-vm:~$ juju status
Model             Controller  Cloud/Region        Version  SLA          Timestamp
jenkins-tutorial  microk8s    microk8s/localhost  3.5.3    unsupported  12:57:52+02:00

App                Version  Status       Scale  Charm              Channel        Rev  Address         Exposed  Message
jenkins-agent-k8s           waiting          1  jenkins-agent-k8s  latest/stable   25  10.152.183.112  no       installing agent
jenkins-k8s                 maintenance      1  jenkins-k8s        latest/edge    125  10.152.183.207  no       Adding agent node.

Unit                  Workload     Agent  Address      Ports  Message
jenkins-agent-k8s/0*  waiting      idle   10.1.32.139         Waiting for complete relation data.
jenkins-k8s/0*        maintenance  idle   10.1.32.144         Adding agent node.

To Reproduce

Sometimes deploying eveything at the same time it gets stuck.

juju add-model jenkins-tutorial
juju deploy jenkins-k8s --channel=latest/edge; juju deploy jenkins-agent-k8s; juju integrate jenkins-k8s:agent jenkins-agent-k8s:agent

I have found it easier to reproduce deleting the ctr images with something like microk8s ctr images list | awk '{print $1}' | xargs microk8s ctr images delete.

Environment

I have used a multipass machine created with multipass launch --cpus 8 --memory 16G --disk 50G --name my-juju-vm charm-dev.

ubuntu@my-juju-vm:~$ uname -a
Linux my-juju-vm 5.15.0-122-generic #132-Ubuntu SMP Thu Aug 29 13:45:52 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
ubuntu@my-juju-vm:~$ juju controllers
Use --refresh option with this command to see the latest information.

Controller  Model             User   Access     Cloud/Region         Models  Nodes    HA  Version
lxd         welcome-lxd       admin  superuser  localhost/localhost       2      1  none  3.5.3  
microk8s*   jenkins-tutorial  admin  superuser  microk8s/localhost        3      1     -  3.5.3  
ubuntu@my-juju-vm:~$ juju controllers --refresh
Controller  Model             User   Access     Cloud/Region         Models  Nodes    HA  Version
lxd         welcome-lxd       admin  superuser  localhost/localhost       2      1  none  3.5.3  
microk8s*   jenkins-tutorial  admin  superuser  microk8s/localhost        3      -     -  3.5.3  
ubuntu@my-juju-vm:~$ juju version
3.5.3-genericlinux-amd64
ubuntu@my-juju-vm:~$ microk8s version
MicroK8s v1.29.9 revision 7227
ubuntu@my-juju-vm:~$ lsb_release -a
No LSB modules are available.
Distributor ID:	Ubuntu
Description:	Ubuntu 22.04.5 LTS
Release:	22.04
Codename:	jammy

ubuntu@my-juju-vm:~$ juju status --relations
Model             Controller  Cloud/Region        Version  SLA          Timestamp
jenkins-tutorial  microk8s    microk8s/localhost  3.5.3    unsupported  13:02:03+02:00

App                Version  Status       Scale  Charm              Channel        Rev  Address         Exposed  Message
jenkins-agent-k8s           waiting          1  jenkins-agent-k8s  latest/stable   25  10.152.183.112  no       installing agent
jenkins-k8s                 maintenance      1  jenkins-k8s        latest/edge    125  10.152.183.207  no       Adding agent node.

Unit                  Workload     Agent  Address      Ports  Message
jenkins-agent-k8s/0*  waiting      idle   10.1.32.139         Waiting for complete relation data.
jenkins-k8s/0*        maintenance  idle   10.1.32.144         Adding agent node.

Integration provider     Requirer           Interface         Type     Message
jenkins-agent-k8s:agent  jenkins-k8s:agent  jenkins_agent_v0  regular  

Relevant log output

ubuntu@my-juju-vm:~/github/canonical/jenkins-k8s-operator$ juju debug-log --replay
controller-0: 12:39:51 INFO juju.worker.apicaller [fcc206] "controller-0" successfully connected to "localhost:17070"
controller-0: 12:39:51 INFO juju.worker.logforwarder config change - log forwarding not enabled
controller-0: 12:39:51 INFO juju.worker.logger logger worker started
controller-0: 12:39:51 INFO juju.worker.pruner.action pruner config: max age: 336h0m0s, max collection size 5120M for jenkins-tutorial (fcc206c9-e564-49d0-8af7-2bbd3c537d25)
controller-0: 12:39:51 INFO juju.worker.pruner.statushistory pruner config: max age: 336h0m0s, max collection size 5120M for jenkins-tutorial (fcc206c9-e564-49d0-8af7-2bbd3c537d25)
model-fcc206c9-e564-49d0-8af7-2bbd3c537d25: 12:39:57 INFO juju.worker.caasupgrader abort check blocked until version event received
model-fcc206c9-e564-49d0-8af7-2bbd3c537d25: 12:39:57 INFO juju.worker.caasupgrader unblocking abort check
model-fcc206c9-e564-49d0-8af7-2bbd3c537d25: 12:39:57 INFO juju.worker.muxhttpserver starting http server on [::]:17071
model-fcc206c9-e564-49d0-8af7-2bbd3c537d25: 12:39:57 INFO juju.worker.caasadmission ensuring model k8s webhook configurations
controller-0: 12:40:02 INFO juju.worker.caasapplicationprovisioner.runner start "jenkins-k8s"
controller-0: 12:40:02 INFO juju.worker.caasapplicationprovisioner.jenkins-k8s scaling application "jenkins-k8s" to desired scale 1
controller-0: 12:40:02 INFO juju.worker.caasapplicationprovisioner.jenkins-k8s scaling application "jenkins-k8s" to desired scale 1
controller-0: 12:40:02 INFO juju.worker.caasapplicationprovisioner.jenkins-k8s scaling application "jenkins-k8s" to desired scale 1
controller-0: 12:40:05 INFO juju.worker.caasapplicationprovisioner.jenkins-k8s scaling application "jenkins-k8s" to desired scale 1
controller-0: 12:40:05 INFO juju.worker.caasapplicationprovisioner.runner start "jenkins-agent-k8s"
controller-0: 12:40:06 INFO juju.worker.caasapplicationprovisioner.jenkins-agent-k8s scaling application "jenkins-agent-k8s" to desired scale 1
controller-0: 12:40:06 INFO juju.worker.caasapplicationprovisioner.jenkins-agent-k8s scaling application "jenkins-agent-k8s" to desired scale 1
controller-0: 12:40:08 INFO juju.worker.caasapplicationprovisioner.jenkins-k8s scaling application "jenkins-k8s" to desired scale 1
controller-0: 12:40:09 INFO juju.worker.caasapplicationprovisioner.jenkins-agent-k8s scaling application "jenkins-agent-k8s" to desired scale 1
controller-0: 12:40:12 INFO juju.worker.caasapplicationprovisioner.jenkins-agent-k8s scaling application "jenkins-agent-k8s" to desired scale 1
controller-0: 12:40:12 INFO juju.worker.caasapplicationprovisioner.jenkins-k8s scaling application "jenkins-k8s" to desired scale 1
controller-0: 12:40:23 INFO juju.worker.caasapplicationprovisioner.jenkins-agent-k8s scaling application "jenkins-agent-k8s" to desired scale 1
unit-jenkins-agent-k8s-0: 12:40:25 INFO juju.cmd running containerAgent [3.5.3 63d460f9ee6c7c710131961390687e7a0ab90470 gc go1.21.12]
unit-jenkins-agent-k8s-0: 12:40:25 INFO juju.cmd.containeragent.unit start "unit"
unit-jenkins-agent-k8s-0: 12:40:25 INFO juju.worker.upgradesteps upgrade steps for 3.5.3 have already been run.
unit-jenkins-agent-k8s-0: 12:40:25 INFO juju.worker.probehttpserver starting http server on 127.0.0.1:65301
unit-jenkins-agent-k8s-0: 12:40:25 INFO juju.api cannot resolve "controller-service.controller-microk8s.svc.cluster.local": lookup controller-service.controller-microk8s.svc.cluster.local: operation was canceled
unit-jenkins-agent-k8s-0: 12:40:25 INFO juju.api connection established to "wss://10.152.183.82:17070/model/fcc206c9-e564-49d0-8af7-2bbd3c537d25/api"
unit-jenkins-agent-k8s-0: 12:40:25 INFO juju.worker.apicaller [fcc206] "unit-jenkins-agent-k8s-0" successfully connected to "10.152.183.82:17070"
unit-jenkins-agent-k8s-0: 12:40:25 INFO juju.worker.migrationminion migration migration phase is now: NONE
unit-jenkins-agent-k8s-0: 12:40:25 INFO juju.worker.logger logger worker started
unit-jenkins-agent-k8s-0: 12:40:25 WARNING juju.worker.proxyupdater unable to set snap core settings [proxy.http= proxy.https= proxy.store=]: exec: "snap": executable file not found in $PATH, output: ""
unit-jenkins-agent-k8s-0: 12:40:25 INFO juju.worker.leadership jenkins-agent-k8s/0 promoted to leadership of jenkins-agent-k8s
unit-jenkins-agent-k8s-0: 12:40:25 INFO juju.worker.caasupgrader abort check blocked until version event received
unit-jenkins-agent-k8s-0: 12:40:25 INFO juju.worker.caasupgrader unblocking abort check
unit-jenkins-agent-k8s-0: 12:40:25 INFO juju.agent.tools ensure jujuc symlinks in /var/lib/juju/tools/unit-jenkins-agent-k8s-0
unit-jenkins-agent-k8s-0: 12:40:25 INFO juju.worker.uniter unit "jenkins-agent-k8s/0" started
unit-jenkins-agent-k8s-0: 12:40:25 INFO juju.worker.uniter resuming charm install
unit-jenkins-agent-k8s-0: 12:40:25 INFO juju.worker.uniter.charm downloading ch:amd64/jammy/jenkins-agent-k8s-25 from API server
unit-jenkins-agent-k8s-0: 12:40:25 INFO juju.downloader downloading from ch:amd64/jammy/jenkins-agent-k8s-25
unit-jenkins-agent-k8s-0: 12:40:25 INFO juju.downloader download complete ("ch:amd64/jammy/jenkins-agent-k8s-25")
unit-jenkins-agent-k8s-0: 12:40:25 INFO juju.downloader download verified ("ch:amd64/jammy/jenkins-agent-k8s-25")
unit-jenkins-k8s-0: 12:40:26 INFO juju.cmd running containerAgent [3.5.3 63d460f9ee6c7c710131961390687e7a0ab90470 gc go1.21.12]
unit-jenkins-k8s-0: 12:40:26 INFO juju.cmd.containeragent.unit start "unit"
unit-jenkins-k8s-0: 12:40:26 INFO juju.worker.upgradesteps upgrade steps for 3.5.3 have already been run.
unit-jenkins-k8s-0: 12:40:26 INFO juju.worker.probehttpserver starting http server on 127.0.0.1:65301
unit-jenkins-k8s-0: 12:40:26 INFO juju.api cannot resolve "controller-service.controller-microk8s.svc.cluster.local": lookup controller-service.controller-microk8s.svc.cluster.local: operation was canceled
unit-jenkins-k8s-0: 12:40:26 INFO juju.api connection established to "wss://10.152.183.82:17070/model/fcc206c9-e564-49d0-8af7-2bbd3c537d25/api"
unit-jenkins-k8s-0: 12:40:26 INFO juju.worker.apicaller [fcc206] "unit-jenkins-k8s-0" successfully connected to "10.152.183.82:17070"
unit-jenkins-k8s-0: 12:40:26 INFO juju.api connection established to "wss://controller-service.controller-microk8s.svc.cluster.local:17070/model/fcc206c9-e564-49d0-8af7-2bbd3c537d25/api"
unit-jenkins-k8s-0: 12:40:26 INFO juju.worker.apicaller [fcc206] "unit-jenkins-k8s-0" successfully connected to "controller-service.controller-microk8s.svc.cluster.local:17070"
unit-jenkins-k8s-0: 12:40:26 INFO juju.worker.migrationminion migration migration phase is now: NONE
unit-jenkins-k8s-0: 12:40:26 INFO juju.worker.logger logger worker started
unit-jenkins-k8s-0: 12:40:26 WARNING juju.worker.proxyupdater unable to set snap core settings [proxy.http= proxy.https= proxy.store=]: exec: "snap": executable file not found in $PATH, output: ""
unit-jenkins-k8s-0: 12:40:26 INFO juju.worker.leadership jenkins-k8s/0 promoted to leadership of jenkins-k8s
unit-jenkins-k8s-0: 12:40:26 INFO juju.worker.caasupgrader abort check blocked until version event received
unit-jenkins-k8s-0: 12:40:26 INFO juju.worker.caasupgrader unblocking abort check
unit-jenkins-k8s-0: 12:40:26 INFO juju.agent.tools ensure jujuc symlinks in /var/lib/juju/tools/unit-jenkins-k8s-0
unit-jenkins-k8s-0: 12:40:26 INFO juju.worker.uniter unit "jenkins-k8s/0" started
unit-jenkins-k8s-0: 12:40:26 INFO juju.worker.uniter resuming charm install
unit-jenkins-k8s-0: 12:40:26 INFO juju.worker.uniter.charm downloading ch:amd64/jammy/jenkins-k8s-125 from API server
unit-jenkins-k8s-0: 12:40:26 INFO juju.downloader downloading from ch:amd64/jammy/jenkins-k8s-125
unit-jenkins-k8s-0: 12:40:26 INFO juju.downloader download complete ("ch:amd64/jammy/jenkins-k8s-125")
unit-jenkins-k8s-0: 12:40:26 INFO juju.downloader download verified ("ch:amd64/jammy/jenkins-k8s-125")
unit-jenkins-agent-k8s-0: 12:40:29 INFO juju.worker.uniter hooks are retried true
unit-jenkins-agent-k8s-0: 12:40:29 INFO juju.worker.uniter found queued "install" hook
unit-jenkins-agent-k8s-0: 12:40:30 INFO unit.jenkins-agent-k8s/0.juju-log Running legacy hooks/install.
unit-jenkins-agent-k8s-0: 12:40:30 INFO juju.worker.uniter.operation ran "install" hook (via hook dispatching script: dispatch)
unit-jenkins-agent-k8s-0: 12:40:31 INFO juju.worker.uniter.operation ran "agent-relation-created" hook (via hook dispatching script: dispatch)
unit-jenkins-agent-k8s-0: 12:40:31 INFO juju.worker.uniter found queued "leader-elected" hook
unit-jenkins-agent-k8s-0: 12:40:31 INFO juju.worker.uniter.operation ran "leader-elected" hook (via hook dispatching script: dispatch)
unit-jenkins-k8s-0: 12:40:31 INFO juju.worker.uniter hooks are retried true
unit-jenkins-agent-k8s-0: 12:40:32 WARNING unit.jenkins-agent-k8s/0.juju-log Jenkins agent container not yet ready. Deferring.
unit-jenkins-k8s-0: 12:40:32 INFO juju.worker.uniter found queued "install" hook
unit-jenkins-agent-k8s-0: 12:40:32 INFO juju.worker.uniter.operation ran "config-changed" hook (via hook dispatching script: dispatch)
unit-jenkins-agent-k8s-0: 12:40:32 INFO juju.worker.uniter found queued "start" hook
unit-jenkins-agent-k8s-0: 12:40:32 INFO unit.jenkins-agent-k8s/0.juju-log Running legacy hooks/start.
unit-jenkins-agent-k8s-0: 12:40:32 WARNING unit.jenkins-agent-k8s/0.juju-log Jenkins agent container not yet ready. Deferring.
unit-jenkins-k8s-0: 12:40:32 INFO unit.jenkins-k8s/0.juju-log Running legacy hooks/install.
unit-jenkins-agent-k8s-0: 12:40:33 INFO juju.worker.uniter.operation ran "start" hook (via hook dispatching script: dispatch)
unit-jenkins-k8s-0: 12:40:33 INFO juju.worker.uniter.operation ran "install" hook (via hook dispatching script: dispatch)
unit-jenkins-agent-k8s-0: 12:40:33 WARNING unit.jenkins-agent-k8s/0.juju-log agent:0: Jenkins agent container not yet ready. Deferring.
unit-jenkins-agent-k8s-0: 12:40:33 INFO unit.jenkins-agent-k8s/0.juju-log agent:0: agent relation joined.
unit-jenkins-agent-k8s-0: 12:40:34 INFO juju.worker.uniter.operation ran "agent-relation-joined" hook (via hook dispatching script: dispatch)
unit-jenkins-k8s-0: 12:40:34 INFO juju.worker.uniter.operation ran "agent-relation-created" hook (via hook dispatching script: dispatch)
unit-jenkins-k8s-0: 12:40:34 INFO juju.worker.uniter found queued "leader-elected" hook
unit-jenkins-agent-k8s-0: 12:40:34 WARNING unit.jenkins-agent-k8s/0.juju-log agent:0: Jenkins agent container not yet ready. Deferring.
unit-jenkins-agent-k8s-0: 12:40:34 INFO unit.jenkins-agent-k8s/0.juju-log agent:0: agent relation changed.
unit-jenkins-agent-k8s-0: 12:40:34 WARNING unit.jenkins-agent-k8s/0.juju-log agent:0: Jenkins agent container not yet ready. Deferring.
unit-jenkins-agent-k8s-0: 12:40:34 INFO juju.worker.uniter.operation ran "agent-relation-changed" hook (via hook dispatching script: dispatch)
unit-jenkins-k8s-0: 12:40:35 INFO juju.worker.uniter.operation ran "leader-elected" hook (via hook dispatching script: dispatch)
unit-jenkins-k8s-0: 12:40:35 INFO juju.worker.uniter.operation ran "jenkins-home-storage-attached" hook (via hook dispatching script: dispatch)
unit-jenkins-k8s-0: 12:40:36 INFO juju.worker.uniter.operation ran "config-changed" hook (via hook dispatching script: dispatch)
unit-jenkins-k8s-0: 12:40:36 INFO juju.worker.uniter found queued "start" hook
unit-jenkins-k8s-0: 12:40:37 INFO unit.jenkins-k8s/0.juju-log Running legacy hooks/start.
unit-jenkins-k8s-0: 12:40:37 INFO juju.worker.uniter.operation ran "start" hook (via hook dispatching script: dispatch)
unit-jenkins-k8s-0: 12:40:38 WARNING unit.jenkins-k8s/0.juju-log agent:0: Service not yet ready. Deferring.
unit-jenkins-k8s-0: 12:40:38 INFO juju.worker.uniter.operation ran "agent-relation-joined" hook (via hook dispatching script: dispatch)
unit-jenkins-k8s-0: 12:40:38 WARNING unit.jenkins-k8s/0.juju-log agent:0: Service not yet ready. Deferring.
unit-jenkins-k8s-0: 12:40:39 INFO juju.worker.uniter.operation ran "agent-relation-changed" hook (via hook dispatching script: dispatch)
unit-jenkins-agent-k8s-0: 12:41:17 INFO unit.jenkins-agent-k8s/0.juju-log agent relation changed.
unit-jenkins-agent-k8s-0: 12:41:17 INFO unit.jenkins-agent-k8s/0.juju-log Waiting for complete relation data.
unit-jenkins-agent-k8s-0: 12:41:17 WARNING unit.jenkins-agent-k8s/0.juju-log Preconditions not ready.
unit-jenkins-agent-k8s-0: 12:41:18 INFO juju.worker.uniter.operation ran "jenkins-agent-k8s-pebble-ready" hook (via hook dispatching script: dispatch)
unit-jenkins-agent-k8s-0: 12:45:14 INFO unit.jenkins-agent-k8s/0.juju-log agent relation changed.
unit-jenkins-agent-k8s-0: 12:45:14 INFO unit.jenkins-agent-k8s/0.juju-log Waiting for complete relation data.
unit-jenkins-agent-k8s-0: 12:45:14 INFO juju.worker.uniter.operation ran "update-status" hook (via hook dispatching script: dispatch)
unit-jenkins-k8s-0: 12:47:38 ERROR unit.jenkins-k8s/0.juju-log Uncaught exception while in charm code:
Traceback (most recent call last):
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/src/jenkins.py", line 234, in wait_ready
    _wait_for(self._is_ready, timeout=timeout, check_interval=check_interval)
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/src/jenkins.py", line 672, in _wait_for
    raise TimeoutError()
TimeoutError

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/./src/charm.py", line 207, in <module>
    ops.main.main(JenkinsK8sOperatorCharm)
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/venv/ops/main.py", line 553, in main
    manager.run()
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/venv/ops/main.py", line 529, in run
    self._emit()
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/venv/ops/main.py", line 515, in _emit
    self.framework.reemit()
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/venv/ops/framework.py", line 863, in reemit
    self._reemit()
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/venv/ops/framework.py", line 943, in _reemit
    custom_handler(event)
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/src/agent.py", line 165, in _on_agent_relation_joined
    self.jenkins.wait_ready()
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/src/jenkins.py", line 236, in wait_ready
    raise TimeoutError("Timed out waiting for Jenkins to become ready.") from exc
TimeoutError: Timed out waiting for Jenkins to become ready.
unit-jenkins-k8s-0: 12:47:38 ERROR juju.worker.uniter.operation hook "jenkins-pebble-ready" (via hook dispatching script: dispatch) failed: exit status 1
unit-jenkins-k8s-0: 12:47:38 ERROR juju.worker.uniter pebble poll failed for container "jenkins": failed to send pebble-ready event: hook failed
unit-jenkins-agent-k8s-0: 12:50:29 INFO unit.jenkins-agent-k8s/0.juju-log agent relation changed.
unit-jenkins-agent-k8s-0: 12:50:29 INFO unit.jenkins-agent-k8s/0.juju-log Waiting for complete relation data.
unit-jenkins-agent-k8s-0: 12:50:29 INFO juju.worker.uniter.operation ran "update-status" hook (via hook dispatching script: dispatch)
unit-jenkins-k8s-0: 12:52:49 ERROR unit.jenkins-k8s/0.juju-log Uncaught exception while in charm code:
Traceback (most recent call last):
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/src/jenkins.py", line 234, in wait_ready
    _wait_for(self._is_ready, timeout=timeout, check_interval=check_interval)
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/src/jenkins.py", line 672, in _wait_for
    raise TimeoutError()
TimeoutError

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/./src/charm.py", line 207, in <module>
    ops.main.main(JenkinsK8sOperatorCharm)
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/venv/ops/main.py", line 553, in main
    manager.run()
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/venv/ops/main.py", line 529, in run
    self._emit()
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/venv/ops/main.py", line 515, in _emit
    self.framework.reemit()
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/venv/ops/framework.py", line 863, in reemit
    self._reemit()
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/venv/ops/framework.py", line 943, in _reemit
    custom_handler(event)
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/src/agent.py", line 165, in _on_agent_relation_joined
    self.jenkins.wait_ready()
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/src/jenkins.py", line 236, in wait_ready
    raise TimeoutError("Timed out waiting for Jenkins to become ready.") from exc
TimeoutError: Timed out waiting for Jenkins to become ready.
unit-jenkins-k8s-0: 12:52:50 ERROR juju.worker.uniter.operation hook "jenkins-pebble-ready" (via hook dispatching script: dispatch) failed: exit status 1
unit-jenkins-k8s-0: 12:52:50 ERROR juju.worker.uniter pebble poll failed for container "jenkins": failed to send pebble-ready event: hook failed
unit-jenkins-agent-k8s-0: 12:56:11 INFO unit.jenkins-agent-k8s/0.juju-log agent relation changed.
unit-jenkins-agent-k8s-0: 12:56:11 INFO unit.jenkins-agent-k8s/0.juju-log Waiting for complete relation data.
unit-jenkins-agent-k8s-0: 12:56:11 INFO juju.worker.uniter.operation ran "update-status" hook (via hook dispatching script: dispatch)
unit-jenkins-k8s-0: 12:58:01 ERROR unit.jenkins-k8s/0.juju-log Uncaught exception while in charm code:
Traceback (most recent call last):
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/src/jenkins.py", line 234, in wait_ready
    _wait_for(self._is_ready, timeout=timeout, check_interval=check_interval)
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/src/jenkins.py", line 672, in _wait_for
    raise TimeoutError()
TimeoutError

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/./src/charm.py", line 207, in <module>
    ops.main.main(JenkinsK8sOperatorCharm)
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/venv/ops/main.py", line 553, in main
    manager.run()
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/venv/ops/main.py", line 529, in run
    self._emit()
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/venv/ops/main.py", line 515, in _emit
    self.framework.reemit()
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/venv/ops/framework.py", line 863, in reemit
    self._reemit()
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/venv/ops/framework.py", line 943, in _reemit
    custom_handler(event)
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/src/agent.py", line 165, in _on_agent_relation_joined
    self.jenkins.wait_ready()
  File "/var/lib/juju/agents/unit-jenkins-k8s-0/charm/src/jenkins.py", line 236, in wait_ready
    raise TimeoutError("Timed out waiting for Jenkins to become ready.") from exc
TimeoutError: Timed out waiting for Jenkins to become ready.
unit-jenkins-k8s-0: 12:58:01 ERROR juju.worker.uniter.operation hook "jenkins-pebble-ready" (via hook dispatching script: dispatch) failed: exit status 1
unit-jenkins-k8s-0: 12:58:01 ERROR juju.worker.uniter pebble poll failed for container "jenkins": failed to send pebble-ready event: hook failed
unit-jenkins-agent-k8s-0: 13:01:32 INFO unit.jenkins-agent-k8s/0.juju-log agent relation changed.
unit-jenkins-agent-k8s-0: 13:01:32 INFO unit.jenkins-agent-k8s/0.juju-log Waiting for complete relation data.
unit-jenkins-agent-k8s-0: 13:01:32 INFO juju.worker.uniter.operation ran "update-status" hook (via hook dispatching script: dispatch)

Additional context

No response

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant