-
-
Notifications
You must be signed in to change notification settings - Fork 30.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
2022.7.x docker container broken on 1st gen Raspberry Pi #75142
Comments
Edited to add that the 2022.6.6 container starts up on the same system. |
Edited subject to clarify this is a new issue in 2022.7, and the problem persists in 2022.7.5 It's possible to start python3 itself inside the container, so I'd assume the problem is in some module that gets pulled in by homeassistant:
|
Can confirm Raspberry Pi 1b works with 2022.6.7 as well, but it does not work with 2022.7.* |
Yesterday, I replaced the Pi 1b with a Pi 3b+, so I'm not directly affected anymore. I'll keep the old setup around for a couple more days in case anyone picks this up and needs some testing. |
Duplicate of #74707? |
@danielrheinbay I don't think this is duplicate as the ARM architecture(armv6l vs armv7), error message, Linux kernel version(recent vs old) are all different. The linux version on my RPi 1b running Raspbian/Debian Bullseye:
|
@richardzone Fair enough, the environments do differ. I am also affected by this issue (the most recent version of hass core that runs on my Raspi 1b using Docker is 2022.6.7), so I reviewed existing issues and found @frenck's comment in #74707 pointing to the update of the Alpine base image. I was wondering if this update of was also what is causing 2022.7.x version of hass core to crash on Raspi 1s. |
I have not found any indication of a general problem with Alpine 3.16 on the RPi 1. Maybe there's a connection to numpy/numpy#20765 and piwheels/packages#276 though? Unfortunately neither issue seems to mention how they fixed the build problem. Anyway, when I run the Alpine python3 inside my homeassistant container and do an |
I am experiencing identical behavior, same platform, with the new 2022.8.1 |
According to the debugger it breaks here:
Stepping into this call just gives me "frozen" gibberish:
But poking around the config utils I found that Does this make any sense to someone? 😄 |
Importing crypto libs seems so be the problem:
But this is wrapped in a try/except block... so it might be a red herring: https://github.com/jpadilla/pyjwt/blob/0bef0fbff5c245668578a43774d8620bdba4a6f7/jwt/algorithms.py#L18-L57 |
Good catch, @larsxschneider! Indeed, cryptography contains compiled modules: https://cryptography.io/en/latest/installation/ |
@danielrheinbay thanks! I think your link explains it - there is no ARM 32bit version. That's what my QNAP ARM would need I think. |
The same goes for our RasPi 1 and 2, I guess :/ |
So I have reviewed older versions of the documentation and noticed:
So we may have to dig further to identify the actual root cause. |
Just for completeness, on
Apparently However, I don't think that is the problem here because |
Re-reading the thread, I think @abochmann really found the core problem in this link above. Apparently a newer |
What do you mean when you say "a newer piwheels", @larsxschneider? From piwheels.org:
From piwheels/packages#276 (comment):
Then, two days later in piwheels/packages#276 (comment):
So the way I understand it is this: |
@danielrheinbay I am no home assistant and no python expert. So read my comment with that in mind. Is it possible that the faulty version of Piwheels was backed into the Homeassistant docker image? If that is true, then the problem could still be present although the issue was fixed half a year ago. |
I am still seeing a boot loop as of version 2022.9. |
I found that Home Assistant has it's own wheels repo https://github.com/home-assistant/wheels |
Also can confirm that
I found the issue numpy/numpy#18131 in numpy repo but it has been fixed. |
Interestingly this issue is also referenced in the wheels issue 276. Issue 276 and the numpy issue were both closed around January 17 but I'm not seeing a clear commit in the commit history. As @kuznetsss pointed out, HA has it's own wheels it would be worth while to verify the fix implemented Jan 17 is also in the HA variant. |
Bricked my native python install trying to upgrade and interestingly it was getting stuck on building cryptography (for some reason no wheel available) - likely related to the rust dependency: piwheels/packages#276 (comment) Cryptography which broke a lot of builds and wheels last year: pyca/cryptography#5771 So now fresh install, trying docker and came across this issue interestingly jwt imports cryptography https://github.com/GehirnInc/python-jwt/blob/fd684745b2de884a32e6d7a423e659e4d262fb27/setup.py#L27 Although I cant find that numpy does |
I've done some investigation about this issue, the issue is deeper than cryptography and numpy. It is expanded on all packages with compiled (С/C++/Rust) code. |
I confirmed this problem still exists on 2022.11.5. I tried to build 2022.11.5 on my QNAP NAS but so far I haven't been able to. I'm a noob when it comes to building from source. I tried downloading the HA core repo and adding the build command to my docker compose but it gives me an error saying I need to check my yaml. I will build on my machine if I can figure it out. |
I'm no expert so I'm not sure if I'm on to something or not. I noticed that this file was created in June specifically for armhf. Could this be our issue? Edit: I've attempted to build the HA core container image using the Alpine and Alpine Python base images as well as the Debian base image. I always get errors installing dependencies for python astral and I'm not sure how to resolve that. This is my docker file...
It should be noted it builds fine with wheels but then loops which is what we expect. I tried to build with wheels and the 2022.06 Alpine base image just to confirm that Alpine wasn't the problem. It loops as well. The next step is to build an image that includes building wheels but I'm not going to get there without help since I keep getting errors while installing build dependencies. |
TL;DRI think the Github Actions are configured wrong to use armv7 instead of armv6 for the wheels packages. Long story (for troubleshooting in case that is not it)I ran into this same issue on raspberry pi 1B rev2. It is very easy to reproduce with the built docker containers:
Because it was suggested that the wheels might be broken, I tried to only pull the wheels directly (my poor pi 1b does not like to compile or download the full requirements). So I created a Dockerfile which only installs the numpy package from the same repository as the Github Actions use and built the container. Dockerfile: ARG BUILD_FROM
FROM ghcr.io/home-assistant/armhf-homeassistant-base:2022.11.0@sha256:37b3385d3a10a30344ba73f4d5551b7ecb63aca1606b1586e67f34624f592d6c
# Synchronize with homeassistant/core.py:async_stop
ENV \
S6_SERVICES_GRACETIME=220000
WORKDIR /usr/src
## Setup Home Assistant Core dependencies
#COPY requirements.txt homeassistant/
#COPY homeassistant/package_constraints.txt homeassistant/homeassistant/
# test install numpy package
RUN pip3 install --no-cache-dir --no-index --only-binary=:all: --find-links "https://wheels.home-assistant.io/musllinux/" --use-deprecated=legacy-resolver numpy
Build output
The goal was just to get a simple system where I may have a chance of manually building the single package or swap it to a different source. So I tried to run the import-crash code again and to my surprise it was working:
I tried to pull the exact same image that was built in the Actions for that version and compared the numpy versions. That is where I noticed that the Actions are pulling from armv7, while my working version uses armv6: Not working:
Working:
This is the same for all of the packages (although I did not test that every single one is working). So I think this easily explains the illegal instruction errors, unless I am mistaken and the Actions are only used for CI and the uploaded docker images come from somewhere else. I am not very familiar with the Actions workflows, but this should be pretty easy to fix without any code changes and without rebuilding the wheels. I noticed that the architecture everywhere is set to |
I noticed that linuxserver.io is providing own wheels, so got latest version working by switching to their container. I was thinking trying to figure out how to get wheels built by myself, but decided to go with this easy route for now. |
Okay, I built an image on my QNAP NAS with the following dockerfile...
This image starts setting up integrations before giving the same errors when it tries to set up @samitheberber, I had no idea linuxserver.io published a HA image. I grabbed the latest and can confirm it works. I just updated 6 months worth of HA 👍 |
The linuxserver.io image does not work for me, I got a warning that it has arm v7 and I need arm v6. What is your architecture that they are working on @Wetzel402 @samitheberber ? ( |
@Jojo-1000, my QNAP NAS is arm v7 so that makes sense then why its working for me. I wonder if the linuxserver.io image has "good" wheels but is using the wrong numpy version like you observed for the HA in house image. |
I think you're right and probably the only thing that is missing in the configuration of the armhf config is an environment variable, QEMU_CPU=arm1176, that will force the build container to think is on armv6 so that the build will just work for armv6 and then in the build for v7 remove that variable. I'm trying to build an image on my linux computer using docker buildx and send it to my raspberry pi zero w if it works I will update this message. |
Presumably HA's actions are open source and we can submit a PR? |
I am trying to understand what is all going on behind the scene because there are lots of repository involved in the build process (builder, base, ecc..). |
I don't think there is an issue with the Alpine base image. I think the issue is specific to wheels when building HA core. The linuxserver.io image works for those of us with armv7 CPUs and uses different wheels than the OEM version. Separately @Jojo-1000, seems to have found another issue where the wrong numpy wheel is being collected for armv6. |
So I made it work on my raspberry pi zero w with arch armv6l, I had to compile it from the source and from another pc because the raspberry was too loaded and it struggled to create the image. steps that I did: //Modified the Dockerfile //added //run on a faster machine //send "out.tar" file over to the raspberry pi zero w and run //custom/homeassistant is the image tag you can use whatever you want... with this I could install the newer version Unfortunately I couldn't find how and if it's possible to modify it in the main github actions to make it run on raspberry pi zero w directly EDIT: it's not necessary to replace ${WHEELS_LINK} because is present in the armhf-homeassitant-base image where the build begins ;) I'm digging more into the details of the base Image and what I noticed is that the layers of the armhf-homeassistant-base are full of UPDATE: |
I've finally created a pull request I've noted that the builder used, if the architecture is armhf, in the actions will provide with the QEMU_CPU=arm1176 as an ARG. I tested it by the same process Added ARG QEMU_CPU then build it with hope it will be merged soon so that we will be able to use the offical image from the repository. thank you @Jojo-1000 without your analysis I would never been able to identify what was going on |
It's gemerged, but what to do when you to compose on :dev? Is a flag necessary? |
Already available: No flags needed Thanks all guys for support !! |
Yes it has been merged almost 20 hours ago but we had to wait until a build was triggered. That happened 2 hours ago with a development version I will try it and I hope it will be soon present also in a stable version @Metaln00b if you are using the :dev Image Tag you should stop your container delete the image and then pull the new one. |
I've tested it on my raspberry pi zero W and it works like a charm I'm not sure about ti other but I think this issue can be closed as resolved. |
Also running smoothly on my 1st gen Pi 👍🏾 Thanks for the fix!
…On Sun, 22 Jan 2023, 11:56 Patrick, ***@***.***> wrote:
I've tested it on my raspberry pi zero W and it works like a charm I'm not
sure about ti other but I think this issue can be closed as resolved.
—
Reply to this email directly, view it on GitHub
<#75142 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABQ2QADWWCZ3UUUGQCRDM53WTUN7VANCNFSM53P7TIWA>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
can confirm 2023.2.0.dev20230120 is running fine on 1st gen Pi |
Ironically I have just yesterday migrated to a Pi Zero W 2 to overcome this issue 😞 |
@HarvsG Its much faster then rpi0w ? |
I can confirm that this version works on raspberry pi b+. |
Yes and I managed to install full HAOS, which really simplified my config. So not wasted time. |
There hasn't been any activity on this issue recently. Due to the high number of incoming GitHub notifications, we have to clean some of the old issues, as many of them have already been resolved with the latest updates. |
The problem
I'm trying to run home-assistant:stable with the docker compose method on an Raspberry Pi Model B Rev 2 (512MB RAM / armv6l). This is obviously not a good platform, but it's what I had available for a test. The OS is Raspbian/bullseye, with docker ce 20.10.17
Running docker compose up, the container starts, but the homeassistant inside seems to crash:
A plain Home Assistant Core venv starts successfully on the same hardware, as does the previous release of the docker container using 2022.6.6@sha256:c2e7c0bb84e6f84d1c62ff34d9f286fb17486ccf2760a5612f65128848c59316.
What version of Home Assistant Core has the issue?
2022.7.5
What was the last working version of Home Assistant Core?
2022.6.6
What type of installation are you running?
Home Assistant Container
Integration causing the issue
No response
Link to integration documentation on our website
No response
Diagnostics information
When looking at the resulting core file with gdb, it reports an illegal instruction:
I assume the build uses some compiler flag that creates instructions not supported by the BCM2835?
Example YAML snippet
No response
Anything in the logs that might be useful for us?
No response
Additional information
No response
The text was updated successfully, but these errors were encountered: