-
Notifications
You must be signed in to change notification settings - Fork 508
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HTTP PUT crashes in static-linked linux builds #996
Comments
Thank you for the simple python server. Super helpful! If I build locally and statically link, I get a segfault. If I build locally and don't statically link, it works correctly. The official v2.6 binaries are statically linked. |
With the non-static build, I get inconsistent results. In some cases, I get something like |
When this failure occurs, IoCache has a large amount of data written to it, but libcurl has only read a small amount of it back out. In the run I just did, curl read out 8 x 16,372 byte chunks (total 130,976 bytes), but Packager wrote 333 x 65,536 byte chunks + 18797 bytes at the end (total 21,842,285 bytes). When I've resolved this, I'll return the question of why it is crashing in a static-link build. |
The non-crash failure is caused by the python server, which doesn't read the input data and doesn't support chunked-transfer encoding. If I update it to read the upload data and handle chunked transfer encoding, the failure goes away. Here's the new server code: from http.server import HTTPServer
from http.server import SimpleHTTPRequestHandler
import http.server
class HTTPRequestHandler(SimpleHTTPRequestHandler):
def do_PUT(self):
if "Content-Length" in self.headers:
content_length = int(self.headers["Content-Length"])
body = self.rfile.read(content_length)
elif "chunked" in self.headers.get("Transfer-Encoding", ""):
body = bytes()
while True:
line = self.rfile.readline().strip()
chunk_length = int(line, 16)
if chunk_length != 0:
chunk = self.rfile.read(chunk_length)
body += chunk
# Each chunk is followed by an additional empty newline
# that we have to consume.
self.rfile.readline()
# Finally, a chunk size of 0 is an end indication
if chunk_length == 0:
break
self.send_response(201, 'Created')
self.end_headers()
print(self.headers)
print('Body length', len(body))
if __name__ == '__main__':
http.server.test(HandlerClass=HTTPRequestHandler,
ServerClass=HTTPServer,
port=5000) |
I'm still getting a segfault on the static-linked binary with the fixed server. So the failures I was seeing were red herrings. |
Backtrace from the crash:
|
Oops, I already implemented the server that reads chunked transfer but sent an over simplified one for readability. It's weird though that the version of packager I tested and works good will work fine whether the chunks are read by the server or not. Maybe it has to do with the chunk size. Let me compile v2.6 to have a better view of the bug. |
It appears that only the static-linked official binaries are crashing, and only inside of libcurl. If you build it yourself with standard flags, it should work correctly. I hope this unblocks you for now while I try to find a solution. |
It turns out that getaddrinfo and all of its alternatives in Linux's glibc fail when statically linked. Alpine Linux's musl (replacement for glibc) doesn't seem to have this issue, but we can't require that on other distros. If I configure libcurl at build time to use libc-ares for DNS instead of libc's getaddrinfo, this seems to resolve the crash. Since this would be a new dependency on Linux, I need to test this across GitHub Actions, |
The fix is in Packager v2.6.1, which the robots are pushing out now. After that, I'll update shaka-streamer and shaka-streamer-binaries as well. |
I have some Dockerfile:s where i build static PIE binaries where i test the binaries by having a build step where i copy the binaries over to a scratch image and do some basic tests and try network requests etc. Maybe interesting approach in this case? |
That sounds great. Would you like to submit a PR? |
The official, static-linked linux builds were crashing in their use of getaddrinfo, which libcurl was configured to use. Both getaddrinfo and all of its alternatives available in glibc fail with static linking. We can fix this by configuring libcurl to use libc-ares on Linux instead. This allows us to keep the benefits of a statically-linked Linux binary. Closes shaka-project#996 Change-Id: Ib4a9eb939813fd165727788726459ef4adf3fc4d
The official, static-linked linux builds were crashing in their use of getaddrinfo, which libcurl was configured to use. Both getaddrinfo and all of its alternatives available in glibc fail with static linking. We can fix this by configuring libcurl to use libc-ares on Linux instead. This allows us to keep the benefits of a statically-linked Linux binary. Closes shaka-project#996 Change-Id: Ib4a9eb939813fd165727788726459ef4adf3fc4d
System info
Operating System: Ubuntu 18.04
Shaka Packager Version: v2.6
Issue and steps to reproduce the problem
Run the command below while having an HTTP server running on port 5000 on localhost.
Packager Command:
packager 'in=testvid1.mp4,stream=video,init_segment=http://localhost:5000/output_files/video_init.mp4,segment_template=http://localhost:5000/output_files/video_$Number$.mp4' --segment_duration 10 --generate_static_live_mpd --mpd_output http://localhost:5000/output_files/dash.mpd
What is the expected result?
Packaged content are sent to the server using PUT requests.
What happens instead?
Only the first init segment reached the server, then Shaka Packager crashes with a seg fault.
testvid1.mp4
repro.zip
The text was updated successfully, but these errors were encountered: