-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support streams (http) #97
Comments
Hi @haukepribnow we don't fully support streams right now. This is the expected behavior:
I'm cleaning up the stream issues and will use this to track support for http streams. This was finally unblocked for us in the host: Azure/azure-functions-dotnet-worker#1387. The .NET Isolated worker was the first one to use it, but we should be able to benefit in Node.js as well. Unfortunately I don't have any ETAs. For now we're focused on GA-ing the v4 programming model, but we will keep the roadmap updated with any stream plans: https://github.com/Azure/azure-functions-nodejs-library/wiki/Roadmap Streaming Azure resources will be done separately, tracked by #99 |
Given the slow responses from OpenAI, but their support for streaming, having support for streaming here would be critical. Azure's private OpenAI offering is have me, a decades long AWS customer, strongly considering switching to Azure for AI based projects, but this issue will likely prevent me from using this serverless approach. |
Thanks @ejizba for keeping this issue alive and referencing this in other issues. I am not sure how much this is requested by others, but I wanted to leave a +1 on this. Do you have any updates on this or ETAs? |
@AlexPshul We're open to contributions, but not for this feature. This has been in-progress for a while and involves a lot of moving parts. In the next week or so I should actually be able to post unofficial instructions to try this out, so the best way you could help would be testing. As for the official preview announcement, that ETA is listed on the roadmap. I will update the roadmap as we get closer to an exact date - it depends how the testing goes. |
Thanks @ejizba for the quick reply and the details. |
Hi folks, I was waiting to post instructions until core tools released and it just did today! |
Works great! Just what I need! @ejizba Thanks for providing us with an early preview. |
We just announced preview! 🎉 Check out our blog post here: https://aka.ms/AzFuncNodeHttpStreams |
Hi folks! HTTP streams work like a charm. I tested them using the app.setup({ enableHttpStream: true });
// ...
const chunks = await client.streamChatCompletions(deploymentId, messages);
// ...
function createStream(chunks, callback) {
const buffer = new Readable({
read() {},
});
const stream = async () => {
for await (const chunk of chunks) {
buffer.push(content);
}
buffer.push(null);
};
stream();
return buffer;
}
return {
headers: { "Content-Type": "text/plain" },
body: createStream(chunks)
} |
hey everyone, as @ilyachenko indicates, the streaming works like a charm, but in my scenario only on localhost. Is this something anyone witnessed? Is there any way to address the issue? azure configs, specific tier etc? Thanks in advance! |
Hi folks, http stream support is GA as of today! 🎉 Read more in our blog post here: https://aka.ms/azfuncnodehttpstreams @eric-gonzalez-tfs @georgal I'm going to close this issue, but I've moved your comments to a discussion here: #261 |
Hi I am trying to send html chunks back from an azure function but this still isnt working for me .. any suggestions on how else i can send some html back to the browser in a chucked manner ?? Thanks
|
Repro steps
Using programming model v4, create an HTTP-triggered function where:
ReadableStream
in the return object'sbody
field.ReadableStream
enqueues data with a time delay.ReadableStream
continues to enqueue data after the handler function has returned.context.log()
gets called.Call the function via HTTP.
(I recommend to check the source code below; I guess it's more telling than my description here.)
Expected behavior
When the function is called:
Transfer-Encoding: chunked
).context.log()
calls should log "as usual" and should not raise warnings/errors.Actual behavior
Transfer-Encoding: chunked
is set in the HTTP response, newly enqueued data is not directly sent when it becomes available. Instead, the completion of the stream is awaited before any data is sent. All data is then sent in a single chunk.context.log()
calls (that happen when data is enqueued after the handler function has returned) do output the log message, they also output a warning:Unexpected call to 'log' on the context object after function execution has completed. Please check for asynchronous calls that are not awaited. Function name: streamingTest. Invocation Id: [removed].
Known workarounds
n/a
Related information
Source
Screenshot of Wireshark showing information about the HTTP request & response
The text was updated successfully, but these errors were encountered: