Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support streams (http) #97

Closed
haukepribnow opened this issue May 28, 2023 · 13 comments
Closed

Support streams (http) #97

haukepribnow opened this issue May 28, 2023 · 13 comments
Assignees
Milestone

Comments

@haukepribnow
Copy link

haukepribnow commented May 28, 2023

Repro steps

  1. Using programming model v4, create an HTTP-triggered function where:

    • The handler function provides a ReadableStream in the return object's body field.
    • The ReadableStream enqueues data with a time delay.
    • The ReadableStream continues to enqueue data after the handler function has returned.
    • While enqueuing data, context.log() gets called.
  2. Call the function via HTTP.

(I recommend to check the source code below; I guess it's more telling than my description here.)

Expected behavior

When the function is called:

  • When new data is enqueued, it should directly be sent to the HTTP client as a chunk (using Transfer-Encoding: chunked).
  • The context.log() calls should log "as usual" and should not raise warnings/errors.

Actual behavior

  • While Transfer-Encoding: chunked is set in the HTTP response, newly enqueued data is not directly sent when it becomes available. Instead, the completion of the stream is awaited before any data is sent. All data is then sent in a single chunk.
  • While the context.log() calls (that happen when data is enqueued after the handler function has returned) do output the log message, they also output a warning: Unexpected call to 'log' on the context object after function execution has completed. Please check for asynchronous calls that are not awaited. Function name: streamingTest. Invocation Id: [removed].

Known workarounds

n/a

Related information

  • Programming language used: TypeScript
Source
import { ReadableStream } from "stream/web";
import { HttpRequest, HttpResponse, InvocationContext, app } from "@azure/functions";

// Test function that returns a ReadableStream as body.
export async function streamingTest(request: HttpRequest, context: InvocationContext): Promise<HttpResponse> {
    context.log(`Http function processed request for url "${request.url}"`);

    const encoder = new TextEncoder();
    let times = 0;

    const stream = new ReadableStream({
        type: 'bytes',
        async start(controller) {
            function onTimer() {
                if (times >= 1000) {
                    controller.close();
                    return;
                }

                context.log(`${times}`);
                // This raises a warning:
                // Unexpected call to 'log' on the context object after function execution
                // has completed. Please check for asynchronous calls that are not awaited.
                // Function name: streamingTest. Invocation Id: [removed].

                times++;
                const data = encoder.encode(`${times}\r\n`);
                controller.enqueue(data);
                setTimeout(onTimer, 10);
            }
            onTimer();
        } 
    }, {
       highWaterMark: 1 
    });

    return new HttpResponse({
        status: 200,
        headers: {
            "Content-Type": "text/plain"
        },
        body: stream
    });
};

app.http('streamingTest', {
    methods: ['GET'],
    authLevel: 'anonymous',
    handler: streamingTest 
});
Screenshot of Wireshark showing information about the HTTP request & response

image

@ejizba ejizba changed the title v4 HTTP bindings: Unexpected behaviors when returning a ReadableStream that enqueues data & outputs log messages after returning from the handler function Support streams (http) May 31, 2023
@ejizba ejizba added this to the Backlog Candidates milestone May 31, 2023
@ejizba
Copy link
Contributor

ejizba commented May 31, 2023

Hi @haukepribnow we don't fully support streams right now. This is the expected behavior:

Instead, the completion of the stream is awaited before any data is sent. All data is then sent in a single chunk.

I'm cleaning up the stream issues and will use this to track support for http streams. This was finally unblocked for us in the host: Azure/azure-functions-dotnet-worker#1387. The .NET Isolated worker was the first one to use it, but we should be able to benefit in Node.js as well.

Unfortunately I don't have any ETAs. For now we're focused on GA-ing the v4 programming model, but we will keep the roadmap updated with any stream plans: https://github.com/Azure/azure-functions-nodejs-library/wiki/Roadmap

Streaming Azure resources will be done separately, tracked by #99

@m14t
Copy link

m14t commented Jul 19, 2023

Given the slow responses from OpenAI, but their support for streaming, having support for streaming here would be critical.

Azure's private OpenAI offering is have me, a decades long AWS customer, strongly considering switching to Azure for AI based projects, but this issue will likely prevent me from using this serverless approach.

@AlexPshul
Copy link

Thanks @ejizba for keeping this issue alive and referencing this in other issues.

I am not sure how much this is requested by others, but I wanted to leave a +1 on this.
Especially with all the Azure OpenAI related stuff that is going on right now, having an Azure Function that can stream back responses would be super valuable!

Do you have any updates on this or ETAs?
Are you open to contributions? :)

@ejizba
Copy link
Contributor

ejizba commented Jan 30, 2024

@AlexPshul We're open to contributions, but not for this feature. This has been in-progress for a while and involves a lot of moving parts. In the next week or so I should actually be able to post unofficial instructions to try this out, so the best way you could help would be testing.

As for the official preview announcement, that ETA is listed on the roadmap. I will update the roadmap as we get closer to an exact date - it depends how the testing goes.

@AlexPshul
Copy link

Thanks @ejizba for the quick reply and the details.
I'd love to try the feature when it's available, even if it's unofficial.
Once you have it, please let me know where I can access it.

@ejizba
Copy link
Contributor

ejizba commented Feb 16, 2024

Hi folks, I was waiting to post instructions until core tools released and it just did today! Here are steps if you want to try out http streams before we announce preview: https://github.com/Azure/azure-functions-nodejs-library/wiki/Http-Stream-Support Edit: we announced preview, see here instead: https://aka.ms/AzFuncNodeHttpStreams

@AlexPshul
Copy link

Works great! Just what I need!
image
Had to play around a bit to understand how to return a stream from a function.
Ended up implementing my own generator and sending back a Readable.from(generator) in the response.

@ejizba Thanks for providing us with an early preview.
I will be happy to contribute an example to the docs if you need one.

@ejizba
Copy link
Contributor

ejizba commented Feb 28, 2024

We just announced preview! 🎉 Check out our blog post here: https://aka.ms/AzFuncNodeHttpStreams

@ilyachenko
Copy link

Hi folks! HTTP streams work like a charm. I tested them using the @azure/openai. The stream buffer should be returned in the body of the function:

app.setup({ enableHttpStream: true });

// ...

const chunks = await client.streamChatCompletions(deploymentId, messages);

// ...

function createStream(chunks, callback) {
  const buffer = new Readable({
    read() {},
  });

  const stream = async () => {
    for await (const chunk of chunks) {
      buffer.push(content);
    }
    buffer.push(null);
  };

  stream();

  return buffer;
}

return {
    headers: { "Content-Type": "text/plain" },
    body: createStream(chunks)
}

@ejizba ejizba modified the milestones: Backlog Candidates, May 2024 May 6, 2024
@eric-gonzalez-tfs
Copy link

Hi @ejizba, Thank you for the update!

After calling the setup method with enableHttpStream set to true, will await request.formData() take advantage of the streaming request functionality?

The blog post suggests using request.body but is request.formData just as good? Thanks!

image

@georgal
Copy link

georgal commented May 21, 2024

hey everyone,

as @ilyachenko indicates, the streaming works like a charm, but in my scenario only on localhost.
Once running from app service it doesn't stream, instead returns response once the stream is completed.

Is this something anyone witnessed? Is there any way to address the issue? azure configs, specific tier etc?

Thanks in advance!

@ejizba
Copy link
Contributor

ejizba commented May 21, 2024

Hi folks, http stream support is GA as of today! 🎉 Read more in our blog post here: https://aka.ms/azfuncnodehttpstreams

@eric-gonzalez-tfs @georgal I'm going to close this issue, but I've moved your comments to a discussion here: #261

@ejizba ejizba closed this as completed May 21, 2024
@d-tdhillon
Copy link

Hi I am trying to send html chunks back from an azure function but this still isnt working for me .. any suggestions on how else i can send some html back to the browser in a chucked manner ?? Thanks

import { app} from '@azure/functions';

@Get('ping-html')
 pingHTML(@Res() response: Response) {
     app.setup({enableHttpStream:true});
     response.status(HttpStatus.PARTIAL_CONTENT);
     response.setHeader('Content-Type', 'text/html');
     response.setHeader('X-Content-Type-Options', 'nosniff');
     response.setHeader('Transfer-Encoding', 'chunked');
     const html = `<h1>Controller alive and kicking ...${moment().utc()}</h1>`;
     

     response.write(html);
     
     setTimeout(()=>{
         response.write(`Second line after timeout ${moment().utc()}`);
         response.end();
     },10000)
     
     console.log(" response constructed ");
     return html;
 }

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

8 participants