Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

StreamingResponse #9

Open
franzwilding opened this issue Apr 11, 2024 · 7 comments
Open

StreamingResponse #9

franzwilding opened this issue Apr 11, 2024 · 7 comments
Labels

Comments

@franzwilding
Copy link

In order to have a good LLM chat UX, we need to streame the response to the client. Langserve is doing this with an dedicated endpoint, hayhooks could do the same (pseudocode):

async def pipeline_stream(pipeline_run_req: PipelineRunRequest) -> StreamingResponse:
        buffer = ...
        result = pipe.run(data=pipeline_run_req.dict())

        return StreamingResponse(buffer_generator)
app.add_api_route(
        path=f"/{pipeline_def.name}/stream",
        endpoint=pipeline_stream,
        methods=["POST"],
        name=pipeline_def.name,
        response_model=PipelineRunResponse,
    )

Additionally haystack should provide a special streaming_callback that will write the chunk content to a buffer, that will be available to hayhooks. Maybe the Pipeline could add this logic and provides an pipe.stream method that will return a generator or simething like this.

@vblagoje
Copy link
Member

Yes @franzwilding we have this item on our roadmap, thanks for raising this issue and voicing your preferred solution.

@masci masci added the enhancement New feature or request label Apr 12, 2024
@Phlasse
Copy link

Phlasse commented May 7, 2024

@vblagoje any idea yet, when this feature will become available? We are using haystack in quite some projects now and want to know if it is worth putting more energy in our work around solution or if we can expect proper streaming out of a pipeline soon :) ?

@vblagoje
Copy link
Member

vblagoje commented May 7, 2024

Yes, I understand totally! The support is currently being worked on 😎

@aymbot
Copy link

aymbot commented Jul 30, 2024

@vblagoje Any updates regarding an ETA for the feature? Thanks in advance for the heads-up

@vblagoje
Copy link
Member

@aymbot on our immediate roadmap for Q3, starting soon 🙏

@ilkersigirci
Copy link

ilkersigirci commented Sep 12, 2024

With this feature implemented, hayhooks would be a strong alternative to langserve. Thanks again for working on it

@ParseDark
Copy link

really need this feature. Is there any recent update? The streaming feature is very important because most of the other third-party UIs or pkgs are called in streaming mode.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

8 participants