Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🧐[问题] flask做流式服务端,ProChat无法递进式接收chunk #248

Open
billzhaoyansong opened this issue Jun 12, 2024 · 4 comments
Labels
documentation Improvements or additions to documentation

Comments

@billzhaoyansong
Copy link

🧐 问题描述

按照 这里 的方法做了一个flask的server,发送sse消息,但是pro-chat无法递进式的接收数据。总是等到消息全部结束了以后才能收到。不知是否是flask的设置有问题。如果可以能否提供一个完整的flask端example+pro-chat客户端例子

💻 示例代码

服务器端代码与这里 基本一致,prochat端代码如下

request={async (messages: any) => {

            // 正常业务中如下:
            const response = await fetch('/stream-sse', {
              method: 'POST',
              headers: {
                'Content-Type': 'application/json;charset=UTF-8',
              },
              body: JSON.stringify({
                messages,
                stream: true,
              }),
            });
            console.log('messages', messages);

            // 确保服务器响应是成功的
            if (!response.ok || !response.body) {
              throw new Error(`HTTP error! status: ${response.status}`);
            }

            console.log('getting response');
            const decoder = new TextDecoder('utf-8');
            const encoder = new TextEncoder();
            const reader = response.body?.getReader()

            const readableStream = new ReadableStream({
              async start(controller) {
                function push() {
                  reader.read().then(({ done, value }) => {
                    // If there is no more data to read
                    if (done) {
                      console.log("done", done);
                      controller.close();
                      return;
                    }

                    console.log(done, value);

                    const chunk = decoder.decode(value, {stream: true})

                    // Check chunks by logging to the console
                    console.log(done, chunk);

                    // Get the data and send it to the browser via the controller
                    controller.enqueue(encoder.encode(chunk));
                    

                    push();
                  });
                }

                push();
              },
            })


            return new Response(readableStream);

          }

🚑 其他信息

@ONLY-yours ONLY-yours added the documentation Improvements or additions to documentation label Jun 18, 2024
@allendata0706
Copy link

貌似我遇到这个问题了, 是否可以弄一个千问流式返回,完整的例子。

@ONLY-yours
Copy link
Collaborator

貌似我遇到这个问题了, 是否可以弄一个千问流式返回,完整的例子。

@allendata0706 帮忙提供下使用的 qwen 文档。
能直接帮忙来个 pr 就更好了

@allendata0706
Copy link

貌似我遇到这个问题了, 是否可以弄一个千问流式返回,完整的例子。

我的问题是在umi中开启来代理所导致,按照umi文档说明,设置一下环境变量解决。
参考文件:https://umijs.org/docs/guides/env-variables#umi_dev_server_compress
image

@lonrencn
Copy link

lonrencn commented Jul 9, 2024

//package.json

{
"scripts": {
"analyze": "cross-env ANALYZE=1 max build",
"build": "max build",
"deploy": "npm run build && npm run gh-pages",
"dev": "cross-env UMI_DEV_SERVER_COMPRESS=none umi dev"
"start": "cross-env UMI_ENV=dev UMI_DEV_SERVER_COMPRESS=none max dev",
}
}
是不是这样设置的
我这样设置结果不能开启流式输出。 @allendata0706

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation
Projects
None yet
Development

No branches or pull requests

4 participants