Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

azure openai o3-mini expecting stream response but sete stream false in request. #15663

Open
5 tasks done
ufo009e opened this issue Mar 12, 2025 · 1 comment
Open
5 tasks done
Labels
🐞 bug Something isn't working good first issue Good first issue for newcomers

Comments

@ufo009e
Copy link

ufo009e commented Mar 12, 2025

Self Checks

  • This is only for bug report, if you would like to ask a question, please head to Discussions.
  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • Please do not modify this template :) and fill in all the required fields.

Dify version

1.0.1

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

I installed azure openai plugin 0.0.8, when using o3-mini modle, stream always set to false in http request, but the plugin is expecting stream response.

Image
2025-03-12 14:57:47.584 ERROR [Thread-411 (_generate_worker)] [app_generator.py:257] - Unknown Error when generating
Traceback (most recent call last):
  File "/app/api/core/app/apps/agent_chat/app_generator.py", line 237, in _generate_worker
    runner.run(
  File "/app/api/core/app/apps/agent_chat/app_runner.py", line 245, in run
    self._handle_invoke_result(
  File "/app/api/core/app/apps/base_app_runner.py", line 271, in _handle_invoke_result
    self._handle_invoke_result_stream(invoke_result=invoke_result, queue_manager=queue_manager, agent=agent)
  File "/app/api/core/app/apps/base_app_runner.py", line 312, in _handle_invoke_result_stream
    text += result.delta.message.content
TypeError: can only concatenate str (not "list") to str

✔️ Expected Behavior

No response

❌ Actual Behavior

No response

@dosubot dosubot bot added the 🐞 bug Something isn't working label Mar 12, 2025
Copy link

dosubot bot commented Mar 12, 2025

I found a similar discussion thread titled "azure openai api Stream options setting" which is currently unsolved [1].

Regarding your issue, the TypeError in the _handle_invoke_result_stream method suggests that result.delta.message.content is a list, but the code is attempting to concatenate it as if it were a string. To resolve this, ensure that result.delta.message.content is indeed a string before concatenating it. You might need to convert or join the list elements into a string format.

Additionally, the stream parameter in the HTTP request is set based on the response_mode. If response_mode is "streaming", stream should be set to True; otherwise, it is set to False <sup>[[2]](https://github.com/langgenius/dify/blob/main/sdks/python-client/dify_client/client.py)</sup><sup>[[3]](https://github.com/langgenius/dify/blob/main/sdks/nodejs-client/index.js)</sup>. Ensure that the response_mode is correctly set to "streaming" if a stream response is expected.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other

@crazywoola crazywoola reopened this Mar 13, 2025
@crazywoola crazywoola added the good first issue Good first issue for newcomers label Mar 13, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐞 bug Something isn't working good first issue Good first issue for newcomers
Projects
None yet
Development

No branches or pull requests

2 participants