You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
[FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
Please do not modify this template :) and fill in all the required fields.
Dify version
1.0.1
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
I installed azure openai plugin 0.0.8, when using o3-mini modle, stream always set to false in http request, but the plugin is expecting stream response.
2025-03-12 14:57:47.584 ERROR [Thread-411 (_generate_worker)] [app_generator.py:257] - Unknown Error when generating
Traceback (most recent call last):
File "/app/api/core/app/apps/agent_chat/app_generator.py", line 237, in _generate_worker
runner.run(
File "/app/api/core/app/apps/agent_chat/app_runner.py", line 245, in run
self._handle_invoke_result(
File "/app/api/core/app/apps/base_app_runner.py", line 271, in _handle_invoke_result
self._handle_invoke_result_stream(invoke_result=invoke_result, queue_manager=queue_manager, agent=agent)
File "/app/api/core/app/apps/base_app_runner.py", line 312, in _handle_invoke_result_stream
text += result.delta.message.content
TypeError: can only concatenate str (not "list") to str
✔️ Expected Behavior
No response
❌ Actual Behavior
No response
The text was updated successfully, but these errors were encountered:
I found a similar discussion thread titled "azure openai api Stream options setting" which is currently unsolved [1].
Regarding your issue, the TypeError in the _handle_invoke_result_stream method suggests that result.delta.message.content is a list, but the code is attempting to concatenate it as if it were a string. To resolve this, ensure that result.delta.message.content is indeed a string before concatenating it. You might need to convert or join the list elements into a string format.
Additionally, the stream parameter in the HTTP request is set based on the response_mode. If response_mode is "streaming", stream should be set to True; otherwise, it is set to False <sup>[[2]](https://github.com/langgenius/dify/blob/main/sdks/python-client/dify_client/client.py)</sup><sup>[[3]](https://github.com/langgenius/dify/blob/main/sdks/nodejs-client/index.js)</sup>. Ensure that the response_mode is correctly set to "streaming" if a stream response is expected.
Self Checks
Dify version
1.0.1
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
I installed azure openai plugin 0.0.8, when using o3-mini modle, stream always set to false in http request, but the plugin is expecting stream response.
✔️ Expected Behavior
No response
❌ Actual Behavior
No response
The text was updated successfully, but these errors were encountered: