Skip to content

Use anthropic api and stream and handoff will have error #419

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Daose1997 opened this issue Apr 2, 2025 · 0 comments · Fixed by #422
Closed

Use anthropic api and stream and handoff will have error #419

Daose1997 opened this issue Apr 2, 2025 · 0 comments · Fixed by #422
Labels
bug Something isn't working

Comments

@Daose1997
Copy link

Daose1997 commented Apr 2, 2025

I used claude's model following the CUSTOM_MODEL_PROVIDER approach, as shown below

client = AsyncOpenAI(
    base_url="https://api.anthropic.com/v1/",
    api_key="sk-xxxxxxxxxxxxxxx-AAA",
)

class CustomModelProvider(ModelProvider):
    def get_model(self, model_name: str | None) -> Model:
        return OpenAIChatCompletionsModel(model="claude-3-5-haiku-20241022", openai_client=client)
        # return OpenAIChatCompletionsModel(model="claude-3-7-sonnet-20250219", openai_client=client)

CUSTOM_MODEL_PROVIDER = CustomModelProvider()


tracking_task_agent = Agent(
        name="team_leader",
        instructions=TRACKING_TASK_TEMPLATE.format(work_id=work_id),
        handoffs=[project_manager, architect, engineer],
        tools=[execute_mysql_sql],
    )

result = Runner.run_streamed(
            starting_agent=tracking_task_agent,
            input=first_input,
            context=context,
            max_turns=30,
            run_config=RunConfig(model_provider=CUSTOM_MODEL_PROVIDER)
        )

When I use Runner.run, the program runs normally and completes my entire workflow. When I use Runner.run_streamed, entering the handoff process, the API call returns the following error:
openai.BadRequestError: Error code: 400 - {'error': {'code': 'invalid_request_error', 'message': 'Failed to parse JSON: ', 'type': 'invalid_request_error', 'param': None}}

LLM The final output is

{
    "role": "assistant",
    "content": "\u6211\u5c06\u5e2e\u52a9\u60a8\u89c4\u5212\u8fd9\u4e2a\u7f51\u7ad9\u9879\u76ee\u3002\u9996\u5148\uff0c\u6211\u4f1a\u4f7f\u7528Eve\u4ee3\u7406\u6765\u8fdb\u884c\u9700\u6c42\u5206\u6790\u548c\u4ea7\u54c1\u8bbe\u8ba1\u3002",
    "tool_calls": [
      {
        "id": "toolu_01SnwkRUjwXYHygLL2uSy1gZ",
        "type": "function",
        "function": {
          "name": "transfer_to_eve",
          "arguments": ""
        }
      }
    ]
  },
  {
    "role": "tool",
    "tool_call_id": "toolu_01SnwkRUjwXYHygLL2uSy1gZ",
    "content": "{'assistant': 'Eve'}"
  }
@Daose1997 Daose1997 added the bug Something isn't working label Apr 2, 2025
rm-openai added a commit that referenced this issue Apr 2, 2025
## Summary:
Resolves #419. Unclear why, but `""` was raising errors.

## Test Plan:

Confirmed things work with openai and anthropic both.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant