LiteLLM Integration
Since Agents SDK no longer uses assistants, most of the previously available frameworks became incompatible with it. One of the few frameworks that has been ported for the new SDK is LiteLLM, which you can use to connect your agent to various providers, such as: Anthropic, Vertex AI, AWS Bedrock, Azure, and others.1
Install LiteLLM
Install LiteLLM to get started with open-source model support:
2
Configure Agency Swarm Agent
Create an agent that connects to your LiteLLM proxy:
3
Create and Run Agency
Set up your agency and start using open-source models:
Using model-specific tools
Some models, like gemini or claude have their internal tools, which can be attached to an agent by utilizingextra_body
parameter in agent’s model_settings
:Limitations
Be aware of the limitations when using open-source models.
- Hosted tools are not supported: Patched agents are not able to utilize hosted tools, such as WebSearch, FileSearch, CodeInterpreter and others.
- Patched and unpatched models should not use handoffs to communicate: You may use standard OpenAI client and patched agents in a single agency, however using handoff to transfer chat from patched model to unpatched or vice-versa will lead to an error.
- Function calling may not be supported by some open-source models: This limitation prevents the agent from communicating with other agents in the agency. Therefore, it must be positioned at the end of the agency chart and cannot utilize any tools.
- RAG is typically limited: Most open-source implementations have restricted Retrieval-Augmented Generation capabilities. It is recommended to develop a custom tool with your own vector database.
- Potential library conflicts: the Agents SDK is still a fairly new framework which is being actively developed and improved. Due to that, there might be potential conflicts between litellm and openai-agents packages on recent releases.