While OpenAI is generally recommended, there are situations where you might prefer open-source models. Agency Swarm supports various open-source model integrations as alternatives to OpenAI:

LiteLLM Integration

Since Agents SDK no longer uses assistants, most of the previously available frameworks became incompatible with it. One of the few frameworks that has been ported for the new SDK is LiteLLM, which you can use to connect your agent to various providers, such as: Anthropic, Vertex AI, AWS Bedrock, Azure, and others.
1

Install LiteLLM

Install LiteLLM to get started with open-source model support:
pip install "openai-agents[litellm]"
2

Configure Agency Swarm Agent

Create an agent that connects to your LiteLLM proxy:
import os
from agency_swarm import Agent
from agents.extensions.models.litellm_model import LitellmModel

gemini_agent = Agent(
    name="GeminiAgent",
    instructions="You are a helpful assistant",
    model=LitellmModel(
      model="gemini/gemini-2.0-flash",
      api_key=os.getenv("GEMINI_API_KEY")
    )
)
3

Create and Run Agency

Set up your agency and start using open-source models:
import asyncio
from agency_swarm import Agency

agency = Agency(gemini_agent)

agency.terminal_demo()

Using model-specific tools

Some models, like gemini or claude have their internal tools, which can be attached to an agent by utilizing extra_body parameter in agent’s model_settings:
import os
from agency_swarm import Agent
from agents.extensions.models.litellm_model import LitellmModel

gemini_agent = Agent(
    name="GeminiAgent",
    instructions="You are a helpful assistant",
    model=LitellmModel(
      model="gemini/gemini-2.0-flash",
      api_key=os.getenv("GEMINI_API_KEY"),
      extra_body={"web_search_options": {"search_context_size": "medium"}}
    )
)

grok_agent = Agent(
    name="GrokAgent",
    instructions="You are a helpful assistant",
    model=LitellmModel(
      model="xai/grok-4-0709",
      api_key=os.getenv("XAI_API_KEY"),
      extra_body={
        "search_parameters": {
          "mode": "on", "returnCitations": True
        }
      }
    )
)
Here both Grok and Gemini agents will be able to use their native search tools, which are similar to OpenAI’s WebSearch() tool. Consider checking out LiteLLM’s documentation to find a full list of supported tools.

Limitations

Be aware of the limitations when using open-source models.
  • Hosted tools are not supported: Patched agents are not able to utilize hosted tools, such as WebSearch, FileSearch, CodeInterpreter and others.
  • Patched and unpatched models should not use handoffs to communicate: You may use standard OpenAI client and patched agents in a single agency, however using handoff to transfer chat from patched model to unpatched or vice-versa will lead to an error.
  • Function calling may not be supported by some open-source models: This limitation prevents the agent from communicating with other agents in the agency. Therefore, it must be positioned at the end of the agency chart and cannot utilize any tools.
  • RAG is typically limited: Most open-source implementations have restricted Retrieval-Augmented Generation capabilities. It is recommended to develop a custom tool with your own vector database.
  • Potential library conflicts: the Agents SDK is still a fairly new framework which is being actively developed and improved. Due to that, there might be potential conflicts between litellm and openai-agents packages on recent releases.

Future Plans

Updates will be provided as new open-source assistant API implementations stabilize. If you successfully integrate other projects with agency-swarm, please share your experience through an issue or pull request.