ORXHESTRA

Compose multi-agent AI systems with async event streaming, composable agent hierarchies, and built-in MCP & A2A integration.

$ pip install orxhestra

Everything you need to orchestrate agents

Agent Ensemble

LLM, ReAct, Sequential, Parallel, and Loop agents. Mix and match to build any orchestration pattern.

Event Streaming

Async event-driven architecture with real-time streaming. Every agent yields typed events you can observe and react to.

Composer

Conduct entire agent orchestras declaratively with YAML. No Python wiring needed for complex pipelines.

Tools

Function tools, filesystem, shell, agent-as-tool, and long-running tool support. Extend agents with any capability.

Planners

Choreograph task execution with PlanReAct and TaskPlanner strategies. Agents that think before they act.

MCP & A2A

First-class Model Context Protocol and Agent-to-Agent protocol integration for cross-service harmonization.

Build agents in minutes

Create powerful AI agents with just a few lines of Python. Stream events in real-time and compose complex orchestration patterns.

  • Works with any LangChain-compatible LLM
  • OpenAI, Anthropic, Google, and custom providers
  • Async-first with full streaming support
  • Session management and memory built-in
  • Type-safe with full Pydantic models
agent.py
from orxhestra import LlmAgent, Runner
from langchain_openai import ChatOpenAI

agent = LlmAgent(
    name="assistant",
    model=ChatOpenAI(model="gpt-5.4"),
    instructions="You are a helpful assistant.",
)

runner = Runner(agent=agent, app_name="my-app")

async for event in runner.astream(
    user_id="user-1",
    session_id="session-1",
    new_message="Hello!",
):
    print(event.content)

A complete ensemble

LlmAgent

Chat model agent with tools, instructions, and structured output

ReActAgent

Reasoning + acting loop with automatic tool use

SequentialAgent

Runs sub-agents in order, like a pipeline

ParallelAgent

Runs sub-agents concurrently for maximum throughput

LoopAgent

Repeats sub-agents until an exit condition is met

A2AAgent

Connects to remote agents via the A2A protocol

Declare. Compose. Run.

compose.yaml
# Define an entire coding agent in YAML
defaults:
  model:
    provider: openai
    name: gpt-5.4

tools:
  filesystem:
    builtin: "filesystem"
  shell:
    builtin: "shell"

agents:
  planner:
    type: llm
    instructions: Output actionable steps for the coder.

  coder:
    type: llm
    tools: [filesystem, shell]
    instructions: Execute the plan. Never ask the user.

  dev_loop:
    type: loop
    agents: [coder, reviewer]

  coordinator:
    type: sequential
    agents: [planner, dev_loop]

main_agent: coordinator

Ready to orchestrate?

Get started with orxhestra in minutes.