sherma

sherma logo

sherma

A Python framework for building LLM-powered agents that bridges A2A, LangGraph, and Agent Skills through a unified, declarative interface.

What is sherma?

sherma lets you define agents in YAML with Common Expression Language (CEL) for dynamic logic, while still giving you full programmatic control when you need it. It handles the wiring between protocols so you can focus on agent behavior.

Every utility in sherma – registries, tool wrapping, hooks, message converters, multi-agent primitives – works as a standalone building block. You can use them with DeclarativeAgent for zero-code YAML agents, with LangGraphAgent for custom graph construction, or mix both approaches.

Core value proposition:

Documentation

Document Description
Getting Started Installation, setup, and your first agent
Core Concepts Entities, registries, versioning, and the type system
Declarative Agents YAML schema reference, node types, edges, and CEL expressions
Multi-Agent Sub-agent orchestration, agent-as-tool wrapping
Skills Skill cards, progressive disclosure, MCP and local tool integration
Hooks Lifecycle hooks for observability, guardrails, and control flow
A2A Integration A2A protocol support, agent executor, message conversion
API Reference Exported classes, functions, and type definitions

Quick Example

# weather-agent.yaml
prompts:
  - id: weather-prompt
    version: "1.0.0"
    instructions: >
      You are a helpful weather assistant.
      Use the get_weather tool to look up current weather.

llms:
  - id: openai-gpt-4o-mini
    version: "1.0.0"
    provider: openai
    model_name: gpt-4o-mini

tools:
  - id: get_weather
    version: "1.0.0"
    import_path: my_tools.get_weather

agents:
  weather-agent:
    state:
      fields:
        - name: messages
          type: list
          default: []

    graph:
      entry_point: agent
      nodes:
        - name: agent
          type: call_llm
          args:
            llm:
              id: openai-gpt-4o-mini
              version: "1.0.0"
            prompt:
              - role: system
                content: 'prompts["weather-prompt"]["instructions"]'
              - role: messages
                content: 'messages'
            tools:
              - id: get_weather
                version: "1.0.0"

      edges:
        - source: agent
          target: __end__
import asyncio
from sherma import DeclarativeAgent

agent = DeclarativeAgent(
    id="weather-agent",
    version="1.0.0",
    yaml_path="weather-agent.yaml",
)

# Use with A2A messages
from a2a.types import Message, Part, Role, TextPart

msg = Message(
    message_id="1",
    role=Role.user,
    parts=[Part(root=TextPart(text="What's the weather in Tokyo?"))],
)

async def main():
    async for event in agent.send_message(msg):
        print(event)

asyncio.run(main())

Installation

pip install sherma

Or with uv:

uv add sherma

Requires Python 3.13+.