Agent Backends
Mudabbir supports three agent backends. Each provides different capabilities and trade-offs.
Backend Comparison
| Feature | Claude Agent SDK | Mudabbir Native | Open Interpreter |
|---|---|---|---|
| Provider | Anthropic or Ollama | Anthropic or Ollama | Ollama/OpenAI/Anthropic |
| Built-in tools | Bash, Read, Write, Edit | Custom + OI execution | OI tools |
| Local models | Yes (Ollama) | Yes (Ollama) | Yes (Ollama) |
| Code execution | Claude tools | Open Interpreter | Open Interpreter |
| Streaming | Yes | Yes | Yes |
| MCP support | Native | Via tool registry | Via tool registry |
| Recommended for | Coding, complex tasks | Balanced workflows | Local/offline use |
Switching Backends
Set the backend via environment variable or config:
export MUDABBIR_AGENT_BACKEND="claude_agent_sdk" # default# orexport MUDABBIR_AGENT_BACKEND="mudabbir_native"# orexport MUDABBIR_AGENT_BACKEND="open_interpreter"Or change it in the web dashboard’s Settings panel.
Backend Details
Claude Agent SDK
The recommended backend with native Claude tools and MCP support.
Mudabbir Native
Custom orchestrator combining Anthropic SDK with Open Interpreter.
Open Interpreter
Standalone engine with Ollama support for fully local operation.
Ollama (Local LLMs)
Run fully local with any backend — no API keys, no cloud, no costs.
Was this page helpful?