FastClaw is a high-performance AI agent framework with multi-agent support, multi-channel integration, and comprehensive extensible features.
FastClaw implements a comprehensive agent framework:
fastclaw/ ├── core/ # Core framework │ ├── gateway.py # WebSocket gateway │ ├── agent.py # Agent runtime & loop │ ├── llm_client.py # LLM client integration │ ├── tools.py # Tool registry │ ├── multi_agent.py # Multi-agent orchestrator │ ├── workspace.py # Workspace management │ ├── audit.py # Audit logging │ ├── error_handler.py # Error handling │ ├── plugin.py # Plugin system │ ├── cli.py # CLI interface │ ├── llm/ # LLM adapters │ │ ├── base.py │ │ ├── openai_adapter.py │ │ ├── anthropic_adapter.py │ │ └── ollama_adapter.py │ ├── mcp/ # MCP protocol │ │ ├── manager.py │ │ └── protocol.py │ └── skills/ # Skills system │ ├── base.py │ ├── manager.py │ └── registry.py ├── adapters/ # Channel adapters │ ├── base.py │ ├── telegram.py │ ├── slack.py │ ├── discord.py │ ├── feishu.py │ ├── dingtalk.py │ ├── qq.py │ ├── wecom.py │ ├── whatsapp.py │ └── webhook.py ├── storage/ # Storage implementations │ ├── session_store.py │ └── memory_store.py ├── workspace/ # Agent workspace (markdown files) │ ├── AGENTS.md │ ├── SOUL.md │ ├── TOOLS.md │ └── MEMORY.md ├── state/ # Runtime state │ ├── sessions/ │ └── memory/ ├── ui/ # Web interface │ └── index.html ├── main.py # Entry point ├── config.yaml # Configuration file ├── ARCHITECTURE.md # Architecture documentation ├── ROADMAP.md # Project roadmap └── requirements.txt # Python dependencies
# Create virtual environment
python3 -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install dependencies
pip install -r requirements.txt
# Set environment variables
export DEEPSEEK_API_KEY="your_api_key_here"
export DEEPSEEK_BASE_URL="https://api.deepseek.com/v1"
# Start the server
python3 main.py
# Access the web interface
# Open http://localhost:8000 in your browser
POST /api/v1/chat - OpenAI-compatible HTTP APIWS /api/ws - WebSocket endpoint for real-time communicationGET /api/v1/tools - List available toolsGET /api/v1/sessions - List active sessionsGET / - Web interfaceEdit config.yaml to configure:
llm:
provider: "openai" # openai | anthropic | ollama
openai:
api_key: "${DEEPSEEK_API_KEY}"
base_url: "${DEEPSEEK_BASE_URL}"
model: "deepseek-chat"
agents:
- id: "coordinator"
role: "coordinator"
model: "deepseek-chat"
enabled: true
channels:
telegram:
enabled: false
bot_token: "${TELEGRAM_BOT_TOKEN}"
slack:
enabled: false
bot_token: "${SLACK_BOT_TOKEN}"
bash: Execute shell commands (with approval)file_read: Read file contentsfile_write: Write file contentsfile_list: List directory contentscalculator: Evaluate mathematical expressionsmemory_add: Add to long-term memorymemory_search: Search memory# List all CLI commands
python3 -m core.cli --help
# Start agent with specific configuration
python3 -m core.cli start --config config.yaml
# List available tools
python3 -m core.cli tools list
# Run a test query
python3 -m core.cli query "Hello, FastClaw!"
MIT License