trpc-agent-go@6c7e3a47 | ||
trpc-agent-go-1@f26b0733 | ||
English | 中文
A powerful Go framework for building intelligent agent systems that transforms how you create AI applications. Build autonomous agents that think, remember, collaborate, and act with unprecedented ease.
Why tRPC-Agent-Go?
SKILL.md workflows with safe executionPerfect for building:
|
|
|
|
|
|
Ready to dive into tRPC-Agent-Go? Our documentation covers everything from basic concepts to advanced techniques, helping you build powerful AI applications with confidence. Whether you're new to AI agents or an experienced developer, you'll find detailed guides, practical examples, and best practices to accelerate your development journey.
See it in Action: [Demo GIF placeholder - showing agent reasoning and tool usage]
Get started in 3 simple steps:
# 1. Clone and setup
git clone https://github.com/trpc-group/trpc-agent-go.git
cd trpc-agent-go
# 2. Configure your LLM
export OPENAI_API_KEY="your-api-key-here"
export OPENAI_BASE_URL="your-base-url-here" # Optional
# 3. Run your first agent!
cd examples/runner
go run . -model="gpt-4o-mini" -streaming=true
What you'll see:
Try asking: "What's the current time? Then calculate 15 * 23 + 100"
package main
import (
"context"
"fmt"
"log"
"trpc.group/trpc-go/trpc-agent-go/agent/llmagent"
"trpc.group/trpc-go/trpc-agent-go/model"
"trpc.group/trpc-go/trpc-agent-go/model/openai"
"trpc.group/trpc-go/trpc-agent-go/runner"
"trpc.group/trpc-go/trpc-agent-go/tool"
"trpc.group/trpc-go/trpc-agent-go/tool/function"
)
func main() {
// Create model.
modelInstance := openai.New("deepseek-chat")
// Create tool.
calculatorTool := function.NewFunctionTool(
calculator,
function.WithName("calculator"),
function.WithDescription("Execute addition, subtraction, multiplication, and division. "+
"Parameters: a, b are numeric values, op takes values add/sub/mul/div; "+
"returns result as the calculation result."),
)
// Enable streaming output.
genConfig := model.GenerationConfig{
Stream: true,
}
// Create Agent.
agent := llmagent.New("assistant",
llmagent.WithModel(modelInstance),
llmagent.WithTools([]tool.Tool{calculatorTool}),
llmagent.WithGenerationConfig(genConfig),
)
// Create Runner.
runner := runner.NewRunner("calculator-app", agent)
// Execute conversation.
ctx := context.Background()
events, err := runner.Run(ctx,
"user-001",
"session-001",
model.NewUserMessage("Calculate what 2+3 equals"),
)
if err != nil {
log.Fatal(err)
}
// Process event stream.
for event := range events {
if event.Object == "chat.completion.chunk" {
fmt.Print(event.Response.Choices[0].Delta.Content)
}
}
fmt.Println()
}
func calculator(ctx context.Context, req calculatorReq) (calculatorRsp, error) {
var result float64
switch req.Op {
case "add", "+":
result = req.A + req.B
case "sub", "-":
result = req.A - req.B
case "mul", "*":
result = req.A * req.B
case "div", "/":
result = req.A / req.B
default:
return calculatorRsp{}, fmt.Errorf("invalid operation: %s", req.Op)
}
return calculatorRsp{Result: result}, nil
}
type calculatorReq struct {
A float64 `json:"A" jsonschema:"description=First integer operand,required"`
B float64 `json:"B" jsonschema:"description=Second integer operand,required"`
Op string `json:"Op" jsonschema:"description=Operation type,enum=add,enum=sub,enum=mul,enum=div,required"`
}
type calculatorRsp struct {
Result float64 `json:"result"`
}
Sometimes your Agent must be created per request (for example: different
prompt, model, tools, sandbox instance). In that case, you can let Runner build
a fresh Agent for every Run(...):
r := runner.NewRunnerWithAgentFactory(
"my-app",
"assistant",
func(ctx context.Context, ro agent.RunOptions) (agent.Agent, error) {
// Use ro to build an Agent for this request.
a := llmagent.New("assistant",
llmagent.WithInstruction(ro.Instruction),
)
return a, nil
},
)
events, err := r.Run(ctx,
"user-001",
"session-001",
model.NewUserMessage("Hello"),
agent.WithInstruction("You are a helpful assistant."),
)
_ = events
_ = err
If you want to interrupt a running agent, cancel the context you passed to
Runner.Run (recommended). This stops model calls and tool calls safely and
lets the runner clean up.
Important: do not just “break” your event loop and walk away — the agent goroutine may keep running and can block on channel writes. Always cancel, then keep draining the event channel until it is closed.
Convert Ctrl+C into context cancellation:
ctx, stop := signal.NotifyContext(context.Background(), os.Interrupt)
defer stop()
events, err := r.Run(ctx, userID, sessionID, message)
if err != nil {
return err
}
for range events {
// Drain until the runner stops (ctx canceled or run completed).
}
ctx, cancel := context.WithCancel(context.Background())
defer cancel()
events, err := r.Run(ctx, userID, sessionID, message)
if err != nil {
return err
}
go func() {
time.Sleep(2 * time.Second)
cancel()
}()
for range events {
// Keep draining until the channel is closed.
}
requestID := "req-123"
events, err := r.Run(ctx, userID, sessionID, message,
agent.WithRequestID(requestID),
)
mr := r.(runner.ManagedRunner)
_ = mr.Cancel(requestID)
For more details (including detached cancellation, resume, and server cancel
routes), see docs/mkdocs/en/runner.md and docs/mkdocs/en/agui.md.
The examples directory contains runnable demos covering every major feature.
Example: examples/llmagent
LLMAgent.event.Event updates while the model streams.Example: examples/multiagent
Example: examples/graph
GraphAgent – demonstrates building and executing complex, conditional
workflows using the graph and agent/graph packages. It shows
how to construct a graph-based agent, manage state safely, implement
conditional routing, and orchestrate execution with the Runner.
Multi-conditional fan-out routing:
// Return multiple branch keys and run targets in parallel.
sg := graph.NewStateGraph(schema)
sg.AddNode("router", func(ctx context.Context, s graph.State) (any, error) {
return nil, nil
})
sg.AddNode("A", func(ctx context.Context, s graph.State) (any, error) {
return graph.State{"a": 1}, nil
})
sg.AddNode("B", func(ctx context.Context, s graph.State) (any, error) {
return graph.State{"b": 1}, nil
})
sg.SetEntryPoint("router")
sg.AddMultiConditionalEdges(
"router",
func(ctx context.Context, s graph.State) ([]string, error) {
return []string{"goA", "goB"}, nil
},
map[string]string{"goA": "A", "goB": "B"}, // Path map or ends map
)
sg.SetFinishPoint("A").SetFinishPoint("B")
Example: examples/memory
Example: examples/knowledge
Example: examples/telemetry
Example: examples/mcptool
Example: examples/agui
Example: examples/evaluation
Example: examples/skillrun
SKILL.md spec + optional docs/scripts.skill_load, skill_list_docs, skill_select_docs,
skill_run (runs commands in an isolated workspace).skill_run only for commands required by the selected skill
docs, not for generic shell exploration.Example: examples/artifact
Example: examples/a2aadk
Example: openclaw
Other notable examples:
See individual README.md files in each example folder for usage details.
Architecture
Key packages:
| Package | Responsibility |
|---|---|
agent | Core execution unit, responsible for processing user input and generating responses. |
runner | Agent executor, responsible for managing execution flow and connecting Session/Memory Service capabilities. |
model | Supports multiple LLM models (OpenAI, DeepSeek, etc.). |
tool | Provides various tool capabilities (Function, MCP, DuckDuckGo, etc.). |
session | Manages user session state and events. |
memory | Records user long-term memory and personalized information. |
knowledge | Implements RAG knowledge retrieval capabilities. |
planner | Provides Agent planning and reasoning capabilities. |
artifact | Stores and retrieves versioned files produced by agents and tools (images, reports, etc.). |
skill | Loads and executes reusable Agent Skills defined by SKILL.md. |
event | Defines event types and streaming payloads used across Runner and servers. |
evaluation | Evaluates agents on eval sets using pluggable metrics and stores results. |
server | Exposes HTTP servers (Gateway, AG-UI, A2A) for integration and UIs. |
telemetry | OpenTelemetry tracing and metrics instrumentation. |
For most applications you do not need to implement the agent.Agent
interface yourself. The framework already ships with several ready-to-use
agents that you can compose like Lego bricks:
| Agent | Purpose |
|---|---|
LLMAgent | Wraps an LLM chat-completion model as an agent. |
ChainAgent | Executes sub-agents sequentially. |
ParallelAgent | Executes sub-agents concurrently and merges output. |
CycleAgent | Loops over a planner + executor until stop signal. |
// 1. Create a base LLM agent.
base := llmagent.New(
"assistant",
llmagent.WithModel(openai.New("gpt-4o-mini")),
)
// 2. Create a second LLM agent with a different instruction.
translator := llmagent.New(
"translator",
llmagent.WithInstruction("Translate everything to French"),
llmagent.WithModel(openai.New("gpt-3.5-turbo")),
)
// 3. Combine them in a chain.
pipeline := chainagent.New(
"pipeline",
chainagent.WithSubAgents([]agent.Agent{base, translator}),
)
// 4. Run through the runner for sessions & telemetry.
run := runner.NewRunner("demo-app", pipeline)
events, _ := run.Run(ctx, "user-1", "sess-1",
model.NewUserMessage("Hello!"))
for ev := range events { /* ... */ }
The composition API lets you nest chains, cycles, or parallels to build complex workflows without low-level plumbing.
We love contributions! Join our growing community of developers building the future of AI agents.
# Fork & clone the repo
git clone https://github.com/YOUR_USERNAME/trpc-agent-go.git
cd trpc-agent-go
# Run tests to ensure everything works
go test ./...
go vet ./...
# Make your changes and submit a PR!
Please read CONTRIBUTING.md for detailed guidelines and coding standards.
Special thanks to Tencent's business units including Tencent Yuanbao, Tencent Video, Tencent News, IMA, and QQ Music for their invaluable support and real-world validation. Production usage drives framework excellence!
Inspired by amazing frameworks like ADK, Agno, CrewAI, AutoGen, and many others. Standing on the shoulders of giants!
Licensed under the Apache 2.0 License - see LICENSE file for details.