Pi Agent
Pi Agent is the open-source TypeScript monorepo toolkit powering OpenClaw. Created by Mario Zechner (badlogic), it provides a full agent stack from unified LLM API to coding agent CLI, terminal UI, web UI, Slack bot, and vLLM pod support.
GitHub: https://github.com/earendil-works/pi Website: https://pi.dev License: MIT Language: TypeScript
Installation
Section intitulée « Installation »Install via npm
Section intitulée « Install via npm »# Install the coding agent CLI globally
npm install -g @pi-agent/cli
# Or install from source
git clone https://github.com/earendil-works/pi.git
cd pi
npm install
npm run build
Self-Update
Section intitulée « Self-Update »# Update Pi Agent to the latest version
pi update
# Check current version
pi --version
Core Packages
Section intitulée « Core Packages »Pi Agent is a monorepo with modular packages that can be used independently or together.
| Package | Description |
|---|---|
pi-ai | Unified LLM API — streaming, completions, tool definitions, cost tracking |
pi-agent-core | Agent loop with tool calling, message management, context overflow handling |
pi-coding-agent | Built-in tools (read, write, edit, bash) + session persistence |
pi-tui | Terminal UI library for interactive agent sessions |
pi-web-ui | Web-based UI for agent interaction |
pi-slack | Slack bot integration for team-based agent access |
pi-vllm | vLLM pod support for self-hosted model inference |
Quick Start
Section intitulée « Quick Start »Start an Interactive Session
Section intitulée « Start an Interactive Session »# Launch the coding agent in the current directory
pi
# Start with a specific prompt
pi "Explain the architecture of this project"
# Run a one-shot task
pi --prompt "Add error handling to the API routes"
Project Configuration
Section intitulée « Project Configuration »# Initialize a Pi Agent config in your project
pi init
# This creates a .pi/ directory with configuration files
Default Tools
Section intitulée « Default Tools »The coding agent ships with four core tools out of the box.
| Tool | Description |
|---|---|
read | Read file contents, supports partial reads with line ranges |
write | Write or create files with full content |
edit | Apply targeted string replacements to existing files |
bash | Execute shell commands in the project context |
Tool Usage Examples
Section intitulée « Tool Usage Examples »# The agent uses tools automatically based on your request
pi "Read the README and summarize the project"
# Tools are invoked by the agent loop — you describe the task, the agent picks the tools
pi "Fix the failing test in src/utils/parser.test.ts"
# Bash tool runs commands in the project directory
pi "Run the test suite and fix any failures"
LLM Provider Support (pi-ai)
Section intitulée « LLM Provider Support (pi-ai) »The pi-ai package provides a unified API across all major LLM providers.
| Provider | Models | Configuration |
|---|---|---|
| Anthropic | Claude 4, Sonnet 4, Haiku | ANTHROPIC_API_KEY |
| OpenAI | GPT-4o, o3, o4-mini | OPENAI_API_KEY |
| Gemini 2.5 Pro, Flash | GOOGLE_API_KEY | |
| AWS Bedrock | Claude, Titan, Llama | AWS credentials |
| Mistral | Mistral Large, Codestral | MISTRAL_API_KEY |
| Groq | Llama, Mixtral | GROQ_API_KEY |
| xAI | Grok | XAI_API_KEY |
| OpenRouter | Multi-provider routing | OPENROUTER_API_KEY |
| Ollama | Local models | OLLAMA_HOST (default: localhost:11434) |
| Azure OpenAI | GPT-4o, o3 | Azure credentials |
Using pi-ai Programmatically
Section intitulée « Using pi-ai Programmatically »import { createClient } from "pi-ai";
// Create a client with a specific provider
const client = createClient({
provider: "anthropic",
model: "claude-sonnet-4-20250514",
apiKey: process.env.ANTHROPIC_API_KEY,
});
// Simple completion
const response = await client.complete({
messages: [{ role: "user", content: "Hello, world!" }],
});
// Streaming completion
const stream = await client.stream({
messages: [{ role: "user", content: "Explain TypeScript generics" }],
});
for await (const chunk of stream) {
process.stdout.write(chunk.text);
}
Tool Definitions
Section intitulée « Tool Definitions »import { defineTool } from "pi-ai";
const searchTool = defineTool({
name: "web_search",
description: "Search the web for information",
parameters: {
query: { type: "string", description: "Search query" },
},
execute: async ({ query }) => {
// Implementation
return { results: [] };
},
});
const response = await client.complete({
messages: [{ role: "user", content: "Search for TypeScript 5.7 features" }],
tools: [searchTool],
});
Cost Tracking
Section intitulée « Cost Tracking »// pi-ai tracks token usage and cost per request
const response = await client.complete({
messages: [{ role: "user", content: "Hello" }],
});
console.log(response.usage.inputTokens); // Tokens consumed
console.log(response.usage.outputTokens); // Tokens generated
console.log(response.usage.cost); // Estimated cost in USD
Agent Core (pi-agent-core)
Section intitulée « Agent Core (pi-agent-core) »The agent loop manages tool calling, message history, and context overflow.
Agent Loop
Section intitulée « Agent Loop »import { createAgent } from "pi-agent-core";
const agent = createAgent({
client,
tools: [readTool, writeTool, editTool, bashTool],
systemPrompt: "You are a helpful coding assistant.",
});
// Run the agent loop — handles multi-turn tool calling automatically
const result = await agent.run("Refactor the utils module for better testability");
Subagents
Section intitulée « Subagents »// Spawn a subagent for a focused subtask
const subagent = agent.createSubagent({
systemPrompt: "You are a test-writing specialist.",
tools: [readTool, writeTool, bashTool],
});
const testResult = await subagent.run("Write unit tests for src/utils/parser.ts");
Session Management
Section intitulée « Session Management »Pi Agent persists sessions so you can resume work across terminal sessions.
| Command | Description |
|---|---|
pi --resume | Resume the most recent session |
pi --resume <session-id> | Resume a specific session |
pi sessions list | List all saved sessions |
pi sessions delete <id> | Delete a specific session |
pi sessions clear | Clear all saved sessions |
Session Compaction
Section intitulée « Session Compaction »When context grows too large, Pi Agent compacts the conversation by summarizing older messages while preserving recent context.
# Manual compaction trigger
pi compact
# Configure compaction threshold (tokens)
pi config set compaction.threshold 100000
# Compaction strategy: summarize older messages, keep recent ones intact
pi config set compaction.strategy summarize
Skills and Extensions
Section intitulée « Skills and Extensions »Skills are reusable prompt templates and tool bundles packaged as pi packages.
Using Skills
Section intitulée « Using Skills »# List available skills
pi skills list
# Install a skill package
pi skills install @pi-skills/code-review
# Use a skill in a session
pi --skill code-review "Review the changes in the last commit"
Creating Custom Skills
Section intitulée « Creating Custom Skills »// skills/my-skill/index.ts
import { defineSkill } from "pi-agent-core";
export default defineSkill({
name: "deploy-checker",
description: "Verify deployment readiness",
systemPrompt: `You are a deployment readiness checker.
Verify tests pass, linting is clean, and no TODOs remain.`,
tools: [bashTool, readTool],
});
Terminal UI (pi-tui)
Section intitulée « Terminal UI (pi-tui) »The pi-tui package provides the interactive terminal interface.
| Keybinding | Action |
|---|---|
Enter | Send message |
Shift+Enter | New line in input |
Ctrl+C | Cancel current generation |
Ctrl+D | Exit the session |
Ctrl+L | Clear the screen |
/ | Open command palette |
TUI Commands
Section intitulée « TUI Commands »| Command | Description |
|---|---|
/help | Show available commands |
/model <name> | Switch LLM model mid-session |
/system <prompt> | Update the system prompt |
/clear | Clear conversation history |
/cost | Show token usage and cost for current session |
/compact | Trigger context compaction |
/tools | List available tools |
Configuration
Section intitulée « Configuration »Global Configuration
Section intitulée « Global Configuration »# Set default provider and model
pi config set provider anthropic
pi config set model claude-sonnet-4-20250514
# Set API keys
pi config set anthropic.apiKey sk-ant-...
pi config set openai.apiKey sk-...
# View current configuration
pi config list
Project-Level Configuration (.pi/config.json)
Section intitulée « Project-Level Configuration (.pi/config.json) »{
"model": "claude-sonnet-4-20250514",
"provider": "anthropic",
"systemPrompt": "You are a senior TypeScript developer.",
"tools": {
"bash": {
"allowedCommands": ["npm", "node", "git", "tsc"],
"timeout": 30000
}
},
"session": {
"compaction": {
"threshold": 100000,
"strategy": "summarize"
}
}
}
Slack Bot Integration
Section intitulée « Slack Bot Integration »The pi-slack package provides a Slack bot that exposes agent capabilities to your team.
# Set up the Slack bot
export SLACK_BOT_TOKEN=xoxb-...
export SLACK_APP_TOKEN=xapp-...
# Start the Slack bot
pi slack start
# Configure allowed channels
pi slack config set channels "#dev,#engineering"
| Feature | Description |
|---|---|
| Threaded conversations | Each Slack thread is a separate agent session |
| File sharing | Upload and analyze files via Slack |
| Code execution | Run code snippets shared in Slack |
| Multi-model | Switch models per channel or thread |
vLLM Pod Support
Section intitulée « vLLM Pod Support »Run self-hosted models with vLLM integration for on-premise or GPU-constrained environments.
# Configure vLLM endpoint
pi config set vllm.endpoint http://localhost:8000
pi config set vllm.model codellama/CodeLlama-34b-Instruct-hf
# Use vLLM as the provider
pi --provider vllm "Explain this function"
# Start a vLLM pod (requires GPU)
pi vllm start --model codellama/CodeLlama-34b-Instruct-hf --gpu-memory-utilization 0.9
# Check pod status
pi vllm status
# Stop the pod
pi vllm stop
Advanced Features
Section intitulée « Advanced Features »LSP Integration
Section intitulée « LSP Integration »# Enable Language Server Protocol integration for richer code understanding
pi config set lsp.enabled true
# Supported languages: TypeScript, Python, Rust, Go, Java
pi config set lsp.languages "typescript,python"
Browser Tools
Section intitulée « Browser Tools »# Enable browser automation tools
pi --tools browser "Navigate to localhost:3000 and check for console errors"
Cost Budgets
Section intitulée « Cost Budgets »# Set a cost budget for the session
pi --budget 5.00 "Refactor the entire utils directory"
# Set a global daily budget
pi config set budget.daily 20.00
Common Workflows
Section intitulée « Common Workflows »| Workflow | Command |
|---|---|
| Fix a bug | pi "Fix the TypeError in src/api/handler.ts" |
| Add a feature | pi "Add pagination to the /users endpoint" |
| Write tests | pi "Write comprehensive tests for the auth module" |
| Code review | pi "Review the changes in the last 3 commits" |
| Refactor | pi "Refactor database queries to use the repository pattern" |
| Documentation | pi "Generate JSDoc comments for all exported functions" |
| Debugging | pi "The app crashes on startup — diagnose and fix" |
Environment Variables
Section intitulée « Environment Variables »| Variable | Description |
|---|---|
ANTHROPIC_API_KEY | Anthropic API key for Claude models |
OPENAI_API_KEY | OpenAI API key |
GOOGLE_API_KEY | Google AI API key |
GROQ_API_KEY | Groq API key |
MISTRAL_API_KEY | Mistral API key |
XAI_API_KEY | xAI API key for Grok |
OPENROUTER_API_KEY | OpenRouter API key |
OLLAMA_HOST | Ollama server URL (default: http://localhost:11434) |
PI_HOME | Custom config directory (default: ~/.pi) |
PI_MODEL | Override default model |
PI_PROVIDER | Override default provider |
Additional Resources
Section intitulée « Additional Resources »- Pi Agent Documentation
- GitHub Repository
- OpenClaw — The agent stack powered by Pi Agent
- npm Packages