تخطَّ إلى المحتوى

Goose

Goose is an open-source agentic AI platform that automates coding, DevOps, and system workflows through conversational interaction. Created by Block and now managed by the Agentic AI Foundation at the Linux Foundation, Goose integrates Model Context Protocol (MCP) extensions to extend AI capabilities across developer tools, cloud services, and local systems.

brew install block/tap/goose
goose version  # Verify installation
cargo install goose-cli
goose version
cargo install goose-cli
goose version

Download platform-specific installers from the official Goose repository:

  • macOS: .dmg installer
  • Windows: .exe installer
  • Linux: .AppImage or .deb package
# Using npm (if available)
npm install -g goose

# Using pipx (Python)
pipx install goose-cli

# Using cargo (recommended)
cargo install goose-cli
goose  # Start interactive Goose session

Once in a session, prompt naturally:

> Create a Python script that checks if a port is open
> Modify the docker-compose.yml file to add a PostgreSQL service
> Refactor this JavaScript function for performance
# Start a new session
goose

# View session history
goose session list

# Resume previous session
goose session resume

# Exit session
exit
  1. Start Goose: goose
  2. Describe your task in natural language
  3. Review suggested actions before approval (depending on mode)
  4. Let Goose execute multi-step workflows autonomously
CommandDescription
gooseStart interactive session in default mode
goose configureConfigure API keys and LLM providers
goose session listList all saved sessions
goose session resume <id>Resume a previous session
goose session delete <id>Delete a session
goose updateUpdate Goose to latest version
goose versionDisplay installed version
goose helpShow CLI help
goose run --recipe <file>Execute automation recipe
# List all sessions with metadata
goose session list

# Resume specific session (shows recent interaction history)
goose session resume 1234abcd

# Start fresh session
goose

# Run headless recipe (no interactive prompts)
goose run --recipe automation.yaml

Slash commands control Goose behavior within a session:

CommandSyntaxPurpose
/mode/mode auto | /mode approve | /mode chatSwitch execution mode
/extension/extension list | /extension add <name>Manage extensions
/builtin/builtin listList built-in extensions
/plan/planShow current action plan
/clear/clearClear conversation history
/retry/retryRetry last action
/restart/restartRestart current task
/help/helpShow command list
> /mode auto
> /mode approve
> /extension list
> /builtin list
> /plan
> /clear
> /retry
> /restart
goose configure

This interactive command prompts for:

  • Default LLM provider (OpenAI, Anthropic, Google, Ollama, OpenRouter)
  • API keys for selected provider
  • Default model selection
  • Extension preferences
# OpenAI
export OPENAI_API_KEY="sk-..."

# Anthropic
export ANTHROPIC_API_KEY="sk-ant-..."

# Google
export GOOGLE_API_KEY="AIza..."

# Custom OpenRouter
export OPENROUTER_API_KEY="sk-or-..."

# Ollama (local)
export OLLAMA_BASE_URL="http://localhost:11434"
goose configure

# Select provider: anthropic
# Enter API key when prompted
# Select default model: claude-3-5-sonnet
# macOS / Linux
~/.config/goose/config.yaml

# Windows
%APPDATA%\goose\config.yaml
provider: anthropic
model: claude-3-5-sonnet
api_key: sk-ant-...
extensions:
  - developer
  - computeruse
  - memory
temperature: 0.7
max_tokens: 4096

Goose uses Model Context Protocol (MCP) to extend AI capabilities. Extensions provide tools for file operations, shell execution, cloud services, and specialized workflows.

ExtensionPurposeTools Provided
DeveloperFile editing, shell commandsedit files, run commands, create directories
ComputeruseGUI automation, screenshot capturedesktop control, visual task automation
MemoryPersistent knowledge storagesave/retrieve context between sessions
JetBrainsIntelliJ/PyCharm integrationcode navigation, refactoring
Google DriveDocument/spreadsheet accessread/write Google Docs, Sheets, Drive files
# List available extensions
goose /builtin list

# In session: view all extensions
> /builtin list

# View loaded extensions
> /extension list
> /extension add developer
> /extension add computeruse
> /extension add memory
# Configure in config.yaml
extensions:
  - name: my-custom-server
    type: command-line
    command: python
    args:
      - /path/to/mcp_server.py

Or interactively:

> /extension add custom-server
# Built-in extension
- name: developer
  type: builtin

# Command-line MCP server
- name: postgres
  type: command-line
  command: npx
  args:
    - "@modelcontextprotocol/server-postgres"
    - "--connection-string"
    - "postgresql://user:pass@localhost/db"

# Remote MCP server
- name: remote-service
  type: remote
  url: "http://mcp-server:3000"

Profiles let you switch between different extension sets and configurations for various workflows.

# ~/.config/goose/profiles.yaml
profiles:
  web-dev:
    model: claude-3-5-sonnet
    extensions:
      - developer
      - memory
    temperature: 0.5
    
  devops:
    model: claude-3-5-sonnet
    extensions:
      - developer
      - computeruse
    temperature: 0.7
    
  research:
    model: claude-3-sonnet
    extensions:
      - memory
      - google-drive
    temperature: 0.3
# Start with specific profile
goose --profile web-dev

# In session, switch profiles
> /profile switch devops

Recipes enable headless automation and scripted workflows without interactive prompts.

name: "Deploy API Service"
description: "Build, test, and deploy Node.js API"
steps:
  - task: "Review package.json and identify dependencies"
  - task: "Run npm install to install dependencies"
  - task: "Execute npm test to run test suite"
  - task: "Build Docker image with tag api:latest"
  - task: "Push image to registry"
  - task: "Update deployment manifest with new image"
  - task: "Apply Kubernetes changes with kubectl apply"
mode: approve
extensions:
  - developer
  - computeruse
# Execute recipe file
goose run --recipe deploy-api.yaml

# Run with custom variables
goose run --recipe setup.yaml --vars '{"env":"production","region":"us-west-2"}'

# Run multiple recipes in sequence
goose run --recipe prep.yaml && goose run --recipe deploy.yaml
name: "Setup Python Project"
description: "Initialize Python project with dependencies and testing"
steps:
  - task: "Create src/ and tests/ directories"
  - task: "Create requirements.txt with Flask, pytest, black, flake8"
  - task: "Run pip install -r requirements.txt"
  - task: "Create pytest configuration in pyproject.toml"
  - task: "Add GitHub Actions CI workflow"
  - task: "Initialize git repository"
mode: approve
name: "Build Multi-Stage Docker Image"
description: "Create optimized Node.js Docker image"
steps:
  - task: "Create Dockerfile with builder and runtime stages"
  - task: "Test build locally with docker build"
  - task: "Tag image as myapp:latest and myapp:v1.0"
  - task: "Push images to Docker registry"
mode: auto
# Set API key
export OPENAI_API_KEY="sk-proj-..."

# Configure in goose
goose configure
# Select: openai
# Model: gpt-4o or gpt-4-turbo

# Specify model in recipe
model: gpt-4o
export ANTHROPIC_API_KEY="sk-ant-..."

goose configure
# Select: anthropic
# Model: claude-3-5-sonnet or claude-3-opus

# Most recommended for Goose
model: claude-3-5-sonnet
export GOOGLE_API_KEY="AIza..."

goose configure
# Select: google
# Model: gemini-2.0-flash or gemini-1.5-pro

model: gemini-2.0-flash
# Start Ollama service
ollama serve

# In another terminal, pull model
ollama pull mistral

# Configure Goose
export OLLAMA_BASE_URL="http://localhost:11434"

goose configure
# Select: ollama
# Model: mistral or llama2

model: mistral
export OPENROUTER_API_KEY="sk-or-..."

goose configure
# Select: openrouter
# Model: any OpenRouter model (claude-3-5-sonnet, gpt-4o, mistral, etc.)

model: claude-3-5-sonnet

The Developer extension provides core file and system operations.

# In Goose session:
> Create a file called utils.py with helper functions
> Modify src/index.js line 45 to add error handling
> Delete old test fixtures in tests/old/

Goose automatically:

  • Creates/modifies files
  • Maintains proper indentation
  • Preserves file structure
  • Validates syntax where applicable
# Execute system commands
> Run npm test
> Check git status and show recent commits
> Install dependencies with pip install -r requirements.txt
> Build project with cargo build --release
# Initialize project structure
> Create a Python project with src/, tests/, and docs/ directories

# Generate configuration files
> Create .gitignore for Node.js projects
> Add ESLint and Prettier configuration

# Set up development tools
> Create GitHub Actions workflow for CI/CD
> Initialize Docker with Dockerfile and docker-compose.yml

Create a Python-based MCP server:

# custom_mcp_server.py
from mcp.server import Server
from mcp.types import Tool

server = Server("my-custom-server")

@server.tool()
def get_system_info():
    """Get system information"""
    import platform
    return f"OS: {platform.system()}, Python: {platform.python_version()}"

@server.tool()
def execute_query(query: str):
    """Execute database query"""
    # Custom logic here
    return f"Results for: {query}"

if __name__ == "__main__":
    server.run()

Register in config.yaml:

extensions:
  - name: custom-server
    type: command-line
    command: python
    args:
      - /path/to/custom_mcp_server.py
# Build Docker container with Goose
> Create Dockerfile for Python service with health checks
> Build with docker build -t myservice:latest .
> Test container with docker run --rm myservice:latest

# Docker Compose automation
> Generate docker-compose.yml with PostgreSQL, Redis, and Node.js services
> Set environment variables and networks
> Validate with docker-compose config
#!/bin/bash
# ci-deploy.sh

# Export credentials
export ANTHROPIC_API_KEY=${{ secrets.ANTHROPIC_API_KEY }}

# Run deployment recipe
goose run --recipe deploy.yaml

# Check exit code
if [ $? -eq 0 ]; then
  echo "Deployment successful"
else
  echo "Deployment failed"
  exit 1
fi
# Use Goose as a library
from goose import Goose

goose = Goose(
    provider="anthropic",
    model="claude-3-5-sonnet"
)

result = goose.execute("Create a Python CLI tool for file processing")
print(result.output)
# Verify environment variable
echo $ANTHROPIC_API_KEY

# Or set directly in config
goose configure
# Enter API key when prompted

# Check config file permissions
ls -la ~/.config/goose/config.yaml
# Verify MCP server is running
ps aux | grep mcp

# Test MCP connection
mcp-client --connect localhost:3000

# Check firewall/port access
netstat -an | grep 3000
# List all sessions
goose session list

# Check session ID format
goose session resume <full-session-id>

# If corrupted, start fresh
goose
# Clear conversation history
> /clear

# Or start new session
exit
goose

# Reduce max_tokens in config.yaml
max_tokens: 2048
# Verify extension in config
cat ~/.config/goose/config.yaml

# List available extensions
goose /builtin list

# Restart Goose
exit
goose
✓ "Create a Python script that validates CSV files"
✗ "Do something useful"
  • Web Development: Lower temperature (0.3-0.5)
  • DevOps: Standard temperature (0.7)
  • Brainstorming: Higher temperature (0.8-0.9)
# Save common workflows as recipes
goose run --recipe weekly-backup.yaml
goose run --recipe security-audit.yaml
# Use /mode approve to review actions
> /mode approve

# Verify changes before proceeding
# Then switch to /mode auto for efficiency
> /mode auto
extensions:
  - developer      # Always needed
  - memory         # Preserve context
  - computeruse    # For GUI tasks
  # Add only necessary extensions to reduce overhead
# Use lower token limits for quick tasks
max_tokens: 1024

# Use higher for complex work
max_tokens: 4096
# Resume interrupted work
goose session resume previous-id

# Maintain context across multiple interactions
ToolPurposeIntegration
Claude (Anthropic)Primary LLM providerVia ANTHROPIC_API_KEY
OpenAI GPT-4Alternative LLMVia OPENAI_API_KEY
MCP (Model Context Protocol)Extension frameworkNative support
OllamaLocal LLM executionVia OLLAMA_BASE_URL
DockerContainer automationVia Developer extension
GitHub ActionsCI/CD workflowsCan be configured by Goose
KubernetesOrchestration automationVia kubectl commands
TerraformInfrastructure as CodeVia Developer extension
VS CodeCode editor integrationVia JetBrains extension
Google DriveDocument storageVia Google Drive extension