Skip to content

Hoppscotch

Open-source API development ecosystem and lightweight alternative to Postman for testing REST, GraphQL, WebSocket, and real-time APIs.

CommandDescription
Navigate to hoppscotch.ioOpen Hoppscotch web app (no install needed)
Download from hoppscotch.io/downloadInstall desktop app (macOS, Windows, Linux)
Sign in with GitHub, Google, or emailCreate account for team sync
Browser PWA install optionInstall as Progressive Web App
CommandDescription
npm install -g @hoppscotch/cliInstall CLI globally with npm
npx @hoppscotch/cliRun CLI without installing
hopp --versionShow CLI version
hopp --helpShow CLI help
CommandDescription
docker pull hoppscotch/hoppscotchPull Docker image
docker compose up -dStart self-hosted instance
# docker-compose.yml for self-hosted Hoppscotch
version: "3"
services:
  hoppscotch:
    image: hoppscotch/hoppscotch:latest
    ports:
      - "3000:3000"   # Web UI
      - "3100:3100"   # Admin dashboard
      - "3170:3170"   # Backend API
    environment:
      DATABASE_URL: postgresql://user:pass@db:5432/hoppscotch
      JWT_SECRET: your-jwt-secret-here
      TOKEN_SALT_COMPLEXITY: 10
      MAGIC_LINK_TOKEN_VALIDITY: 3
      REFRESH_TOKEN_VALIDITY: 604800000
      ACCESS_TOKEN_VALIDITY: 86400000
      VITE_ALLOWED_AUTH_PROVIDERS: GITHUB,GOOGLE,EMAIL
    depends_on:
      - db

  db:
    image: postgres:15
    volumes:
      - postgres_data:/var/lib/postgresql/data
    environment:
      POSTGRES_USER: user
      POSTGRES_PASSWORD: pass
      POSTGRES_DB: hoppscotch

volumes:
  postgres_data:
CommandDescription
Select method (GET, POST, PUT, DELETE, PATCH)Choose HTTP method from dropdown
Enter URL in address barSet request endpoint
Click “Send”Execute the request
Ctrl + EnterSend request (keyboard shortcut)
Ctrl + GSend and download response
Ctrl + KOpen command palette
CommandDescription
Click “Body” tab → select typeSet request body format
Select “application/json”JSON body (most common)
Select “multipart/form-data”File uploads
Select “application/x-www-form-urlencoded”Form data
Select “text/plain”Raw text body
Select “application/xml”XML body
Select “Binary”Raw binary data
CommandDescription
Click “Headers” tab → Add headerAdd custom request headers
Click “Parameters” tabAdd URL query parameters
Bulk edit mode for headersPaste multiple headers at once
Toggle header active/inactiveDisable without deleting
Use <<variable>> in valuesReference environment variables
CommandDescription
View response bodyJSON, HTML, XML, binary, image
Click “Headers” tab on responseInspect response headers
Status code and time shownHTTP status, latency, size
Click “Copy” on responseCopy response to clipboard
Click “Download”Download response as file
JSON response auto-formattedCollapsible tree view
Click “Timeline”View request/response timeline
Method:  POST
URL:     https://api.example.com/users
Headers:
  Content-Type: application/json
  Authorization: Bearer <<auth_token>>

Body (JSON):
{
  "name": "Alice Smith",
  "email": "alice@example.com",
  "role": "admin"
}

Response: 201 Created (145ms, 234 B)
{
  "id": 42,
  "name": "Alice Smith",
  "email": "alice@example.com",
  "role": "admin",
  "created_at": "2024-01-15T10:30:00Z"
}
CommandDescription
Click ”+” in Collections panelCreate new collection
Right-click collection → “New Request”Add request to collection
Right-click collection → “New Folder”Organize with sub-folders
Drag requests between foldersReorder collection items
Right-click → “Duplicate”Clone a collection
Right-click → “Properties”Edit collection settings
Right-click → “Delete”Remove collection
Ctrl + Shift + PSearch within collections
CommandDescription
Click “Import” → select fileImport collection
Right-click → “Export”Export collection as JSON
Import from Postman Collection v2Postman compatibility
Import from OpenAPI/Swagger specAuto-generate requests
Import from InsomniaInsomnia compatibility
Import from cURLPaste cURL command
Import from HARImport HTTP Archive
Supported import formats:
  - Hoppscotch Collection (native JSON)
  - Postman Collection v2
  - OpenAPI 3.0 / Swagger 2.0
  - Insomnia v4
  - cURL commands
  - HAR (HTTP Archive)

Import from cURL:
  Paste a cURL command directly into the URL bar
  Hoppscotch auto-parses method, headers, body, and URL
CommandDescription
Click “Environments” in sidebarOpen environment manager
Click ”+” to create environmentCreate new environment
Add key: value pairsDefine variables
Use <<variable_name>> in requestsReference variable in any field
Select environment from dropdownActivate an environment
Click “Global” tabSet global variables (always active)
Right-click → “Export”Export environment as JSON
Right-click → “Duplicate”Clone environment
TypeDescription
Regular variableVisible in UI, exported in collections
Secret variableMasked in UI, not exported
Global variableAvailable in all environments
Environment variableAvailable only when env is active
Environment: "Production"
┌───────────────┬──────────────────────────────┬──────────┐
│ Key           │ Value                        │ Type     │
├───────────────┼──────────────────────────────┼──────────┤
│ base_url      │ https://api.example.com      │ Regular  │
│ api_version   │ v2                           │ Regular  │
│ auth_token    │ eyJhbGciOiJIUzI1NiIsIn...    │ Secret   │
│ timeout       │ 30000                        │ Regular  │
└───────────────┴──────────────────────────────┴──────────┘

Usage in requests:
  URL:    <<base_url>>/<<api_version>>/users
  Header: Authorization: Bearer <<auth_token>>
CommandDescription
Select “Bearer Token”Add Bearer token authentication
Select “Basic Auth”Username/password authentication
Select “OAuth 2.0”Configure OAuth 2.0 flow
Select “API Key”Add API key to header or query param
Select “AWS Signature”AWS Signature V4 auth
Select “Inherit from parent”Inherit auth from collection
Select “None”Remove authentication
Token fields support <<variables>>Use env vars in auth tokens
OAuth 2.0 Configuration:
  Grant Type:       Authorization Code
  Auth URL:         https://auth.example.com/authorize
  Token URL:        https://auth.example.com/token
  Client ID:        <<client_id>>
  Client Secret:    <<client_secret>>
  Scope:            read write
  Redirect URI:     Auto-configured

Supported grant types:
  - Authorization Code (+ PKCE)
  - Client Credentials
  - Implicit
  - Password Credentials
CommandDescription
Click “Realtime” → “WebSocket”Open WebSocket client
Enter wss:// URLSet WebSocket endpoint
Click “Connect”Establish connection
Type message → click “Send”Send message to server
View messages in log panelMonitor incoming/outgoing
Click “Disconnect”Close connection
Add protocols/headersConfigure connection params
CommandDescription
Click “Realtime” → “SSE”Test Server-Sent Events
Enter SSE endpoint URLSet event stream URL
Click “Connect”Start receiving events
Events logged in real-timeMonitor event stream
Click “Disconnect”Stop receiving
CommandDescription
Click “Realtime” → “MQTT”Test MQTT connections
Enter mqtt:// or wss:// URLSet MQTT broker URL
Set Client ID, username, passwordConfigure credentials
Subscribe to topicsListen for messages
Publish to topicsSend messages
CommandDescription
Click “Realtime” → “Socket.IO”Test Socket.IO connections
Enter server URLSet Socket.IO endpoint
Listen on eventsSubscribe to named events
Emit events with dataSend named events
View connection statusMonitor connect/disconnect
CommandDescription
Click “GraphQL” tabSwitch to GraphQL mode
Enter GraphQL endpoint URLSet GraphQL server URL
Write query in editor panelCompose GraphQL query
Click “Schema” to fetch schemaLoad server schema for autocomplete
Add variables in Variables panelSet query variables as JSON
Add headers for authenticationSet auth headers
Click “Run” or Ctrl + EnterExecute query
Use autocomplete from schemaGet field suggestions
Endpoint: https://api.example.com/graphql

Query:
  query GetUsers($limit: Int!) {
    users(limit: $limit) {
      id
      name
      email
      posts {
        title
        createdAt
      }
    }
  }

Variables:
  {
    "limit": 10
  }

Headers:
  Authorization: Bearer <<auth_token>>
CommandDescription
hopp test collection.jsonRun collection tests
hopp test collection.json -e environment.jsonRun with environment
hopp test collection.json --delay 500Delay between requests (ms)
hopp test collection.json --reporter junitJUnit test report output
hopp test collection.json --reporter jsonJSON test report output
hopp test collection.json --bailStop on first failure
hopp test collection.json --env-var KEY=VALUEOverride env variable
# Run collection with environment
hopp test api-tests.json -e production.json

# Run with delay and stop on failure
hopp test api-tests.json --delay 1000 --bail

# Generate JUnit report for CI/CD
hopp test api-tests.json --reporter junit > results.xml

# Override specific variables
hopp test api-tests.json \
  --env-var "base_url=https://staging.example.com" \
  --env-var "auth_token=test-token-123"

# CI/CD pipeline example
hopp test api-tests.json -e staging.json --bail --reporter junit
CommandDescription
Click “Pre-request” tabAdd script that runs before request
Click “Tests” tabAdd script that runs after response
pw.env.set("key", "value")Set environment variable
pw.env.get("key")Get environment variable
pw.expect(response.status).toBe(200)Assert status code
pw.expect(response.body).toHaveProperty("id")Assert body property
// Pre-request: Generate timestamp
pw.env.set("timestamp", Date.now().toString());

// Pre-request: Generate random ID
pw.env.set("random_id", Math.random().toString(36).slice(2));

// Test: Check status code
pw.test("Status is 200", () => {
  pw.expect(pw.response.status).toBe(200);
});

// Test: Check response body
pw.test("Response has user data", () => {
  const body = pw.response.body;
  pw.expect(body).toHaveProperty("id");
  pw.expect(body).toHaveProperty("name");
  pw.expect(body.name).toBe("Alice");
});

// Test: Check response time
pw.test("Response is fast", () => {
  pw.expect(pw.response.status).toBeLessThan(500);
});

// Test: Save token for next request
pw.test("Extract auth token", () => {
  const token = pw.response.body.access_token;
  pw.env.set("auth_token", token);
});
ShortcutDescription
Ctrl + EnterSend request
Ctrl + KCommand palette
Ctrl + SSave current request
Ctrl + Shift + PSearch collections
Ctrl + /Toggle sidebar
Alt + ↑Previous request in history
Alt + ↓Next request in history
F11Toggle full screen
CommandDescription
Create team from sidebarSet up team workspace
Invite members by emailAdd collaborators
Shared collectionsTeam-wide request libraries
Shared environmentsTeam-wide variable sets
Real-time collaborationSee team edits live
Role-based accessOwner, editor, viewer roles
  1. Organize collections by API domain — Group related endpoints into collections and use folders for resource types (users, orders, products) to keep things findable.

  2. Use environments for staging vs production — Create separate environments for dev, staging, and production with the same variable names but different values. Switch with one click.

  3. Store secrets as secret variables — Mark API keys and tokens as “secret” type so they’re masked in the UI and excluded from exports.

  4. Use pre-request scripts for auth — Generate timestamps, calculate signatures, or refresh tokens automatically before each request instead of copying them manually.

  5. Write tests for critical endpoints — Add test scripts to verify status codes, response structure, and key values. Run them via CLI in CI/CD pipelines.

  6. Import from OpenAPI specs — When working with documented APIs, import the OpenAPI/Swagger spec to auto-generate all endpoints with correct parameters.

  7. Use collection-level auth — Set authentication at the collection level and use “Inherit from parent” in individual requests to avoid duplicating auth config.

  8. Export collections to version control — Export collections as JSON and commit them to your repository for versioning and team sharing.

  9. Use the CLI in CI/CD — Add hopp test to your pipeline to run API tests on every deployment, ensuring endpoints work before going live.

  10. Self-host for sensitive APIs — If testing internal or sensitive APIs, deploy the self-hosted Docker version to keep all data within your infrastructure.