Zum Inhalt

Port Cheat Blatt

generieren

Überblick

Port ist eine interne Entwicklerplattform (IDP), die ein umfassendes Entwicklerportal für die Verwaltung und Visualisierung Ihres gesamten Softwarekatalogs erstellt. Es bietet eine einheitliche Schnittstelle für Dienste, Infrastruktur, Bereitstellungen und Entwickler-Workflows und ermöglicht Self-Service-Funktionen und die Einhaltung von Governance- und Compliance-Standards.

ZEIT Anmerkung: Freier Platz für kleine Teams. Bezahlte Pläne starten bei $20 / Benutzer / Monat für erweiterte Funktionen.

Erste Schritte

Kontoaufbau

```bash

Sign up process:

1. Visit getport.io

2. Create account with email or SSO

3. Set up your organization

4. Configure initial data model

5. Connect your first data sources

Initial configuration:

- Organization settings

- User roles and permissions

- Data model design

- Integration setup

- Team onboarding

```_

Datenmodellbau

json { "identifier": "service", "title": "Service", "icon": "Service", "schema": { "properties": { "name": { "title": "Name", "type": "string" }, "description": { "title": "Description", "type": "string" }, "owner": { "title": "Owner", "type": "string", "format": "team" }, "language": { "title": "Language", "type": "string", "enum": ["Python", "JavaScript", "Java", "Go", "C#"] }, "lifecycle": { "title": "Lifecycle", "type": "string", "enum": ["Production", "Staging", "Development", "Deprecated"] }, "tier": { "title": "Tier", "type": "string", "enum": ["Mission Critical", "Customer Facing", "Internal", "Experimental"] } }, "required": ["name", "owner", "lifecycle"] } }_

Blueprint Creation

json { "identifier": "microservice", "title": "Microservice", "icon": "Microservice", "schema": { "properties": { "name": { "title": "Service Name", "type": "string" }, "repository": { "title": "Repository", "type": "string", "format": "url" }, "health_status": { "title": "Health Status", "type": "string", "enum": ["Healthy", "Warning", "Critical", "Unknown"] }, "deployment_status": { "title": "Deployment Status", "type": "string", "enum": ["Deployed", "Deploying", "Failed", "Not Deployed"] }, "cpu_usage": { "title": "CPU Usage (%)", "type": "number" }, "memory_usage": { "title": "Memory Usage (%)", "type": "number" }, "last_deployment": { "title": "Last Deployment", "type": "string", "format": "date-time" } } }, "relations": { "team": { "title": "Team", "target": "team", "required": true, "many": false }, "domain": { "title": "Domain", "target": "domain", "required": false, "many": false } } }_

Datenaufnahme

REST API Integration

```python import requests import json

Port API configuration

PORT_API_URL = "https://api.getport.io/v1" PORT_CLIENT_ID = "your-client-id" PORT_CLIENT_SECRET = "your-client-secret"

Get access token

def get_access_token(): response = requests.post( f"{PORT_API_URL}/auth/access_token", json={ "clientId": PORT_CLIENT_ID, "clientSecret": PORT_CLIENT_SECRET } ) return response.json()["accessToken"]

Create or update entity

def upsert_entity(blueprint_id, entity_data): token = get_access_token() headers = { "Authorization": f"Bearer {token}", "Content-Type": "application/json" }

response = requests.post(
    f"{PORT_API_URL}/blueprints/{blueprint_id}/entities",
    headers=headers,
    json=entity_data
)
return response.json()

Example: Create service entity

service_data = { "identifier": "user-service", "title": "User Service", "properties": { "name": "User Service", "description": "Handles user authentication and management", "owner": "backend-team", "language": "Python", "lifecycle": "Production", "tier": "Customer Facing" }, "relations": { "team": "backend-team", "domain": "authentication" } }

result = upsert_entity("service", service_data) print(json.dumps(result, indent=2)) ```_

Webhook Integration

```python from flask import Flask, request, jsonify import requests

app = Flask(name)

@app.route('/port-webhook', methods=['POST']) def handle_port_webhook(): """Handle incoming webhooks from Port""" data = request.json

# Process different event types
if data.get('eventType') == 'ENTITY_CREATED':
    handle_entity_created(data)
elif data.get('eventType') == 'ENTITY_UPDATED':
    handle_entity_updated(data)
elif data.get('eventType') == 'ENTITY_DELETED':
    handle_entity_deleted(data)

return jsonify({"status": "success"})

def handle_entity_created(data): """Handle entity creation events""" entity = data.get('entity', {}) blueprint = data.get('blueprint', {})

print(f"New {blueprint.get('identifier')} created: {entity.get('identifier')}")

# Trigger downstream processes
if blueprint.get('identifier') == 'service':
    setup_monitoring(entity)
    create_alerts(entity)

def setup_monitoring(entity): """Set up monitoring for new service""" service_name = entity.get('identifier') # Configure monitoring tools print(f"Setting up monitoring for {service_name}")

def create_alerts(entity): """Create alerts for new service""" service_name = entity.get('identifier') # Configure alerting print(f"Creating alerts for {service_name}")

if name == 'main': app.run(host='0.0.0.0', port=5000) ```_

Integration von GitHub

```yaml

.github/workflows/port-sync.yml

name: Sync with Port

on: push: branches: [main] pull_request: branches: [main]

jobs: sync-to-port: runs-on: ubuntu-latest steps: - uses: actions/checkout@v3

- name: Extract service metadata
  id: metadata
  run: |
    # Extract metadata from repository
    SERVICE_NAME=$(basename $GITHUB_REPOSITORY)

| LANGUAGE=$(find . -name ".py" -o -name ".js" -o -name ".java" | head -1 | sed 's/..//') |

    echo "service_name=$SERVICE_NAME" >> $GITHUB_OUTPUT
    echo "language=$LANGUAGE" >> $GITHUB_OUTPUT

- name: Update Port entity
  uses: port-labs/port-github-action@v1
  with:
    clientId: ${{ secrets.PORT_CLIENT_ID }}
    clientSecret: ${{ secrets.PORT_CLIENT_SECRET }}
    operation: UPSERT
    identifier: ${{ steps.metadata.outputs.service_name }}
    blueprint: service
    properties: |
      {
        "name": "${{ steps.metadata.outputs.service_name }}",
        "repository": "${{ github.repository }}",
        "language": "${{ steps.metadata.outputs.language }}",
        "lifecycle": "Production",
        "last_commit": "${{ github.sha }}"
      }

```_

Integration von Kubernets

```yaml

Port Kubernetes exporter

apiVersion: apps/v1 kind: Deployment metadata: name: port-k8s-exporter namespace: port-k8s-exporter spec: replicas: 1 selector: matchLabels: app: port-k8s-exporter template: metadata: labels: app: port-k8s-exporter spec: serviceAccountName: port-k8s-exporter containers: - name: port-k8s-exporter image: ghcr.io/port-labs/port-k8s-exporter:latest env: - name: PORT_CLIENT_ID valueFrom: secretKeyRef: name: port-credentials key: clientId - name: PORT_CLIENT_SECRET valueFrom: secretKeyRef: name: port-credentials key: clientSecret - name: CONFIG_YAML value: | resources: - kind: v1/pods selector: | query: .metadata.namespace | startswith("kube") | not | port: entity: mappings: - identifier: .metadata.name + "-" + .metadata.namespace title: .metadata.name blueprint: '"k8s-pod"' properties: namespace: .metadata.namespace status: .status.phase node: .spec.nodeName created: .metadata.creationTimestamp - kind: apps/v1/deployments port: entity: mappings: - identifier: .metadata.name + "-" + .metadata.namespace title: .metadata.name blueprint: '"k8s-deployment"' properties: namespace: .metadata.namespace replicas: .spec.replicas available_replicas: .status.availableReplicas strategy: .spec.strategy.type ```_

Selbstbedienungsaktionen

Begriffsbestimmungen

json { "identifier": "deploy_service", "title": "Deploy Service", "icon": "Deploy", "description": "Deploy a service to the specified environment", "trigger": { "type": "self-service", "operation": "CREATE", "userInputs": { "properties": { "environment": { "title": "Environment", "type": "string", "enum": ["development", "staging", "production"], "default": "development" }, "version": { "title": "Version", "type": "string", "description": "Docker image tag to deploy" }, "replicas": { "title": "Replicas", "type": "number", "default": 1, "minimum": 1, "maximum": 10 }, "cpu_limit": { "title": "CPU Limit", "type": "string", "default": "500m", "enum": ["100m", "250m", "500m", "1000m"] }, "memory_limit": { "title": "Memory Limit", "type": "string", "default": "512Mi", "enum": ["256Mi", "512Mi", "1Gi", "2Gi"] } }, "required": ["environment", "version"] }, "blueprintIdentifier": "service" }, "invocationMethod": { "type": "WEBHOOK", "url": "https://api.company.com/deploy", "method": "POST", "headers": { "Authorization": "Bearer {{ .secrets.DEPLOY_TOKEN }}" }, "body": { "service": "{{ .entity.identifier }}", "environment": "{{ .inputs.environment }}", "version": "{{ .inputs.version }}", "replicas": "{{ .inputs.replicas }}", "resources": { "cpu": "{{ .inputs.cpu_limit }}", "memory": "{{ .inputs.memory_limit }}" } } } }_

GitHub Aktionen Integration

json { "identifier": "create_repository", "title": "Create Repository", "icon": "Github", "description": "Create a new GitHub repository with template", "trigger": { "type": "self-service", "operation": "CREATE", "userInputs": { "properties": { "name": { "title": "Repository Name", "type": "string", "pattern": "^[a-z0-9-]+$" }, "description": { "title": "Description", "type": "string" }, "template": { "title": "Template", "type": "string", "enum": ["microservice", "frontend", "library", "documentation"] }, "team": { "title": "Owning Team", "type": "string", "format": "team" }, "visibility": { "title": "Visibility", "type": "string", "enum": ["private", "internal", "public"], "default": "private" } }, "required": ["name", "template", "team"] } }, "invocationMethod": { "type": "GITHUB", "org": "your-org", "repo": "platform-workflows", "workflow": "create-repository.yml", "workflowInputs": { "repository_name": "{{ .inputs.name }}", "description": "{{ .inputs.description }}", "template": "{{ .inputs.template }}", "team": "{{ .inputs.team }}", "visibility": "{{ .inputs.visibility }}" } } }_

Integration von Terrain

json { "identifier": "provision_infrastructure", "title": "Provision Infrastructure", "icon": "Terraform", "description": "Provision cloud infrastructure using Terraform", "trigger": { "type": "self-service", "operation": "CREATE", "userInputs": { "properties": { "environment": { "title": "Environment", "type": "string", "enum": ["dev", "staging", "prod"] }, "instance_type": { "title": "Instance Type", "type": "string", "enum": ["t3.micro", "t3.small", "t3.medium", "t3.large"] }, "region": { "title": "AWS Region", "type": "string", "enum": ["us-east-1", "us-west-2", "eu-west-1"] }, "auto_scaling": { "title": "Enable Auto Scaling", "type": "boolean", "default": false } }, "required": ["environment", "instance_type", "region"] } }, "invocationMethod": { "type": "WEBHOOK", "url": "https://terraform-api.company.com/provision", "method": "POST", "headers": { "Authorization": "Bearer {{ .secrets.TERRAFORM_TOKEN }}", "Content-Type": "application/json" }, "body": { "workspace": "{{ .entity.identifier }}-{{ .inputs.environment }}", "variables": { "environment": "{{ .inputs.environment }}", "instance_type": "{{ .inputs.instance_type }}", "region": "{{ .inputs.region }}", "auto_scaling": "{{ .inputs.auto_scaling }}", "service_name": "{{ .entity.identifier }}" } } } }_

Scorecards und Standards

Scorecard Definition

json { "identifier": "production_readiness", "title": "Production Readiness", "description": "Measures how ready a service is for production deployment", "filter": { "combinator": "and", "rules": [ { "property": "$blueprint", "operator": "=", "value": "service" }, { "property": "lifecycle", "operator": "in", "value": ["Production", "Staging"] } ] }, "rules": [ { "identifier": "has_readme", "title": "Has README", "description": "Service repository contains a README file", "level": "Bronze", "query": { "combinator": "and", "rules": [ { "property": "has_readme", "operator": "=", "value": true } ] } }, { "identifier": "has_tests", "title": "Has Tests", "description": "Service has automated tests with >80% coverage", "level": "Silver", "query": { "combinator": "and", "rules": [ { "property": "test_coverage", "operator": ">=", "value": 80 } ] } }, { "identifier": "has_monitoring", "title": "Has Monitoring", "description": "Service has monitoring and alerting configured", "level": "Gold", "query": { "combinator": "and", "rules": [ { "property": "has_monitoring", "operator": "=", "value": true }, { "property": "has_alerts", "operator": "=", "value": true } ] } } ] }_

Qualitätstore

json { "identifier": "security_compliance", "title": "Security Compliance", "description": "Ensures services meet security standards", "filter": { "combinator": "and", "rules": [ { "property": "$blueprint", "operator": "=", "value": "service" } ] }, "rules": [ { "identifier": "vulnerability_scan", "title": "No Critical Vulnerabilities", "description": "Service has no critical security vulnerabilities", "level": "Bronze", "query": { "combinator": "and", "rules": [ { "property": "critical_vulnerabilities", "operator": "=", "value": 0 } ] } }, { "identifier": "secrets_management", "title": "Proper Secrets Management", "description": "Service uses proper secrets management", "level": "Silver", "query": { "combinator": "and", "rules": [ { "property": "uses_secrets_manager", "operator": "=", "value": true }, { "property": "hardcoded_secrets", "operator": "=", "value": 0 } ] } }, { "identifier": "security_review", "title": "Security Review Completed", "description": "Service has completed security review", "level": "Gold", "query": { "combinator": "and", "rules": [ { "property": "security_review_status", "operator": "=", "value": "approved" } ] } } ] }_

Dashboards und Visualisierung

In den Warenkorb

json { "identifier": "engineering_overview", "title": "Engineering Overview", "description": "High-level view of engineering metrics", "widgets": [ { "id": "services_by_team", "title": "Services by Team", "type": "pie-chart", "dataset": { "combinator": "and", "rules": [ { "property": "$blueprint", "operator": "=", "value": "service" } ] }, "property": "team" }, { "id": "deployment_frequency", "title": "Deployment Frequency", "type": "line-chart", "dataset": { "combinator": "and", "rules": [ { "property": "$blueprint", "operator": "=", "value": "deployment" } ] }, "property": "$createdAt", "timeframe": "last30days" }, { "id": "production_readiness", "title": "Production Readiness Score", "type": "number", "dataset": { "combinator": "and", "rules": [ { "property": "$blueprint", "operator": "=", "value": "service" }, { "property": "lifecycle", "operator": "=", "value": "Production" } ] }, "calculation": "average", "property": "$scorecard.production_readiness.level" } ] }_

Team Dashboard

json { "identifier": "team_dashboard", "title": "Team Dashboard", "description": "Team-specific metrics and services", "filters": [ { "property": "team", "operator": "=", "value": "{{ user.team }}" } ], "widgets": [ { "id": "my_services", "title": "My Services", "type": "table", "dataset": { "combinator": "and", "rules": [ { "property": "$blueprint", "operator": "=", "value": "service" }, { "property": "team", "operator": "=", "value": "{{ user.team }}" } ] }, "columns": ["title", "lifecycle", "health_status", "last_deployment"] }, { "id": "incident_count", "title": "Open Incidents", "type": "number", "dataset": { "combinator": "and", "rules": [ { "property": "$blueprint", "operator": "=", "value": "incident" }, { "property": "status", "operator": "=", "value": "open" }, { "property": "assigned_team", "operator": "=", "value": "{{ user.team }}" } ] } } ] }_

Erweiterte Funktionen

Benutzerdefinierte Eigenschaften mit Berechnungen

json { "identifier": "service", "title": "Service", "schema": { "properties": { "name": { "title": "Name", "type": "string" }, "cpu_requests": { "title": "CPU Requests", "type": "number" }, "cpu_limits": { "title": "CPU Limits", "type": "number" }, "cpu_utilization": { "title": "CPU Utilization (%)", "type": "number", "calculation": { "type": "formula", "formula": "(cpu_requests / cpu_limits) * 100" } }, "cost_per_month": { "title": "Monthly Cost", "type": "number", "format": "currency" }, "cost_per_request": { "title": "Cost per Request", "type": "number", "calculation": { "type": "formula", | "formula": "cost_per_month / (requests_per_month | | 1)" | } } } } }_

Automatisierungsregeln

json { "identifier": "auto_assign_team", "title": "Auto-assign Team Based on Repository", "description": "Automatically assign team ownership based on repository path", "trigger": { "type": "entity-created", "blueprintIdentifier": "service" }, "conditions": [ { "property": "repository", "operator": "contains", "value": "/backend/" } ], "actions": [ { "type": "update-entity", "properties": { "team": "backend-team" } }, { "type": "send-notification", "channel": "slack", "message": "New backend service {{ entity.title }} has been registered and assigned to backend-team" } ] }_

Datenanreicherung

```python

Port data enrichment webhook

from flask import Flask, request, jsonify import requests import json

app = Flask(name)

@app.route('/enrich-service', methods=['POST']) def enrich_service(): """Enrich service data with external information""" data = request.json entity = data.get('entity', {})

# Get repository information from GitHub
repo_url = entity.get('properties', {}).get('repository')
if repo_url:
    github_data = get_github_metrics(repo_url)

    # Update entity with enriched data
    enriched_properties = {
        **entity.get('properties', {}),
        'stars': github_data.get('stargazers_count', 0),
        'forks': github_data.get('forks_count', 0),
        'open_issues': github_data.get('open_issues_count', 0),
        'last_commit': github_data.get('pushed_at'),
        'primary_language': github_data.get('language')
    }

    # Update Port entity
    update_port_entity(entity['identifier'], enriched_properties)

return jsonify({"status": "success"})

def get_github_metrics(repo_url): """Fetch metrics from GitHub API""" # Extract owner/repo from URL parts = repo_url.replace('https://github.com/', '').split('/') owner, repo = parts[0], parts[1]

response = requests.get(
    f"https://api.github.com/repos/{owner}/{repo}",
    headers={"Authorization": f"token {GITHUB_TOKEN}"}
)

return response.json() if response.status_code == 200 else {}

def update_port_entity(identifier, properties): """Update Port entity with enriched data""" token = get_port_access_token()

response = requests.patch(
    f"{PORT_API_URL}/blueprints/service/entities/{identifier}",
    headers={
        "Authorization": f"Bearer {token}",
        "Content-Type": "application/json"
    },
    json={"properties": properties}
)

return response.json()

if name == 'main': app.run(host='0.0.0.0', port=5000) ```_

Überwachung und Beobachtungsfähigkeit

Integration von Gesundheitschecks

```python import requests import json from datetime import datetime

def check_service_health(): """Check health of all services and update Port""" services = get_port_entities("service")

for service in services:
    health_url = service.get('properties', {}).get('health_endpoint')
    if health_url:
        try:
            response = requests.get(health_url, timeout=10)
            health_status = "Healthy" if response.status_code == 200 else "Unhealthy"
            response_time = response.elapsed.total_seconds() * 1000

            # Update Port entity
            update_port_entity(service['identifier'], {
                'health_status': health_status,
                'response_time_ms': response_time,
                'last_health_check': datetime.utcnow().isoformat()
            })

        except requests.RequestException:
            update_port_entity(service['identifier'], {
                'health_status': 'Unreachable',
                'last_health_check': datetime.utcnow().isoformat()
            })

def get_port_entities(blueprint): """Fetch entities from Port""" token = get_port_access_token()

response = requests.get(
    f"{PORT_API_URL}/blueprints/{blueprint}/entities",
    headers={"Authorization": f"Bearer {token}"}
)

return response.json().get('entities', [])

Run health checks every 5 minutes

if name == 'main': import schedule import time

schedule.every(5).minutes.do(check_service_health)

while True:
    schedule.run_pending()
    time.sleep(1)

```_

Sammlung von Metriken

```python import requests import json from prometheus_client.parser import text_string_to_metric_families

def collect_prometheus_metrics(): """Collect metrics from Prometheus and update Port""" services = get_port_entities("service")

for service in services:
    service_name = service['identifier']

    # Query Prometheus for service metrics
    metrics = {
        'cpu_usage': query_prometheus(f'avg(cpu_usage{{service="{service_name}"}})'),
        'memory_usage': query_prometheus(f'avg(memory_usage{{service="{service_name}"}})'),
        'request_rate': query_prometheus(f'rate(http_requests_total{{service="{service_name}"}}[5m])'),
        'error_rate': query_prometheus(f'rate(http_requests_total{{service="{service_name}",status=~"5.."}[5m])'),
        'p95_latency': query_prometheus(f'histogram_quantile(0.95, http_request_duration_seconds{{service="{service_name}"}})'),
    }

    # Update Port entity with metrics
    update_port_entity(service['identifier'], {
        'cpu_usage_percent': metrics['cpu_usage'],
        'memory_usage_percent': metrics['memory_usage'],
        'requests_per_second': metrics['request_rate'],
        'error_rate_percent': metrics['error_rate'] * 100,
        'p95_latency_ms': metrics['p95_latency'] * 1000,
        'metrics_last_updated': datetime.utcnow().isoformat()
    })

def query_prometheus(query): """Query Prometheus for metrics""" response = requests.get( f"{PROMETHEUS_URL}/api/v1/query", params={'query': query} )

data = response.json()
if data['status'] == 'success' and data['data']['result']:
    return float(data['data']['result'][0]['value'][1])
return 0

```_

Sicherheit und Compliance

RBAC Konfiguration

json { "roles": [ { "identifier": "developer", "title": "Developer", "description": "Standard developer access", "permissions": [ { "action": "read", "resource": "entity", "condition": { "property": "team", "operator": "=", "value": "{{ user.team }}" } }, { "action": "execute", "resource": "action", "condition": { "property": "identifier", "operator": "in", "value": ["deploy_service", "restart_service"] } } ] }, { "identifier": "team_lead", "title": "Team Lead", "description": "Team lead with additional permissions", "permissions": [ { "action": "*", "resource": "entity", "condition": { "property": "team", "operator": "=", "value": "{{ user.team }}" } }, { "action": "execute", "resource": "action", "condition": { "property": "approval_required", "operator": "=", "value": false } } ] }, { "identifier": "platform_admin", "title": "Platform Admin", "description": "Full platform access", "permissions": [ { "action": "*", "resource": "*" } ] } ] }_

Audit Logging

```python import json from datetime import datetime import logging

Configure audit logger

audit_logger = logging.getLogger('port_audit') audit_handler = logging.FileHandler('port_audit.log') audit_formatter = logging.Formatter('%(asctime)s - %(message)s') audit_handler.setFormatter(audit_formatter) audit_logger.addHandler(audit_handler) audit_logger.setLevel(logging.INFO)

def log_audit_event(event_type, user, entity, action, details=None): """Log audit events for compliance""" audit_event = { 'timestamp': datetime.utcnow().isoformat(), 'event_type': event_type, 'user': { 'id': user.get('id'), 'email': user.get('email'), 'team': user.get('team') }, 'entity': { 'identifier': entity.get('identifier'), 'blueprint': entity.get('blueprint'), 'title': entity.get('title') }, 'action': action, 'details': details or {}, 'ip_address': get_client_ip(), 'user_agent': get_user_agent() }

audit_logger.info(json.dumps(audit_event))

Example usage in webhook handler

@app.route('/port-action', methods=['POST']) def handle_port_action(): data = request.json

# Log the action execution
log_audit_event(
    event_type='action_executed',
    user=data.get('trigger', {}).get('by', {}),
    entity=data.get('entity', {}),
    action=data.get('action', {}).get('identifier'),
    details={
        'inputs': data.get('payload', {}).get('inputs', {}),
        'run_id': data.get('run', {}).get('id')
    }
)

# Process the action
result = process_action(data)

# Log the result
log_audit_event(
    event_type='action_completed',
    user=data.get('trigger', {}).get('by', {}),
    entity=data.get('entity', {}),
    action=data.get('action', {}).get('identifier'),
    details={
        'status': result.get('status'),
        'duration': result.get('duration'),
        'output': result.get('output')
    }
)

return jsonify(result)

```_

Best Practices

Datenmodellbau

```bash

Best practices for blueprint design:

1. Start with core entities (services, teams, environments)

2. Use consistent naming conventions

3. Define clear relationships between entities

4. Include metadata for governance (owner, lifecycle, tier)

5. Plan for scalability and evolution

6. Use enums for standardized values

7. Include validation rules

8. Design for automation and self-service

```_

Integrationsstrategie

```bash

Integration best practices:

1. Start with read-only integrations

2. Implement gradual data ingestion

3. Use webhooks for real-time updates

4. Implement proper error handling and retries

5. Monitor integration health

6. Use batch processing for large datasets

7. Implement data validation and cleansing

8. Plan for data migration and evolution

```_

Selbstbedienungsdesign

```bash

Self-service action design:

1. Design for common use cases first

2. Provide clear input validation

3. Include helpful descriptions and examples

4. Implement proper approval workflows

5. Provide feedback and status updates

6. Include rollback capabilities

7. Monitor action usage and success rates

8. Iterate based on user feedback

```_

Governance und Compliance

```bash

Governance best practices:

1. Implement proper RBAC from the start

2. Enable comprehensive audit logging

3. Define clear ownership models

4. Implement quality gates and scorecards

5. Regular compliance reviews

6. Automate policy enforcement

7. Provide training and documentation

8. Monitor and measure adoption

```_

Fehlerbehebung

Gemeinsame Themen

```bash

API authentication issues

1. Verify client ID and secret

2. Check token expiration

3. Validate API permissions

4. Review rate limiting

Data ingestion problems

1. Check webhook endpoints

2. Validate JSON schema

3. Review entity relationships

4. Check for duplicate identifiers

Performance issues

1. Optimize API queries

2. Implement proper caching

3. Use batch operations

4. Monitor rate limits

```_

Debug Tools

```python

Port API debugging utility

import requests import json

def debug_port_api(endpoint, method='GET', data=None): """Debug Port API calls with detailed logging""" token = get_port_access_token()

headers = {
    'Authorization': f'Bearer {token}',
    'Content-Type': 'application/json'
}

print(f"Making {method} request to: {PORT_API_URL}{endpoint}")
print(f"Headers: {json.dumps(headers, indent=2)}")

if data:
    print(f"Request body: {json.dumps(data, indent=2)}")

response = requests.request(
    method=method,
    url=f"{PORT_API_URL}{endpoint}",
    headers=headers,
    json=data
)

print(f"Response status: {response.status_code}")
print(f"Response headers: {dict(response.headers)}")
print(f"Response body: {json.dumps(response.json(), indent=2)}")

return response

Example usage

debug_port_api('/blueprints/service/entities') ```_

Ressourcen

Dokumentation

Gemeinschaft

  • [Port Community](LINK_9___ -%20[Slack%20Community](LINK_9 -%20[GitHub%20Beispiele](LINK_9

%20Ausbildung

-%20[Port%20Academy](LINK_9 -%20[Webinar-Serie](_LINK_9__ -%20(LINK9)