Core
Built-in SuperPlane components.
Triggers
Section titled “Triggers”Actions
Section titled “Actions”Schedule
Section titled “Schedule”The Schedule trigger starts workflow executions automatically based on a configured schedule.
Use Cases
Section titled “Use Cases”- Periodic tasks: Run daily reports, backups, or maintenance tasks
- Data synchronization: Regularly sync data between systems
- Monitoring: Periodic health checks and monitoring
- Batch processing: Process data on a recurring schedule
Schedule Types
Section titled “Schedule Types”- Minutes: Trigger every N minutes (1-59)
- Hours: Trigger every N hours at a specific minute (1-23 hours)
- Days: Trigger every N days at a specific time (1-31 days)
- Weeks: Trigger every N weeks on specific weekdays at a specific time (1-52 weeks)
- Months: Trigger every N months on a specific day and time (1-24 months)
- Cron: Use a cron expression for advanced scheduling patterns
Timezone Support
Section titled “Timezone Support”For days, weeks, months, and cron schedules, you can specify a timezone to ensure triggers occur at the correct local time.
Cron Expressions
Section titled “Cron Expressions”Supports both 5-field and 6-field cron expressions:
- 5-field:
minute hour day month dayofweek(e.g.,30 14 * * MON-FRI) - 6-field:
second minute hour day month dayofweek(e.g.,0 30 14 * * MON-FRI)
Event Data
Section titled “Event Data”Each scheduled execution includes calendar information:
- calendar: Year, month, day, hour, minute, second, week_day
- timezone: Timezone information (for applicable schedule types)
Examples
Section titled “Examples”- Every 15 minutes: Minutes schedule with 15-minute interval
- Daily at 9 AM: Days schedule with hour=9, minute=0
- Weekdays at 2 PM: Weeks schedule with weekDays=[Monday-Friday], hour=14
- First of every month: Months schedule with dayOfMonth=1
Example Data
Section titled “Example Data”{ "data": { "calendar": { "day": "1", "hour": "09", "minute": "00", "month": "January", "second": "00", "week_day": "Monday", "year": "2024" }, "timezone": "+00:00" }, "timestamp": "2024-01-01T09:00:00Z", "type": "scheduler.tick"}Manual Run
Section titled “Manual Run”The Manual Run trigger allows you to start workflow executions manually from the SuperPlane UI.
Use Cases
Section titled “Use Cases”- Testing workflows: Manually trigger workflows during development and testing
- One-off tasks: Run workflows on-demand for specific operations
- Debugging: Manually execute workflows to debug issues
- Ad-hoc processing: Process data when needed without automation
How It Works
Section titled “How It Works”- Add the Manual Run trigger as the starting node of your workflow
- Click the “Run” button in the workflow UI to start an execution
- The workflow begins immediately with empty event data
Configuration
Section titled “Configuration”The Manual Run trigger requires no configuration. It’s ready to use immediately after being added to a workflow.
Event Data
Section titled “Event Data”Manual runs start with an empty event payload. You can use this as a starting point and add data through subsequent components.
Example Data
Section titled “Example Data”{ "foo": "bar"}Webhook
Section titled “Webhook”The Webhook trigger starts a new workflow execution when an HTTP request is received at the generated webhook URL.
Use Cases
Section titled “Use Cases”- External system integration: Receive events from third-party services
- CI/CD pipelines: Trigger workflows from build systems
- Form submissions: Process data from web forms
- Event notifications: Receive notifications from external applications
How It Works
Section titled “How It Works”- When you add a Webhook trigger to a workflow, SuperPlane generates a unique webhook URL
- Configure the authentication method for the webhook
- External systems can send HTTP requests to this URL
- Each request starts a new workflow execution with the request data
Authentication Methods
Section titled “Authentication Methods”- Signature (HMAC): Verify requests using HMAC-SHA256 signature in the
X-Signature-256header - Bearer Token: Require a Bearer token in the
Authorizationheader - Header Token: Require a raw token in a custom header (default:
X-Webhook-Token) - None (unsafe): No authentication (not recommended for production)
Request Data
Section titled “Request Data”The webhook payload includes:
- body: Parsed request body (JSON if possible, otherwise raw data)
- headers: All HTTP headers from the request
Security
Section titled “Security”- Each webhook has a unique secret key for authentication
- Secrets can be reset using the “Reset Authentication” action
- Maximum payload size: 64KB
Example Usage
Section titled “Example Usage”Send a POST request to the webhook URL with your payload. The workflow will receive the data and start execution.
Example Data
Section titled “Example Data”{ "data": { "body": { "event": "push", "repository": "superplanehq/superplane" }, "headers": { "X-Event": [ "push" ] } }, "timestamp": "2026-01-19T12:00:00Z", "type": "webhook"}Add Memory
Section titled “Add Memory”The Add Memory component appends a new item to canvas-level memory storage.
Use Cases
Section titled “Use Cases”- Persist identifiers for later cleanup paths
- Store cross-run mappings (for example pull request to resource ID)
- Keep structured operational context per canvas
How It Works
Section titled “How It Works”- Reads
namespaceand value fields from configuration - Appends a new memory row for the current canvas
- Emits
memory.addedwith the saved payload
Example Output
Section titled “Example Output”{ "data": { "data": { "namespace": "machines", "values": { "creator": "alex", "id": "1", "pull_request": "123" } } }, "timestamp": "2026-01-19T12:00:00Z", "type": "memory.added"}Approval
Section titled “Approval”The Approval component pauses workflow execution and waits for manual approval from specified users, groups, or roles before continuing.
Use Cases
Section titled “Use Cases”- Deployment approvals: Require approval before deploying to production
- Financial transactions: Get approval for high-value operations
- Content moderation: Review content before publishing
- Compliance workflows: Ensure regulatory approvals are obtained
How It Works
Section titled “How It Works”- When the Approval component executes, it creates approval requirements based on the configured approvers
- The workflow pauses and waits for all required approvals
- Approvers receive notifications and can approve or reject from the workflow UI
- Once all approvals are collected, the workflow continues:
- Approved channel: All required approvers approved
- Rejected channel: At least one approver rejected
Configuration
Section titled “Configuration”- Approvers: List of users, groups, or roles who must approve
- Everyone: Any authenticated user can approve
- Specific user: Only the specified user can approve
- Group: Any member of the specified group can approve
- Role: Any user with the specified role can approve
Output Channels
Section titled “Output Channels”- Approved: Emitted when all required approvers have approved
- Rejected: Emitted when at least one approver rejects (after all have responded)
Actions
Section titled “Actions”- approve: Approve a pending requirement (can include an optional comment)
- reject: Reject a pending requirement (requires a reason)
Example Output
Section titled “Example Output”{ "data": { "records": [ { "approval": { "approvedAt": "2024-01-01T12:00:00Z", "comment": "Looks good" }, "index": 0, "state": "approved", "type": "user", "user": { "email": "alex@example.com", "id": "user_123", "name": "Alex Doe" } } ], "result": "approved" }, "timestamp": "2026-01-16T17:56:16.680755501Z", "type": "approval.finished"}Delete Memory
Section titled “Delete Memory”The Delete Memory component removes memory rows from canvas-level memory storage.
Use Cases
Section titled “Use Cases”- Remove stale IDs after cleanup is complete
- Keep memory stores bounded over time
How It Works
Section titled “How It Works”- Reads
namespaceandmatchListfrom configuration - Deletes memory rows matching all configured key/value pairs
- Emits
memory.deletedto thedeletedornotFoundchannel
Output Channels
Section titled “Output Channels”- Deleted: At least one matching memory row was removed
- Not Found: No matching memory rows were removed
Example Output
Section titled “Example Output”{ "data": { "data": { "count": 1, "deleted": [ { "creator": "igor", "pull_request": 123, "sandbox_id": "sbx-001" } ], "matches": { "creator": "igor", "pull_request": 123 }, "namespace": "machines" } }, "timestamp": "2026-02-28T00:00:00Z", "type": "memory.deleted"}Filter
Section titled “Filter”The Filter component evaluates a boolean expression against incoming events and only forwards events that match the condition.
Use Cases
Section titled “Use Cases”- Data validation: Only process events that meet certain criteria
- Event filtering: Filter out unwanted events before processing
- Conditional routing: Stop processing events that don’t match requirements
- Data quality: Ensure only valid data continues through the workflow
How It Works
Section titled “How It Works”- The Filter component evaluates a boolean expression against the incoming event data
- If the expression evaluates to
true, the event is emitted to the default output channel - If the expression evaluates to
false, the execution passes without emitting (effectively filtering out the event)
Expression Environment
Section titled “Expression Environment”The expression has access to:
- $: The run context data
- root(): Access to the root event data
- previous(): Access to previous node outputs (optionally with depth parameter)
Examples
Section titled “Examples”$["Node Name"].status == "active": Only forward events where status is “active”$["Node Name"].amount > 1000: Filter events with amount greater than 1000$["Node Name"].user.role == "admin" && $["Node Name"].action == "delete": Complex condition checking multiple fields
Example Output
Section titled “Example Output”{ "data": {}, "timestamp": "2026-01-16T17:56:16.680755501Z", "type": "filter.executed"}HTTP Request
Section titled “HTTP Request”The HTTP component allows you to make HTTP requests to external APIs and services as part of your workflow.
Use Cases
Section titled “Use Cases”- API integration: Call external REST APIs
- Webhook notifications: Send notifications to external systems
- Data fetching: Retrieve data from external services
- Service orchestration: Coordinate with microservices
Supported Methods
Section titled “Supported Methods”- GET, POST, PUT, DELETE, PATCH
Request Configuration
Section titled “Request Configuration”- URL: The endpoint to call (supports expressions)
- Method: HTTP method to use
- Query Parameters: Optional URL query parameters
- Headers: Custom HTTP headers (header names cannot use expressions)
- Body: Request body in various formats:
- JSON: Structured JSON payload
- Form Data: URL-encoded form data
- Plain Text: Raw text content
- XML: XML formatted content
Response Handling
Section titled “Response Handling”The component emits the response with:
- status: HTTP status code
- headers: Response headers
- body: Parsed response body (JSON if possible, otherwise string)
Example Output
Section titled “Example Output”{ "data": { "body": { "message": "ok" }, "error": "Error to read request body: EOF", "headers": { "Content-Type": [ "application/json" ] }, "status": 200 }, "timestamp": "2026-01-16T17:56:16.680755501Z", "type": "http.request.finished"}The If component evaluates a boolean expression and routes events to different output channels based on the result.
Use Cases
Section titled “Use Cases”- Conditional branching: Route events down different paths based on conditions
- Decision logic: Implement if-then-else logic in workflows
- Data routing: Send events to different processing paths
- Workflow control: Control workflow flow based on event properties
How It Works
Section titled “How It Works”- The If component evaluates a boolean expression against the incoming event data
- If the expression evaluates to
true, the event is emitted to the “True” output channel - If the expression evaluates to
false, the event is emitted to the “False” output channel
Output Channels
Section titled “Output Channels”- True: Events where the expression evaluates to
true - False: Events where the expression evaluates to
false
Expression Environment
Section titled “Expression Environment”The expression has access to:
- $: The run context data
- root(): Access to the root event data
- previous(): Access to previous node outputs (optionally with depth parameter)
Examples
Section titled “Examples”$["Node Name"].status == "approved": Route approved items to True channel$["Node Name"].amount > 1000: Route high-value items to True channel$["Node Name"].user.role == "admin": Route admin actions to True channel
Example Output
Section titled “Example Output”{ "data": {}, "timestamp": "2026-01-16T17:56:16.680755501Z", "type": "if.executed"}The Merge component waits for events from all upstream nodes before forwarding a combined result downstream.
Use Cases
Section titled “Use Cases”- Parallel processing: Wait for multiple parallel operations to complete
- Data aggregation: Combine results from multiple sources
- Synchronization: Synchronize multiple workflow branches
- Fan-in patterns: Collect outputs from multiple upstream nodes
How It Works
Section titled “How It Works”- The Merge component waits for events from all distinct upstream source nodes
- Once all inputs are received, it emits the combined data to the Success channel
- Optional timeout and conditional stop features allow early completion
Configuration Options
Section titled “Configuration Options”- Enable Timeout: Cancel merge after a specified time if not all inputs are received
- Enable Conditional Stop: Stop waiting early when a condition is met (e.g., if one branch fails)
Output Channels
Section titled “Output Channels”- Success: Emitted when all upstream inputs are received
- Timeout: Emitted if the timeout is reached before all inputs are received
- Fail: Emitted if the conditional stop expression evaluates to true
Behavior
Section titled “Behavior”- Tracks distinct source nodes (ignoring multiple channels from the same source)
- Combines all received event data into the output
- Supports timeout to prevent indefinite waiting
- Supports conditional early stop based on expression evaluation
Example Output
Section titled “Example Output”{ "data": { "eventIDs": [ "event_1", "event_2" ], "groupKey": "merge-group-123", "sources": [ "node_a", "node_b" ], "stopEarly": false }, "timestamp": "2026-01-16T17:56:16.680755501Z", "type": "merge.finished"}No Operation
Section titled “No Operation”The No Operation component is a pass-through component that forwards events to downstream nodes without any modification or processing.
Use Cases
Section titled “Use Cases”- Testing workflows: Use this component to test workflow connections and flow without side effects
- Placeholder nodes: Temporarily replace components during workflow development
- Event forwarding: Simply forward events when no processing is needed
Behavior
Section titled “Behavior”When executed, the No Operation component immediately emits the incoming event data to the default output channel without any transformation. It has no configuration options and requires no setup.
Example Output
Section titled “Example Output”{ "data": {}, "timestamp": "2026-01-16T17:56:16.680755501Z", "type": "noop.finished"}Read Memory
Section titled “Read Memory”The Read Memory component looks up values from canvas-level memory storage.
Use Cases
Section titled “Use Cases”- Retrieve previously stored IDs before cleanup actions
- Check whether related data already exists
- Rehydrate context from prior runs
How It Works
Section titled “How It Works”- Reads
namespace,resultMode,emitMode, andmatchListfrom configuration - Finds memory rows matching all configured key/value pairs
- Emits
memory.readto thefoundornotFoundchannel
Output Channels
Section titled “Output Channels”- Found: At least one matching memory row was found
- Not Found: No matching memory rows were found
Example Output
Section titled “Example Output”{ "data": { "data": { "count": 1, "emitMode": "allAtOnce", "matches": { "creator": "igor", "pull_request": 123 }, "namespace": "machines", "resultMode": "latest", "values": [ { "creator": "igor", "pull_request": 123, "sandbox_id": "sbx-001" } ] } }, "timestamp": "2026-02-28T00:00:00Z", "type": "memory.read"}Send Email Notification
Section titled “Send Email Notification”The Send Email Notification component sends emails through the system’s configured email provider (Resend or SMTP) without requiring a separate integration setup.
Use Cases
Section titled “Use Cases”- Notifications: Send email notifications for workflow events
- Alerts: Email alerts for errors or important conditions
- Status updates: Notify stakeholders about workflow progress
- User communications: Send emails to users as part of automated workflows
Recipients
Section titled “Recipients”Select recipients from your organization’s users, groups, or roles. The system resolves the actual email addresses at send time.
Configuration
Section titled “Configuration”- Recipients: List of users, groups, or roles
- Subject: Email subject line (supports expressions)
- Body: Email body content (supports expressions)
Output
Section titled “Output”Emits the list of recipients and the subject to the default output channel.
Example Output
Section titled “Example Output”{ "data": { "groups": [], "roles": [], "subject": "Deployment completed", "to": [ "alice@example.com", "bob@example.com" ] }, "timestamp": "2026-03-19T12:00:00.000000000Z", "type": "sendEmail.sent"}SSH Command
Section titled “SSH Command”Run one or more commands on a remote host via SSH.
Authentication
Section titled “Authentication”Choose SSH key or Password, then select the organization Secret and the key name within that secret that holds the credential.
- SSH key: Secret key containing the private key (PEM/OpenSSH). Optionally a second secret+key for passphrase if the key is encrypted.
- Password: Secret key containing the password.
Configuration
Section titled “Configuration”- Host, Port (default 22), Username: Connection details.
- Commands: One or more commands to run, one per line (supports expressions). The output payload is based on the last command.
- Working directory: Optional; Changes to this directory before running the command.
- Environment variables: Optional list of key/value pairs available during command execution.
- Timeout (seconds): How long the command may run (default 60).
- Connection retry (optional): Enable to retry connecting when the host is not reachable yet (e.g. server still booting). Set number of retries and interval between attempts.
Output
Section titled “Output”- success: Exit code 0
- failed: Non-zero exit code
Example Output
Section titled “Example Output”{ "data": { "exitCode": 0, "stderr": "", "stdout": "Hello, World!\n" }, "timestamp": "2026-01-19T12:00:00Z", "type": "ssh.command.executed"}Time Gate
Section titled “Time Gate”The Time Gate component delays event processing until the next valid day and time window, with optional excluded dates.
Use Cases
Section titled “Use Cases”- Business hours: Only process events during business hours
- Scheduled releases: Delay deployments until off-peak hours
- Holiday handling: Exclude specific dates from processing
- Time-based routing: Route events based on time of day or specific dates
Configuration
Section titled “Configuration”- Active Days: Days of the week when the gate can open
- Active Time: Start and end times in HH:MM-HH:MM format (24-hour)
- Timezone: Timezone offset for time calculations (default: current)
- Exclude Dates: Specific MM/DD dates that override the rules above
Behavior
Section titled “Behavior”- Events wait until the next valid time window is reached
- Exclude dates override the day/time rules
- Can be manually pushed through using the “Push Through” action
- Automatically schedules execution when the time window is reached
Example Output
Section titled “Example Output”{ "data": {}, "timestamp": "2026-01-16T17:56:16.680755501Z", "type": "timegate.finished"}Update Memory
Section titled “Update Memory”The Update Memory component updates matching rows in canvas-level memory storage.
Use Cases
Section titled “Use Cases”- Patch stored records after external state changes
- Enrich existing memory rows with additional fields
- Keep identifiers and status data in sync
How It Works
Section titled “How It Works”- Reads
namespace,matchList, andvalueListfrom configuration - Updates all matching memory rows in a single SQL operation
- Emits
memory.updatedto thefoundornotFoundchannel
Output Channels
Section titled “Output Channels”- Found: At least one matching memory row was updated
- Not Found: No matching memory rows were updated
Example Output
Section titled “Example Output”{ "data": { "data": { "count": 1, "matches": { "creator": "igor", "pull_request": 123 }, "namespace": "machines", "updated": [ { "creator": "igor", "pull_request": 123, "sandbox_id": "sbx-001", "status": "running", "updated_by": "workflow" } ], "values": { "status": "running", "updated_by": "workflow" } } }, "timestamp": "2026-02-28T00:00:00Z", "type": "memory.updated"}Upsert Memory
Section titled “Upsert Memory”The Upsert Memory component updates matching rows in canvas-level memory storage, and creates a new row when no matches are found.
Use Cases
Section titled “Use Cases”- Keep one record per identifier (for example environment or pull request)
- Replace ad-hoc update-then-add branching with one component
- Persist latest status snapshots with stable matching keys
How It Works
Section titled “How It Works”- Reads
namespace,matchList, andvalueListfrom configuration - Attempts to update all matching memory rows
- If no rows were updated, inserts a new memory row with the values
- Emits
memory.upsertedto the default channel withoperationset toupdatedorcreated
Simplified Matching
Section titled “Simplified Matching”If matchList is empty, the component treats the namespace as a singleton record and upserts at namespace level.
This lets you store just one field (for example value) without extra marker fields.
Output
Section titled “Output”Always emits to the default channel. Check data.operation to know whether the component updated existing rows or created a new row.
Example Output
Section titled “Example Output”{ "data": { "data": { "count": 1, "matches": { "environment": "production", "latest_deployment_source": "manual_run" }, "namespace": "deployments", "operation": "updated", "records": [ { "environment": "production", "latest_deployment": "v1.0.1", "latest_deployment_source": "manual_run" } ], "values": { "environment": "production", "latest_deployment": "v1.0.1", "latest_deployment_source": "manual_run" } } }, "timestamp": "2026-02-28T00:00:00Z", "type": "memory.upserted"}The Wait component pauses workflow execution for a specified duration or until a specific time is reached.
Use Cases
Section titled “Use Cases”- Rate limiting: Add delays between API calls
- Scheduled execution: Wait until a specific time before proceeding
- Retry delays: Wait before retrying failed operations
- Time-based workflows: Delay processing until a specific date/time
Wait Modes
Section titled “Wait Modes”-
Interval: Wait for a fixed duration (seconds, minutes, or hours)
- Supports expressions for dynamic wait times
- Example:
{{$.retry_delay}}or{{$.status == "urgent" ? 0 : 30}}
-
Countdown: Wait until a specific date/time is reached
- Supports ISO 8601 date formats
- Supports expressions for dynamic target times
- Example:
{{$.release_date}}or{{$.run_time + duration("48h")}}
Behavior
Section titled “Behavior”- Execution pauses until the wait period completes
- Can be manually pushed through using the “Push Through” action
- Automatically resumes when the wait time expires
- Emits metadata including start time, finish time, and result
Output
Section titled “Output”The component emits a payload with:
- started_at: When the wait began
- finished_at: When the wait completed
- result: Completion status (completed, cancelled)
- reason: How it completed (timeout, manual_override, user_cancel)
Example Output
Section titled “Example Output”{ "data": { "actor": { "display_name": "Alex Doe", "email": "alex@example.com" }, "finished_at": "2024-01-01T12:05:00Z", "reason": "timeout", "result": "completed", "started_at": "2024-01-01T12:00:00Z" }, "timestamp": "2026-01-16T17:56:16.680755501Z", "type": "wait.finished"}