Skip to main content

CLI node

A CLI node lets you run custom CLI commands as a step in your flow and pass structured output to the next nodes. The script runs in a secure, sandboxed environment. You must attach an AWS or GCP connection; the script runs with that connection's credentials, and gcloud or aws are pre-installed.

The CLI node is useful for simplifying your flows. For example, you can run the aws ec2 attach-volume command in a single CLI node. With regular AWS nodes, you would need to need to chain three nodes: DescribeInstances (validation), DescribeVolumes (state check), and AttachVolume.

Before you begin​

Create an AWS or GCP connection before adding a CLI node. For an overview of connections, see Connections.

Configure the CLI node​

Selecting a CLI node opens a side panel with a Parameters tab.

CLI node Parameters tab showing connection, provider, script button, and output referencing

  • Cloud connection provider: Select Google Cloud (GCP) or Amazon Web Services (AWS). A connection is required; the script runs with that connection's credentials.

    (AWS only) Account and Region: When the connection is Amazon Web Services (AWS), select the account and optionally the region. If not set, Region defaults to all regions.

    CLI node with AWS connection showing Account and Region fields

  • Select Add script or Edit script to open the script editor. The script runs in a sandbox. When you have selected a connection, gcloud or aws is available and authenticated for the chosen provider. You should print a single JSON value to standard output (stdout). Downstream nodes can then reference individual fields from that output, use it in conditions, or pass it to the next step—so producing valid JSON is how you make the CLI node's result usable in the rest of your flow.

    CLI script editor with $nodes example and completion

    You can reference output from any preceding node in the flow using $nodes["<Node name>"]. Add an optional path to reach nested values, for example $nodes["Manually start"][0].results[0].currentDate.

    # Reference previous node output in your script with $nodes["<node name>"]
    echo $nodes["Manually start"][0].results[0].currentDate
    Tip

    The script editor supports completion for $nodes. Use it to insert node names and paths.

  • Referencing the output: Defines how other nodes in your flow can reference the CLI node's output:

    • Basic referencing: The output is referenced as a single field. Use this for simple return values.

    • Advanced referencing: Define a JSON schema so that specific fields in your output can be individually referenced by downstream nodes. This works the same way as for the Code node.

Configure output schema​

When you select Advanced referencing, you define a JSON schema so that specific fields in the CLI node's output can be referenced by downstream nodes. In CloudFlow, the output schema describes only the value your node returns from its code or configuration—the actual payload that other nodes will use—not the run metadata or runtime envelope (e.g. results, message, context) shown in execution logs. Put simply: define the schema for the JSON value you would get if you took the node's functional output alone as a standalone JSON document; do not model the wrapper around it.

When defining an output schema:

  • Match the top-level type exactly: object, array, string, number, boolean, or null.

  • Define the shape recursively:

    • If it's an object, define properties (and optionally required).
    • If it's an array, define items (the schema for each element).
  • Avoid modeling CloudFlow's execution wrappers (e.g., results, message, context). Those are part of the runtime envelope, not the node's functional output.

Example patterns:

  • Node returns an object:
{ "type": "object", "properties": { "foo": { "type": "string" } } }
  • Node returns an array:
{ "type": "array", "items": { "type": "number" } }

Execution limits​

The script runs in a sandboxed environment with the following limits:

  • Maximum execution time: 10 seconds. If the script exceeds this, the node fails with a timeout error.

  • Rate limit: The sandbox may enforce rate limits. If exceeded, the node fails with a rate-limit error.

Examples​

The following script lists Google Cloud Storage buckets when a GCP connection is configured. The command outputs JSON directly, so downstream nodes can reference the array or its fields (for example with advanced referencing and a schema).

gcloud storage buckets list --format=json --limit=5

With an AWS connection and account selected, you can list S3 buckets instead:

aws s3api list-buckets --output json

To reference and filter output from an upstream node (for example, an AWS Describe instances node), use $nodes and then process the JSON. The following script reads the first result from an upstream node named Describe instances, filters EC2 instances to those in running state, and outputs a JSON array for downstream nodes. Replace the node name with your own.

# Filter running EC2 instances from upstream Describe instances node
echo '$nodes["Describe instances"][0].results[0]' | python3 -c "
import json, sys
data = json.load(sys.stdin)
instances = [
{\"InstanceId\": i[\"InstanceId\"], \"State\": i[\"State\"][\"Name\"]}
for r in data.get(\"Reservations\", [])
for i in r.get(\"Instances\", [])
if i.get(\"State\", {}).get(\"Name\") == \"running\"
]
print(json.dumps(instances))
"
Note

The node name in $nodes["Describe instances"] must match the exact name of the upstream node in your flow (including spaces and capitalization).

See also​