Tables
Tables are lightweight, user-authored data stores that your flows can read, write, and upsert. Think of them as shared state across automations — for deduplication, lookup caches, enrichment data, or custom audit trails.
When to use a table
- Deduplication: Before creating a HubSpot contact, check a “seen emails” table so you don't double-insert.
- Lookup caches: Map Shopify SKUs to internal product IDs without making an API call per flow run.
- Enrichment: Store per-customer context (tier, owner, custom tags) that multiple flows reference via
table.lookup. - Audit trails: Append one row per flow execution with the original trigger payload, for search later.
Creating a table
- Go to /admin/tables and click + New Table.
- Pick a name (e.g. Customers) and an optional description.
- Define your columns — name, type (string/number/boolean/date/json), and whether the column is required.
- Click Create table.
You can always edit the schema later from the Schema tab. Columns can be renamed, added, or removed without losing existing row data (the data is stored as JSONB and rows retain any fields not in the current schema).
Working with rows
From the admin UI
On the table detail page, use the Rows tab to add rows manually. Each column gets a typed input (number pickers for numeric columns, dropdowns for booleans, date pickers for dates). The external_key field is optional but enables idempotent upserts from flows.
From a flow step
Tables have three step kinds in the flow engine:
table.lookup- Find a row by column value. Config:
{tableSlug, column}. Input:{value}. Output:{found, row: {id, externalKey, ...data} | null}. table.upsert- Insert or update by
external_key. Config:{tableSlug}. Input:{externalKey, data}. Output:{created: boolean, row}. table.write- Append a new row unconditionally. Config:
{tableSlug}. Input:{data, externalKey?}. Use for event logs, never for lookup data.
{
"kind": "table.lookup",
"config": { "tableSlug": "customers", "column": "email" },
"inputMapping": { "value": "{{trigger.email}}" }
}{
"kind": "table.upsert",
"config": { "tableSlug": "customers" },
"inputMapping": {
"externalKey": "{{trigger.email}}",
"data": {
"email": "{{trigger.email}}",
"firstSeen": "{{trigger.created_at}}",
"source": "meta_lead_ads"
}
}
}Referencing table output downstream
All three step kinds flatten row data onto the output, so the next step can reference any column directly:
{{step_1.row.email}} // email column from the looked-up row
{{step_1.row.firstSeen}} // any JSONB field works
{{step_1.found}} // boolean — pair with a filter step
{{step_2.created}} // upsert created a new row vs updatedCSV import / export
The Import / Export tab handles bulk data. For imports, paste a CSV with a header row. If the header includes a column named external_key or id, those values become upsert keys — existing rows with a matching key are updated in place. Rows without a key are always inserted fresh.
Exports dump every row in the table as CSV with external_key as the first column followed by every schema column, in schema order. Fields with commas or newlines are quoted and internal quotes are escaped.
API reference
GET /connect/v1/tables # tables.view
POST /connect/v1/tables # tables.write
GET /connect/v1/tables/:slug # tables.view
PATCH /connect/v1/tables/:slug # tables.write
DELETE /connect/v1/tables/:slug # tables.write
GET /connect/v1/tables/:slug/rows # tables.view
POST /connect/v1/tables/:slug/rows # tables.write
POST /connect/v1/tables/:slug/rows/upsert # tables.write
GET /connect/v1/tables/:slug/rows/lookup # tables.view
PATCH /connect/v1/tables/:slug/rows/:rowId # tables.write
DELETE /connect/v1/tables/:slug/rows/:rowId # tables.write
POST /connect/v1/tables/:slug/import # tables.write
GET /connect/v1/tables/:slug/export # tables.viewPlan limits
Tables are plan-gated by total row count across every table in your workspace:
- Free: 500 rows
- Starter: 5,000 rows
- Professional: 50,000 rows
- Team: 250,000 rows
- Enterprise: unlimited
When the row limit is reached, new row creation returns HTTP 403. Upserts that match an existing row still succeed since they don't add rows. See your current usage on the Billing page.