Scheduled Slack Digest
Every weekday morning at 9 AM, this workflow queries a data store table for the records added in the last 24 hours, aggregates them, formats a Slack-friendly summary, and posts it to a channel. Use it for a daily ops digest, sales lead summary, error count digest, or any “what happened yesterday” report.
Workflow JSON
Section titled “Workflow JSON”The recipe assumes the data store table stores records with a top-level createdAt field (an ISO 8601 string) and a status and amount field on each value. Adjust to match your table’s schema.
{ "name": "Daily Slack Digest", "initial": { "tableName": "leads" }, "steps": [ { "stepId": "yesterday", "stepType": "datetime.manipulate", "input": { "operation": "subtract", "dateTime": "now", "amount": 1, "unit": "days" } }, { "stepId": "load-yesterday", "stepType": "data-store.query", "input": { "tableName": "{{ initial.tableName }}", "filters": [ { "field": "createdAt", "operator": "gte", "value": "{{ yesterday.result }}" } ], "filterMode": "all", "limit": 1000 } }, { "stepId": "aggregate", "stepType": "data.reduce", "input": { "items": "{{ load-yesterday.entries }}", "reduce": { "totalCount": { "count": {} }, "newCount": { "count": { "if": { "==": ["{{ $item.value.status }}", "new"] } } }, "qualifiedCount": { "count": { "if": { "==": ["{{ $item.value.status }}", "qualified"] } } }, "totalValue": { "sum": "{{ $item.value.amount }}" } } } }, { "stepId": "format-message", "stepType": "core.set-variable", "input": { "messageText": "*Daily digest for {{ initial.tableName }}*\n\n• Total new records: *{{ aggregate.results.totalCount }}*\n• Status `new`: {{ aggregate.results.newCount }}\n• Status `qualified`: {{ aggregate.results.qualifiedCount }}\n• Total value: ${{ aggregate.results.totalValue }}\n\n_As of {{ yesterday.result }}_" } }, { "stepId": "post-to-slack", "stepType": "core.http", "input": { "url": "{{ $env.SLACK_WEBHOOK_URL }}", "method": "POST", "headers": { "Content-Type": "application/json" }, "body": { "text": "{{ $vars.messageText }}" } } } ]}Connections needed: none.
Environment variable:
| Variable | Value |
|---|---|
SLACK_WEBHOOK_URL | Your Slack incoming webhook URL (e.g. https://hooks.slack.com/services/T.../B.../...) |
Add it to your workflow’s environment in Environments. No Slack OAuth required — Slack’s incoming webhooks are unauthenticated URLs you store in an env var.
Trigger: add a Schedule trigger with:
| Field | Value |
|---|---|
| Cron Expression | 0 9 * * 1-5 (every weekday at 9 AM) |
| Timezone | Your local timezone, e.g. America/New_York |
| Initial Data | { "tableName": "leads" } |
The tableName in initial data lets you reuse the same workflow for multiple tables by creating multiple schedule triggers with different table names.
How it works
Section titled “How it works”Five steps:
- Compute yesterday’s cutoff —
datetime.manipulatestep subtracts 24 hours fromnowto produce a clean ISO timestamp. - Query yesterday’s rows —
data-store.queryfiltered bycreatedAt >= yesterday. - Aggregate —
data.reducerolls the rows up into counts and a sum total. - Format the message —
core.set-variablebuilds the Slack markdown block. - Post to Slack — HTTP POST to the Slack incoming webhook URL.
What to customize first
Section titled “What to customize first”- Change the time window. Bump the
datetime.manipulateamountto7andunittodaysfor a weekly digest, or to1/hoursfor an hourly heartbeat. - Add per-status grouping. Use
data.group-byinstead ofdata.reduceto break the digest down by category, region, or owner. - Format with Slack Block Kit. The current implementation uses simple markdown text. For richer formatting (color blocks, fields, dividers), build a Block Kit JSON object and post it as
{ "blocks": [...] }in the Slack body. - Send to multiple channels. Wrap the post step in a
core.for-eachover an array of webhook URLs to fan out to multiple channels. - Email instead of Slack. Swap the HTTP step for the email step (
email.send-email) and send the digest to a distribution list.
Related recipes
Section titled “Related recipes”- Lead Dedupe + Enrich Pipeline — populates the data store this digest reads from
- Webhook → Weather API — simpler webhook-driven HTTP example