Skip to content

Scheduled Slack Digest

Every weekday morning at 9 AM, this workflow queries a data store table for the records added in the last 24 hours, aggregates them, formats a Slack-friendly summary, and posts it to a channel. Use it for a daily ops digest, sales lead summary, error count digest, or any “what happened yesterday” report.

The recipe assumes the data store table stores records with a top-level createdAt field (an ISO 8601 string) and a status and amount field on each value. Adjust to match your table’s schema.

scheduled-slack-digest.json
{
"name": "Daily Slack Digest",
"initial": {
"tableName": "leads"
},
"steps": [
{
"stepId": "yesterday",
"stepType": "datetime.manipulate",
"input": {
"operation": "subtract",
"dateTime": "now",
"amount": 1,
"unit": "days"
}
},
{
"stepId": "load-yesterday",
"stepType": "data-store.query",
"input": {
"tableName": "{{ initial.tableName }}",
"filters": [
{
"field": "createdAt",
"operator": "gte",
"value": "{{ yesterday.result }}"
}
],
"filterMode": "all",
"limit": 1000
}
},
{
"stepId": "aggregate",
"stepType": "data.reduce",
"input": {
"items": "{{ load-yesterday.entries }}",
"reduce": {
"totalCount": {
"count": {}
},
"newCount": {
"count": {
"if": { "==": ["{{ $item.value.status }}", "new"] }
}
},
"qualifiedCount": {
"count": {
"if": { "==": ["{{ $item.value.status }}", "qualified"] }
}
},
"totalValue": {
"sum": "{{ $item.value.amount }}"
}
}
}
},
{
"stepId": "format-message",
"stepType": "core.set-variable",
"input": {
"messageText": "*Daily digest for {{ initial.tableName }}*\n\n• Total new records: *{{ aggregate.results.totalCount }}*\n• Status `new`: {{ aggregate.results.newCount }}\n• Status `qualified`: {{ aggregate.results.qualifiedCount }}\n• Total value: ${{ aggregate.results.totalValue }}\n\n_As of {{ yesterday.result }}_"
}
},
{
"stepId": "post-to-slack",
"stepType": "core.http",
"input": {
"url": "{{ $env.SLACK_WEBHOOK_URL }}",
"method": "POST",
"headers": {
"Content-Type": "application/json"
},
"body": {
"text": "{{ $vars.messageText }}"
}
}
}
]
}

Connections needed: none.

Environment variable:

VariableValue
SLACK_WEBHOOK_URLYour Slack incoming webhook URL (e.g. https://hooks.slack.com/services/T.../B.../...)

Add it to your workflow’s environment in Environments. No Slack OAuth required — Slack’s incoming webhooks are unauthenticated URLs you store in an env var.

Trigger: add a Schedule trigger with:

FieldValue
Cron Expression0 9 * * 1-5 (every weekday at 9 AM)
TimezoneYour local timezone, e.g. America/New_York
Initial Data{ "tableName": "leads" }

The tableName in initial data lets you reuse the same workflow for multiple tables by creating multiple schedule triggers with different table names.

Five steps:

  1. Compute yesterday’s cutoffdatetime.manipulate step subtracts 24 hours from now to produce a clean ISO timestamp.
  2. Query yesterday’s rowsdata-store.query filtered by createdAt >= yesterday.
  3. Aggregatedata.reduce rolls the rows up into counts and a sum total.
  4. Format the messagecore.set-variable builds the Slack markdown block.
  5. Post to Slack — HTTP POST to the Slack incoming webhook URL.
  • Change the time window. Bump the datetime.manipulate amount to 7 and unit to days for a weekly digest, or to 1/hours for an hourly heartbeat.
  • Add per-status grouping. Use data.group-by instead of data.reduce to break the digest down by category, region, or owner.
  • Format with Slack Block Kit. The current implementation uses simple markdown text. For richer formatting (color blocks, fields, dividers), build a Block Kit JSON object and post it as { "blocks": [...] } in the Slack body.
  • Send to multiple channels. Wrap the post step in a core.for-each over an array of webhook URLs to fan out to multiple channels.
  • Email instead of Slack. Swap the HTTP step for the email step (email.send-email) and send the digest to a distribution list.