Webhook → AI URL Summary
A webhook URL that takes { "url": "https://..." }, fetches the page, runs an LLM Call in structured output mode to extract a typed summary, and returns clean JSON. AI-powered Cliffs Notes for any URL — drop it into a Slack bot, a browser extension, or a Notion automation, and you get back structured data you can pipe into anything.
Workflow JSON
Section titled “Workflow JSON”Before pasting, swap the "connection": "my-anthropic" placeholder for the name of your actual Anthropic (or OpenAI / Google) connection.
{ "name": "URL Summary API", "initial": { "url": "https://example.com" }, "steps": [ { "stepId": "fetch-page", "stepType": "core.http", "input": { "url": "{{ initial.url }}", "method": "GET", "retry": { "limit": 3 } } }, { "stepId": "summarize", "stepType": "ai.llm-call", "input": { "provider": { "provider": "anthropic", "connection": "my-anthropic", "model": "claude-sonnet-4-5-20250929" }, "mode": { "mode": "structured_output", "system_prompt": "You are a concise content analyst. Extract a clean structured summary from the HTML or text the user gives you. Be terse and accurate.", "user_prompt": "Summarize the following content from {{ initial.url }}:\n\n{{ fetch-page.text }}", "output_schema": { "title": "string: The page's title", "summary": "string: A 2-3 sentence summary", "topics": "string[]: A short list of topical tags", "sentiment": "string: One of 'positive', 'neutral', 'critical', 'mixed'", "wordCount": "number: Approximate word count of the original page" } } } }, { "stepId": "respond", "stepType": "core.return", "input": { "webhookResponse": { "statusCode": 200, "body": "{{ summarize.object }}" } } } ]}Connection needed: an Anthropic, OpenAI, or Google API key connection. Anthropic Claude is the default in the recipe — swap the connection name and provider if you use OpenAI or Google.
Trigger: add a Webhook trigger to the workflow with:
| Field | Value |
|---|---|
| Name | summarize-url |
| Method | POST |
| Authentication | Recommended on. Generate a webhook secret and pass it as a Bearer token. |
How it works
Section titled “How it works”Three steps:
- Fetch the page — HTTP GET against the URL the caller passed in.
- Summarize with structured output — LLM Call configured to return JSON matching a schema. Instead of asking the model for free text and parsing it, you define an output schema and the model is constrained to return JSON matching it. No regex, no JSON-extraction headaches, no hallucinated keys.
- Return the summary — webhook response with the structured object.
Sample response:
{ "title": "Why writing a SaaS in Rust is harder than you think", "summary": "A blog post arguing that Rust's strict type system creates friction for the kind of rapid iteration most B2B SaaS startups need...", "topics": ["rust", "saas", "type systems", "developer productivity"], "sentiment": "critical", "wordCount": 1840}Test it
Section titled “Test it”curl -X POST https://run.quickflo.app/w/@your-org/summarize-url \ -H "Authorization: Bearer YOUR_WEBHOOK_SECRET" \ -H "Content-Type: application/json" \ -d '{"url": "https://en.wikipedia.org/wiki/Workflow_engine"}'What to customize first
Section titled “What to customize first”- Strip HTML before sending to the model. Right now
fetch-page.textincludes raw HTML. Add aset-variablestep that uses Liquid’sstrip_htmlfilter to clean it up first — saves tokens and improves quality. - Cap input size. Long pages will blow past the model’s context window. Truncate
fetch-page.textto the first ~30,000 characters with{{ fetch-page.text | substring: 0, 30000 }}before passing it in. - Add a knowledge base for company-specific context. If you’re summarizing internal docs, attach a knowledge base to the LLM Call so the model has background context.
- Switch to AI Agent for multi-step extraction. If you want the model to also fetch a related URL or query a CRM as part of summarizing, swap the LLM Call for an AI Agent and define those capabilities as tools.
Related recipes
Section titled “Related recipes”- Webhook → Weather API — simpler 3-step webhook recipe with no LLM
- AI Post-Call Summary — a similar shape but transcribes audio first and writes the summary back to a CRM