Splitout Code Automation Scheduled – Business Process Automation | Complete n8n Scheduled Guide (Intermediate)
This article provides a complete, practical walkthrough of the Splitout Code Automation Scheduled n8n agent. It connects HTTP Request, Webhook across approximately 1 node(s). Expect a Intermediate setup in 15-45 minutes. One‑time purchase: €29.
What This Agent Does
This agent orchestrates a reliable automation between HTTP Request, Webhook, handling triggers, data enrichment, and delivery with guardrails for errors and rate limits.
It streamlines multi‑step processes that would otherwise require manual exports, spreadsheet cleanup, and repeated API requests. By centralizing logic in n8n, it reduces context switching, lowers error rates, and ensures consistent results across teams.
Typical outcomes include faster lead handoffs, automated notifications, accurate data synchronization, and better visibility via execution logs and optional Slack/Email alerts.
How It Works
The workflow uses standard n8n building blocks like Webhook or Schedule triggers, HTTP Request for API calls, and control nodes (IF, Merge, Set) to validate inputs, branch on conditions, and format outputs. Retries and timeouts improve resilience, while credentials keep secrets safe.
Third‑Party Integrations
- HTTP Request
- Webhook
Import and Use in n8n
- Open n8n and create a new workflow or collection.
- Choose Import from File or Paste JSON.
- Paste the JSON below, then click Import.
-
Show n8n JSON
Title: Automate Trustpilot Review Scraping and Google Sheets Integration Using n8n Meta Description: Learn how to automate the process of scraping Trustpilot reviews and storing them into Google Sheets using a no-code n8n workflow. Perfect for marketers, analysts, and eCommerce businesses. Keywords: n8n Trustpilot automation, Trustpilot scraper, Google Sheets automation, automate customer reviews, no-code web scraping, Trustpilot reviews to sheet, HelpfulCrowd import, customer feedback collection, Trustpilot data pipeline, workflow automation Third-Party APIs Used: 1. Trustpilot (scraping via HTTP request – technically not an official API) 2. Google Sheets API 3. HelpfulCrowd (for formatting reviews for import) Article: Automate Trustpilot Review Scraping with n8n to Google Sheets Manually checking online reviews is a task no busy marketer or business owner should have to do. Thanks to automation tools like n8n, gathering and processing customer feedback from sites like Trustpilot can now happen at the click of a button—or better yet, on a schedule. In this article, we explore how a workflow built in n8n automatically scrapes Trustpilot customer reviews and saves them to Google Sheets, and how it supports additional formatting for platforms like HelpfulCrowd. Let’s break down the mechanics and value of this powerful automation setup. Overview of the Automation This n8n workflow automates the process of: 1. Loading a specific Trustpilot company page. 2. Scraping reviews across multiple pagination levels. 3. Extracting key data such as review title, content, author, star rating, and publishing date. 4. Saving this structured data into Google Sheets for tracking, analysis, or public display. 5. Formatting a secondary dataset of reviews specifically tailored for import into platforms like HelpfulCrowd. Workflow Trigger Options The workflow can be initiated in two primary ways: - Manually, using the Manual Trigger node during testing or on-demand execution; - Automatically, using a Schedule Trigger to run the process at defined intervals (e.g., daily, weekly). This flexible setup ensures you can either control the refresh manually or let it run unattended. Step 1: Setting Up Parameters All input variables like the company’s Trustpilot subdomain and the number of review pages to scrape are defined early on using a Set node labeled "Global." For example: - company_id: "n8n.io" - max_page: 100 You can easily change this to your company’s Trustpilot URL by editing the node. This design allows rapid prototype changes without altering the core logic of the workflow. Step 2: Scraping Trustpilot Reviews Using an HTTP Request node, the workflow accesses your target Trustpilot page. Pagination is handled dynamically with a pre-configured limit and interval between requests to prevent hitting Trustpilot’s defenses. Because Trustpilot uses modern client-side rendering, traditional scraping methods don’t work easily. Instead, the workflow uses a clever JavaScript-based approach: it extracts embedded JSON from the __NEXT_DATA__ script tag on each page. This is done inside a custom Code node with help from the cheerio web scraping library to parse the page. Step 3: Parsing the Data The extracted review data is flattened and enriched in another JavaScript Code node. Each review is formatted into a structured object with fields such as: - Date - Author - Body (review content) - Heading (title) - Rating (1–5 stars) - Country/Location Step 4: Handling Outputs — Two Sheets, One Source The workflow prepares two distinct sets of review data. 1. General Sheet: A “General edits” node standardizes fields and feeds the formatted items into the “General sheet” node. This updates a Google Sheet named “trustpilot” used for broad review analysis. Updates are intelligently matched by a unique review_id field to avoid duplicate entries. 2. HelpfulCrowd Integration: For users importing reviews to HelpfulCrowd, the workflow provides enhanced formatting—including product_id, export status, and verified purchase checks—in a separate set of assignments called “HelpfulCrowd edits.” These are then appended or updated into a second Google Sheet tab named “helpfulcrowd.” The mapping and schema for both sheets are predefined in the workflow. All review entries are matched via a review_id to allow future updates without duplication. User Notes and Configuration Tips The workflow includes sticky notes offering quick-start tips: - How to customize the company ID and page count limits. - A link to clone a preconfigured spreadsheet to get started quickly. - Documentation links for HelpfulCrowd review formatting. Real-World Use Cases Here are a few scenarios where this automation shines: - eCommerce brands syncing third-party reviews into their CMS or storefront. - Customer experience managers tracking review sentiment over time. - Marketing teams looking to create testimonials or user-generated content pipelines. - Agencies offering reputation management as a service. Future Enhancements This workflow is modular and extensible. Some ideas for future upgrades: - Integrate with sentiment analysis APIs (e.g., Google Natural Language). - Send new 1-star reviews as Slack alerts. - Create email digests summarizing weekly review performance. - Extend to pull product-specific reviews using trustpilot.com/subdomain/product pages. Conclusion This n8n workflow is a perfect demonstration of how powerful workflows can be when paired with practical business needs. In very little time, you can turn publicly visible reviews into structured, analyzable data—directly in your Google Sheets or third-party systems. By automating the review scraping process with n8n, you not only save hours each week but also empower your business with data you can act on. Ready to get started? Clone the spreadsheet template, plug in your company details, and hit "Test Workflow"!
- Set credentials for each API node (keys, OAuth) in Credentials.
- Run a test via Execute Workflow. Inspect Run Data, then adjust parameters.
- Enable the workflow to run on schedule, webhook, or triggers as configured.
Tips: keep secrets in credentials, add retries and timeouts on HTTP nodes, implement error notifications, and paginate large API fetches.
Validation: use IF/Code nodes to sanitize inputs and guard against empty payloads.
Why Automate This with AI Agents
AI‑assisted automations offload repetitive, error‑prone tasks to a predictable workflow. Instead of manual copy‑paste and ad‑hoc scripts, your team gets a governed pipeline with versioned state, auditability, and observable runs.
n8n’s node graph makes data flow transparent while AI‑powered enrichment (classification, extraction, summarization) boosts throughput and consistency. Teams reclaim time, reduce operational costs, and standardize best practices without sacrificing flexibility.
Compared to one‑off integrations, an AI agent is easier to extend: swap APIs, add filters, or bolt on notifications without rewriting everything. You get reliability, control, and a faster path from idea to production.
Best Practices
- Credentials: restrict scopes and rotate tokens regularly.
- Resilience: configure retries, timeouts, and backoff for API nodes.
- Data Quality: validate inputs; normalize fields early to reduce downstream branching.
- Performance: batch records and paginate for large datasets.
- Observability: add failure alerts (Email/Slack) and persistent logs for auditing.
- Security: avoid sensitive data in logs; use environment variables and n8n credentials.
FAQs
Can I swap integrations later? Yes. Replace or add nodes and re‑map fields without rebuilding the whole flow.
How do I monitor failures? Use Execution logs and add notifications on the Error Trigger path.
Does it scale? Use queues, batching, and sub‑workflows to split responsibilities and control load.
Is my data safe? Keep secrets in Credentials, restrict token scopes, and review access logs.