Code Itemlists Create Scheduled – Business Process Automation | Complete n8n Scheduled Guide (Intermediate)
This article provides a complete, practical walkthrough of the Code Itemlists Create Scheduled n8n agent. It connects HTTP Request, Webhook across approximately 1 node(s). Expect a Intermediate setup in 15-45 minutes. One‑time purchase: €29.
What This Agent Does
This agent orchestrates a reliable automation between HTTP Request, Webhook, handling triggers, data enrichment, and delivery with guardrails for errors and rate limits.
It streamlines multi‑step processes that would otherwise require manual exports, spreadsheet cleanup, and repeated API requests. By centralizing logic in n8n, it reduces context switching, lowers error rates, and ensures consistent results across teams.
Typical outcomes include faster lead handoffs, automated notifications, accurate data synchronization, and better visibility via execution logs and optional Slack/Email alerts.
How It Works
The workflow uses standard n8n building blocks like Webhook or Schedule triggers, HTTP Request for API calls, and control nodes (IF, Merge, Set) to validate inputs, branch on conditions, and format outputs. Retries and timeouts improve resilience, while credentials keep secrets safe.
Third‑Party Integrations
- HTTP Request
- Webhook
Import and Use in n8n
- Open n8n and create a new workflow or collection.
- Choose Import from File or Paste JSON.
- Paste the JSON below, then click Import.
-
Show n8n JSON
Title: Automating n8n Workflow Backups to Google Drive with Auto-Archival and Purging Meta Description: Discover how to automate nightly backups of your n8n workflows to Google Drive using a custom-built workflow. Learn how backups are organized, moved, and deleted—all without manual intervention. Keywords: n8n backup automation, Google Drive workflow backup, n8n Google Drive integration, JSON workflow backup, automated workflow archiving, n8n old backups deletion, n8n folder creation, auto backup purge, n8n Google Drive API, n8n schedule trigger Third-Party APIs Used: - Google Drive API (via OAuth2 Credentials) - n8n API (via API key credentials) Article: Automating n8n Workflow Backups to Google Drive with Auto-Archival and Purging Managing workflow backups is a critical best practice for any automation ecosystem. If you're using n8n, a powerful and flexible workflow automation tool, ensuring consistent backups of your data can safeguard your workflow integrity and business continuity. In this article, we explore a dynamic solution using a purpose-built n8n workflow to automatically backup workflows daily, archive older backups, and erase outdated files—all on Google Drive. Let’s break down how this automation system works and why it is a must-have for n8n users working with important workflows that should never be lost. 🧠 The Concept This workflow is designed to execute a scheduled, nightly backup of all current workflows in an n8n instance. It uses Google Drive as the storage medium and follows a strict file organization protocol: - New backups are pushed to a folder named n8n_backups - The previous day’s backups get moved to a separate folder called n8n_old - Backups older than 30 days are purged This ensures the latest snapshot is always available, older ones are archived, and obsolete backups don’t consume unnecessary space. 📁 Folder Management: Do They Exist? The workflow begins with a check on Google Drive to verify whether two essential folders exist: n8n_backups and n8n_old. Using the Google Drive API, the workflow searches My Drive’s root directory. A simple code block checks if the required folders are missing. If either one doesn't exist, the workflow triggers node-based logic to create them using the Create Folder action. 💾 Backup Capture: Exporting Workflows After ensuring the folder structure exists, the workflow proceeds to retrieve all active workflows from the n8n instance. This is done using the n8n API node configured with a user's API credentials. Each workflow is converted from JSON to binary data via a Move Binary Data node, renamed with a status tag (active/inactive) along with the name, and then uploaded to the n8n_backups folder on Google Drive. 📤 Workflow Storage: Uploading to Google Drive Uploads happen on a per-file basis using batching logic (Split In Batches) so that the system remains responsive and API limit-friendly. Each file is saved with a filename that includes the workflow name and last updatedAt timestamp. 📥 Cleaning up the Backup Folder: Move Old Data Before new workflows are added, the system loops through the current files in the n8n_backups folder, moving each to the n8n_old folder. This ensures that new backups never overwrite old versions and historical versions remain traceable. The cleanup uses the MOVE INTO OLD FOLDER node, calling the Google Drive API to relocate each backup file from the current folder to the archive folder. ⏳ Aging Out: Purging Backups Older than 30 Days Older backups stored in the n8n_old folder are not meant to stay forever. Using a second Schedule Trigger (set to 30-day intervals), the system checks the folder, lists all JSON files, and deletes those beyond the age threshold. In future iterations, this logic may evolve to allow user-defined retention policies or tagging for particular backups that should never be deleted. 🔔 Notifications and Bug Fixes Throughout the workflow, sticky note nodes are used to provide inline documentation for end-users, offering clarity on what each section of the workflow accomplishes. These notes explain logic like folder creation, backup movement, and purging. The most recent version (v3) also includes fixes to ensure the workflow correctly processes more than 13 backup files—previous runs had limitations due to batching that were resolved. There's even a suggestion to replace Split In Batches with the more scalable “Loop Over Items” node in the future for better performance and control. 🎯 Prerequisites To get started with this automatic backup workflow, users must: - Have Google Drive credentials (OAuth2) ready - Use n8n version 1.67.1 or later - Generate and provide a valid n8n API key - Create folders named “n8n_backups” and “n8n_old” in Google Drive manually if not using the auto-create functionality 📅 Customization Tips 1. Modify schedule times using the Schedule Trigger node. 2. Adjust purge duration by editing the second trigger and associated nodes from 30 days to the desired retention period. 3. Maintain your own Google Drive folder structure—just update folder IDs where necessary in the appropriate nodes. 🔚 Final Thoughts This workflow demonstrates the power of combining n8n's low-code tools and external API integrations to completely automate backup management. It's an ideal solution for teams and solo developers who want peace of mind when creating complex workflows. By automating the backup, archival, and deletion lifecycle, you reduce human error, maintain storage efficiency, and gain centralized control—all while leveraging the robust cloud infrastructure of Google Drive. Need even more customization or support? You can contact the creator directly via the included sticky note or at: nuntius.creative.hub@gmail.com. Now that backups are automated, you can focus on what matters: building workflows that bring your automations to life, worry-free. 🛠️ Happy Automating!
- Set credentials for each API node (keys, OAuth) in Credentials.
- Run a test via Execute Workflow. Inspect Run Data, then adjust parameters.
- Enable the workflow to run on schedule, webhook, or triggers as configured.
Tips: keep secrets in credentials, add retries and timeouts on HTTP nodes, implement error notifications, and paginate large API fetches.
Validation: use IF/Code nodes to sanitize inputs and guard against empty payloads.
Why Automate This with AI Agents
AI‑assisted automations offload repetitive, error‑prone tasks to a predictable workflow. Instead of manual copy‑paste and ad‑hoc scripts, your team gets a governed pipeline with versioned state, auditability, and observable runs.
n8n’s node graph makes data flow transparent while AI‑powered enrichment (classification, extraction, summarization) boosts throughput and consistency. Teams reclaim time, reduce operational costs, and standardize best practices without sacrificing flexibility.
Compared to one‑off integrations, an AI agent is easier to extend: swap APIs, add filters, or bolt on notifications without rewriting everything. You get reliability, control, and a faster path from idea to production.
Best Practices
- Credentials: restrict scopes and rotate tokens regularly.
- Resilience: configure retries, timeouts, and backoff for API nodes.
- Data Quality: validate inputs; normalize fields early to reduce downstream branching.
- Performance: batch records and paginate for large datasets.
- Observability: add failure alerts (Email/Slack) and persistent logs for auditing.
- Security: avoid sensitive data in logs; use environment variables and n8n credentials.
FAQs
Can I swap integrations later? Yes. Replace or add nodes and re‑map fields without rebuilding the whole flow.
How do I monitor failures? Use Execution logs and add notifications on the Error Trigger path.
Does it scale? Use queues, batching, and sub‑workflows to split responsibilities and control load.
Is my data safe? Keep secrets in Credentials, restrict token scopes, and review access logs.