Skip to main content
Business Process Automation Webhook

Limit Code Create Webhook

3
14 downloads
15-45 minutes
🔌
4
Integrations
Intermediate
Complexity
🚀
Ready
To Deploy
Tested
& Verified

What's Included

📁 Files & Resources

  • Complete N8N workflow file
  • Setup & configuration guide
  • API credentials template
  • Troubleshooting guide

🎯 Support & Updates

  • 30-day email support
  • Free updates for 1 year
  • Community Discord access
  • Commercial license included

Agent Documentation

Standard

Limit Code Create Webhook – Business Process Automation | Complete n8n Webhook Guide (Intermediate)

This article provides a complete, practical walkthrough of the Limit Code Create Webhook n8n agent. It connects HTTP Request, Webhook across approximately 1 node(s). Expect a Intermediate setup in 15-45 minutes. One‑time purchase: €29.

What This Agent Does

This agent orchestrates a reliable automation between HTTP Request, Webhook, handling triggers, data enrichment, and delivery with guardrails for errors and rate limits.

It streamlines multi‑step processes that would otherwise require manual exports, spreadsheet cleanup, and repeated API requests. By centralizing logic in n8n, it reduces context switching, lowers error rates, and ensures consistent results across teams.

Typical outcomes include faster lead handoffs, automated notifications, accurate data synchronization, and better visibility via execution logs and optional Slack/Email alerts.

How It Works

The workflow uses standard n8n building blocks like Webhook or Schedule triggers, HTTP Request for API calls, and control nodes (IF, Merge, Set) to validate inputs, branch on conditions, and format outputs. Retries and timeouts improve resilience, while credentials keep secrets safe.

Third‑Party Integrations

  • HTTP Request
  • Webhook

Import and Use in n8n

  1. Open n8n and create a new workflow or collection.
  2. Choose Import from File or Paste JSON.
  3. Paste the JSON below, then click Import.
  4. Show n8n JSON
    Title:
    How to Extract and Decode Clean Article URLs from Google News RSS with n8n
    
    Meta Description:
    Learn how to use n8n to scrape and decode clean article URLs from Google News RSS feeds. This step-by-step guide walks you through extracting encoded links and decoding them into readable URLs without manual effort.
    
    Keywords:
    n8n workflow, RSS feed decoding, Google News scraping, extract clean article links, Google News RSS, news automation, scraping workflow, base64 decode news article, Google reverse engineering, dynamic RSS extraction, no-ai workflow, n8n automation, web scraping tools
    
    Third-Party APIs and Services Used:
    
    - Google News (RSS Feed Endpoint: https://news.google.com/rss)
    - Google News Internal API (Decoding endpoint: https://news.google.com/_/DotsSplashUi/data/batchexecute)
    
    Article:
    ---
    
    Extract and Decode Clean Article URLs from Google News RSS with n8n
    
    Google News is an invaluable source of timely information, but when integrating it into automated workflows or scraping pipelines, its URLs can present a challenge. The RSS feeds from Google News often include obfuscated or encoded URLs, making it difficult to extract direct links to original articles. Using the open-source automation tool n8n, you can overcome this obstacle by building a workflow that automatically decodes and cleans these URLs, making them usable in downstream applications.
    
    In this article, we’ll explore a custom n8n workflow titled “Extract and Decode Google News RSS URLs to Clean Article Links” that orchestrates the full decoding pipeline. Whether you're a data analyst, SEO researcher, or news aggregator, this workflow will help you clean and extract article URLs in a structured, efficient way.
    
    🧰 What This Workflow Does
    
    The core goal of the workflow is to take raw entries from a Google News RSS feed and decode the embedded, often base64-encoded or otherwise obfuscated URLs to clean, direct article links. The entire process involves the following:
    
    1. Reading the Google News RSS feed.
    2. Limiting the number of items for performance.
    3. Fetching the encoded content hidden in individual links.
    4. Extracting necessary decoding keys like base64 strings, timestamp, and signature.
    5. Sending those keys to a Google backend decoding endpoint.
    6. Parsing the raw response and cleaning the output into usable article URLs.
    
    Let’s walk through each component of the workflow.
    
    📥 Step 1: Reading Google News RSS
    
    The initial node, Reading Google News RSS, fetches the raw RSS feed from a localized Google News endpoint. For instance, the feed used in this workflow is:
    https://news.google.com/rss?hl=it&gl=IT&ceid=IT:it
    
    The country (`gl`), language (`hl`), and edition (`ceid`) parameters can be modified to target different regions.
    
    ⚠️ Note: These endpoints are subject to rate limits, and excessive automation may result in being blocked by Google. Exercise caution and test responsibly.
    
    📦 Step 2: Limit the Results for Efficiency
    
    To mitigate load and avoid being blocked, the workflow limits the number of articles processed. While it's configured to fetch a maximum of 5 items, the sticky notes suggest reducing this to 3, considering the multiple HTTP requests made during the decoding.
    
    🕵️ Step 3: Extract Encoded Content
    
    After retrieving the RSS entries, the next step is to visit each article link. Contrary to what it seems, these links don't immediately point to the original article — they often lead to a Google redirect page, which contains encoded data in the HTML.
    
    The workflow’s Get Encoded News URL node fetches the full HTML content of the redirect page, from which two important decoding keys — a signature and a timestamp stored in custom HTML attributes — are extracted using the Extract decoding keys node.
    
    🔐 Step 4: Prepare Decoding Parameters
    
    In addition to the extracted signature and timestamp, the article’s encoded base64 string (usually found in the original URL’s GUID field) is mapped for use in the decoding request. All three values are passed to a JavaScript code node, which assembles a specially-crafted request payload to mimic a legitimate backend request to Google’s internal decoding endpoint.
    
    🛰️ Step 5: Send Decoding Request
    
    Next, a POST request is sent to https://news.google.com/_/DotsSplashUi/data/batchexecute using the n8n HTTP Request node. This URL is part of Google’s internal API infrastructure and isn't officially documented. Therefore, the workflow reconstructs the request headers (including `User-Agent`, `Referer`, and `Content-Type`) to closely mimic the behavior of a Google News web client.
    
    The body of the request contains the encoded payload with article reference, timestamp, and signature that Google expects for decoding.
    
    🧹 Step 6: Decode and Clean the URLs
    
    Once the response is received, it's parsed to retrieve the direct article URL from a deeply nested JSON structure embedded within a string. Since the raw response contains extraneous characters, the final node — Decoded URL — includes a JavaScript expression that handles this parsing and string cleanup:
    ```js
    ={{ JSON.parse(JSON.parse($json.data.split('\n\n')[1])[0][2])[1] }}
    ```
    
    This cleansed direct URL is finally output in a structured object, ready to be used in news summaries, content extraction tools, or link aggregation websites.
    
    📌 Limitations and Warnings
    
    As with any workflow that reverse-engineers undocumented APIs, there are some caveats:
    
    - This technique relies on internal structures that may change at any time.
    - High request frequencies might trigger Google’s anti-bot mechanisms.
    - The decoding logic is hardcoded — it may require maintenance if Google modifies the structure or key naming conventions.
    
    🏁 Output and Next Steps
    
    Once the URLs are cleaned, you can easily expand the workflow to extract full article text using tools like:
    
    - n8n-nodes-webpage-content-extractor (custom node for scraping streamlined article content)
    - jina.ai or similar AI-powered summarizers
    - An HTML extraction node to parse titles or article bodies
    
    🚀 Use Cases
    
    This workflow opens up a wide range of use cases:
    
    - Aggregating news from specific regions or topics
    - Creating newsletters or summaries from trending content
    - Enriching datasets with original article links for NLP tasks
    - Integrating clean article URLs into content curation dashboards
    
    📚 Conclusion
    
    This n8n workflow is a powerful and effective way to automate the extraction and decoding of clean article URLs from Google News RSS feeds. Though the solution relies on a bit of reverse engineering, it provides a low-code, reusable automation that can plug into your news pipelines or data ingestion setups.
    
    Use it responsibly, monitor its performance, and tweak it based on your RSS input and downstream needs. With minimal code and powerful orchestration, n8n proves once again why it's a top choice for automation workflows.
    
    ---
    
    Happy automating! ⚙️📰
  5. Set credentials for each API node (keys, OAuth) in Credentials.
  6. Run a test via Execute Workflow. Inspect Run Data, then adjust parameters.
  7. Enable the workflow to run on schedule, webhook, or triggers as configured.

Tips: keep secrets in credentials, add retries and timeouts on HTTP nodes, implement error notifications, and paginate large API fetches.

Validation: use IF/Code nodes to sanitize inputs and guard against empty payloads.

Why Automate This with AI Agents

AI‑assisted automations offload repetitive, error‑prone tasks to a predictable workflow. Instead of manual copy‑paste and ad‑hoc scripts, your team gets a governed pipeline with versioned state, auditability, and observable runs.

n8n’s node graph makes data flow transparent while AI‑powered enrichment (classification, extraction, summarization) boosts throughput and consistency. Teams reclaim time, reduce operational costs, and standardize best practices without sacrificing flexibility.

Compared to one‑off integrations, an AI agent is easier to extend: swap APIs, add filters, or bolt on notifications without rewriting everything. You get reliability, control, and a faster path from idea to production.

Best Practices

  • Credentials: restrict scopes and rotate tokens regularly.
  • Resilience: configure retries, timeouts, and backoff for API nodes.
  • Data Quality: validate inputs; normalize fields early to reduce downstream branching.
  • Performance: batch records and paginate for large datasets.
  • Observability: add failure alerts (Email/Slack) and persistent logs for auditing.
  • Security: avoid sensitive data in logs; use environment variables and n8n credentials.

FAQs

Can I swap integrations later? Yes. Replace or add nodes and re‑map fields without rebuilding the whole flow.

How do I monitor failures? Use Execution logs and add notifications on the Error Trigger path.

Does it scale? Use queues, batching, and sub‑workflows to split responsibilities and control load.

Is my data safe? Keep secrets in Credentials, restrict token scopes, and review access logs.

Keywords: n8n, workflow, rss feed decoding, google news scraping, extract clean article links, google news rss, news automation, scraping workflow, base64 decode news article, google reverse engineering, dynamic rss extraction, no-ai workflow, n8n automation, web scraping tools, google news (rss feed endpoint: https://news.google.com/rss), google news internal api (decoding endpoint: https://news

Integrations referenced: HTTP Request, Webhook

Complexity: Intermediate • Setup: 15-45 minutes • Price: €29

Requirements

N8N Version
v0.200.0 or higher required
API Access
Valid API keys for integrated services
Technical Skills
Basic understanding of automation workflows
One-time purchase
€29
Lifetime access • No subscription

Included in purchase:

  • Complete N8N workflow file
  • Setup & configuration guide
  • 30 days email support
  • Free updates for 1 year
  • Commercial license
Secure Payment
Instant Access
14
Downloads
3★
Rating
Intermediate
Level