Automate Google SERP Data Extraction with n8n and ScrapingRobot

Learn how to automate Google search results extraction for SEO analysis using n8n with ScrapingRobot API. This workflow fetches keyword rankings, processes SERP data, and organizes it for detailed SEO insights, saving hours of manual research.
manualTrigger
httpRequest
code
+5
Workflow Identifier: 1844
NODES in Use: Manual Trigger, Set, Split Out, Filter, Code, HTTP Request, NoOp, Sticky Note

Press CTRL+F5 if the workflow didn't load.

Learn how to Build this Workflow with AI:

Visit through Desktop for Best experience

What this workflow does

This workflow gets live Google search result data for many keywords automatically.
It stops the need to copy results by hand, saving time and errors.
The system finds organic results, ads, and “People Also Ask” questions for each keyword.
It cleans data by removing results without titles.
Then it ranks each result with its search position.
The data is ready to send to your database or SEO tool.


Who should use this workflow

If you track SEO ranking or need Google result data for many keywords, this workflow helps a lot.
It fits analysts or SEO teams that want fast, reliable updates without manual work.
You can handle from a few to hundreds of keywords in one run.
The system suits users who want clean, ranked Google data for reports or dashboards.


Tools and services used

  • n8n automation platform: Runs the workflow and connects services.
  • ScrapingRobot API: Fetches Google Search Engine Results Pages (SERP) live data.
  • Your own database or SEO tool: Stores the cleaned ranking results (optional).

Inputs, processing steps, and outputs

Inputs

There are two main ways to provide keywords:
– A list inside the n8n workflow using a “Set” node.
– Connected external database with keywords in a column named “Keyword”.

Processing steps

  • The workflow starts manually with a Manual Trigger.
  • The keyword list is split into separate items for individual processing.
  • For each keyword, an API call is sent to ScrapingRobot Google’s scraping module to pull SERP data.
  • The received JSON is parsed to extract organic results, paid ads, “People Also Ask,” and original keyword.
  • Organic results are split into single entries for filtering and ranking.
  • Entries missing titles are filtered out to keep data clean.
  • Each organic search result is tagged with its keyword string again.
  • JavaScript code assigns the correct rank number to each organic result by grouping them per keyword.
  • Finally, data is sent to your database or SEO tracking tool node (requires setup).

Output

The final output is a clean list of Google organic search results for each keyword.
Every entry includes title, URL, position ranking, related ads and questions (if any), and the keyword.
This data is ready for reports or dashboards to track SEO progress.


Beginner step-by-step: How to run this workflow in n8n

Step 1: Import the workflow

  1. Download the provided workflow file using the Download button on this page.
  2. Open the n8n editor where you want to run the workflow.
  3. Use the “Import from File” option in n8n to upload this downloaded workflow.

Step 2: Add credentials and configure nodes

  1. In n8n, open the GET SERP HTTP Request node.
    Input your active ScrapingRobot API Key in the node’s credentials or query auth field.
  2. Check the Set Keywords to get SERPs for node if you want to edit or add your keywords inside the workflow.
  3. If you want to fetch keywords from your own database, connect your database node before the splitter node and ensure it has a “Keyword” column.
  4. In the last node Connect to your own database2, replace the placeholder with your actual database saving node or integration.

Step 3: Test and activate the workflow

  1. Click “Execute Workflow” manually in n8n to test the data flow.
  2. Check workflow execution results to confirm the data pulls, filters, and ranks properly.
  3. Once confirmed, activate the workflow in n8n by switching it on.
  4. You can add a Cron trigger to schedule automatic runs if needed instead of manual start.
  5. For those using self-host n8n, run the workflow on your server for continuous operation.

Common edge cases and failures

  • 401 Unauthorized error: Happens if ScrapingRobot API Key is wrong or missing. Fix by updating key in the GET SERP node.
  • No data or empty organic results: Could mean API format changed or spelled keywords wrong. Check API response JSON in n8n’s execution preview.
  • Missing titles filtering: Filtering step depends on title existing. Make sure JSON path is correct or adjust filter logic.
  • Position numbering wrong: The code node needs proper searchQuery field in data. Confirm grouping logic matches your input format.

Customization ideas

  • You can switch the static keyword list to a real database fetch to work with dynamic keywords.
  • Expand the paid ads parsing to get ads as arrays for detailed ads analysis.
  • Replace the final NoOp database node with nodes that export to Google Sheets, Airtable, or SEO software APIs.
  • Modify the API JSON body in the HTTP Request node to add location or language filters supported by ScrapingRobot.

Summary of workflow benefits and results

→ Automates getting real Google ranking data per keyword.
→ Removes manual copying and reduces errors.
→ Cleans data by filtering empty results.
→ Provides exact rank position for each URL.
→ Prepares data for easy export or reporting.
→ Works for small or large keyword lists with same process.


Frequently Asked Questions

The GET SERP node sends requests to the ScrapingRobot API to get live Google search results for each keyword.
The workflow splits the keyword list into individual items and processes each one separately with API calls.
A 401 error happens if the ScrapingRobot API Key is missing or incorrect in the GET SERP node credentials.
The final data is sent to a database or SEO tool node, which needs to be configured by the user for storage.

Promoted by BULDRR AI

Related Workflows

Automate Viral UGC Video Creation Using n8n + Degaus (Beginner-Friendly Guide)

Learn how to automate viral UGC video creation using n8n, AI prompts, and Degaus. This beginner-friendly guide shows how to import, configure, and run the workflow without technical complexity.
Form Trigger
Google Sheets
Gmail
+37
Free

AI SEO Blog Writer Automation Workflows in n8n

A complete beginner guide to building an AI SEO blog writer automation using n8n.
AI Agent
Google Sheets
httpRequest
+5
Free

Automate CrowdStrike Alerts with VirusTotal, Jira & Slack

This workflow automates processing of CrowdStrike detections by enriching threat data via VirusTotal, creating Jira tickets for incident tracking, and notifying teams on Slack for quick response. Save hours daily by transforming complex threat data into actionable alerts effortlessly.
scheduleTrigger
httpRequest
jira
+5
Free

Automate Telegram Invoices to Notion with AI Summaries & Reports

Save hours on financial tracking by automating invoice extraction from Telegram photos to Notion using Google Gemini AI. This workflow extracts data, records transactions, and generates detailed spending reports with charts sent on schedule via Telegram.
lmChatGoogleGemini
telegramTrigger
notion
+9
Free

Automate Email Replies with n8n and AI-Powered Summarization

Save hours managing your inbox with this n8n workflow that uses IMAP email triggers, AI summarization, and vector search to draft concise replies requiring minimal review. Automate business email processing efficiently with AI guidance and Gmail integration.
emailReadImap
vectorStoreQdrant
emailSend
+12
Free

Automate Email Campaigns Using n8n with Gmail & Google Sheets

This n8n workflow automates personalized email outreach campaigns by integrating Gmail and Google Sheets, saving hours of manual follow-up work and reducing errors in email sequences. It ensures timely follow-ups based on previous email interactions, optimizing communication efficiency.
googleSheets
gmail
code
+5
Free