Analyze SERP Keywords Automatically with n8n & SerpBear

Automate the process of fetching and analyzing SerpBear SEO keyword rankings using n8n workflows, and save AI-driven insights to Baserow for informed decision-making. Save time and improve keyword strategy effectively.
httpRequest
code
baserow
+3
Learn how to Build this Workflow with AI:
Workflow Identifier: 1810
NODES in Use: Manual Trigger, Schedule Trigger, HTTP Request, Code, Baserow, Sticky Note

Press CTRL+F5 if the workflow didn't load.

Visit through Desktop for Best experience

Automating SEO Keyword Analysis with n8n and SerpBear for Sarah’s Digital Agency

Meet Sarah, a digital marketing manager juggling multiple client websites every week. One of her recurring headaches is manually pulling keyword ranking reports from SerpBear and summarizing the data to create actionable insights for her team. This tedious task eats up several hours weekly, increasing the risk of errors and delays in optimizing her clients’ SEO strategies.

Imagine spending up to 5 hours each week copying keyword data, manually finding trends, and drafting reports—time Sarah would rather spend growing her business. Without automation, errors in data interpretation can cost real money by missing opportunities or reacting late to ranking drops.

What This Automation Does ⚙️

This workflow streamlines Sarah’s SEO reporting by automatically:

  • Fetching fresh keyword data for a specific domain directly from the SerpBear API on a set schedule or manual trigger.
  • Parsing the keyword ranking data to calculate current positions, weekly averages, and ranking trends for each keyword.
  • Sending a detailed summary prompt to an AI (using OpenRouter’s LLaMA model) to analyze and generate expert SEO insights and actionable recommendations.
  • Saving the AI-generated analysis, along with the date and website name, into a structured Baserow database table for easy review and historical tracking.
  • Allowing on-demand manual testing of the workflow or scheduled weekly automation through n8n triggers.
  • Providing informative sticky notes with step-by-step guidance and credential setup instructions within the workflow for user ease.

By automating these steps, Sarah saves several hours each week, reduces human error, and ensures consistent, expert SEO keyword performance reports faster than ever before.

Prerequisites ⚙️

  • SerpBear API: Obtain your SerpBear API URL and authentication token to fetch keyword ranking data.
  • OpenRouter AI account: Access to OpenRouter API keys with permission to use the Meta LLaMA model for SEO text analysis.
  • Baserow database: A Baserow account and a table prepared with columns for Date, Note, and Blog site information.
  • n8n automation platform: An n8n instance (cloud or self-hosted if preferred) to build and run this workflow.
  • Basic familiarity with API keys and HTTP header authentication setup in n8n.

Having these tools ready is essential before implementing this SEO keyword analysis automation.

Step-by-Step Guide to Building This SEO Keyword Analysis Workflow

Step 1: Create n8n Triggers for Scheduling and Manual Testing

Navigate to your n8n editor:

  • Add a Schedule Trigger node.
    Click + Add Node → Search for Schedule Trigger → Select it.
    In this node, set the scheduling interval to every 1 week under the interval rules.
    This scheduling runs the workflow weekly automatically.
  • Add a Manual Trigger node for manual workflow tests. This helps for immediate data fetching when needed.
    Configure this with no parameters; this node only triggers when you press “Execute Workflow”.

Now, connect both triggers to the next node to fetch SerpBear data.

Common mistake: Forgetting to connect both triggers correctly will prevent workflow execution either on schedule or manual clicks.

Step 2: Fetch Keyword Ranking Data from SerpBear API

Add an HTTP Request node named “Get data from SerpBear.”

  • Set the method to GET.
  • Enter your SerpBear API URL, for example https://myserpbearinstance.com/api/keyword?id=22.
  • Add query parameter: domain=yourwebsite.com (replace with your domain).
  • Use HTTP Header Authentication credentials:
    Navigate to CredentialsCreate HTTP Header Auth → Add your Authorization header with your SerpBear token.
  • In the settings, ensure Execute Once is unchecked to allow trigger-based execution.

Test this node to see a JSON response with keyword data.

Common mistake: Using an incorrect domain or expired API token will cause a 401 Unauthorized error.

Step 3: Parse the SerpBear Keyword Data with a Code Node

Add a Code node named “Parse data from SerpBear.”

  • Insert the following JavaScript code to process keyword rankings and calculate trends:
    const keywords = items[0].json.keywords;
    const today = new Date().toISOString().split('T')[0];
    
    // Create summary for each keyword
    const keywordSummaries = keywords.map(kw => {
      const position = kw.position || 0;
      const lastWeekPositions = Object.values(kw.history || {}).slice(-7);
      const avgPosition = lastWeekPositions.reduce((a, b) => a + b, 0) / lastWeekPositions.length;
      
      return {
        keyword: kw.keyword,
        currentPosition: position,
        averagePosition: Math.round(avgPosition * 10) / 10,
        trend: position < avgPosition ? 'improving' : position > avgPosition ? 'declining' : 'stable',
        url: kw.url || 'not ranking'
      };
    });
    
    // Create the prompt
    const prompt = `Here's the SEO ranking data for rumjahn.com as of ${today}:
    
    ${keywordSummaries.map(kw => `
    Keyword: "${kw.keyword}"
    Current Position: ${kw.currentPosition}
    7-Day Average: ${kw.averagePosition}
    Trend: ${kw.trend}
    Ranking URL: ${kw.url}
    `).join('n')}
    
    Please analyze this data and provide:
    1. Key observations about ranking performance
    2. Keywords showing the most improvement
    3. Keywords needing attention
    4. Suggested actions for improvement`;
    
    return {
      prompt
    };
  • This code extracts keyword positions, calculates 7-day averages, determines trends, and formats a user-friendly prompt for AI analysis.

Common mistake: Ensure the JSON structure from SerpBear matches this code’s expectations to avoid runtime errors.

Step 4: Send Data to OpenRouter AI for SEO Analysis

Add another HTTP Request node named “Send data to A.I. for analysis.”

  • Set the method to POST.
  • URL: https://openrouter.ai/api/v1/chat/completions.
  • Set Body Type to JSON.
  • In the Body, include model info (meta-llama/llama-3.1-70b-instruct:free) and send the prompt from the previous node, properly encoded.
  • Setup Header Authentication with your OpenRouter API key as the Bearer token (Authorization header).

Run the node to receive expert SEO analysis and suggestions in response.

Common mistake: Missing the space after “Bearer” in the token causes authentication failure.

Step 5: Save AI SEO Analysis to Baserow Database

Add a Baserow node called “Save data to Baserow.”

  • Choose the create operation.
  • Select your database and the table prepared with columns for Date, Note, and Blog.
  • Map current date, AI analysis content (response from AI node), and website name (for example, “Rumjahn”) to respective fields.
  • Edit credentials for your Baserow API account.

Execute to confirm AI insights are stored for future reference and reporting.

Common mistake: Ensure field IDs match your Baserow table schema to avoid failed writes.

Step 6: Add Sticky Notes for Workflow Documentation

Use Sticky Note nodes at various workflow points to explain the purpose and setup instructions:

  • Overview note describing the workflow purpose and link to a detailed tutorial.
  • Notes near the SerpBear data node with API key setup tips.
  • Notes near AI node about OpenRouter API setup.
  • Note by Baserow node with table setup instructions.

This documentation ensures anyone using the workflow understands key steps and credentials required.

Customizations ✏️

  • Change the domain parameter: In the HTTP Request node “Get data from SerpBear,” swap rumjahn.com for any client domain you manage to analyze other websites.
  • Modify the AI prompt: Adjust the JavaScript code node’s prompt template to ask additional questions or focus on specific keyword groups that matter to your SEO goals.
  • Schedule frequency: Alter the Schedule Trigger node interval from weekly to daily or monthly per your reporting needs.
  • Store results in different database columns: Add or remove fields in your Baserow table and update the Baserow node field mappings accordingly.
  • Add error handling: Introduce conditional nodes or error workflows to capture API failures or empty results for robust automation.

Troubleshooting 🔧

  • Problem: “401 Unauthorized error from SerpBear API”
    Cause: Invalid or expired API token.
    Solution: Regenerate the API token in SerpBear dashboard and update the HTTP Header Auth credentials in n8n.
  • Problem: “AI node returns empty or malformed response”
    Cause: Incorrect encoding or prompt formatting.
    Solution: Ensure the prompt is correctly URI-encoded and JSON structure matches OpenRouter API requirements; test with simpler prompt first.
  • Problem: “Data not saving to Baserow”
    Cause: Mismatched field IDs or operation settings.
    Solution: Verify Baserow table schema and map fields properly; test Baserow API credentials for access rights.

Pre-Production Checklist ✅

  • Confirm SerpBear API credentials are valid and domain parameter is correct.
  • Test manual trigger to ensure HTTP request node returns keyword data.
  • Verify JavaScript code node runs without errors and outputs the formatted prompt.
  • Validate AI node receives prompt correctly and returns meaningful analysis.
  • Check Baserow node stores a test record accurately in your database.
  • Add sticky notes to capture workflow instructions clearly.
  • Backup your n8n workflow JSON before production deployment.

Deployment Guide

Activate your workflow by toggling the active switch in n8n once testing is successful. The weekly Schedule Trigger will automatically execute the data retrieval and analysis.

Monitor workflow executions via n8n’s executions panel for any errors or warnings.

Ensure your API keys are secure and rotate tokens periodically for security best practices.

FAQs

  • Can I use a different AI provider? Yes, you can replace the OpenRouter HTTP Request node with another AI service node, adjusting the prompt and authentication accordingly.
  • Does this workflow consume API credits? Yes, both SerpBear and OpenRouter might have API usage limits depending on your plan; monitor usage to avoid overages.
  • Is my data safe? This workflow uses secure HTTP header authentication and saves data into your Baserow account only. Ensure credentials are stored safely in n8n.
  • Can I run this workflow more frequently? Yes, but note that higher frequency may increase API usage and potential costs.

Conclusion

By following this guide, you’ve created a powerful n8n automation that fetches real-time SEO keyword rankings from SerpBear, generates AI-driven insights, and archives them in Baserow—letting Sarah and others run their SEO reports with zero manual effort.

You save hours each week, reduce errors, and get expert recommendations on how to improve keyword positions reliably.

Next steps? Try customizing prompts for different niches, integrating Slack notifications for analysis summaries, or scheduling monthly competitor keyword analysis to stay ahead in SEO.

This workflow stands as a practical example of how modern automation and AI can transform tedious marketing tasks into seamless, insightful operations.

Related Workflows

Automate Viral UGC Video Creation Using n8n + Degaus (Beginner-Friendly Guide)

Learn how to automate viral UGC video creation using n8n, AI prompts, and Degaus. This beginner-friendly guide shows how to import, configure, and run the workflow without technical complexity.
Form Trigger
Google Sheets
Gmail
+37
Free

AI SEO Blog Writer Automation in n8n (Beginner Guide)

A complete beginner guide to building an AI-powered SEO blog writer automation using n8n.
AI Agent
Google Sheets
httpRequest
+5
Free

Automate CrowdStrike Alerts with VirusTotal, Jira & Slack

This workflow automates processing of CrowdStrike detections by enriching threat data via VirusTotal, creating Jira tickets for incident tracking, and notifying teams on Slack for quick response. Save hours daily by transforming complex threat data into actionable alerts effortlessly.
scheduleTrigger
httpRequest
jira
+5
Free

Automate Telegram Invoices to Notion with AI Summaries & Reports

Save hours on financial tracking by automating invoice extraction from Telegram photos to Notion using Google Gemini AI. This workflow extracts data, records transactions, and generates detailed spending reports with charts sent on schedule via Telegram.
lmChatGoogleGemini
telegramTrigger
notion
+9
Free

Automate Email Replies with n8n and AI-Powered Summarization

Save hours managing your inbox with this n8n workflow that uses IMAP email triggers, AI summarization, and vector search to draft concise replies requiring minimal review. Automate business email processing efficiently with AI guidance and Gmail integration.
emailReadImap
vectorStoreQdrant
emailSend
+12
Free

Automate Email Campaigns Using n8n with Gmail & Google Sheets

This n8n workflow automates personalized email outreach campaigns by integrating Gmail and Google Sheets, saving hours of manual follow-up work and reducing errors in email sequences. It ensures timely follow-ups based on previous email interactions, optimizing communication efficiency.
googleSheets
gmail
code
+5
Free