1. Opening Problem Statement
Meet Sarah, a digital marketing specialist managing SEO for multiple websites including rumjahn.com. Every week, she spends hours logging into SERPBear to manually check keyword rankings and trends, painstakingly summarizing the data in spreadsheets. Despite her efforts, important ranking shifts sometimes go unnoticed, leading to missed optimization opportunities and lost organic traffic. The manual process is error-prone and time-consuming, taking up 3-4 hours weekly, which could be better spent on strategy rather than data crunching.
For Sarah, this repetitive task is a bottleneck: SEO data is scattered, summaries are inconsistent, and insights delayed, limiting proactive decision-making. She needs an automated solution that fetches SERPBear keyword data regularly, analyzes it intelligently, and stores well-organized reports for easy review and action.
2. What This Automation Does
This unique n8n workflow integrates SERPBear, an AI analysis engine (via OpenRouter), and Baserow database to create an end-to-end SEO analytics pipeline tailored for rumjahn.com. Here’s what happens automatically when triggered:
- Fetches keyword ranking data from SERPBear API for a specified domain and website ID without manual login.
- Parses the raw API response using a Code node to extract current positions, 7-day average rankings, and trends for all tracked keywords.
- Generates a detailed SEO analysis prompt summarizing keyword performance facts and sends it to a large language model AI via HTTP request.
- Receives an expert-styled SEO report from the AI, highlighting keywords that are improving, declining, or stable, and suggests specific optimization actions.
- Saves the AI-generated SEO insights into a Baserow table with timestamp and website name for long-term tracking and team collaboration.
- Can run on a weekly schedule or via manual trigger, ensuring flexibility for on-demand or periodic reporting.
With this workflow, Sarah saves 3-4 hours per week and gains consistent, rich SEO insights that guide her optimization efforts effectively.
3. Prerequisites ⚙️
- n8n Account – To set up and run this workflow. Optionally self-host with Hostinger.
- SERPBear API Access – Your SerpBear API URL, API key, and Website ID for your domain (e.g., rumjahn.com).
- OpenRouter AI API Key – For sending keyword data to an AI model for analysis.
- Baserow Account & API Token – To store the AI-generated SEO analysis reports in a database table.
4. Step-by-Step Guide
Step 1: Trigger Setup with Schedule or Manual Trigger
In your n8n editor, add the Schedule Trigger node from the node panel. Set it to run weekly (interval: weeks) to automate periodic SEO analysis. Alternatively, add a Manual Trigger node for on-demand runs during testing.
This enables automatic or manual execution of the workflow.
Common Mistake: Not setting the schedule interval correctly. Make sure you select “weeks” for weekly frequency.
Step 2: Configure the SERPBear HTTP Request Node
Add an HTTP Request node named “Get data from SerpBear.” Set the method to GET and enter the URL like https://myserpbearinstance.com/api/keyword?id=22. Replace with your actual SerpBear instance URL and website ID.
Under “Query Parameters,” add domain=rumjahn.com or your target domain.
Set authentication type to Header Auth and enter your API key in the headers.
Confirm the request fetches JSON data listing keywords and their rankings.
Common Mistake: Using incorrect domain or website ID will return empty or error responses.
Step 3: Parse the SERPBear API Data with Code Node
Add a Code node right after the HTTP Request node. Copy-paste the following JavaScript:
const keywords = items[0].json.keywords;
const today = new Date().toISOString().split('T')[0];
const keywordSummaries = keywords.map(kw => {
const position = kw.position || 0;
const lastWeekPositions = Object.values(kw.history || {}).slice(-7);
const avgPosition = lastWeekPositions.reduce((a, b) => a + b, 0) / lastWeekPositions.length;
return {
keyword: kw.keyword,
currentPosition: position,
averagePosition: Math.round(avgPosition * 10) / 10,
trend: position < avgPosition ? 'improving' : position > avgPosition ? 'declining' : 'stable',
url: kw.url || 'not ranking'
};
});
const prompt = `Here's the SEO ranking data for rumjahn.com as of ${today}:
${keywordSummaries.map(kw => `
Keyword: "${kw.keyword}"
Current Position: ${kw.currentPosition}
7-Day Average: ${kw.averagePosition}
Trend: ${kw.trend}
Ranking URL: ${kw.url}
`).join('n')}
Please analyze this data and provide:
1. Key observations about ranking performance
2. Keywords showing the most improvement
3. Keywords needing attention
4. Suggested actions for improvement`;
return { prompt };
This transforms raw keyword data into clear summaries and prepares a detailed prompt for the AI analysis.
Common Mistake: Removing or altering variable names incorrectly can break the code.
Step 4: Send Data to AI for Analysis via HTTP Request
Add an HTTP Request node named “Send data to A.I. for analysis.” Configure a POST request to https://openrouter.ai/api/v1/chat/completions.
Use this JSON body (replace the prompt dynamically):
{
"model": "meta-llama/llama-3.1-70b-instruct:free",
"messages": [
{
"role": "user",
"content": "You are an SEO expert. This is keyword data for my site. Can you summarize the data into a table and then give me some suggestions:{{ encodeURIComponent($json.prompt) }}"
}
]
}
Set authentication to Header Auth with “Authorization: Bearer {your API key}” header.
This sends prepared keyword ranking summaries to the AI model for expert insights.
Common Mistake: Forgetting the space after “Bearer” in the Authorization header leads to auth failures.
Step 5: Save AI Analysis Results to Baserow
Add a Baserow node configured to create records in your SEO analytics table.
Map fields as follows:
- Date field → current date formatted as YYYY-MM-DD
- Note field → AI response content contained in
$json.choices[0].message.content - Blog field → Website name (e.g., “Rumjahn”)
Once saved, you have a historical audit trail of your SEO health reports accessible anytime.
Common Mistake: Not creating the Baserow table with correct columns beforehand causes write errors.
Step 6: Test Workflow Manually or Wait for Scheduled Run
Use the manual trigger to test your workflow initially. Check every node’s output for errors. Once confirmed, activate the schedule trigger node.
You should see keyword data flow from SERPBear, get parsed, analyzed by AI, and finally stored in Baserow seamlessly.
5. Customizations ✏️
- Change Target Domain: In the “Get data from SerpBear” HTTP Request node, update the
domainquery parameter to track keywords for a different website. - Adjust Analysis Frequency: Modify the “Schedule Trigger” node’s interval to daily or monthly for different reporting cadences.
- Customize AI Prompt: Edit the JavaScript in the
Codenode to request different SEO insights or change the tone of the AI report (more detailed, summarized, etc.). - Store to Another Database: Replace the Baserow node with any other database node n8n supports if you want to integrate with tools like Airtable or Google Sheets.
6. Troubleshooting 🔧
Problem: “401 Unauthorized” from SERPBear API
Cause: Incorrect API key or header authentication setup in the HTTP Request node.
Solution: Double-check your API key in the header auth credentials. Ensure “Authorization” header is correctly set with “Bearer your_api_key” format.
Problem: AI node returns empty or error response
Cause: Malformed JSON body or missing/incorrect API key for OpenRouter AI.
Solution: Verify the JSON body format matches the API spec. Check the header auth includes correct Bearer token with a space after “Bearer.” Test with a simple fixed prompt to isolate issues.
Problem: Baserow node fails to save data
Cause: Incorrect table ID, missing fields, or API token improper.
Solution: Ensure the Baserow table exists with columns Date, Note, Blog as per configuration. Confirm your Baserow API token has write permissions.
7. Pre-Production Checklist ✅
- Verify SerpBear API credentials and domain parameters are accurate.
- Test the Code node’s JavaScript with sample data ensuring no errors.
- Confirm OpenRouter API keys and test with simple prompt for AI connectivity.
- Create and test Baserow table with required columns.
- Run workflow manually and check data flow and outputs at each node.
- Backup existing SEO data if migrating from manual systems.
8. Deployment Guide
Activate the schedule trigger to start automated weekly SEO reporting. Monitor workflow runs periodically in n8n to catch any errors. Review saved Baserow records regularly with your SEO team to track progress and plan optimizations. Adjust schedule frequency or domains anytime as needed.
9. FAQs
Can I use a different AI provider instead of OpenRouter?
Yes, this workflow uses standard HTTP request nodes for AI calls, so you can replace OpenRouter with other AI APIs like OpenAI, provided you adjust the request format and credentials accordingly.
Does this workflow consume API credits?
It depends on your SerpBear and AI provider plans. Regular API calls and AI completions count toward your quota, so monitor usage to avoid overruns.
Is my SEO data safe with this automation?
Yes, your API keys are stored securely via n8n credentials. Data flows only between trusted services you control.
Can this handle multiple domains?
This workflow is configured for one domain but can be duplicated or adapted to handle multiple by adding nodes or looping logic.
10. Conclusion
By building this n8n workflow integrating SERPBear, OpenRouter AI, and Baserow, you’ve automated tedious SEO keyword ranking analysis for rumjahn.com.
You now save up to 4 hours weekly, get intelligent AI-driven SEO insights, and maintain an organized historical database of performance reports. This empowers smarter, faster SEO decision-making and better organic traffic growth.
Next, try adding automated email alerts for critical ranking drops or integrating Google Analytics for richer SEO context. Keep experimenting and optimizing your automation to make SEO work less manual and more strategic — you’ve got this!