1. Opening Problem Statement
Meet Sarah, a small business owner running Pennibnotes.com, a growing content blog. Every week, she spends up to 3 hours manually pulling website analytics from Umami, comparing the data week-over-week, and trying to extract actionable SEO insights. With limited technical skills, this process is error-prone and tedious. She often misses small trends like bounce rate changes or page popularity shifts that could boost her blog’s growth and revenue.
Sound familiar? Manually analyzing analytics data wastes time and risks overlooking critical details, especially if you want to optimize SEO effectively. Fortunately, this unique n8n workflow automates the entire process end-to-end, making Sarah’s life much easier.
2. What This Automation Does ⚙️
This custom n8n workflow connects to your Umami analytics API, fetches key KPI stats and page-specific data, analyzes it using an AI SEO expert prompt, and saves the summarized insights to a database automatically. Here’s what happens when it runs:
- Retrieves last 7 days’ summary statistics (pageviews, visitors, visits, bounces, total time) from Umami
- Fetches detailed page view data for this week and the previous week for comparison
- Parses and transforms raw API data into clean, URL-encoded JSON strings for AI processing
- Sends data to an Openrouter AI model specialized in SEO analysis to generate markdown tables and tailored improvement suggestions
- Saves AI-generated SEO summaries and top pages reports to a Baserow database table for long-term tracking
- Runs on schedule every Thursday (or manually triggered for tests), eliminating manual data wrangling and speeding SEO workflow
By automating these steps, Sarah can reclaim over 3 hours of precious time weekly and uncover detailed insights she previously missed. This means better content decisions and faster SEO wins.
3. Prerequisites ⚙️
- Umami Account with API access to your website’s analytics data
- n8n Account (cloud or self-hosted – check out Hostinger for n8n self-hosting)
- Openrouter AI Account for SEO analysis (API key for authorization header)
- Baserow Account to save SEO analysis outputs (API token and table configured)
- Generic HTTP Header Authentication Credentials set up in n8n for secure API calls (Umami and Openrouter)
4. Step-by-Step Guide to Build This Workflow ✏️
Step 1: Add a Manual Trigger Node for Testing
Click the plus icon → select Manual Trigger node → position it on the canvas.
This lets you run the workflow manually for testing before scheduling.
After adding, click the node and then “Execute Node” to start the test run.
Common mistake: Forgetting to enable the node before testing.
Step 2: Set Up a Scheduled Trigger
Click plus → add a Schedule Trigger node → configure to trigger every Thursday as in the JSON (weeks interval, day 4).
This automates regular weekly runs.
Check the “Rule” settings under parameters. Expected outcome: the workflow triggers automatically once a week.
Common mistake: Incorrect day of week or interval settings can cause no runs.
Step 3: Configure HTTP Request Node to Fetch Summary Umami Stats
Add HTTP Request node → set Method to “GET” → Copy the URL exactly including templated timestamps:
https://umami.mydomain.com/api/websites/{websiteID}/event-data/stats?startAt={{ DateTime.now().minus({ days: 7 }).toMillis() }}&endAt={{ DateTime.now().toMillis() }}&unit=hour&timezone=Asia%2FHong_Kong
Replace umami.mydomain.com and {websiteID} with your actual domain and site ID.
Under Authentication, choose HTTP Header Auth and select your credential containing your API key in the header.
Execute the node to verify it returns JSON with pageviews, visitors, visits, bounces, and total time metrics.
Common mistake: Using wrong API endpoint or missing credentials.
Step 4: Add Code Node to Parse Summary Data
Add a Code node → set language to JavaScript → paste this code to extract key values and encode as URL string for AI consumption:
function transformToUrlString(items) {
const data = items[0].json;
if (!data) return encodeURIComponent(JSON.stringify([]));
try {
const simplified = {
pageviews: {
value: parseInt(data.pageviews.value) || 0,
prev: parseInt(data.pageviews.prev) || 0
},
visitors: {
value: parseInt(data.visitors.value) || 0,
prev: parseInt(data.visitors.prev) || 0
},
visits: {
value: parseInt(data.visits.value) || 0,
prev: parseInt(data.visits.prev) || 0
},
bounces: {
value: parseInt(data.bounces.value) || 0,
prev: parseInt(data.bounces.prev) || 0
},
totaltime: {
value: parseInt(data.totaltime.value) || 0,
prev: parseInt(data.totaltime.prev) || 0
}
};
return encodeURIComponent(JSON.stringify(simplified));
} catch (error) {
throw new Error('Invalid data structure');
}
}
const items = $input.all();
const result = transformToUrlString(items);
return { json: { urlString: result } };
This code grabs the main metrics, handles missing data gracefully, and prepares a URL-safe JSON string to send to AI.
Common mistake: Modifying code without understanding the data structure.
Step 5: Send Summary Data to AI for SEO Analysis
Add another HTTP Request node → Method: POST → URL:
https://openrouter.ai/api/v1/chat/completions
Set body as JSON with this example prompt:
{
"model": "meta-llama/llama-3.1-70b-instruct:free",
"messages": [
{
"role": "user",
"content": "You are an SEO expert. Here is data from Umami analytics of Pennibnotes.com. Where X is URL and Y is number of visitors. Give me a table summary of this data in markdown format:{{ $('Parse Umami data').item.json.urlString }}."
}
]
}
Use HTTP Header Auth credential with your Openrouter API Key.
Execute to check you get back markdown-formatted SEO recommendations for summary data.
Common mistake: Missing or invalid API key causes authorization errors.
Step 6: Fetch Detailed Page View Data for This Week
Add HTTP Request node → GET method → URL:
https://umami.rumjahn.synology.me/api/websites/{websiteID}/metrics?startAt={{Date.now() - (7 * 24 * 60 * 60 * 1000)}}&endAt={{Date.now()}}&type=url&tz=America/Los_Angeles
Replace domain and websiteID accordingly.
Authentication similar with HTTP Header Auth.
Common mistake: Using wrong timezone or date math results in empty data.
Step 7: Parse This Week’s Page Data with Code Node
Create a Code node → JavaScript → paste:
const data = $input.all();
const encodedData = encodeURIComponent(JSON.stringify(data));
return { json: { thisWeek: encodedData } };
Converts the detailed API response into a URL-safe encoded JSON string for AI.
Step 8: Fetch Last Week Page View Data
Add HTTP Request node with GET method → URL:
https://umami.rumjahn.synology.me/api/websites/{websiteID}/metrics?startAt={{Date.now() - (14 * 24 * 60 * 60 * 1000)}}&endAt={{Date.now() - (7 * 24 * 60 * 60 * 1000)}}&type=url&tz=America/Los_Angeles
Authentication as above.
Step 9: Parse Last Week Data with Code Node
Add Code node with JavaScript:
const data = $input.all();
const encodedData = encodeURIComponent(JSON.stringify(data));
return { json: { lastweek: encodedData } };
Step 10: Send This Week vs Last Week Data to AI
Add HTTP Request node → POST → URL:
https://openrouter.ai/api/v1/chat/completions
Use JSON body with this prompt:
{
"model": "meta-llama/llama-3.1-70b-instruct:free",
"messages": [
{
"role": "user",
"content": "You are an SEO expert. Here is data from Umami analytics of Pennibnotes.com. Where X is URL and Y is number of visitors. Compare the data from this week to last week. Present the data in a table using markdown and offer 5 improvement suggestions. This week:{{ $('Parse Umami data1').first().json.thisWeek }} Lastweek:{{ $json.lastweek }}"
}
]
}
Use the same Openrouter API Header Auth credentials.
Step 11: Save AI Analysis to Baserow Database
Add the Baserow node → Set operation to “Create” row in your configured table → Map fields:
- Date: Use current date formatted as yyyy-MM-dd
- Summary: Map from the AI summary from the first AI node
- Top pages: Map from AI comparison data from the second AI node
- Blog name: Add your blog’s name (text)
Test execution to ensure data saves correctly.
5. Customizations ✏️
- Change Schedule Frequency: In the Schedule Trigger node, adjust the “interval” and “triggerAtDay” to run daily or monthly to fit your analysis needs.
- Use Different Timezones: Modify the URLs in HTTP request nodes to set your preferred timezone instead of Asia/Hong_Kong or America/Los_Angeles.
- Switch AI Model: Update the AI HTTP Request nodes’ JSON body to use different models from Openrouter AI for higher accuracy or faster response.
- Expand Baserow Fields: Add fields like “Bounce Rate”, “Average Session Duration” to your Baserow table and update mapping accordingly.
- Add Slack Notification: Insert Slack node post Baserow save to notify you when new insights are available.
6. Troubleshooting 🔧
- Problem: “401 Unauthorized” from HTTP Request nodes
Cause: Invalid API key or missing header in authentication.
Solution: Go to Credentials → Re-add correct API key, ensure “Bearer ” prefix is included. - Problem: AI nodes return no or garbled responses
Cause: Input data format incorrect or too large.
Solution: Verify data encoding in Code nodes; reduce data range if needed. - Problem: Baserow save fails
Cause: Incorrect field IDs or table not set up.
Solution: Check Baserow table schema, confirm field IDs match, and test connection.
7. Pre-Production Checklist ✅
- Test manual trigger and confirm API data returns expected JSON structure
- Validate Code node outputs output correct JSON encoded strings
- Check AI node prompts in HTTP Request body is updated with accurate data references
- Test Baserow node saving to confirm data writes properly
- Verify schedule trigger runs at expected time
8. Deployment Guide
Once your workflow tests successfully, turn it active by toggling the activation switch in n8n.
Monitor regular runs via n8n dashboard to catch errors early.
Consider adding logs or Slack alerts for failures to enhance monitoring.
This workflow is light on resources and suitable to run in small to medium setups.
9. FAQs
- Q: Can I use Google Analytics instead of Umami?
A: This workflow is specific to the Umami API format; adapting for Google Analytics requires rewriting HTTP requests and parsing code nodes. - Q: Does this consume Openrouter API credits?
A: Yes, each AI request consumes credits. Monitor usage to avoid unexpected costs. - Q: Is my data secure?
A: API keys are stored securely in n8n credentials. Use HTTPS endpoints and follow best practices. - Q: Can this handle multiple websites?
A: You can clone and adjust workflow nodes for multiple sites or enhance it with dynamic parameters.
10. Conclusion
By following this detailed n8n workflow tutorial, you’ve automated fetching and AI analysis of your Umami website analytics, generating clear SEO insights and saving them for tracking. You regained several hours per week previously spent on manual reporting, avoided errors, and unlocked actionable suggestions to grow your blog audience.
Next steps? Consider adding Slack alerts for instant insights, integrate Google Sheets for broader data visualization, or set up monthly trend reports. Keep automating to focus more on creating great content!