1. The Daily Dilemma of Workflow Backup for Sam
Meet Sam, an automation engineer responsible for maintaining a growing library of n8n workflows that power his company’s operational automations. Every day at the start of his morning, Sam painstakingly exports the workflows from n8n and manually commits them to his Bitbucket repository. This tedious process takes around 30 minutes, fraught with risks of missing recent changes or overwriting updates. Occasionally, he hits Bitbucket’s API rate limits, causing delays and failed uploads, costing precious time and risking data loss.
With nearly 50 active workflows, the manual backup approach is error-prone, wastes time, and introduces stress. Sam needs an automated, reliable process that securely backs up all his n8n workflows daily without hitting rate limits or causing failures.
2. What This Automation Does ⚙️
This n8n workflow automates the entire process of backing up n8n workflows to a Bitbucket repository with intelligent rate limit handling. Here’s what happens when it runs automatically every day at 2 AM:
- Fetches all n8n workflows from your environment without manual export.
- Iterates through each workflow individually to check the current version stored in Bitbucket.
- Compares the local workflow JSON to the repository version to detect new or changed workflows.
- Pushes new or updated workflow JSON files to Bitbucket via its API with an appropriate commit message.
- Calculates and applies dynamic wait times between API requests to avoid hitting Bitbucket’s rate limits.
- Runs unattended every day, freeing Sam from manual effort and reducing backup errors.
This workflow can save Sam approximately 30 minutes daily and prevent backup failures caused by API rate limiting.
3. Prerequisites ⚙️
- n8n account with API access configured (your automation environment). 🔑
- Bitbucket account with a repository ready for workflow backups. 🔑
- Bitbucket HTTP Basic Auth credentials set up in n8n for API requests. 🔐
- Access to n8n Nodes:
- Schedule Trigger Node
- n8n API Node
- Split in Batches Node
- HTTP Request Node (for Bitbucket API)
- Code Node (for rate limit wait-time calculation)
- Wait Node
- If needed, Set Node for environment settings.
4. Step-by-Step Guide to Build This Workflow
Step 1: Set Up a Scheduled Trigger
Navigate to Nodes panel → Add Node → Schedule Trigger. Configure the node to run daily at 2 AM. This timing automates your backups during off-hours, minimizing interference with daily operations.
Expected Outcome: The workflow will trigger once every 24 hours at 2 AM automatically.
Common Mistake: Not setting time to the correct timezone may cause the trigger to run at unexpected hours.
Step 2: Configure Workflow to Set Bitbucket Workspace and Repository Info
Next, add a Set Node to define your Bitbucket workspace and repository slug. In Set Bitbucket Workspace & Repository, create two string fields:
WorkspaceSlug– e.g.,your-workspaceRepositorySlug– e.g.,your-repository
Expected Outcome: The workflow now holds the repository data needed for API URLs.
Tip: Use your actual Bitbucket workspace and repository slugs precisely.
Step 3: Retrieve All n8n Workflows via n8n API Node
Add the n8n Node configured to call the endpoint that fetches all workflows without filters. Use API credentials with sufficient permissions.
Example setup: Filters set to {} to retrieve everything.
Expected Outcome: The node outputs a list of all workflows with their JSON definitions.
Step 4: Split the Workflow List into Batches
To handle workflows individually, add a Split In Batches Node. Connect it to the previous node. Leave batch options default to process one workflow per cycle, preventing overloading Bitbucket with bulk requests.
Expected Outcome: Each workflow will be passed one at a time downstream.
Step 5: Retrieve Existing Workflow from Bitbucket
Add an HTTP Request Node named Get Existing Workflow from Bitbucket configured with:
- Method: GET
- URL Template:
https://api.bitbucket.org/2.0/repositories/{{ $json.WorkspaceSlug }}/{{ $json.RepositorySlug }}/src/main/{{ $json.name.replace(/[^a-zA-Z0-9]/g, '-').toLowerCase() }} - Authentication type: HTTP Basic Auth using Bitbucket credentials
- Settings for full response to access headers
Expected Outcome: Fetches the existing workflow file or returns 404 if it doesn’t exist.
Common Mistake: Forgetting to sanitize the workflow name to valid file naming conventions can cause API errors.
Step 6: Check If Workflow is New or Changed
Add an If Node named New or Changed? with conditions:
- Check if the HTTP Request returned a 404 status meaning no file exists.
- OR compare current workflow JSON with the Bitbucket file content. If not equal, mark for upload.
Expected Outcome: Only new or modified workflows proceed to the upload step.
Step 7: Upload Workflow to Bitbucket
Add another HTTP Request Node named Upload Workflow to Bitbucket configured for a POST request to Bitbucket API to commit files:
- URL:
https://api.bitbucket.org/2.0/repositories/{{ $json.WorkspaceSlug }}/{{ $json.RepositorySlug }}/src - Body Parameters include:
–messagewith commit message using workflow name and timestamp.
– Workflow file name sanitized and workflow JSON content as file content. - Headers: Content-Type set to
application/x-www-form-urlencoded - Authentication: HTTP Basic Auth using Bitbucket credentials
Expected Outcome: The workflow commits the updated or new workflow JSON file to your Bitbucket repository.
Step 8: Dynamically Calculate Wait Time to Avoid Rate Limits
Add a Code Node named Calculate Wait Time with this JavaScript code snippet:
// Get all input items and ensure we have data
if ($input.all().length === 0 || !$input.all()[0].headers) {
// If no headers available, return default wait time
return { waitTime: 1 };
}
// Check rate limit headers from previous request
const headers = $input.all()[0].headers;
let waitTime = 1; // Default 1 second
// Check if we have rate limit information (safely)
const remaining = parseInt(headers['x-ratelimit-remaining']) || null;
const reset = parseInt(headers['x-ratelimit-reset']) || null;
// Only adjust wait time if we have valid rate limit info
if (remaining !== null && reset !== null) {
if (remaining < 100) {
const now = Math.floor(Date.now() / 1000);
const timeUntilReset = reset - now;
waitTime = Math.ceil((timeUntilReset / remaining) * 1.1); // 10% buffer
} else if (remaining < 500) {
waitTime = 2;
}
}
// Cap maximum wait time at 30 seconds
waitTime = Math.min(waitTime, 30);
return { waitTime };
Explanation: This code reads Bitbucket’s API rate limit headers from the last API response and calculates a safe wait time to spread requests evenly to avoid throttling.
Step 9: Wait Node to Pause Before Next API Call
Use a Wait Node configured to pause for the wait time calculated in the previous step. This helps space out Bitbucket API calls and prevents hitting rate limits.
Step 10: Loop Back to Process Next Workflow
The Wait Node’s output loops back to the Split In Batches Node to process the next workflow, repeating the flow until all workflows are backed up.
5. Customizations ✏️
- Adjust Daily Trigger Time: In the Schedule Trigger Node, modify the hour to your preferred backup time.
- Change Repository Branch or Folder: In HTTP Request nodes’ URL template, change
mainto your target branch or specify a folder path. - Add Commit Author Info: Extend commit message with author metadata by adding extra parameters in the upload HTTP request.
- Backup Only Specific Workflows: Add filter conditions in the Get All Workflows Node to backup selected workflows only.
- Increase Rate Limit Awareness: Tune the thresholds and multipliers in the Calculate Wait Time Code Node to be more or less aggressive.
6. Troubleshooting 🔧
- Problem: "HTTP request returned 404 for existing workflow retrieval"
Cause: Workflow filename sanitization does not match Bitbucket naming conventions.
Solution: Verify the replacement of invalid characters in filenames in HTTP request URL template. - Problem: "Rate limit exceeded errors during upload"
Cause: Insufficient wait time between API requests.
Solution: Adjust the wait time calculation code to increase delays, or add additional logging to monitor remaining calls. - Problem: "Authentication failed on Bitbucket API calls"
Cause: Incorrect or expired credentials.
Solution: Re-enter or renew the Http Basic Auth credentials under n8n Credentials manager.
7. Pre-Production Checklist ✅
- Ensure Bitbucket workspace and repository slugs are correctly set.
- Test Bitbucket HTTP requests independently with Postman or similar tools.
- Run the workflow manually before enabling the schedule.
- Check logs in n8n for API responses to confirm rate limit headers are received.
- Backup existing workflows manually as a fallback before first run.
8. Deployment Guide
Activate the workflow by enabling it in your n8n environment. Ensure that your n8n instance runs consistently (self-hosted options like Hostinger provide reliable uptime).
Monitor executions initially to verify backups complete successfully without errors or rate limiting.
You can view commit history in Bitbucket to confirm that workflows are backed up each day with unique timestamped messages.
9. FAQs
- Can I use GitHub instead of Bitbucket? Yes, but you will need to adjust the HTTP Request nodes to use GitHub’s API endpoints and authentication.
- Does this workflow consume Bitbucket API rate limits? Yes, but the workflow dynamically spaces requests and respects limits to avoid hitting caps.
- Is my workflow data secure? Your data is transmitted securely via HTTPS and protected by Bitbucket authentication credentials.
10. Conclusion
By following this detailed guide, you have automated the otherwise manual, error-prone task of backing up your n8n workflows to Bitbucket. This workflow not only saves you around 30 minutes daily but also intelligently manages API rate limits, avoiding failed backups.
Next, consider enhancing this backup automation by adding notifications via Slack after backups complete, or create similar workflows to back up other critical automation configurations.
With your automated backup system in place, Sam can now rest easy knowing his workflows are safely stored and version-controlled every day without manual intervention.