Opening Problem Statement
Meet Alejandro, a digital automation specialist who uses n8n to orchestrate multiple workflows daily. Every night, his workflows update constantly, generating critical data he needs to back up reliably. However, Alejandro found himself wasting over an hour each day manually downloading, organizing, and deleting outdated backups from Google Drive. Mistakes crept in—older backups cluttered his storage, leading to confusion about the latest versions and even accidental data loss. With no automated system to manage this backup lifecycle, Alejandro faced growing risk and inefficiency.
This problem is common for n8n users who rely on backup storage on cloud services like Google Drive but struggle to maintain an organized, updated backup repository automatically. Alejandro’s pain points reflected wasted time, potential loss of key workflow versions, and increased maintenance overhead. The frustration was real: how can one automate creating, organizing, and purging these backups without complex manual effort?
What This Automation Does
This n8n workflow is specially crafted to automate every essential step in managing backups for your n8n workflows on Google Drive. When triggered on a schedule, it accomplishes the following:
- Creates two specific backup folders on Google Drive named
n8n_backupsandn8n_oldif they do not exist, ensuring standard storage locations. - Backs up all current workflows as JSON files in the
n8n_backupsfolder, with filenames including their active status and timestamp for easy identification. - Moves previous backups from the
n8n_backupsfolder to then8n_oldfolder, organizing backup history systematically. - Purges backups older than 30 days from the
n8n_oldfolder automatically to save storage space and maintain relevance. - Implements batching and error handling using split nodes and continue-on-error options to robustly handle large backup sets without failure.
- Saves you hours each week by eliminating manual backup management and reducing the risk of errors or lost data.
Prerequisites ⚙️
- Google Drive account and Google Drive OAuth2 credentials configured in n8n (used by Google Drive nodes).
- n8n instance with access to the n8n API key for fetching workflow data (via the n8n node).
- n8n version 1.67.1 or higher (for compatibility with nodes and features).
- Two folders in Google Drive named
n8n_backupsandn8n_oldwill be automatically created if missing. - Basic working knowledge of n8n flows and scheduling triggers.
- Optional: Self-hosted n8n environment for tighter control (e.g., Hostinger at buldrr.com/hostinger).
Step-by-Step Guide
Step 1: Add and Configure the Schedule Trigger Node
In your n8n workflow, click Add Node → Search for Schedule Trigger → Select it. Configure the trigger to run nightly or on your preferred backup interval by setting the interval rules (e.g., every 24 hours).
You should see a green circle next to the node indicating it’s active.
Expected Outcome: The workflow will start running automatically on the defined schedule.
Common mistake: Forgetting to set the interval to daily to ensure nightly backups.
Step 2: Get Current Google Drive Folders
Add the Google Drive node, set to resource: fileFolder and operation: list, filtered to return all folders in your drive.
This node fetches existing folders so the workflow can check if n8n_backups and n8n_old already exist.
Use your Google Drive OAuth2 credentials here.
Expected Outcome: You get a list of folders for the next logic step.
Common mistake: Not using the correct credentials or filtering for folders.
Step 3: Use the Code Node to Detect Missing Folders
Add a Code node with this JavaScript snippet to check for the existence of the two backup folders:
const items = $input.all();
const requiredNames = ["n8n_old", "n8n_backups"];
// Extract names from the items
const folderNames = items.map(item => item.json.name);
// Check missing names
const missingNames = requiredNames.filter(name => !folderNames.includes(name));
if (missingNames.length === 0) {
return [{ json: { message: "ok" } }];
} else {
return [{ json: { message: `Faltan los siguientes: ${missingNames.join(', ')}` } }];
}
Explanation: This script receives the folder list and determines which backup folders don’t exist yet, signaling which folders need to be created.
Expected Outcome: The response indicates which folders to create.
Common Mistake: Syntax errors or incorrect property names.
Step 4: Use IF Nodes to Branch Folder Creation Logic
Add two IF nodes that check if the previous message from the Code node mentions missing n8n_old or n8n_backups. Connect the Code node’s output to both.
Expected Outcome: Conditional branching allowing the next nodes to create missing folders only.
Step 5: Create Missing Backup Folders
For each IF node’s true output, add a Google Drive node configured to create a folder:
Name it n8n_old or n8n_backups, resource: folder, and parent folder as root.
Expected Outcome: The folders appear in Google Drive if they didn’t exist before.
Common Mistake: Forgetting to set folder resource or name exactly.
Step 6: Merge Confirmations and Refresh Folder List
Use a Merge node (mode: Combine) to unify outputs after folder creation to continue the workflow.
Add another Google Drive folder list node to refresh the folder list after creation.
Expected Outcome: Updated folder list including any newly created folders.
Step 7: Filter to Use Only the n8n_backups Folder
Add a Filter node to ignore any folders other than n8n_backups. This ensures only your backup folder is processed in subsequent steps.
Expected Outcome: Only the correct folder is used to get current backup files.
Step 8: List Current Backups in n8n_backups
Add another Google Drive node filtered to list files in the n8n_backups folder. This node retrieves all current backups for processing.
Expected Outcome: You get a list of all backup JSON files.
Step 9: Split Backup File List to Batches
Use the Item Lists node and Split In Batches node to paginate through all backup files for safe processing. Batch size is set to 1 for controlled iteration.
Expected Outcome: Files are processed one at a time, avoiding rate limits or errors.
Step 10: Move Old Backups to n8n_old Folder
Add a Google Drive node with operation move configured to move each backup file from n8n_backups to n8n_old.
Connect the batch output here.
Expected Outcome: Previous backups are archived in the n8n_old folder.
Common Mistake: Incorrect folder ID mapping causing files to move to wrong folder.
Step 11: Fetch Current Workflows from n8n API Node
Add the n8n node to list current workflows, set to fetch JSON data about each workflow with details.
Expected Outcome: You receive metadata about all active or inactive workflows.
Step 12: Process and Upload Each Workflow Backup
Chain the Move Binary Data node to convert JSON objects into binary data files for upload.
Use Split In Batches to iterate over each workflow backup.
Finally, use a Google Drive Upload node to save the workflow backup files into the n8n_backups folder with distinctive file names including active status and timestamps.
Expected Outcome: Backup files created and uploaded to your drive reliably.
Step 13: Purge Backups Older Than 30 Days Automatically
Add a second Schedule Trigger node to run every 30 days and trigger another sequence.
This sequence lists backups in the n8n_old folder, splits them in batches, and deletes any files older than the purging threshold.
Expected Outcome: Automated cleanup of old backups maintaining your storage healthy.
Common Mistake: Forgetting to enable or connect purge nodes in your workflow.
Customizations ✏️
- Change Backup Frequency: In the
Schedule Triggernode, adjust the interval or specific time for when backups run to fit your workflow update cycles. - Modify Backup Retention Period: Adjust the purge schedule trigger and deletion logic to retain backups for longer or shorter than 30 days by changing interval and filter conditions.
- Rename Backup Files: Customize the file naming format in the
Move Binary Datanode by modifying itsfileNameexpression to include other metadata like workflow ID. - Add Email Notifications: Add an email node post backup or purge to notify you of completion or errors for comprehensive monitoring.
- Include Additional Backup Destinations: Extend the flow by adding nodes for other cloud services like Dropbox or AWS S3 for multi-location backup.
Troubleshooting 🔧
Problem: “File not found” during move operation
Cause: The file ID passed to the Google Drive move node is incorrect or outdated.
Solution: Verify that your batch iterations correctly pass the file Ids retrieved from GET CURRENT BACKUPS1 node. Use execution data to inspect IDs.
Problem: Folders are not created
Cause: The Code node message does not properly identify missing folders or Google Drive create folder nodes lack proper configuration.
Solution: Check the Code node script for accuracy. Confirm folder names are exactly n8n_old and n8n_backups. Verify Google Drive credentials and folder creation parameters.
Problem: Upload fails due to data format
Cause: The workflow data is not properly converted into binary for Google Drive upload.
Solution: Confirm the Move Binary Data node is correctly set to convert JSON data to binary and filename includes correct expression.
Pre-Production Checklist ✅
- Verify Google Drive credentials are correctly added and authorized.
- Ensure your n8n API key has permissions to list and access workflows.
- Confirm the Schedule Trigger nodes are configured with correct intervals and active.
- Test the folder creation by temporarily deleting
n8n_backupsorn8n_oldfolders and run the workflow manually. - Run a test backup to check files upload with correct naming into Google Drive.
- Simulate the purge sequence by adjusting the purge interval and file age in Google Drive to test deletion.
Deployment Guide
Activate your n8n workflow by toggling it on once all nodes are configured and tested.
Monitor initial runs via n8n’s execution log to ensure backups and folder moves complete successfully.
Use the schedule triggers to maintain ongoing automation without manual intervention.
Set up alerts or logging if desired to track backup statuses for peace of mind.
This workflow requires minimal ongoing maintenance beyond periodic credentials renewal.
FAQs
Q: Can I use Dropbox instead of Google Drive for backups?
A: Yes, but you will need to replace the Google Drive nodes with equivalent Dropbox nodes and configure accordingly. This workflow’s logic remains similar but adapts to the different API.
Q: Will this workflow consume many API calls?
A: The workflow uses moderate Google Drive API usage depending on the number of workflows backed up and backup files to move. Batching helps mitigate hitting rate limits.
Q: Is my backup data secure in Google Drive?
A: Yes, your data is protected by Google Drive’s security protocols. Use secure OAuth2 credentials and maintain access control in Google Drive for privacy.
Conclusion
By following this detailed guide, you have automated the entire backup lifecycle for your n8n workflows leveraging Google Drive. You no longer need to manually organize, rename, or purge backup files—saving you significant time and reducing human error risks.
Specifically, Alejandro’s workflow now runs nightly to keep backups orderly, moving previous backups to an old folder and purging anything beyond 30 days, greatly optimizing storage management. You too can protect your valuable workflow data that way.
Next steps could include adding notifications for backup success, expanding multi-cloud backup locations, or versioning backups with git integrations for even more control.
Get started now and take control of your n8n workflow backups effortlessly!