Automate n8n Workflow Backups to GitHub with Cron & API

Save time and avoid manual errors by automating n8n workflow backups directly to GitHub. This workflow uses scheduled cron triggers, GitHub API calls, and data merging to keep your workflow versions safe and up to date.
github
cron
httpRequest
+2
Workflow Identifier: 1390
NODES in Use: Cron, HTTP Request, Function, GitHub, Merge

Press CTRL+F5 if the workflow didn't load.

Learn how to Build this Workflow with AI:

Visit through Desktop for Best experience

1. Opening Problem Statement

Meet Sarah, an automation specialist who manages multiple complex workflows in n8n for her organization. Every day, she manually backs up critical workflow configurations to secure locations to prevent accidental data loss and to maintain version control. Despite her best efforts, this manual process is time-consuming, prone to mistakes, and sometimes she even overwrites important updates without realizing it.

Sarah realized that manually maintaining backups costs her over an hour daily and poses risks of losing valuable workflow configurations. She needs a reliable, automated backup solution tailored specifically to her n8n environment and GitHub repositories to keep everything synchronized without lifting a finger.

2. What This Automation Does

This n8n workflow automates the entire backup process of your n8n workflows to GitHub. Here’s what happens when it runs:

  • At 23:59 daily, a Cron trigger activates the process, ensuring backups happen regularly without manual intervention.
  • It fetches all current workflows from your local n8n instance via HTTP API calls.
  • Each workflow’s detailed data is retrieved and compared with existing files in a specified GitHub repository.
  • New workflows get added as JSON files to GitHub, while existing files are updated only if the workflow data changed, preventing unnecessary commits.
  • The process smartly merges and removes duplicate workflows based on workflow name and updated timestamp to keep GitHub clean and organized.
  • The entire process reduces manual backup time by over an hour daily and minimizes risks of overwriting or losing workflow configurations.

3. Prerequisites ⚙️

  • n8n account and a running local instance with API enabled (http://localhost:5678/rest/workflows)
  • GitHub account with a repository dedicated to storing n8n workflow backups
  • GitHub API credentials for n8n — generate a personal access token with repo permissions
  • n8n nodes used: Cron, HTTP Request, GitHub, Function, and Merge
  • Basic Authentication enabled on n8n API for secure HTTP requests

4. Step-by-Step Guide

Step 1: Set up the Cron Trigger to Run Daily

In your n8n editor, click + Add Node → select Cron node. Configure it to trigger at 23:59 daily. This will kick off the backup process automatically every night.
You should see the node named “Daily at 23:59” with its schedule set.
Common pitfall: Forgetting to set the timezone properly can lead to backups running at unexpected hours.

Step 2: Fetch All Workflows from Local n8n Instance

Add an HTTP Request node named “Get workflows.” Navigate to node parameters:
– Method: GET (default)
– URL: http://localhost:5678/rest/workflows
– Authentication type: Basic Auth
– Provide your n8n instance credentials.
This node calls your local n8n REST API to fetch metadata of all workflows.
After execution, expect to see JSON data with workflow IDs and names.
Common mistake: Forgetting to enable the n8n REST API or to use the correct credentials.

Step 3: Transform the Workflows Array into Individual Items

Add a Function node called “transform.”
Paste this code to transform array data into separate items:

const newItems = [];
for (item of items[0].json.data) {
  newItems.push({ json: item });
}
return newItems;

This breaks the workflows array into individual messages for easier handling downstream.
Expected: Individual workflow JSON objects ready for next API calls.
Common error: Modifying the original array incorrectly can cause empty outputs.

Step 4: Retrieve Complete Data for Each Workflow

Add another HTTP Request node named “Get workflow data.” Set URL as:
=http://localhost:5678/rest/workflows/{{$json["id"]}}
Use the same Basic Auth credentials.
This node fetches detailed configurations of each workflow.
Expected output: Complete JSON data of each workflow.
Be aware of rate limits or API downtime if you have many workflows.

Step 5: Fetch Existing Backup Files from GitHub

Add a GitHub node named “Get Files.” Configure for “Get” operation on the root directory of your GitHub repo:
– Owner: YOUR_USERNAME
– Repository: REPO
– File Path: /
– Authentication: GitHub API credentials
Output: File listing JSON from your repository to check for existing backups.
Watch out: Make sure the repo and access rights are correct to avoid unauthorized errors.

Step 6: Transform GitHub File Data

Add a Function node “Transform.” Use the code:

return items[0].json.map(item => {
  return {
    json: item
  };
});

This prepares file data for merging and comparison.
Outcome: Array of GitHub files as individual JSON items.
Common issue: Improper data mapping can cause node crashes.

Step 7: Download Raw Content of Each GitHub Backup File

Add HTTP Request node “Download Raw Content.”
Set it to GET the URL from each file’s download_url property:
={{$json["download_url"]}}
Use header authentication with GitHub token.
Response format: String
Data returned: Raw JSON backup content.
Watch for rate limits and ensure the token has read permissions.

Step 8: Merge Local Workflows with GitHub Data to Identify Updates

Add a Merge node called “Merge.” Configure mode to “removeKeyMatches.” Use “data.name” as both key properties.
This compares local workflow names with GitHub backup names to isolate new or updated workflows.
Expected to branch workflows needing creation or update.
Note: This is critical to avoid redundant commits.

Step 9: Create or Update Workflow Backup Files on GitHub

Add a GitHub node “Create file” to add new workflows:
– Operation: “create”
– File Path: Use {{$json.data.name}}.json
– File Content: Payload from “Merge” node JSON stringified
– Commit message including timestamp and workflow name
This creates new JSON backup files for workflows not yet on GitHub.
Validation: Confirm files created without errors.
Common mistake: Wrong JSON stringification causing invalid files.

Step 10: Update Existing Workflow Backup Files if Needed

Add another Merge node “Merge1” to compare using “data.updatedAt” keys.
Add a final GitHub node “GitHub Edit” to perform the update operation:
– Use file path as workflow name plus “.json” extension
– File content: Stringify the updated workflow JSON
– Commit message includes update timestamp
This node commits updates only when workflow data has changed remotely.
Outcome: Efficient version control with minimal GitHub commits.
Common error: Failing to remove key matches causes redundant updates.

5. Customizations ✏️

  • Change Backup Timing: In the “Daily at 23:59” Cron node, modify the hour and minute fields to suit your backup schedule.
  • Target a Different GitHub Repository: Update the GitHub nodes to point to a different repo or branch by modifying the “Repository” field.
  • Add Workflow Filtering: Introduce additional filters in the “Get workflows” HTTP Request node to exclude certain workflow IDs or names.
  • Enable Backup to Multiple Repositories: Duplicate the GitHub nodes and adjust credentials to backup to multiple repositories.
  • Extend to Include Workflow Executions: Enhance HTTP requests to fetch latest workflow executions and back them up in separate files.

6. Troubleshooting 🔧

Problem: “GitHub API rate limit exceeded or 403 Forbidden errors”
Cause: GitHub API throttling due to too many requests
Solution: Use a token with adequate permissions, reduce request frequency, and ensure caching of responses.

Problem: “Empty or incorrect workflow data in HTTP Request node output”
Cause: Incorrect n8n API URL or missing authentication credentials
Solution: Double-check the localhost API endpoint URL and validate HTTP Basic Auth credentials in n8n.

Problem: “Merge node does not filter duplicates as expected”
Cause: Using incorrect property names or mismatched data structures in Merge node
Solution: Ensure the property names used in the Merge node exactly match the data fields (e.g., “data.name”), and validate JSON structure.

7. Pre-Production Checklist ✅

  • Verify that your n8n instance REST API is accessible and Basic Auth is correctly configured.
  • Confirm GitHub API credentials work and have repo write permissions.
  • Run the workflow manually to inspect data flow between nodes step-by-step.
  • Check that merge operations correctly identify new vs. existing workflows.
  • Backup test on a small number of workflows before scheduling automation.

8. Deployment Guide

Once all nodes are configured and tested, activate the workflow in n8n by toggling the enable switch at the top right.
Monitor the initial runs via execution logs to ensure backups succeed.
Adjust the Cron schedule if needed based on your operational backup window.
Regularly review your GitHub repository for created or updated workflow JSON files.
For safe rollback, keep manual backups of your workflows before automation.

9. FAQs

Q: Can I use a cloud-hosted n8n instance instead of localhost?
A: Yes, just update HTTP Request URLs to your cloud instance address and ensure API access is correctly set.

Q: Does this workflow consume many GitHub API calls?
A: It makes calls to list files, download contents, and create/update files daily—fairly minimal if scheduled once per day.

Q: Is my workflow data secure with this method?
A: Using Basic Auth and personal access tokens with limited scopes keeps data reasonably secure. Avoid sharing credentials.

10. Conclusion

By setting up this n8n workflow, you have automated the crucial task of backing up your n8n workflows to GitHub securely and efficiently. You saved over an hour daily and mitigated risks of losing your valuable automation configurations.

Next steps could include automating workflow documentation generation, syncing workflow executions logs, or integrating notifications on backup success or failure.
This targeted automation brings peace of mind and operational resilience for anyone managing critical n8n workflows.

Promoted by BULDRR AI

Related Workflows

Automate Viral UGC Video Creation Using n8n + Degaus (Beginner-Friendly Guide)

Learn how to automate viral UGC video creation using n8n, AI prompts, and Degaus. This beginner-friendly guide shows how to import, configure, and run the workflow without technical complexity.
Form Trigger
Google Sheets
Gmail
+37
Free

AI SEO Blog Writer Automation in n8n

A complete beginner guide to building an AI-powered SEO blog writer automation using n8n.
AI Agent
Google Sheets
httpRequest
+5
Free

Automate CrowdStrike Alerts with VirusTotal, Jira & Slack

This workflow automates processing of CrowdStrike detections by enriching threat data via VirusTotal, creating Jira tickets for incident tracking, and notifying teams on Slack for quick response. Save hours daily by transforming complex threat data into actionable alerts effortlessly.
scheduleTrigger
httpRequest
jira
+5
Free

Automate Telegram Invoices to Notion with AI Summaries & Reports

Save hours on financial tracking by automating invoice extraction from Telegram photos to Notion using Google Gemini AI. This workflow extracts data, records transactions, and generates detailed spending reports with charts sent on schedule via Telegram.
lmChatGoogleGemini
telegramTrigger
notion
+9
Free

Automate Email Replies with n8n and AI-Powered Summarization

Save hours managing your inbox with this n8n workflow that uses IMAP email triggers, AI summarization, and vector search to draft concise replies requiring minimal review. Automate business email processing efficiently with AI guidance and Gmail integration.
emailReadImap
vectorStoreQdrant
emailSend
+12
Free

Automate Email Campaigns Using n8n with Gmail & Google Sheets

This n8n workflow automates personalized email outreach campaigns by integrating Gmail and Google Sheets, saving hours of manual follow-up work and reducing errors in email sequences. It ensures timely follow-ups based on previous email interactions, optimizing communication efficiency.
googleSheets
gmail
code
+5
Free