Automate n8n Workflow Backups to GitHub Easily

Struggling to manually backup your n8n workflows leading to errors and lost configurations? This unique n8n workflow automates backing up all your workflows as JSON files to GitHub, saving you hours and ensuring your automation data is safe and versioned.
github
code
n8n
+12
Workflow Identifier: 1356
NODES in Use: Manual Trigger, Sticky Note, n8n, Set, Http Request, If, Merge, Code, Switch, No Operation, GitHub, Split In Batches, Schedule Trigger, Execute Workflow Trigger, Execute Workflow

Press CTRL+F5 if the workflow didn't load.

Learn how to Build this Workflow with AI:

Visit through Desktop for Best experience

1. Opening Problem Statement

Meet Sarah, an automation engineer managing dozens of workflows within her n8n instance. Every day she tweaks workflows to optimize business processes but spends hours manually exporting and backing up her work to avoid data loss or version confusion. On some occasions, she forgets to backup workflows or loses track of changes, risking costly setbacks and forcing troubleshooting that wastes her entire day. What she needs is an automated way to ensure her entire n8n workflow library is backed up regularly and stored safely with version control.

This exact pain is what this workflow solves—it automates backing up all workflows from an n8n instance to a specified GitHub repository as JSON files. This eliminates manual export errors, version conflicts, and time wasted in repeat backups. Instead, Sarah now saves multiple hours weekly and gains confidence that every workflow iteration is safely stored and tracked.

2. What This Automation Does

When running, this n8n workflow performs the following key actions:

  • Fetches the list of all workflows in your n8n instance using the n8n node.
  • Loops over each workflow and retrieves existing backup files from a configured GitHub repository.
  • Comparatively analyzes the local workflow data against the backed-up version to see if they are new, changed, or unchanged.
  • Automatically creates a new file in GitHub if the workflow backup doesn’t exist yet.
  • Edits the existing GitHub file if differences are detected, committing updated JSON content.
  • Skips any workflows that have not changed to avoid unnecessary commits, saving API quota and clutter.

This automation thus ensures all your workflows are accurately mirrored on GitHub in near real-time, reducing manual backup labor by hours weekly and minimizing human errors in versioning.

3. Prerequisites ⚙️

  • n8n account with access to your workflows to automate backups.
  • GitHub account with a repository ready to store JSON workflow backups.
  • Configured n8n credentials for GitHub API access with permissions to create/edit files.
  • Basic understanding of n8n nodes and working with JSON data.
  • Optional: Self-hosted n8n instance for full control over scheduled runs.

4. Step-by-Step Guide

Step 1: Setting up Globals Node for GitHub Repo Info

Navigate to the Globals node, which is a Set node. Fill in your repository details:

  • repo.owner: Your GitHub username or organization, e.g., islamnazmi.
  • repo.name: The repository name, e.g., n8n.
  • repo.path: The path in the repo where backups will be saved, e.g., workflows/{{ $json.tags[0].name }}. This organizes backups in folders based on workflow tags.

Save changes. This node centralizes your GitHub repository settings.

Tip: Ensure you have a repo and path created or ready to be created by GitHub upon first commit.

Step 2: Triggering Workflow Execution

The Schedule Trigger node runs the workflow daily at 7am by default. You can also run manually using the manualTrigger node named On clicking ‘execute’.

Tip: Customize Schedule Trigger if you want backups at different intervals.

Step 3: Fetching All n8n Workflows

The n8n node calls the n8n API to get all workflows currently active in the instance. It returns workflow metadata including IDs and tags.

Step 4: Looping Over Workflows for Processing

The data is passed to the Loop Over Items node, a splitInBatches type that processes each workflow one-by-one to reduce memory load.

Step 5: Executing Subworkflow for Each Workflow

The Execute Workflow node calls the same workflow recursively with the current workflow data as input to handle backup of individual workflows.

Step 6: Checking for Tags Presence

The tag? switch node routes workflows based on whether they have tags. Tagged workflows proceed to create a path under a folder named after the tag.

Step 7: Constructing Repository Path

The Set node named / formats the directory path by appending a slash to the first tag name to organize backups nicely.

Step 8: Retrieving Backed-Up File from GitHub

The Get file data node uses GitHub API to get the current backup file if it exists in the configured repo path using the workflow ID as filename (e.g., {workflowID}.json).

Note: It continues on fail for missing files to handle new workflows gracefully.

Step 9: Handling Large Files

The If file too large node checks if the file content is empty and if there’s any error (like too large to download directly). If so, it will fetch the file content via normal HTTP request fallback.

Step 10: Merging Workflow Data

At Merge Items, outputs from the previous steps combine to prepare for comparison.

Step 11: Comparing Workflow Files for Differences

The isDiffOrNew Code node runs JavaScript that:

  • Decodes base64 content from GitHub if present.
  • Parses the JSON to order keys consistently for reliable comparison.
  • Determines if the current workflow matches the backed-up file, is different, or is new entirely.
  • Sets github_status accordingly (“same”, “different”, “new”).

Here’s the core code snippet for this logic used inside the node:

const orderJsonKeys = (jsonObj) => {
  const ordered = {};
  Object.keys(jsonObj).sort().forEach(key => {
    ordered[key] = jsonObj[key];
  });
  return ordered;
}

// Check if file returned with content
if (Object.keys($input.all()[0].json).includes("content")) {
  const origWorkflow = JSON.parse(Buffer.from($input.all()[0].json.content, 'base64').toString());
  const n8nWorkflow = $input.all()[1].json;
  
  const orderedOriginal = orderJsonKeys(origWorkflow);
  const orderedActual = orderJsonKeys(n8nWorkflow);

  if (JSON.stringify(orderedOriginal) === JSON.stringify(orderedActual)) {
    $input.all()[0].json.github_status = "same";
  } else {
    $input.all()[0].json.github_status = "different";
    $input.all()[0].json.n8n_data_stringy = JSON.stringify(orderedActual, null, 2);
  }
  $input.all()[0].json.content_decoded = orderedOriginal;
} else {
  const n8nWorkflow = $input.all()[1].json;
  const orderedActual = orderJsonKeys(n8nWorkflow);
  $input.all()[0].json.github_status = "new";
  $input.all()[0].json.n8n_data_stringy = JSON.stringify(orderedActual, null, 2);
}

return $input.all();

Step 12: Branching Based on File Status

The Check Status switch node routes to one of three paths based on github_status:

  • same: No changes, skip backup update. Goes to Same file – Do nothing no-op.
  • different: Edit the existing GitHub file.
  • new: Create a new GitHub file.

Step 13: Creating or Editing GitHub Backup Files

Depending on the file status, either the Create new file or Edit existing file GitHub nodes execute, committing the respective JSON content to the repo with a commit message indicating the workflow name and file status.

Step 14: Ending the Workflow

The Return node sets a flag Done: true to signal the end of each workflow item processing cleanly.

5. Customizations ✏️

  • Change Backup Folder Path: In the Globals node, update repo.path to any folder you prefer within your GitHub repo. This helps organize backups logically by project or environment.
  • Add More Metadata to Files: Modify the isDiffOrNew code node to inject additional metadata or timestamps into the backup JSON content before committing.
  • Schedule Backups More Often: Edit the Schedule Trigger node’s interval to run hourly or multiple times per day for faster sync.
  • Filter Workflows by Tag: Extend the tag? switch logic to backup only workflows matching specific tags by adjusting conditions.
  • Switch GitHub Credentials: Easily swap githubApi credentials in the GitHub nodes if you want to use alternative GitHub accounts or tokens.

6. Troubleshooting 🔧

Problem: “GitHub API rate limit exceeded”

Cause: Too many backup attempts or commits in a short time.

Solution: Reduce backup frequency in the Schedule Trigger or optimize by skipping unchanged workflows.

Problem: “File too large to download from GitHub” fallback triggering too often

Cause: Large workflows being backed up trigger HTTP fallback repeatedly.

Solution: Review workflow size, possibly split complex workflows or optimize the fallback node.

Problem: “No tags present on workflow” causes files to be backed up to root directory

Cause: If a workflow has no tags, the tag? switch routes to default path.

Solution: Add at least one tag on your workflows or modify path logic in the Globals node for these cases.

7. Pre-Production Checklist ✅

  • Verify your GitHub credentials node is correctly configured with proper scope permissions (repo access).
  • Test your GitHub repo path and access manually to ensure proper write permissions.
  • Run manual trigger and check the workflow list outputs correctly.
  • Confirm that backups are created or updated correctly in the repository after runs.
  • Test the workflow with workflows having tags and without tags to verify path logic.
  • Backup existing GitHub repo before first run as a rollback precaution.

8. Deployment Guide

Activate the Schedule Trigger to run the backup daily or at your desired frequency. For immediate testing, use the manual trigger node “On clicking ‘execute'”.

Monitor your GitHub repository for new or updated backup files named after your workflow IDs.

Use n8n’s execution logs to troubleshoot any failures or unexpected data flow issues.

9. FAQs

Q: Can I use another Git platform instead of GitHub?

A: This workflow specifically uses GitHub nodes, but you could adapt it to GitLab or Bitbucket by replacing GitHub nodes with HTTP request calls to their APIs.

Q: Will backing up large workflows slow down n8n?

A: The splitInBatches node helps mitigate memory usage by processing one workflow at a time, but very large workflows may take longer to process.

Q: Is my data secure during backups?

A: Yes, your data is transferred securely using GitHub API and n8n connections. Ensure your GitHub token is kept private.

10. Conclusion

By the end of this guide, you’ve automated the entire backup of your n8n workflows to GitHub, safeguarding your work with version control and eliminating manual export headaches. This workflow saves you hours weekly, reduces error risks, and gives a reliable historical record of all your automation workflows.

Next steps could include automating rollback on errors, notifying your team of changes via Slack, or integrating backups with other cloud storage for redundancy.

Get started now and enjoy peace of mind knowing your n8n workflows are always backed up flawlessly to GitHub.

Promoted by BULDRR AI

Related Workflows

Automate Viral UGC Video Creation Using n8n + Degaus (Beginner-Friendly Guide)

Learn how to automate viral UGC video creation using n8n, AI prompts, and Degaus. This beginner-friendly guide shows how to import, configure, and run the workflow without technical complexity.
Form Trigger
Google Sheets
Gmail
+37
Free

AI SEO Blog Writer Automation in n8n

A complete beginner guide to building an AI-powered SEO blog writer automation using n8n.
AI Agent
Google Sheets
httpRequest
+5
Free

Automate CrowdStrike Alerts with VirusTotal, Jira & Slack

This workflow automates processing of CrowdStrike detections by enriching threat data via VirusTotal, creating Jira tickets for incident tracking, and notifying teams on Slack for quick response. Save hours daily by transforming complex threat data into actionable alerts effortlessly.
scheduleTrigger
httpRequest
jira
+5
Free

Automate Telegram Invoices to Notion with AI Summaries & Reports

Save hours on financial tracking by automating invoice extraction from Telegram photos to Notion using Google Gemini AI. This workflow extracts data, records transactions, and generates detailed spending reports with charts sent on schedule via Telegram.
lmChatGoogleGemini
telegramTrigger
notion
+9
Free

Automate Email Replies with n8n and AI-Powered Summarization

Save hours managing your inbox with this n8n workflow that uses IMAP email triggers, AI summarization, and vector search to draft concise replies requiring minimal review. Automate business email processing efficiently with AI guidance and Gmail integration.
emailReadImap
vectorStoreQdrant
emailSend
+12
Free

Automate Email Campaigns Using n8n with Gmail & Google Sheets

This n8n workflow automates personalized email outreach campaigns by integrating Gmail and Google Sheets, saving hours of manual follow-up work and reducing errors in email sequences. It ensures timely follow-ups based on previous email interactions, optimizing communication efficiency.
googleSheets
gmail
code
+5
Free