Backup n8n Credentials to GitHub Automatically

This workflow automates backing up all n8n instance credentials securely to GitHub. It saves time by running scheduled exports, comparing with existing backups, and updating or creating files only when necessary.
github
executeCommand
code
+12
Workflow Identifier: 2063
NODES in Use: ManualTrigger, StickyNote, Set, HttpRequest, If, Merge, Code, Switch, NoOp, GitHub, SplitInBatches, ScheduleTrigger, ExecuteCommand, ExecuteWorkflow, ExecuteWorkflowTrigger

Press CTRL+F5 if the workflow didn't load.

Learn how to Build this Workflow with AI:

Visit through Desktop for Best experience

Opening Problem Statement

Meet Sarah, a workflow automation developer managing multiple n8n instances for her company. Every week, she manually exports and backs up her n8n credentials to avoid losing critical configuration data, a process that takes hours and risks human error. Without an automated backup system, Sarah faces the constant stress of accidental data loss, delayed recovery times, and disruption of operations.

Sarah needs a reliable way to automatically backup all her n8n instance credentials to a secure, version-controlled location like GitHub, without spending hours on manual exports and file management.

What This Automation Does

This n8n workflow automates backing up all your n8n credentials to GitHub. When triggered manually or on a scheduled interval, it:

  • Exports all decrypted credentials from your n8n instance using the Execute Command node with npx n8n export:credentials --all --decrypted.
  • Formats the exported JSON data to ensure it’s ready for processing.
  • Loops through each credential item to initiate backup processing.
  • Checks if a backup file already exists in GitHub for each credential.
  • Compares the current exported JSON with the existing GitHub file to detect new, different, or unchanged backups.
  • Creates new files or edits existing ones in GitHub based on comparison results, avoiding unnecessary commits.
  • Runs on a schedule (every two hours) or manually triggered for instant backup.

By automating this, Sarah saves hours of manual work, reduces risk of lost credentials, and keeps a clean, versioned backup repository updated automatically.

Prerequisites ⚙️

  • n8n instance with credentials configured 🔐
  • Access to GitHub with a personal access token credential for the GitHub node 🔑
  • n8n Execute Command node ability to run shell commands
  • Repository on GitHub prepared and accessible for storing backups 📁
  • n8n account or self-hosted environment ⏱️

Step-by-Step Guide

Step 1: Configure GitHub Repository Details in Globals Node

Navigate to the Globals (Set node). Here, you specify your GitHub repository information:

  • repo.owner: Your GitHub username, e.g., john-doe
  • repo.name: The name of your GitHub repository, e.g., n8n-backup
  • repo.path: The folder where backups will be saved, e.g., credentials/

This centralizes your GitHub path configuration and makes it easy to update in one place.

Common mistake: Forgetting the trailing slash in the repo.path, which could cause path issues.

Step 2: Run Export Command for Credentials

Look at the Execute Command node. It executes the shell command npx n8n export:credentials --all --decrypted, exporting all credentials decrypted.

This node produces JSON output in the stdout field, which contains all credentials as an array.

Note: You must have npx and n8n CLI tools available in the environment where this is running.

Step 3: Format and Parse Exported JSON

The JSON formatting (Code node) extracts the JSON array from the raw command output, parses it, and returns each credential as an individual item for further processing.

// JavaScript snippet from the Code node
const input = $input.all()[0].json;
const jsonString = input.stdout.match(/[{.*}]/s);
if (!jsonString) {
  return { json: { error: "No valid JSON string found in stdout." } };
}
const beautifiedJson = JSON.parse(jsonString[0]);
const output = beautifiedJson.map(entry => ({ json: entry }));
return output;

This prepares the data for item-wise processing.

Step 4: Loop Each Credential Item

The Loop Over Items (SplitInBatches) node processes credentials in batches to optimize memory usage.

Ensure batch size fits your environment limits.

Step 5: Fetch Existing Backup File from GitHub

The Get file data (GitHub node) attempts to fetch the existing backup file based on the credential ID (like credentialID.json) and repository path.

The node is set to Continue On Fail, so absence of the file won’t stop the workflow.

Step 6: Check If Backup File is Too Large

The If file too large (If node) checks if the fetched file content exists or if an error was returned.

If content is empty (file might be too large), it fetches the raw file using the Get File (HTTP Request) node directly from the download URL.

Step 7: Merge Fetched Data

The Merge Items node combines the results from the GitHub file fetch and the HTTP fallback file fetch, so the data flows to comparison.

Step 8: Compare Workflow JSONs for Differences

The isDiffOrNew (Code node) compares the exported credential JSON with the existing GitHub backup.

It sorts keys consistently and flags if the file is the same, different, or new.

// Key comparison snippet
if (JSON.stringify(orderedOriginal) === JSON.stringify(orderedActual)) {
  status = "same";
} else {
  status = "different";
}

Step 9: Branch Based on Comparison Result

The Check Status (Switch node) routes the flow based on the comparison result to either:

  • Same file – Do nothing: Ends flow with no GitHub update.
  • File is different: Edits the existing backup file on GitHub with new content.
  • File is new: Creates a new backup file in the GitHub repository.

Step 10: Edit or Create File on GitHub

Depending on the path chosen, the workflow either:

  • Edit existing file: Updates the file content and commits to GitHub.
  • Create new file: Adds a new file in the designated repo folder with the credentials JSON.

Step 11: End Workflow and Return

The Return (Set node) marks each credential processed as done.

Step 12: Schedule the Workflow

The Schedule Trigger runs the entire backup process every two hours automatically, ensuring your backups stay updated without manual intervention. You can also trigger it manually using the Manual Trigger.

Customizations ✏️

  • Change backup folder path: In the Globals node, update repo.path to change where credentials are saved in your GitHub repo.
  • Adjust schedule interval: Modify the Schedule Trigger node to run at different intervals, such as every hour or daily.
  • Batch size tuning: Change the Loop Over Items node batch size to control memory and processing speed based on your instance’s capacity.
  • Modify commit messages: In the Create new file and Edit existing file GitHub nodes, customize the commit message template to include timestamps or custom notes.
  • Add error handling: Add additional workflow logic after GitHub nodes to notify failures via email or Slack.

Troubleshooting 🔧

  • Problem: “No valid JSON string found in stdout.” from JSON formatting node.

    Cause: The export command did not return valid JSON, possibly due to environment path or command execution issues.

    Solution: Verify the n8n CLI is installed and accessible. Test running the export command manually in your environment.
  • Problem: “File not found” errors in the Get file data GitHub node.

    Cause: The file doesn’t exist yet for the credential ID.

    Solution: This is handled by the workflow—confirm Continue On Fail is enabled and the fallback HTTP Request node is configured properly.
  • Problem: Workflow runs slowly or hits memory limits.

    Cause: Exported credential count is large or batch size is too big.

    Solution: Reduce batch size in Loop Over Items node or increase your instance resources.

Pre-Production Checklist ✅

  • Ensure GitHub personal access token credentials have sufficient scope to read/write files in the repository.
  • Test export command manually to confirm output is valid JSON.
  • Verify GitHub repo exists and is accessible with correct owner, name, and path set in the Globals node.
  • Run manual trigger to verify initial backup completes successfully.
  • Check logs for any errors or data mismatches.

Deployment Guide

Activate the workflow by enabling the Schedule Trigger node. This ensures your n8n credentials backup runs automatically every two hours.

Use the Manual Trigger node to run immediate backups as needed.
You can monitor activity and data flow via n8n execution logs to confirm regular backups.

FAQs

  • Can I use a private GitHub repo for backups?
    Yes, just ensure your personal access token credentials have access to that repository.
  • Does this workflow consume GitHub API limits?
    Yes, but minimal because it only fetches and commits files when changes are detected.
  • Is it safe to store decrypted credentials on GitHub?
    Only use private repositories with access controls to protect sensitive data.
  • Can I run this backup workflow on self-hosted n8n?
    Absolutely! Make sure your environment supports the Execute Command node and GitHub node access.

Conclusion

With this workflow, Sarah—like you—can confidently automate backing up all n8n credentials to GitHub, eliminating manual export drudgery while ensuring reliable, version-controlled backups. This solution saves her hours every week, reduces risk of credential loss, and streamlines disaster recovery.

Next steps might include automating backups of workflows themselves or triggering notifications on backup failures for improved monitoring.

Start using this workflow today, and never worry about losing your n8n credentials again!

Promoted by BULDRR AI

Related Workflows

Automate Viral UGC Video Creation Using n8n + Degaus (Beginner-Friendly Guide)

Learn how to automate viral UGC video creation using n8n, AI prompts, and Degaus. This beginner-friendly guide shows how to import, configure, and run the workflow without technical complexity.
Form Trigger
Google Sheets
Gmail
+37
Free

AI SEO Blog Writer Automation in n8n

A complete beginner guide to building an AI-powered SEO blog writer automation using n8n.
AI Agent
Google Sheets
httpRequest
+5
Free

Automate CrowdStrike Alerts with VirusTotal, Jira & Slack

This workflow automates processing of CrowdStrike detections by enriching threat data via VirusTotal, creating Jira tickets for incident tracking, and notifying teams on Slack for quick response. Save hours daily by transforming complex threat data into actionable alerts effortlessly.
scheduleTrigger
httpRequest
jira
+5
Free

Automate Telegram Invoices to Notion with AI Summaries & Reports

Save hours on financial tracking by automating invoice extraction from Telegram photos to Notion using Google Gemini AI. This workflow extracts data, records transactions, and generates detailed spending reports with charts sent on schedule via Telegram.
lmChatGoogleGemini
telegramTrigger
notion
+9
Free

Automate Email Replies with n8n and AI-Powered Summarization

Save hours managing your inbox with this n8n workflow that uses IMAP email triggers, AI summarization, and vector search to draft concise replies requiring minimal review. Automate business email processing efficiently with AI guidance and Gmail integration.
emailReadImap
vectorStoreQdrant
emailSend
+12
Free

Automate Email Campaigns Using n8n with Gmail & Google Sheets

This n8n workflow automates personalized email outreach campaigns by integrating Gmail and Google Sheets, saving hours of manual follow-up work and reducing errors in email sequences. It ensures timely follow-ups based on previous email interactions, optimizing communication efficiency.
googleSheets
gmail
code
+5
Free