Automate S3 File Download and ZIP Compression with n8n

This workflow automates downloading all files from a specific AWS S3 bucket folder and compresses them into a single ZIP archive using n8n. Save hours manually fetching files and streamline file management effortlessly with this automation.
awsS3
compression
manualTrigger
+2
Learn how to Build this Workflow with AI:
Workflow Identifier: 1843
NODES in Use: Manual Trigger, AWS S3, Aggregate, Compression, Sticky Note

Press CTRL+F5 if the workflow didn't load.

Visit through Desktop for Best experience

1. The Frustration of Managing AWS S3 Files Manually

Meet Sarah, a data analyst at a growing e-commerce company. Every week, she needs to gather dozens of sales reports and images stored in a specific folder in the company’s AWS S3 bucket. Sarah spends hours navigating the AWS console, downloading each file individually, and then compressing them before sending to her team for review. This tedious process takes her well over two hours weekly—time she’d rather invest in analyzing data and generating insights.

Moreover, manual file handling can lead to missed files or errors in compression, creating delays in her team’s workflow. Sarah needed an efficient, automated way to gather all those files from S3, bundle them neatly, and have them ready to download or use in further automation tasks.

2. What This Automation Does

This n8n workflow tackles Sarah’s problem smartly by automating the entire process. When triggered, it:

  • Lists all files inside a specified folder within your AWS S3 bucket.
  • Downloads every listed file to n8n’s workflow context as binary data.
  • Aggregates all files into one item that includes all file data with their binaries intact.
  • Compresses all downloaded files into a single ZIP archive named s3-export.zip.
  • Provides a downloadable ZIP file which you can then download directly from n8n or pass on for further processing.

Thanks to this workflow, Sarah saves more than two hours every week by eliminating manual downloading and zipping. It ensures all files from the folder are bundled seamlessly without errors, ready for any next steps.

3. Prerequisites ⚙️

  • AWS S3 account with access credentials – to connect and retrieve files.
  • n8n automation platform account – either cloud or self-hosted.
  • A configured AWS S3 node in n8n with your bucket name and folder path.

4. Step-by-Step Guide to Automate Download and ZIP Compression of S3 Files

Step 1: Start with a Manual Trigger to Test the Workflow

In your n8n canvas, click on + → choose Manual Trigger node. This node lets you manually trigger the workflow during development or actual use.

You should see a simple interface with a “Execute Workflow” button when running this node.

Common mistake: Forgetting to connect the Manual Trigger as the first node will prevent the workflow from starting.

Step 2: Add the AWS S3 “List ALL Files” Node

Click + and add an AWS S3 node (Operation: getAll). Configure it by:

  • Under Bucket Name, enter your S3 bucket name, e.g., yourBucket.
  • Under Folder Key, specify the folder path, e.g., yourFolder.
  • Toggle Return All to true to fetch all files.

This node fetches a list of all files inside the target folder.

Common mistake: Leaving folder path empty fetches files from the bucket root, which may not be what you want.

Step 3: Connect to the “Download ALL Files from Folder” Node

Add another AWS S3 node. Set operation to Download files.

In the File Key field, use an expression to download each file listed. Use: {{ $json.Key }}.

Make sure the bucket name matches the previous node.

This node downloads each file from S3 so it’s accessible inside n8n.

Common mistake: Not using an expression for File Key will break the download for all files except one.

Step 4: Aggregate All Files into One Item with Binary Data

Add an Aggregate node. Set to aggregate all item data and include binary data.

This consolidates multiple file items into a single item, maintaining all file binaries in one payload.

Common mistake: Forgetting to enable binary data aggregation or choosing incorrect aggregation method.

Step 5: Compress All Files into a ZIP Archive

Add a Compression node. Configure:

  • Operation: compress
  • File Name: s3-export.zip
  • Binary Property Name: Use expression {{ Object.keys($binary).join(',') }} to include all downloaded files.

This bundles all downloaded files into one ZIP.

Common mistake: Leaving the binary property empty or wrong leads to an empty ZIP.

5. Customizations ✏️

  • Change ZIP File Name: In the Compression node, replace s3-export.zip with any desired file name like monthly-reports.zip.
  • Filter Files by Type: Before listing files, add a Filter node to only include files with specific extensions, e.g., .csv or .jpg. Useful if your S3 folder contains mixed content.
  • Trigger on Schedule: Replace the Manual Trigger with a Cron node to automate downloads on a schedule, like every Monday morning.
  • Upload ZIP to S3: Extend workflow by adding another AWS S3 node to upload the generated ZIP back to your S3 bucket for sharing.

6. Troubleshooting 🔧

Problem: “AWS S3 node returns access denied error”

Cause: Incorrect IAM permissions or wrong credentials.

Solution: Verify your AWS credentials in n8n. Ensure the IAM user has permissions for ListBucket and GetObject actions.

Problem: “ZIP file is empty or missing files”
>Cause: The binary data wasn’t properly aggregated or the Compression node’s binary property was wrong.

Solution: Double-check the Aggregate node settings to include binary data. Confirm the compression node uses the correct binary keys expression.

7. Pre-Production Checklist ✅

  • Confirm AWS credentials are valid and have the right permissions.
  • Verify bucket name and folder path in S3 nodes exactly match your AWS setup.
  • Test manual trigger to ensure files are downloaded successfully and zipped.
  • Check the Aggregate node’s output contains all files.
  • Ensure the ZIP file downloads correctly and opens to contain all files.

8. Deployment Guide

Once tested successfully, activate your workflow by toggling it enabled in n8n.

If your use case requires periodic file updates, replace the Manual Trigger with a Cron node and schedule it accordingly.

Monitor running executions for failures and logs in n8n to ensure smooth operation over time.

9. FAQs ✏️

Can I use a different cloud storage like Google Drive instead of AWS S3?
Yes, but you’ll need to adapt the workflow and use the respective cloud storage nodes. The core logic remains the same.

Does this workflow consume AWS API credits or cost money?
AWS charges are based on operations performed. Listing and downloading files consume API calls, so check your AWS pricing.

Is my data safe?
All data transfer happens via your AWS connection configured with your credentials. n8n doesn’t store your data externally.

10. Conclusion

By following this detailed guide, you’ve automated the tedious and error-prone task of collecting files from a specific S3 folder and compressing them into a ZIP archive. This not only saves hours of manual work but also provides a reliable, repeatable process that scales as your file counts grow.

Next, you might consider automating uploading this ZIP to other storage, sending notifications with links, or scheduling regular exports. This workflow is a solid foundation to accelerate your data management with n8n.

Related Workflows

Automate Viral UGC Video Creation Using n8n + Degaus (Beginner-Friendly Guide)

Learn how to automate viral UGC video creation using n8n, AI prompts, and Degaus. This beginner-friendly guide shows how to import, configure, and run the workflow without technical complexity.
Form Trigger
Google Sheets
Gmail
+37
Free

AI SEO Blog Writer Automation in n8n (Beginner Guide)

A complete beginner guide to building an AI-powered SEO blog writer automation using n8n.
AI Agent
Google Sheets
httpRequest
+5
Free

Automate CrowdStrike Alerts with VirusTotal, Jira & Slack

This workflow automates processing of CrowdStrike detections by enriching threat data via VirusTotal, creating Jira tickets for incident tracking, and notifying teams on Slack for quick response. Save hours daily by transforming complex threat data into actionable alerts effortlessly.
scheduleTrigger
httpRequest
jira
+5
Free

Automate Telegram Invoices to Notion with AI Summaries & Reports

Save hours on financial tracking by automating invoice extraction from Telegram photos to Notion using Google Gemini AI. This workflow extracts data, records transactions, and generates detailed spending reports with charts sent on schedule via Telegram.
lmChatGoogleGemini
telegramTrigger
notion
+9
Free

Automate Email Replies with n8n and AI-Powered Summarization

Save hours managing your inbox with this n8n workflow that uses IMAP email triggers, AI summarization, and vector search to draft concise replies requiring minimal review. Automate business email processing efficiently with AI guidance and Gmail integration.
emailReadImap
vectorStoreQdrant
emailSend
+12
Free

Automate Email Campaigns Using n8n with Gmail & Google Sheets

This n8n workflow automates personalized email outreach campaigns by integrating Gmail and Google Sheets, saving hours of manual follow-up work and reducing errors in email sequences. It ensures timely follow-ups based on previous email interactions, optimizing communication efficiency.
googleSheets
gmail
code
+5
Free