Automate Twitter Sentiment Analysis with n8n ETL Pipeline

Discover how this n8n ETL pipeline automates extracting tweets with #OnThisDay, analyzes their sentiment using Google Cloud Natural Language, stores results in MongoDB and Postgres, and posts positive tweets to Slack — saving hours of manual monitoring and enabling timely insights.
twitter
googleCloudNaturalLanguage
postgres
+6
Workflow Identifier: 1609
NODES in Use: Twitter, MongoDB, Google Cloud Natural Language, Set, Postgres, IF, Slack, NoOp, Cron

Press CTRL+F5 if the workflow didn't load.

Learn how to Build this Workflow with AI:

Visit through Desktop for Best experience

Opening Problem Statement

Meet Sarah, a social media analyst for a cultural heritage organization. Every morning, she spends over an hour manually searching Twitter for posts tagged with #OnThisDay to find interesting or positive mentions about historical events they promote. She then copies and pastes tweets into spreadsheets, runs sentiment checks using multiple tools, and finally summarizes positive posts to her team through Slack. This tedious process often leads to errors—missed tweets, delayed notifications, and lost insights on audience sentiment. If Sarah misses or delays sharing an important tweet, her team loses engagement opportunities and fails in timely content planning.

Clearly, Sarah needs an automated pipeline that can fetch relevant tweets daily, analyze their sentiment, store results for future reference, and instantly notify her team of positive posts.

What This Automation Does

This n8n workflow acts as a complete ETL (Extract, Transform, Load) pipeline tailored for Twitter sentiment analysis centered on the #OnThisDay hashtag. When triggered each morning, it:

  • Extracts the latest three tweets containing “#OnThisDay” using the Twitter node.
  • Loads the raw tweets into MongoDB for initial storage.
  • Transforms the tweet content by analyzing sentiment scores and magnitude via Google Cloud Natural Language.
  • Stores the processed text alongside sentiment metrics in a Postgres database for structured querying.
  • Applies a condition to identify positive sentiment tweets (score > 0) using the IF node.
  • Posts these positive tweets to a Slack channel with dynamic messages showing sentiment scores.

Benefits include saving Sarah around 30-45 minutes each day, reducing human error, enabling rich data archival, and improving team awareness with real-time notifications.

Prerequisites ⚙️

  • Twitter account with API access enabled (OAuth1 credentials configured).
  • MongoDB database instance with credentials set in n8n.
  • Postgres database with the “tweets” table containing columns: text, score, magnitude.
  • Google Cloud account with Natural Language API enabled and OAuth2 credentials configured in n8n.
  • Slack workspace and bot with API token configured for n8n.
  • n8n automation tool (cloud or self-hosted) with all credentials securely stored.

Optionally, you can self-host n8n with a reliable provider like Hostinger to control your data and run workflows continuously.

Step-by-Step Guide

1. Schedule Daily Workflow Trigger with Cron Node

Navigate to the Cron node configuration inside n8n. Set the “Trigger Times” to run at 06:00 AM daily. This ensures your tweets are fetched fresh early every day without manual intervention.

Common mistake: Forgetting to set the “Trigger Times” or choosing the wrong timezone may cause unexpected trigger times.

2. Extract Tweets Using the Twitter Node

Click on the Twitter node next to Cron. Configure it with your Twitter OAuth1 API credentials. Set the operation to “Search” and enter =#OnThisDay in the search text field with a limit of 3 tweets.

You should see the node sampling recent tweets tagged #OnThisDay after running the workflow.

Common mistake: Using incorrect or expired Twitter API credentials will cause the node to fail.

3. Insert Raw Tweets into MongoDB

Select the MongoDB node connected to Twitter’s output. Choose the “Insert” operation and specify the “tweets” collection. Map the field “text” from the Twitter node’s output to the MongoDB insert fields.

This step creates a raw tweet archive for historical data.

Common mistake: Incorrect MongoDB URI or missing database permissions can cause insert errors.

4. Perform Sentiment Analysis with Google Cloud Natural Language Node

Select the Google Cloud Natural Language node linked to MongoDB. Configure your Google API OAuth2 credentials here. Set the “content” parameter to pull the stored tweet text dynamically from MongoDB like this: {{$node["MongoDB"].json["text"]}}.

You should get a sentiment score and magnitude in the node output.

Common mistake: Forgetting to enable the Natural Language API or using invalid API credentials.

5. Set Sentiment Score and Text Using Set Node

The Set node extracts the sentiment score and magnitude from the Google Cloud node’s output and the tweet text from Twitter. Configure variables “score”, “magnitude”, and “text” like so:

score = {{$json["documentSentiment"]["score"]}}
magnitude = {{$json["documentSentiment"]["magnitude"]}}
text = {{$node["Twitter"].json["text"]}}

This node formats your data clearly for database storage and conditional checks.

Common mistake: Misreferencing node outputs can lead to empty or wrong values.

6. Store Processed Tweets in Postgres

Connect the Set node to the Postgres node. Configure your Postgres credentials and set the target table to “tweets”. Map columns “text”, “score”, and “magnitude” accordingly.

After executing, your Postgres database will contain sentiment-scored tweet data ready for querying.

Common mistake: Schema mismatches or incorrect column names in your Postgres table will cause insert failures.

7. Filter Positive Sentiment Tweets with IF Node

Link the Postgres node’s output to the IF node. Set condition “score” > 0 to identify positive tweets.

Tweets with positive sentiment pass the true branch; others go to NoOp.

Common mistake: Using the wrong field name or not using double curly braces {{}} for expressions causes condition errors.

8. Post Positive Tweets to Slack

On the IF node’s true branch, add the Slack node. Configure with Slack API credentials and target the “tweets” channel. Use this dynamic text template:

=🐦 NEW TWEET with sentiment score {{$json["score"]}} and magnitude {{$json["magnitude"]}} ⬇️
{{$json["text"]}}

On running, your Slack channel receives timely notifications about positive tweets automatically.

Common mistake: Incorrect Slack channel name or expired Slack tokens cause message sending failures.

Customizations ✏️

  • Change the hashtag search term: In the Twitter node, replace =#OnThisDay with any hashtag or keyword you want to monitor. This lets you adapt the ETL pipeline to any Twitter topic.
  • Adjust the sentiment threshold: In the IF node, change the condition value from 0 to any score you consider “positive” or worth notifying. For example, use 0.2 for stricter filtering.
  • Store additional tweet fields: Modify the MongoDB and Postgres nodes to include extra tweet fields like username, date, or tweet ID for richer data records.
  • Change Slack message formatting: In the Slack node, customize the text or add attachments with images or links for better engagement.
  • Increase tweet fetch limit: In the Twitter node, raise the “limit” parameter to gather more tweets per run if you want more comprehensive daily data.

Troubleshooting 🔧

Problem: Twitter node returns “Rate limit exceeded” error.
Cause: Twitter enforces API rate limits per account.
Solution: Reduce the number of tweets fetched or increase time between runs in your Cron node.

Problem: Google Cloud Natural Language node returns authentication errors.
Cause: OAuth credentials are incorrect or expired.
Solution: Re-authenticate Google Cloud credentials in n8n and verify API access in the Google Cloud Console.

Problem: Slack messages are not sent to the channel.
Cause: Incorrect Slack channel name or permissions.
Solution: Double-check channel names and ensure the Slack bot token has permission to post messages.

Pre-Production Checklist ✅

  • Verify all API credentials (Twitter, Google Cloud, Slack) are correctly set in n8n.
  • Ensure the MongoDB and Postgres databases are accessible with proper permissions.
  • Test the workflow manually for each node output and confirm expected data flows.
  • Check the Postgres table schema matches the fields used: text, score, magnitude.
  • Make sure the Slack channel exists and is ready to receive messages.

Deployment Guide

Once tested, activate the workflow in n8n by toggling the “Active” switch. Your Cron node will trigger the automation every morning at 6 AM, running the entire pipeline end to end.

Monitor execution results via n8n’s dashboard. Enable error notifications or logging if your setup requires robust monitoring.

For uninterrupted service, consider running n8n self-hosted or in a cloud environment with persistent uptime.

FAQs

Q: Can I analyze other hashtags besides #OnThisDay?
A: Absolutely! Just change the search term in the Twitter node to any hashtag or keyword you prefer.

Q: Will this workflow consume my Twitter API limits quickly?
A: Fetching 3 tweets daily is low usage and unlikely to hit limits. Adjust frequency or tweet limit cautiously.

Q: Is my tweet data stored securely?
A: Yes, all data is stored securely in your own MongoDB and Postgres databases controlled by you.

Conclusion

By building and deploying this n8n ETL pipeline, you have automated the tedious task of monitoring #OnThisDay tweets with sentiment analysis. You’ve saved valuable time previously spent on manual searching, properly archived tweet data for future analysis, and improved team communication with real-time Slack alerts for positive content.

This specific automation is perfect for social media analysts like Sarah who need timely insights without the hassle. Next, you might consider expanding with automatic report generation from Postgres data, integrating other social platforms, or adding deeper AI-driven text analytics.

Keep experimenting with your workflows and enjoy the power of tailored automation!

Promoted by BULDRR AI

Related Workflows

Automate Viral UGC Video Creation Using n8n + Degaus (Beginner-Friendly Guide)

Learn how to automate viral UGC video creation using n8n, AI prompts, and Degaus. This beginner-friendly guide shows how to import, configure, and run the workflow without technical complexity.
Form Trigger
Google Sheets
Gmail
+37
Free

AI SEO Blog Writer Automation in n8n

A complete beginner guide to building an AI-powered SEO blog writer automation using n8n.
AI Agent
Google Sheets
httpRequest
+5
Free

Automate CrowdStrike Alerts with VirusTotal, Jira & Slack

This workflow automates processing of CrowdStrike detections by enriching threat data via VirusTotal, creating Jira tickets for incident tracking, and notifying teams on Slack for quick response. Save hours daily by transforming complex threat data into actionable alerts effortlessly.
scheduleTrigger
httpRequest
jira
+5
Free

Automate Telegram Invoices to Notion with AI Summaries & Reports

Save hours on financial tracking by automating invoice extraction from Telegram photos to Notion using Google Gemini AI. This workflow extracts data, records transactions, and generates detailed spending reports with charts sent on schedule via Telegram.
lmChatGoogleGemini
telegramTrigger
notion
+9
Free

Automate Email Replies with n8n and AI-Powered Summarization

Save hours managing your inbox with this n8n workflow that uses IMAP email triggers, AI summarization, and vector search to draft concise replies requiring minimal review. Automate business email processing efficiently with AI guidance and Gmail integration.
emailReadImap
vectorStoreQdrant
emailSend
+12
Free

Automate Email Campaigns Using n8n with Gmail & Google Sheets

This n8n workflow automates personalized email outreach campaigns by integrating Gmail and Google Sheets, saving hours of manual follow-up work and reducing errors in email sequences. It ensures timely follow-ups based on previous email interactions, optimizing communication efficiency.
googleSheets
gmail
code
+5
Free