Automate Reddit Trend Analysis and PR Reports with n8n

Discover how this n8n automation identifies hot Reddit trends, filters top posts by upvotes, analyzes comments for sentiment, and generates comprehensive PR reports automatically. Save hours weekly by transforming social data into actionable PR insights with ease.
reddit
code
if
+9
Workflow Identifier: 1267
NODES in Use: Code, Reddit, If, Set, SplitInBatches, HTTP Request, Google Drive, Compression, ConvertToFile, Aggregate, StickyNote, Anthropic Chat Model

Press CTRL+F5 if the workflow didn't load.

Learn how to Build this Workflow with AI:

Visit through Desktop for Best experience

1. Real Problem: Emily’s Struggle with Finding Trending Topics for PR Campaigns

Emily is a PR strategist focusing on digital campaigns for her firm’s clients. Every Monday morning, she spends hours scouring Reddit, a hotspot for emerging trends, looking for newsworthy stories related to politics and current affairs — topics that her clients care about deeply. She manually compiles posts, checks each post’s engagement metrics, reads comments for sentiment, and tries to synthesize what angle might work for the week’s pitching strategy.

This manual process is time-consuming — taking up to 3-4 hours weekly — and prone to errors. Emily often misses viral opportunities because she can’t keep up with the sheer volume of content and dynamic discussions. Sometimes, she also redundantly considers low-impact posts or ones that don’t fit her clients’ criteria. This inefficiency slows down the PR team’s ability to react quickly, losing competitive advantage and potential revenue.

2. What This Automation Does

This custom-built n8n workflow automates Emily’s Monday morning routine by connecting directly to Reddit and other online tools to:

  • Automatically split her interest topics (e.g., “Donald Trump”, “Politics”) for targeted searches.
  • Search Reddit across all communities for posts matching those topics and filter posts with over 100 upvotes, excluding certain URLs (like native Reddit links and undesired domains).
  • Remove duplicate posts by URL, prioritizing the ones with the most upvotes.
  • Fetch all comments for each qualifying post, extract the top 30 comments by score, and maintain comment-reply hierarchies while excluding deleted comments.
  • Format these comments into readable Markdown and analyze the discussion sentiment using Anthropic’s AI models.
  • Retrieve the full source news content via an HTTP API call (to a Jina.ai endpoint) for deeper context analysis.
  • Aggregate the combined Reddit data, news content, and sentiment analysis into a comprehensive PR report.
  • Store the final report as a text file, compress it, upload to Google Drive, and send a notification with the download link to a Mattermost channel for immediate team collaboration.

This automation can save Emily 3-4 hours every week by consolidating complex manual steps into a single automatic process and delivers rich, actionable insights that help produce timely, targeted PR pitches.

3. Prerequisites ⚙️

  • n8n account with the ability to add custom credentials ⏱️
  • Reddit OAuth2 API credential to allow searching Reddit and retrieving posts/comments 📧
  • Anthropic account credential for AI-powered sentiment and content analysis 🔑
  • Google Drive OAuth2 API credential for file upload and sharing 📁
  • Access to Jina.ai API key for fetching detailed news content 🔐
  • Mattermost webhook URL and channel info for team notifications 💬

4. Step-by-Step Guide to Build and Run This Workflow

Step 1: Schedule the Weekly Trigger

In n8n, add a Schedule Trigger node. Configure it to run every Monday at 6 AM weekly to match Emily’s briefing time. This initiates the workflow weekly without manual intervention. You should see the schedule set on the node labeled with a calendar and clock icon. The node will activate workflow runs automatically at this time.

Common Mistake: Forgetting to adjust the timezone if your team is in a different zone — confirm timezone in Schedule Trigger settings.

Step 2: Set Topics and API Keys

Add a Set node named “Set Data” to define input data like the search topics (each on a new line, e.g., Donald Trump, Politics) and your Jina API key. Emily sets her key here and lists the topics to search Reddit for.

Verify this node’s output shows a field Topics with a multi-line string and Jina API Key.

Step 3: Split Topics Into Individual Items

Use a Code node named “Split Topics into Items” with JavaScript that splits the multi-line topics string into separate items for parallel Reddit searches.

// Split input Topics string
const topicsString = $json.Topics;
const topicsArray = topicsString.split('n').map(topic => topic.trim());
const items = topicsArray.map(topic => ({ json: { Topic: topic } }));
return items;

This breaks down the input into consumable chunks for querying Reddit.

Step 4: Search Posts on Reddit for Each Topic

Add a Reddit node called “Search Posts” with the operation set to “search” across all Reddit locations with the keyword set from input (e.g., ={{ $json.Topic }} for dynamic input).

Activate “Return All” to gather all posts and sort by “hot” for trending content.

Common mistakes include invalid OAuth2 setup or exceeding Reddit API rate limits.

Step 5: Filter Posts by Upvotes and URL Conditions

Use an If node called “Upvotes Requirement Filtering” to filter out posts with fewer than 100 upvotes, include only posts with post_hint equal to ‘link’, and exclude URLs containing ‘bsky.app’.

The conditions ensure you only keep highly engaging, relevant posts for analysis.

Step 6: Set and Format Data Fields

Add a Set node named “Set Reddit Posts” to rearrange and store useful post attributes including title, subreddit, upvotes, comments, URLs, date, and post IDs for downstream nodes.

Step 7: Remove Duplicate URLs Keeping Highest Upvoted

Add a Code node “Remove Duplicates” which deduplicates posts by URL, discarding links to redd.it short URLs, and retains the one with the highest upvotes.

// JavaScript to eliminate duplicates prioritizing max upvotes
const inputItems = $input.all();
const uniqueItemsMap = new Map();
for (const item of inputItems) {
  const url = item.json.URL;
  if (url && url.includes("redd.it")) continue;
  const upvotes = parseInt(item.json.Upvotes, 10) || 0;
  if (!uniqueItemsMap.has(url) || upvotes > parseInt(uniqueItemsMap.get(url).json.Upvotes, 10)) {
    uniqueItemsMap.set(url, item);
  }
}
return Array.from(uniqueItemsMap.values());

Step 8: Process Each Post One-by-One in Batches

Use SplitInBatches named “Loop Over Items” to handle posts one by one to reduce overload and ease API calls.

Step 9: Set Data for Each Loop Item

Add a Set node “Set for Loop” to confirm all necessary post data fields are available in each loop for further processing.

Step 10: Retrieve Comments for Each Post

Add another Reddit node “Get Comments” using the post ID and subreddit from the loop item to fetch all comments on a post.

Step 11: Extract and Organize Top 30 Comments

Use a Code node “Extract Top Comments” to recursively filter out deleted comments and flatten a tree of Reddit comments keeping only the top 30 comments by score, preserving reply structure.

Step 12: Format Comments for Readability

Add another Code node “Format Comments” that converts the filtered comments into indented Markdown format highlighting author, score, and comment content.

Step 13: Perform Sentiment and Engagement Analysis of Comments

Use an Anthropic Chat Model node “Comments Analysis” with a detailed prompt to analyze comment sentiment, engagement, and narrative mapping with expert clarity.

Step 14: Retrieve News Content from External API

Add an HTTP Request node “Get News Content” that calls the Jina.ai endpoint with the post URL to extract the original news article content for more detailed context.

Step 15: Extract Final Article Text from Stream Data

Add a Code node “Keep Last” to parse the streaming API response for the latest content snippet featured in the HTTP Request output.

Step 16: Analyze News Content Together with Reddit Insights

Add an Anthropic Chat Model node “News Analysis” that evaluates the news content along with Reddit metrics and comment sentiment to gauge virality and PR angles.

Step 17: Generate a Full PR Stories Report

Add another Anthropic Chat Model node “Stories Report” to synthesize all analysis results into actionable PR story ideas with priority rankings and a strategic roadmap.

Step 18: Format the Final Report for Export

Add Set Final Report node to assemble a combined textual report including metrics and AI insights into a plain-text string ready for file conversion.

Step 19: Convert Report to Text File

Add Convert to File node to transform the final report string into a UTF-8 encoded text file. The filename dynamically uses the leading story headline.

Step 20: Process Files for Final Delivery

Add SplitInBatches node “Loop Over Items” and then use Aggregate node to group all report files for compression.

Step 21: Merge Binary Files into One

Add a Code node “Merge Binary Files” to list and collect all binary keys for comprehensive zipping.

Step 22: Compress Files into a ZIP Archive

Add Compression node “Compress files” to create a zip file named with date and random suffix for unique storage.

Step 23: Upload ZIP Archive to Google Drive

Add a Google Drive node “Google Drive6” configured for your Drive and folder ID to upload the zip file.

Step 24: Share Uploaded File Publicly

Add another Google Drive node “Google Drive7” with the file ID input and set permission to “reader” and “anyone” type so anyone with the link can access it.

Step 25: Notify Team Via Mattermost

Finally, add an HTTP Request node “Send files to Mattermost3” posting a JSON message with the file download link to your designated Mattermost channel for immediate team awareness.

5. Customizations ✏️

  • Adjust Topics: Change the multiline string in “Set Data” → Topics to track different themes like “Technology” or “Health” for other PR segments.
  • Upvote Threshold: Modify the numeric value in “Upvotes Requirement Filtering” node from 100 to a higher or lower number based on desired engagement level.
  • Comment Count: In the “Extract Top Comments” code node, change the slice count from 30 to a different number to widen or narrow comment analysis.
  • File Naming Convention: Update the “Convert to File” node filename expression to include dates or client names for personalized report files.
  • Mattermost Channel: Change the webhook URL and payload in “Send files to Mattermost3” to notify a different team or channel.

6. Troubleshooting 🔧

Problem: “Reddit API returns 401 Unauthorized”
Cause: Missing or incorrect OAuth2 Reddit credentials.
Solution: Go to Credentials → Update Reddit OAuth2, ensure token is valid and assigned to Reddit nodes.

Problem: “HTTP Request to Jina API fails or times out”
Cause: API key missing or invalid, or network issues.
Solution: Verify Jina API key in “Set Data” node, check network, retry due to rate limits.

Problem: “Google Drive upload permission denied”
Cause: OAuth2 token expired or wrong folder access.
Solution: Reauthorize Google Drive credential, confirm folder ID and permissions.

7. Pre-Production Checklist ✅

  • Verify all API credentials are added and active in n8n.
  • Test Reddit search for one topic to confirm search and filtering works.
  • Test comment extraction and formatting on a sample post.
  • Check Jina API response correctness for news content retrieval.
  • Run full workflow in test mode to ensure outputs and report generation match expectations.

8. Deployment Guide

Activate the workflow by toggling the switch in n8n once you’ve verified all nodes are configured and credentials set. The scheduled trigger will run the workflow weekly without manual input.

Monitor workflow executions from the n8n dashboard to catch errors and review logs. Consider setting up alerts or retries for critical API failure nodes.

9. FAQs (Optional)

Q: Can I track topics other than “Donald Trump” or “Politics”?
A: Yes, simply update the topics list in the “Set Data” node.

Q: Does it consume high API credits?
A: Reddit and Anthropic usage depends on volume; optimize with batch size settings.

Q: Is the data secure?
A: All credentials use OAuth2 with encrypted storage in n8n; use private credentials only.

10. Conclusion

By following this guide, you have automated the weekly identification of trending Reddit stories, extracted top public sentiment insights, analyzed news content, and produced a rich PR report ready for your team.

This automation saves Emily and any digital PR professional 3-4 valuable hours per week, improves the quality of story ideas, and accelerates campaign pitch preparation. Next steps include expanding topics, integrating other social platforms, or automating email pitches directly.

With this powerful n8n workflow, your digital PR team can stay ahead of trends effortlessly, turning social media chatter into strategic opportunities.

Promoted by BULDRR AI

Related Workflows

Automate Viral UGC Video Creation Using n8n + Degaus (Beginner-Friendly Guide)

Learn how to automate viral UGC video creation using n8n, AI prompts, and Degaus. This beginner-friendly guide shows how to import, configure, and run the workflow without technical complexity.
Form Trigger
Google Sheets
Gmail
+37
Free

AI SEO Blog Writer Automation in n8n (Beginner Guide)

A complete beginner guide to building an AI-powered SEO blog writer automation using n8n.
AI Agent
Google Sheets
httpRequest
+5
Free

Automate CrowdStrike Alerts with VirusTotal, Jira & Slack

This workflow automates processing of CrowdStrike detections by enriching threat data via VirusTotal, creating Jira tickets for incident tracking, and notifying teams on Slack for quick response. Save hours daily by transforming complex threat data into actionable alerts effortlessly.
scheduleTrigger
httpRequest
jira
+5
Free

Automate Telegram Invoices to Notion with AI Summaries & Reports

Save hours on financial tracking by automating invoice extraction from Telegram photos to Notion using Google Gemini AI. This workflow extracts data, records transactions, and generates detailed spending reports with charts sent on schedule via Telegram.
lmChatGoogleGemini
telegramTrigger
notion
+9
Free

Automate Email Replies with n8n and AI-Powered Summarization

Save hours managing your inbox with this n8n workflow that uses IMAP email triggers, AI summarization, and vector search to draft concise replies requiring minimal review. Automate business email processing efficiently with AI guidance and Gmail integration.
emailReadImap
vectorStoreQdrant
emailSend
+12
Free

Automate Email Campaigns Using n8n with Gmail & Google Sheets

This n8n workflow automates personalized email outreach campaigns by integrating Gmail and Google Sheets, saving hours of manual follow-up work and reducing errors in email sequences. It ensures timely follow-ups based on previous email interactions, optimizing communication efficiency.
googleSheets
gmail
code
+5
Free