Automate Trending GitHub Content Sharing with n8n & AI

This n8n workflow automatically curates trending GitHub discussions from Hacker News, enriches them with AI-generated social media posts, and publishes to Twitter and LinkedIn. Save hours on content creation while ensuring fresh, engaging tech content on your feeds.
httpRequest
airtable
code
+9
Workflow Identifier: 1667
NODES in Use: httpRequest, code, airtable, markdown, twitter, linkedIn, telegram, wait, scheduleTrigger, noOp, merge, openAi

Press CTRL+F5 if the workflow didn't load.

Learn how to Build this Workflow with AI:

Visit through Desktop for Best experience

1. Opening Problem Statement

Meet Alex, a social media manager at a fast-paced tech startup. Every day, Alex spends hours sifting through dozens of tech news websites and Hacker News to find interesting GitHub projects worth sharing with her followers on Twitter and LinkedIn. This manual curating process is time-consuming, error-prone, and often leads to duplicated posts or missed trending projects — costing Alex about 4-5 hours weekly and risking audience engagement loss.

What if Alex could automate this entire pipeline — from discovering trending GitHub repositories on Hacker News to crafting well-crafted, platform-optimized social media posts, and finally publishing them seamlessly? This is exactly what the “Social Media AI Agent – Telegram” n8n workflow achieves, helping social media teams like Alex’s focus on strategy instead of manual busywork.

2. What This Automation Does

This workflow brings multiple steps and tools under one hood for a smooth content curation and distribution process:

  • Crawls the Hacker News homepage every 6 hours to fetch the latest discussions.
  • Extracts GitHub links and detailed metadata using a custom Python code node running BeautifulSoup.
  • Filters out already posted items by searching an Airtable base configured as a content database.
  • Visits each GitHub page to gather detailed project insights and converts the HTML content to Markdown for clarity.
  • Uses OpenAI (GPT-4o-mini) to generate social media posts tailored specifically for Twitter and LinkedIn, following strict style guidelines.
  • Stores posts in Airtable and notifies via Telegram for content owner review with a 5-minute buffer.
  • Automatically posts to Twitter (called X) and LinkedIn after approval.

By automating these tasks, Alex can save at least 4 hours per week, eliminate human errors in content duplication, and ensure consistent, timely social media updates about trending open source tech projects.

3. Prerequisites ⚙️

  • n8n instance (cloud or self-hosted) 🔌
  • OpenAI account with API access 🔑
  • Airtable account with a base configured to store posts 📊
  • Twitter developer account with OAuth2 credentials for posting X tweets 📧
  • LinkedIn developer account OAuth2 credentials for posting LinkedIn content 💬
  • Telegram bot token and chat ID for notifications 📱
  • Basic familiarity with Python for code node customization (optional) 🐍

4. Step-by-Step Guide

Step 1: Setting Up Scheduled Trigger

Navigate to the n8n dashboard and add a Schedule Trigger node. Configure it to run every 6 hours:

  • Click “Add Node” → Search “Schedule Trigger” → Select it.
  • In settings, choose “Every 6 Hours” interval.
  • This will ensure the workflow checks Hacker News regularly for new GitHub project discussions.

After saving, your trigger node is ready to start the workflow automatically on schedule.

Step 2: Crawling Hacker News Homepage

Add an HTTP Request node named “Crawl HN Home”:

  • Set the URL to https://news.ycombinator.com/
  • Enable “Full Response” to get all HTML content
  • This node pulls the entire Hacker News homepage HTML for scraping.

Step 3: Parsing Hacker News HTML to Extract GitHub Links

Add a Python Code node “Extract Meta” right after the HTTP Request:

  • Invoke BeautifulSoup and simplejson packages (installed at runtime) to parse the HTML.
  • Extract all posts from Hacker News that hyperlink to GitHub repositories.
  • Capture data points: post ID, title, URL, site, score, author, age, comments, and Hacker News item URL.

Here’s the key snippet from the Python code node that does this:

import asyncio
import micropip

async def install_packages():
    await micropip.install("beautifulsoup4")
    await micropip.install("simplejson")

asyncio.get_event_loop().run_until_complete(install_packages())

import simplejson as json
from bs4 import BeautifulSoup

html_content = items[0].get('json', {}).get('data', '')
soup = BeautifulSoup(html_content, 'html.parser')
github_posts = []

posts = soup.find_all('tr', class_='athing submission')
for post in posts:
    post_id = post.get('id')
    title_line = post.find('span', class_='titleline')
    if not title_line:
        continue
    title_tag = title_line.find('a')
    if not title_tag:
        continue
    title = title_tag.get_text(strip=True)
    url = title_tag.get('href', '')
    if 'github.com' not in url.lower():
        continue
    site_bit = title_line.find('span', class_='sitebit comhead')
    site = site_bit.find('span', class_='sitestr').get_text(strip=True) if site_bit else ''
    subtext_tr = post.find_next_sibling('tr')
    if not subtext_tr:
        continue
    subtext_td = subtext_tr.find('td', class_='subtext')
    if not subtext_td:
        continue
    score_span = subtext_td.find('span', class_='score')
    score = score_span.get_text(strip=True) if score_span else '0 points'
    author_a = subtext_td.find('a', class_='hnuser')
    author = author_a.get_text(strip=True) if author_a else 'unknown'
    age_span = subtext_td.find('span', class_='age')
    age_a = age_span.find('a') if age_span else None
    age = age_a.get_text(strip=True) if age_a else 'unknown'
    comments_a = subtext_td.find_all('a')[-1] if subtext_td.find_all('a') else None
    comments_text = comments_a.get_text(strip=True) if comments_a else '0 comments'
    hn_url = f"https://news.ycombinator.com/item?id={post_id}"
    post_metadata = {
        'Post': post_id,
        'title': title,
        'url': url,
        'site': site,
        'score': score,
        'author': author,
        'age': age,
        'comments': comments_text,
        'hn_url': hn_url
    }
    github_posts.append(post_metadata)
output = [{'json': post} for post in github_posts]
return output

After running, this node outputs an array of posts with all metadata for further filtering.

Step 4: Avoid Duplicate Posting Using Airtable Search

Add an Airtable node (“Search Item”) connected to the extracted meta node:

  • Configure to search by post ID in your Airtable base.
  • Returns any matching posts to prevent duplicates.

Step 5: Filtering Unique GitHub Posts

Add a JavaScript Code node named “Filter Unposted Items”:

  • Reads the list of posts already in Airtable.
  • Filters and passes only new, unposted GitHub repositories downstream.

Snippet from its code:

const items = [];
const processedPosts = new Set($input.all().filter(item => item.json.id).map(item => item.json.Post));
for (const item of $input.all()) {
  if(!item.json.id){
    if(!processedPosts.has(item.json.Post) && item.json.Post!=undefined){
      items.push(item);
    }
  }
}
return items;

Step 6: Visiting Each GitHub Page for In-Depth Content

Add an HTTP Request node “Visit GH Page”:

  • Use dynamic URL from the extracted GitHub link.
  • Fetches the HTML content of the GitHub repository page for more detailed insights.

Step 7: Convert GitHub HTML Content to Markdown

Add a Markdown node (“Convert HTML To Markdown”) connected to the GitHub HTTP request:

  • Transforms the raw HTML into readable Markdown format to use for AI post generation.

Step 8: Generate Social Media Content with OpenAI

Add the LangChain OpenAI node (“Generate Content”) configured with GPT-4o-mini:

  • Feed the Markdown details, title, and URL into the AI prompt specialized for Twitter and LinkedIn posts.
  • The AI returns two stylized social media posts following strict guidelines (no emojis, distinct tones, call to action, character limits).

Step 9: Validate Generated Content Format

Add a JavaScript Code node (“Validate Generate Content”) that checks if the AI output contains both Twitter and LinkedIn post content.

Step 10: Filter Out Any Generation Errors

Add a Filter node (“Filter Errored”) to allow only items without errors to proceed.

Step 11: Save New Posts to Airtable

Add an Airtable node (“Create Item”) to insert new unique posts into your base.

Step 12: Notify Content Owner on Telegram

Add a Telegram node (“Ping Me”) that sends a message with the ready-to-post tweets and LinkedIn content.

Step 13: Waiting Period Before Posting

Add a Wait node (“Wait for 5 mins before posting”) to provide a buffer for review.

Step 14: Post to Twitter (X) and LinkedIn

Add Twitter (X) node (“X”) and LinkedIn node to post the generated social media content.

Step 15: Update Post Status in Airtable

Add two Airtable nodes (“Update X Status” and “Update L Status”) to mark posts as done after posting on Twitter and LinkedIn respectively.

5. Customizations ✏️

  • Change Posting Interval: In the Schedule Trigger node, modify the hours interval to your preferred frequency, e.g., every 3 hours instead of 6.
  • Filter by GitHub Topics or Users: Enhance the Extract Meta Python code to scrape GitHub URLs based on specific repository topics or user handles.
  • Improve AI Prompt Tone: Edit the Generate Content node’s system message to adjust the style—more formal, casual, or technical—to match your brand voice.
  • Additional Social Channels: Add nodes for other platforms like Mastodon or Facebook and adjust content accordingly.
  • Notification Channel: Switch the Telegram node to Slack or email to notify different team members.

6. Troubleshooting 🔧

Problem: “Error: No posts found with GitHub links.”

Cause: Hacker News HTML structure changed or no new GitHub links.

Solution: Inspect the Extract Meta Python node selectors, update class names or tags based on new HTML structure, or run a manual test on recent pages.

Problem: “AI output missing Twitter or LinkedIn post content.”

Cause: OpenAI API rate limits or prompt formatting errors.

Solution: Check API usage, refine AI prompt in Generate Content node, and ensure JSON output format is correct.

Problem: “Duplicates appearing in Airtable base.”

Cause: Incorrect filtering logic or missing Airtable search configuration.

Solution: Review Search Item node’s filterByFormula and Filter Unposted Items code for proper duplicate detection.

7. Pre-Production Checklist ✅

  • Confirm all API credentials for Airtable, OpenAI, Twitter, LinkedIn, and Telegram are valid and connected.
  • Test crawler by running the Crawl HN Home node manually; verify GitHub posts output.
  • Check AI prompt responses with sample Markdown input.
  • Run full test cycle to ensure posts are created, notified, and published correctly.
  • Backup Airtable base before deploying to production.

8. Deployment Guide

Activate the n8n workflow by turning it ON in your dashboard. The scheduled trigger will kick off the process automatically every 6 hours. Monitor executions via the n8n execution log for errors. The Telegram notification node keeps you updated in real-time before final posts go live. For scaling, ensure API rate limits are managed and consider upgrading OpenAI plans as needed.

9. FAQs

Q: Can I use GitLab or Bitbucket links instead of GitHub?
A: Yes, by modifying the Extract Meta node’s Python code to recognize those URLs, you can adapt this workflow to other platforms.

Q: Does this workflow consume a lot of OpenAI credits?
A: Since it processes a limited number of posts every 6 hours, the usage is moderate. You can control volume by adjusting schedule frequency.

Q: Is my data secure?
A: The workflow uses OAuth2 tokens and personal API keys. Ensure n8n runs in a secure environment for data protection.

10. Conclusion

By implementing this n8n automation, you transform how you curate and share trending GitHub projects from Hacker News to social media platforms. Alex now saves hours weekly on content discovery and publishing, gains consistency in posting, and boosts audience engagement with fresh, professionally styled posts.

Next automation ideas could include adding sentiment analysis on posts to prioritize content, integrating LinkedIn groups or Twitter threads, or auto-responding to comments via the Telegram bot.

Give this workflow a try and free up your social media team’s time for bigger creative strategies!

Promoted by BULDRR AI

Related Workflows

Automate Viral UGC Video Creation Using n8n + Degaus (Beginner-Friendly Guide)

Learn how to automate viral UGC video creation using n8n, AI prompts, and Degaus. This beginner-friendly guide shows how to import, configure, and run the workflow without technical complexity.
Form Trigger
Google Sheets
Gmail
+37
Free

AI SEO Blog Writer Automation in n8n

A complete beginner guide to building an AI-powered SEO blog writer automation using n8n.
AI Agent
Google Sheets
httpRequest
+5
Free

Automate CrowdStrike Alerts with VirusTotal, Jira & Slack

This workflow automates processing of CrowdStrike detections by enriching threat data via VirusTotal, creating Jira tickets for incident tracking, and notifying teams on Slack for quick response. Save hours daily by transforming complex threat data into actionable alerts effortlessly.
scheduleTrigger
httpRequest
jira
+5
Free

Automate Telegram Invoices to Notion with AI Summaries & Reports

Save hours on financial tracking by automating invoice extraction from Telegram photos to Notion using Google Gemini AI. This workflow extracts data, records transactions, and generates detailed spending reports with charts sent on schedule via Telegram.
lmChatGoogleGemini
telegramTrigger
notion
+9
Free

Automate Email Replies with n8n and AI-Powered Summarization

Save hours managing your inbox with this n8n workflow that uses IMAP email triggers, AI summarization, and vector search to draft concise replies requiring minimal review. Automate business email processing efficiently with AI guidance and Gmail integration.
emailReadImap
vectorStoreQdrant
emailSend
+12
Free

Automate Email Campaigns Using n8n with Gmail & Google Sheets

This n8n workflow automates personalized email outreach campaigns by integrating Gmail and Google Sheets, saving hours of manual follow-up work and reducing errors in email sequences. It ensures timely follow-ups based on previous email interactions, optimizing communication efficiency.
googleSheets
gmail
code
+5
Free