Automate Chat-Based Queries with PostgreSQL and n8n LangChain

Learn how to build a chat automation that interacts with your PostgreSQL database using n8n and LangChain. Streamline querying with natural language, reducing time spent on manual SQL queries and ensuring accurate database insights.
chatTrigger
agent
lmChatOpenAi
+2
Learn how to Build this Workflow with AI:
Workflow Identifier: 1745
NODES in Use: chatTrigger, agent, lmChatOpenAi, postgresTool, memoryBufferWindow

Press CTRL+F5 if the workflow didn't load.

Visit through Desktop for Best experience

Automate Complex Database Queries via Chat: Maria’s Challenge with PostgreSQL

Maria is a data analyst managing a large PostgreSQL database for her retail company. Each day she receives numerous requests from sales, marketing, and executive teams, needing quick and precise data insights. Writing custom SQL queries every time drains hours, invites errors, and creates bottlenecks.

Imagine Maria spending 3-4 hours daily just translating user questions into SQL, then validating results, only to repeat the cycle constantly. Any misinterpretation in query structure risks wrong data, impacting decision-making and costing the company money.

This workflow addresses Maria’s exact pain by enabling conversations with her database through chat. It turns natural language questions into SQL queries, fetches and aggregates data directly, all within n8n’s automation platform using LangChain AI agents. No more manual query writing, wasted time, or data inaccuracies.

What This Automation Does ⚙️

This specific n8n workflow leverages LangChain’s AI agent to provide an intuitive chat interface that interacts with PostgreSQL databases. Here’s precisely what happens when the workflow runs:

  • The chat trigger node activates when a new message arrives, capturing user queries in natural language.
  • An AI Agent node (using OpenAI and LangChain functions) interprets the question and decides what SQL query to run by consulting the database schema and table structure.
  • The workflow dynamically fetches the schema and table definitions from PostgreSQL to understand the database layout on-the-fly.
  • It executes the custom SQL query generated by the AI agent against the PostgreSQL database.
  • Chat History node keeps a context window to maintain conversational flow for multi-turn dialogues.
  • The OpenAI Chat Model processes language generation, forming user-friendly responses based on query results.

By automating data querying this way, teams save hours of manual SQL coding per day. Answers are more reliable as the system uses live database insights, reducing human error risk.

Prerequisites ⚙️

  • n8n Account (Cloud or Self-hosted) 🔑
  • PostgreSQL Database credentials 🔐
  • OpenAI Account credentials (for GPT-4o-mini model) 📧
  • Knowledge of your database schema and permission to query metadata tables

For those interested in self-hosting, n8n can run easily on platforms like Hostinger.

Step-by-Step Guide ✏️

Step 1: Set Up PostgreSQL Credentials in n8n

Navigate to n8n’s credentials section:
Click SettingsCredentialsCreate New → Select PostgreSQL.
Fill in your database host, port, user, password, and database name.
Save the credentials as Your Postgresql Database Credentials.
Expected outcome: Your PostgreSQL credentials are stored and ready to use.
Common mistake: Incorrect host or port leads to connection errors. Double-check these values.

Step 2: Set Up OpenAI Credentials

Similarly, add OpenAI credentials:
Click Create New in credentials → Choose OpenAI → Enter your API key.
Save it as Your OpenAI Account Credentials.
Expected: OpenAI node can authenticate and send queries.
Common mistake: Using expired or invalid API keys causes authorization failures.

Step 3: Configure the Chat Trigger Node

This node listens for chat messages to start the workflow.
Go to the When chat message received node.
Ensure the webhook is active or set for testing.
No extra parameters needed here as the default options work well.
Expected: Incoming chat messages start your workflow automatically.
Tip: Use the provided webhook URL to connect your front-end or test the chat trigger.

Step 4: Customize the AI Agent Node

The AI Agent uses LangChain with a specialized OpenAI functions agent.
Check the systemMessage parameter which instructs the AI to behave as a DB assistant.
It references three key tools:
– Execute SQL Query
– Get DB Schema and Tables List
– Get Table Definition
These let it understand your data and generate queries accurately.
Expected: AI agent translates natural language into precise SQL.
Common mistake: Changing the system message may cause unexpected query behavior.

Step 5: Understand Get DB Schema and Tables Node

This PostgreSQL Tool node fetches all schema names and tables.
It runs the SQL:

SELECT 
    table_schema,
    table_name
FROM information_schema.tables
WHERE table_type = 'BASE TABLE'
AND table_schema NOT IN ('pg_catalog', 'information_schema')
ORDER BY table_schema, table_name;

Expected: A comprehensive list of user tables with schema names.
Tip: Allows dynamic querying of metadata for AI context.

Step 6: Set Get Table Definition Node

This node extracts detailed info about columns, data types, nullable status, and foreign keys for any table.
It uses dynamic parameters {{ $fromAI("table_name") }} and {{ $fromAI("schema_name") }} from the AI agent.
Expected: Detailed table metadata to aid AI in query formulation.
Common mistake: If table or schema names don’t match actual DB, errors occur.

Step 7: Configure Execute SQL Query Node

Executes SQL queries generated by the AI agent.
The query parameter uses {{ $fromAI("sql_query", "SQL Query") }} to intake the AI’s output.
Ensure credentials are set for database access.
Expected: Real-time query results from your PostgreSQL database.
Tip: Always include schema prefixes in your DB tables in queries.

Step 8: Integrate OpenAI Chat Model

This node generates the final human-readable response text.
It uses the GPT-4o-mini model and OpenAI credentials.
Expected: Polished replies that summarize or interpret SQL results.
Customization tip: You can swap the language model to any supported OpenAI model.

Step 9: Manage Chat History with Memory Buffer Window

Keep conversational context by storing several previous messages.
This preserves multi-turn conversations, remembering user queries and AI answers.
Expected: Smooth, continuous chat interactions.
Tip: Adjust the number of past messages kept by changing “Context Window Length.”

Customizations ✏️

  • Change the OpenAI model in the OpenAI Chat Model node to other GPT versions for different performance or cost balance.
  • Adjust context window size in the Chat History node’s “Context Window Length” to keep longer or shorter conversations.
  • Customize the system role message in the AI Agent node to handle specific domain queries, e.g., sales reports, product inventory.
  • Activate the workflow and expose the trigger webhook publicly for real-time chat integration with a website or app beyond n8n editor.
  • Modify the SQL queries in the PostgreSQL nodes to add filters or join additional tables for deeper insights.

Troubleshooting 🔧

  • Problem: “Connection refused” or timeout errors from PostgreSQL nodes.
    Cause: Database credentials incorrect or network firewall blocks.
    Solution: Verify credentials, whitelist n8n IP, check DB server status.
  • Problem: AI agent returns incorrect or nonsensical SQL queries.
    Cause: Incomplete or outdated database schema info.
    Solution: Run “Get DB Schema and Tables List” node manually to refresh metadata; verify AI system message clarity.
  • Problem: OpenAI authentication failures.
    Cause: Invalid API key or permissions.
    Solution: Regenerate API key from OpenAI dashboard and update credentials in n8n.

Pre-Production Checklist ✅

  • Confirm PostgreSQL and OpenAI credentials are valid and tested.
  • Test the chat trigger by sending demo questions related to your database.
  • Validate that “Get DB Schema and Tables List” returns current tables.
  • Run sample queries through the AI agent and verify SQL syntax correctness.
  • Check final chat responses are accurate, relevant, and well formatted.

Deployment Guide

Activate the workflow by toggling the switch in n8n. Share the generated webhook URL to your users or interface.

Monitor executions via n8n’s workflow run history to watch for errors or slow queries.

Regularly update OpenAI and PostgreSQL credentials and review schema changes to maintain accuracy.

FAQs

  • Q: Can I use MySQL instead of PostgreSQL?
    A: This workflow uses PostgreSQL-specific SQL metadata queries, so it needs adjustments for MySQL.
  • Q: Does this consume OpenAI API credits?
    A: Yes, all calls to the GPT model consume credits dependent on your OpenAI plan.
  • Q: Is this secure?
    A: Yes, since credentials are managed in n8n and data flows securely between services.

Conclusion

You have now built a powerful chat interface that understands your PostgreSQL database and answers natural language queries automatically. This saves you hours daily previously lost translating questions into SQL and hunting data manually.

Not only is data retrieval faster, but it’s also more accurate and conversational, opening new frontiers in how teams interact with databases.

Next, consider extending this workflow to integrate Slack or Microsoft Teams for messaging platform chats or adding caching layers for performance boosts.

Happy automating!

Related Workflows

Automate Viral UGC Video Creation Using n8n + Degaus (Beginner-Friendly Guide)

Learn how to automate viral UGC video creation using n8n, AI prompts, and Degaus. This beginner-friendly guide shows how to import, configure, and run the workflow without technical complexity.
Form Trigger
Google Sheets
Gmail
+37
Free

AI SEO Blog Writer Automation in n8n (Beginner Guide)

A complete beginner guide to building an AI-powered SEO blog writer automation using n8n.
AI Agent
Google Sheets
httpRequest
+5
Free

Automate CrowdStrike Alerts with VirusTotal, Jira & Slack

This workflow automates processing of CrowdStrike detections by enriching threat data via VirusTotal, creating Jira tickets for incident tracking, and notifying teams on Slack for quick response. Save hours daily by transforming complex threat data into actionable alerts effortlessly.
scheduleTrigger
httpRequest
jira
+5
Free

Automate Telegram Invoices to Notion with AI Summaries & Reports

Save hours on financial tracking by automating invoice extraction from Telegram photos to Notion using Google Gemini AI. This workflow extracts data, records transactions, and generates detailed spending reports with charts sent on schedule via Telegram.
lmChatGoogleGemini
telegramTrigger
notion
+9
Free

Automate Email Replies with n8n and AI-Powered Summarization

Save hours managing your inbox with this n8n workflow that uses IMAP email triggers, AI summarization, and vector search to draft concise replies requiring minimal review. Automate business email processing efficiently with AI guidance and Gmail integration.
emailReadImap
vectorStoreQdrant
emailSend
+12
Free

Automate Email Campaigns Using n8n with Gmail & Google Sheets

This n8n workflow automates personalized email outreach campaigns by integrating Gmail and Google Sheets, saving hours of manual follow-up work and reducing errors in email sequences. It ensures timely follow-ups based on previous email interactions, optimizing communication efficiency.
googleSheets
gmail
code
+5
Free