Opening Problem Statement
Meet Sean, an experienced software engineer with a rich portfolio and years of achievements documented in various resume formats stored in his Google Drive. Sean often spends hours responding to inquiries about his qualifications, roles, and skills manually. These repetitive interactions consume his valuable time and potential business opportunities, delaying responses and sometimes causing missed chances to impress potential employers or clients. Imagine Sean’s frustration trying to keep his portfolio info updated and accessible instantly without manually digging through documents or answering redundant questions.
This problem intensifies as Sean updates his portfolio frequently; every new resume upload or update needs to be quickly integrated into an intelligent system that can automatically parse, index, and make that data accessible in a conversational way. Without automation, Sean risks losing hours weekly managing his CV presence, plus accuracy and consistency can suffer under the manual approach.
What This Automation Does
This n8n workflow addresses Sean’s challenge by creating a personal AI assistant chatbot that intelligently manages and queries his resume data stored in Google Drive. Here’s what happens when this workflow runs:
- Automatically triggers when Sean uploads or updates his resume files in a specific Google Drive folder.
- Downloads and processes the resume into chunks using a recursive text splitter for better comprehension.
- Converts these chunks into vector embeddings using Google Gemini’s text-embedding model for semantic search capability.
- Stores these vectors in the Pinecone vector database for fast, relevant retrieval during queries.
- Enables real-time chatbot interaction via a Webhook where users can ask questions that the bot answers based on indexed resume data.
- Maintains conversation history with session-aware memory and optionally stores chat data in a NocoDB database for reporting.
- Runs a daily email scheduler that compiles all conversations from that day and sends a formatted summary report to Sean’s Gmail.
By automating these steps, Sean saves significant manual effort weekly. He eliminates hours searching for information or repeating answers, automates updates, and gains professional daily insights into user interactions with his resume chatbot.
Prerequisites ⚙️
- Google Drive account with a dedicated folder for resume/CV documents.
- Pinecone account to create a vector database index (e.g., named “seanrag”).
- Google Cloud Project with Vertex AI API enabled and Google Gemini (PaLM) API key.
- NocoDB account for optional conversation history storage and reporting.
- Gmail account with OAuth access configured in n8n for sending daily email reports.
- n8n account or self-hosting setup to run the workflow with these credential integrations.
Step-by-Step Guide
Step 1: Set Up Google Drive Folder Trigger
Navigate to the Google Drive nodes Google Drive – Resume CV File Created and Google Drive – Resume CV File Updated. Configure each to watch the specific folder where Sean’s resume documents will be added or updated. This ensures the workflow triggers automatically on changes. Use the folder ID of the dedicated resume folder in Drive.
Outcome: The workflow activates whenever a resume file is uploaded or modified.
Common mistake: Forgetting to specify the correct folder ID or permissions can prevent the trigger from working.
Step 2: Download the Resume File
The Download CV File From Google Drive node downloads the file based on the file ID from the trigger. It uses OAuth credentials linked to Google Drive.
Outcome: Resume content is downloaded into the workflow for processing.
Common mistake: Not setting the operation to ‘download’ or incorrect fileId mapping.
Step 3: Split the Resume Content into Chunks
The CV content – Recursive Character Text Splitter node breaks down large resume text into smaller, overlapping chunks (with 100 characters overlap) to improve embedding and semantic analysis accuracy.
Outcome: The resume is segmented into smaller pieces suitable for embedding.
Common mistake: Overlap or chunk size settings incorrectly configured leading to fragmented or incomplete embeddings.
Step 4: Load the Document Data
CV File Data Loader converts the chunked binary data into a usable text format for embedding.
Outcome: Document content is ready for the embedding process.
Common mistake: Incorrect binary field name or data type mismatch.
Step 5: Generate Embeddings Using Google Gemini
The Embeddings Google Gemini node uses the model ‘models/text-embedding-004’ to create vector embeddings representing the semantic meaning of each text chunk.
Outcome: Each chunk is transformed into a vector suitable for vector database search.
Common mistake: Using wrong model names or not providing the right API credentials.
Step 6: Store Embeddings in Pinecone Vector Store
The Pinecone – Vector Store forr CV Content node inserts the generated embeddings into the Pinecone index named ‘seanrag’.
Outcome: The resume data is indexed for quick semantic retrieval during chat queries.
Common mistake: Incorrect Pinecone index name or missing API key.
Step 7: Configure the Chat API Webhook Endpoint
The Chat API – webhook node listens for POST requests at the ‘/chat’ endpoint. When a chat input is received, it forwards the question to the Personal CV AI Agent Assistant.
Outcome: Enables real-time interaction with an AI chatbot based on Sean’s resume data.
Common mistake: Not updating the webhook URL or testing with inappropriate POST data.
Step 8: Set Up Personal CV AI Agent Assistant
This @n8n/n8n-nodes-langchain.agent node uses Google Gemini’s AI language model to answer queries. It refers to the Pinecone vector store ‘seanrag’ to extract relevant info for precise answers, acting as Sean’s smart assistant.
Outcome: Users get accurate, context-aware answers about Sean’s resume in chat form.
Common mistake: Misconfiguring the system message prompt, which defines Sean’s background and answering style.
Step 9: Manage Chat Memory for Session Awareness
The Chat Memory – Window Buffer node stores recent conversation history keyed by session input, allowing the chatbot to maintain context across messages.
Outcome: Conversations feel more natural and contextually linked.
Common mistake: Incorrect session key setup causing context loss or mixup.
Step 10: Save Conversation History via API Webhook
The Save Conversation API – Webhook and Save Conversation – NocoDB nodes accept POST requests to save chat sessions into a NocoDB table. A corresponding Save Conversation API Webhook – Response node confirms the successful save.
Outcome: Chat histories are stored for analytics and review.
Common mistake: Not configuring allowed origins, causing CORS errors during frontend integration.
Step 11: Schedule Daily Conversation Summary Email
The Schedule Trigger runs daily at 18:00, querying NocoDB – get all todays conversation to collect all chats from that day.
Outcome: All conversations are grouped and aggregated by session and email.
Common mistake: Timezone mismatch affecting when the email is sent.
Step 12: Group and Format Conversation Data for Email
The Group Conversation By Unique Session + Email – Code node groups chats by session id and email address, then passes the grouped data to the Format HTML Display For email node that generates an email-friendly HTML summary.
Outcome: A clean, readable daily report showcasing user interactions.
Common mistake: Errors in JavaScript grouping logic causing empty or misgrouped data.
Step 13: Send Report via Gmail
The Send Report To Gmail node mails the HTML conversation report to Sean’s configured Gmail address with a subject line including the current date.
Outcome: Sean receives a daily summary of AI chatbot conversations for review.
Common mistake: Gmail OAuth tokens not authorized, causing send failures.
Customizations ✏️
- Change Vector Store Index Name: In the Pinecone nodes, update the index name “seanrag” to any preferred vector database index to use a different data source.
- Modify System AI Assistant Message: Edit the “systemMessage” text in the Personal CV AI Agent Assistant node to adjust how Sean’s assistant responds, e.g., tone, persona, or additional info.
- Adjust Chunk Overlap and Size: Tune settings in the Recursive Character Text Splitter to control chunk size and overlap for larger or smaller documents, optimizing embedding quality.
- Customize Daily Report Email Format: Modify the HTML template in the Format HTML Display For email node, changing colors, fonts, or layout to suit Sean’s branding.
- Enable/Disable Conversation History Logging: Toggle the Save Conversation API – Webhook and NocoDB nodes to control whether session data gets saved, useful for privacy control.
Troubleshooting 🔧
Problem: “Google Drive trigger not firing when file is updated or created.”
Cause: Incorrect folder ID or insufficient Google Drive API permissions.
Solution: Verify the correct folder ID is set in the trigger node and ensure OAuth scopes include Google Drive file read access.
Problem: “Pinecone API refuses insertion or retrieval.”
Cause: Wrong API key, index name mismatch, or region misconfiguration.
Solution: Double-check Pinecone API key credentials and that the index exists and matches the configured name exactly.
Problem: “Chatbot responds with ‘I cannot find the answer in the available resources.’ unexpectedly.”
Cause: Resume vectors were not properly inserted or up-to-date.
Solution: Re-upload or update the resume file to trigger vector regeneration and confirm Pinecone vector store insertion succeeded.
Problem: “Daily email summary is empty.”
Cause: NocoDB query for today’s conversations returns no results or data grouping code error.
Solution: Confirm conversations are being saved in NocoDB correctly and verify that the JavaScript grouping code executes without errors.
Pre-Production Checklist ✅
- Verify Google Drive API credentials have correct scopes.
- Confirm Pinecone index exists and API key is valid.
- Test Google Gemini API connectivity and valid keys.
- Ensure webhook endpoints are publicly accessible for chat and save conversation APIs.
- Test chat interaction manually via curl or Postman.
- Validate the NocoDB configuration and connectivity for storing chats.
- Run the scheduled trigger and confirm email delivery time and formatting.
- Backup credentials and workflow configurations before deployment.
Deployment Guide
Activate the workflow in n8n by toggling its status to “Active.” Ensure all necessary credentials (Google Drive, Pinecone, Google Gemini, NocoDB, Gmail) are correctly configured in n8n’s credentials manager.
Monitor webhook logs and node executions to confirm chat requests are handled promptly and conversation data is saved. The daily report email will automatically send at your configured hour.
If self-hosting, consider using a VPS provider like Hostinger (https://buldrr.com/hostinger) for better uptime and secure domain setup.
FAQs
Can I use a different vector database instead of Pinecone?
Yes, but you will need to adapt the vector store nodes accordingly. Pinecone is supported out of the box in this workflow.
Does the Google Gemini API usage incur costs?
Yes, Google Gemini usage may incur charges depending on your Google Cloud plan. Monitor usage to avoid surprises.
Is my data secure when using this chatbot?
All data stays within the authorized APIs and Pinecone vector store. This workflow does not expose sensitive data publicly, but follow best practices on credential security.
Can this workflow handle multiple concurrent users?
Yes, session memory is handled using the Chat Memory Window Buffer node ensuring context is isolated per user input.
Conclusion
By implementing this detailed n8n workflow, you transformed Sean’s manual, time-consuming resume management into a dynamic AI-powered chatbot experience. Sean now automatically indexes his updated CVs, answers user questions instantly with contextual understanding, and receives daily engagement reports via email.
This automation saves Sean multiple hours weekly, improves his professional representation, and streamlines communication. Next, consider expanding the chatbot to handle cover letters, project portfolios, or integrate LinkedIn data for even richer AI-driven personal branding.
Let’s make your personal portfolio smart, interactive, and effortless with n8n and Google Gemini!