Track AI Cost Per Client with n8n Langchain Code Node

This n8n workflow enables you to track and log AI token usage cost per client by extracting resume data from uploaded PDFs using Langchain’s custom LLM node, then storing costs in Google Sheets for accurate billing.
formTrigger
code
googleSheetsTool
+7
Workflow Identifier: 2102
NODES in Use: formTrigger, extractFromFile, set, informationExtractor, code, googleSheetsTool, googleSheets, filter, summarize, gmail

Press CTRL+F5 if the workflow didn't load.

Learn how to Build this Workflow with AI:

Visit through Desktop for Best experience

1. Opening Problem Statement

Meet Jim, who runs a small AI-powered resume parsing service. Every day, Jim receives dozens of CVs from clients in PDF format. While the core AI service to extract data works smoothly, Jim struggles to track exactly how much AI resource each client consumes and how much to bill them afterwards. Manual tracking is time-consuming, error-prone, and leads to inconsistent billing—sometimes even undercharging clients or losing money on high usage.

For Jim, not knowing the exact token usage per request means lost revenue and inability to scale his business confidently. With numerous clients and ongoing AI model costs, he wastes hours compiling logs from multiple sources and manually calculating costs every month. Jim needs an automated, foolproof method to log each AI interaction and calculate costs per client accurately and effortlessly.

2. What This Automation Does ⚙️

This n8n workflow offers Jim an end-to-end solution to his problem by combining PDF resume uploads, AI data extraction, usage tracking, and automatic billing. Here’s what happens when the workflow runs:

  • A client submits a CV in PDF form via a user-friendly n8n form.
  • The PDF is parsed to extract raw text data.
  • Langchain’s custom LLM subnode organizes extracted text into a structured JSON resume format.
  • The workflow taps into Langchain’s token usage metadata lifecycle hooks to capture input/output tokens and costs precisely.
  • Client usage data and calculated token costs are appended to a Google Sheets log for ongoing record-keeping.
  • At month-end, the workflow aggregates usage for each client and sends an automated invoice email with detailed billing.

By using this automation, Jim saves hours each month on manual logging and billing. It also removes human errors in billing calculations and supports scaling the business with multiple clients effortlessly.

3. Prerequisites ⚙️

  • 📁 n8n Self-hosted instance (necessary for the Langchain Code node compatibility).
  • 📧 Gmail account configured in n8n for sending invoice emails.
  • 📊 Google Sheets account with a pre-created “Client Usage Log” spreadsheet for appending token usage and cost data.
  • 🔑 OpenAI API key with access to models like GPT-4o-mini.
  • 💬 Langchain integration enabled in your n8n environment.

4. Step-by-Step Guide ✏️

Step 1: Set Up the Form Trigger for CV Upload

Navigate to Trigger Nodes and add a Form Trigger. Configure the form titled “CV Parsing Service” with fields:

  • “Upload a file” as a required file input accepting .pdf files.
  • Acknowledgement dropdown confirming client’s billing agreement.

This form will be the public-facing UI where clients submit their resumes. The webhook URL generated by this node will handle incoming submissions.

Common mistake: Forgetting to mark the file upload field as “required” may lead to empty submissions.

Step 2: Parse the Uploaded PDF

Add an Extract from File node, connect it to the form trigger output. In its parameters, select operation as “pdf” and binary property as the uploaded file field (e.g., “Upload_a_file”). This extracts the text content from the PDF for further AI processing.

After running a test submission, you should see the extracted PDF text in the node’s output.

Common mistake: Not matching the binary field name exactly.

Step 3: Set Logging Attributes for Workflow & Client IDs

Insert a Set node to attach variables such as workflow ID, execution ID, and a static client ID (e.g., “12345” for demo purposes). This information is essential for logging each request’s context in the Google Sheets log.

Example settings for the node:

{
  workflow_id: "={{ $workflow.id }}",
  execution_id: "={{ $execution.id }}",
  client_id: "12345"
}

This ensures each logged entry is traceable to the specific run and client.

Step 4: Extract Resume Data Using Langchain Information Extractor

Add the Information Extractor node (from the Langchain integration). Pass the extracted PDF text as input. Configure the node with a manual JSON schema specifying the structured format for resumes, including properties such as name, email, work experience, skills, education, etc.

Test this with sample resumes to ensure the AI extracts structured JSON data successfully.

Common mistake: Not adhering to the schema formatting causes extraction errors.

Step 5: Use Custom LLM Langchain Code Node to Track Token Usage & Cost

The Langchain Code node provides a unique lifecycle hook to tap into the AI response metadata. Configure it with JavaScript code that reads token usage from the OpenAI response (input_tokens, output_tokens), calculates the costs based on predefined per-token prices, and assembles a data row with all logged info.

Here’s the key code snippet to use within the node:

const { ChatOpenAI } = require("@langchain/openai");

const openAIApiKey = "sk-...";
const model = "gpt-4o-mini";
const input_token_cost = 0.150;
const output_token_cost = 0.600;

const tools = await this.getInputConnectionData('ai_tool', 0);
const googleSheetTool = tools[0];

const {
  workflow_id,
  execution_id,
  client_id
} = $input.first().json;

const llm = new ChatOpenAI({
  apiKey: openAIApiKey,
  model,
  callbacks: [
    {
      handleLLMEnd: async function(output,runId,parentId) {
        const generation = output.generations[0][0];
        const message = generation.message;
        const row = {
          date: (new Date()).toGMTString(),
          workflow_id,
          execution_id,
          client_id,
          input_tokens: message.usage_metadata.input_tokens,
          output_tokens: message.usage_metadata.output_tokens,
          total_tokens: message.usage_metadata.total_tokens,
          input_cost: (message.usage_metadata.input_tokens / 1_000_000) * input_token_cost,
          output_cost: (message.usage_metadata.output_tokens / 1_000_000) * output_token_cost,
        };
        row.total_cost = row.input_cost + row.output_cost;
        await googleSheetTool.func(row);
      }
    }
  ]
});

return llm;

This code captures token usage and pushes that data to Google Sheets through the linked tool, enabling precise billing.

Step 6: Append Client Usage Data to Google Sheets

The Google Sheets Tool node appends each usage record to the “Client Usage Log” spreadsheet. It’s connected as a tool input to the Langchain Code node and receives the row object with detailed usage metrics.

Ensure your Google Sheets OAuth2 credentials are set properly for authentication.

Common mistake: Using wrong sheet name or document ID will fail appends silently.

Step 7: Display Extracted Resume JSON Data Back to Client

Add a Form node configured as a completion node. Link it after the data extraction node. It displays the structured JSON resume in a formatted code block to the client.

Use custom CSS in the form for better code readability. This feature provides transparent feedback and validation for clients.

Step 8: Trigger Monthly Aggregation of Client Usage Logs

Add a Schedule Trigger node configured to run on the last day of each month at 18:00. This initiates monthly billing aggregation.

Connect this to a Google Sheets node that retrieves all logs filtered for the specific client ID, then through a filter node that restricts data to only the current month.

Common mistake: Incorrect filter conditions on dates will produce empty data sets.

Step 9: Summarize Monthly Token Usage and Cost

Use the Summarize node to sum “total_cost” and “total_tokens” fields from the filtered monthly logs.

This prepares the data for invoicing.

Step 10: Send Invoice Email via Gmail

Connect the summarized output to a Gmail node to send an invoice email to the client. Customize the email body using n8n expressions to dynamically insert usage, cost, tax, and total payable.

Common mistake: Forgetting to setup OAuth2 credentials or incorrect recipient address will cause email failures.

5. Customizations ✏️

  • Change Client ID for Multi-Client Support: In the “Logging Attributes” Set node, replace the static client_id with a dynamic value based on form input to support multiple clients.
  • Adjust Token Cost Values: In the Langchain Code node’s JavaScript, change input_token_cost and output_token_cost variables to reflect your current AI provider pricing.
  • Enhance Invoice Formatting: Modify the Gmail node’s HTML email body template to include logos, payment links, or detailed line items.
  • Add More Client Metadata: Extend the Google Sheets schema with additional columns like client email or project codes by updating the Google Sheets Tool and the Langchain Code node row object.

6. Troubleshooting 🔧

Problem: “Google Sheets append failing silently or no data recorded.”
Cause: Improper sheet name or document ID in Google Sheets Tool node.
Solution: Double-check Google Sheets document ID and sheet gid match exactly. Test connection with a simple append before running full workflow.

Problem: “Invoices not sending to clients via Gmail node.”
Cause: OAuth2 credentials missing or incorrect recipient email.
Solution: Verify Gmail OAuth2 credentials under node settings and confirm the email address is valid. Send a manual test email.

Problem: “Information Extractor node returns errors or empty JSON.”
Cause: Schema JSON misformatted or input text empty.
Solution: Validate JSON schema syntax carefully and confirm the extracted text from PDF is not empty.

7. Pre-Production Checklist ✅

  • Confirm OpenAI API key is correctly loaded and has quota.
  • Validate Google Sheets document ID, sheet name, and OAuth2 authentication.
  • Test form submission with sample PDF resumes.
  • Verify that token usage logs are appended correctly to the Google Sheet.
  • Run monthly invoice aggregation manually to check correct filtering and summarization.

8. Deployment Guide

To activate, deploy the workflow in your self-hosted n8n instance and enable the schedule trigger for monthly invoicing. Share the form trigger URL publicly or embed it in your website for clients to upload CVs.

Monitor workflow executions via n8n’s execution logs for errors. Set up alerts or notifications in n8n for failed runs to ensure billing continuity.

9. FAQs

Q: Can I replace Gmail with another email service?
A: Yes, n8n supports various email nodes such as SMTP or Outlook. Just configure the corresponding node with your credentials.

Q: Does this workflow consume extra OpenAI API credits?
A: No extra credits beyond the AI model usage itself. The custom Langchain Code node just reads metadata.

Q: Is this workflow secure?
A: Yes, all credentials are stored securely in n8n, and data transmissions like Google Sheets and Gmail use OAuth2 encrypted connections.

10. Conclusion

By following this tutorial, you’ve built a powerful n8n automation that tracks AI token usage and costs per client, extracts valuable structured data from resumes, and automates monthly billing. This saves Jim countless hours of manual work, reduces billing errors, and fosters transparent client chargebacks.

Next steps? Consider expanding to multiple clients by making client IDs dynamic, adding support for multiple document types, or integrating payment gateways to receive online payments automatically.

With this setup, your AI-powered SaaS business gains reliable cost tracking and billing automation — a key competitive advantage!

Promoted by BULDRR AI

Related Workflows

Automate Viral UGC Video Creation Using n8n + Degaus (Beginner-Friendly Guide)

Learn how to automate viral UGC video creation using n8n, AI prompts, and Degaus. This beginner-friendly guide shows how to import, configure, and run the workflow without technical complexity.
Form Trigger
Google Sheets
Gmail
+37
Free

AI SEO Blog Writer Automation in n8n (Beginner Guide)

A complete beginner guide to building an AI-powered SEO blog writer automation using n8n.
AI Agent
Google Sheets
httpRequest
+5
Free

Automate CrowdStrike Alerts with VirusTotal, Jira & Slack

This workflow automates processing of CrowdStrike detections by enriching threat data via VirusTotal, creating Jira tickets for incident tracking, and notifying teams on Slack for quick response. Save hours daily by transforming complex threat data into actionable alerts effortlessly.
scheduleTrigger
httpRequest
jira
+5
Free

Automate Telegram Invoices to Notion with AI Summaries & Reports

Save hours on financial tracking by automating invoice extraction from Telegram photos to Notion using Google Gemini AI. This workflow extracts data, records transactions, and generates detailed spending reports with charts sent on schedule via Telegram.
lmChatGoogleGemini
telegramTrigger
notion
+9
Free

Automate Email Replies with n8n and AI-Powered Summarization

Save hours managing your inbox with this n8n workflow that uses IMAP email triggers, AI summarization, and vector search to draft concise replies requiring minimal review. Automate business email processing efficiently with AI guidance and Gmail integration.
emailReadImap
vectorStoreQdrant
emailSend
+12
Free

Automate Email Campaigns Using n8n with Gmail & Google Sheets

This n8n workflow automates personalized email outreach campaigns by integrating Gmail and Google Sheets, saving hours of manual follow-up work and reducing errors in email sequences. It ensures timely follow-ups based on previous email interactions, optimizing communication efficiency.
googleSheets
gmail
code
+5
Free