Automate CV Screening with n8n, OpenAI, and Supabase

This workflow automates resume screening by extracting CV data from PDFs, analyzing candidate-job fit using OpenAI, and storing results in Supabase. It solves the hassle of manual screening, saving recruiters hours and improving hiring accuracy.
manualTrigger
httpRequest
set
+1
Workflow Identifier: 1717
NODES in Use: manualTrigger, set, httpRequest, extractFromFile

Press CTRL+F5 if the workflow didn't load.

Learn how to Build this Workflow with AI:

Visit through Desktop for Best experience

Opening Problem Statement

Meet Anna, a busy recruiter at a fast-growing tech startup. Every week, she receives hundreds of CVs for various software engineering roles. Manually sifting through these documents to identify qualified candidates not only consumes hours each day but also increases the risk of overlooking emerging talent or making biased decisions. Anna’s team struggles to keep up, slowing down the hiring process and risking losing top applicants to competitors.

This workflow is tailored to solve Anna’s exact problem by automating CV screening, especially for tech roles demanding detailed skill matching. Instead of spending days reviewing resumes, Anna wants an automated system that extracts relevant candidate information from PDF CVs, evaluates the fit against a specific job description, and stores insightful results for easy access and auditing. This boosts efficiency, reduces manual errors, and supports better hiring decisions.

What This Automation Does

Once triggered, this n8n workflow completes several critical recruitment steps automatically, including:

  • Download the candidate’s CV directly from a public URL.
  • Extract textual data from the PDF CV to prepare it for analysis.
  • Send the extracted CV text and job description to OpenAI’s API for detailed evaluation.
  • Receive a structured JSON response including a match percentage, candidate summary, and detailed reasons why the candidate fits or doesn’t fit the role.
  • Parse and save the evaluated JSON data in a format ready for storage or further processing.
  • Improve hiring accuracy by relying on consistent AI-driven candidate evaluation criteria.

By automating these steps, recruiters like Anna can save hours per candidate screened, reduce subjective errors, and quickly identify the best applicants for each role.

Prerequisites ⚙️

  • n8n Automation Platform Account (Cloud or self-hosted) 🔌
  • Access to candidate CVs via direct public URLs 🔐 (e.g., Supabase public storage or Dropbox links)
  • OpenAI API Key with access to GPT-4o-mini or compatible chat completion models 🔑
  • Basic understanding of JSON to handle the structured evaluation results
  • Optional: Supabase account or another database/storage solution to save results 📁

Step-by-Step Guide to Build This CV Screening Automation

Step 1: Trigger the Workflow Manually for Testing

Go to your n8n dashboard and add a Manual Trigger node. This lets you test the workflow with example data before automating further.

Where to click: In n8n editor, click Add Node → Trigger → Manual Trigger.

What you see: A trigger node titled “When clicking ‘Test workflow’” appears.

Outcome: Setup enables you to run the entire process by clicking “Execute Workflow.” Common mistake: Forgetting to add this trigger or running in production without testing first can cause unintended executions.

Step 2: Set Variables for Example Data

Add a Set node connected from the Manual Trigger. Configure it to store:

  • file_url: direct link to a sample PDF CV (e.g., a Supabase public file URL)
  • job_description: the text of the job’s requirements and details
  • prompt: a detailed instruction string to guide OpenAI on how to evaluate the resume
  • json_schema: a JSON Schema string that defines the response structure OpenAI should return

Where to click: Add Node → Data → Set

Example value for file_url:
https://cflobdhpqwnoisuctsoc.supabase.co/storage/v1/object/public/my_storage/software_engineer_resume_example.pdf

Common mistake: Using private or expired URLs, causing download failures.

Step 3: Download the Resume File

Add an HTTP Request node.

Configure it to perform a GET request to {{$json.file_url}} to fetch the candidate CV.

Where: Add Node → Core Nodes → HTTP Request

Settings: Set URL to {{$json.file_url}}. Leave other options default.

Outcome: The binary PDF file will be downloaded into the workflow for further steps.

Common mistake: Not using the dynamic expression {{$json.file_url}} which results in a static, incorrect URL.

Step 4: Extract Text from the PDF

Add an Extract Data from File node. This node extracts the raw text content from the downloaded PDF resume.

Where: Add Node → Core Nodes → Extract Data from File

Configure: In the node parameters, choose Operation as pdf. No additional options needed.

This extracts readable text from the PDF for analysis.

Common mistake: Uploading unsupported file formats or corrupt files.

Step 5: Send Data to OpenAI for Analysis

Add an HTTP Request node to call OpenAI’s API.

Configure:

  • Method: POST
  • URL: https://api.openai.com/v1/chat/completions
  • Authentication: Use an OpenAI API credential stored in n8n
  • Body: JSON payload containing the chat messages:
  • {
     "model": "gpt-4o-mini",
     "messages": [
       { "role": "system", "content": "{{ $('Set Variables').item.json.prompt }}" },
       { "role": "user", "content": {{ JSON.stringify(encodeURIComponent($json.text))}} }
     ],
     "response_format": {
       "type": "json_schema",
       "json_schema": {{ $('Set Variables').item.json.json_schema }}
     }
    }

    This sends the CV text along with the job description and instructs OpenAI to respond in a structured manner per the given schema.

    Common mistake: Forgetting to encode text properly or mismatching the schema, causing API errors.

    Step 6: Parse the JSON Response from OpenAI

    Add a Set node to parse the raw JSON string returned by OpenAI to a JavaScript object.

    Use this exact expression to parse:
    ={{ JSON.parse($json.choices[0].message.content) }}

    This converts OpenAI’s response from a string to an accessible JSON object for downstream usage.

    Common mistake: Skipping parsing results in inaccessible data fields.

    Customizations ✏️

    • Change Candidate Scoring Threshold: In the prompt variable inside the “Set Variables” node, adjust how strict the matching percentage cutoff is for “suitable” vs “not suitable.” This lets you tailor the pass/fail criteria.
    • Switch Resume Source: Instead of direct URLs, adapt the Download File node to work with Dropbox, Google Drive, or other storage by changing the URL format and authentication headers.
    • Expand File Format Support: Currently extracts PDFs; add additional extract nodes or use other third-party nodes to handle DOCX or plain text resumes.
    • Save Results to Supabase: Add a database node to the end of the workflow to insert the parsed evaluation data into your SQL table for analytics and audit logs.

    Troubleshooting 🔧

    Problem:

    “Download File node returns 404 or empty file.”

    Cause: The file URL is incorrect, expired, or requires authentication.

    Solution: Verify the URL manually in a browser. If private, generate a public link or update authentication headers in the HTTP Request node.

    Problem:

    “OpenAI API call fails or returns errors.”

    Cause: Malformed JSON, schema mismatch, or invalid API key.

    Solution: Check your JSON payload, validate the schema, and ensure your OpenAI API key is active and properly linked in n8n credentials.

    Pre-Production Checklist ✅

    • Confirm the candidate CV URLs are accessible publicly without authentication.
    • Test the PDF extraction node with a variety of CV files to ensure text is correctly extracted.
    • Validate the JSON schema used in the OpenAI call; test with sample prompts.
    • Run the entire workflow end-to-end manually and review the parsed output.
    • If saving to a database like Supabase, verify connectivity and table schema matching.

    Deployment Guide

    Activate the workflow in n8n once fully tested. Connect your trigger (manual or webhook) depending on your use case.

    Monitor executions via n8n’s built-in execution logs to catch any errors early.

    Schedule automatic runs if you want to process incoming CV links periodically.

    FAQs

    Can this workflow handle DOCX files instead of PDFs?
    Currently, it extracts PDFs only, but you can add additional extraction nodes to support DOCX or convert DOCX to PDF first.

    Does it use many OpenAI API credits?
    The model used is relatively efficient, but cost depends on CV size and frequency of processing.

    Is candidate data secure?
    Data travels through secured APIs and can be stored safely if using services like Supabase with proper access controls.

    Conclusion

    By building this automated CV screening workflow with n8n, OpenAI, and Supabase, you can revolutionize your hiring process. Anna and her team now save hours every week and improve their candidate evaluation consistency with AI’s unbiased and structured insights. This workflow not only minimizes manual labor but also maximizes your chances of finding the perfect match for every job.

    Next, consider automating interview scheduling or integrating candidate scoring dashboards for even smarter recruiting.

Promoted by BULDRR AI

Related Workflows

Automate Viral UGC Video Creation Using n8n + Degaus (Beginner-Friendly Guide)

Learn how to automate viral UGC video creation using n8n, AI prompts, and Degaus. This beginner-friendly guide shows how to import, configure, and run the workflow without technical complexity.
Form Trigger
Google Sheets
Gmail
+37
Free

AI SEO Blog Writer Automation in n8n (Beginner Guide)

A complete beginner guide to building an AI-powered SEO blog writer automation using n8n.
AI Agent
Google Sheets
httpRequest
+5
Free

Automate CrowdStrike Alerts with VirusTotal, Jira & Slack

This workflow automates processing of CrowdStrike detections by enriching threat data via VirusTotal, creating Jira tickets for incident tracking, and notifying teams on Slack for quick response. Save hours daily by transforming complex threat data into actionable alerts effortlessly.
scheduleTrigger
httpRequest
jira
+5
Free

Automate Telegram Invoices to Notion with AI Summaries & Reports

Save hours on financial tracking by automating invoice extraction from Telegram photos to Notion using Google Gemini AI. This workflow extracts data, records transactions, and generates detailed spending reports with charts sent on schedule via Telegram.
lmChatGoogleGemini
telegramTrigger
notion
+9
Free

Automate Email Replies with n8n and AI-Powered Summarization

Save hours managing your inbox with this n8n workflow that uses IMAP email triggers, AI summarization, and vector search to draft concise replies requiring minimal review. Automate business email processing efficiently with AI guidance and Gmail integration.
emailReadImap
vectorStoreQdrant
emailSend
+12
Free

Automate Email Campaigns Using n8n with Gmail & Google Sheets

This n8n workflow automates personalized email outreach campaigns by integrating Gmail and Google Sheets, saving hours of manual follow-up work and reducing errors in email sequences. It ensures timely follow-ups based on previous email interactions, optimizing communication efficiency.
googleSheets
gmail
code
+5
Free