Automate Bulk Contact Enrichment with Dropcontact & n8n

This workflow automates the enrichment of business contacts by querying a Postgres database, batching requests to Dropcontact’s API, and updating profiles asynchronously. It saves hours of manual data entry and avoids costly errors for sales teams needing accurate contact data.
postgres
httpRequest
scheduleTrigger
+7
Workflow Identifier: 2057
NODES in Use: Schedule Trigger, Postgres, SplitInBatches, Aggregate, Code, HTTP Request, Wait, SplitOut, Slack, Sticky Note

Press CTRL+F5 if the workflow didn't load.

Learn how to Build this Workflow with AI:

Visit through Desktop for Best experience

Opening Problem Statement

Meet Marieke, a business development manager at a fast-growing SaaS company. Every week, she struggles to maintain an accurate and up-to-date list of key decision-makers for outreach campaigns. Manually searching and verifying contacts wastes her entire Monday, costing her at least 6 hours weekly, and often results in outdated or incomplete contact details. This leads to missed sales opportunities and frustrated colleagues.

Marieke’s company stores profiles in a Postgres database but many entries lack verified email addresses and phone numbers. Bulk manual enrichment is a huge pain and unreliable. Could there be a way to automate batch contact enrichment — updating thousands of profiles asynchronously with accurate, qualified data — without exhausting API rate limits or coding from scratch?

What This Automation Does

This n8n workflow is tailored to help Marieke and teams like hers automate bulk contact enrichment using the Dropcontact API and a Postgres database. When run, the workflow:

  • Queries the Postgres database for profiles missing emails and filtered by specific domains and titles.
  • Aggregates and transforms queried data into Dropcontact API request format.
  • Splits the data into asynchronous batches of 250 requests to comply with Dropcontact’s 1500/hour rate limit.
  • Sends batch requests to Dropcontact’s batch enrichment API.
  • Implements a timed wait of 10 minutes to allow Dropcontact to process the batch asynchronously.
  • Fetches enriched data results for each batch and updates the Postgres profiles, marking them as enriched.
  • Optionally sends Slack notifications for errors, such as API credit problems.

This process drastically reduces manual data entry time, improves data accuracy, and scales enrichment up to thousands of contacts per hour.

Prerequisites ⚙️

  • A Postgres database account with profile and account tables as per query.
  • An active Dropcontact API key for batch contact enrichment.
  • An n8n account configured with the necessary credentials for Postgres and Dropcontact.
  • Optional: a Slack workspace with webhook or OAuth for error notifications.
  • Basic knowledge of n8n workflow editor.

Step-by-Step Guide

Step 1: Schedule Trigger Setup to Run Automatically

Open your n8n editor, click + Add Node, then select Schedule Trigger. Configure the interval for how often you want this enrichment to run (e.g., daily at midnight). This node initializes the workflow.

Expected: The trigger will fire on schedule and start the workflow.

Common mistake: Forgetting to set interval or timezone, causing unintended runs.

Step 2: Query Profiles Missing Emails From Postgres

Add a Postgres node connected to the Schedule Trigger. Enter the SQL query exactly as follows:

select first_name, last_name, domain, full_name
from accounts a 
left join profiles p on a.company_id = p.company_id 
where title = 'Bestuurder' and p.email is null and a.domain != ''
and domain NOT IN ('gmail.com', 'hotmail.com', 'hotmail.be', 'hotmail%','outlook.com','telenet.be', 'live.be', 'skynet.be','SKYNET%', 'yahoo.com' , 'yahoo%', 'msn%', 'hotmail', 'belgacom%') and dropcontact_found is null 
limit 1000

This filters profiles by specific titles and excludes generic domains.

Expected: The node returns up to 1000 profiles requiring enrichment.

Common mistake: Query syntax errors or incorrect table structure.

Step 3: Split Results into Batches of 250

Add the SplitInBatches node to paginate results into batches since Dropcontact limits to 1500/hr.

Set Batch Size to 250. Connect this node to the Postgres query node.

Expected: Profiles are chunked into smaller sets for batch processing.

Common mistake: Setting batch sizes too high may cause API rate errors.

Step 4: Aggregate Profile Fields for Transformation

Add the Aggregate node to gather fields like first_name, last_name, domain, and full_name into arrays for transformation.

Expected: Data grouped by field for next processing step.

Step 5: Transform Data for Dropcontact with Python Code Node

Add a Code node set to Python, configure with this script:

import json

for item in _input.all():
  data = item.json 

  output_data = {"data": [], "siren": True}

  first_names = data["first_name"]
  last_names = data["last_name"]
  domain = data["domain"]
  full_name = data["full_name"]

  transformed_data = []
  for i, (first_name, last_name, domain_name, full_name_value) in enumerate(zip(first_names, last_names, domain, full_name)):
    transformed_data.append({
      "first_name": first_name,
      "last_name": last_name,
      "website": domain_name,
      "custom_fields": {"full_name": full_name_value}
    })

  output_data["data"] = transformed_data

  return output_data

This formats data as required for Dropcontact batch API.

Expected: JSON payload ready for batch upload.

Common mistake: Incorrect field names or Python syntax errors.

Step 6: Send Batch Requests to Dropcontact API

Use the HTTP Request node to POST the transformed data to https://api.dropcontact.io/batch. Authenticate with your API key via credentials.

Expected: Dropcontact accepts batch job and returns a request_id.

Common mistake: Missing auth header or malformed JSON body.

Step 7: Wait for 10 Minutes to Allow Asynchronous Processing

Add a Wait node set to 600 seconds (10 minutes) after the Dropcontact request node.

Expected: The workflow pauses to let Dropcontact process the batch.

Common mistake: Skipping wait leads to trying to download results too early.

Step 8: Download Processed Batch Results

Add another HTTP Request node to GET results from https://api.dropcontact.io/batch/{{ $json.request_id }}.

This fetches enriched contact details asynchronously.

Expected: Full batch data including emails and phone numbers in the response.

Common mistake: Incorrect request_id usage or missing authentication.

Step 9: Split Batch Results for Database Update

Add SplitOut node to separate received results into individual items.

Expected: Each enriched profile becomes a separate message for the next update step.

Step 10: Update Profiles in Postgres with New Contact Details

Use the Postgres node to update the profiles table:

  • Match rows by full_name
  • Update email, phone, dropcontact_found (set to true), and email_qualification.

Use dynamic expressions to map incoming data fields to the table columns.

Expected: Profiles have updated and verified contacts.

Common mistake: Incorrect mapping or update conditions leading to no changes.

Step 11: Optional Slack Notification on Errors

Add a Slack node connected to error output of the batch download to notify the team if API credits are low or if errors occur.

Expected: Error messages posted to a Slack channel for monitoring.

Customizations ✏️

  • Change Target Title in Postgres Query: Modify the SQL in the PROFILES QUERY node to enrich contacts with a different job title, e.g., change Bestuurder to Manager.
  • Adjust Batch Size: In the SplitInBatches node, change batchSize from 250 to fit your API rate limits or volume.
  • Slack Alerts on Different Events: Add or customize Slack messages for successful batches or other statuses by linking nodes differently.
  • Add Email Domain Filters: Update the SQL WHERE clause to include or exclude more domains tailored to your data quality needs.
  • Modify Data Transformation: Edit the Python code in the DATA TRANSFORMATION node to include additional fields or change output structure for Dropcontact API compatibility.

Troubleshooting 🔧

  • Problem: “Invalid API Key” or authentication errors when sending batch requests.

    Cause: Wrong or expired API token in Dropcontact credentials.

    Solution: Re-enter the Dropcontact API key in n8n credentials under the HTTP Request node settings.
  • Problem: “Too Many Requests” or hitting API limits.

    Cause: Batch request sizes or frequency too high.

    Solution: Lower the batch size in the SplitInBatches node and increase the wait time in the Wait node.
  • Problem: Query returns no profiles.

    Cause: Incorrect SQL filters or database connection.

    Solution: Verify SQL query syntax and database connectivity with test queries.
  • Problem: Postgres updates not saving.

    Cause: Wrong mapping of fields or missing matching column.

    Solution: Check the mapping in the Postgres update node, especially matching column full_name.

Pre-Production Checklist ✅

  • Verify n8n credentials for Postgres and Dropcontact are valid and tested.
  • Run SQL queries manually in Postgres client to confirm results match expected profiles.
  • Test batch requests in Dropcontact sandbox or with a small dataset before scaling.
  • Check API rate limits and configure batch sizes and wait times accordingly.
  • Perform trial runs and verify data updates in the database.
  • Backup database before running bulk updates.

Deployment Guide

Activate the workflow in n8n after final testing. Ensure the Schedule Trigger is enabled for routine runs.

Monitor workflow execution logs regularly for failed nodes or errors.

Adjust batch sizes or wait times when scaling up.

Consider integrating with Slack or email for operational alerts.

FAQs

  • Q: Can I use a different database than Postgres?

    A: Yes, but you’d need to adjust the query and update nodes accordingly.
  • Q: Does this consume a lot of Dropcontact API credits?

    A: Batch requests are efficient but do consume credits per contact processed. Monitor usage regularly.
  • Q: Is my contact data safe?

    A: Dropcontact uses secured communication over HTTPS and respects privacy regulations. Ensure your API keys remain secure.
  • Q: Can the workflow handle thousands of contacts daily?

    A: Yes, with appropriate batch sizes and waits the workflow scales well.

Conclusion

By setting up this automated workflow, Marieke now enriches thousands of contact profiles weekly without manual effort. The automation saves her an estimated 6 hours per week, eliminates costly data errors, and ensures up-to-date contact info for the sales team. You’ve learned how to combine Postgres, n8n, and Dropcontact APIs to achieve asynchronous bulk enrichment with batching and error handling.

Next, consider building workflows to automate follow-up email campaigns, integrate enriched contacts with CRM systems, or add other data quality checks. With n8n’s flexibility and Dropcontact’s powerful API, you can keep your contact data fresh and actionable effortlessly. Keep experimenting—automation brings clarity and efficiency to your sales processes!

Promoted by BULDRR AI

Related Workflows

Automate Viral UGC Video Creation Using n8n + Degaus (Beginner-Friendly Guide)

Learn how to automate viral UGC video creation using n8n, AI prompts, and Degaus. This beginner-friendly guide shows how to import, configure, and run the workflow without technical complexity.
Form Trigger
Google Sheets
Gmail
+37
Free

AI SEO Blog Writer Automation in n8n (Beginner Guide)

A complete beginner guide to building an AI-powered SEO blog writer automation using n8n.
AI Agent
Google Sheets
httpRequest
+5
Free

Automate CrowdStrike Alerts with VirusTotal, Jira & Slack

This workflow automates processing of CrowdStrike detections by enriching threat data via VirusTotal, creating Jira tickets for incident tracking, and notifying teams on Slack for quick response. Save hours daily by transforming complex threat data into actionable alerts effortlessly.
scheduleTrigger
httpRequest
jira
+5
Free

Automate Telegram Invoices to Notion with AI Summaries & Reports

Save hours on financial tracking by automating invoice extraction from Telegram photos to Notion using Google Gemini AI. This workflow extracts data, records transactions, and generates detailed spending reports with charts sent on schedule via Telegram.
lmChatGoogleGemini
telegramTrigger
notion
+9
Free

Automate Email Replies with n8n and AI-Powered Summarization

Save hours managing your inbox with this n8n workflow that uses IMAP email triggers, AI summarization, and vector search to draft concise replies requiring minimal review. Automate business email processing efficiently with AI guidance and Gmail integration.
emailReadImap
vectorStoreQdrant
emailSend
+12
Free

Automate Email Campaigns Using n8n with Gmail & Google Sheets

This n8n workflow automates personalized email outreach campaigns by integrating Gmail and Google Sheets, saving hours of manual follow-up work and reducing errors in email sequences. It ensures timely follow-ups based on previous email interactions, optimizing communication efficiency.
googleSheets
gmail
code
+5
Free