Opening Problem Statement
Meet Marieke, a business development manager at a fast-growing SaaS company. Every week, she struggles to maintain an accurate and up-to-date list of key decision-makers for outreach campaigns. Manually searching and verifying contacts wastes her entire Monday, costing her at least 6 hours weekly, and often results in outdated or incomplete contact details. This leads to missed sales opportunities and frustrated colleagues.
Marieke’s company stores profiles in a Postgres database but many entries lack verified email addresses and phone numbers. Bulk manual enrichment is a huge pain and unreliable. Could there be a way to automate batch contact enrichment — updating thousands of profiles asynchronously with accurate, qualified data — without exhausting API rate limits or coding from scratch?
What This Automation Does
This n8n workflow is tailored to help Marieke and teams like hers automate bulk contact enrichment using the Dropcontact API and a Postgres database. When run, the workflow:
- Queries the Postgres database for profiles missing emails and filtered by specific domains and titles.
- Aggregates and transforms queried data into Dropcontact API request format.
- Splits the data into asynchronous batches of 250 requests to comply with Dropcontact’s 1500/hour rate limit.
- Sends batch requests to Dropcontact’s batch enrichment API.
- Implements a timed wait of 10 minutes to allow Dropcontact to process the batch asynchronously.
- Fetches enriched data results for each batch and updates the Postgres profiles, marking them as enriched.
- Optionally sends Slack notifications for errors, such as API credit problems.
This process drastically reduces manual data entry time, improves data accuracy, and scales enrichment up to thousands of contacts per hour.
Prerequisites ⚙️
- A Postgres database account with profile and account tables as per query.
- An active Dropcontact API key for batch contact enrichment.
- An n8n account configured with the necessary credentials for Postgres and Dropcontact.
- Optional: a Slack workspace with webhook or OAuth for error notifications.
- Basic knowledge of n8n workflow editor.
Step-by-Step Guide
Step 1: Schedule Trigger Setup to Run Automatically
Open your n8n editor, click + Add Node, then select Schedule Trigger. Configure the interval for how often you want this enrichment to run (e.g., daily at midnight). This node initializes the workflow.
Expected: The trigger will fire on schedule and start the workflow.
Common mistake: Forgetting to set interval or timezone, causing unintended runs.
Step 2: Query Profiles Missing Emails From Postgres
Add a Postgres node connected to the Schedule Trigger. Enter the SQL query exactly as follows:
select first_name, last_name, domain, full_name
from accounts a
left join profiles p on a.company_id = p.company_id
where title = 'Bestuurder' and p.email is null and a.domain != ''
and domain NOT IN ('gmail.com', 'hotmail.com', 'hotmail.be', 'hotmail%','outlook.com','telenet.be', 'live.be', 'skynet.be','SKYNET%', 'yahoo.com' , 'yahoo%', 'msn%', 'hotmail', 'belgacom%') and dropcontact_found is null
limit 1000
This filters profiles by specific titles and excludes generic domains.
Expected: The node returns up to 1000 profiles requiring enrichment.
Common mistake: Query syntax errors or incorrect table structure.
Step 3: Split Results into Batches of 250
Add the SplitInBatches node to paginate results into batches since Dropcontact limits to 1500/hr.
Set Batch Size to 250. Connect this node to the Postgres query node.
Expected: Profiles are chunked into smaller sets for batch processing.
Common mistake: Setting batch sizes too high may cause API rate errors.
Step 4: Aggregate Profile Fields for Transformation
Add the Aggregate node to gather fields like first_name, last_name, domain, and full_name into arrays for transformation.
Expected: Data grouped by field for next processing step.
Step 5: Transform Data for Dropcontact with Python Code Node
Add a Code node set to Python, configure with this script:
import json
for item in _input.all():
data = item.json
output_data = {"data": [], "siren": True}
first_names = data["first_name"]
last_names = data["last_name"]
domain = data["domain"]
full_name = data["full_name"]
transformed_data = []
for i, (first_name, last_name, domain_name, full_name_value) in enumerate(zip(first_names, last_names, domain, full_name)):
transformed_data.append({
"first_name": first_name,
"last_name": last_name,
"website": domain_name,
"custom_fields": {"full_name": full_name_value}
})
output_data["data"] = transformed_data
return output_data
This formats data as required for Dropcontact batch API.
Expected: JSON payload ready for batch upload.
Common mistake: Incorrect field names or Python syntax errors.
Step 6: Send Batch Requests to Dropcontact API
Use the HTTP Request node to POST the transformed data to https://api.dropcontact.io/batch. Authenticate with your API key via credentials.
Expected: Dropcontact accepts batch job and returns a request_id.
Common mistake: Missing auth header or malformed JSON body.
Step 7: Wait for 10 Minutes to Allow Asynchronous Processing
Add a Wait node set to 600 seconds (10 minutes) after the Dropcontact request node.
Expected: The workflow pauses to let Dropcontact process the batch.
Common mistake: Skipping wait leads to trying to download results too early.
Step 8: Download Processed Batch Results
Add another HTTP Request node to GET results from https://api.dropcontact.io/batch/{{ $json.request_id }}.
This fetches enriched contact details asynchronously.
Expected: Full batch data including emails and phone numbers in the response.
Common mistake: Incorrect request_id usage or missing authentication.
Step 9: Split Batch Results for Database Update
Add SplitOut node to separate received results into individual items.
Expected: Each enriched profile becomes a separate message for the next update step.
Step 10: Update Profiles in Postgres with New Contact Details
Use the Postgres node to update the profiles table:
- Match rows by
full_name - Update
email,phone,dropcontact_found(set to true), andemail_qualification.
Use dynamic expressions to map incoming data fields to the table columns.
Expected: Profiles have updated and verified contacts.
Common mistake: Incorrect mapping or update conditions leading to no changes.
Step 11: Optional Slack Notification on Errors
Add a Slack node connected to error output of the batch download to notify the team if API credits are low or if errors occur.
Expected: Error messages posted to a Slack channel for monitoring.
Customizations ✏️
- Change Target Title in Postgres Query: Modify the SQL in the PROFILES QUERY node to enrich contacts with a different job title, e.g., change
BestuurdertoManager. - Adjust Batch Size: In the SplitInBatches node, change
batchSizefrom 250 to fit your API rate limits or volume. - Slack Alerts on Different Events: Add or customize Slack messages for successful batches or other statuses by linking nodes differently.
- Add Email Domain Filters: Update the SQL
WHEREclause to include or exclude more domains tailored to your data quality needs. - Modify Data Transformation: Edit the Python code in the DATA TRANSFORMATION node to include additional fields or change output structure for Dropcontact API compatibility.
Troubleshooting 🔧
- Problem: “Invalid API Key” or authentication errors when sending batch requests.
Cause: Wrong or expired API token in Dropcontact credentials.
Solution: Re-enter the Dropcontact API key in n8n credentials under the HTTP Request node settings. - Problem: “Too Many Requests” or hitting API limits.
Cause: Batch request sizes or frequency too high.
Solution: Lower the batch size in the SplitInBatches node and increase the wait time in the Wait node. - Problem: Query returns no profiles.
Cause: Incorrect SQL filters or database connection.
Solution: Verify SQL query syntax and database connectivity with test queries. - Problem: Postgres updates not saving.
Cause: Wrong mapping of fields or missing matching column.
Solution: Check the mapping in the Postgres update node, especially matching columnfull_name.
Pre-Production Checklist ✅
- Verify n8n credentials for Postgres and Dropcontact are valid and tested.
- Run SQL queries manually in Postgres client to confirm results match expected profiles.
- Test batch requests in Dropcontact sandbox or with a small dataset before scaling.
- Check API rate limits and configure batch sizes and wait times accordingly.
- Perform trial runs and verify data updates in the database.
- Backup database before running bulk updates.
Deployment Guide
Activate the workflow in n8n after final testing. Ensure the Schedule Trigger is enabled for routine runs.
Monitor workflow execution logs regularly for failed nodes or errors.
Adjust batch sizes or wait times when scaling up.
Consider integrating with Slack or email for operational alerts.
FAQs
- Q: Can I use a different database than Postgres?
A: Yes, but you’d need to adjust the query and update nodes accordingly. - Q: Does this consume a lot of Dropcontact API credits?
A: Batch requests are efficient but do consume credits per contact processed. Monitor usage regularly. - Q: Is my contact data safe?
A: Dropcontact uses secured communication over HTTPS and respects privacy regulations. Ensure your API keys remain secure. - Q: Can the workflow handle thousands of contacts daily?
A: Yes, with appropriate batch sizes and waits the workflow scales well.
Conclusion
By setting up this automated workflow, Marieke now enriches thousands of contact profiles weekly without manual effort. The automation saves her an estimated 6 hours per week, eliminates costly data errors, and ensures up-to-date contact info for the sales team. You’ve learned how to combine Postgres, n8n, and Dropcontact APIs to achieve asynchronous bulk enrichment with batching and error handling.
Next, consider building workflows to automate follow-up email campaigns, integrate enriched contacts with CRM systems, or add other data quality checks. With n8n’s flexibility and Dropcontact’s powerful API, you can keep your contact data fresh and actionable effortlessly. Keep experimenting—automation brings clarity and efficiency to your sales processes!