Scrape and Sync Trustpilot Reviews to Google Sheets with n8n

Discover how to automate scraping Trustpilot reviews directly into Google Sheets using n8n. This workflow saves hours by extracting, parsing, and syncing customer feedback efficiently without manual effort.
httpRequest
googleSheets
code
+5
Learn how to Build this Workflow with AI:
Workflow Identifier: 1729
NODES in Use: Manual Trigger, Sticky Note, HTTP Request, Set, Code, Split Out, Google Sheets, Schedule Trigger

Press CTRL+F5 if the workflow didn't load.

Visit through Desktop for Best experience

Opening Problem Statement

Meet Sarah, a digital marketing manager at a growing e-commerce company selling eco-friendly products online. Sarah relies heavily on customer reviews from Trustpilot to gauge product satisfaction and identify areas for improvement. However, checking Trustpilot manually to gather review data is a time-consuming, repetitive task that often leads to errors and lost insights.

Sarah was spending over 5 hours a week just copying and pasting new reviews into her team’s Google Sheets tracker, making it hard to keep up with the rapid influx of feedback and delaying critical product adjustments. The constant manual effort also increased the chance of missing important negative reviews or new trends in customer sentiment.

This is a common problem for many marketing teams that depend on accurate and up-to-date review data from Trustpilot but lack an automated way to collect and organize the information efficiently.

What This Automation Does

This n8n workflow transforms how you collect and manage Trustpilot reviews by automating the entire process seamlessly. Here’s what happens when the workflow runs:

  • Fetches reviews directly from Trustpilot pages by scraping new review content for any specified company.
  • Parses the review data using custom JavaScript to extract detailed fields like author, date, rating, and review text.
  • Splits out individual reviews to prepare for insertion into Google Sheets.
  • Updates or appends review entries to two different Google Sheets documents tailored for general tracking and for integration with HelpfulCrowd platform.
  • Limits scraping to a specified number of pages with pagination control to avoid unnecessary requests.
  • Accepts manual or scheduled triggers so you can control when it runs meticulously.

Overall, Sarah went from spending 5 hours weekly to just a couple of minutes setting up this workflow and running it periodically, freeing up valuable time for strategic marketing initiatives.

Prerequisites ⚙️

  • n8n Account to build and execute this automated workflow.
  • Google Sheets account with OAuth2 Credentials 📊🔑 connected to n8n for data insertion and updates.
  • Trustpilot company ID/name to specify the target of review scraping.
  • Basic knowledge of JavaScript beneficial but not mandatory, as code is included.
  • Optional: Self-hosting n8n for enhanced control (e.g., Hostinger with Buldrr).

Step-by-Step Guide to Build This Workflow

1. Start with a Trigger (Manual/Schedule)

Open n8n Editor and add either a Manual Trigger or a Schedule Trigger. For manual testing, click Start node → When clicking ‘Test workflow’. For automated runs, configure the Schedule Trigger with desired intervals, e.g., daily or hourly.

You should see the nodes ready to execute the workflow when triggered.

Tip: Use the manual trigger to confirm your workflow works step-by-step before scheduling.

2. Set Global Workflow Parameters

Add a Set Node named Global. Configure it to set these variables:

  • company_id: The Trustpilot company ID or slug, e.g., "n8n.io"
  • max_page: Maximum number of pages to scrape, e.g., 100

This will inform the scraping node what to target and how many pages at most to fetch.

3. Fetch Trustpilot Reviews with HTTP Request

Add an HTTP Request Node named Get reviews.

Set the URL as: https://trustpilot.com/review/{{ $json.company_id }}.

Configure pagination:

  • Enable pagination with page query parameter incrementing on each request.
  • Set max requests equal to max_page from the Global node.
  • Wait 5 seconds between requests to be respectful to the server.
  • Configure pagination stop upon receiving HTTP status code 404 (indicating no more pages).

Set sorting query parameter sort=recency to fetch newest reviews first.

After setup, test the HTTP request and verify it retrieves HTML page content.

4. Parse Reviews Using Code Node and Cheerio

Add a Code Node named Parse reviews.

Use the following JavaScript code to extract review data embedded in the page:

const cheerio = require('cheerio');

async function getReviewsFromPage(content) {
    try {
        const $ = cheerio.load(content);
        const scriptTag = $('#__NEXT_DATA__');
        
        if (!scriptTag.length) {
            console.warn("Warning: Could not find review data in page");
            return [];
        }

        const reviewsRaw = JSON.parse(scriptTag.html());
        return reviewsRaw.props.pageProps.reviews || [];
    } catch (error) {
        console.error(`Error fetching reviews: ${error.message}`);
        return [];
    }
}

async function scrapeTrustpilotReviews() {
    let reviewsData = [];
    
    for (let page = 0; page < $input.all().length; page++) {
        console.log(`nScraping page ${page}...`);
        const content = $input.all()[page].json.data;
        const reviews = await getReviewsFromPage(content);
        
        if (!reviews.length) {
            console.log("No more reviews found.");
            break;
        }

        console.log(`Found ${reviews.length} reviews on page ${page}`);
        reviews.forEach(review => {
            const data = {
                Date: new Date(review.dates.publishedDate).toISOString().split('T')[0],
                Author: review.consumer.displayName,
                Body: review.text,
                Heading: review.title,
                Rating: review.rating,
                Location: review.consumer.countryCode
            };
            reviewsData.push(review);
        });
    }
    
    return reviewsData;
}

const reviews = await scrapeTrustpilotReviews();

return {reviews:reviews};

This script uses Cheerio to parse HTML, find the Next.js embedded JSON data, extract reviews, and return them for subsequent nodes.

5. Split Reviews into Individual Items

Add a Split Out Node named Split Out configured to split on the reviews field returned from the Parse reviews node.

This prepares each review as a separate item for further manipulation.

6. Format Review Data for Different Sheets

Add two Set Nodes called General edits and HelpfulCrowd edits to tailor the review fields for insertion into two Google Sheets.

General edits sets fields like Date, Author, Body, Heading, Rating, Location, review_id.

HelpfulCrowd edits maps fields required by HelpfulCrowd, including status, verification, and placeholders for product_id, comments, and media.

7. Append or Update Sheets with Review Data

Add two Google Sheets Nodes: General sheet and HelpfulCrowd Sheets.

Configure each node to connect your Google Sheets account.

Set the document ID and sheet names exactly as intended, matching schema and mapping properly to the fields set earlier.

Use appendOrUpdate operation based on unique review_id to avoid duplicates.

8. Test and Adjust Workflow Parameters

Run the workflow using manual trigger to verify successful scraping, parsing, and spreadsheet updates.

Check Google Sheets to confirm reviews appear correctly without duplicates.

Adjust company_id or max_page values in the Global set node as needed for different companies or scraping scope.

Customizations ✏️

  1. Change Target Company
    • Edit the Global set node’s company_id field to your desired Trustpilot company slug, e.g., “amazon”.
  2. Limit Pages to Scrape
    • Modify the max_page value in Global set node to control how many pages of reviews to fetch.
  3. Extend Review Details
    • Add fields in the Parse reviews code node to extract more info, like review ID, reviewer image, or product references (if available).
  4. Use Schedule Instead of Manual Trigger
    • Replace the manual trigger with the Schedule Trigger node to automate scraping daily or at intervals.
  5. Customize Google Sheets Layout
    • Adjust the mapping schema in the Google Sheets nodes to fit your sheet’s columns or add new columns.

Troubleshooting 🔧

Problem:

“No more reviews found” message appears early, and fewer reviews are scraped than expected.

Cause:

Trustpilot pagination limits reached or the company ID is incorrect.

Solution:

  • Double-check the company_id value in the Global node.
  • Increase max_page cautiously but be mindful of request limits.
  • Check the HTTP request URL and response status for any access issues.

Problem:

Google Sheets nodes fail with authentication errors.

Cause:

Google Sheets OAuth credentials are invalid or expired.

Solution:

  • Re-authenticate Google Sheets in n8n Credential settings.
  • Ensure the connected Google account has edit permissions on the target sheets.

Pre-Production Checklist ✅

  • Verify that your Trustpilot company ID is correct and public reviews exist for scraping.
  • Ensure your Google Sheets documents are accessible and shared with your Google credentials linked in n8n.
  • Test the HTTP Request node separately to confirm pages are fetched without errors.
  • Run the code node manually with sample data to validate review parsing logic.
  • Backup your sheets before running any updates to prevent accidental data loss.

Deployment Guide

Activate your workflow after thorough testing.

For automated operations, enable the Schedule Trigger with your desired frequency.

Monitor execution via n8n’s execution list for failures or errors.

Periodically verify Google Sheets data integrity and update the company_id or other parameters as business needs evolve.

FAQs

Can I scrape reviews for multiple companies?

Yes, by cloning the workflow and changing the company_id in the Global node or by expanding it to accept multiple IDs with loop control.

Does this workflow consume Google Sheets API quota?

Yes, every appendOrUpdate operation uses API quota, so monitor your usage on Google Developer Console to avoid limits.

Is my data secure?

Data passes through n8n and Google Sheets securely with OAuth. Avoid sharing credentials or exposing sensitive info in public workflows.

Can I run this workflow on schedule?

Absolutely. Use the Schedule Trigger node to automate running at specified times.

Conclusion

By setting up this customized n8n workflow, you have automated the tedious task of gathering customer reviews from Trustpilot into Google Sheets. Sarah now saves more than 5 hours weekly — which she re-invests in crafting better marketing strategies and improving customer experience.

This workflow demonstrates how focused scraping, parsing, and smart data syncing can transform raw online feedback into actionable insights without manual drudgery.

Next steps could include automating sentiment analysis on reviews, notifying your team via Slack about negative feedback, or integrating further review sources like Amazon or Google Reviews for a broader view.

Happy automating!

Related Workflows

Automate Viral UGC Video Creation Using n8n + Degaus (Beginner-Friendly Guide)

Learn how to automate viral UGC video creation using n8n, AI prompts, and Degaus. This beginner-friendly guide shows how to import, configure, and run the workflow without technical complexity.
Form Trigger
Google Sheets
Gmail
+37
Free

AI SEO Blog Writer Automation in n8n (Beginner Guide)

A complete beginner guide to building an AI-powered SEO blog writer automation using n8n.
AI Agent
Google Sheets
httpRequest
+5
Free

Automate CrowdStrike Alerts with VirusTotal, Jira & Slack

This workflow automates processing of CrowdStrike detections by enriching threat data via VirusTotal, creating Jira tickets for incident tracking, and notifying teams on Slack for quick response. Save hours daily by transforming complex threat data into actionable alerts effortlessly.
scheduleTrigger
httpRequest
jira
+5
Free

Automate Telegram Invoices to Notion with AI Summaries & Reports

Save hours on financial tracking by automating invoice extraction from Telegram photos to Notion using Google Gemini AI. This workflow extracts data, records transactions, and generates detailed spending reports with charts sent on schedule via Telegram.
lmChatGoogleGemini
telegramTrigger
notion
+9
Free

Automate Email Replies with n8n and AI-Powered Summarization

Save hours managing your inbox with this n8n workflow that uses IMAP email triggers, AI summarization, and vector search to draft concise replies requiring minimal review. Automate business email processing efficiently with AI guidance and Gmail integration.
emailReadImap
vectorStoreQdrant
emailSend
+12
Free

Automate Email Campaigns Using n8n with Gmail & Google Sheets

This n8n workflow automates personalized email outreach campaigns by integrating Gmail and Google Sheets, saving hours of manual follow-up work and reducing errors in email sequences. It ensures timely follow-ups based on previous email interactions, optimizing communication efficiency.
googleSheets
gmail
code
+5
Free