E-commerce Technical SEO

Recovering 1.8M Product Pages from JavaScript Rendering Failure

A framework migration left most of a Brazilian retailer's product catalogue invisible to Google. The fix required working around strict constraints.

Key Results

  • +70% organic traffic from post-migration low
  • 1.8M pages re-indexed
  • 6 weeks to full recovery

The symptom

Three weeks after launching a redesigned product catalogue on a new JavaScript framework, the client's organic traffic had dropped by 40%. The pattern was telling: product pages were losing visibility rapidly, while category pages remained stable.

The internal team had already ruled out the obvious causes. Redirects were correct. Canonicals pointed where they should. The pages rendered perfectly in a browser. But Search Console told a different story: indexed pages were declining week over week, and impressions for product queries had collapsed.

The business impact was significant. The client estimated £350–400K in lost organic revenue in the first month, with projections worsening if the trend continued.

Note: All values converted to GBP for convenience. Market: Brazil.

The diagnosis

The problem wasn't visible in Chrome DevTools. It only appeared during Googlebot's rendering process.

The new React-based product pages fetched core content (titles, descriptions, prices, availability) from API calls that completed after the initial page load. This works for users, but Googlebot operates with stricter timeout constraints than a regular browser.

Using Google's URL Inspection tool in "live test" mode, I compared what Googlebot actually rendered against what the page should display. The difference was stark: Googlebot's version showed placeholder elements where product data should have been.

The root cause: a third-party analytics script was blocking the main thread during a critical window. On a normal connection, the delay was 2–3 seconds. But Googlebot's rendering infrastructure has tighter timeouts. That delay meant product content never populated before Googlebot captured its snapshot.

Why this matters: Googlebot's renderer doesn't wait indefinitely. A delay that feels instantaneous to users can determine whether content gets indexed or ignored entirely.

The constraints

This wasn't a straightforward fix. Several factors limited the available options:

  • No SSR in the architecture: The site was entirely client-side rendered. Rebuilding for server-side rendering would have taken the engineering team 8–10 weeks.
  • Third-party script requirements: The analytics integration was contractually required for marketing attribution. Removing it wasn't an option.
  • Deployment freeze: A major sale event in 4 weeks meant any fix had to be small enough to ship quickly and stable enough not to risk revenue.

The approach

Given those constraints, I recommended a hybrid solution that avoided both full SSR and script removal. The engineering team implemented the following:

  1. Script deferral: The analytics integration was restructured to avoid blocking critical rendering. Initialisation moved to fire after DOMContentLoaded, allowing product content to hydrate before third-party JavaScript ran.

  2. Build-time pre-rendering: A build step was added to bake core product data (title, description, main image, price) directly into the HTML. This content was immediately visible to crawlers, while React enhanced it with dynamic features after load.

  3. Dynamic rendering for edge cases: Approximately 15% of pages had genuinely dynamic content (real-time stock levels, personalised pricing). For these, the team implemented a lightweight dynamic rendering solution using Cloudflare Workers, serving pre-rendered content to bots while preserving the full SPA experience for users.

Tech stack: React, Cloudflare Workers, Cloud Crawler, Google Search Console URL Inspection API

The result

The staged rollout began in week 2, with full deployment by week 4. Within six weeks:

  • Indexing: 1.8M product pages returned to the index
  • Traffic: Recovered to 70% above the post-migration low, ultimately exceeding pre-migration levels by 10–15%
  • Revenue: The client estimated £500K in recovered organic revenue that quarter
  • Performance: LCP improved from 4+ seconds to approximately 2 seconds as a byproduct of the pre-rendering work

Key takeaways

  1. Test with Googlebot, not just your browser. The URL Inspection tool's "live test" feature shows what Googlebot actually sees after rendering. This should be standard pre-launch QA for JavaScript-heavy sites.

  2. Third-party scripts can cause rendering failures invisible to standard monitoring. Most performance monitoring focuses on first-party code. Third-party scripts can introduce blocking behaviour that only manifests under specific conditions, such as Googlebot's rendering environment.

  3. Hybrid rendering approaches often outperform "pure" solutions. The choice isn't always between full SSR and pure client-side rendering. Pre-rendering critical content combined with selective dynamic rendering provided the best balance of engineering effort, performance, and SEO outcomes.

"We'd assumed it was a penalty and were bracing for the worst. Having someone explain exactly why it was happening—and that we didn't need a full rebuild—made all the difference."

Head of Product, Brazil E-commerce Platform

Struggling with JavaScript rendering issues?

Let's diagnose whether your content is actually reaching search engines.

Get in Touch

Your Brand, VISIVELY!