The 2024 Google Reviews bot purge was a blunt reminder of why offline copies of your reviews matter. Local businesses watched their counts drop overnight — sometimes by hundreds of reviews — and the only operators who could explain what happened were the ones who'd been exporting their reviews monthly. Reputation lives on Google's servers right up until it doesn't. This guide walks you through using ExportComments' Google Reviews exporter to pull every public review for any business location into Excel, CSV, or JSON, with owner responses, photos, and permalinks intact.

Why export Google reviews

Google Reviews are the highest-stakes review surface most local businesses have. They feed the local pack, the Maps card, and the voice-search results — and they're the first thing a prospect reads about you. Two things make them harder to manage than they should be. First, Google's UI shows you a paginated, "most relevant" sort by default, with no native bulk export. Second, the multi-location operators who need this data most (franchises, restaurant chains, dental groups, dealerships) get no roll-up across locations. Once a location's reviews are in a spreadsheet:

  • Reputation reporting — chart the percentage of 4–5 star reviews over time and you have the single chart every executive review meeting actually wants to see.
  • Competitor benchmarking — pull the top-3 competitors in the trading area, diff the rating distribution and review velocity, and you've got an honest competitive position rather than a self-flattering one.
  • Local-SEO audits — for multi-location brands, pivot by location and surface the outliers (the one store dragging the average down, the one that quietly built a moat).
  • Owner-response gap — filter where owner_response_text is null and you've got the precise list of reviews the local manager has ignored. Closing that gap is the lowest-effort reputation lever there is.
  • Language-bucket sentiment — for international brands, pivot by the language column and you'll see that the German reviews skew differently from the Spanish ones, which differ again from English.
  • Negative-review triage — when a supply-chain issue or a viral incident hits, exporting the last 30 days of negatives gives the comms team a real corpus to triage instead of guessing.

How to export — step by step

Step 1: Grab the Google business URL

Open the location's listing on Google Maps or in Google Search and copy the URL. Both work — the Maps URL (https://www.google.com/maps/place/...) and the search-result Knowledge Panel URL resolve to the same business. The exporter uses the Place ID under the hood, so you don't need to find it manually.

Step 2: Paste the URL into the exporter

Head to the Google Reviews exporter and drop the URL into the input field. If you operate a multi-location brand and want every location in one batch, switch to bulk mode and paste one URL per line. Bulk runs return one file per location, packaged together in a single ZIP at the end of the job, so each location's reviews stay cleanly separated for downstream pivots and reporting.

Step 3: Pick a format

Choose Excel (.xlsx), CSV, or JSON. Excel is the right pick for marketing and ops teams that want to pivot, filter, and chart immediately. CSV is the safest pick for BI imports into Looker, Tableau, or Power BI. JSON is the right pick if you're piping into a notebook or feeding the reviews into a sentiment model.

Step 4: Start the export

Click Export. The job runs server-side and paginates through every public review for the location, including owner responses and any photos attached. Locations with thousands of reviews take a few minutes; you can close the tab and the file lands in your dashboard plus your inbox when it's ready.

Step 5: Open the file

Open the .xlsx in Excel, Numbers, or Google Sheets and you're ready to filter, pivot, and chart. Each row is one review, with the columns described in the next section.

Inside the export — what fields you get

Each row is one Google review. You'll find columns for:

  • reviewer_name — the display name shown on the review.
  • reviewer_profile_url — direct link back to the reviewer's Google profile.
  • reviewer_total_reviews — how many reviews the reviewer has left across all of Google. Useful for filtering out single-review accounts.
  • rating — the 1–5 star score.
  • review_text — the full review body.
  • language — the language Google auto-detected for the review.
  • relative_time — the "2 weeks ago"-style string Google shows on the page.
  • created_at_estimated — UTC timestamp derived from the relative-time string, so you can sort and pivot chronologically.
  • photos — direct URLs to any photos the reviewer attached.
  • owner_response_text — the business owner's response, blank if no response was posted.
  • owner_response_at_estimated — UTC timestamp of the owner response, also derived from a relative-time string.
  • likes — the "Helpful" vote count.
  • permalink — direct link back to the review on Google.

Common workflows

  • Reputation reporting — bucket created_at_estimated by month, compute the percentage of 4–5 star reviews per bucket, and chart it. That's the single time-series most leadership reviews actually need; everything else is decoration.
  • Competitor benchmarking — export your location plus the top-3 competitors in the trading area, stack the rating distributions side by side, and overlay review velocity. The picture is usually less flattering than the brand deck claims.
  • Local-SEO audits for multi-location brands — pivot a bulk export by location, sort by average rating, and the bottom-10 list is your operational priority list. Cross-reference with reviewer_total_reviews to filter out drive-by single-review accounts.
  • Owner-response gap closure — filter where owner_response_text is null, sort by rating ascending, and you've got the prioritized list of reviews the local manager hasn't responded to. Replying to negatives — even with a templated apology — measurably moves perceived sentiment.
  • Language-bucket sentiment — international brands almost always discover that one language cohort is materially less satisfied than another. Pivot by language, run sentiment per bucket, and the gap is usually a localization or staffing issue you can fix.
  • Negative-review triage — when a viral incident or supply-chain issue hits, export the last 30 days, filter to 1–2 stars, group by emerging keywords (delivery, refund, broken), and the comms team has a triage queue instead of an inbox panic.

Plan limits and API access

The Free tier returns up to 100 reviews per export, enough to evaluate the format on a smaller location. Personal scales to 5,000 results per export, Premium to 50,000, and Business to 250,000 — enough to capture every review for the largest multi-location chains in a single bulk run. If you'd rather pull on a schedule (monthly is the standard cadence for reputation reporting) or trigger exports from your own pipeline, the REST API and webhooks handle it. See pricing for the full breakdown and docs.exportcomments.com for endpoints and authentication.

FAQ

  • How does the exporter handle the relative-time strings ("2 weeks ago")?
    Google doesn't expose absolute timestamps on reviews, so the exporter parses the relative-time string into a UTC estimate and writes it to created_at_estimated. The original string is also preserved in relative_time for transparency.
  • Will it pull owner responses too?
    Yes. The owner_response_text and owner_response_at_estimated columns capture the business response and when it was posted. To find unanswered reviews, filter where owner_response_text is null.
  • Can I export every location for a multi-location brand in one job?
    Yes. Use bulk mode and paste one Google business URL per line. The run returns one file per location, packaged in a single ZIP, so each location's data stays cleanly separated for per-location pivots and the executive roll-up.
  • What about reviews removed in the 2024 bot purge?
    Once Google removes a review, it's gone from the public listing and the exporter can only pull what's currently visible. The case for monthly scheduled exports is exactly this: an offline archive lets you reconcile against the live count after any future purge or policy change.
  • Does it pull photos attached to reviews?
    Yes. The photos column contains the direct URLs to each attached photo, so you can download or display them outside Google.
  • Will the language column let me run separate sentiment per market?
    Yes. Google auto-detects the review language and the exporter writes the detected code into the language column, so you can pivot by market and run a separate sentiment pass per language without false-positive cross-contamination.