How to Web Scrape Reviews from Google Maps

Andrei Ogiolan on Apr 21 2023

blog-image

Introduction

Google Maps is one of the most widely used mapping and navigation services in the world, providing users with an easy way to find and explore places, businesses, and points of interest. One of the key features of Google Maps is the ability to search for places and view detailed information about them, such as their location, reviews, photos, and more.

Scraping this data from Google Maps can be useful for businesses to track and analyze the performance of their locations, for researchers to study patterns in consumer behavior and for individuals to find and explore new places.

The purpose of this article is to provide a step-by-step guide on how to scrape Google Maps Reviews with our API using Node.js. We will cover everything from setting up the development environment to extracting relevant data and discussing potential issues. By the end of this article, you will have the knowledge and tools you need to scrape Google Maps place results on your own.

Why should you use a professional scraper instead of building yours?

Using a professional scraper can be a better option than creating your own for several reasons. Firstly, professional scrapers are built to handle a wide variety of scraping tasks and are optimized for performance, reliability, and scalability. They are designed to handle large amounts of data and can handle various types of websites and web technologies. This means that professional scrapers can often extract data faster and more accurately than a custom-built scraper.

Additionally, professional scrapers often come with built-in features such as CAPTCHA solving, IP rotation, and error handling, which can make the scraping process more efficient and less prone to errors. They also offer support and documentation which can be helpful when you face any issues.

Another important aspect is that professional scrapers providers are compliant with the scraping policies of the websites they scrape and they can provide legal use of the data, which is important to keep in mind when scraping data.

Finally, in our particular case, in order to scrape Google Maps Reviews, for best results, you need to pass a data_id parameter to your Google URL. This parameter usually looks something like this : 0x87c0ef253b04093f:0xafdfd6dc1d3a2b4e. I know this may sound intimidating at first as you may have no idea how to get the data_id property for a specific place and you are right, because Google hides this information and it is not visible on the page when you are searching for a place in Google Maps. But, fortunately, using a professional scraper like ours takes care of that by finding this data for you. We will talk in the later sections about how to get the data_id and how to scrape Google Maps reviews using our API.

Defining our target

What are Google Maps reviews?

Google Maps reviews are the ratings and comments left by users on Google Maps about a specific place. These reviews include information such as the user's name, the date the review was left, the rating given, and the review text.

Scraping Google Maps reviews can be useful for businesses who want to track and analyze the performance of their locations, researchers who want to study patterns in consumer behavior, and individuals who want to find and explore new places. By extracting the reviews data, businesses can identify the strengths and weaknesses of their locations, and make improvements accordingly. Researchers can study the sentiment of the reviews and find patterns in consumer behavior. Individuals can also use this information to make decisions about where to go and what to do.

What does our target look like?

blog-image

Setting up

Before beginning to scrape Google Maps reviews, it's important to have the necessary tools in place. The primary requirement is Node.js, a JavaScript runtime that enables the execution of JavaScript on the server-side, which can be downloaded from their official website. Additionally, an API KEY is required, which can be obtained by creating an account here and activating the SERP service.

After setting up Node.js and obtaining an API KEY, the next step is to create a Node.js script file. This can be done by running the following command:

$ touch scraper.js 

And now paste the following line in your file:

console.log("Hello World!")

And the run the following command:

$ node scraper.js

If you see the message "Hello World!" displayed on the terminal, it indicates that Node.js has been successfully installed and you are ready to proceed to the final step. This final step is to obtain the Place ID of the place you are interested in scraping reviews. This is where our API comes in handy, it's easy to use and doesn't require any additional libraries to be installed.

Firstly, in a js file you need to import the Node.js `https` built in module in order to be able to send requests to our API. This can be done as following:

​​const https = require("https");

Secondly, you need to specify your API key a search term and the coordinates of the place you are interested in:

const API_KEY = "<YOUR-API-KEY-HERE>" // You can get by creating an account - https://app.webscrapingapi.com/register

const query = "Waldo%20Pizza"

const coords = "@38.99313451901278,-94.59368586441806"

Tip: This is how you get the coordinates for a place on Google Maps:

blog-image

The next step is to include the obtained Place ID in an options object, to let our API know which location's reviews you want to scrape:

const options = {

"method": "GET",

"hostname": "serpapi.webscrapingapi.com",

"port": null,

"path": `/v1?engine=google_maps&api_key=${API_KEY}&type=search&q=${query}&ll=${coords}`,

"headers": {}

};

Next, you now need to set up a call to our API with all this information:

const req = https.request(options, function (res) {

const chunks = [];

res.on("data", function (chunk) {

chunks.push(chunk);

});

res.on("end", function () {

const body = Buffer.concat(chunks);

const response = JSON.parse(body.toString());

const data_id = response.place_results.data_id;

if (data_id) {

console.log(data_id);

}

else {

console.log('We could not find a data_id property for your query. Please try using another query')

}

});

});

req.end();

Lastly, you can execute the script you have just created and wait for the results to be returned:

​​$ node scraper.js

And you should get back the data_id property printed on the screen:

$ ​​0x87c0ef253b04093f:0xafdfd6dc1d3a2b4es

That concludes the setup process, with the data_id property, now you have all the necessary information to create a scraper for Google Maps reviews using our API using Node.js.

Let’s start scraping Google Reviews

With the environment set up, you are ready to begin scraping Google Maps Reviews with our API. To proceed, you need to set up the data parameter as previously mentioned. With all the necessary information available, you can set up the data_id parameter as follows:

const data_id = "0x87c0ef253b04093f:0xafdfd6dc1d3a2b4e" // the data_id we retrieved earlier

Now, the only thing left to do is to modify the options object, thus telling our API that you would like to scrape reviews from Google Maps:

const options = {

"method": "GET",

"hostname": "serpapi.webscrapingapi.com",

"port": null,

"path": `/v1?engine=google_maps_reviews&api_key=${API_KEY}&data_id=${data_id}`, // there is no need in having a query anymore, data_id is enough to identify a place

"headers": {}

};

And this is everything you need to do. Your script should now look like this:

const http = require("https");

const API_KEY = "<YOUR-API-KEY-HERE>"

const data_id = "0x87c0ef253b04093f:0xafdfd6dc1d3a2b4e" // the data_id we retrieved earlier

const options = {

"method": "GET",

"hostname": "serpapi.webscrapingapi.com",

"port": null,

"path": `/v1?engine=google_maps_reviews&api_key=${API_KEY}&data_id=${data_id}`, // there is no need in having a query anymore, data_id is enough to identify a place

"headers": {}

};

const req = http.request(options, function (res) {

const chunks = [];

res.on("data", function (chunk) {

chunks.push(chunk);

});

res.on("end", function () {

const body = Buffer.concat(chunks);

const response = JSON.parse(body.toString())

console.log(response);

});

});

req.end();

After executing this script, you should receive a response that appears similar to this:

reviews: [

{

link: 'https://www.google.com/maps/reviews/data=!4m8!14m7!1m6!2m5!1sChZDSUhNMG9nS0VJQ0FnSUMyem9pOEdBEAE!2m1!1s0x0:0xafdfd6dc1d3a2b4e!3m1!1s2@1:CIHM0ogKEICAgIC2zoi8GA%7CCgwI1vuBkwYQiKeWyQE%7C?hl=en-US',

date: '8 months ago',

rating: 5,

snippet: 'Wow, if you have dietary restrictions this is absolutely the place to go! Both for the variety of restrictions they cater to as well as the taste of the dishes.The good: great tasting food. Very conscious of dietary restrictions which include multiple types of vegan cheeses as well as gluten free. Decent drink selection.The meh: service is nice but a touch slow. Maybe understaffed? Prices are average for pizzas.The bad: noneFeatures: Did not see any masks on anyone inside. Unsure of cleaning practices so I cannot speak to that.Dine in: Yes\n' +

'Takeout: Yes\n' +

'Curbside pickup: YesWow, if you have dietary restrictions this is absolutely the place to go! Both for the variety of restrictions they cater to as well as the taste of the dishes. ...More',

likes: 3,

user: [Object],

images: [Array]

},

{

link: 'https://www.google.com/maps/reviews/data=!4m8!14m7!1m6!2m5!1sChZDSUhNMG9nS0VJQ0FnSURXOUxHSUl3EAE!2m1!1s0x0:0xafdfd6dc1d3a2b4e!3m1!1s2@1:CIHM0ogKEICAgIDW9LGIIw%7CCgwI3OnIkQYQwLGL1gM%7C?hl=en-US',

date: '9 months ago',

rating: 5,

snippet: "We love Waldo Pizza! We have dairy allergies and Waldo offers a wide range of vegan cheeses as well as a ton of different toppings. The vegan dessert here is always excellent as well, super rich in flavor. Of course the traditional pizza, pasta and dessert are also amazing! It's great to have both options under one roof!Dine in: Yes\n" +

'Outdoor seating: No ...More',

likes: 1,

user: [Object],

images: [Array]

}

. . .

]

And that's it! You have successfully scraped Google Maps reviews using our API and you can now use the obtained data for various purposes such as data analysis, business analysis, machine learning and more. For further reference and code samples in other 6 programming languages, you can check out our Google Maps reviews documentation.

Limitations of Google Maps Reviews

Even though using a professional scraper to extract Google Maps reviews can be more efficient and accurate than building your own scraper, there are still some limitations to keep in mind. One limitation is that some professional scrapers may have usage limits, which means that you can only scrape a certain number of reviews per day or per month. Another limitation is that some professional scrapers may not be able to bypass IP blocks or CAPTCHAs, which can make it difficult to extract large amounts of data without encountering errors. Luckily, at WebScrapingAPI we have residential proxies which rotate the IP addresses, thus getting you covered and eliminating the worry of being banned or rate limited. One thing you should keep in mind is that Google Maps reviews are usually in natural language, which can make them difficult to analyze and interpret without the use of natural language processing techniques.

Conclusion

In conclusion, scraping Google Maps reviews can be a valuable tool for businesses, researchers, and individuals. It allows you to gather data on a large scale and analyze it for various purposes. However, it's important to keep in mind that there are limitations to scraping Google Maps reviews, including usage limits, CAPTCHAs and IP blocks and natural language processing. Using a professional scraper can make the process more efficient and accurate and can get you rid of some of the limitations.Overall, scraping Google Maps reviews can provide useful information, but it's important to approach it with caution and care.

News and updates

Stay up-to-date with the latest web scraping guides and news by subscribing to our newsletter.

We care about the protection of your data. Read our Privacy Policy.

Related articles

thumbnail
GuidesSERP Scraping API - Start Guide

Effortlessly gather real-time data from search engines using the SERP Scraping API. Enhance market analysis, SEO, and topic research with ease. Get started today!

WebscrapingAPI
author avatar
WebscrapingAPI
7 min read
thumbnail
GuidesTop 7 Best Google SERP APIs (Free & Paid)

Top 7 Google SERP APIs Compared: WebScrapingAPI, Apify, Serp API & More - Best Value for Money, Features, Pros & Cons

Andrei Ogiolan
author avatar
Andrei Ogiolan
10 min read
thumbnail
GuidesHow To Use A Proxy With Node Fetch And Build a Web Scraper

Learn how to use proxies with node-fetch, a popular JavaScript HTTP client, to build web scrapers. Understand how proxies work in web scraping, integrate proxies with node-fetch, and build a web scraper with proxy support.

Mihnea-Octavian Manolache
author avatar
Mihnea-Octavian Manolache
8 min read