Author Profile

Andrei Ogiolan

Full Stack Developer

Andrei Ogiolan is a Full Stack Developer at WebScrapingAPI, contributing across the product and helping build reliable tools and features for the platform.

Python web scrapingproxy infrastructuresearch data extractione-commerce data extractionGuidesUse Cases
Andrei Ogiolan, Full Stack Developer @ WebScrapingAPI

Published Articles

11

Published Articles
GuidesMay 1, 202617 min read

7 Best SERP APIs in 2026: Pricing & Features Compare

TL;DR: There is no official Google SERP API, so third-party providers fill the gap. Pricing ranges from roughly $0.30 to $15 per thousand searches, and the right choice depends on your volume, budget, and the SERP features you need to extract. This guide compares the top providers side by side, breaks down true cost at scale, and gives you a decision framework to shortlist the best SERP API for your project.

Read article

GuidesMay 7, 202611 min read

Web Scraping JavaScript Tables in Python: From Hidden APIs to Playwright

TL;DR: Web scraping JavaScript tables in Python rarely needs a headless browser. Open DevTools, find the JSON endpoint that hydrates the grid, replay it with requests, paginate it, and fall back to Playwright only when the network call is signed, encrypted, or otherwise sealed shut.

Read article

GuidesMay 7, 202610 min read

How to Scrape HTML Tables in Golang with Colly: End-to-End Guide

TL;DR: This guide shows how to scrape HTML tables in Golang end to end: choose between Colly, goquery, and golang.org/x/net/html, target the right <tbody>, model rows as a typed struct, and export clean JSON and CSV. You also get pagination, anti-block, and JavaScript-rendered table patterns.

Read article

GuidesMay 7, 202615 min read

How to Scrape Google Maps for Reviews: A Practical Python Guide

TL;DR: Figuring out how to scrape Google Maps for reviews comes down to three method tracks: a DIY Selenium scraper behind a rotating proxy, a scraping API with render instructions, or a structured Maps Reviews API that returns parsed JSON. This guide walks through all three in Python with copy-pasteable code, pagination patterns, anti-block tactics, and a final cleaning step that turns raw reviews into something a business can actually use.

Read article

GuidesApr 22, 20267 min read

How to Web Scrape Google Maps Place Results

Learn how to scrape Google Maps place results with our API using Node.js: step-by-step guide, professional scraper benefits, and more. Get data_id, coordinates, and build data parameter easily.

Read article

GuidesMay 7, 202615 min read

How to Scrape HTML Tables Using Python

TL;DR: Most HTML tables can be scraped with a single line of pandas.read_html. When the table is paginated, JavaScript-rendered, or has merged headers, switch to Requests + BeautifulSoup or a headless browser like Playwright. This guide gives you a decision matrix, working code for all three approaches, and the cleaning steps that turn scraped rows into pipeline-ready data.

Read article

GuidesApr 22, 20266 min read

How to Web Scrape Google Shopping Nearby Sellers with Node.js

Learn how to use Node.js and our API to scrape nearby sellers from Google Shopping. Extract valuable data quickly and easily with our professional web scraper.

Read article

GuidesApr 22, 20266 min read

Learn How to Web Scrape Google Shopping Product Specs with Node.js

Discover the step-by-step guide to web scraping Google Shopping product specs using Node.js. Improve your web scraping skills with this tutorial.

Read article

GuidesApr 22, 202610 min read

Find out how to use cURL in Python

Learn how to use cURL in Python to build data extraction scripts

Read article

Use CasesApr 22, 20267 min read

Find out how to use cURL with Proxy

Learn cURL usage with a Proxy: a command-line tool for developers to fetch data from a server. This article explains what a proxy is, how to set it up, and how to use it with cURL.

Read article

GuidesApr 22, 202610 min read

Automated Web Scraping – Easy retrieval Of reliable structured Web Data

Automated web scraping is a reliable technique of ensuring you obtain valuable structured data from multiple websites for well-considered data-driven decisions .

Read article