Research
Security News
Malicious npm Package Targets Solana Developers and Hijacks Funds
A malicious npm package targets Solana developers, rerouting funds in 2% of transactions to a hardcoded address.
google-shopping-scraper-api
Advanced tools
Google Shopping Scraper can scrape and parse various Google Shopping page types to collect structured e-commerce data.
As part of E-Commerce Scraper API, Google Shopping Scraper extracts timely e-commerce data in raw HTML or structured JSON format. The scraper offers a maintenance-free data collection infrastructure that automates the bulk of underlying processes, from sending HTTP requests to data parsing.
The underlying measures, such as proxies, ensure considerably fewer CAPTCHAs and IP blocks. The scraper supports localized results from almost any locale worldwide (195 countries) with country-level and postal code targeting.
Additionally, the scraper can automate recurring scraping and parsing jobs through Scheduler, load dynamic websites that use JavaScript for rendering content, and retrieve results via the API or directly to Google Cloud Storage or Amazon S3 storage bucket.
There are various page types we can scrape and parse on Google Shopping. You can either provide us with a full URL or a few input parameters via specifically built data sources (e.g. Search, Product, Product Pricing so we can form the URL on our end.
Below is a quick overview of all the available data source
values we support with Google Shopping.
Source | Description | Structured data |
---|---|---|
google | Submit any Google Shopping URL you like. | Depends on the URL. |
google_shopping_search | Search results for a search term of your choice. | Yes. |
google_shopping_product | Product page of a product ID of your choice. | Yes. |
google_shopping_pricing | List of offers available for a product ID of your choice. | Yes. |
The google
source is designed to retrieve content from various Google Shopping URLs. Instead of sending multiple parameters and letting us form and scrape Google Shopping URLs, you can provide us with a URL to the required Google Shopping page. We do not strip any parameters or alter your URLs in any other way.
This data source also supports parsed data (structured data in JSON format), as long as the URL submitted links to a page that we can parse.
Parameter | Description | Default Value |
---|---|---|
source | Data source. More info. | google |
url | Direct URL (link) to Google page | - |
user_agent_type | Device type and browser. The full list can be found here. | desktop |
render | Enables JavaScript rendering. More info. | |
callback_url | URL to your callback endpoint. More info. | - |
geo_location | The geographical location that the result should be adapted for. Using this parameter correctly is extremely important to get the right data. For more information, read about our suggested geo_location parameter structures here. | - |
parse | true will return parsed data, as long as the URL submitted is for Google Search. | - |
- required parameter
In this example, we make a request to retrieve a Google Shopping Search result for keyword adidas
, as seen in New York, USA.
import requests
from pprint import pprint
# Structure payload.
payload = {
'source': 'google',
'url': 'https://www.google.com/search?tbm=shop&q=adidas&hl=en',
'geo_location': 'New York,New York,United States'
}
# Get response.
response = requests.request(
'POST',
'https://realtime.oxylabs.io/v1/queries',
auth=('user', 'pass1'),
json=payload,
)
# Instead of response with job status and results url, this will return the
# JSON response with results.
pprint(response.json())
Code examples for other languages can be found here.
The google_shopping_search
source is designed to retrieve Google Shopping search results.
Parameter | Description | Default Value |
---|---|---|
source | Data source. More info. | google_shopping_search |
domain | Domain localization | com |
query | UTF-encoded keyword | - |
start_page | Starting page number | 1 |
pages | Number of pages to retrieve | 1 |
locale | Accept-Language header value which changes your Google Shopping page web interface language. More info. | - |
results_language | Results language. List of supported Google languages can be found here. | - |
geo_location | The geographical location that the result should be adapted for. Using this parameter correctly is extremely important to get the right data. For more information, read about our suggested geo_location parameter structures here. | - |
user_agent_type | Device type and browser. The full list can be found here. | desktop |
render | Enables JavaScript rendering. More info. | - |
callback_url | URL to your callback endpoint. More info. | - |
parse | true will return parsed data. | - |
| true will turn off spelling auto-correction. | false |
| Sort product list by a given criteria. r applies default Google sorting, rv - by review score, p - by price ascending, pd - by price descending | r |
| Minimum price of products to filter | - |
| Maximum price of products to filter | - |
- required parameter
In this example, we make a request to retrieve the first 4
pages of Google Shopping search for the search term adidas
, sorted by descending price and minimum price of $20
.
import requests
from pprint import pprint
# Structure payload.
payload = {
'source': 'google_shopping_search',
'domain': 'com',
'query': 'adidas',
'pages': 4,
'context': [
{'key': 'sort_by', 'value': 'pd'},
{'key': 'min_price', 'value': 20},
],
}
# Get response.
response = requests.request(
'POST',
'https://realtime.oxylabs.io/v1/queries',
auth=('user', 'pass1'),
json=payload,
)
# Print prettified response to stdout.
pprint(response.json())
Code examples for other languages can be found here.
The google_shopping_product
source is designed to retrieve Google Shopping product page for a specified product.
Parameter | Description | Default Value |
---|---|---|
source | Data source. More info. | google_shopping_product |
domain | Domain localization | com |
query | UTF-encoded product code | - |
locale | Accept-Language header value which changes your Google Shopping page web interface language. More info. | - |
results_language | Results language. List of supported Google languages can be found here. | - |
geo_location | The geographical location that the result should be adapted for. Using this parameter correctly is extremely important to get the right data. For more information, read about our suggested geo_location parameter structures here. | - |
user_agent_type | Device type and browser. The full list can be found here. | desktop |
render | Enables JavaScript rendering. More info. | |
callback_url | URL to your callback endpoint. More info. | - |
parse | true will return parsed data. | - |
- required parameter
In the code example below, we make a request to retrieve the product page for product ID 5007040952399054528
from Google Shopping on com
domain.
import requests
from pprint import pprint
# Structure payload.
payload = {
'source': 'google_shopping_product',
'domain': 'com',
'query': '5007040952399054528',
}
# Get response.
response = requests.request(
'POST',
'https://realtime.oxylabs.io/v1/queries',
auth=('user', 'pass1'),
json=payload,
)
# Print prettified response to stdout.
pprint(response.json())
Code examples for other languages can be found here.
The google_shopping_pricing
source is designed to retrieve pages containing lists of offers available for a product ID of your choice.
Parameter | Description | Default Value |
---|---|---|
source | Data source. More info. | google_shopping_pricing |
domain | Domain localization | com |
query | UTF-encoded product code | - |
start_page | Starting page number | 1 |
pages | Number of pages to retrieve | 1 |
locale | Accept-Language header value which changes your Google Shopping page web interface language. More info. | - |
results_language | Results language. List of supported Google languages can be found here. | - |
geo_location | The geographical location that the result should be adapted for. Using this parameter correctly is extremely important to get the right data. For more information, read about our suggested geo_location parameter structures here. | - |
user_agent_type | Device type and browser. The full list can be found here. | desktop |
render | Enables JavaScript rendering. More info. | |
callback_url | URL to your callback endpoint. More info. | - |
parse | true will return parsed data. | - |
- required parameter |
In the code example below, we make a request to retrieve the product pricing page for product ID 5007040952399054528
from Google Shopping on google.com
.
import requests
from pprint import pprint
# Structure payload.
payload = {
'source': 'google_shopping_pricing',
'domain': 'com',
'query': '5007040952399054528',
}
# Get response.
response = requests.request(
'POST',
'https://realtime.oxylabs.io/v1/queries',
auth=('user', 'pass1'),
json=payload,
)
# Print prettified response to stdout.
pprint(response.json())
Code examples for other languages can be found here..
If you have questions or concerns about Google Shopping Scraper or associated features, get in touch via (support@oxylabs.io) or through the live chat on our website.
FAQs
Google Shopping Scraper can scrape and parse various Google Shopping page types to collect structured e-commerce data.
We found that google-shopping-scraper-api demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
A malicious npm package targets Solana developers, rerouting funds in 2% of transactions to a hardcoded address.
Security News
Research
Socket researchers have discovered malicious npm packages targeting crypto developers, stealing credentials and wallet data using spyware delivered through typosquats of popular cryptographic libraries.
Security News
Socket's package search now displays weekly downloads for npm packages, helping developers quickly assess popularity and make more informed decisions.