
Security News
vlt Launches "reproduce": A New Tool Challenging the Limits of Package Provenance
vlt's new "reproduce" tool verifies npm packages against their source code, outperforming traditional provenance adoption in the JavaScript ecosystem.
open-graph-scraper
Advanced tools
A simple node module for scraping Open Graph and Twitter Card info off a site. For browser usage, we recommend using ky to make the requests(or a backend service) then pass in the html
into open-graph-scraper
using the html
option.
npm install open-graph-scraper --save
Callback Example:
const ogs = require('open-graph-scraper');
const options = { url: 'http://ogp.me/' };
ogs(options, (error, results, response) => {
console.log('error:', error); // This returns true or false. True if there was an error. The error itself is inside the results object.
console.log('results:', results); // This contains all of the Open Graph results
console.log('response:', response); // This contains the HTML of page
});
Promise Example:
const ogs = require('open-graph-scraper');
const options = { url: 'http://ogp.me/' };
ogs(options)
.then((data) => {
const { error, result, response } = data;
console.log('error:', error); // This returns true or false. True if there was an error. The error itself is inside the results object.
console.log('result:', result); // This contains all of the Open Graph results
console.log('response:', response); // This contains the HTML of page
})
Check the return for a success
flag. If success is set to true, then the url input was valid. Otherwise it will be set to false. The above example will return something like...
{
ogTitle: 'Open Graph protocol',
ogType: 'website',
ogUrl: 'http://ogp.me/',
ogDescription: 'The Open Graph protocol enables any web page to become a rich object in a social graph.',
ogImage: {
url: 'http://ogp.me/logo.png',
width: '300',
height: '300',
type: 'image/png'
},
requestUrl: 'http://ogp.me/',
success: true
}
Name | Info | Default Value | Required |
---|---|---|---|
url | URL of the site. | x | |
timeout | Timeout of the request | 2000 ms | |
html | You can pass in an HTML string to run ogs on it. (use without options.url) | ||
blacklist | Pass in an array of sites you don't want ogs to run on. | [] | |
onlyGetOpenGraphInfo | Only fetch open graph info and don't fall back on anything else. | false | |
ogImageFallback | Fetch other images if no open graph ones are found. | true | |
customMetaTags | Here you can define custom meta tags you want to scrape. | [] | |
allMedia | By default, OGS will only send back the first image/video it finds | false | |
decompress | Set the accept-encoding to gzip/deflate | true | |
followRedirect | Defines if redirect responses should be followed automatically. | true | |
maxRedirects | Max number of redirects ogs will follow. | 10 | |
retry | Number of times ogs will retry the request. | 2 | |
headers | An object containing request headers. Useful for setting the user-agent | {} | |
peekSize | Sets the peekSize for the request | 1024 | |
agent | Used for Proxies, Look below for notes on how to use. | null | |
downloadLimit | Maximum size of the content downloaded from the server, in bytes | 1000000 (1MB) | |
urlValidatorSettings | Sets the options used by validator.js for testing the URL | Here |
Note: open-graph-scraper
uses got for requests and most of got's options should work as open-graph-scraper
options.
const ogs = require('open-graph-scraper');
const options = {
url: 'https://github.com/jshemas/openGraphScraper',
customMetaTags: [{
multiple: false, // is there more than one of these tags on a page (normally this is false)
property: 'hostname', // meta tag name/property attribute
fieldName: 'hostnameMetaTag', // name of the result variable
}],
};
ogs(options)
.then((data) => {
const { error, result, response } = data;
console.log('hostnameMetaTag:', result.hostnameMetaTag); // hostnameMetaTag: github.com
})
Look here for more info on how to use proxies.
const ogs = require('open-graph-scraper');
const tunnel = require('tunnel');
const options = {
url: 'https://whatismyipaddress.com/',
timeout: 15000,
agent: {
// setting proxy agent for https requests
https: tunnel.httpsOverHttp({
// test proxies can be found here: https://hidemy.name/en/proxy-list/?country=US&type=h#list or http://free-proxy.cz/en/proxylist/country/US/https/ping/all
proxy: {
host: 'proxy_ip',
port: proxyPort,
rejectUnauthorized: false,
}
})
}
};
ogs(options)
.then((data) => {
const { error, result, response } = data;
console.log('response:', response); // you should see the proxy IP in here
})
const ogs = require("open-graph-scraper");
const options = {
url: "https://twitter.com/elonmusk/status/1364826301027115008",
headers: {
"user-agent": "Googlebot/2.1 (+http://www.google.com/bot.html)",
},
};
ogs(options, (error, results) => {
console.log("error:", error); // This returns true or false. True if there was an error. The error itself is inside the results object.
console.log("results:", results); // This contains all of the Open Graph results
});
Then you can run the tests by running...
npm run test
4.10.0
content-type
header, it has to contain text/html
options.downloadLimit
, it sets the maximum size of the content downloaded from the server, in bytesFAQs
Node.js scraper module for Open Graph and Twitter Card info
The npm package open-graph-scraper receives a total of 52,171 weekly downloads. As such, open-graph-scraper popularity was classified as popular.
We found that open-graph-scraper demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 0 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
vlt's new "reproduce" tool verifies npm packages against their source code, outperforming traditional provenance adoption in the JavaScript ecosystem.
Research
Security News
Socket researchers uncovered a malicious PyPI package exploiting Deezer’s API to enable coordinated music piracy through API abuse and C2 server control.
Research
The Socket Research Team discovered a malicious npm package, '@ton-wallet/create', stealing cryptocurrency wallet keys from developers and users in the TON ecosystem.