
Security News
Deno 2.2 Improves Dependency Management and Expands Node.js Compatibility
Deno 2.2 enhances Node.js compatibility, improves dependency management, adds OpenTelemetry support, and expands linting and task automation for developers.
Fast and lightweight promise/event based web krawler with built-in cheerio, xml and json parser. And of course ... the best :)
npm install krawler
var Krawler = require('krawler')
var urls = [
'http://ondraplsek.cz'
];
var krawler = new Krawler;
krawler
.queue(urls)
.on('data', function($, url, response) {
// $ - cheerio instance
// url of the current webpage
// response object from mikeal/request
})
.on('error', function(err, url) {
// there has been an 'error' on 'url'
})
.on('end', function() {
// all URLs has been fetched
});
Krawler provides following API:
var krawler = new Krawler({
maxConnections: 10, // number of max simultaneously opened connections, default 10
parser: 'cheerio', // web page parser, default 'cheerio'
// another options are xml, json or false (no parser will be used, raw data will be returned)
forceUTF8: false, // if Krawler should convert source string to utf8, default false
});
mikeal/request is used for fetching web pages so any desired option from this package can be passed to Krawler's constructor.
var urls = [
'https://graph.facebook.com/nodejs',
'https://graph.facebook.com/facebook',
'https://graph.facebook.com/cocacola',
'https://graph.facebook.com/google',
'https://graph.facebook.com/microsoft',
];
var krawler = new Krawler({
maxConnections: 5,
parser: 'json',
forceUTF8: true
});
krawler
.on('data', function(json, url, response) {
// do something with json...
})
.on('error', function(err, url) {
// there has been an 'error' on 'url'
})
.on('end', function() {
// all URLs has been fetched
});
If your program flow is based on promises you can easily attach Krawler to your promise chain. Method fetchUrl() returns a Q.promise. When the promise is full filled, callback function is called with a result object.
Object has two properties
var krawler = new Krawler;
findUrl()
.then(function(url) {
return krawler.fetchUrl(url);
})
.then(function(result) {
// in this case result.data in a cheerio instance
return processData(result.data);
})
// and so on ...
FAQs
Fast and lightweight web crawler with built-in cheerio, xml and json parser.
The npm package krawler receives a total of 3 weekly downloads. As such, krawler popularity was classified as not popular.
We found that krawler demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Deno 2.2 enhances Node.js compatibility, improves dependency management, adds OpenTelemetry support, and expands linting and task automation for developers.
Security News
React's CRA deprecation announcement sparked community criticism over framework recommendations, leading to quick updates acknowledging build tools like Vite as valid alternatives.
Security News
Ransomware payment rates hit an all-time low in 2024 as law enforcement crackdowns, stronger defenses, and shifting policies make attacks riskier and less profitable.