Security News
Fluent Assertions Faces Backlash After Abandoning Open Source Licensing
Fluent Assertions is facing backlash after dropping the Apache license for a commercial model, leaving users blindsided and questioning contributor rights.
Crawler is a web spider written with Nodejs. It gives you the full power of jQuery on the server to parse a big number of pages as they are downloaded, asynchronously.
node-crawler aims to be the best crawling/scraping package for Node.
It features:
The argument for creating this package was made at ParisJS #2 in 2010 ( lightning talk slides )
Help & Forks welcomed!
$ npm install crawler
var Crawler = require("crawler").Crawler;
var c = new Crawler({
"maxConnections":10,
// This will be called for each crawled page
"callback":function(error,result,$) {
// $ is a jQuery instance scoped to the server-side DOM of the page
$("#content a").each(function(a) {
c.queue(a.href);
});
}
});
// Queue just one URL, with default callback
c.queue("http://joshfire.com");
// Queue a list of URLs
c.queue(["http://jamendo.com/","http://tedxparis.com"]);
// Queue URLs with custom callbacks & parameters
c.queue([{
"uri":"http://parishackers.org/",
"jQuery":false,
// The global callback won't be called
"callback":function(error,result) {
console.log("Grabbed",result.body.length,"bytes");
}
}]);
// Queue some HTML code directly without grabbing (mostly for tests)
c.queue([{
"html":"<p>This is a <strong>test</strong></p>"
}]);
You can pass these options to the Crawler() constructor if you want them to be global or as items in the queue() calls if you want them to be specific to that item (overwriting global options)
This options list is a strict superset of mikeal's request options and will be directly passed to the request() method.
Basic request options:
Callbacks:
Pool options:
Retry options:
Server-side DOM options:
Charset encoding:
Cache:
When using timeouts, to avoid triggering Node #3076 you should use Node > 0.8.14
There is now a complete memory leak test for node-crawler :)
$ npm install && npm test
Feel free to add more tests!
Make Sizzle tests pass (jsdom bug? https://github.com/tmpvar/jsdom/issues#issue/81)
More crawling tests
Document the API more (+ the result object)
Get feedback on featureset for a 1.0 release (option for autofollowing links?)
Check how we can support other mimetypes than HTML
Option to wait for callback to finish before freeing the pool resource (via another callback like next())
0.2.2
0.2.1
0.2.0
0.1.0
FAQs
Crawler is a ready-to-use web spider that works with proxies, asynchrony, rate limit, configurable request pools, jQuery, and HTTP/2 support.
The npm package crawler receives a total of 2,729 weekly downloads. As such, crawler popularity was classified as popular.
We found that crawler demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 0 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Fluent Assertions is facing backlash after dropping the Apache license for a commercial model, leaving users blindsided and questioning contributor rights.
Research
Security News
Socket researchers uncover the risks of a malicious Python package targeting Discord developers.
Security News
The UK is proposing a bold ban on ransomware payments by public entities to disrupt cybercrime, protect critical services, and lead global cybersecurity efforts.