Crawler
Crawler is a simple web crawler written in Ruby. Given a URL it crawls the domain and recursively finds all links
associated with it. It also keeps track of all static contents related to each of these links.
It uses eventmachine and fiber (through em-synchrony) to issue concurrent non-blocking requests.
Crawler stores the site map using a variation of Adjacency list data structure. It can also
pretty-print the map once a URL is crawled.
Installation
Add this line to your application's Gemfile:
gem 'crawler'
And then execute:
$ bundle
Or install it yourself as:
$ gem install crawler
Usage
Using Crawler as library
crawler = Crawler.new('http://google.com')
crawler.crawl
map = crawler.map
crawler.print
Using Crawler as binary
crawler http://google.com
Contributing
- Fork it ( http://github.com/anupom/crawler/fork )
- Create your feature branch (
git checkout -b my-new-feature
) - Commit your changes (
git commit -am 'Add some feature'
) - Push to the branch (
git push origin my-new-feature
) - Create new Pull Request