RobotsTagParser
A simple gem to parse X-Robots-Tag HTTP headers according to Google X-Robots-Tag HTTP header specifications.
Installation
Add this line to your application's Gemfile:
gem 'robots_tag_parser', git: 'https://github.com/GSA/robots_tag_parser'
And then execute:
$ bundle
Usage
Basic examples
Get rules applying to all user agents:
headers = { 'X-Robots-Tag' => ['noindex,noarchive', 'googlebot: nofollow'] }
RobotsTagParser.get_rules(headers: headers)
=> ['noindex', 'noarchive']
Get rules applying to specific user agents (which include generic
rules):
headers = { 'X-Robots-Tag' => ['noindex,noarchive', 'googlebot: nofollow'] }
RobotsTagParser.get_rules(headers: headers, user_agent: 'googlebot')
=> ['noindex', 'noarchive', 'nofollow']
Development
After checking out the repo, run bin/setup
to install dependencies. Then, run rake spec
to run the tests. You can also run bin/console
for an interactive prompt that will allow you to experiment.
To install this gem onto your local machine, run bundle exec rake install
. To release a new version, update the version number in version.rb
, and then run bundle exec rake release
, which will create a git tag for the version, push git commits and tags, and push the .gem
file to rubygems.org.
Contributing
Bug reports and pull requests are welcome on GitHub.
License
The gem is available as open source under the terms of the CC0 License.
Directives:
Source: https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag