
Security News
vlt Launches Real-Time Dependency Analysis Powered by Socket
vlt adds real-time security selectors powered by Socket, enabling developers to query and analyze package risks directly in their dependency graph.
Install:
npm install robotstxt
all examples use coffee script syntax
require:
robotsTxt = require 'robotstxt'
parse a robots.txt:
#robotsTxt(url, user_agent)
google_robots_txt = robotsTxt 'http://www.google.com/robots.txt', 'Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.iamnnotreallyagooglebot.com/)'
assign event handler after all that parsing is done
google_robots_txt.on 'ready', (gate_keeper) ->
#returns false
console.log gate_keeper.isAllowed 'http://www.google.com/setnewsprefs?sfsdfg'
#returns false
console.log gate_keeper.isAllowed '/setnewsprefs?sfsdfg'
#returns true
console.log gate_keeper.isDisallowed 'http://www.google.com/setnewsprefs?sfsdfg'
#returns true
console.log gate_keeper.isDisallowed '/setnewsprefs?sfsdfg'
gate_keeper methods:
#asks the gate_keeper if it's ok to crawle an url
isAllowed url
#asks the gate_keeper if it's not ok to crawle an url
isDisallowed url
#answeres the question, why an url is allowed/disallowed
why url
#if you want to change the user agent that is used for this question
setUserAgent user_agent
#if you want to know which robots.txt group is used with which user_agent
#per default uses the user agent set with setUserAgent
getGroup (user_agent)
robotsTxt methods
#fetches parses url with user_agent
#returns an robots_txt event emitter
robotsTxt(url, user_agent)
#blank robots_txt object
blank_robots_txt = robotsTxt()
#crawls and parses a robots.txt
#throws an 'crawled' event
blank_robots_txt.crawl: (protocol, host, port, path, user_agent, encoding)
#parses a txt string line after line
#throws a 'ready' event
blank_robots_txt.parse(txt)
robotsTxt events
#thrown after the whole robots.txt is crawled
robotsTxt.on 'crawled' (txt) -> ...
#thrown after all lines of the robots.txt are parsed
robotsTxt.on 'ready' (gate_keeper)
NOTES
the default user-agent used is
#robotsTxt(url, user_agent)
Mozilla/5.0 (compatible; Open-Source-Coffee-Script-Robots-Txt-Checker/2.1; +http://example.com/bot.html
i strongly recommend using your own user agent
i.e.:
myapp_robots_txt = robotsTxt 'http://www.google.com/robots.txt', 'Mozilla/5.0 (compatible; MyAppBot/2.1; +http://www.example.com/)'
if you want to simulate another crawler (for testing purposes only, of course) see this list for the correct user agent strings
FAQs
a robotstxt parser for node.js
The npm package robotstxt receives a total of 4 weekly downloads. As such, robotstxt popularity was classified as not popular.
We found that robotstxt demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
vlt adds real-time security selectors powered by Socket, enabling developers to query and analyze package risks directly in their dependency graph.
Security News
CISA extended MITRE’s CVE contract by 11 months, avoiding a shutdown but leaving long-term governance and coordination issues unresolved.
Product
Socket's Rubygems ecosystem support is moving from beta to GA, featuring enhanced security scanning to detect supply chain threats beyond traditional CVEs in your Ruby dependencies.