Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

@crawlify.io/robots

Package Overview
Dependencies
Maintainers
2
Versions
2
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@crawlify.io/robots

Parses robots.txt files to provide meaningful, useful output as well as reporting syntax errors.

  • 1.0.1
  • latest
  • Source
  • npm
  • Socket score

Version published
Weekly downloads
1
Maintainers
2
Weekly downloads
 
Created
Source

Crawlify/Robots

A robots.txt parser for node.js

This package comes with two options for parsing robots.txt files, either fetch - which retrieves the file from the URL provided and parses the response, or parse - which simply parses the textual response provided.

let RobotFetch = require('robot');
RobotFetch.fetch('https://reckless.agency/robots.txt', function() {
  console.log(RobotFetch.rulesets);
  console.log(RobotFetch.sitemaps);
});

let RobotParse = require('robot');
RobotParse.parse(someRobotsContent, function() {
    console.log(RobotParse.rulesets);
    console.log(RobotParse.sitemaps);
});

If any lines of the robots.txt cannot be understood by the parser, they will be returned in Robot.unknown.

More features will be added as we move forward.

Keywords

FAQs

Package last updated on 10 Feb 2019

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc