Socket
Book a DemoInstallSign in
Socket

@crawlify.io/robots

Package Overview
Dependencies
Maintainers
2
Versions
2
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@crawlify.io/robots

Parses robots.txt files to provide meaningful, useful output as well as reporting syntax errors.

1.0.1
latest
Source
npmnpm
Version published
Weekly downloads
0
Maintainers
2
Weekly downloads
 
Created
Source

Crawlify/Robots

A robots.txt parser for node.js

This package comes with two options for parsing robots.txt files, either fetch - which retrieves the file from the URL provided and parses the response, or parse - which simply parses the textual response provided.

let RobotFetch = require('robot');
RobotFetch.fetch('https://reckless.agency/robots.txt', function() {
  console.log(RobotFetch.rulesets);
  console.log(RobotFetch.sitemaps);
});

let RobotParse = require('robot');
RobotParse.parse(someRobotsContent, function() {
    console.log(RobotParse.rulesets);
    console.log(RobotParse.sitemaps);
});

If any lines of the robots.txt cannot be understood by the parser, they will be returned in Robot.unknown.

More features will be added as we move forward.

Keywords

robots.txt

FAQs

Package last updated on 10 Feb 2019

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

About

Packages

Stay in touch

Get open source security insights delivered straight into your inbox.

  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc

U.S. Patent No. 12,346,443 & 12,314,394. Other pending.