New Case Study:See how Anthropic automated 95% of dependency reviews with Socket.Learn More
Socket
Sign inDemoInstall
Socket

robots-parser

Package Overview
Dependencies
Maintainers
1
Versions
11
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

robots-parser - npm Package Versions

2

3.0.1

Diff

Changelog

Source

Version 3.0.1

  • Fixed bug with https: URLs defaulting to port 80 instead of 443 if no port is specified. Thanks to @dskvr for reporting

    This affects comparing URLs with the default HTTPs port to URLs without it. For example, comparing https://example.com/ to https://example.com:443/ or vice versa.

    They should be treated as equivalent but weren't due to the incorrect port being used for https:.

samclarke
published 3.0.0 •

Changelog

Source

Version 3.0.0

  • Changed to using global URL object instead of importing. – Thanks to @brendankenny
samclarke
published 2.4.0 •

Changelog

Source

Version 2.4.0:

  • Added Typescript definitions. – Thanks to @danhab99 for creating
  • Added SECURITY.md policy and CodeQL scanning
samclarke
published 2.3.0 •

Changelog

Source

Version 2.3.0:

  • Fixed bug where if the user-agent passed to isAllowed() / isDisallowed() is called "constructor" it would throw an error.

  • Added support for relative URLs. This does not affect the default behavior so can safely be upgraded.

    Relative matching is only allowed if both the robots.txt URL and the URLs being checked are relative.

    For example:

    var robots = robotsParser('/robots.txt', [
        'User-agent: *',
        'Disallow: /dir/',
        'Disallow: /test.html',
        'Allow: /dir/test.html',
        'Allow: /test.html'
    ].join('\n'));
    
    robots.isAllowed('/test.html', 'Sams-Bot/1.0'); // false
    robots.isAllowed('/dir/test.html', 'Sams-Bot/1.0'); // true
    robots.isDisallowed('/dir/test2.html', 'Sams-Bot/1.0'); // true
    
samclarke
published 2.2.0 •

Changelog

Source

Version 2.2.0:

  • Fixed bug that with matching wildcard patterns with some URLs – Thanks to @ckylape for reporting and fixing
  • Changed matching algorithm to match Google's implementation in google/robotstxt
  • Changed order of precedence to match current spec
samclarke
published 2.1.1 •

Changelog

Source

Version 2.1.1:

  • Fix bug that could be used to causing rule checking to take a long time – Thanks to @andeanfog
samclarke
published 2.1.0 •

Changelog

Source

Version 2.1.0:

  • Removed use of punycode module API's as new URL API handles it
  • Improved test coverage
  • Added tests for percent encoded paths and improved support
  • Added getMatchingLineNumber() method
  • Fixed bug with comments on same line as directive
samclarke
published 2.0.0 •

Changelog

Source

Version 2.0.0:

This release is not 100% backwards compatible as it now uses the new URL APIs which are not supported in Node < 7.

  • Update code to not use deprecated URL module API's. – Thanks to @kdzwinel
samclarke
published 1.0.2 •

Changelog

Source

Version 1.0.2:

  • Fixed error caused by invalid URLs missing the protocol.
samclarke
published 1.0.1 •

Changelog

Source

Version 1.0.1:

  • Fixed bug with the "user-agent" rule being treated as case sensitive. – Thanks to @brendonboshell
  • Improved test coverage. – Thanks to @schornio
2
SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc