Socket
Socket
Sign inDemoInstall

crawler-url-parser

Package Overview
Dependencies
40
Maintainers
1
Versions
21
Alerts
File Explorer

Advanced tools

Install Socket

Detect and block malicious and high-risk dependencies

Install

    crawler-url-parser

An `URL` parser for crawling purpose.


Version published
Maintainers
1
Install size
1.64 MB
Created

Readme

Source

crawler-url-parser

An URL parser for crawling purpose

logo

version downloads node status

Installation

npm install crawler-url-parser

Usage

Parse

const cup = require('crawler-url-parser');

//// parse(current_url,base_url)
let url = cup.parse("../ddd","http://question.stackoverflow.com/aaa/bbb/ccc/");
console.log(url.normalized);//http://question.stackoverflow.com/aaa/bbb/ddd
console.log(url.host); // question.stackoverflow.com
console.log(url.domain); // stackoverflow.com
console.log(url.subdomain); // question
console.log(url.protocol); // http:
console.log(url.path); // /aaa/bbb/ddd

Extract

const cup = require('crawler-url-parser');

//// extract(html_str,current_url);
let htmlStr=
    'html> \
        <body> \
            <a href="http://www.stackoverflow.com/internal-1">test-link-4</a><br /> \
            <a href="http://www.stackoverflow.com/internal-2">test-link-5</a><br /> \
            <a href="http://www.stackoverflow.com/internal-2">test-link-6</a><br /> \
            <a href="http://faq.stackoverflow.com/subdomain-1">test-link-7</a><br /> \
            <a href="http://faq.stackoverflow.com/subdomain-2">test-link-8</a><br /> \
            <a href="http://faq.stackoverflow.com/subdomain-2">test-link-9</a><br /> \
            <a href="http://www.google.com/external-1">test-link-10</a><br /> \
            <a href="http://www.google.com/external-2">test-link-11</a><br /> \
            <a href="http://www.google.com/external-2">test-link-12</a><br /> \
        </body> \
    </html>';
let currentUrl= "http://www.stackoverflow.com/aaa/bbb/ccc";
let urls = cup.extract(htmlStr,currentUrl);
console.log(urls.length); // 6

Level

const cup = require('crawler-url-parser');

//// getlevel(current_url,base_url);
let level = cup.getlevel("sub.domain.com/aaa/bbb/","sub.domain.com/aaa/bbb/ccc");
console.log(level); //sublevel

level = cup.getlevel("sub.domain.com/aaa/bbb/ccc/ddd","sub.domain.com/aaa/bbb/ccc");
console.log(level); //uplevel

level = cup.getlevel("sub.domain.com/aaa/bbb/eee","sub.domain.com/aaa/bbb/ccc");
console.log(level); //samelevel

level = cup.getlevel("sub.domain.com/aaa/bbb/eee","sub.anotherdomain.com/aaa/bbb/ccc");
console.log(level); //null

Query

const cup = require('crawler-url-parser');

//// querycount(url)
let count = cup.querycount("sub.domain.com/aaa/bbb?q1=data1&q2=data2&q3=data3");
console.log(count); //3

Test

mocha or npm test

more than 200 unit test cases. check test folder and QUICKSTART.js for extra usage.

Keywords

FAQs

Last updated on 08 Dec 2017

Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc