Socket
Socket
Sign inDemoInstall

ccht

Package Overview
Dependencies
6
Maintainers
1
Versions
3
Alerts
File Explorer

Advanced tools

Install Socket

Detect and block malicious and high-risk dependencies

Install

    ccht

A simple command0line tool to crawl and test your website


Version published
Weekly downloads
3
Maintainers
1
Install size
13.9 MB
Created
Weekly downloads
 

Readme

Source

ccht

npm Test and Lint Workflow Status Publish Package Workflow Status

Command-line Crawling HTTP Testing tool

ccht is a simple command-line tool to crawl and test your website resources' HTTP status code, like broken link checker.

Installation

You can skip installation if you use npx for one-time invocation.

$ npm i -D ccht

# or
$ yarn add -D ccht

Usage

ccht [options] <url>
# to crawl and test "https://example.com"
$ npx ccht 'https://example.com'

# to show help
$ npx ccht --help

ccht will crawl the site starting from the given URL.

Options

To see more options, run npx ccht --help.

Global Options

--crawler <name>

Choose crawler. Available crawlers:

node-http

Default. Crawls pages by using Node.js' HTTP module and cheerio.

puppeteer

Crawls pages by using a real browser through Puppeteer. You need to install puppeteer (npm i -D puppeteer) or configure your environment (browser DevTool protocol connection, executable.)

--reporter <name>

Specify reporter, which formats and outputs the test result.

code-frame

Default. Outputs human-friendly visuallized result.

json

Prints JSON string. Useful for a programmatic access to results.

$ npx ccht 'https://example.com' --reporter=json | jq
--include <urls>

A comma separated list of a URL to include in a result. Any URLs forward-matching will be crawled and be reported.

Defaults to the given URL. For example, given npx ccht 'https://example.com' then --include will be https://example.com.

--exclude <urls>

A comma separated list of a URL to exclude from a result. Any URLs forward-matching will be skipped nor be removed from a result.

--expected-status <HTTP status codes>

A comma separated list of an expected HTTP status code for pages. Any pages responded with other status codes result in error (unexpected_status).

Defaults to 200.

--exit-error-severity <list of severity>

Change which severities occurs exit status 1. Available severities are below:

  • danger
  • warning
  • info
  • debug

Defaults to danger.

Crawler Options

--timeout <ms>

Timeout for each page to load/response in crawling phase. This option value is directly goes to node-fetch's or puppeteer's one.

Defaults to 3000 (3s).

--concurrency <uint>

How many connection can exist at the same time, a size for connection pool.

Defaults to 1.

Keywords

FAQs

Last updated on 02 Jul 2022

Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc