
Product
Announcing Socket Fix 2.0
Socket Fix 2.0 brings targeted CVE remediation, smarter upgrade planning, and broader ecosystem support to help developers get to zero alerts.
@jbruni/linkinator
Advanced tools
Find broken links, missing images, etc in your HTML. Scurry around your site and find all those broken links.
A super simple site crawler and broken link checker.
Behold my latest inator! The linkinator
provides an API and CLI for crawling websites and validating links. It's got a ton of sweet features:
<a href>
$ npm install linkinator
You can use this as a library, or as a CLI. Let's see the CLI!
$ linkinator LOCATION [ --arguments ]
Positional arguments
LOCATION
Required. Either the URL or the path on disk to check for broken links.
Flags
--recurse, -r
Recurively follow links on the same root domain.
--skip, -s
List of urls in regexy form to not include in the check.
--include, -i
List of urls in regexy form to include. The opposite of --skip.
--format, -f
Return the data in CSV or JSON format.
--help
Show this command.
You can run a shallow scan of a website for busted links:
$ npx linkinator http://jbeckwith.com
That was fun. What about local files? The linkinator will stand up a static web server for yinz:
$ npx linkinator ./docs
But that only gets the top level of links. Lets go deeper and do a full recursive scan!
$ npx linkinator ./docs --recurse
Aw, snap. I didn't want that to check those links. Let's skip em:
$ npx linkinator ./docs --skip www.googleapis.com
The --skip
parameter will accept any regex! You can do more complex matching, or even tell it to only scan links with a given domain:
$ linkinator http://jbeckwith.com --skip '^(?!http://jbeckwith.com)'
Maybe you're going to pipe the output to another program. Use the --format
option to get JSON or CSV!
$ linkinator ./docs --format CSV
Asynchronous method that runs a site wide scan. Options come in the form of an object that includes:
path
(string) - A fully qualified path to the url to be scanned, or the path to the directory on disk that contains files to be scanned. required.port
(number) - When the path
is provided as a local path on disk, the port
on which to start the temporary web server. Defaults to a random high range order port.recurse
(boolean) - By default, all scans are shallow. Only the top level links on the requested page will be scanned. By setting recurse
to true
, the crawler will follow all links on the page, and continue scanning links on the same domain for as long as it can go. Results are cached, so no worries about loops.linksToSkip
(array) - An array of regular expression strings that should be skipped during the scan.Constructor method that can be used to create a new LinkChecker
instance. This is particularly useful if you want to receive events as the crawler crawls. Exposes the following events:
pagestart
(string) - Provides the url that the crawler has just started to scan.link
(object) - Provides an object with
url
(string) - The url that was scannedstate
(string) - The result of the scan. Potential values include BROKEN
, OK
, or SKIPPED
.status
(number) - The HTTP status code of the request.const link = require('linkinator');
async function simple() {
const results = await link.check({
path: 'http://example.com'
});
// To see if all the links passed, you can check `passed`
console.log(`Passed: ${results.passed}`);
// Show the list of scanned links and their results
console.log(results);
// Example output:
// {
// passed: true,
// links: [
// {
// url: 'http://example.com',
// status: 200,
// state: 'OK'
// },
// {
// url: 'http://www.iana.org/domains/example',
// status: 200,
// state: 'OK'
// }
// ]
// }
}
simple();
In most cases you're going to want to respond to events, as running the check command can kinda take a long time.
const link = require('linkinator');
async function complex() {
// create a new `LinkChecker` that we'll use to run the scan.
const checker = new link.LinkChecker();
// Respond to the beginning of a new page being scanned
checker.on('pagestart', url => {
console.log(`Scanning ${url}`);
});
// After a page is scanned, check out the results!
checker.on('link', result => {
// check the specific url that was scanned
console.log(` ${result.url}`);
// How did the scan go? Potential states are `BROKEN`, `OK`, and `SKIPPED`
console.log(` ${result.state}`);
// What was the status code of the response?
console.log(` ${result.status}`);
});
// Go ahead and start the scan! As events occur, we will see them above.
const result = await checker.check({
path: 'http://example.com',
// port: 8673,
// recurse?: true,
// linksToSkip: [
// 'https://jbeckwith.com/some/link',
// 'http://example.com'
// ]
});
// Check to see if the scan passed!
console.log(result.passed ? 'PASSED :D' : 'FAILED :(');
// How many links did we scan?
console.log(`Scanned total of ${result.links.length} links!`);
// The final result will contain the list of checked links, and the pass/fail
const brokeLinksCount = result.links.filter(x => x.state === 'BROKEN');
console.log(`Detected ${brokeLinksCount.length} broken links.`);
}
complex();
FAQs
Find broken links, missing images, etc in your HTML. Scurry around your site and find all those broken links.
We found that @jbruni/linkinator demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Product
Socket Fix 2.0 brings targeted CVE remediation, smarter upgrade planning, and broader ecosystem support to help developers get to zero alerts.
Security News
Socket CEO Feross Aboukhadijeh joins Risky Business Weekly to unpack recent npm phishing attacks, their limited impact, and the risks if attackers get smarter.
Product
Socket’s new Tier 1 Reachability filters out up to 80% of irrelevant CVEs, so security teams can focus on the vulnerabilities that matter.