New Case Study:See how Anthropic automated 95% of dependency reviews with Socket.Learn More
Socket
Sign inDemoInstall
Socket

sitemap-generator

Package Overview
Dependencies
Maintainers
1
Versions
61
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

sitemap-generator - npm Package Compare versions

Comparing version 6.0.0 to 7.0.0

lib/createCrawler.js

54

package.json
{
"name": "sitemap-generator",
"version": "6.0.0",
"version": "7.0.0",
"description": "Easily create XML sitemaps for your website.",

@@ -21,3 +21,3 @@ "homepage": "https://github.com/lgraubner/sitemap-generator",

],
"main": "lib/SitemapGenerator.js",
"main": "lib/index.js",
"repository": {

@@ -31,9 +31,9 @@ "type": "git",

"dependencies": {
"cheerio": "^0.22.0",
"lodash.assign": "^4.0.9",
"lodash.chunk": "4.2.0",
"lodash.forin": "^4.2.0",
"robots": "^0.9.5",
"simplecrawler": "1.0.5",
"xmlbuilder": "^8.2.2"
"async": "^2.5.0",
"babel-eslint": "^7.2.3",
"cheerio": "^1.0.0-rc.2",
"crypto-random-string": "^1.0.0",
"mitt": "^1.1.2",
"simplecrawler": "^1.1.4",
"url-parse": "^1.1.9"
},

@@ -44,20 +44,36 @@ "engines": {

"license": "MIT",
"pre-commit": [
"precommit-msg",
"lint"
"files": [
"lib",
"!**/__tests__"
],
"devDependencies": {
"ava": "^0.18.2",
"eslint": "^3.16.1",
"eslint-config-graubnla": "^3.0.0",
"ava": "^0.21.0",
"babel-preset-es2015": "^6.24.1",
"eslint": "4.3.0",
"eslint-config-airbnb": "15.1.0",
"eslint-config-prettier": "^2.3.0",
"eslint-plugin-import": "2.7.0",
"eslint-plugin-jsx-a11y": "5.1.1",
"eslint-plugin-react": "7.1.0",
"husky": "^0.14.3",
"jest": "^20.0.4",
"lint-staged": "^4.0.2",
"lodash.isarray": "4.0.0",
"lodash.isobject": "^3.0.2",
"lodash.isstring": "4.0.1",
"pre-commit": "^1.2.2"
"pre-commit": "^1.2.2",
"prettier": "^1.5.3"
},
"scripts": {
"lint": "eslint SitemapGenerator.js",
"test": "npm run lint && ava test/all.js --timeout=10s",
"precommit-msg": "echo 'Pre-commit checks...' && exit 0"
"lint": "eslint lib",
"precommit": "lint-staged",
"test": "jest"
},
"lint-staged": {
"*.js": [
"eslint",
"prettier --single-quote --trailing-comma es5 --write",
"git add"
]
}
}
# Sitemap Generator
[![Travis](https://img.shields.io/travis/lgraubner/sitemap-generator.svg)](https://travis-ci.org/lgraubner/sitemap-generator) [![David](https://img.shields.io/david/lgraubner/sitemap-generator.svg)](https://david-dm.org/lgraubner/sitemap-generator) [![David Dev](https://img.shields.io/david/dev/lgraubner/sitemap-generator.svg)](https://david-dm.org/lgraubner/sitemap-generator#info=devDependencies) [![npm](https://img.shields.io/npm/v/sitemap-generator-cli.svg)](https://www.npmjs.com/package/sitemap-generator)
[![Travis](https://img.shields.io/travis/lgraubner/sitemap-generator.svg)](https://travis-ci.org/lgraubner/sitemap-generator) [![David](https://img.shields.io/david/lgraubner/sitemap-generator.svg)](https://david-dm.org/lgraubner/sitemap-generator) [![npm](https://img.shields.io/npm/v/sitemap-generator.svg)](https://www.npmjs.com/package/sitemap-generator)
> Easily create XML sitemaps for your website.
## Installation
Generates a sitemap by crawling your site. Uses streams to efficiently write the sitemap to your drive and runs asynchronously to avoid blocking the thread. Is cappable of creating multiple sitemaps if threshold is reached. Respects robots.txt and meta tags.
```BASH
## Table of contents
- [Install](#install)
- [Usage](#usage)
- [API](#api)
- [Options](#options)
- [Events](#events)
- [License](#license)
## Install
This module is available on [npm](https://www.npmjs.com/).
```
$ npm install -S sitemap-generator
```
This module is running only with Node.js and is not meant to be used in the browser.
```JavaScript
const SitemapGenerator = require('sitemap-generator');
```
## Usage
```JavaScript
var SitemapGenerator = require('sitemap-generator');
const Generator = require('sitemap-generator');
// create generator
var generator = new SitemapGenerator('http://example.com');
const generator = SitemapGenerator('http://example.com', {
stripQuerystring: false
});
// register event listeners
generator.on('done', function (sitemaps) {
console.log(sitemaps); // => array of generated sitemaps
generator.on('done', () {
// sitemaps created
});

@@ -31,2 +52,18 @@

## API
The generator offers straightforward methods to start and stop it. Also `getStatus` offers a way to get the current status as the crawler runs asynchronous.
### start
Starts crawler asynchronously and writes sitemap to disk.
### stop
Stops the running crawler and halts the sitemap generation.
### getStatus
Returns the status of the generator. Possible values are `waiting`, `started`, `stopped` and `done`.
## Options

@@ -37,26 +74,24 @@

```JavaScript
var generator = new SitemapGenerator('http://example.com', {
restrictToBasepath: false,
stripQuerystring: true,
var generator = SitemapGenerator('http://example.com', {
crawlerMaxDepth: 0,
filepath: path.join(process.cwd(), 'sitemap.xml'),
maxEntriesPerFile: 50000,
crawlerMaxDepth: 0,
stripQuerystring: true
});
```
Since version 5 port is not an option anymore. If you are using the default ports for http/https your are fine. If you are using a custom port just append it to the URL.
### crawlerMaxDepth
### restrictToBasepath
Type: `number`
Default: `0`
Type: `boolean`
Default: `false`
Defines a maximum distance from the original request at which resources will be fetched.
If you specify an URL with a path (e.g. `example.com/foo/`) and this option is set to `true` the crawler will only fetch URL's matching `example.com/foo/*`. Otherwise it could also fetch `example.com` in case a link to this URL is provided.
### filepath
### stripQueryString
Type: `string`
Default: `./sitemap.xml`
Type: `boolean`
Default: `true`
Filepath for the new sitemap. If multiple sitemaps are created "part_$index" is appended to each filename.
Whether to treat URL's with query strings like `http://www.example.com/?foo=bar` as indiviual sites and to add them to the sitemap.
### maxEntriesPerFile

@@ -67,21 +102,21 @@

Google limits the maximum number of URLs in one sitemap to 50000. If this limit is reached the sitemap-generator creates another sitemap. In that case the first entry of the `sitemaps` array is a sitemapindex file.
Google limits the maximum number of URLs in one sitemap to 50000. If this limit is reached the sitemap-generator creates another sitemap. A sitemap index file will be created as well.
### crawlerMaxDepth
### stripQueryString
Type: `number`
Default: `0`
Type: `boolean`
Default: `true`
Defines a maximum distance from the original request at which resources will be fetched.
Whether to treat URL's with query strings like `http://www.example.com/?foo=bar` as indiviual sites and add them to the sitemap.
## Events
The Sitemap Generator emits several events using nodes `EventEmitter`.
The Sitemap Generator emits several events which can be listened to.
### `fetch`
### `add`
Triggered when the crawler tries to fetch a resource. Passes the status and the url as arguments. The status can be any HTTP status.
Triggered when the crawler successfully added a resource to the sitemap. Passes the url as argument.
```JavaScript
generator.on('fetch', function (status, url) {
generator.on('add', (url) => {
// log url

@@ -91,30 +126,35 @@ });

### `ignore`
### `done`
If an URL matches a disallow rule in the `robots.txt` file this event is triggered. The URL will not be added to the sitemap. Passes the ignored url as argument.
Triggered when the crawler finished and the sitemap is created. Passes the created sitemaps as callback argument. The second argument provides an object containing found URL's, ignored URL's and faulty URL's.
```JavaScript
generator.on('ignore', function (url) {
// log ignored url
generator.on('done', () => {
// sitemaps created
});
```
### `clienterror`
### `error`
Thrown if there was an error on client side while fetching an URL. Passes the crawler error and additional error data as arguments.
Thrown if there was an error while fetching an URL. Passes an object with the http status code, a message and the url as argument.
```JavaScript
generator.on('clienterror', function (queueError, errorData) {
// log error
generator.on('error', (error) {
console.log(error);
// => { code: 404, message: 'Not found.', url: 'http://example.com/foo' }
});
```
### `done`
### `ignore`
Triggered when the crawler finished and the sitemap is created. Passes the created sitemaps as callback argument. The second argument provides an object containing found URL's, ignored URL's and faulty URL's.
If an URL matches a disallow rule in the `robots.txt` file or meta robots noindex is present this event is triggered. The URL will not be added to the sitemap. Passes the ignored url as argument.
```JavaScript
generator.on('done', function (sitemaps, store) {
// do something with the sitemaps, e.g. save as file
generator.on('ignore', (url) => {
// log ignored url
});
```
## License
[MIT](https://github.com/lgraubner/sitemap-generator/blob/master/LICENSE) © [Lars Graubner](https://larsgraubner.com)
SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc