sitemap-generator-cli
Advanced tools
Comparing version 4.1.0 to 4.2.0
{ | ||
"name": "sitemap-generator-cli", | ||
"version": "4.1.0", | ||
"version": "4.2.0", | ||
"description": "Create xml sitemaps from the command line.", | ||
@@ -21,3 +21,3 @@ "homepage": "https://github.com/lgraubner/sitemap-generator-cli", | ||
], | ||
"main": "cli.js", | ||
"main": "index.js", | ||
"repository": { | ||
@@ -31,22 +31,59 @@ "type": "git", | ||
"dependencies": { | ||
"chalk": "^1.1.3", | ||
"commander": "^2.9.0", | ||
"sitemap-generator": "^4.1.1" | ||
"chalk": "2.4.1", | ||
"commander": "2.19.0", | ||
"sitemap-generator": "^8.3.2" | ||
}, | ||
"devDependencies": { | ||
"eslint": "5.9.0", | ||
"execa": "1.0.0", | ||
"husky": "1.2.0", | ||
"jest": "23.6.0", | ||
"lint-staged": "8.1.0", | ||
"prettier": "1.15.3" | ||
}, | ||
"preferGlobal": true, | ||
"engines": { | ||
"node": ">=0.12" | ||
"node": ">=6" | ||
}, | ||
"bin": { | ||
"sitemap-generator": "cli.js" | ||
"sitemap-generator": "index.js" | ||
}, | ||
"license": "MIT", | ||
"devDependencies": { | ||
"ava": "^0.15.2", | ||
"eslint": "^3.0.0", | ||
"eslint-config-graubnla": "^3.0.0" | ||
"scripts": { | ||
"lint": "eslint .", | ||
"test": "jest" | ||
}, | ||
"scripts": { | ||
"test": "eslint cli.js && NODE_ENV=development ava test/cli.js" | ||
"lint-staged": { | ||
"*.js": [ | ||
"eslint --fix", | ||
"prettier --write", | ||
"git add" | ||
] | ||
}, | ||
"prettier": { | ||
"singleQuote": true | ||
}, | ||
"eslintConfig": { | ||
"parserOptions": { | ||
"ecmaVersion": 6 | ||
}, | ||
"extends": [ | ||
"eslint:recommended" | ||
], | ||
"env": { | ||
"jest": true, | ||
"node": true | ||
}, | ||
"rules": { | ||
"no-console": 0 | ||
} | ||
}, | ||
"jest": { | ||
"testEnvironment": "node" | ||
}, | ||
"husky": { | ||
"hooks": { | ||
"pre-commit": "lint-staged" | ||
} | ||
} | ||
} |
@@ -1,71 +0,104 @@ | ||
# Node Sitemap Generator | ||
# Sitemap Generator CLI | ||
[![Travis](https://img.shields.io/travis/lgraubner/sitemap-generator-cli.svg)](https://travis-ci.org/lgraubner/sitemap-generator-cli) [![David](https://img.shields.io/david/lgraubner/sitemap-generator-cli.svg)](https://david-dm.org/lgraubner/sitemap-generator-cli) [![David Dev](https://img.shields.io/david/dev/lgraubner/sitemap-generator-cli.svg)](https://david-dm.org/lgraubner/sitemap-generator-cli#info=devDependencies) [![npm](https://img.shields.io/npm/v/sitemap-generator-cli.svg)](https://www.npmjs.com/package/sitemap-generator-cli) | ||
[![Travis](https://img.shields.io/travis/lgraubner/sitemap-generator-cli.svg)](https://travis-ci.org/lgraubner/sitemap-generator-cli) [![David](https://img.shields.io/david/lgraubner/sitemap-generator-cli.svg)](https://david-dm.org/lgraubner/sitemap-generator-cli) [![npm](https://img.shields.io/npm/v/sitemap-generator-cli.svg)](https://www.npmjs.com/package/sitemap-generator-cli) | ||
> Create xml sitemaps from the command line. | ||
## Installation | ||
Generates a sitemap by crawling your site. Uses streams to efficiently write the sitemap to your drive. Is cappable of creating multiple sitemaps if threshold is reached. Respects robots.txt and meta tags. | ||
## Table of contents | ||
- [Install](#install) | ||
- [Usage](#usage) | ||
- [Options](#options) | ||
- [License](#license) | ||
## Install | ||
This module is available on [npm](https://www.npmjs.com/). | ||
```BASH | ||
$ npm install -g sitemap-generator-cli | ||
npm install -g sitemap-generator-cli | ||
# or execute it directly with npx (since npm v5.2) | ||
npx sitemap-generator-cli https://example.com | ||
``` | ||
## Usage | ||
```BASH | ||
$ sitemap-generator [options] <url> | ||
``` | ||
The crawler will fetch all folder URL pages and file types [parsed by Google](https://support.google.com/webmasters/answer/35287?hl=en). If present the `robots.txt` will be taken into account and possible rules are applied for each URL to consider if it should be added to the sitemap. Also the crawler will not fetch URL's from a page if the robots meta tag with the value `nofollow` is present and ignore them completely if `noindex` rule is present. The crawler is able to apply the `base` value to found links. | ||
When the crawler finished the XML Sitemap will be built and printed directly to your console. Pass the sitemap to save the sitemap as a file or do something else: | ||
```BASH | ||
$ sitemap-generator http://example.com > sitemap.xml | ||
sitemap-generator [options] <url> | ||
``` | ||
To save it in a subfolder simply provide a relativ path. You can pick any filename you want. | ||
When the crawler finished the XML Sitemap will be built and saved to your specified filepath. If the count of fetched pages is greater than 50000 it will be splitted into several sitemap files and create a sitemapindex file. Google does not allow more than 50000 items in one sitemap. | ||
Example: | ||
```BASH | ||
$ sitemap-generator http://example.com > ./subfolder/mysitemap.xml | ||
sitemap-generator http://example.com | ||
``` | ||
## Options | ||
```BASH | ||
$ sitemap-generator --help | ||
sitemap-generator --help | ||
Usage: sitemap-generator [options] <url> | ||
Usage: cli [options] <url> | ||
Options: | ||
-h, --help output usage information | ||
-V, --version output the version number | ||
-b, --baseurl only allow URLs which match given <url> | ||
-d, --dry show status messages without generating a sitemap | ||
-q, --query consider query string | ||
-V, --version output the version number | ||
-f, --filepath <filepath> path to file including filename (default: sitemap.xml) | ||
-m, --max-entries <maxEntries> limits the maximum number of URLs per sitemap file (default: 50000) | ||
-d, --max-depth <maxDepth> limits the maximum distance from the original request (default: 0) | ||
-q, --query consider query string | ||
-u, --user-agent <agent> set custom User Agent | ||
-v, --verbose print details when crawling | ||
-c, --max-concurrency <maxConcurrency> maximum number of requests the crawler will run simultaneously (default: 5) | ||
-r, --no-respect-robots-txt controls whether the crawler should respect rules in robots.txt | ||
-h, --help output usage information | ||
``` | ||
Example: | ||
### filepath | ||
```Bash | ||
// strictly match given path and consider query string | ||
$ sitemap-generator -bq example.com/foo/ | ||
``` | ||
Path to file to write including the filename itself. Path can be absolute or relative. Default is `sitemap.xml`. | ||
### `--baseurl` | ||
Examples: | ||
Default: `false` | ||
- `sitemap.xml` | ||
- `mymap.xml` | ||
- `/var/www/sitemap.xml` | ||
- `./sitemap.myext` | ||
If you specify an URL with a path (e.g. `http://example.com/foo/`) and this option is set to `true` the crawler will only fetch URL's matching `example.com/foo/*`. Otherwise it could also fetch `example.com` in case a link to this URL is provided | ||
### maxConcurrency | ||
### `--dry` | ||
Sets the maximum number of requests the crawler will run simultaneously (default: 5). | ||
Default: `false` | ||
### maxEntries | ||
Use this option to make a dry run and check the generation process to see which sites are fetched and if there are any errors. | ||
Will not create a sitemap! | ||
fine a limit of URLs per sitemap files, useful for site with lots of urls. Defaults to 50000. | ||
### `--query` | ||
### maxDepth | ||
Default: `false` | ||
Set a maximum distance from the original request to crawl URLs, useful for generating smaller `sitemap.xml` files. Defaults to 0, which means it will crawl all levels. | ||
### noRespectRobotsTxt | ||
Controls whether the crawler should respect rules in robots.txt. | ||
### query | ||
Consider URLs with query strings like `http://www.example.com/?foo=bar` as indiviual sites and add them to the sitemap. | ||
### user-agent | ||
Set a custom User Agent used for crawling. Default is `Node/SitemapGenerator`. | ||
### verbose | ||
Print debug messages during crawling process. Also prints out a summery when finished. | ||
## License | ||
[MIT](https://github.com/lgraubner/sitemap-generator/blob/master/LICENSE) © [Lars Graubner](https://larsgraubner.com) |
Sorry, the diff of this file is not supported yet
Major refactor
Supply chain riskPackage has recently undergone a major refactor. It may be unstable or indicate significant internal changes. Use caution when updating to versions that include significant changes.
Found 1 instance in 1 package
Filesystem access
Supply chain riskAccesses the file system, and could potentially read sensitive data.
Found 1 instance in 1 package
Network access
Supply chain riskThis module accesses the network.
Found 1 instance in 1 package
Shell access
Supply chain riskThis module accesses the system shell. Accessing the system shell increases the risk of executing arbitrary code.
Found 1 instance in 1 package
Environment variable access
Supply chain riskPackage accesses environment variables, which may be a sign of credential stuffing or data theft.
Found 1 instance in 1 package
105
1
10622
6
7
133
+ Added@types/node@22.13.1(transitive)
+ Addedansi-styles@3.2.1(transitive)
+ Addedasync@2.6.13.2.6(transitive)
+ Addedchalk@2.4.1(transitive)
+ Addedcheerio@1.0.0-rc.2(transitive)
+ Addedcolor-convert@1.9.3(transitive)
+ Addedcolor-name@1.1.3(transitive)
+ Addedcommander@2.19.0(transitive)
+ Addedcp-file@6.0.0(transitive)
+ Addedcrypto-random-string@1.0.0(transitive)
+ Addeddate-fns@1.29.0(transitive)
+ Addeddomhandler@2.4.2(transitive)
+ Addedgraceful-fs@4.2.11(transitive)
+ Addedhas-flag@3.0.0(transitive)
+ Addedhtmlparser2@3.10.1(transitive)
+ Addediconv-lite@0.5.2(transitive)
+ Addedlodash@4.17.20(transitive)
+ Addedmake-dir@1.3.0(transitive)
+ Addedmitt@1.1.3(transitive)
+ Addednested-error-stacks@2.1.1(transitive)
+ Addednormalize-url@3.3.0(transitive)
+ Addedparse5@3.0.3(transitive)
+ Addedpify@3.0.0(transitive)
+ Addedquerystringify@2.2.0(transitive)
+ Addedreadable-stream@3.6.2(transitive)
+ Addedrequires-port@1.0.0(transitive)
+ Addedrobots-parser@2.4.0(transitive)
+ Addedsimplecrawler@1.1.9(transitive)
+ Addedsitemap-generator@8.5.1(transitive)
+ Addedstring_decoder@1.3.0(transitive)
+ Addedsupports-color@5.5.0(transitive)
+ Addedundici-types@6.20.0(transitive)
+ Addedurl-parse@1.4.7(transitive)
+ Addedutil-deprecate@1.0.2(transitive)
- Removedabab@1.0.4(transitive)
- Removedacorn@2.7.0(transitive)
- Removedacorn-globals@1.0.9(transitive)
- Removedajv@6.12.6(transitive)
- Removedansi-regex@2.1.1(transitive)
- Removedansi-styles@2.2.1(transitive)
- Removedasn1@0.2.6(transitive)
- Removedassert-plus@1.0.0(transitive)
- Removedasynckit@0.4.0(transitive)
- Removedaws-sign2@0.7.0(transitive)
- Removedaws4@1.13.2(transitive)
- Removedbcrypt-pbkdf@1.0.2(transitive)
- Removedcaseless@0.12.0(transitive)
- Removedchalk@1.1.3(transitive)
- Removedcheerio@0.20.0(transitive)
- Removedcombined-stream@1.0.8(transitive)
- Removedcommander@2.20.3(transitive)
- Removedcore-util-is@1.0.21.0.3(transitive)
- Removedcssom@0.3.8(transitive)
- Removedcssstyle@0.2.37(transitive)
- Removeddashdash@1.14.1(transitive)
- Removeddeep-is@0.1.4(transitive)
- Removeddelayed-stream@1.0.0(transitive)
- Removeddomhandler@2.3.0(transitive)
- Removedecc-jsbn@0.1.2(transitive)
- Removedentities@1.0.0(transitive)
- Removedescodegen@1.14.3(transitive)
- Removedesprima@4.0.1(transitive)
- Removedestraverse@4.3.0(transitive)
- Removedesutils@2.0.3(transitive)
- Removedextend@3.0.2(transitive)
- Removedextsprintf@1.3.0(transitive)
- Removedfast-deep-equal@3.1.3(transitive)
- Removedfast-json-stable-stringify@2.1.0(transitive)
- Removedfast-levenshtein@2.0.6(transitive)
- Removedforever-agent@0.6.1(transitive)
- Removedform-data@2.3.3(transitive)
- Removedgetpass@0.1.7(transitive)
- Removedhar-schema@2.0.0(transitive)
- Removedhar-validator@5.1.5(transitive)
- Removedhas-ansi@2.0.0(transitive)
- Removedhtmlparser2@3.8.3(transitive)
- Removedhttp-signature@1.2.0(transitive)
- Removediconv-lite@0.4.24(transitive)
- Removedis-typedarray@1.0.0(transitive)
- Removedisarray@0.0.1(transitive)
- Removedisstream@0.1.2(transitive)
- Removedjsbn@0.1.1(transitive)
- Removedjsdom@7.2.2(transitive)
- Removedjson-schema@0.4.0(transitive)
- Removedjson-schema-traverse@0.4.1(transitive)
- Removedjson-stringify-safe@5.0.1(transitive)
- Removedjsprim@1.4.2(transitive)
- Removedlevn@0.3.0(transitive)
- Removedlodash@4.17.21(transitive)
- Removedlodash.assign@4.2.0(transitive)
- Removedlodash.forin@4.4.0(transitive)
- Removedmime-db@1.52.0(transitive)
- Removedmime-types@2.1.35(transitive)
- Removednwmatcher@1.4.4(transitive)
- Removedoauth-sign@0.9.0(transitive)
- Removedoptionator@0.8.3(transitive)
- Removedparse5@1.5.1(transitive)
- Removedperformance-now@2.1.0(transitive)
- Removedprelude-ls@1.1.2(transitive)
- Removedpsl@1.15.0(transitive)
- Removedpunycode@2.3.1(transitive)
- Removedqs@6.5.3(transitive)
- Removedreadable-stream@1.1.14(transitive)
- Removedrequest@2.88.2(transitive)
- Removedrobots@0.9.5(transitive)
- Removedsax@1.4.1(transitive)
- Removedsimplecrawler@0.7.0(transitive)
- Removedsitemap-generator@4.1.1(transitive)
- Removedsource-map@0.6.1(transitive)
- Removedsshpk@1.18.0(transitive)
- Removedstring_decoder@0.10.31(transitive)
- Removedstrip-ansi@3.0.1(transitive)
- Removedsupports-color@2.0.0(transitive)
- Removedsymbol-tree@3.2.4(transitive)
- Removedtough-cookie@2.5.0(transitive)
- Removedtr46@0.0.3(transitive)
- Removedtunnel-agent@0.6.0(transitive)
- Removedtweetnacl@0.14.5(transitive)
- Removedtype-check@0.3.2(transitive)
- Removeduri-js@4.4.1(transitive)
- Removeduuid@3.4.0(transitive)
- Removedverror@1.10.0(transitive)
- Removedwebidl-conversions@2.0.1(transitive)
- Removedwhatwg-url-compat@0.6.5(transitive)
- Removedword-wrap@1.2.5(transitive)
- Removedxml-name-validator@2.0.1(transitive)
- Removedxmlbuilder@8.2.2(transitive)
Updatedchalk@2.4.1
Updatedcommander@2.19.0
Updatedsitemap-generator@^8.3.2