markdownlint
Advanced tools
Comparing version 0.27.0 to 0.28.0
@@ -16,5 +16,5 @@ # Contributing | ||
Do not add new [`dependencies` to `package.json`][dependencies]. The excellent | ||
Markdown parser [markdown-it][markdown-it] is this project's one and only | ||
dependency. | ||
Do not add new [`dependencies` to `package.json`][dependencies]. The Markdown | ||
parsers [`markdown-it`][markdown-it] and [`micromark`][micromark] are the | ||
project's only dependencies. | ||
@@ -84,4 +84,5 @@ If developing a new rule, start by creating a [custom rule][custom-rules] in its | ||
[markdown-it]: https://www.npmjs.com/package/markdown-it | ||
[micromark]: https://www.npmjs.com/package/micromark | ||
[new-rule]: https://github.com/DavidAnson/markdownlint/labels/new%20rule | ||
[npm-scripts]: https://docs.npmjs.com/misc/scripts | ||
[rewriting-history]: https://git-scm.com/book/en/v2/Git-Tools-Rewriting-History |
@@ -37,3 +37,3 @@ # Custom Rules | ||
"function": function rule(params, onError) { | ||
params.tokens.filter(function filterToken(token) { | ||
params.parsers.markdownit.tokens.filter(function filterToken(token) { | ||
return token.type === "blockquote_open"; | ||
@@ -124,4 +124,2 @@ }).forEach(function forToken(blockquote) { | ||
- Packages should export a single rule object or an `Array` of rule objects | ||
- [Custom rules from the Microsoft/vscode-docs-authoring | ||
repository][vscode-docs-authoring] | ||
- [Custom rules from the axibase/docs-util repository][docs-util] | ||
@@ -379,2 +377,1 @@ - [Custom rules from the webhintio/hint repository][hint] | ||
[test-rules-npm]: ../test/rules/npm | ||
[vscode-docs-authoring]: https://github.com/microsoft/vscode-docs-authoring/tree/main/packages/docs-linting/markdownlint-custom-rules |
@@ -9,24 +9,24 @@ # `MD034` - Bare URL used | ||
This rule is triggered whenever a URL is given that isn't surrounded by angle | ||
brackets: | ||
This rule is triggered whenever a URL or email address appears without | ||
surrounding angle brackets: | ||
```markdown | ||
For more information, see https://www.example.com/. | ||
For more info, visit https://www.example.com/ or email user@example.com. | ||
``` | ||
To fix this, add angle brackets around the URL: | ||
To fix this, add angle brackets around the URL or email address: | ||
```markdown | ||
For more information, see <https://www.example.com/>. | ||
For more info, visit <https://www.example.com/> or email <user@example.com>. | ||
``` | ||
Note: To use a bare URL without it being converted into a link, enclose it in | ||
a code block, otherwise in some Markdown parsers it *will* be converted: | ||
Note: To include a bare URL or email without it being converted into a link, | ||
wrap it in a code span: | ||
```markdown | ||
`https://www.example.com` | ||
Not a clickable link: `https://www.example.com` | ||
``` | ||
Note: The following scenario does *not* trigger this rule to avoid conflicts | ||
with `MD011`/`no-reversed-links`: | ||
Note: The following scenario does not trigger this rule because it could be a | ||
shortcut link: | ||
@@ -37,10 +37,16 @@ ```markdown | ||
The use of quotes around a bare link will *not* trigger this rule, either: | ||
Note: The following syntax triggers this rule because the nested link could be | ||
a shortcut link (which takes precedence): | ||
```markdown | ||
"https://www.example.com" | ||
'https://www.example.com' | ||
[text [shortcut] text](https://example.com) | ||
``` | ||
Rationale: Without angle brackets, the URL isn't converted into a link by many | ||
Markdown parsers. | ||
To avoid this, escape both inner brackets: | ||
```markdown | ||
[link \[text\] link](https://example.com) | ||
``` | ||
Rationale: Without angle brackets, a bare URL or email isn't converted into a | ||
link by some Markdown parsers. |
@@ -50,2 +50,2 @@ # `MD051` - Link fragments should be valid | ||
[github-section-links]: https://docs.github.com/en/get-started/writing-on-github/getting-started-with-writing-and-formatting-on-github/basic-writing-and-formatting-syntax#section-links | ||
[github-heading-algorithm]: https://github.com/gjtorikian/html-pipeline/blob/main/lib/html/pipeline/toc_filter.rb | ||
[github-heading-algorithm]: https://github.com/gjtorikian/html-pipeline/blob/f13a1534cb650ba17af400d1acd3a22c28004c09/lib/html/pipeline/toc_filter.rb |
@@ -1413,24 +1413,24 @@ # Rules | ||
This rule is triggered whenever a URL is given that isn't surrounded by angle | ||
brackets: | ||
This rule is triggered whenever a URL or email address appears without | ||
surrounding angle brackets: | ||
```markdown | ||
For more information, see https://www.example.com/. | ||
For more info, visit https://www.example.com/ or email user@example.com. | ||
``` | ||
To fix this, add angle brackets around the URL: | ||
To fix this, add angle brackets around the URL or email address: | ||
```markdown | ||
For more information, see <https://www.example.com/>. | ||
For more info, visit <https://www.example.com/> or email <user@example.com>. | ||
``` | ||
Note: To use a bare URL without it being converted into a link, enclose it in | ||
a code block, otherwise in some Markdown parsers it *will* be converted: | ||
Note: To include a bare URL or email without it being converted into a link, | ||
wrap it in a code span: | ||
```markdown | ||
`https://www.example.com` | ||
Not a clickable link: `https://www.example.com` | ||
``` | ||
Note: The following scenario does *not* trigger this rule to avoid conflicts | ||
with `MD011`/`no-reversed-links`: | ||
Note: The following scenario does not trigger this rule because it could be a | ||
shortcut link: | ||
@@ -1441,12 +1441,18 @@ ```markdown | ||
The use of quotes around a bare link will *not* trigger this rule, either: | ||
Note: The following syntax triggers this rule because the nested link could be | ||
a shortcut link (which takes precedence): | ||
```markdown | ||
"https://www.example.com" | ||
'https://www.example.com' | ||
[text [shortcut] text](https://example.com) | ||
``` | ||
Rationale: Without angle brackets, the URL isn't converted into a link by many | ||
Markdown parsers. | ||
To avoid this, escape both inner brackets: | ||
```markdown | ||
[link \[text\] link](https://example.com) | ||
``` | ||
Rationale: Without angle brackets, a bare URL or email isn't converted into a | ||
link by some Markdown parsers. | ||
<a name="md035"></a> | ||
@@ -2189,3 +2195,3 @@ | ||
[github-section-links]: https://docs.github.com/en/get-started/writing-on-github/getting-started-with-writing-and-formatting-on-github/basic-writing-and-formatting-syntax#section-links | ||
[github-heading-algorithm]: https://github.com/gjtorikian/html-pipeline/blob/main/lib/html/pipeline/toc_filter.rb | ||
[github-heading-algorithm]: https://github.com/gjtorikian/html-pipeline/blob/f13a1534cb650ba17af400d1acd3a22c28004c09/lib/html/pipeline/toc_filter.rb | ||
@@ -2192,0 +2198,0 @@ <a name="md052"></a> |
@@ -5,2 +5,4 @@ // @ts-check | ||
const micromark = require("./micromark.cjs"); | ||
// Regular expression for matching common newline characters | ||
@@ -37,6 +39,2 @@ // See NEWLINES_RE in markdown-it/lib/rules_core/normalize.js | ||
// Regular expression for reference links (full, collapsed, and shortcut) | ||
const referenceLinkRe = | ||
/!?\\?\[((?:\[[^\]\0]*\]|[^[\]\0])*)\](?:\[([^\]\0]*)\]|([^(])|$)/g; | ||
// Regular expression for link reference definitions | ||
@@ -198,15 +196,2 @@ const linkReferenceDefinitionRe = /^ {0,3}\[([^\]]*[^\\])\]:/; | ||
// Un-escapes Markdown content (simple algorithm; not a parser) | ||
const escapedMarkdownRe = /\\./g; | ||
module.exports.unescapeMarkdown = | ||
function unescapeMarkdown(markdown, replacement) { | ||
return markdown.replace(escapedMarkdownRe, (match) => { | ||
const char = match[1]; | ||
if ("!\"#$%&'()*+,-./:;<=>?@[\\]^_`{|}~".includes(char)) { | ||
return replacement || char; | ||
} | ||
return match; | ||
}); | ||
}; | ||
/** | ||
@@ -294,3 +279,3 @@ * Return the string representation of a fence markup character. | ||
function filterTokens(params, type, handler) { | ||
for (const token of params.tokens) { | ||
for (const token of params.parsers.markdownit.tokens) { | ||
if (token.type === type) { | ||
@@ -350,3 +335,3 @@ handler(token); | ||
}); | ||
for (const token of params.tokens.filter(isMathBlock)) { | ||
for (const token of params.parsers.markdownit.tokens.filter(isMathBlock)) { | ||
for (let i = token.map[0]; i < token.map[1]; i++) { | ||
@@ -445,3 +430,3 @@ lineMetadata[i][7] = true; | ||
let heading = null; | ||
for (const token of params.tokens) { | ||
for (const token of params.parsers.markdownit.tokens) { | ||
if (token.type === "heading_open") { | ||
@@ -827,133 +812,113 @@ heading = token; | ||
* | ||
* @param {Object} lineMetadata Line metadata object. | ||
* @param {Object} params RuleParams instance. | ||
* @returns {Object} Reference link/image data. | ||
*/ | ||
function getReferenceLinkImageData(lineMetadata) { | ||
// Initialize return values | ||
function getReferenceLinkImageData(params) { | ||
const normalizeReference = (s) => s.toLowerCase().trim().replace(/\s+/g, " "); | ||
const definitions = new Map(); | ||
const definitionLineIndices = []; | ||
const duplicateDefinitions = []; | ||
const references = new Map(); | ||
const shortcuts = new Map(); | ||
const definitions = new Map(); | ||
const duplicateDefinitions = []; | ||
const definitionLineIndices = []; | ||
// Define helper functions | ||
const normalizeLabel = (s) => s.toLowerCase().trim().replace(/\s+/g, " "); | ||
const exclusions = []; | ||
const excluded = (match) => withinAnyRange( | ||
exclusions, 0, match.index, match[0].length - (match[3] || "").length | ||
); | ||
// Convert input to single-line so multi-line links/images are easier | ||
const lineOffsets = []; | ||
let currentOffset = 0; | ||
const contentLines = []; | ||
forEachLine(lineMetadata, (line, lineIndex, inCode) => { | ||
lineOffsets[lineIndex] = currentOffset; | ||
if (!inCode) { | ||
line = line.replace(blockquotePrefixRe, ""); | ||
if (line.trim().length === 0) { | ||
// Allow RegExp to detect the end of a block | ||
line = "\0"; | ||
} | ||
contentLines.push(line); | ||
currentOffset += line.length + 1; | ||
} | ||
}); | ||
lineOffsets.push(currentOffset); | ||
const contentLine = contentLines.join(" "); | ||
// Determine single-line exclusions for inline code spans | ||
forEachInlineCodeSpan(contentLine, (code, lineIndex, columnIndex) => { | ||
exclusions.push([ 0, columnIndex, code.length ]); | ||
}); | ||
// Identify all link/image reference definitions | ||
forEachLine(lineMetadata, (line, lineIndex, inCode) => { | ||
if (!inCode) { | ||
const linkReferenceDefinitionMatch = linkReferenceDefinitionRe.exec(line); | ||
if (linkReferenceDefinitionMatch) { | ||
const label = normalizeLabel(linkReferenceDefinitionMatch[1]); | ||
if (definitions.has(label)) { | ||
duplicateDefinitions.push([ label, lineIndex ]); | ||
} else { | ||
definitions.set(label, lineIndex); | ||
const filteredTokens = | ||
micromark.filterByTypes( | ||
params.parsers.micromark.tokens, | ||
[ | ||
// definitionLineIndices | ||
"definition", "gfmFootnoteDefinition", | ||
// definitions and definitionLineIndices | ||
"definitionLabelString", "gfmFootnoteDefinitionLabelString", | ||
// references and shortcuts | ||
"gfmFootnoteCall", "image", "link" | ||
] | ||
); | ||
for (const token of filteredTokens) { | ||
let labelPrefix = ""; | ||
// eslint-disable-next-line default-case | ||
switch (token.type) { | ||
case "definition": | ||
case "gfmFootnoteDefinition": | ||
// definitionLineIndices | ||
for (let i = token.startLine; i <= token.endLine; i++) { | ||
definitionLineIndices.push(i - 1); | ||
} | ||
const labelLength = linkReferenceDefinitionMatch[0].length; | ||
exclusions.push([ 0, lineOffsets[lineIndex], labelLength ]); | ||
const hasDefinition = line.slice(labelLength).trim().length > 0; | ||
definitionLineIndices.push(lineIndex); | ||
if (!hasDefinition) { | ||
definitionLineIndices.push(lineIndex + 1); | ||
} | ||
} | ||
} | ||
}); | ||
// Identify all link and image references | ||
let lineIndex = 0; | ||
const pendingContents = [ | ||
{ | ||
"content": contentLine, | ||
"contentLineIndex": 0, | ||
"contentIndex": 0, | ||
"topLevel": true | ||
} | ||
]; | ||
let pendingContent = null; | ||
while ((pendingContent = pendingContents.shift())) { | ||
const { content, contentLineIndex, contentIndex, topLevel } = | ||
pendingContent; | ||
let referenceLinkMatch = null; | ||
while ((referenceLinkMatch = referenceLinkRe.exec(content)) !== null) { | ||
const [ matchString, matchText, matchLabel ] = referenceLinkMatch; | ||
if ( | ||
!matchString.startsWith("\\") && | ||
!matchString.startsWith("!\\") && | ||
!matchText.endsWith("\\") && | ||
!(matchLabel || "").endsWith("\\") && | ||
!(topLevel && excluded(referenceLinkMatch)) | ||
) { | ||
const shortcutLink = (matchLabel === undefined); | ||
const collapsedLink = | ||
(!shortcutLink && (matchLabel.length === 0)); | ||
const label = normalizeLabel( | ||
(shortcutLink || collapsedLink) ? matchText : matchLabel | ||
); | ||
if (label.length > 0) { | ||
const referenceindex = referenceLinkMatch.index; | ||
if (topLevel) { | ||
// Calculate line index | ||
while (lineOffsets[lineIndex + 1] <= referenceindex) { | ||
lineIndex++; | ||
} | ||
break; | ||
case "gfmFootnoteDefinitionLabelString": | ||
labelPrefix = "^"; | ||
case "definitionLabelString": // eslint-disable-line no-fallthrough | ||
{ | ||
// definitions and definitionLineIndices | ||
const reference = normalizeReference(`${labelPrefix}${token.text}`); | ||
if (definitions.has(reference)) { | ||
duplicateDefinitions.push([ reference, token.startLine - 1 ]); | ||
} else { | ||
// Use provided line index | ||
lineIndex = contentLineIndex; | ||
definitions.set(reference, token.startLine - 1); | ||
} | ||
const referenceIndex = referenceindex + | ||
(topLevel ? -lineOffsets[lineIndex] : contentIndex); | ||
const referenceDatum = [ | ||
lineIndex, | ||
referenceIndex, | ||
matchString.length, | ||
matchText.length, | ||
(matchLabel || "").length | ||
]; | ||
if (shortcutLink) { | ||
// Track separately due to ambiguity in "text [text] text" | ||
const shortcutData = shortcuts.get(label) || []; | ||
shortcutData.push(referenceDatum); | ||
shortcuts.set(label, shortcutData); | ||
} else { | ||
// Track reference and location | ||
const referenceData = references.get(label) || []; | ||
} | ||
break; | ||
case "gfmFootnoteCall": | ||
case "image": | ||
case "link": | ||
{ | ||
let isShortcut = false; | ||
let isFullOrCollapsed = false; | ||
let labelText = null; | ||
let referenceStringText = null; | ||
const shortcutCandidate = | ||
micromark.matchAndGetTokensByType(token.children, [ "label" ]); | ||
if (shortcutCandidate) { | ||
labelText = | ||
micromark.getTokenTextByType( | ||
shortcutCandidate[0].children, "labelText" | ||
); | ||
isShortcut = (labelText !== null); | ||
} | ||
const fullAndCollapsedCandidate = | ||
micromark.matchAndGetTokensByType( | ||
token.children, [ "label", "reference" ] | ||
); | ||
if (fullAndCollapsedCandidate) { | ||
labelText = | ||
micromark.getTokenTextByType( | ||
fullAndCollapsedCandidate[0].children, "labelText" | ||
); | ||
referenceStringText = | ||
micromark.getTokenTextByType( | ||
fullAndCollapsedCandidate[1].children, "referenceString" | ||
); | ||
isFullOrCollapsed = (labelText !== null); | ||
} | ||
const footnote = micromark.matchAndGetTokensByType( | ||
token.children, | ||
[ | ||
"gfmFootnoteCallLabelMarker", "gfmFootnoteCallMarker", | ||
"gfmFootnoteCallString", "gfmFootnoteCallLabelMarker" | ||
], | ||
[ "gfmFootnoteCallMarker", "gfmFootnoteCallString" ] | ||
); | ||
if (footnote) { | ||
const callMarkerText = footnote[0].text; | ||
const callString = footnote[1].text; | ||
labelText = `${callMarkerText}${callString}`; | ||
isShortcut = true; | ||
} | ||
// Track shortcuts separately due to ambiguity in "text [text] text" | ||
if (isShortcut || isFullOrCollapsed) { | ||
const referenceDatum = [ | ||
token.startLine - 1, | ||
token.startColumn - 1, | ||
token.text.length, | ||
// @ts-ignore | ||
labelText.length, | ||
(referenceStringText || "").length | ||
]; | ||
const reference = | ||
normalizeReference(referenceStringText || labelText); | ||
const dictionary = isShortcut ? shortcuts : references; | ||
const referenceData = dictionary.get(reference) || []; | ||
referenceData.push(referenceDatum); | ||
references.set(label, referenceData); | ||
dictionary.set(reference, referenceData); | ||
} | ||
// Check for links embedded in brackets | ||
if (!matchString.startsWith("!")) { | ||
pendingContents.push({ | ||
"content": matchText, | ||
"contentLineIndex": lineIndex, | ||
"contentIndex": referenceIndex + 1, | ||
"topLevel": false | ||
}); | ||
} | ||
} | ||
} | ||
break; | ||
} | ||
@@ -1197,108 +1162,1 @@ } | ||
module.exports.expandTildePath = expandTildePath; | ||
/** | ||
* RegExp.exec-style implementation of function expressions. | ||
* | ||
* @param {Function} funcExp Function that takes string and returns | ||
* [index, length] or null. | ||
* @param {string} input String to search. | ||
* @returns {string[] | null} RegExp.exec-style [match] with an index property. | ||
*/ | ||
function funcExpExec(funcExp, input) { | ||
// Start or resume match | ||
// @ts-ignore | ||
const lastIndex = funcExp.lastIndex || 0; | ||
const result = funcExp(input.slice(lastIndex)); | ||
if (result) { | ||
// Update lastIndex and return match | ||
const [ subIndex, length ] = result; | ||
const index = lastIndex + subIndex; | ||
// @ts-ignore | ||
funcExp.lastIndex = index + length; | ||
const match = [ input.slice(index, index + length) ]; | ||
// @ts-ignore | ||
match.index = index; | ||
return match; | ||
} | ||
// Reset lastIndex and return no match | ||
// @ts-ignore | ||
funcExp.lastIndex = 0; | ||
return null; | ||
} | ||
module.exports.funcExpExec = funcExpExec; | ||
const urlFeProtocolRe = /(?:http|ftp)s?:\/\//i; | ||
const urlFeAutolinkTerminalsRe = / |$/; | ||
const urlFeBareTerminalsRe = /[ ,!`'"\]]|$/; | ||
const urlFeNonTerminalsRe = "-#/"; | ||
const urlFePunctuationRe = /\p{Punctuation}/u; | ||
const urlFePrefixToPostfix = new Map([ | ||
[ " ", " " ], | ||
[ "`", "`" ], | ||
[ "'", "'" ], | ||
[ "\"", "\"" ], | ||
[ "‘", "’" ], | ||
[ "“", "”" ], | ||
[ "«", "»" ], | ||
[ "*", "*" ], | ||
[ "_", "_" ], | ||
[ "(", ")" ], | ||
[ "[", "]" ], | ||
[ "{", "}" ], | ||
[ "<", ">" ], | ||
[ ">", "<" ] | ||
]); | ||
/** | ||
* Function expression that matches URLs. | ||
* | ||
* @param {string} input Substring to search for a URL. | ||
* @returns {Array | null} [index, length] of URL or null. | ||
*/ | ||
function urlFe(input) { | ||
// Find start of URL by searching for protocol | ||
const match = input.match(urlFeProtocolRe); | ||
if (match) { | ||
// Look for matching pre/postfix characters (ex: <...>) | ||
const start = match.index || 0; | ||
const length = match[0].length; | ||
const prefix = input[start - 1] || " "; | ||
const postfix = urlFePrefixToPostfix.get(prefix); | ||
// @ts-ignore | ||
let endPostfix = input.indexOf(postfix, start + length); | ||
if (endPostfix === -1) { | ||
endPostfix = input.length; | ||
} | ||
// Look for characters that terminate a URL | ||
const terminalsRe = | ||
(prefix === "<") ? urlFeAutolinkTerminalsRe : urlFeBareTerminalsRe; | ||
const endTerminal = start + input.slice(start).search(terminalsRe); | ||
// Determine tentative end of URL | ||
let end = Math.min(endPostfix, endTerminal); | ||
if (prefix === " ") { | ||
// If the URL used " " as pre/postfix characters, trim the end | ||
if (input[end - 1] === ")") { | ||
// Trim any ")" beyond the last "(...)" pair | ||
const lastOpenParen = input.lastIndexOf("(", end - 2); | ||
if (lastOpenParen <= start) { | ||
end--; | ||
} else { | ||
const nextCloseParen = input.indexOf(")", lastOpenParen + 1); | ||
end = nextCloseParen + 1; | ||
} | ||
} else { | ||
// Trim unwanted punctuation | ||
while ( | ||
!urlFeNonTerminalsRe.includes(input[end - 1]) && | ||
urlFePunctuationRe.test(input[end - 1]) | ||
) { | ||
end--; | ||
} | ||
} | ||
} | ||
return [ start, end - start ]; | ||
} | ||
// No match | ||
return null; | ||
} | ||
module.exports.urlFe = urlFe; |
{ | ||
"name": "markdownlint-rule-helpers", | ||
"version": "0.18.0", | ||
"version": "0.19.0", | ||
"description": "A collection of markdownlint helper functions for custom rules", | ||
@@ -18,2 +18,5 @@ "main": "./helpers.js", | ||
}, | ||
"dependencies": { | ||
"markdownlint-micromark": "0.1.2" | ||
}, | ||
"keywords": [ | ||
@@ -20,0 +23,0 @@ "markdownlint", |
@@ -14,2 +14,2 @@ // @ts-check | ||
module.exports.homepage = "https://github.com/DavidAnson/markdownlint"; | ||
module.exports.version = "0.27.0"; | ||
module.exports.version = "0.28.0"; |
@@ -5,7 +5,7 @@ export = markdownlint; | ||
* | ||
* @param {Options} options Configuration options. | ||
* @param {Options | null} options Configuration options. | ||
* @param {LintCallback} callback Callback (err, result) function. | ||
* @returns {void} | ||
*/ | ||
declare function markdownlint(options: Options, callback: LintCallback): void; | ||
declare function markdownlint(options: Options | null, callback: LintCallback): void; | ||
declare namespace markdownlint { | ||
@@ -37,3 +37,3 @@ export { markdownlintSync as sync, readConfig, readConfigSync, getVersion, promises, RuleFunction, RuleParams, MarkdownItToken, RuleOnError, RuleOnErrorInfo, RuleOnErrorFixInfo, Rule, Options, Plugin, ToStringCallback, LintResults, LintError, FixInfo, LintCallback, Configuration, RuleConfiguration, ConfigurationParser, ReadConfigCallback, ResolveConfigExtendsCallback }; | ||
*/ | ||
frontMatter?: RegExp; | ||
frontMatter?: RegExp | null; | ||
/** | ||
@@ -73,6 +73,6 @@ * File system implementation. | ||
* | ||
* @param {Options} options Configuration options. | ||
* @param {Options | null} options Configuration options. | ||
* @returns {LintResults} Results object. | ||
*/ | ||
declare function markdownlintSync(options: Options): LintResults; | ||
declare function markdownlintSync(options: Options | null): LintResults; | ||
/** | ||
@@ -79,0 +79,0 @@ * Read specified configuration file. |
@@ -7,3 +7,4 @@ // @ts-check | ||
const { promisify } = require("node:util"); | ||
const markdownIt = require("markdown-it"); | ||
const markdownit = require("markdown-it"); | ||
const micromark = require("../helpers/micromark.cjs"); | ||
const { deprecatedRuleNames } = require("./constants"); | ||
@@ -170,3 +171,3 @@ const rules = require("./rules"); | ||
* @param {string} content Markdown content. | ||
* @param {RegExp} frontMatter Regular expression to match front matter. | ||
* @param {RegExp | null} frontMatter Regular expression to match front matter. | ||
* @returns {Object} Trimmed content and front matter lines. | ||
@@ -513,2 +514,4 @@ */ | ||
* @param {Rule[]} ruleList List of rules. | ||
* @param {Object.<string, string[]>} aliasToRuleNames Map of alias to rule | ||
* names. | ||
* @param {string} name Identifier for the content. | ||
@@ -519,3 +522,3 @@ * @param {string} content Markdown content. | ||
* @param {ConfigurationParser[] | null} configParsers Configuration parsers. | ||
* @param {RegExp} frontMatter Regular expression for front matter. | ||
* @param {RegExp | null} frontMatter Regular expression for front matter. | ||
* @param {boolean} handleRuleFailures Whether to handle exceptions in rules. | ||
@@ -529,2 +532,3 @@ * @param {boolean} noInlineConfig Whether to allow inline configuration. | ||
ruleList, | ||
aliasToRuleNames, | ||
name, | ||
@@ -555,14 +559,25 @@ content, | ||
configParsers, | ||
mapAliasToRuleNames(ruleList) | ||
aliasToRuleNames | ||
); | ||
// Hide the content of HTML comments from rules, etc. | ||
// Parse content into parser tokens | ||
const markdownitTokens = md.parse(content, {}); | ||
const micromarkTokens = micromark.parse(content); | ||
// Hide the content of HTML comments from rules | ||
content = helpers.clearHtmlCommentText(content); | ||
// Parse content into tokens and lines | ||
const tokens = md.parse(content, {}); | ||
// Parse content into lines and update markdown-it tokens | ||
const lines = content.split(helpers.newLineRe); | ||
annotateAndFreezeTokens(tokens, lines); | ||
annotateAndFreezeTokens(markdownitTokens, lines); | ||
// Create (frozen) parameters for rules | ||
const parsers = Object.freeze({ | ||
"markdownit": Object.freeze({ | ||
"tokens": markdownitTokens | ||
}), | ||
"micromark": Object.freeze({ | ||
"tokens": micromarkTokens | ||
}) | ||
}); | ||
const paramsBase = { | ||
name, | ||
tokens, | ||
parsers, | ||
"tokens": markdownitTokens, | ||
"lines": Object.freeze(lines), | ||
@@ -576,7 +591,7 @@ "frontMatterLines": Object.freeze(frontMatterLines) | ||
const flattenedLists = | ||
helpers.flattenLists(paramsBase.tokens); | ||
helpers.flattenLists(paramsBase.parsers.markdownit.tokens); | ||
const htmlElementRanges = | ||
helpers.htmlElementRanges(paramsBase, lineMetadata); | ||
const referenceLinkImageData = | ||
helpers.getReferenceLinkImageData(lineMetadata); | ||
helpers.getReferenceLinkImageData(paramsBase); | ||
cache.set({ | ||
@@ -790,2 +805,4 @@ codeBlockAndSpanRanges, | ||
* @param {Rule[]} ruleList List of rules. | ||
* @param {Object.<string, string[]>} aliasToRuleNames Map of alias to rule | ||
* names. | ||
* @param {string} file Path of file to lint. | ||
@@ -795,3 +812,3 @@ * @param {Object} md Instance of markdown-it. | ||
* @param {ConfigurationParser[] | null} configParsers Configuration parsers. | ||
* @param {RegExp} frontMatter Regular expression for front matter. | ||
* @param {RegExp | null} frontMatter Regular expression for front matter. | ||
* @param {boolean} handleRuleFailures Whether to handle exceptions in rules. | ||
@@ -807,2 +824,3 @@ * @param {boolean} noInlineConfig Whether to allow inline configuration. | ||
ruleList, | ||
aliasToRuleNames, | ||
file, | ||
@@ -826,2 +844,3 @@ md, | ||
ruleList, | ||
aliasToRuleNames, | ||
file, | ||
@@ -850,3 +869,3 @@ content, | ||
* | ||
* @param {Options} options Options object. | ||
* @param {Options | null} options Options object. | ||
* @param {boolean} synchronous Whether to execute synchronously. | ||
@@ -883,3 +902,3 @@ * @param {Function} callback Callback (err, result) function. | ||
3 : options.resultVersion; | ||
const md = markdownIt({ "html": true }); | ||
const md = markdownit({ "html": true }); | ||
const markdownItPlugins = options.markdownItPlugins || []; | ||
@@ -891,2 +910,3 @@ for (const plugin of markdownItPlugins) { | ||
const fs = options.fs || require("node:fs"); | ||
const aliasToRuleNames = mapAliasToRuleNames(ruleList); | ||
const results = newResults(ruleList); | ||
@@ -919,2 +939,3 @@ let done = false; | ||
ruleList, | ||
aliasToRuleNames, | ||
currentItem, | ||
@@ -937,2 +958,3 @@ md, | ||
ruleList, | ||
aliasToRuleNames, | ||
currentItem, | ||
@@ -978,3 +1000,3 @@ strings[currentItem] || "", | ||
* | ||
* @param {Options} options Configuration options. | ||
* @param {Options | null} options Configuration options. | ||
* @param {LintCallback} callback Callback (err, result) function. | ||
@@ -1003,3 +1025,3 @@ * @returns {void} | ||
* | ||
* @param {Options} options Configuration options. | ||
* @param {Options | null} options Configuration options. | ||
* @returns {LintResults} Results object. | ||
@@ -1316,3 +1338,3 @@ */ | ||
* @property {string[] | string} [files] Files to lint. | ||
* @property {RegExp} [frontMatter] Front matter pattern. | ||
* @property {RegExp | null} [frontMatter] Front matter pattern. | ||
* @property {Object} [fs] File system implementation. | ||
@@ -1319,0 +1341,0 @@ * @property {boolean} [handleRuleFailures] True to catch exceptions. |
@@ -14,3 +14,3 @@ // @ts-check | ||
const tag = "h" + level; | ||
params.tokens.every(function forToken(token) { | ||
params.parsers.markdownit.tokens.every(function forToken(token) { | ||
if (token.type === "heading_open") { | ||
@@ -17,0 +17,0 @@ addErrorDetailIf(onError, token.lineNumber, tag, token.tag); |
@@ -16,3 +16,3 @@ // @ts-check | ||
let listItemNesting = 0; | ||
for (const token of params.tokens) { | ||
for (const token of params.parsers.markdownit.tokens) { | ||
const { content, lineNumber, type } = token; | ||
@@ -19,0 +19,0 @@ if (type === "blockquote_open") { |
@@ -14,3 +14,3 @@ // @ts-check | ||
let prevLineNumber = null; | ||
for (const token of params.tokens) { | ||
for (const token of params.parsers.markdownit.tokens) { | ||
if ((token.type === "blockquote_open") && | ||
@@ -17,0 +17,0 @@ (prevToken.type === "blockquote_close")) { |
@@ -5,13 +5,7 @@ // @ts-check | ||
const { | ||
addError, forEachLine, htmlElementRe, withinAnyRange, unescapeMarkdown | ||
} = require("../helpers"); | ||
const { codeBlockAndSpanRanges, lineMetadata, referenceLinkImageData } = | ||
require("./cache"); | ||
const { addError } = require("../helpers"); | ||
const { filterByTypes, getHtmlTagInfo, parse } = | ||
require("../helpers/micromark.cjs"); | ||
const linkDestinationRe = /\]\(\s*$/; | ||
// See https://spec.commonmark.org/0.29/#autolinks | ||
const emailAddressRe = | ||
// eslint-disable-next-line max-len | ||
/^[\w.!#$%&'*+/=?^`{|}~-]+@[a-zA-Z\d](?:[a-zA-Z\d-]{0,61}[a-zA-Z\d])?(?:\.[a-zA-Z\d](?:[a-zA-Z\d-]{0,61}[a-zA-Z\d])?)*$/; | ||
const nextLinesRe = /[\r\n][\s\S]*$/; | ||
@@ -26,36 +20,49 @@ module.exports = { | ||
allowedElements = allowedElements.map((element) => element.toLowerCase()); | ||
const exclusions = codeBlockAndSpanRanges(); | ||
const { references, definitionLineIndices } = referenceLinkImageData(); | ||
for (const datas of references.values()) { | ||
for (const data of datas) { | ||
const [ lineIndex, index, , textLength, labelLength ] = data; | ||
if (labelLength > 0) { | ||
exclusions.push([ lineIndex, index + 3 + textLength, labelLength ]); | ||
const pending = [ [ 0, params.parsers.micromark.tokens ] ]; | ||
let current = null; | ||
while ((current = pending.shift())) { | ||
const [ offset, tokens ] = current; | ||
for (const token of filterByTypes(tokens, [ "htmlFlow", "htmlText" ])) { | ||
if (token.type === "htmlText") { | ||
const htmlTagInfo = getHtmlTagInfo(token); | ||
if ( | ||
htmlTagInfo && | ||
!htmlTagInfo.close && | ||
!allowedElements.includes(htmlTagInfo.name.toLowerCase()) | ||
) { | ||
const range = [ | ||
token.startColumn, | ||
token.text.replace(nextLinesRe, "").length | ||
]; | ||
addError( | ||
onError, | ||
token.startLine + offset, | ||
"Element: " + htmlTagInfo.name, | ||
undefined, | ||
range | ||
); | ||
} | ||
} else { | ||
// token.type === "htmlFlow" | ||
// Re-parse without "htmlFlow" to get only "htmlText" tokens | ||
const options = { | ||
"extensions": [ | ||
{ | ||
"disable": { | ||
"null": [ "codeIndented", "htmlFlow" ] | ||
} | ||
} | ||
] | ||
}; | ||
// Use lines instead of token.text for accurate columns | ||
const lines = | ||
params.lines.slice(token.startLine - 1, token.endLine).join("\n"); | ||
const flowTokens = parse(lines, options); | ||
pending.push( | ||
[ token.startLine - 1, flowTokens ] | ||
); | ||
} | ||
} | ||
} | ||
forEachLine(lineMetadata(), (line, lineIndex, inCode) => { | ||
let match = null; | ||
// eslint-disable-next-line no-unmodified-loop-condition | ||
while (!inCode && ((match = htmlElementRe.exec(line)) !== null)) { | ||
const [ tag, content, element ] = match; | ||
if ( | ||
!allowedElements.includes(element.toLowerCase()) && | ||
!tag.endsWith("\\>") && | ||
!emailAddressRe.test(content) && | ||
!withinAnyRange(exclusions, lineIndex, match.index, tag.length) && | ||
!definitionLineIndices.includes(lineIndex) | ||
) { | ||
const prefix = line.substring(0, match.index); | ||
if (!linkDestinationRe.test(prefix)) { | ||
const unescaped = unescapeMarkdown(prefix + "<", "_"); | ||
if (!unescaped.endsWith("_")) { | ||
addError(onError, lineIndex + 1, "Element: " + element, | ||
undefined, [ match.index + 1, tag.length ]); | ||
} | ||
} | ||
} | ||
} | ||
}); | ||
} | ||
}; |
126
lib/md034.js
@@ -5,9 +5,6 @@ // @ts-check | ||
const { addErrorContext, filterTokens, funcExpExec, urlFe, withinAnyRange } = | ||
require("../helpers"); | ||
const { codeBlockAndSpanRanges, htmlElementRanges, referenceLinkImageData } = | ||
require("./cache"); | ||
const { addErrorContext } = require("../helpers"); | ||
const { filterByPredicate, getHtmlTagInfo } = | ||
require("../helpers/micromark.cjs"); | ||
const htmlLinkRe = /<a(?:\s[^>]*)?>[^<>]*<\/a\s*>/gi; | ||
module.exports = { | ||
@@ -18,71 +15,56 @@ "names": [ "MD034", "no-bare-urls" ], | ||
"function": function MD034(params, onError) { | ||
const { lines } = params; | ||
const codeExclusions = [ | ||
...codeBlockAndSpanRanges(), | ||
...htmlElementRanges() | ||
]; | ||
filterTokens(params, "html_block", (token) => { | ||
for (let i = token.map[0]; i < token.map[1]; i++) { | ||
codeExclusions.push([ i, 0, lines[i].length ]); | ||
} | ||
}); | ||
const { definitionLineIndices } = referenceLinkImageData(); | ||
for (const [ lineIndex, line ] of lines.entries()) { | ||
if (definitionLineIndices[0] === lineIndex) { | ||
definitionLineIndices.shift(); | ||
} else { | ||
let match = null; | ||
const lineExclusions = []; | ||
while ((match = htmlLinkRe.exec(line)) !== null) { | ||
lineExclusions.push([ lineIndex, match.index, match[0].length ]); | ||
} | ||
while ((match = funcExpExec(urlFe, line)) !== null) { | ||
const [ bareUrl ] = match; | ||
// @ts-ignore | ||
const matchIndex = match.index; | ||
const bareUrlLength = bareUrl.length; | ||
const prefix = line.slice(0, matchIndex); | ||
const postfix = line.slice(matchIndex + bareUrlLength); | ||
if ( | ||
// Allow <...> to avoid reporting non-bare links | ||
!(prefix.endsWith("<") && postfix.startsWith(">")) && | ||
// Allow >...</ to avoid reporting <code>...</code> | ||
!(prefix.endsWith(">") && postfix.startsWith("</")) && | ||
// Allow "..." and '...' to allow quoting a bare link | ||
!(prefix.endsWith("\"") && postfix.startsWith("\"")) && | ||
!(prefix.endsWith("'") && postfix.startsWith("'")) && | ||
// Allow ](... to avoid reporting Markdown-style links | ||
!(/\]\(\s*$/.test(prefix)) && | ||
// Allow [...] to avoid MD011/no-reversed-links and nested links | ||
!(/\[[^\]]*$/.test(prefix) && /^[^[]*\]/.test(postfix)) && | ||
!withinAnyRange( | ||
lineExclusions, lineIndex, matchIndex, bareUrlLength | ||
) && | ||
!withinAnyRange( | ||
codeExclusions, lineIndex, matchIndex, bareUrlLength | ||
) | ||
) { | ||
const range = [ | ||
matchIndex + 1, | ||
bareUrlLength | ||
]; | ||
const fixInfo = { | ||
"editColumn": range[0], | ||
"deleteCount": range[1], | ||
"insertText": `<${bareUrl}>` | ||
}; | ||
addErrorContext( | ||
onError, | ||
lineIndex + 1, | ||
bareUrl, | ||
null, | ||
null, | ||
range, | ||
fixInfo | ||
); | ||
const literalAutolinks = | ||
filterByPredicate( | ||
params.parsers.micromark.tokens, | ||
(token) => token.type === "literalAutolink", | ||
(token) => { | ||
const { children } = token; | ||
const result = []; | ||
for (let i = 0; i < children.length; i++) { | ||
const openToken = children[i]; | ||
const openTagInfo = getHtmlTagInfo(openToken); | ||
if (openTagInfo && !openTagInfo.close) { | ||
let count = 1; | ||
for (let j = i + 1; j < children.length; j++) { | ||
const closeToken = children[j]; | ||
const closeTagInfo = getHtmlTagInfo(closeToken); | ||
if (closeTagInfo && (openTagInfo.name === closeTagInfo.name)) { | ||
if (closeTagInfo.close) { | ||
count--; | ||
if (count === 0) { | ||
i = j; | ||
break; | ||
} | ||
} else { | ||
count++; | ||
} | ||
} | ||
} | ||
} else { | ||
result.push(openToken); | ||
} | ||
} | ||
} | ||
} | ||
return result; | ||
}); | ||
for (const token of literalAutolinks) { | ||
const range = [ | ||
token.startColumn, | ||
token.endColumn - token.startColumn | ||
]; | ||
const fixInfo = { | ||
"editColumn": range[0], | ||
"deleteCount": range[1], | ||
"insertText": `<${token.text}>` | ||
}; | ||
addErrorContext( | ||
onError, | ||
token.startLine, | ||
token.text, | ||
null, | ||
null, | ||
range, | ||
fixInfo | ||
); | ||
} | ||
} | ||
}; |
@@ -5,3 +5,4 @@ // @ts-check | ||
const { addErrorDetailIf, filterTokens } = require("../helpers"); | ||
const { addErrorDetailIf } = require("../helpers"); | ||
const { filterByTypes } = require("../helpers/micromark.cjs"); | ||
@@ -14,15 +15,12 @@ module.exports = { | ||
let style = String(params.config.style || "consistent").trim(); | ||
filterTokens(params, "hr", (token) => { | ||
const { line, lineNumber } = token; | ||
let { markup } = token; | ||
const match = line.match(/[_*\-\s]+$/); | ||
if (match) { | ||
markup = match[0].trim(); | ||
} | ||
const thematicBreaks = | ||
filterByTypes(params.parsers.micromark.tokens, [ "thematicBreak" ]); | ||
for (const token of thematicBreaks) { | ||
const { startLine, text } = token; | ||
if (style === "consistent") { | ||
style = markup; | ||
style = text; | ||
} | ||
addErrorDetailIf(onError, lineNumber, style, markup); | ||
}); | ||
addErrorDetailIf(onError, startLine, style, text); | ||
} | ||
} | ||
}; |
@@ -52,3 +52,3 @@ // @ts-check | ||
let state = base; | ||
for (const token of params.tokens) { | ||
for (const token of params.parsers.markdownit.tokens) { | ||
state = state(token); | ||
@@ -55,0 +55,0 @@ } |
135
lib/md038.js
@@ -5,13 +5,19 @@ // @ts-check | ||
const { addErrorContext, filterTokens, forEachInlineCodeSpan, newLineRe } = | ||
require("../helpers"); | ||
const { addErrorContext } = require("../helpers"); | ||
const { filterByTypes } = require("../helpers/micromark.cjs"); | ||
const leftSpaceRe = /^\s(?:[^`]|$)/; | ||
const rightSpaceRe = /[^`]\s$/; | ||
const trimCodeText = (text, start, end) => { | ||
text = text.replace(/^\s+$/, ""); | ||
if (start) { | ||
text = text.replace(/^\s+?(\s`|\S)/, "$1"); | ||
} | ||
if (end) { | ||
text = text.replace(/(`\s|\S)\s+$/, "$1"); | ||
} | ||
return text; | ||
}; | ||
const tokenIfType = (token, type) => token && (token.type === type) && token; | ||
const spaceInsideCodeInline = (token) => ( | ||
(token.type === "code_inline") && | ||
(leftSpaceRe.test(token.content) || rightSpaceRe.test(token.content)) | ||
); | ||
module.exports = { | ||
@@ -22,52 +28,71 @@ "names": [ "MD038", "no-space-in-code" ], | ||
"function": function MD038(params, onError) { | ||
filterTokens(params, "inline", (token) => { | ||
if (token.children.some(spaceInsideCodeInline)) { | ||
const tokenLines = params.lines.slice(token.map[0], token.map[1]); | ||
forEachInlineCodeSpan( | ||
tokenLines.join("\n"), | ||
(code, lineIndex, columnIndex, tickCount) => { | ||
let rangeIndex = columnIndex - tickCount; | ||
let rangeLength = code.length + (2 * tickCount); | ||
let rangeLineOffset = 0; | ||
let fixIndex = columnIndex; | ||
let fixLength = code.length; | ||
const codeLines = code.split(newLineRe); | ||
const left = leftSpaceRe.test(code); | ||
const right = !left && rightSpaceRe.test(code); | ||
if (right && (codeLines.length > 1)) { | ||
rangeIndex = 0; | ||
rangeLineOffset = codeLines.length - 1; | ||
fixIndex = 0; | ||
} | ||
if (left || right) { | ||
const codeLinesRange = codeLines[rangeLineOffset]; | ||
if (codeLines.length > 1) { | ||
rangeLength = codeLinesRange.length + tickCount; | ||
fixLength = codeLinesRange.length; | ||
} | ||
const context = tokenLines[lineIndex + rangeLineOffset] | ||
.substring(rangeIndex, rangeIndex + rangeLength); | ||
const codeLinesRangeTrim = codeLinesRange.trim(); | ||
const fixText = | ||
(codeLinesRangeTrim.startsWith("`") ? " " : "") + | ||
codeLinesRangeTrim + | ||
(codeLinesRangeTrim.endsWith("`") ? " " : ""); | ||
addErrorContext( | ||
onError, | ||
token.lineNumber + lineIndex + rangeLineOffset, | ||
context, | ||
left, | ||
right, | ||
[ rangeIndex + 1, rangeLength ], | ||
{ | ||
"editColumn": fixIndex + 1, | ||
"deleteCount": fixLength, | ||
"insertText": fixText | ||
} | ||
); | ||
} | ||
}); | ||
const codeTextTokens = | ||
filterByTypes(params.parsers.micromark.tokens, [ "codeText" ]); | ||
for (const token of codeTextTokens) { | ||
const { children } = token; | ||
const first = 0; | ||
const last = children.length - 1; | ||
const startSequence = tokenIfType(children[first], "codeTextSequence"); | ||
const endSequence = tokenIfType(children[last], "codeTextSequence"); | ||
const startData = | ||
tokenIfType(children[first + 1], "codeTextData") || | ||
tokenIfType(children[first + 2], "codeTextData"); | ||
const endData = | ||
tokenIfType(children[last - 1], "codeTextData") || | ||
tokenIfType(children[last - 2], "codeTextData"); | ||
if (startSequence && endSequence && startData && endData) { | ||
const spaceLeft = leftSpaceRe.test(startData.text); | ||
const spaceRight = !spaceLeft && rightSpaceRe.test(endData.text); | ||
if (spaceLeft || spaceRight) { | ||
let lineNumber = startSequence.startLine; | ||
let range = null; | ||
let fixInfo = null; | ||
if (startSequence.startLine === endSequence.endLine) { | ||
range = [ | ||
startSequence.startColumn, | ||
endSequence.endColumn - startSequence.startColumn | ||
]; | ||
fixInfo = { | ||
"editColumn": startSequence.endColumn, | ||
"deleteCount": endSequence.startColumn - startSequence.endColumn, | ||
"insertText": trimCodeText(startData.text, true, true) | ||
}; | ||
} else if (spaceLeft) { | ||
range = [ | ||
startSequence.startColumn, | ||
startData.endColumn - startSequence.startColumn | ||
]; | ||
fixInfo = { | ||
"editColumn": startSequence.endColumn, | ||
"deleteCount": startData.endColumn - startData.startColumn, | ||
"insertText": trimCodeText(startData.text, true, false) | ||
}; | ||
} else { | ||
lineNumber = endSequence.endLine; | ||
range = [ | ||
endData.startColumn, | ||
endSequence.endColumn - endData.startColumn | ||
]; | ||
fixInfo = { | ||
"editColumn": endData.startColumn, | ||
"deleteCount": endData.endColumn - endData.startColumn, | ||
"insertText": trimCodeText(endData.text, false, true) | ||
}; | ||
} | ||
const context = params | ||
.lines[lineNumber - 1] | ||
.substring(range[0] - 1, range[0] - 1 + range[1]); | ||
addErrorContext( | ||
onError, | ||
lineNumber, | ||
context, | ||
spaceLeft, | ||
spaceRight, | ||
range, | ||
fixInfo | ||
); | ||
} | ||
} | ||
}); | ||
} | ||
} | ||
}; |
@@ -21,3 +21,3 @@ // @ts-check | ||
const htmlHeadingRe = new RegExp(`^<h${level}[ />]`, "i"); | ||
params.tokens.every((token) => { | ||
params.parsers.markdownit.tokens.every((token) => { | ||
let isError = false; | ||
@@ -24,0 +24,0 @@ if (token.type === "html_block") { |
102
lib/md044.js
@@ -5,8 +5,11 @@ // @ts-check | ||
const { addErrorDetailIf, escapeForRegExp, forEachLine, forEachLink, | ||
funcExpExec, linkReferenceDefinitionRe, urlFe, withinAnyRange } = | ||
const { addErrorDetailIf, escapeForRegExp, withinAnyRange } = | ||
require("../helpers"); | ||
const { codeBlockAndSpanRanges, htmlElementRanges, lineMetadata } = | ||
require("./cache"); | ||
const { filterByPredicate, filterByTypes, parse } = | ||
require("../helpers/micromark.cjs"); | ||
const ignoredChildTypes = new Set( | ||
[ "codeFencedFence", "definition", "reference", "resource" ] | ||
); | ||
module.exports = { | ||
@@ -26,27 +29,21 @@ "names": [ "MD044", "proper-names" ], | ||
(htmlElements === undefined) ? true : !!htmlElements; | ||
const scannedTypes = new Set([ "data", "htmlFlowData" ]); | ||
if (includeCodeBlocks) { | ||
scannedTypes.add("codeFlowValue"); | ||
scannedTypes.add("codeTextData"); | ||
} | ||
const contentTokens = | ||
filterByPredicate( | ||
params.parsers.micromark.tokens, | ||
(token) => scannedTypes.has(token.type), | ||
(token) => { | ||
let { children } = token; | ||
if (!includeHtmlElements && (token.type === "htmlFlow")) { | ||
children = children.slice(1, -1); | ||
} | ||
return children.filter((t) => !ignoredChildTypes.has(t.type)); | ||
} | ||
); | ||
const exclusions = []; | ||
forEachLine(lineMetadata(), (line, lineIndex) => { | ||
if (linkReferenceDefinitionRe.test(line)) { | ||
exclusions.push([ lineIndex, 0, line.length ]); | ||
} else { | ||
let match = null; | ||
while ((match = funcExpExec(urlFe, line)) !== null) { | ||
// @ts-ignore | ||
exclusions.push([ lineIndex, match.index, match[0].length ]); | ||
} | ||
forEachLink(line, (index, _, text, destination) => { | ||
if (destination) { | ||
exclusions.push( | ||
[ lineIndex, index + text.length, destination.length ] | ||
); | ||
} | ||
}); | ||
} | ||
}); | ||
if (!includeCodeBlocks) { | ||
exclusions.push(...codeBlockAndSpanRanges()); | ||
} | ||
if (!includeHtmlElements) { | ||
exclusions.push(...htmlElementRanges()); | ||
} | ||
const autoLinked = new Set(); | ||
for (const name of names) { | ||
@@ -59,16 +56,33 @@ const escapedName = escapeForRegExp(name); | ||
const nameRe = new RegExp(namePattern, "gi"); | ||
forEachLine(lineMetadata(), (line, lineIndex, inCode, onFence) => { | ||
if (includeCodeBlocks || (!inCode && !onFence)) { | ||
let match = null; | ||
while ((match = nameRe.exec(line)) !== null) { | ||
const [ , leftMatch, nameMatch ] = match; | ||
const index = match.index + leftMatch.length; | ||
const length = nameMatch.length; | ||
if ( | ||
!withinAnyRange(exclusions, lineIndex, index, length) && | ||
!names.includes(nameMatch) | ||
) { | ||
for (const token of contentTokens) { | ||
let match = null; | ||
while ((match = nameRe.exec(token.text)) !== null) { | ||
const [ , leftMatch, nameMatch ] = match; | ||
const index = token.startColumn - 1 + match.index + leftMatch.length; | ||
const length = nameMatch.length; | ||
const lineIndex = token.startLine - 1; | ||
if ( | ||
!withinAnyRange(exclusions, lineIndex, index, length) && | ||
!names.includes(nameMatch) | ||
) { | ||
let urlRanges = []; | ||
if (!autoLinked.has(token)) { | ||
urlRanges = filterByTypes( | ||
parse(token.text), | ||
[ "literalAutolink" ] | ||
).map( | ||
(t) => [ | ||
lineIndex, | ||
token.startColumn - 1 + t.startColumn - 1, | ||
t.endColumn - t.startColumn | ||
] | ||
); | ||
exclusions.push(...urlRanges); | ||
autoLinked.add(token); | ||
} | ||
if (!withinAnyRange(urlRanges, lineIndex, index, length)) { | ||
const column = index + 1; | ||
addErrorDetailIf( | ||
onError, | ||
lineIndex + 1, | ||
token.startLine, | ||
name, | ||
@@ -78,5 +92,5 @@ nameMatch, | ||
null, | ||
[ index + 1, length ], | ||
[ column, length ], | ||
{ | ||
"editColumn": index + 1, | ||
"editColumn": column, | ||
"deleteCount": length, | ||
@@ -87,8 +101,8 @@ "insertText": name | ||
} | ||
exclusions.push([ lineIndex, index, length ]); | ||
} | ||
exclusions.push([ lineIndex, index, length ]); | ||
} | ||
}); | ||
} | ||
} | ||
} | ||
}; |
@@ -18,3 +18,3 @@ // @ts-check | ||
let expectedStyle = String(params.config.style || "consistent"); | ||
const codeBlocksAndFences = params.tokens.filter( | ||
const codeBlocksAndFences = params.parsers.markdownit.tokens.filter( | ||
(token) => (token.type === "code_block") || (token.type === "fence") | ||
@@ -21,0 +21,0 @@ ); |
@@ -14,3 +14,5 @@ // @ts-check | ||
let expectedStyle = style; | ||
const fenceTokens = params.tokens.filter((token) => token.type === "fence"); | ||
const fenceTokens = params.parsers.markdownit.tokens.filter( | ||
(token) => token.type === "fence" | ||
); | ||
for (const fenceToken of fenceTokens) { | ||
@@ -17,0 +19,0 @@ const { lineNumber, markup } = fenceToken; |
{ | ||
"name": "markdownlint", | ||
"version": "0.27.0", | ||
"version": "0.28.0", | ||
"description": "A Node.js style checker and lint tool for Markdown/CommonMark files.", | ||
@@ -28,6 +28,7 @@ "type": "commonjs", | ||
"build-config-schema": "node schema/build-config-schema.js", | ||
"build-declaration": "tsc --allowJs --declaration --emitDeclarationOnly --module commonjs --resolveJsonModule --target es2015 lib/markdownlint.js && node scripts delete 'lib/{c,md,r}*.d.ts' 'helpers/*.d.ts'", | ||
"build-demo": "node scripts copy node_modules/markdown-it/dist/markdown-it.min.js demo/markdown-it.min.js && cd demo && webpack --no-stats", | ||
"build-declaration": "tsc --allowJs --declaration --emitDeclarationOnly --module commonjs --resolveJsonModule --target es2015 lib/markdownlint.js && node scripts delete 'lib/{c,md,r}*.d.ts' 'micromark/*.d.cts' 'helpers/*.d.{cts,ts}'", | ||
"build-demo": "node scripts copy node_modules/markdown-it/dist/markdown-it.min.js demo/markdown-it.min.js && node scripts copy node_modules/markdownlint-micromark/micromark-browser.js demo/micromark-browser.js && cd demo && webpack --no-stats", | ||
"build-docs": "node doc-build/build-rules.mjs", | ||
"build-example": "npm install --no-save --ignore-scripts grunt grunt-cli gulp through2", | ||
"build-micromark": "cd micromark && npm run build", | ||
"ci": "npm-run-all --continue-on-error --parallel lint serial-config-docs serial-declaration-demo test-cover && git diff --exit-code", | ||
@@ -39,2 +40,3 @@ "clone-test-repos-apache-airflow": "cd test-repos && git clone https://github.com/apache/airflow apache-airflow --depth 1 --no-tags --quiet", | ||
"clone-test-repos-mdn-content": "cd test-repos && git clone https://github.com/mdn/content mdn-content --depth 1 --no-tags --quiet", | ||
"clone-test-repos-mdn-translated-content": "cd test-repos && git clone https://github.com/mdn/translated-content mdn-translated-content --depth 1 --no-tags --quiet", | ||
"clone-test-repos-mkdocs-mkdocs": "cd test-repos && git clone https://github.com/mkdocs/mkdocs mkdocs-mkdocs --depth 1 --no-tags --quiet", | ||
@@ -46,3 +48,3 @@ "clone-test-repos-mochajs-mocha": "cd test-repos && git clone https://github.com/mochajs/mocha mochajs-mocha --depth 1 --no-tags --quiet", | ||
"clone-test-repos-webpack-webpack-js-org": "cd test-repos && git clone https://github.com/webpack/webpack.js.org webpack-webpack-js-org --depth 1 --no-tags --quiet", | ||
"clone-test-repos": "mkdir test-repos && cd test-repos && npm run clone-test-repos-apache-airflow && npm run clone-test-repos-dotnet-docs && npm run clone-test-repos-electron-electron && npm run clone-test-repos-eslint-eslint && npm run clone-test-repos-mdn-content && npm run clone-test-repos-mkdocs-mkdocs && npm run clone-test-repos-mochajs-mocha && npm run clone-test-repos-pi-hole-docs && npm run clone-test-repos-v8-v8-dev && npm run clone-test-repos-webhintio-hint && npm run clone-test-repos-webpack-webpack-js-org", | ||
"clone-test-repos": "mkdir test-repos && cd test-repos && npm run clone-test-repos-apache-airflow && npm run clone-test-repos-dotnet-docs && npm run clone-test-repos-electron-electron && npm run clone-test-repos-eslint-eslint && npm run clone-test-repos-mdn-content && npm run clone-test-repos-mdn-translated-content && npm run clone-test-repos-mkdocs-mkdocs && npm run clone-test-repos-mochajs-mocha && npm run clone-test-repos-pi-hole-docs && npm run clone-test-repos-v8-v8-dev && npm run clone-test-repos-webhintio-hint && npm run clone-test-repos-webpack-webpack-js-org", | ||
"declaration": "npm run build-declaration && npm run test-declaration", | ||
@@ -52,11 +54,11 @@ "example": "cd example && node standalone.js && grunt markdownlint --force && gulp markdownlint", | ||
"docker-npm-run-upgrade": "docker run --rm --tty --name npm-run-upgrade --volume $PWD:/home/workdir --workdir /home/workdir --user node node:16 npm run upgrade", | ||
"lint": "eslint --ext .js,.mjs --max-warnings 0 .", | ||
"lint-test-repos": "ava --timeout=5m test/markdownlint-test-repos.js", | ||
"lint": "eslint --ext .js,.cjs,.mjs --max-warnings 0 .", | ||
"lint-test-repos": "ava --timeout=10m test/markdownlint-test-repos.js", | ||
"serial-config-docs": "npm run build-config && npm run build-docs", | ||
"serial-declaration-demo": "npm run build-declaration && npm-run-all --continue-on-error --parallel build-demo test-declaration", | ||
"test": "ava test/markdownlint-test.js test/markdownlint-test-config.js test/markdownlint-test-custom-rules.js test/markdownlint-test-helpers.js test/markdownlint-test-result-object.js test/markdownlint-test-scenarios.js", | ||
"test-cover": "c8 --check-coverage --branches 100 --functions 100 --lines 100 --statements 100 npm test", | ||
"test": "ava test/markdownlint-test.js test/markdownlint-test-config.js test/markdownlint-test-custom-rules.js test/markdownlint-test-helpers.js test/markdownlint-test-micromark.mjs test/markdownlint-test-result-object.js test/markdownlint-test-scenarios.js", | ||
"test-cover": "c8 --check-coverage --branches 100 --functions 100 --lines 100 --statements 100 --exclude 'test/**' --exclude 'micromark/**' npm test", | ||
"test-declaration": "cd example/typescript && tsc && node type-check.js", | ||
"test-extra": "ava --timeout=5m test/markdownlint-test-extra-parse.js test/markdownlint-test-extra-type.js", | ||
"update-snapshots": "ava --update-snapshots test/markdownlint-test-scenarios.js", | ||
"update-snapshots": "ava --update-snapshots test/markdownlint-test-micromark.mjs test/markdownlint-test-scenarios.js", | ||
"upgrade": "npx --yes npm-check-updates --upgrade" | ||
@@ -68,16 +70,19 @@ }, | ||
"dependencies": { | ||
"markdown-it": "13.0.1" | ||
"markdown-it": "13.0.1", | ||
"markdownlint-micromark": "0.1.2" | ||
}, | ||
"devDependencies": { | ||
"ava": "5.1.0", | ||
"c8": "7.12.0", | ||
"eslint": "8.30.0", | ||
"ava": "5.2.0", | ||
"babel-loader": "9.1.2", | ||
"@babel/core": "7.21.3", | ||
"@babel/preset-env": "7.20.2", | ||
"c8": "7.13.0", | ||
"eslint": "8.36.0", | ||
"eslint-plugin-es": "4.1.0", | ||
"eslint-plugin-jsdoc": "39.6.4", | ||
"eslint-plugin-n": "15.6.0", | ||
"eslint-plugin-regexp": "1.11.0", | ||
"eslint-plugin-unicorn": "45.0.2", | ||
"eslint-plugin-jsdoc": "40.1.0", | ||
"eslint-plugin-n": "15.6.1", | ||
"eslint-plugin-regexp": "1.13.0", | ||
"eslint-plugin-unicorn": "46.0.0", | ||
"globby": "13.1.3", | ||
"js-yaml": "4.1.0", | ||
"markdown-it-footnote": "3.0.3", | ||
"markdown-it-for-inline": "0.1.1", | ||
@@ -87,12 +92,12 @@ "markdown-it-sub": "1.0.0", | ||
"markdown-it-texmath": "1.0.0", | ||
"markdownlint-rule-helpers": "0.17.2", | ||
"markdownlint-rule-helpers": "0.18.0", | ||
"npm-run-all": "4.1.5", | ||
"strip-json-comments": "5.0.0", | ||
"terser-webpack-plugin": "5.3.6", | ||
"terser-webpack-plugin": "5.3.7", | ||
"toml": "3.0.0", | ||
"ts-loader": "9.4.2", | ||
"tv4": "1.3.0", | ||
"typescript": "4.9.4", | ||
"webpack": "5.75.0", | ||
"webpack-cli": "5.0.1" | ||
"typescript": "5.0.2", | ||
"webpack": "5.76.3", | ||
"webpack-cli": "5.0.1", | ||
"yaml": "2.2.1" | ||
}, | ||
@@ -99,0 +104,0 @@ "keywords": [ |
124
README.md
@@ -1023,125 +1023,3 @@ # markdownlint | ||
- 0.0.1 - Initial release, includes tests MD001-MD032. | ||
- 0.0.2 - Improve documentation, tests, and code. | ||
- 0.0.3 - Add synchronous API, improve documentation and code. | ||
- 0.0.4 - Add tests MD033-MD040, update dependencies. | ||
- 0.0.5 - Add `strings` option to enable file-less scenarios, add in-browser | ||
demo. | ||
- 0.0.6 - Improve performance, simplify in-browser, update dependencies. | ||
- 0.0.7 - Add MD041, improve MD003, ignore front matter, update dependencies. | ||
- 0.0.8 - Support disabling/enabling rules inline, improve code fence, | ||
dependencies. | ||
- 0.1.0 - Add aliases, exceptions for MD033, exclusions for MD013, dependencies. | ||
- 0.1.1 - Fix bug handling HTML in tables, reference markdownlint-cli. | ||
- 0.2.0 - Add MD042/MD043, enhance MD002/MD003/MD004/MD007/MD011/MD025/MD041, | ||
dependencies. | ||
- 0.3.0 - More detailed error reporting with `resultVersion`, enhance | ||
MD010/MD012/MD036, fixes for MD027/MD029/MD030, include JSON schema, | ||
dependencies. | ||
- 0.3.1 - Fix regressions in MD032/MD038, update dependencies. | ||
- 0.4.0 - Add MD044, enhance MD013/MD032/MD041/MD042/MD043, fix for MD038, | ||
dependencies. | ||
- 0.4.1 - Fixes for MD038/front matter, improvements to MD044, update | ||
dependencies. | ||
- 0.5.0 - Add shareable configuration, `noInlineConfig` option, README links, | ||
fix MD030, improve MD009/MD041, update dependencies. | ||
- 0.6.0 - `resultVersion` defaults to 1 (breaking change), ignore HTML comments, | ||
TOML front matter, fixes for MD044, update dependencies. | ||
- 0.6.1 - Update `markdown-it` versioning, exclude demo/test from publishing. | ||
- 0.6.2 - Improve MD013/MD027/MD034/MD037/MD038/MD041/MD044, update | ||
dependencies. | ||
- 0.6.3 - Improve highlighting for MD020. | ||
- 0.6.4 - Improve MD029/MD042, update dependencies. | ||
- 0.7.0 - `resultVersion` defaults to 2 (breaking change), add MD045, improve | ||
MD029, remove trimLeft/trimRight, split rules, refactor, update | ||
dependencies. | ||
- 0.8.0 - Add support for using and authoring custom rules, improve | ||
MD004/MD007/MD013, add `engines` to `package.json`, refactor, update | ||
dependencies. | ||
- 0.8.1 - Update item loop to be iterative, improve MD014, update | ||
dependencies. | ||
- 0.9.0 - Remove support for end-of-life Node versions 0.10/0.12/4, change | ||
"header" to "heading" per spec (non-breaking), improve | ||
MD003/MD009/MD041, handle uncommon line-break characters, refactor for | ||
ES6, update dependencies. | ||
- 0.10.0 - Add support for non-JSON configuration files, pass file/string name | ||
to custom rules, update dependencies. | ||
- 0.11.0 - Improve MD005/MD024/MD029/MD038, improve custom rule example, add | ||
CONTRIBUTING.md, update dependencies. | ||
- 0.12.0 - Add `information` link for custom rules, `markdownItPlugins` for | ||
extensibility, improve MD023/MD032/MD038, update dependencies. | ||
- 0.13.0 - Improve MD013/MD022/MD025/MD029/MD031/MD032/MD037/MD041/, deprecate | ||
MD002, improve pandoc YAML support, update dependencies. | ||
- 0.14.0 - Remove support for end-of-life Node version 6, introduce | ||
`markdownlint-rule-helpers`, add MD046/MD047, improve | ||
MD033/MD034/MD039, improve custom rule validation and in-browser | ||
demo, update dependencies. | ||
- 0.14.1 - Improve MD033. | ||
- 0.14.2 - Improve MD047, add `handleRuleFailures` option. | ||
- 0.15.0 - Add `markdownlint-capture`/`markdownlint-restore` inline comments, | ||
improve MD009/MD013/MD026/MD033/MD036, update dependencies. | ||
- 0.16.0 - Add custom rule sample for linting code, improve | ||
MD026/MD031/MD033/MD038, update dependencies. | ||
- 0.17.0 - Add `resultVersion` 3 to support fix information for default and | ||
custom rules, add fix information for 24 rules, update newline | ||
handling to match latest CommonMark specification, improve | ||
MD014/MD037/MD039, update dependencies. | ||
- 0.17.1 - Fix handling of front matter by fix information. | ||
- 0.17.2 - Improve MD020/MD033/MD044. | ||
- 0.18.0 - Add MD048/code-fence-style, add fix information for MD007/ul-indent, | ||
add `markdownlint-disable-file`/`markdownlint-enable-file` inline | ||
comments, add type declaration file (.d.ts) for TypeScript | ||
dependents, update schema, improve MD006/MD007/MD009/MD013/MD030, | ||
update dependencies. | ||
- 0.19.0 - Remove support for end-of-life Node version 8, add fix information | ||
for MD005/list-indent, improve MD007/MD013/MD014, deprecate | ||
MD006/ul-start-left, add rationale for every rule, update test runner | ||
and code coverage, add more JSDoc comments, update dependencies. | ||
- 0.20.0 - Add `markdownlint-configure-file` inline comment, reimplement MD037, | ||
improve MD005/MD007/MD013/MD018/MD029/MD031/MD034/MD038/MD039, | ||
improve HTML comment handling, update dependencies. | ||
- 0.20.1 - Fix regression in MD037. | ||
- 0.20.2 - Fix regression in MD037, improve MD038. | ||
- 0.20.3 - Fix regression in MD037, improve MD044, add automatic regression | ||
testing. | ||
- 0.20.4 - Fix regression in MD037, improve MD034/MD044, improve | ||
documentation. | ||
- 0.21.0 - Lint concurrently for better performance (async only), add | ||
Promise-based APIs, update TypeScript declaration file, hide | ||
`toString` on `LintResults`, add ability to fix in browser demo, | ||
allow custom rules in `.markdownlint.json` schema, improve | ||
MD042/MD044, improve documentation, update dependencies. | ||
- 0.21.1 - Improve MD011/MD031, export `getVersion` API. | ||
- 0.22.0 - Allow `extends` in config to reference installed packages by name, | ||
add `markdownlint-disable-next-line` inline comment, support JSON | ||
front matter, improve MD009/MD026/MD028/MD043, update dependencies | ||
(including `markdown-it` to v12). | ||
- 0.23.0 - Add comprehensive example `.markdownlint.jsonc`/`.markdownlint.yaml` | ||
files, add fix information for MD004/ul-style, improve | ||
MD018/MD019/MD020/MD021/MD037/MD041, improve HTML comment handling, | ||
update test runner and test suite, update dependencies. | ||
- 0.23.1 - Work around lack of webpack support for dynamic calls to | ||
`require`(`.resolve`). | ||
- 0.24.0 - Remove support for end-of-life Node version 10, add support for | ||
custom file system module, improve MD010/MD011/MD037/MD043/MD044, | ||
improve TypeScript declaration file and JSON schema, update | ||
dependencies. | ||
- 0.25.0 - Add MD049/MD050 for consistent emphasis/strong style (both | ||
auto-fixable), improve MD007/MD010/MD032/MD033/MD035/MD037/MD039, | ||
support asynchronous custom rules, improve performance, improve CI | ||
process, reduce dependencies, update dependencies. | ||
- 0.25.1 - Update dependencies for CVE-2022-21670. | ||
- 0.26.0 - Add MD051/MD052/MD053 for validating link fragments & reference | ||
links/images & link/image reference definitions (MD053 auto-fixable), | ||
improve MD010/MD031/MD035/MD039/MD042/MD044/MD049/MD050, add | ||
`markdownlint-disable-line` inline comment, support `~` paths in | ||
`readConfig/Sync`, add `configParsers` option, remove support for | ||
end-of-life Node version 12, default `resultVersion` to 3, update | ||
browser script to use ES2015, simplify JSON schema, address remaining | ||
CodeQL issues, improve performance, update dependencies. | ||
- 0.26.1 - Improve MD051. | ||
- 0.26.2 - Improve MD037/MD051/MD053. | ||
- 0.27.0 - Improve MD011/MD013/MD022/MD031/MD032/MD033/MD034/MD040/MD043/MD051/ | ||
MD053, generate/separate documentation, improve documentation, update | ||
dependencies. | ||
See [CHANGELOG.md](CHANGELOG.md). | ||
@@ -1148,0 +1026,0 @@ [npm-image]: https://img.shields.io/npm/v/markdownlint.svg |
Sorry, the diff of this file is too big to display
Sorry, the diff of this file is not supported yet
Sorry, the diff of this file is not supported yet
Sorry, the diff of this file is not supported yet
Sorry, the diff of this file is not supported yet
License Policy Violation
LicenseThis package is not allowed per your license policy. Review the package's license to ensure compliance.
Found 1 instance in 1 package
License Policy Violation
LicenseThis package is not allowed per your license policy. Review the package's license to ensure compliance.
Found 1 instance in 1 package
709890
124
12971
2
27
1029
+ Addedmarkdownlint-micromark@0.1.2
+ Addedmarkdownlint-micromark@0.1.2(transitive)