New Research: Supply Chain Attack on Axios Pulls Malicious Dependency from npm.Details →
Socket
Book a DemoSign in
Socket

pygments

Package Overview
Dependencies
Maintainers
3
Versions
68
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

pygments - pypi Package Compare versions

Comparing version
2.16.1
to
2.17.0
.coveragerc

Sorry, the diff of this file is not supported yet

+13
doc
tests
TAGS
build
dist
htmlcov
venv
**/__pycache__
.*
*.pyo
.*.sw[op]
!/doc/pyodide/meta.yaml
tests/examplefiles/*/*.output linguist-generated
name: Pygments
on: [push, pull_request]
env:
FORCE_COLOR: 1
permissions:
contents: read # to fetch code (actions/checkout)
jobs:
build:
runs-on: ${{ matrix.os }}
strategy:
fail-fast: false
matrix:
os: [windows-latest, ubuntu-latest]
python-version: ["3.7", "3.8", "3.9", "3.10", "3.11", "3.12"]
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
cache: 'pip'
- name: Install tox
run: pip install -r requirements.txt
- name: Test package
run: tox -- -W error
check:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- uses: actions/setup-python@v4
with:
python-version: "3.x"
- name: Install tox
run: pip install -r requirements.txt
- name: Perform basic checks
run: tox -e check
if: runner.os == 'Linux'
check-mapfiles:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- uses: actions/setup-python@v4
with:
python-version: "3.x"
- name: Install tox
run: pip install -r requirements.txt
- name: Regenerate mapfiles
run: tox -e mapfiles
- name: Fail if mapfiles changed
run: |
if git ls-files -m | grep mapping; then
echo 'Please run "tox -e mapfiles" and add the changes to a commit.'
exit 1
fi
lint:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- uses: actions/setup-python@v4
with:
python-version: "3.x"
- name: Install tox
run: pip install -r requirements.txt
- name: Run regexlint
run: tox -e regexlint
name: Docs
on:
push:
branches:
- master
env:
FORCE_COLOR: 1
permissions: {}
jobs:
build:
permissions:
contents: write # to push pages branch (peaceiris/actions-gh-pages)
runs-on: ubuntu-latest
steps:
- name: Setup Python
uses: actions/setup-python@v4
with:
python-version: "3.x"
- name: Checkout Pygments
uses: actions/checkout@v4
- name: Install tox
run: pip install -r requirements.txt
- name: Sphinx build
run: |
tox -e web-doc -- dirhtml
touch doc/_build/dirhtml/.nojekyll
echo -e 'pygments.org\nwww.pygments.org' > doc/_build/dirhtml/CNAME
echo 'Automated deployment of docs for GitHub pages.' > doc/_build/dirhtml/README
- name: Deploy to repo
if: github.repository_owner == 'pygments'
uses: peaceiris/actions-gh-pages@v3
with:
deploy_key: ${{ secrets.ACTIONS_DEPLOY_KEY }}
external_repository: pygments/pygments.github.io
publish_branch: master
publish_dir: ./doc/_build/dirhtml
*.pyc
*.pyo
.*.sw[op]
/.pytest_cache/
/.idea/
/.project
/.tags
/.tox/
/.cache/
/TAGS
/build/*
/dist/*
/doc/_build
/.coverage
/htmlcov
/.vscode
venv/
.venv/
.DS_Store
"""
pygments.lexers.jsx
~~~~~~~~~~~~~~~~~~~
Lexers for JSX (React).
:copyright: Copyright 2006-2023 by the Pygments team, see AUTHORS.
:license: BSD, see LICENSE for details.
"""
import re
from pygments.lexer import bygroups, default, include, inherit
from pygments.lexers.javascript import JavascriptLexer
from pygments.token import Name, Operator, Punctuation, String, Text, \
Whitespace
__all__ = ['JsxLexer']
class JsxLexer(JavascriptLexer):
"""For JavaScript Syntax Extension (JSX).
.. versionadded:: 2.17
"""
name = "JSX"
aliases = ["jsx", "react"]
filenames = ["*.jsx", "*.react"]
mimetypes = ["text/jsx", "text/typescript-jsx"]
url = "https://facebook.github.io/jsx/"
flags = re.MULTILINE | re.DOTALL
# Use same tokens as `JavascriptLexer`, but with tags and attributes support
tokens = {
"root": [
include("jsx"),
inherit,
],
"jsx": [
(r"</?>", Punctuation), # JSXFragment <>|</>
(r"(<)(\w+)(\.?)", bygroups(Punctuation, Name.Tag, Punctuation), "tag"),
(
r"(</)(\w+)(>)",
bygroups(Punctuation, Name.Tag, Punctuation),
),
(
r"(</)(\w+)",
bygroups(Punctuation, Name.Tag),
"fragment",
), # Same for React.Context
],
"tag": [
(r"\s+", Whitespace),
(r"([\w-]+)(\s*)(=)(\s*)", bygroups(Name.Attribute, Whitespace, Operator, Whitespace), "attr"),
(r"[{}]+", Punctuation),
(r"[\w\.]+", Name.Attribute),
(r"(/?)(\s*)(>)", bygroups(Punctuation, Text, Punctuation), "#pop"),
],
"fragment": [
(r"(.)(\w+)", bygroups(Punctuation, Name.Attribute)),
(r"(>)", bygroups(Punctuation), "#pop"),
],
"attr": [
(r"\{", Punctuation, "expression"),
(r'".*?"', String, "#pop"),
(r"'.*?'", String, "#pop"),
default("#pop"),
],
"expression": [
(r"\{", Punctuation, "#push"),
(r"\}", Punctuation, "#pop"),
include("root"),
],
}
"""
pygments.lexers.kusto
~~~~~~~~~~~~~~~~~~~~~
Lexers for Kusto Query Language (KQL).
:copyright: Copyright 2006-2023 by the Pygments team, see AUTHORS.
:license: BSD, see LICENSE for details.
"""
from pygments.lexer import RegexLexer, words
from pygments.token import (Comment, Keyword, Name, Number, Punctuation,
String, Whitespace)
__all__ = ["KustoLexer"]
# Although these all seem to be keywords
# https://github.com/microsoft/Kusto-Query-Language/blob/master/src/Kusto.Language/Syntax/SyntaxFacts.cs
# it appears that only the ones with tags here
# https://github.com/microsoft/Kusto-Query-Language/blob/master/src/Kusto.Language/Parser/QueryGrammar.cs
# are highlighted in the Azure portal log query editor.
KUSTO_KEYWORDS = [
'and', 'as', 'between', 'by', 'consume', 'contains', 'containscs', 'count',
'distinct', 'evaluate', 'extend', 'facet', 'filter', 'find', 'fork',
'getschema', 'has', 'invoke', 'join', 'limit', 'lookup', 'make-series',
'matches regex', 'mv-apply', 'mv-expand', 'notcontains', 'notcontainscs',
'!contains', '!has', '!startswith', 'on', 'or', 'order', 'parse', 'parse-where',
'parse-kv', 'partition', 'print', 'project', 'project-away', 'project-keep',
'project-rename', 'project-reorder', 'range', 'reduce', 'regex', 'render',
'sample', 'sample-distinct', 'scan', 'search', 'serialize', 'sort', 'startswith',
'summarize', 'take', 'top', 'top-hitters', 'top-nested', 'typeof', 'union',
'where', 'bool', 'date', 'datetime', 'int', 'long', 'real', 'string', 'time'
]
# From
# https://github.com/microsoft/Kusto-Query-Language/blob/master/src/Kusto.Language/Syntax/SyntaxFacts.cs
KUSTO_PUNCTUATION = [
"(", ")", "[", "]", "{", "}", "|", "<|", "+", "-", "*", "/",
"%", ".." "!", "<", "<=", ">", ">=", "=", "==", "!=", "<>",
":", ";", ",", "=~", "!~", "?", "=>",
]
class KustoLexer(RegexLexer):
"""For Kusto Query Language source code.
.. versionadded:: 2.17
"""
name = "Kusto"
aliases = ["kql", "kusto"]
filenames = ["*.kql", "*.kusto", ".csl"]
url = "https://learn.microsoft.com/en-us/azure/data-explorer/kusto/query"
tokens = {
"root": [
(r"\s+", Whitespace),
(words(KUSTO_KEYWORDS, suffix=r"\b"), Keyword),
(r"//.*", Comment),
(words(KUSTO_PUNCTUATION), Punctuation),
(r"[^\W\d]\w*", Name),
# Numbers can take the form 1, .1, 1., 1.1, 1.1111, etc.
(r"\d+[.]\d*|[.]\d+", Number.Float),
(r"\d+", Number.Integer),
(r"'", String, "single_string"),
(r'"', String, "double_string"),
(r"@'", String, "single_verbatim"),
(r'@"', String, "double_verbatim"),
(r"```", String, "multi_string"),
],
"single_string": [
(r"'", String, "#pop"),
(r"\\.", String.Escape),
(r"[^'\\]+", String),
],
"double_string": [
(r'"', String, "#pop"),
(r"\\.", String.Escape),
(r'[^"\\]+', String),
],
"single_verbatim": [
(r"'", String, "#pop"),
(r"[^']+", String),
],
"double_verbatim": [
(r'"', String, "#pop"),
(r'[^"]+', String),
],
"multi_string": [
(r"[^`]+", String),
(r"```", String, "#pop"),
(r"`", String),
],
}
"""
pygments.lexers.ldap
~~~~~~~~~~~~~~~~~~~~
Pygments lexers for LDAP.
:copyright: Copyright 2006-2023 by the Pygments team, see AUTHORS.
:license: BSD, see LICENSE for details.
"""
import re
from pygments.lexer import RegexLexer, bygroups, default
from pygments.token import Operator, Comment, Keyword, Literal, Name, String, \
Number, Punctuation, Whitespace, Escape
__all__ = ['LdifLexer', 'LdaprcLexer']
class LdifLexer(RegexLexer):
"""
Lexer for LDIF
.. versionadded:: 2.17
"""
name = 'LDIF'
aliases = ['ldif']
filenames = ['*.ldif']
mimetypes = ["text/x-ldif"]
url = "https://datatracker.ietf.org/doc/html/rfc2849"
tokens = {
'root': [
(r'\s*\n', Whitespace),
(r'(-)(\n)', bygroups(Punctuation, Whitespace)),
(r'(#.*)(\n)', bygroups(Comment.Single, Whitespace)),
(r'(version)(:)([ \t]*)(.*)([ \t]*\n)', bygroups(Keyword,
Punctuation, Whitespace, Number.Integer, Whitespace)),
(r'(control)(:)([ \t]*)([\.0-9]+)([ \t]+)((?:true|false)?)([ \t]*)',
bygroups(Keyword, Punctuation, Whitespace, Name.Other, Whitespace, Keyword, Whitespace), "after-control"),
(r'(deleteoldrdn)(:)([ \n]*)([0-1]+)([ \t]*\n)',
bygroups(Keyword, Punctuation, Whitespace, Number, Whitespace)),
(r'(add|delete|replace)(::?)(\s*)(.*)([ \t]*\n)', bygroups(
Keyword, Punctuation, Whitespace, Name.Attribute, Whitespace)),
(r'(changetype)(:)([ \t]*)([a-z]*)([ \t]*\n)',
bygroups(Keyword, Punctuation, Whitespace, Keyword, Whitespace)),
(r'(dn|newrdn)(::)', bygroups(Keyword, Punctuation), "base64-dn"),
(r'(dn|newrdn)(:)', bygroups(Keyword, Punctuation), "dn"),
(r'(objectclass)(:)([ \t]*)([^ \t\n]*)([ \t]*\n)', bygroups(
Keyword, Punctuation, Whitespace, Name.Class, Whitespace)),
(r'([a-zA-Z]*|[0-9][0-9\.]*[0-9])(;)',
bygroups(Name.Attribute, Punctuation), "property"),
(r'([a-zA-Z]*|[0-9][0-9\.]*[0-9])(:<)',
bygroups(Name.Attribute, Punctuation), "url"),
(r'([a-zA-Z]*|[0-9][0-9\.]*[0-9])(::?)',
bygroups(Name.Attribute, Punctuation), "value"),
],
"after-control": [
(r":<", Punctuation, ("#pop", "url")),
(r"::?", Punctuation, ("#pop", "value")),
default("#pop"),
],
'property': [
(r'([-a-zA-Z0-9]*)(;)', bygroups(Name.Property, Punctuation)),
(r'([-a-zA-Z0-9]*)(:<)',
bygroups(Name.Property, Punctuation), ("#pop", "url")),
(r'([-a-zA-Z0-9]*)(::?)',
bygroups(Name.Property, Punctuation), ("#pop", "value")),
],
'value': [
(r'(\s*)([^\n]+\S)(\n )',
bygroups(Whitespace, String, Whitespace)),
(r'(\s*)([^\n]+\S)(\n)',
bygroups(Whitespace, String, Whitespace), "#pop"),
],
'url': [
(r'([ \t]*)(\S*)([ \t]*\n )',
bygroups(Whitespace, Comment.PreprocFile, Whitespace)),
(r'([ \t]*)(\S*)([ \t]*\n)', bygroups(Whitespace,
Comment.PreprocFile, Whitespace), "#pop"),
],
"dn": [
(r'([ \t]*)([-a-zA-Z0-9\.]+)(=)', bygroups(Whitespace,
Name.Attribute, Operator), ("#pop", "dn-value")),
],
"dn-value": [
(r'\\[^\n]', Escape),
(r',', Punctuation, ("#pop", "dn")),
(r'\+', Operator, ("#pop", "dn")),
(r'[^,\+\n]+', String),
(r'\n ', Whitespace),
(r'\n', Whitespace, "#pop"),
],
"base64-dn": [
(r'([ \t]*)([^ \t\n][^ \t\n]*[^\n])([ \t]*\n )',
bygroups(Whitespace, Name, Whitespace)),
(r'([ \t]*)([^ \t\n][^ \t\n]*[^\n])([ \t]*\n)',
bygroups(Whitespace, Name, Whitespace), "#pop"),
]
}
class LdaprcLexer(RegexLexer):
"""
Lexer for OpenLDAP configuration files.
.. versionadded:: 2.17
"""
name = 'LDAP configuration file'
aliases = ['ldapconf', 'ldaprc']
filenames = ['.ldaprc', 'ldaprc', 'ldap.conf']
mimetypes = ["text/x-ldapconf"]
url = 'https://www.openldap.org/software//man.cgi?query=ldap.conf&sektion=5&apropos=0&manpath=OpenLDAP+2.4-Release'
_sasl_keywords = r'SASL_(?:MECH|REALM|AUTHCID|AUTHZID|CBINDING)'
_tls_keywords = r'TLS_(?:CACERT|CACERTDIR|CERT|ECNAME|KEY|CIPHER_SUITE|PROTOCOL_MIN|RANDFILE|CRLFILE)'
_literal_keywords = rf'(?:URI|SOCKET_BIND_ADDRESSES|{_sasl_keywords}|{_tls_keywords})'
_boolean_keywords = r'GSSAPI_(?:ALLOW_REMOTE_PRINCIPAL|ENCRYPT|SIGN)|REFERRALS|SASL_NOCANON'
_integer_keywords = r'KEEPALIVE_(?:IDLE|PROBES|INTERVAL)|NETWORK_TIMEOUT|PORT|SIZELIMIT|TIMELIMIT|TIMEOUT'
_secprops = r'none|noanonymous|noplain|noactive|nodict|forwardsec|passcred|(?:minssf|maxssf|maxbufsize)=\d+'
flags = re.IGNORECASE | re.MULTILINE
tokens = {
'root': [
(r'#.*', Comment.Single),
(r'\s+', Whitespace),
(rf'({_boolean_keywords})(\s+)(on|true|yes|off|false|no)$',
bygroups(Keyword, Whitespace, Keyword.Constant)),
(rf'({_integer_keywords})(\s+)(\d+)',
bygroups(Keyword, Whitespace, Number.Integer)),
(r'(VERSION)(\s+)(2|3)', bygroups(Keyword, Whitespace, Number.Integer)),
# Constants
(r'(DEREF)(\s+)(never|searching|finding|always)',
bygroups(Keyword, Whitespace, Keyword.Constant)),
(rf'(SASL_SECPROPS)(\s+)((?:{_secprops})(?:,{_secprops})*)',
bygroups(Keyword, Whitespace, Keyword.Constant)),
(r'(SASL_CBINDING)(\s+)(none|tls-unique|tls-endpoint)',
bygroups(Keyword, Whitespace, Keyword.Constant)),
(r'(TLS_REQ(?:CERT|SAN))(\s+)(allow|demand|hard|never|try)',
bygroups(Keyword, Whitespace, Keyword.Constant)),
(r'(TLS_CRLCHECK)(\s+)(none|peer|all)',
bygroups(Keyword, Whitespace, Keyword.Constant)),
# Literals
(r'(BASE|BINDDN)(\s+)(\S+)$',
bygroups(Keyword, Whitespace, Literal)),
# Accepts hostname with or without port.
(r'(HOST)(\s+)([a-z0-9]+)((?::(\d+))?)',
bygroups(Keyword, Whitespace, Literal, Number.Integer)),
(rf'({_literal_keywords})(\s+)(\S+)$',
bygroups(Keyword, Whitespace, Literal)),
],
}
"""
pygments.lexers.lean
~~~~~~~~~~~~~~~~~~~~
Lexers for the Lean theorem prover.
:copyright: Copyright 2006-2023 by the Pygments team, see AUTHORS.
:license: BSD, see LICENSE for details.
"""
import re
from pygments.lexer import RegexLexer, default, words, include
from pygments.token import Text, Comment, Operator, Keyword, Name, String, \
Number, Punctuation, Generic, Whitespace
__all__ = ['Lean3Lexer']
class Lean3Lexer(RegexLexer):
"""
For the Lean 3 theorem prover.
.. versionadded:: 2.0
"""
name = 'Lean'
url = 'https://leanprover-community.github.io/lean3'
aliases = ['lean', 'lean3']
filenames = ['*.lean']
mimetypes = ['text/x-lean', 'text/x-lean3']
tokens = {
'expression': [
(r'\s+', Text),
(r'/--', String.Doc, 'docstring'),
(r'/-', Comment, 'comment'),
(r'--.*?$', Comment.Single),
(words((
'forall', 'fun', 'Pi', 'from', 'have', 'show', 'assume', 'suffices',
'let', 'if', 'else', 'then', 'in', 'with', 'calc', 'match',
'do'
), prefix=r'\b', suffix=r'\b'), Keyword),
(words(('sorry', 'admit'), prefix=r'\b', suffix=r'\b'), Generic.Error),
(words(('Sort', 'Prop', 'Type'), prefix=r'\b', suffix=r'\b'), Keyword.Type),
(words((
'(', ')', ':', '{', '}', '[', ']', '⟨', '⟩', '‹', '›', '⦃', '⦄', ':=', ',',
)), Operator),
(r'[A-Za-z_\u03b1-\u03ba\u03bc-\u03fb\u1f00-\u1ffe\u2100-\u214f]'
r'[.A-Za-z_\'\u03b1-\u03ba\u03bc-\u03fb\u1f00-\u1ffe\u2070-\u2079'
r'\u207f-\u2089\u2090-\u209c\u2100-\u214f0-9]*', Name),
(r'0x[A-Za-z0-9]+', Number.Integer),
(r'0b[01]+', Number.Integer),
(r'\d+', Number.Integer),
(r'"', String.Double, 'string'),
(r"'(?:(\\[\\\"'nt])|(\\x[0-9a-fA-F]{2})|(\\u[0-9a-fA-F]{4})|.)'", String.Char),
(r'[~?][a-z][\w\']*:', Name.Variable),
(r'\S', Name.Builtin.Pseudo),
],
'root': [
(words((
'import', 'renaming', 'hiding',
'namespace',
'local',
'private', 'protected', 'section',
'include', 'omit', 'section',
'protected', 'export',
'open',
'attribute',
), prefix=r'\b', suffix=r'\b'), Keyword.Namespace),
(words((
'lemma', 'theorem', 'def', 'definition', 'example',
'axiom', 'axioms', 'constant', 'constants',
'universe', 'universes',
'inductive', 'coinductive', 'structure', 'extends',
'class', 'instance',
'abbreviation',
'noncomputable theory',
'noncomputable', 'mutual', 'meta',
'attribute',
'parameter', 'parameters',
'variable', 'variables',
'reserve', 'precedence',
'postfix', 'prefix', 'notation', 'infix', 'infixl', 'infixr',
'begin', 'by', 'end',
'set_option',
'run_cmd',
), prefix=r'\b', suffix=r'\b'), Keyword.Declaration),
(r'@\[', Keyword.Declaration, 'attribute'),
(words((
'#eval', '#check', '#reduce', '#exit',
'#print', '#help',
), suffix=r'\b'), Keyword),
include('expression')
],
'attribute': [
(r'\]', Keyword.Declaration, '#pop'),
include('expression'),
],
'comment': [
(r'[^/-]', Comment.Multiline),
(r'/-', Comment.Multiline, '#push'),
(r'-/', Comment.Multiline, '#pop'),
(r'[/-]', Comment.Multiline)
],
'docstring': [
(r'[^/-]', String.Doc),
(r'-/', String.Doc, '#pop'),
(r'[/-]', String.Doc)
],
'string': [
(r'[^\\"]+', String.Double),
(r"(?:(\\[\\\"'nt])|(\\x[0-9a-fA-F]{2})|(\\u[0-9a-fA-F]{4}))", String.Escape),
('"', String.Double, '#pop'),
],
}
LeanLexer = Lean3Lexer
"""
pygments.lexers.prql
~~~~~~~~~~~~~~~~~~~~
Lexer for the PRQL query language.
:copyright: Copyright 2006-2023 by the Pygments team, see AUTHORS.
:license: BSD, see LICENSE for details.
"""
from pygments.lexer import RegexLexer, combined, words, include, bygroups
from pygments.token import Comment, Literal, Keyword, Name, Number, Operator, \
Punctuation, String, Text, Whitespace
__all__ = ['PrqlLexer']
class PrqlLexer(RegexLexer):
"""
For PRQL source code.
.. versionadded:: 2.17
grammar: https://github.com/PRQL/prql/tree/main/grammars
"""
name = 'PRQL'
url = 'https://prql-lang.org/'
aliases = ['prql']
filenames = ['*.prql']
mimetypes = ['application/prql', 'application/x-prql']
builtinTypes = words((
"bool",
"int",
"int8", "int16", "int32", "int64", "int128",
"float",
"text",
"set"), suffix=r'\b')
def innerstring_rules(ttype):
return [
# the new style '{}'.format(...) string formatting
(r'\{'
r'((\w+)((\.\w+)|(\[[^\]]+\]))*)?' # field name
r'(\:(.?[<>=\^])?[-+ ]?#?0?(\d+)?,?(\.\d+)?[E-GXb-gnosx%]?)?'
r'\}', String.Interpol),
(r'[^\\\'"%{\n]+', ttype),
(r'[\'"\\]', ttype),
(r'%|(\{{1,2})', ttype)
]
def fstring_rules(ttype):
return [
(r'\}', String.Interpol),
(r'\{', String.Interpol, 'expr-inside-fstring'),
(r'[^\\\'"{}\n]+', ttype),
(r'[\'"\\]', ttype),
]
tokens = {
'root': [
# Comments
(r'#!.*', String.Doc),
(r'#.*', Comment.Single),
# Whitespace
(r'\s+', Whitespace),
# Modules
(r'^(\s*)(module)(\s*)',
bygroups(Whitespace, Keyword.Namespace, Whitespace),
'imports'),
(builtinTypes, Keyword.Type),
# Main
(r'^prql ', Keyword.Reserved),
('let', Keyword.Declaration),
include('keywords'),
include('expr'),
# Transforms
(r'^[A-Za-z_][a-zA-Z0-9_]*', Keyword),
],
'expr': [
# non-raw f-strings
('(f)(""")', bygroups(String.Affix, String.Double),
combined('fstringescape', 'tdqf')),
("(f)(''')", bygroups(String.Affix, String.Single),
combined('fstringescape', 'tsqf')),
('(f)(")', bygroups(String.Affix, String.Double),
combined('fstringescape', 'dqf')),
("(f)(')", bygroups(String.Affix, String.Single),
combined('fstringescape', 'sqf')),
# non-raw s-strings
('(s)(""")', bygroups(String.Affix, String.Double),
combined('stringescape', 'tdqf')),
("(s)(''')", bygroups(String.Affix, String.Single),
combined('stringescape', 'tsqf')),
('(s)(")', bygroups(String.Affix, String.Double),
combined('stringescape', 'dqf')),
("(s)(')", bygroups(String.Affix, String.Single),
combined('stringescape', 'sqf')),
# raw strings
('(?i)(r)(""")',
bygroups(String.Affix, String.Double), 'tdqs'),
("(?i)(r)(''')",
bygroups(String.Affix, String.Single), 'tsqs'),
('(?i)(r)(")',
bygroups(String.Affix, String.Double), 'dqs'),
("(?i)(r)(')",
bygroups(String.Affix, String.Single), 'sqs'),
# non-raw strings
('"""', String.Double, combined('stringescape', 'tdqs')),
("'''", String.Single, combined('stringescape', 'tsqs')),
('"', String.Double, combined('stringescape', 'dqs')),
("'", String.Single, combined('stringescape', 'sqs')),
# Time and dates
(r'@\d{4}-\d{2}-\d{2}T\d{2}(:\d{2})?(:\d{2})?(\.\d{1,6})?(Z|[+-]\d{1,2}(:\d{1,2})?)?', Literal.Date),
(r'@\d{4}-\d{2}-\d{2}', Literal.Date),
(r'@\d{2}(:\d{2})?(:\d{2})?(\.\d{1,6})?(Z|[+-]\d{1,2}(:\d{1,2})?)?', Literal.Date),
(r'[^\S\n]+', Text),
include('numbers'),
(r'->|=>|==|!=|>=|<=|~=|&&|\|\||\?\?|\/\/', Operator),
(r'[-~+/*%=<>&^|.@]', Operator),
(r'[]{}:(),;[]', Punctuation),
include('functions'),
# Variable Names
(r'[A-Za-z_][a-zA-Z0-9_]*', Name.Variable),
],
'numbers': [
(r'(\d(?:_?\d)*\.(?:\d(?:_?\d)*)?|(?:\d(?:_?\d)*)?\.\d(?:_?\d)*)'
r'([eE][+-]?\d(?:_?\d)*)?', Number.Float),
(r'\d(?:_?\d)*[eE][+-]?\d(?:_?\d)*j?', Number.Float),
(r'0[oO](?:_?[0-7])+', Number.Oct),
(r'0[bB](?:_?[01])+', Number.Bin),
(r'0[xX](?:_?[a-fA-F0-9])+', Number.Hex),
(r'\d(?:_?\d)*', Number.Integer),
],
'fstringescape': [
include('stringescape'),
],
'bytesescape': [
(r'\\([\\bfnrt"\']|\n|x[a-fA-F0-9]{2}|[0-7]{1,3})', String.Escape)
],
'stringescape': [
(r'\\(N\{.*?\}|u\{[a-fA-F0-9]{1,6}\})', String.Escape),
include('bytesescape')
],
'fstrings-single': fstring_rules(String.Single),
'fstrings-double': fstring_rules(String.Double),
'strings-single': innerstring_rules(String.Single),
'strings-double': innerstring_rules(String.Double),
'dqf': [
(r'"', String.Double, '#pop'),
(r'\\\\|\\"|\\\n', String.Escape), # included here for raw strings
include('fstrings-double')
],
'sqf': [
(r"'", String.Single, '#pop'),
(r"\\\\|\\'|\\\n", String.Escape), # included here for raw strings
include('fstrings-single')
],
'dqs': [
(r'"', String.Double, '#pop'),
(r'\\\\|\\"|\\\n', String.Escape), # included here for raw strings
include('strings-double')
],
'sqs': [
(r"'", String.Single, '#pop'),
(r"\\\\|\\'|\\\n", String.Escape), # included here for raw strings
include('strings-single')
],
'tdqf': [
(r'"""', String.Double, '#pop'),
include('fstrings-double'),
(r'\n', String.Double)
],
'tsqf': [
(r"'''", String.Single, '#pop'),
include('fstrings-single'),
(r'\n', String.Single)
],
'tdqs': [
(r'"""', String.Double, '#pop'),
include('strings-double'),
(r'\n', String.Double)
],
'tsqs': [
(r"'''", String.Single, '#pop'),
include('strings-single'),
(r'\n', String.Single)
],
'expr-inside-fstring': [
(r'[{([]', Punctuation, 'expr-inside-fstring-inner'),
# without format specifier
(r'(=\s*)?' # debug (https://bugs.python.org/issue36817)
r'\}', String.Interpol, '#pop'),
# with format specifier
# we'll catch the remaining '}' in the outer scope
(r'(=\s*)?' # debug (https://bugs.python.org/issue36817)
r':', String.Interpol, '#pop'),
(r'\s+', Whitespace), # allow new lines
include('expr'),
],
'expr-inside-fstring-inner': [
(r'[{([]', Punctuation, 'expr-inside-fstring-inner'),
(r'[])}]', Punctuation, '#pop'),
(r'\s+', Whitespace), # allow new lines
include('expr'),
],
'keywords': [
(words((
'into', 'case', 'type', 'module', 'internal',
), suffix=r'\b'),
Keyword),
(words(('true', 'false', 'null'), suffix=r'\b'), Keyword.Constant),
],
'functions': [
(words((
"min", "max", "sum", "average", "stddev", "every", "any",
"concat_array", "count", "lag", "lead", "first", "last",
"rank", "rank_dense", "row_number", "round", "as", "in",
"tuple_every", "tuple_map", "tuple_zip", "_eq", "_is_null",
"from_text", "lower", "upper", "read_parquet", "read_csv"),
suffix=r'\b'),
Name.Function),
],
'comment': [
(r'-(?!\})', Comment.Multiline),
(r'\{-', Comment.Multiline, 'comment'),
(r'[^-}]', Comment.Multiline),
(r'-\}', Comment.Multiline, '#pop'),
],
'imports': [
(r'\w+(\.\w+)*', Name.Class, '#pop'),
],
}
"""
pygments.lexers.vip
~~~~~~~~~~~~~~~~~~~
Lexers for Visual Prolog & Grammar files.
:copyright: Copyright 2006-2023 by the Pygments team, see AUTHORS.
:license: BSD, see LICENSE for details.
"""
import re
from pygments.lexer import RegexLexer, inherit, words, include
from pygments.token import Text, Comment, Operator, Keyword, Name, String, \
Number, Punctuation, Whitespace
__all__ = ['VisualPrologLexer', 'VisualPrologGrammarLexer']
class VisualPrologBaseLexer(RegexLexer):
minorendkw = ('try', 'foreach', 'if')
minorkwexp = ('and', 'catch', 'do', 'else', 'elseif', 'erroneous', 'externally', 'failure', 'finally', 'foreach', 'if', 'or', 'orelse', 'otherwise', 'then',
'try', 'div', 'mod', 'rem', 'quot')
dockw = ('short', 'detail', 'end', 'withdomain')
tokens = {
'root': [
(r'\s+', Whitespace),
(words(minorendkw, prefix=r'\bend\s+', suffix=r'\b'), Keyword.Minor),
(r'end', Keyword),
(words(minorkwexp, suffix=r'\b'), Keyword.Minor),
(r'0[xo][\da-fA-F_]+', Number),
(r'((\d[\d_]*)?\.)?\d[\d_]*([eE][\-+]?\d+)?', Number),
(r'_\w*', Name.Variable.Anonymous),
(r'[A-Z]\w*', Name.Variable),
(r'@\w+', Name.Variable),
(r'[a-z]\w*', Name),
(r'/\*', Comment, 'comment'),
(r'\%', Comment, 'commentline'),
(r'"', String.Symbol, 'string'),
(r'\'', String.Symbol, 'stringsingle'),
(r'@"', String.Symbol, 'atstring'),
(r'[\-+*^/!?<>=~:]+', Operator),
(r'[$,.[\]|(){}\\]+', Punctuation),
(r'.', Text),
],
'commentdoc': [
(words(dockw, prefix=r'@', suffix=r'\b'), Comment.Preproc),
(r'@', Comment),
],
'commentline': [
include('commentdoc'),
(r'[^@\n]+', Comment),
(r'$', Comment, '#pop'),
],
'comment': [
include('commentdoc'),
(r'[^@*/]+', Comment),
(r'/\*', Comment, '#push'),
(r'\*/', Comment, '#pop'),
(r'[*/]', Comment),
],
'stringescape': [
(r'\\u[0-9a-fA-F]{4}', String.Escape),
(r'\\[\'"ntr\\]', String.Escape),
],
'stringsingle': [
include('stringescape'),
(r'\'', String.Symbol, '#pop'),
(r'[^\'\\\n]+', String),
(r'\n', String.Escape.Error, '#pop'),
],
'string': [
include('stringescape'),
(r'"', String.Symbol, '#pop'),
(r'[^"\\\n]+', String),
(r'\n', String.Escape.Error, '#pop'),
],
'atstring': [
(r'""', String.Escape),
(r'"', String.Symbol, '#pop'),
(r'[^"]+', String),
]
}
class VisualPrologLexer(VisualPrologBaseLexer):
"""Lexer for VisualProlog
.. versionadded:: 2.17
"""
name = 'Visual Prolog'
url = 'https://www.visual-prolog.com/'
aliases = ['visualprolog']
filenames = ['*.pro', '*.cl', '*.i', '*.pack', '*.ph']
majorkw = ('goal', 'namespace', 'interface', 'class', 'implement', 'where', 'open', 'inherits', 'supports', 'resolve',
'delegate', 'monitor', 'constants', 'domains', 'predicates', 'constructors', 'properties', 'clauses', 'facts')
minorkw = ('align', 'anyflow', 'as', 'bitsize', 'determ', 'digits', 'erroneous', 'externally', 'failure', 'from',
'guard', 'multi', 'nondeterm', 'or', 'orelse', 'otherwise', 'procedure', 'resolve', 'single', 'suspending')
directivekw = ('bininclude', 'else', 'elseif', 'endif', 'error', 'export', 'externally', 'from', 'grammargenerate',
'grammarinclude', 'if', 'include', 'message', 'options', 'orrequires', 'requires', 'stringinclude', 'then')
tokens = {
'root': [
(words(minorkw, suffix=r'\b'), Keyword.Minor),
(words(majorkw, suffix=r'\b'), Keyword),
(words(directivekw, prefix='#', suffix=r'\b'), Keyword.Directive),
inherit
]
}
def analyse_text(text):
"""Competes with IDL and Prolog on *.pro; div. lisps on*.cl and SwigLexer on *.i"""
# These are *really* good indicators (and not conflicting with the other languages)
# end-scope first on line e.g. 'end implement'
# section keyword alone on line e.g. 'clauses'
if re.search(r'^\s*(end\s+(interface|class|implement)|(clauses|predicates|domains|facts|constants|properties)\s*$)', text):
return 0.98
else:
return 0
class VisualPrologGrammarLexer(VisualPrologBaseLexer):
"""Lexer for VisualProlog grammar
.. versionadded:: 2.17
"""
name = 'Visual Prolog Grammar'
url = 'https://www.visual-prolog.com/'
aliases = ['visualprologgrammar']
filenames = ['*.vipgrm']
majorkw = ('open', 'namespace', 'grammar', 'nonterminals',
'startsymbols', 'terminals', 'rules', 'precedence')
directivekw = ('bininclude', 'stringinclude')
tokens = {
'root': [
(words(majorkw, suffix=r'\b'), Keyword),
(words(directivekw, prefix='#', suffix=r'\b'), Keyword.Directive),
inherit
]
}
def analyse_text(text):
"""No competditors (currently)"""
# These are *really* good indicators
# end-scope first on line e.g. 'end grammar'
# section keyword alone on line e.g. 'rules'
if re.search(r'^\s*(end\s+grammar|(nonterminals|startsymbols|terminals|rules|precedence)\s*$)', text):
return 0.98
else:
return 0
"""
pygments.lexers.vyper
~~~~~~~~~~~~~~~~~~~~~
Lexer for the Vyper Smart Contract language.
:copyright: Copyright 2006-2023 by the Pygments team, see AUTHORS.
:license: BSD, see LICENSE for details.
"""
from pygments.lexer import RegexLexer, bygroups, words
from pygments.token import (Comment, String, Name, Keyword, Number,
Operator, Punctuation, Text, Whitespace)
__all__ = ['VyperLexer']
class VyperLexer(RegexLexer):
"""For the Vyper smart contract language.
.. versionadded:: 2.17
"""
name = 'Vyper'
aliases = ['vyper']
filenames = ['*.vy']
url = "https://vyper.readthedocs.io"
tokens = {
'root': [
# Whitespace
(r'\s+', Whitespace),
# Line continuations
(r'(\\)(\n|\r\n|\r)', bygroups(Text, Whitespace)),
# Comments - inline and multiline
(r'#.*$', Comment.Single),
(r'\"\"\"', Comment.Multiline, 'multiline-comment'),
# Strings - single and double
(r"'", String.Single, 'single-string'),
(r'"', String.Double, 'double-string'),
# Functions (working)
(r'(def)(\s+)([a-zA-Z_][a-zA-Z0-9_]*)',
bygroups(Keyword, Whitespace, Name.Function)),
# Event and Struct
(r'(event|struct|interface|log)(\s+)([a-zA-Z_][a-zA-Z0-9_]*)',
bygroups(Keyword, Whitespace, Name.Class)),
# Imports
(r'(from)(\s+)(vyper\.\w+)(\s+)(import)(\s+)(\w+)',
bygroups(Keyword, Whitespace, Name.Namespace, Whitespace,
Keyword, Whitespace, Name.Class)),
# Numeric Literals
(r'\b0x[0-9a-fA-F]+\b', Number.Hex),
(r'\b(\d{1,3}(?:_\d{3})*|\d+)\b', Number.Integer),
(r'\b\d+\.\d*\b', Number.Float),
# Keywords
(words(('def', 'event', 'pass', 'return', 'for', 'while', 'if', 'elif',
'else', 'assert', 'raise', 'import', 'in', 'struct', 'implements',
'interface', 'from', 'indexed', 'log'),
prefix=r'\b', suffix=r'\b'), Keyword),
# Visibility and State Mutability
(words(('public', 'private', 'view', 'pure', 'constant',
'immutable', 'nonpayable'), prefix=r'\b', suffix=r'\b'),
Keyword.Declaration),
# Built-in Functions
(words(('bitwise_and', 'bitwise_not', 'bitwise_or', 'bitwise_xor', 'shift',
'create_minimal_proxy_to', 'create_copy_of', 'create_from_blueprint',
'ecadd', 'ecmul', 'ecrecover', 'keccak256', 'sha256', 'concat', 'convert',
'uint2str', 'extract32', 'slice', 'abs', 'ceil', 'floor', 'max', 'max_value',
'min', 'min_value', 'pow_mod256', 'sqrt', 'isqrt', 'uint256_addmod',
'uint256_mulmod', 'unsafe_add', 'unsafe_sub', 'unsafe_mul', 'unsafe_div',
'as_wei_value', 'blockhash', 'empty', 'len', 'method_id', '_abi_encode',
'_abi_decode', 'print', 'range'), prefix=r'\b', suffix=r'\b'),
Name.Builtin),
# Built-in Variables and Attributes
(words(('msg.sender', 'msg.value', 'block.timestamp', 'block.number', 'msg.gas'),
prefix=r'\b', suffix=r'\b'),
Name.Builtin.Pseudo),
(words(('uint', 'uint8', 'uint16', 'uint32', 'uint64', 'uint128', 'uint256',
'int', 'int8', 'int16', 'int32', 'int64', 'int128', 'int256', 'bool',
'decimal', 'bytes', 'bytes1', 'bytes2', 'bytes3', 'bytes4', 'bytes5',
'bytes6', 'bytes7', 'bytes8', 'bytes9', 'bytes10', 'bytes11',
'bytes12', 'bytes13', 'bytes14', 'bytes15', 'bytes16', 'bytes17',
'bytes18', 'bytes19', 'bytes20', 'bytes21', 'bytes22', 'bytes23',
'bytes24', 'bytes25', 'bytes26', 'bytes27', 'bytes28', 'bytes29',
'bytes30', 'bytes31', 'bytes32', 'string', 'String', 'address',
'enum', 'struct'), prefix=r'\b', suffix=r'\b'),
Keyword.Type),
# indexed keywords
(r'\b(indexed)\b(\s*)(\()(\s*)(\w+)(\s*)(\))',
bygroups(Keyword, Whitespace, Punctuation, Whitespace,
Keyword.Type, Punctuation)),
# Operators and Punctuation
(r'(\+|\-|\*|\/|<=?|>=?|==|!=|=|\||&|%)', Operator),
(r'[.,:;()\[\]{}]', Punctuation),
# Other variable names and types
(r'@[\w.]+', Name.Decorator),
(r'__\w+__', Name.Magic), # Matches double underscores followed by word characters
(r'EMPTY_BYTES32', Name.Constant),
(r'\bERC20\b', Name.Class),
(r'\bself\b', Name.Attribute),
(r'Bytes\[\d+\]', Keyword.Type),
# Generic names and variables
(r'\b[a-zA-Z_]\w*\b:', Name.Variable),
(r'\b[a-zA-Z_]\w*\b', Name),
],
'multiline-comment': [
(r'\"\"\"', Comment.Multiline, '#pop'),
(r'[^"]+', Comment.Multiline),
(r'\"', Comment.Multiline)
],
'single-string': [
(r"[^\\']+", String.Single),
(r"'", String.Single, '#pop'),
(r'\\.', String.Escape),
],
'double-string': [
(r'[^\\"]+', String.Double),
(r'"', String.Double, '#pop'),
(r'\\.', String.Escape),
]
}
# Automatically generated by scripts/gen_mapfiles.py.
# DO NOT EDIT BY HAND; run `tox -e mapfiles` instead.
STYLES = {
'AbapStyle': ('pygments.styles.abap', 'abap', ()),
'AlgolStyle': ('pygments.styles.algol', 'algol', ()),
'Algol_NuStyle': ('pygments.styles.algol_nu', 'algol_nu', ()),
'ArduinoStyle': ('pygments.styles.arduino', 'arduino', ()),
'AutumnStyle': ('pygments.styles.autumn', 'autumn', ()),
'BlackWhiteStyle': ('pygments.styles.bw', 'bw', ()),
'BorlandStyle': ('pygments.styles.borland', 'borland', ()),
'ColorfulStyle': ('pygments.styles.colorful', 'colorful', ()),
'DefaultStyle': ('pygments.styles.default', 'default', ()),
'DraculaStyle': ('pygments.styles.dracula', 'dracula', ()),
'EmacsStyle': ('pygments.styles.emacs', 'emacs', ()),
'FriendlyGrayscaleStyle': ('pygments.styles.friendly_grayscale', 'friendly_grayscale', ()),
'FriendlyStyle': ('pygments.styles.friendly', 'friendly', ()),
'FruityStyle': ('pygments.styles.fruity', 'fruity', ()),
'GhDarkStyle': ('pygments.styles.gh_dark', 'github-dark', ()),
'GruvboxDarkStyle': ('pygments.styles.gruvbox', 'gruvbox-dark', ()),
'GruvboxLightStyle': ('pygments.styles.gruvbox', 'gruvbox-light', ()),
'IgorStyle': ('pygments.styles.igor', 'igor', ()),
'InkPotStyle': ('pygments.styles.inkpot', 'inkpot', ()),
'LightbulbStyle': ('pygments.styles.lightbulb', 'lightbulb', ()),
'LilyPondStyle': ('pygments.styles.lilypond', 'lilypond', ()),
'LovelaceStyle': ('pygments.styles.lovelace', 'lovelace', ()),
'ManniStyle': ('pygments.styles.manni', 'manni', ()),
'MaterialStyle': ('pygments.styles.material', 'material', ()),
'MonokaiStyle': ('pygments.styles.monokai', 'monokai', ()),
'MurphyStyle': ('pygments.styles.murphy', 'murphy', ()),
'NativeStyle': ('pygments.styles.native', 'native', ()),
'NordDarkerStyle': ('pygments.styles.nord', 'nord-darker', ()),
'NordStyle': ('pygments.styles.nord', 'nord', ()),
'OneDarkStyle': ('pygments.styles.onedark', 'one-dark', ()),
'ParaisoDarkStyle': ('pygments.styles.paraiso_dark', 'paraiso-dark', ()),
'ParaisoLightStyle': ('pygments.styles.paraiso_light', 'paraiso-light', ()),
'PastieStyle': ('pygments.styles.pastie', 'pastie', ()),
'PerldocStyle': ('pygments.styles.perldoc', 'perldoc', ()),
'RainbowDashStyle': ('pygments.styles.rainbow_dash', 'rainbow_dash', ()),
'RrtStyle': ('pygments.styles.rrt', 'rrt', ()),
'SasStyle': ('pygments.styles.sas', 'sas', ()),
'SolarizedDarkStyle': ('pygments.styles.solarized', 'solarized-dark', ()),
'SolarizedLightStyle': ('pygments.styles.solarized', 'solarized-light', ()),
'StarofficeStyle': ('pygments.styles.staroffice', 'staroffice', ()),
'StataDarkStyle': ('pygments.styles.stata_dark', 'stata-dark', ()),
'StataLightStyle': ('pygments.styles.stata_light', 'stata-light', ()),
'TangoStyle': ('pygments.styles.tango', 'tango', ()),
'TracStyle': ('pygments.styles.trac', 'trac', ()),
'VimStyle': ('pygments.styles.vim', 'vim', ()),
'VisualStudioStyle': ('pygments.styles.vs', 'vs', ()),
'XcodeStyle': ('pygments.styles.xcode', 'xcode', ()),
'ZenburnStyle': ('pygments.styles.zenburn', 'zenburn', ()),
}
tox ~= 4.4
# Other requirements are installed by tox.
const isOldEnough = (value, ownProps) => {
if (parseInt(value, 10) < 14) {
return "Only 14yo and older can register to the site."
}
};
// functional component
const BlogTitle = ({ children }) => (
<h3>{children}</h3>
);
// class component
class BlogPost extends React.Component {
renderTitle(title) {
return <BlogTitle>{title}</BlogTitle>
};
render() {
return (
<div className="blog-body">
{this.renderTitle(this.props.title)}
<p>{this.props.body}</p>
<CustomComponent>text</CustomComponent>
<input type="text" {...props.inputProps} />
<button aria-label="Submit">Submit</button>
</div>
);
}
}
const body = "Hello World!";
const blogNode = <BlogPost title="What's going on?" body={body} />;
// some comment. Tags shouldn't be lexed in here
// <div class="blog-body">
// <h3>What's going on?</h3>
// <p>Hello World!</p>
// </div>
/*
Some comment. Tags shouldn't be lexed in here either
<div class="blog-body">
<h3>What's going on?</h3>
<p>Hello World!</p>
</div>
*/
const shortSyntaxfragmentEmptyBody = <></>;
const shortSyntaxfragmentFullBody = <><div/></>;
const reactDotFragment = <React.Fragment><div/></React.Fragment>;
const reactDotContext = <Context.Provider><div/></Context.Provider>;
const reactDotContextValue = <Context.Provider value="Hello"><div/></Context.Provider>;

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

# pragma version 0.3.10
# pragma optimize codesize
# pragma evm-version shanghai
"""
@title CurveStableSwapNG
@author Curve.Fi
@license Copyright (c) Curve.Fi, 2020-2023 - all rights reserved
@notice Stableswap implementation for up to 8 coins with no rehypothecation,
i.e. the AMM does not deposit tokens into other contracts. The Pool contract also
records exponential moving averages for coins relative to coin 0.
@dev Asset Types:
0. Standard ERC20 token with no additional features.
Note: Users are advised to do careful due-diligence on
ERC20 tokens that they interact with, as this
contract cannot differentiate between harmless and
malicious ERC20 tokens.
1. Oracle - token with rate oracle (e.g. wstETH)
Note: Oracles may be controlled externally by an EOA. Users
are advised to proceed with caution.
2. Rebasing - token with rebase (e.g. stETH).
Note: Users and Integrators are advised to understand how
the AMM contract works with rebasing balances.
3. ERC4626 - token with convertToAssets method (e.g. sDAI).
Note: Some ERC4626 implementations may be susceptible to
Donation/Inflation attacks. Users are advised to
proceed with caution.
Supports:
1. ERC20 support for return True/revert, return True/False, return None
2. ERC20 tokens can have arbitrary decimals (<=18).
3. ERC20 tokens that rebase (either positive or fee on transfer)
4. ERC20 tokens that have a rate oracle (e.g. wstETH, cbETH, sDAI, etc.)
Note: Oracle precision _must_ be 10**18.
5. ERC4626 tokens with arbitrary precision (<=18) of Vault token and underlying
asset.
Additional features include:
1. Adds price oracles based on AMM State Price (and _not_ last traded price).
2. Adds TVL oracle based on D.
3. `exchange_received`: swaps that expect an ERC20 transfer to have occurred
prior to executing the swap.
Note: a. If pool contains rebasing tokens and one of the `asset_types` is 2 (Rebasing)
then calling `exchange_received` will REVERT.
b. If pool contains rebasing token and `asset_types` does not contain 2 (Rebasing)
then this is an incorrect implementation and rebases can be
stolen.
4. Adds `get_dx`: Similar to `get_dy` which returns an expected output
of coin[j] for given `dx` amount of coin[i], `get_dx` returns expected
input of coin[i] for an output amount of coin[j].
5. Fees are dynamic: AMM will charge a higher fee if pool depegs. This can cause very
slight discrepancies between calculated fees and realised fees.
"""
from vyper.interfaces import ERC20
from vyper.interfaces import ERC20Detailed
from vyper.interfaces import ERC4626
implements: ERC20
# ------------------------------- Interfaces ---------------------------------
interface Factory:
def fee_receiver() -> address: view
def admin() -> address: view
def views_implementation() -> address: view
interface ERC1271:
def isValidSignature(_hash: bytes32, _signature: Bytes[65]) -> bytes32: view
interface StableSwapViews:
def get_dx(i: int128, j: int128, dy: uint256, pool: address) -> uint256: view
def get_dy(i: int128, j: int128, dx: uint256, pool: address) -> uint256: view
def dynamic_fee(i: int128, j: int128, pool: address) -> uint256: view
def calc_token_amount(
_amounts: DynArray[uint256, MAX_COINS],
_is_deposit: bool,
_pool: address
) -> uint256: view
# --------------------------------- Events -----------------------------------
event Transfer:
sender: indexed(address)
receiver: indexed(address)
value: uint256
event Approval:
owner: indexed(address)
spender: indexed(address)
value: uint256
event TokenExchange:
buyer: indexed(address)
sold_id: int128
tokens_sold: uint256
bought_id: int128
tokens_bought: uint256
event TokenExchangeUnderlying:
buyer: indexed(address)
sold_id: int128
tokens_sold: uint256
bought_id: int128
tokens_bought: uint256
event AddLiquidity:
provider: indexed(address)
token_amounts: DynArray[uint256, MAX_COINS]
fees: DynArray[uint256, MAX_COINS]
invariant: uint256
token_supply: uint256
event RemoveLiquidity:
provider: indexed(address)
token_amounts: DynArray[uint256, MAX_COINS]
fees: DynArray[uint256, MAX_COINS]
token_supply: uint256
event RemoveLiquidityOne:
provider: indexed(address)
token_id: int128
token_amount: uint256
coin_amount: uint256
token_supply: uint256
event RemoveLiquidityImbalance:
provider: indexed(address)
token_amounts: DynArray[uint256, MAX_COINS]
fees: DynArray[uint256, MAX_COINS]
invariant: uint256
token_supply: uint256
event RampA:
old_A: uint256
new_A: uint256
initial_time: uint256
future_time: uint256
event StopRampA:
A: uint256
t: uint256
event ApplyNewFee:
fee: uint256
offpeg_fee_multiplier: uint256
MAX_COINS: constant(uint256) = 8 # max coins is 8 in the factory
MAX_COINS_128: constant(int128) = 8
# ---------------------------- Pool Variables --------------------------------
N_COINS: public(immutable(uint256))
N_COINS_128: immutable(int128)
PRECISION: constant(uint256) = 10 ** 18
factory: immutable(Factory)
coins: public(immutable(DynArray[address, MAX_COINS]))
asset_types: immutable(DynArray[uint8, MAX_COINS])
stored_balances: DynArray[uint256, MAX_COINS]
# Fee specific vars
FEE_DENOMINATOR: constant(uint256) = 10 ** 10
fee: public(uint256) # fee * 1e10
offpeg_fee_multiplier: public(uint256) # * 1e10
admin_fee: public(constant(uint256)) = 5000000000
MAX_FEE: constant(uint256) = 5 * 10 ** 9
# ---------------------- Pool Amplification Parameters -----------------------
A_PRECISION: constant(uint256) = 100
MAX_A: constant(uint256) = 10 ** 6
MAX_A_CHANGE: constant(uint256) = 10
initial_A: public(uint256)
future_A: public(uint256)
initial_A_time: public(uint256)
future_A_time: public(uint256)
# ---------------------------- Admin Variables -------------------------------
MIN_RAMP_TIME: constant(uint256) = 86400
admin_balances: public(DynArray[uint256, MAX_COINS])
# ----------------------- Oracle Specific vars -------------------------------
rate_multipliers: immutable(DynArray[uint256, MAX_COINS])
# [bytes4 method_id][bytes8 <empty>][bytes20 oracle]
oracles: DynArray[uint256, MAX_COINS]
# For ERC4626 tokens, we need:
call_amount: immutable(DynArray[uint256, MAX_COINS])
scale_factor: immutable(DynArray[uint256, MAX_COINS])
last_prices_packed: DynArray[uint256, MAX_COINS] # packing: last_price, ma_price
last_D_packed: uint256 # packing: last_D, ma_D
ma_exp_time: public(uint256)
D_ma_time: public(uint256)
ma_last_time: public(uint256) # packing: ma_last_time_p, ma_last_time_D
# ma_last_time has a distinction for p and D because p is _not_ updated if
# users remove_liquidity, but D is.
# shift(2**32 - 1, 224)
ORACLE_BIT_MASK: constant(uint256) = (2**32 - 1) * 256**28
# --------------------------- ERC20 Specific Vars ----------------------------
name: public(immutable(String[64]))
symbol: public(immutable(String[32]))
decimals: public(constant(uint8)) = 18
version: public(constant(String[8])) = "v7.0.0"
balanceOf: public(HashMap[address, uint256])
allowance: public(HashMap[address, HashMap[address, uint256]])
total_supply: uint256
nonces: public(HashMap[address, uint256])
# keccak256("isValidSignature(bytes32,bytes)")[:4] << 224
ERC1271_MAGIC_VAL: constant(bytes32) = 0x1626ba7e00000000000000000000000000000000000000000000000000000000
EIP712_TYPEHASH: constant(bytes32) = keccak256("EIP712Domain(string name,string version,uint256 chainId,address verifyingContract,bytes32 salt)")
EIP2612_TYPEHASH: constant(bytes32) = keccak256("Permit(address owner,address spender,uint256 value,uint256 nonce,uint256 deadline)")
VERSION_HASH: constant(bytes32) = keccak256(version)
NAME_HASH: immutable(bytes32)
CACHED_CHAIN_ID: immutable(uint256)
salt: public(immutable(bytes32))
CACHED_DOMAIN_SEPARATOR: immutable(bytes32)
# ------------------------------ AMM Setup -----------------------------------
@external
def __init__(
_name: String[32],
_symbol: String[10],
_A: uint256,
_fee: uint256,
_offpeg_fee_multiplier: uint256,
_ma_exp_time: uint256,
_coins: DynArray[address, MAX_COINS],
_rate_multipliers: DynArray[uint256, MAX_COINS],
_asset_types: DynArray[uint8, MAX_COINS],
_method_ids: DynArray[bytes4, MAX_COINS],
_oracles: DynArray[address, MAX_COINS],
):
"""
@notice Initialize the pool contract
@param _name Name of the new plain pool.
@param _symbol Symbol for the new plain pool.
@param _A Amplification co-efficient - a lower value here means
less tolerance for imbalance within the pool's assets.
Suggested values include:
* Uncollateralized algorithmic stablecoins: 5-10
* Non-redeemable, collateralized assets: 100
* Redeemable assets: 200-400
@param _fee Trade fee, given as an integer with 1e10 precision. The
the maximum is 1% (100000000).
50% of the fee is distributed to veCRV holders.
@param _offpeg_fee_multiplier A multiplier that determines how much to increase
Fees by when assets in the AMM depeg. Example value: 20000000000
@param _ma_exp_time Averaging window of oracle. Set as time_in_seconds / ln(2)
Example: for 10 minute EMA, _ma_exp_time is 600 / ln(2) ~= 866
@param _coins List of addresses of the coins being used in the pool.
@param _rate_multipliers An array of: [10 ** (36 - _coins[n].decimals()), ... for n in range(N_COINS)]
@param _asset_types Array of uint8 representing tokens in pool
@param _method_ids Array of first four bytes of the Keccak-256 hash of the function signatures
of the oracle addresses that gives rate oracles.
Calculated as: keccak(text=event_signature.replace(" ", ""))[:4]
@param _oracles Array of rate oracle addresses.
"""
coins = _coins
asset_types = _asset_types
__n_coins: uint256 = len(_coins)
N_COINS = __n_coins
N_COINS_128 = convert(__n_coins, int128)
rate_multipliers = _rate_multipliers
factory = Factory(msg.sender)
A: uint256 = _A * A_PRECISION
self.initial_A = A
self.future_A = A
self.fee = _fee
self.offpeg_fee_multiplier = _offpeg_fee_multiplier
assert _ma_exp_time != 0
self.ma_exp_time = _ma_exp_time
self.D_ma_time = 62324 # <--------- 12 hours default on contract start.
self.ma_last_time = self.pack_2(block.timestamp, block.timestamp)
# ------------------- initialize storage for DynArrays ------------------
_call_amount: DynArray[uint256, MAX_COINS] = empty(DynArray[uint256, MAX_COINS])
_scale_factor: DynArray[uint256, MAX_COINS] = empty(DynArray[uint256, MAX_COINS])
for i in range(MAX_COINS_128):
if i == N_COINS_128:
break
if i < N_COINS_128 - 1:
self.last_prices_packed.append(self.pack_2(10**18, 10**18))
self.oracles.append(convert(_method_ids[i], uint256) * 2**224 | convert(_oracles[i], uint256))
self.stored_balances.append(0)
self.admin_balances.append(0)
if _asset_types[i] == 3:
_call_amount.append(10**convert(ERC20Detailed(_coins[i]).decimals(), uint256))
_underlying_asset: address = ERC4626(_coins[i]).asset()
_scale_factor.append(10**(18 - convert(ERC20Detailed(_underlying_asset).decimals(), uint256)))
else:
_call_amount.append(0)
_scale_factor.append(0)
call_amount = _call_amount
scale_factor = _scale_factor
# ----------------------------- ERC20 stuff ------------------------------
name = _name
symbol = _symbol
# EIP712 related params -----------------
NAME_HASH = keccak256(name)
salt = block.prevhash
CACHED_CHAIN_ID = chain.id
CACHED_DOMAIN_SEPARATOR = keccak256(
_abi_encode(
EIP712_TYPEHASH,
NAME_HASH,
VERSION_HASH,
chain.id,
self,
salt,
)
)
# ------------------------ Fire a transfer event -------------------------
log Transfer(empty(address), msg.sender, 0)
# ------------------ Token transfers in and out of the AMM -------------------
@internal
def _transfer_in(
coin_idx: int128,
dx: uint256,
sender: address,
expect_optimistic_transfer: bool,
) -> uint256:
"""
@notice Contains all logic to handle ERC20 token transfers.
@param coin_idx Index of the coin to transfer in.
@param dx amount of `_coin` to transfer into the pool.
@param dy amount of `_coin` to transfer out of the pool.
@param sender address to transfer `_coin` from.
@param receiver address to transfer `_coin` to.
@param expect_optimistic_transfer True if contract expects an optimistic coin transfer
"""
_dx: uint256 = ERC20(coins[coin_idx]).balanceOf(self)
# ------------------------- Handle Transfers -----------------------------
if expect_optimistic_transfer:
_dx = _dx - self.stored_balances[coin_idx]
assert _dx >= dx
else:
assert dx > 0 # dev : do not transferFrom 0 tokens into the pool
assert ERC20(coins[coin_idx]).transferFrom(
sender, self, dx, default_return_value=True
)
_dx = ERC20(coins[coin_idx]).balanceOf(self) - _dx
# --------------------------- Store transferred in amount ---------------------------
self.stored_balances[coin_idx] += _dx
return _dx
@internal
def _transfer_out(_coin_idx: int128, _amount: uint256, receiver: address):
"""
@notice Transfer a single token from the pool to receiver.
@dev This function is called by `remove_liquidity` and
`remove_liquidity_one`, `_exchange` and `_withdraw_admin_fees` methods.
@param _coin_idx Index of the token to transfer out
@param _amount Amount of token to transfer out
@param receiver Address to send the tokens to
"""
coin_balance: uint256 = ERC20(coins[_coin_idx]).balanceOf(self)
# ------------------------- Handle Transfers -----------------------------
assert ERC20(coins[_coin_idx]).transfer(
receiver, _amount, default_return_value=True
)
# ----------------------- Update Stored Balances -------------------------
self.stored_balances[_coin_idx] = coin_balance - _amount
# -------------------------- AMM Special Methods -----------------------------
@view
@internal
def _stored_rates() -> DynArray[uint256, MAX_COINS]:
"""
@notice Gets rate multipliers for each coin.
@dev If the coin has a rate oracle that has been properly initialised,
this method queries that rate by static-calling an external
contract.
"""
rates: DynArray[uint256, MAX_COINS] = rate_multipliers
oracles: DynArray[uint256, MAX_COINS] = self.oracles
for i in range(MAX_COINS_128):
if i == N_COINS_128:
break
if asset_types[i] == 1 and not oracles[i] == 0:
# NOTE: fetched_rate is assumed to be 10**18 precision
fetched_rate: uint256 = convert(
raw_call(
convert(oracles[i] % 2**160, address),
_abi_encode(oracles[i] & ORACLE_BIT_MASK),
max_outsize=32,
is_static_call=True,
),
uint256
)
rates[i] = unsafe_div(rates[i] * fetched_rate, PRECISION)
elif asset_types[i] == 3: # ERC4626
# fetched_rate: uint256 = ERC4626(coins[i]).convertToAssets(call_amount[i]) * scale_factor[i]
# here: call_amount has ERC4626 precision, but the returned value is scaled up to 18
# using scale_factor which is (18 - n) if underlying asset has n decimals.
rates[i] = unsafe_div(
rates[i] * ERC4626(coins[i]).convertToAssets(call_amount[i]) * scale_factor[i],
PRECISION
) # 1e18 precision
return rates
@view
@internal
def _balances() -> DynArray[uint256, MAX_COINS]:
"""
@notice Calculates the pool's balances _excluding_ the admin's balances.
@dev If the pool contains rebasing tokens, this method ensures LPs keep all
rebases and admin only claims swap fees. This also means that, since
admin's balances are stored in an array and not inferred from read balances,
the fees in the rebasing token that the admin collects is immune to
slashing events.
"""
result: DynArray[uint256, MAX_COINS] = empty(DynArray[uint256, MAX_COINS])
balances_i: uint256 = 0
for i in range(MAX_COINS_128):
if i == N_COINS_128:
break
if 2 in asset_types:
balances_i = ERC20(coins[i]).balanceOf(self) - self.admin_balances[i]
else:
balances_i = self.stored_balances[i] - self.admin_balances[i]
result.append(balances_i)
return result
# -------------------------- AMM Main Functions ------------------------------
@external
@nonreentrant('lock')
def exchange(
i: int128,
j: int128,
_dx: uint256,
_min_dy: uint256,
_receiver: address = msg.sender,
) -> uint256:
"""
@notice Perform an exchange between two coins
@dev Index values can be found via the `coins` public getter method
@param i Index value for the coin to send
@param j Index value of the coin to recieve
@param _dx Amount of `i` being exchanged
@param _min_dy Minimum amount of `j` to receive
@return Actual amount of `j` received
"""
return self._exchange(
msg.sender,
i,
j,
_dx,
_min_dy,
_receiver,
False
)
@external
@nonreentrant('lock')
def exchange_received(
i: int128,
j: int128,
_dx: uint256,
_min_dy: uint256,
_receiver: address = msg.sender,
) -> uint256:
"""
@notice Perform an exchange between two coins without transferring token in
@dev The contract swaps tokens based on a change in balance of coin[i]. The
dx = ERC20(coin[i]).balanceOf(self) - self.stored_balances[i]. Users of
this method are dex aggregators, arbitrageurs, or other users who do not
wish to grant approvals to the contract: they would instead send tokens
directly to the contract and call `exchange_received`.
Note: This is disabled if pool contains rebasing tokens.
@param i Index value for the coin to send
@param j Index valie of the coin to recieve
@param _dx Amount of `i` being exchanged
@param _min_dy Minimum amount of `j` to receive
@return Actual amount of `j` received
"""
assert not 2 in asset_types # dev: exchange_received not supported if pool contains rebasing tokens
return self._exchange(
msg.sender,
i,
j,
_dx,
_min_dy,
_receiver,
True, # <--------------------------------------- swap optimistically.
)
@external
@nonreentrant('lock')
def add_liquidity(
_amounts: DynArray[uint256, MAX_COINS],
_min_mint_amount: uint256,
_receiver: address = msg.sender
) -> uint256:
"""
@notice Deposit coins into the pool
@param _amounts List of amounts of coins to deposit
@param _min_mint_amount Minimum amount of LP tokens to mint from the deposit
@param _receiver Address that owns the minted LP tokens
@return Amount of LP tokens received by depositing
"""
amp: uint256 = self._A()
old_balances: DynArray[uint256, MAX_COINS] = self._balances()
rates: DynArray[uint256, MAX_COINS] = self._stored_rates()
# Initial invariant
D0: uint256 = self.get_D_mem(rates, old_balances, amp)
total_supply: uint256 = self.total_supply
new_balances: DynArray[uint256, MAX_COINS] = old_balances
# -------------------------- Do Transfers In -----------------------------
for i in range(MAX_COINS_128):
if i == N_COINS_128:
break
if _amounts[i] > 0:
new_balances[i] += self._transfer_in(
i,
_amounts[i],
msg.sender,
False, # expect_optimistic_transfer
)
else:
assert total_supply != 0 # dev: initial deposit requires all coins
# ------------------------------------------------------------------------
# Invariant after change
D1: uint256 = self.get_D_mem(rates, new_balances, amp)
assert D1 > D0
# We need to recalculate the invariant accounting for fees
# to calculate fair user's share
fees: DynArray[uint256, MAX_COINS] = empty(DynArray[uint256, MAX_COINS])
mint_amount: uint256 = 0
if total_supply > 0:
ideal_balance: uint256 = 0
difference: uint256 = 0
new_balance: uint256 = 0
ys: uint256 = (D0 + D1) / N_COINS
xs: uint256 = 0
_dynamic_fee_i: uint256 = 0
# Only account for fees if we are not the first to deposit
base_fee: uint256 = self.fee * N_COINS / (4 * (N_COINS - 1))
for i in range(MAX_COINS_128):
if i == N_COINS_128:
break
ideal_balance = D1 * old_balances[i] / D0
difference = 0
new_balance = new_balances[i]
if ideal_balance > new_balance:
difference = ideal_balance - new_balance
else:
difference = new_balance - ideal_balance
# fee[i] = _dynamic_fee(i, j) * difference / FEE_DENOMINATOR
xs = unsafe_div(rates[i] * (old_balances[i] + new_balance), PRECISION)
_dynamic_fee_i = self._dynamic_fee(xs, ys, base_fee)
fees.append(_dynamic_fee_i * difference / FEE_DENOMINATOR)
self.admin_balances[i] += fees[i] * admin_fee / FEE_DENOMINATOR
new_balances[i] -= fees[i]
xp: DynArray[uint256, MAX_COINS] = self._xp_mem(rates, new_balances)
D1 = self.get_D(xp, amp) # <--------------- Reuse D1 for new D value.
mint_amount = total_supply * (D1 - D0) / D0
self.upkeep_oracles(xp, amp, D1)
else:
mint_amount = D1 # Take the dust if there was any
# (re)instantiate D oracle if totalSupply is zero.
self.last_D_packed = self.pack_2(D1, D1)
assert mint_amount >= _min_mint_amount, "Slippage screwed you"
# Mint pool tokens
total_supply += mint_amount
self.balanceOf[_receiver] += mint_amount
self.total_supply = total_supply
log Transfer(empty(address), _receiver, mint_amount)
log AddLiquidity(msg.sender, _amounts, fees, D1, total_supply)
return mint_amount
@external
@nonreentrant('lock')
def remove_liquidity_one_coin(
_burn_amount: uint256,
i: int128,
_min_received: uint256,
_receiver: address = msg.sender,
) -> uint256:
"""
@notice Withdraw a single coin from the pool
@param _burn_amount Amount of LP tokens to burn in the withdrawal
@param i Index value of the coin to withdraw
@param _min_received Minimum amount of coin to receive
@param _receiver Address that receives the withdrawn coins
@return Amount of coin received
"""
assert _burn_amount > 0 # dev: do not remove 0 LP tokens
dy: uint256 = 0
fee: uint256 = 0
xp: DynArray[uint256, MAX_COINS] = empty(DynArray[uint256, MAX_COINS])
amp: uint256 = empty(uint256)
D: uint256 = empty(uint256)
dy, fee, xp, amp, D = self._calc_withdraw_one_coin(_burn_amount, i)
assert dy >= _min_received, "Not enough coins removed"
self.admin_balances[i] += fee * admin_fee / FEE_DENOMINATOR
self._burnFrom(msg.sender, _burn_amount)
self._transfer_out(i, dy, _receiver)
log RemoveLiquidityOne(msg.sender, i, _burn_amount, dy, self.total_supply)
self.upkeep_oracles(xp, amp, D)
return dy
@external
@nonreentrant('lock')
def remove_liquidity_imbalance(
_amounts: DynArray[uint256, MAX_COINS],
_max_burn_amount: uint256,
_receiver: address = msg.sender
) -> uint256:
"""
@notice Withdraw coins from the pool in an imbalanced amount
@param _amounts List of amounts of underlying coins to withdraw
@param _max_burn_amount Maximum amount of LP token to burn in the withdrawal
@param _receiver Address that receives the withdrawn coins
@return Actual amount of the LP token burned in the withdrawal
"""
amp: uint256 = self._A()
rates: DynArray[uint256, MAX_COINS] = self._stored_rates()
old_balances: DynArray[uint256, MAX_COINS] = self._balances()
D0: uint256 = self.get_D_mem(rates, old_balances, amp)
new_balances: DynArray[uint256, MAX_COINS] = old_balances
for i in range(MAX_COINS_128):
if i == N_COINS_128:
break
if _amounts[i] != 0:
new_balances[i] -= _amounts[i]
self._transfer_out(i, _amounts[i], _receiver)
D1: uint256 = self.get_D_mem(rates, new_balances, amp)
base_fee: uint256 = self.fee * N_COINS / (4 * (N_COINS - 1))
ys: uint256 = (D0 + D1) / N_COINS
fees: DynArray[uint256, MAX_COINS] = empty(DynArray[uint256, MAX_COINS])
dynamic_fee: uint256 = 0
xs: uint256 = 0
ideal_balance: uint256 = 0
difference: uint256 = 0
new_balance: uint256 = 0
for i in range(MAX_COINS_128):
if i == N_COINS_128:
break
ideal_balance = D1 * old_balances[i] / D0
difference = 0
new_balance = new_balances[i]
if ideal_balance > new_balance:
difference = ideal_balance - new_balance
else:
difference = new_balance - ideal_balance
xs = unsafe_div(rates[i] * (old_balances[i] + new_balance), PRECISION)
dynamic_fee = self._dynamic_fee(xs, ys, base_fee)
fees.append(dynamic_fee * difference / FEE_DENOMINATOR)
self.admin_balances[i] += fees[i] * admin_fee / FEE_DENOMINATOR
new_balances[i] -= fees[i]
D1 = self.get_D_mem(rates, new_balances, amp) # dev: reuse D1 for new D.
self.upkeep_oracles(new_balances, amp, D1)
total_supply: uint256 = self.total_supply
burn_amount: uint256 = ((D0 - D1) * total_supply / D0) + 1
assert burn_amount > 1 # dev: zero tokens burned
assert burn_amount <= _max_burn_amount, "Slippage screwed you"
total_supply -= burn_amount
self._burnFrom(msg.sender, burn_amount)
log RemoveLiquidityImbalance(msg.sender, _amounts, fees, D1, total_supply)
return burn_amount
@external
@nonreentrant('lock')
def remove_liquidity(
_burn_amount: uint256,
_min_amounts: DynArray[uint256, MAX_COINS],
_receiver: address = msg.sender,
_claim_admin_fees: bool = True,
) -> DynArray[uint256, MAX_COINS]:
"""
@notice Withdraw coins from the pool
@dev Withdrawal amounts are based on current deposit ratios
@param _burn_amount Quantity of LP tokens to burn in the withdrawal
@param _min_amounts Minimum amounts of underlying coins to receive
@param _receiver Address that receives the withdrawn coins
@return List of amounts of coins that were withdrawn
"""
total_supply: uint256 = self.total_supply
assert _burn_amount > 0 # dev: invalid burn amount
amounts: DynArray[uint256, MAX_COINS] = empty(DynArray[uint256, MAX_COINS])
balances: DynArray[uint256, MAX_COINS] = self._balances()
value: uint256 = 0
for i in range(MAX_COINS_128):
if i == N_COINS_128:
break
value = balances[i] * _burn_amount / total_supply
assert value >= _min_amounts[i], "Withdrawal resulted in fewer coins than expected"
amounts.append(value)
self._transfer_out(i, value, _receiver)
self._burnFrom(msg.sender, _burn_amount) # <---- Updates self.total_supply
# --------------------------- Upkeep D_oracle ----------------------------
ma_last_time_unpacked: uint256[2] = self.unpack_2(self.ma_last_time)
last_D_packed_current: uint256 = self.last_D_packed
old_D: uint256 = last_D_packed_current & (2**128 - 1)
self.last_D_packed = self.pack_2(
old_D - unsafe_div(old_D * _burn_amount, total_supply), # new_D = proportionally reduce D.
self._calc_moving_average(
last_D_packed_current,
self.D_ma_time,
ma_last_time_unpacked[1]
)
)
if ma_last_time_unpacked[1] < block.timestamp:
ma_last_time_unpacked[1] = block.timestamp
self.ma_last_time = self.pack_2(ma_last_time_unpacked[0], ma_last_time_unpacked[1])
# ------------------------------- Log event ------------------------------
log RemoveLiquidity(
msg.sender,
amounts,
empty(DynArray[uint256, MAX_COINS]),
total_supply - _burn_amount
)
# ------- Withdraw admin fees if _claim_admin_fees is set to True --------
if _claim_admin_fees:
self._withdraw_admin_fees()
return amounts
@external
def withdraw_admin_fees():
"""
@notice Claim admin fees. Callable by anyone.
"""
self._withdraw_admin_fees()
# ------------------------ AMM Internal Functions ----------------------------
@view
@internal
def _dynamic_fee(xpi: uint256, xpj: uint256, _fee: uint256) -> uint256:
_offpeg_fee_multiplier: uint256 = self.offpeg_fee_multiplier
if _offpeg_fee_multiplier <= FEE_DENOMINATOR:
return _fee
xps2: uint256 = (xpi + xpj) ** 2
return (
(_offpeg_fee_multiplier * _fee) /
((_offpeg_fee_multiplier - FEE_DENOMINATOR) * 4 * xpi * xpj / xps2 + FEE_DENOMINATOR)
)
@internal
def __exchange(
x: uint256,
_xp: DynArray[uint256, MAX_COINS],
rates: DynArray[uint256, MAX_COINS],
i: int128,
j: int128,
) -> uint256:
amp: uint256 = self._A()
D: uint256 = self.get_D(_xp, amp)
y: uint256 = self.get_y(i, j, x, _xp, amp, D)
dy: uint256 = _xp[j] - y - 1 # -1 just in case there were some rounding errors
dy_fee: uint256 = dy * self._dynamic_fee((_xp[i] + x) / 2, (_xp[j] + y) / 2, self.fee) / FEE_DENOMINATOR
# Convert all to real units
dy = (dy - dy_fee) * PRECISION / rates[j]
self.admin_balances[j] += (
dy_fee * admin_fee / FEE_DENOMINATOR
) * PRECISION / rates[j]
# Calculate and store state prices:
xp: DynArray[uint256, MAX_COINS] = _xp
xp[i] = x
xp[j] = y
# D is not changed because we did not apply a fee
self.upkeep_oracles(xp, amp, D)
return dy
@internal
def _exchange(
sender: address,
i: int128,
j: int128,
_dx: uint256,
_min_dy: uint256,
receiver: address,
expect_optimistic_transfer: bool
) -> uint256:
assert i != j # dev: coin index out of range
assert _dx > 0 # dev: do not exchange 0 coins
rates: DynArray[uint256, MAX_COINS] = self._stored_rates()
old_balances: DynArray[uint256, MAX_COINS] = self._balances()
xp: DynArray[uint256, MAX_COINS] = self._xp_mem(rates, old_balances)
# --------------------------- Do Transfer in -----------------------------
# `dx` is whatever the pool received after ERC20 transfer:
dx: uint256 = self._transfer_in(
i,
_dx,
sender,
expect_optimistic_transfer
)
# ------------------------------- Exchange -------------------------------
x: uint256 = xp[i] + dx * rates[i] / PRECISION
dy: uint256 = self.__exchange(x, xp, rates, i, j)
assert dy >= _min_dy, "Exchange resulted in fewer coins than expected"
# --------------------------- Do Transfer out ----------------------------
self._transfer_out(j, dy, receiver)
# ------------------------------------------------------------------------
log TokenExchange(msg.sender, i, _dx, j, dy)
return dy
@internal
def _withdraw_admin_fees():
fee_receiver: address = factory.fee_receiver()
assert fee_receiver != empty(address) # dev: fee receiver not set
admin_balances: DynArray[uint256, MAX_COINS] = self.admin_balances
for i in range(MAX_COINS_128):
if i == N_COINS_128:
break
if admin_balances[i] > 0:
self._transfer_out(i, admin_balances[i], fee_receiver)
admin_balances[i] = 0
self.admin_balances = admin_balances
# --------------------------- AMM Math Functions -----------------------------
@view
@internal
def get_y(
i: int128,
j: int128,
x: uint256,
xp: DynArray[uint256, MAX_COINS],
_amp: uint256,
_D: uint256
) -> uint256:
"""
Calculate x[j] if one makes x[i] = x
Done by solving quadratic equation iteratively.
x_1**2 + x_1 * (sum' - (A*n**n - 1) * D / (A * n**n)) = D ** (n + 1) / (n ** (2 * n) * prod' * A)
x_1**2 + b*x_1 = c
x_1 = (x_1**2 + c) / (2*x_1 + b)
"""
# x in the input is converted to the same price/precision
assert i != j # dev: same coin
assert j >= 0 # dev: j below zero
assert j < N_COINS_128 # dev: j above N_COINS
# should be unreachable, but good for safety
assert i >= 0
assert i < N_COINS_128
amp: uint256 = _amp
D: uint256 = _D
S_: uint256 = 0
_x: uint256 = 0
y_prev: uint256 = 0
c: uint256 = D
Ann: uint256 = amp * N_COINS
for _i in range(MAX_COINS_128):
if _i == N_COINS_128:
break
if _i == i:
_x = x
elif _i != j:
_x = xp[_i]
else:
continue
S_ += _x
c = c * D / (_x * N_COINS)
c = c * D * A_PRECISION / (Ann * N_COINS)
b: uint256 = S_ + D * A_PRECISION / Ann # - D
y: uint256 = D
for _i in range(255):
y_prev = y
y = (y*y + c) / (2 * y + b - D)
# Equality with the precision of 1
if y > y_prev:
if y - y_prev <= 1:
return y
else:
if y_prev - y <= 1:
return y
raise
@pure
@internal
def get_D(_xp: DynArray[uint256, MAX_COINS], _amp: uint256) -> uint256:
"""
D invariant calculation in non-overflowing integer operations
iteratively
A * sum(x_i) * n**n + D = A * D * n**n + D**(n+1) / (n**n * prod(x_i))
Converging solution:
D[j+1] = (A * n**n * sum(x_i) - D[j]**(n+1) / (n**n prod(x_i))) / (A * n**n - 1)
"""
S: uint256 = 0
for x in _xp:
S += x
if S == 0:
return 0
D: uint256 = S
Ann: uint256 = _amp * N_COINS
D_P: uint256 = 0
Dprev: uint256 = 0
for i in range(255):
D_P = D
for x in _xp:
D_P = D_P * D / (x * N_COINS)
Dprev = D
# (Ann * S / A_PRECISION + D_P * N_COINS) * D / ((Ann - A_PRECISION) * D / A_PRECISION + (N_COINS + 1) * D_P)
D = (
(unsafe_div(Ann * S, A_PRECISION) + D_P * N_COINS) *
D / (
unsafe_div((Ann - A_PRECISION) * D, A_PRECISION) +
unsafe_add(N_COINS, 1) * D_P
)
)
# Equality with the precision of 1
if D > Dprev:
if D - Dprev <= 1:
return D
else:
if Dprev - D <= 1:
return D
# convergence typically occurs in 4 rounds or less, this should be unreachable!
# if it does happen the pool is borked and LPs can withdraw via `remove_liquidity`
raise
@pure
@internal
def get_y_D(
A: uint256,
i: int128,
xp: DynArray[uint256, MAX_COINS],
D: uint256
) -> uint256:
"""
Calculate x[i] if one reduces D from being calculated for xp to D
Done by solving quadratic equation iteratively.
x_1**2 + x_1 * (sum' - (A*n**n - 1) * D / (A * n**n)) = D ** (n + 1) / (n ** (2 * n) * prod' * A)
x_1**2 + b*x_1 = c
x_1 = (x_1**2 + c) / (2*x_1 + b)
"""
# x in the input is converted to the same price/precision
assert i >= 0 # dev: i below zero
assert i < N_COINS_128 # dev: i above N_COINS
S_: uint256 = 0
_x: uint256 = 0
y_prev: uint256 = 0
c: uint256 = D
Ann: uint256 = A * N_COINS
for _i in range(MAX_COINS_128):
if _i == N_COINS_128:
break
if _i != i:
_x = xp[_i]
else:
continue
S_ += _x
c = c * D / (_x * N_COINS)
c = c * D * A_PRECISION / (Ann * N_COINS)
b: uint256 = S_ + D * A_PRECISION / Ann
y: uint256 = D
for _i in range(255):
y_prev = y
y = (y*y + c) / (2 * y + b - D)
# Equality with the precision of 1
if y > y_prev:
if y - y_prev <= 1:
return y
else:
if y_prev - y <= 1:
return y
raise
@view
@internal
def _A() -> uint256:
"""
Handle ramping A up or down
"""
t1: uint256 = self.future_A_time
A1: uint256 = self.future_A
if block.timestamp < t1:
A0: uint256 = self.initial_A
t0: uint256 = self.initial_A_time
# Expressions in uint256 cannot have negative numbers, thus "if"
if A1 > A0:
return A0 + (A1 - A0) * (block.timestamp - t0) / (t1 - t0)
else:
return A0 - (A0 - A1) * (block.timestamp - t0) / (t1 - t0)
else: # when t1 == 0 or block.timestamp >= t1
return A1
@pure
@internal
def _xp_mem(
_rates: DynArray[uint256, MAX_COINS],
_balances: DynArray[uint256, MAX_COINS]
) -> DynArray[uint256, MAX_COINS]:
result: DynArray[uint256, MAX_COINS] = empty(DynArray[uint256, MAX_COINS])
for i in range(MAX_COINS_128):
if i == N_COINS_128:
break
result.append(_rates[i] * _balances[i] / PRECISION)
return result
@view
@internal
def get_D_mem(
_rates: DynArray[uint256, MAX_COINS],
_balances: DynArray[uint256, MAX_COINS],
_amp: uint256
) -> uint256:
xp: DynArray[uint256, MAX_COINS] = self._xp_mem(_rates, _balances)
return self.get_D(xp, _amp)
@view
@internal
def _calc_withdraw_one_coin(
_burn_amount: uint256,
i: int128
) -> (
uint256,
uint256,
DynArray[uint256, MAX_COINS],
uint256,
uint256
):
# First, need to calculate
# * Get current D
# * Solve Eqn against y_i for D - _token_amount
amp: uint256 = self._A()
rates: DynArray[uint256, MAX_COINS] = self._stored_rates()
xp: DynArray[uint256, MAX_COINS] = self._xp_mem(rates, self._balances())
D0: uint256 = self.get_D(xp, amp)
total_supply: uint256 = self.total_supply
D1: uint256 = D0 - _burn_amount * D0 / total_supply
new_y: uint256 = self.get_y_D(amp, i, xp, D1)
base_fee: uint256 = self.fee * N_COINS / (4 * (N_COINS - 1))
ys: uint256 = (D0 + D1) / (2 * N_COINS)
xp_reduced: DynArray[uint256, MAX_COINS] = xp
dx_expected: uint256 = 0
xp_j: uint256 = 0
xavg: uint256 = 0
dynamic_fee: uint256 = 0
for j in range(MAX_COINS_128):
if j == N_COINS_128:
break
dx_expected = 0
xp_j = xp[j]
if j == i:
dx_expected = xp_j * D1 / D0 - new_y
xavg = (xp_j + new_y) / 2
else:
dx_expected = xp_j - xp_j * D1 / D0
xavg = xp_j
dynamic_fee = self._dynamic_fee(xavg, ys, base_fee)
xp_reduced[j] = xp_j - dynamic_fee * dx_expected / FEE_DENOMINATOR
dy: uint256 = xp_reduced[i] - self.get_y_D(amp, i, xp_reduced, D1)
dy_0: uint256 = (xp[i] - new_y) * PRECISION / rates[i] # w/o fees
dy = (dy - 1) * PRECISION / rates[i] # Withdraw less to account for rounding errors
# update xp with new_y for p calculations.
xp[i] = new_y
return dy, dy_0 - dy, xp, amp, D1
# -------------------------- AMM Price Methods -------------------------------
@pure
@internal
def pack_2(p1: uint256, p2: uint256) -> uint256:
assert p1 < 2**128
assert p2 < 2**128
return p1 | (p2 << 128)
@pure
@internal
def unpack_2(packed: uint256) -> uint256[2]:
return [packed & (2**128 - 1), packed >> 128]
@internal
@pure
def _get_p(
xp: DynArray[uint256, MAX_COINS],
amp: uint256,
D: uint256,
) -> DynArray[uint256, MAX_COINS]:
# dx_0 / dx_1 only, however can have any number of coins in pool
ANN: uint256 = unsafe_mul(amp, N_COINS)
Dr: uint256 = unsafe_div(D, pow_mod256(N_COINS, N_COINS))
for i in range(MAX_COINS_128):
if i == N_COINS_128:
break
Dr = Dr * D / xp[i]
p: DynArray[uint256, MAX_COINS] = empty(DynArray[uint256, MAX_COINS])
xp0_A: uint256 = ANN * xp[0] / A_PRECISION
for i in range(1, MAX_COINS):
if i == N_COINS:
break
p.append(10**18 * (xp0_A + Dr * xp[0] / xp[i]) / (xp0_A + Dr))
return p
@internal
def upkeep_oracles(xp: DynArray[uint256, MAX_COINS], amp: uint256, D: uint256):
"""
@notice Upkeeps price and D oracles.
"""
ma_last_time_unpacked: uint256[2] = self.unpack_2(self.ma_last_time)
last_prices_packed_current: DynArray[uint256, MAX_COINS] = self.last_prices_packed
last_prices_packed_new: DynArray[uint256, MAX_COINS] = last_prices_packed_current
spot_price: DynArray[uint256, MAX_COINS] = self._get_p(xp, amp, D)
# -------------------------- Upkeep price oracle -------------------------
for i in range(MAX_COINS):
if i == N_COINS - 1:
break
if spot_price[i] != 0:
# Upate packed prices -----------------
last_prices_packed_new[i] = self.pack_2(
spot_price[i],
self._calc_moving_average(
last_prices_packed_current[i],
self.ma_exp_time,
ma_last_time_unpacked[0], # index 0 is ma_exp_time for prices
)
)
self.last_prices_packed = last_prices_packed_new
# ---------------------------- Upkeep D oracle ---------------------------
last_D_packed_current: uint256 = self.last_D_packed
self.last_D_packed = self.pack_2(
D,
self._calc_moving_average(
last_D_packed_current,
self.D_ma_time,
ma_last_time_unpacked[1], # index 1 is ma_exp_time for D
)
)
# Housekeeping: Update ma_last_time for p and D oracles ------------------
for i in range(2):
if ma_last_time_unpacked[i] < block.timestamp:
ma_last_time_unpacked[i] = block.timestamp
self.ma_last_time = self.pack_2(ma_last_time_unpacked[0], ma_last_time_unpacked[1])
@internal
@view
def _calc_moving_average(
packed_value: uint256,
averaging_window: uint256,
ma_last_time: uint256
) -> uint256:
last_spot_value: uint256 = packed_value & (2**128 - 1)
last_ema_value: uint256 = (packed_value >> 128)
if ma_last_time < block.timestamp: # calculate new_ema_value and return that.
alpha: uint256 = self.exp(
-convert(
(block.timestamp - ma_last_time) * 10**18 / averaging_window, int256
)
)
return (last_spot_value * (10**18 - alpha) + last_ema_value * alpha) / 10**18
return last_ema_value
@view
@external
def last_price(i: uint256) -> uint256:
return self.last_prices_packed[i] & (2**128 - 1)
@view
@external
def ema_price(i: uint256) -> uint256:
return (self.last_prices_packed[i] >> 128)
@external
@view
def get_p(i: uint256) -> uint256:
"""
@notice Returns the AMM State price of token
@dev if i = 0, it will return the state price of coin[1].
@param i index of state price (0 for coin[1], 1 for coin[2], ...)
@return uint256 The state price quoted by the AMM for coin[i+1]
"""
amp: uint256 = self._A()
xp: DynArray[uint256, MAX_COINS] = self._xp_mem(
self._stored_rates(), self._balances()
)
D: uint256 = self.get_D(xp, amp)
return self._get_p(xp, amp, D)[i]
@external
@view
@nonreentrant('lock')
def price_oracle(i: uint256) -> uint256:
return self._calc_moving_average(
self.last_prices_packed[i],
self.ma_exp_time,
self.ma_last_time & (2**128 - 1)
)
@external
@view
@nonreentrant('lock')
def D_oracle() -> uint256:
return self._calc_moving_average(
self.last_D_packed,
self.D_ma_time,
self.ma_last_time >> 128
)
# ----------------------------- Math Utils -----------------------------------
@internal
@pure
def exp(x: int256) -> uint256:
"""
@dev Calculates the natural exponential function of a signed integer with
a precision of 1e18.
@notice Note that this function consumes about 810 gas units. The implementation
is inspired by Remco Bloemen's implementation under the MIT license here:
https://xn--2-umb.com/22/exp-ln.
@dev This implementation is derived from Snekmate, which is authored
by pcaversaccio (Snekmate), distributed under the AGPL-3.0 license.
https://github.com/pcaversaccio/snekmate
@param x The 32-byte variable.
@return int256 The 32-byte calculation result.
"""
value: int256 = x
# If the result is `< 0.5`, we return zero. This happens when we have the following:
# "x <= floor(log(0.5e18) * 1e18) ~ -42e18".
if (x <= -42139678854452767551):
return empty(uint256)
# When the result is "> (2 ** 255 - 1) / 1e18" we cannot represent it as a signed integer.
# This happens when "x >= floor(log((2 ** 255 - 1) / 1e18) * 1e18) ~ 135".
assert x < 135305999368893231589, "wad_exp overflow"
# `x` is now in the range "(-42, 136) * 1e18". Convert to "(-42, 136) * 2 ** 96" for higher
# intermediate precision and a binary base. This base conversion is a multiplication with
# "1e18 / 2 ** 96 = 5 ** 18 / 2 ** 78".
value = unsafe_div(x << 78, 5 ** 18)
# Reduce the range of `x` to "(-½ ln 2, ½ ln 2) * 2 ** 96" by factoring out powers of two
# so that "exp(x) = exp(x') * 2 ** k", where `k` is a signer integer. Solving this gives
# "k = round(x / log(2))" and "x' = x - k * log(2)". Thus, `k` is in the range "[-61, 195]".
k: int256 = unsafe_add(unsafe_div(value << 96, 54916777467707473351141471128), 2 ** 95) >> 96
value = unsafe_sub(value, unsafe_mul(k, 54916777467707473351141471128))
# Evaluate using a "(6, 7)"-term rational approximation. Since `p` is monic,
# we will multiply by a scaling factor later.
y: int256 = unsafe_add(unsafe_mul(unsafe_add(value, 1346386616545796478920950773328), value) >> 96, 57155421227552351082224309758442)
p: int256 = unsafe_add(unsafe_mul(unsafe_add(unsafe_mul(unsafe_sub(unsafe_add(y, value), 94201549194550492254356042504812), y) >> 96,\
28719021644029726153956944680412240), value), 4385272521454847904659076985693276 << 96)
# We leave `p` in the "2 ** 192" base so that we do not have to scale it up
# again for the division.
q: int256 = unsafe_add(unsafe_mul(unsafe_sub(value, 2855989394907223263936484059900), value) >> 96, 50020603652535783019961831881945)
q = unsafe_sub(unsafe_mul(q, value) >> 96, 533845033583426703283633433725380)
q = unsafe_add(unsafe_mul(q, value) >> 96, 3604857256930695427073651918091429)
q = unsafe_sub(unsafe_mul(q, value) >> 96, 14423608567350463180887372962807573)
q = unsafe_add(unsafe_mul(q, value) >> 96, 26449188498355588339934803723976023)
# The polynomial `q` has no zeros in the range because all its roots are complex.
# No scaling is required, as `p` is already "2 ** 96" too large. Also,
# `r` is in the range "(0.09, 0.25) * 2**96" after the division.
r: int256 = unsafe_div(p, q)
# To finalise the calculation, we have to multiply `r` by:
# - the scale factor "s = ~6.031367120",
# - the factor "2 ** k" from the range reduction, and
# - the factor "1e18 / 2 ** 96" for the base conversion.
# We do this all at once, with an intermediate result in "2**213" base,
# so that the final right shift always gives a positive value.
# Note that to circumvent Vyper's safecast feature for the potentially
# negative parameter value `r`, we first convert `r` to `bytes32` and
# subsequently to `uint256`. Remember that the EVM default behaviour is
# to use two's complement representation to handle signed integers.
return unsafe_mul(convert(convert(r, bytes32), uint256), 3822833074963236453042738258902158003155416615667) >> convert(unsafe_sub(195, k), uint256)
# ---------------------------- ERC20 Utils -----------------------------------
@view
@internal
def _domain_separator() -> bytes32:
if chain.id != CACHED_CHAIN_ID:
return keccak256(
_abi_encode(
EIP712_TYPEHASH,
NAME_HASH,
VERSION_HASH,
chain.id,
self,
salt,
)
)
return CACHED_DOMAIN_SEPARATOR
@internal
def _transfer(_from: address, _to: address, _value: uint256):
# # NOTE: vyper does not allow underflows
# # so the following subtraction would revert on insufficient balance
self.balanceOf[_from] -= _value
self.balanceOf[_to] += _value
log Transfer(_from, _to, _value)
@internal
def _burnFrom(_from: address, _burn_amount: uint256):
self.total_supply -= _burn_amount
self.balanceOf[_from] -= _burn_amount
log Transfer(_from, empty(address), _burn_amount)
@external
def transfer(_to : address, _value : uint256) -> bool:
"""
@dev Transfer token for a specified address
@param _to The address to transfer to.
@param _value The amount to be transferred.
"""
self._transfer(msg.sender, _to, _value)
return True
@external
def transferFrom(_from : address, _to : address, _value : uint256) -> bool:
"""
@dev Transfer tokens from one address to another.
@param _from address The address which you want to send tokens from
@param _to address The address which you want to transfer to
@param _value uint256 the amount of tokens to be transferred
"""
self._transfer(_from, _to, _value)
_allowance: uint256 = self.allowance[_from][msg.sender]
if _allowance != max_value(uint256):
self.allowance[_from][msg.sender] = _allowance - _value
return True
@external
def approve(_spender : address, _value : uint256) -> bool:
"""
@notice Approve the passed address to transfer the specified amount of
tokens on behalf of msg.sender
@dev Beware that changing an allowance via this method brings the risk that
someone may use both the old and new allowance by unfortunate transaction
ordering: https://github.com/ethereum/EIPs/issues/20#issuecomment-263524729
@param _spender The address which will transfer the funds
@param _value The amount of tokens that may be transferred
@return bool success
"""
self.allowance[msg.sender][_spender] = _value
log Approval(msg.sender, _spender, _value)
return True
@external
def permit(
_owner: address,
_spender: address,
_value: uint256,
_deadline: uint256,
_v: uint8,
_r: bytes32,
_s: bytes32
) -> bool:
"""
@notice Approves spender by owner's signature to expend owner's tokens.
See https://eips.ethereum.org/EIPS/eip-2612.
@dev Inspired by https://github.com/yearn/yearn-vaults/blob/main/contracts/Vault.vy#L753-L793
@dev Supports smart contract wallets which implement ERC1271
https://eips.ethereum.org/EIPS/eip-1271
@param _owner The address which is a source of funds and has signed the Permit.
@param _spender The address which is allowed to spend the funds.
@param _value The amount of tokens to be spent.
@param _deadline The timestamp after which the Permit is no longer valid.
@param _v The bytes[64] of the valid secp256k1 signature of permit by owner
@param _r The bytes[0:32] of the valid secp256k1 signature of permit by owner
@param _s The bytes[32:64] of the valid secp256k1 signature of permit by owner
@return True, if transaction completes successfully
"""
assert _owner != empty(address)
assert block.timestamp <= _deadline
nonce: uint256 = self.nonces[_owner]
digest: bytes32 = keccak256(
concat(
b"\x19\x01",
self._domain_separator(),
keccak256(_abi_encode(EIP2612_TYPEHASH, _owner, _spender, _value, nonce, _deadline))
)
)
if _owner.is_contract:
sig: Bytes[65] = concat(_abi_encode(_r, _s), slice(convert(_v, bytes32), 31, 1))
# reentrancy not a concern since this is a staticcall
assert ERC1271(_owner).isValidSignature(digest, sig) == ERC1271_MAGIC_VAL
else:
assert ecrecover(digest, convert(_v, uint256), convert(_r, uint256), convert(_s, uint256)) == _owner
self.allowance[_owner][_spender] = _value
self.nonces[_owner] = nonce + 1
log Approval(_owner, _spender, _value)
return True
@view
@external
def DOMAIN_SEPARATOR() -> bytes32:
"""
@notice EIP712 domain separator.
@return bytes32 Domain Separator set for the current chain.
"""
return self._domain_separator()
# ------------------------- AMM View Functions -------------------------------
@view
@external
def get_dx(i: int128, j: int128, dy: uint256) -> uint256:
"""
@notice Calculate the current input dx given output dy
@dev Index values can be found via the `coins` public getter method
@param i Index value for the coin to send
@param j Index valie of the coin to recieve
@param dy Amount of `j` being received after exchange
@return Amount of `i` predicted
"""
return StableSwapViews(factory.views_implementation()).get_dx(i, j, dy, self)
@view
@external
def get_dy(i: int128, j: int128, dx: uint256) -> uint256:
"""
@notice Calculate the current output dy given input dx
@dev Index values can be found via the `coins` public getter method
@param i Index value for the coin to send
@param j Index valie of the coin to recieve
@param dx Amount of `i` being exchanged
@return Amount of `j` predicted
"""
return StableSwapViews(factory.views_implementation()).get_dy(i, j, dx, self)
@view
@external
def calc_withdraw_one_coin(_burn_amount: uint256, i: int128) -> uint256:
"""
@notice Calculate the amount received when withdrawing a single coin
@param _burn_amount Amount of LP tokens to burn in the withdrawal
@param i Index value of the coin to withdraw
@return Amount of coin received
"""
return self._calc_withdraw_one_coin(_burn_amount, i)[0]
@view
@external
@nonreentrant('lock')
def totalSupply() -> uint256:
"""
@notice The total supply of pool LP tokens
@return self.total_supply, 18 decimals.
"""
return self.total_supply
@view
@external
@nonreentrant('lock')
def get_virtual_price() -> uint256:
"""
@notice The current virtual price of the pool LP token
@dev Useful for calculating profits.
The method may be vulnerable to donation-style attacks if implementation
contains rebasing tokens. For integrators, caution is advised.
@return LP token virtual price normalized to 1e18
"""
amp: uint256 = self._A()
xp: DynArray[uint256, MAX_COINS] = self._xp_mem(
self._stored_rates(), self._balances()
)
D: uint256 = self.get_D(xp, amp)
# D is in the units similar to DAI (e.g. converted to precision 1e18)
# When balanced, D = n * x_u - total virtual value of the portfolio
return D * PRECISION / self.total_supply
@view
@external
def calc_token_amount(
_amounts: DynArray[uint256, MAX_COINS],
_is_deposit: bool
) -> uint256:
"""
@notice Calculate addition or reduction in token supply from a deposit or withdrawal
@param _amounts Amount of each coin being deposited
@param _is_deposit set True for deposits, False for withdrawals
@return Expected amount of LP tokens received
"""
return StableSwapViews(factory.views_implementation()).calc_token_amount(_amounts, _is_deposit, self)
@view
@external
def A() -> uint256:
return self._A() / A_PRECISION
@view
@external
def A_precise() -> uint256:
return self._A()
@view
@external
def balances(i: uint256) -> uint256:
"""
@notice Get the current balance of a coin within the
pool, less the accrued admin fees
@param i Index value for the coin to query balance of
@return Token balance
"""
return self._balances()[i]
@view
@external
def get_balances() -> DynArray[uint256, MAX_COINS]:
return self._balances()
@view
@external
def stored_rates() -> DynArray[uint256, MAX_COINS]:
return self._stored_rates()
@view
@external
def dynamic_fee(i: int128, j: int128) -> uint256:
"""
@notice Return the fee for swapping between `i` and `j`
@param i Index value for the coin to send
@param j Index value of the coin to recieve
@return Swap fee expressed as an integer with 1e10 precision
"""
return StableSwapViews(factory.views_implementation()).dynamic_fee(i, j, self)
# --------------------------- AMM Admin Functions ----------------------------
@external
def ramp_A(_future_A: uint256, _future_time: uint256):
assert msg.sender == factory.admin() # dev: only owner
assert block.timestamp >= self.initial_A_time + MIN_RAMP_TIME
assert _future_time >= block.timestamp + MIN_RAMP_TIME # dev: insufficient time
_initial_A: uint256 = self._A()
_future_A_p: uint256 = _future_A * A_PRECISION
assert _future_A > 0 and _future_A < MAX_A
if _future_A_p < _initial_A:
assert _future_A_p * MAX_A_CHANGE >= _initial_A
else:
assert _future_A_p <= _initial_A * MAX_A_CHANGE
self.initial_A = _initial_A
self.future_A = _future_A_p
self.initial_A_time = block.timestamp
self.future_A_time = _future_time
log RampA(_initial_A, _future_A_p, block.timestamp, _future_time)
@external
def stop_ramp_A():
assert msg.sender == factory.admin() # dev: only owner
current_A: uint256 = self._A()
self.initial_A = current_A
self.future_A = current_A
self.initial_A_time = block.timestamp
self.future_A_time = block.timestamp
# now (block.timestamp < t1) is always False, so we return saved A
log StopRampA(current_A, block.timestamp)
@external
def set_new_fee(_new_fee: uint256, _new_offpeg_fee_multiplier: uint256):
assert msg.sender == factory.admin()
# set new fee:
assert _new_fee <= MAX_FEE
self.fee = _new_fee
# set new offpeg_fee_multiplier:
assert _new_offpeg_fee_multiplier * _new_fee <= MAX_FEE * FEE_DENOMINATOR # dev: offpeg multiplier exceeds maximum
self.offpeg_fee_multiplier = _new_offpeg_fee_multiplier
log ApplyNewFee(_new_fee, _new_offpeg_fee_multiplier)
@external
def set_ma_exp_time(_ma_exp_time: uint256, _D_ma_time: uint256):
"""
@notice Set the moving average window of the price oracles.
@param _ma_exp_time Moving average window. It is time_in_seconds / ln(2)
"""
assert msg.sender == factory.admin() # dev: only owner
assert 0 not in [_ma_exp_time, _D_ma_time]
self.ma_exp_time = _ma_exp_time
self.D_ma_time = _D_ma_time

Sorry, the diff of this file is not supported yet

---input---
<button aria-label="Submit">Submit</button>
---tokens---
'<' Punctuation
'button' Name.Tag
' ' Text.Whitespace
'aria-label' Name.Attribute
'=' Operator
'"Submit"' Literal.String
'>' Punctuation
'Submit' Name.Other
'</' Punctuation
'button' Name.Tag
'>' Punctuation
'\n' Text.Whitespace
---input---
<button onClick={e => e.preventDefault ()} />
---tokens---
'<' Punctuation
'button' Name.Tag
' ' Text.Whitespace
'onClick' Name.Attribute
'=' Operator
'{' Punctuation
'e' Name.Other
' ' Text.Whitespace
'=>' Punctuation
' ' Text.Whitespace
'e' Name.Other
'.' Punctuation
'preventDefault' Name.Other
' ' Text.Whitespace
'(' Punctuation
')' Punctuation
'}' Punctuation
' ' Text.Whitespace
'/' Punctuation
'>' Punctuation
'\n' Text.Whitespace
---input---
<React.Fragment></React.Fragment>
---tokens---
'<' Punctuation
'React' Name.Tag
'.' Punctuation
'Fragment' Name.Attribute
'>' Punctuation
'</' Punctuation
'React' Name.Tag
'.' Punctuation
'Fragment' Name.Attribute
'>' Punctuation
'\n' Text.Whitespace

Sorry, the diff of this file is not supported yet

---input---
<User name={'john'} last={'doe'} />
---tokens---
'<' Punctuation
'User' Name.Tag
' ' Text.Whitespace
'name' Name.Attribute
'=' Operator
'{' Punctuation
"'john'" Literal.String.Single
'}' Punctuation
' ' Text.Whitespace
'last' Name.Attribute
'=' Operator
'{' Punctuation
"'doe'" Literal.String.Single
'}' Punctuation
' ' Text.Whitespace
'/' Punctuation
'>' Punctuation
'\n' Text.Whitespace
---input---
<div style={{ color: 'red' }} />
---tokens---
'<' Punctuation
'div' Name.Tag
' ' Text.Whitespace
'style' Name.Attribute
'=' Operator
'{' Punctuation
'{' Punctuation
' ' Text.Whitespace
'color' Name.Other
':' Operator
' ' Text.Whitespace
"'red'" Literal.String.Single
' ' Text.Whitespace
'}' Punctuation
'}' Punctuation
' ' Text.Whitespace
'/' Punctuation
'>' Punctuation
'\n' Text.Whitespace
---input---
<></>
---tokens---
'<>' Punctuation
'</>' Punctuation
'\n' Text.Whitespace
---input---
StormEvents
| where StartTime between (datetime(2007-11-01) .. datetime(2007-12-01))
| where State == "FLORIDA"
| count
---tokens---
'StormEvents' Name
'\n' Text.Whitespace
'|' Punctuation
' ' Text.Whitespace
'where' Keyword
' ' Text.Whitespace
'StartTime' Name
' ' Text.Whitespace
'between' Keyword
' ' Text.Whitespace
'(' Punctuation
'datetime' Keyword
'(' Punctuation
'2007' Literal.Number.Integer
'-' Punctuation
'11' Literal.Number.Integer
'-' Punctuation
'01' Literal.Number.Integer
')' Punctuation
' ' Text.Whitespace
'.' Error
'.' Error
' ' Text.Whitespace
'datetime' Keyword
'(' Punctuation
'2007' Literal.Number.Integer
'-' Punctuation
'12' Literal.Number.Integer
'-' Punctuation
'01' Literal.Number.Integer
')' Punctuation
')' Punctuation
'\n' Text.Whitespace
'|' Punctuation
' ' Text.Whitespace
'where' Keyword
' ' Text.Whitespace
'State' Name
' ' Text.Whitespace
'==' Punctuation
' ' Text.Whitespace
'"' Literal.String
'FLORIDA' Literal.String
'"' Literal.String
'\n' Text.Whitespace
'|' Punctuation
' ' Text.Whitespace
'count' Keyword
'\n' Text.Whitespace
---input---
"hello world"
"hello ${ { a = "world"; }.a }"
"1 2 ${toString 3}"
"${pkgs.bash}/bin/sh"
true, false, null, 123, 3.141
-1
/etc
./foo.png
~/.config
<nixpkgs>
''
multi
line
string
''
''
multi
${value}
string
''
---tokens---
'"' Literal.String.Double
'hello world' Literal.String.Double
'"' Literal.String.Double
'\n\n' Text
'"' Literal.String.Double
'hello ' Literal.String.Double
'${' Literal.String.Interpol
' ' Text
'{' Punctuation
' ' Text
'a' Literal.String.Symbol
' ' Text
'=' Operator
' ' Text
'"' Literal.String.Double
'world' Literal.String.Double
'"' Literal.String.Double
';' Punctuation
' ' Text
'}' Punctuation
'.' Operator
'a' Text
' ' Text
'}' Literal.String.Interpol
'"' Literal.String.Double
'\n\n' Text
'"' Literal.String.Double
'1 2 ' Literal.String.Double
'${' Literal.String.Interpol
'toString' Name.Builtin
' ' Text
'3' Literal.Number.Integer
'}' Literal.String.Interpol
'"' Literal.String.Double
'\n\n' Text
'"' Literal.String.Double
'${' Literal.String.Interpol
'pkgs' Text
'.' Operator
'bash' Text
'}' Literal.String.Interpol
'/bin/sh' Literal.String.Double
'"' Literal.String.Double
'\n\n' Text
'true' Name.Constant
',' Punctuation
' ' Text
'false' Name.Constant
',' Punctuation
' ' Text
'null' Name.Constant
',' Punctuation
' ' Text
'123' Literal.Number.Integer
',' Punctuation
' ' Text
'3.141' Literal.Number.Float
'\n\n' Text
'-1' Literal.Number.Integer
'\n\n' Text
'/etc' Literal
'\n' Text
'./foo.png' Literal
'\n' Text
'~/.config' Literal
'\n\n' Text
'<nixpkgs>' Literal
'\n\n' Text
"''" Literal.String.Multiline
'\n multi\n line\n string\n' Literal.String.Multiline
"''" Literal.String.Multiline
'\n\n' Text
"''" Literal.String.Multiline
'\n multi\n ' Literal.String.Multiline
'${' Literal.String.Interpol
'value' Text
'}' Literal.String.Interpol
'\n string\n' Literal.String.Multiline
"''" Literal.String.Multiline
'\n' Text
---input---
import ./foo.nix
map (x: x + x) [ 1 2 3 ]
---tokens---
'import' Name.Builtin
' ' Text
'./foo.nix' Literal
'\n\n' Text
'map' Name.Builtin
' ' Text
'(' Punctuation
'x' Text
':' Punctuation
' ' Text
'x' Text
' ' Text
'+' Operator
' ' Text
'x' Text
')' Punctuation
' ' Text
'[' Punctuation
' ' Text
'1' Literal.Number.Integer
' ' Text
'2' Literal.Number.Integer
' ' Text
'3' Literal.Number.Integer
' ' Text
']' Punctuation
'\n' Text
---input---
# single line comment
/* multi
line
comment */
---tokens---
'# single line comment' Comment.Single
'\n\n' Text
'/*' Comment.Multiline
' multi\nline\ncomment ' Comment.Multiline
'*/' Comment.Multiline
'\n' Text
---input---
{ x = 1; y = 2; }
{ foo.bar = 1; }
rec { x = "foo"; y = x + "bar"; }
[ "foo" "bar" "baz" ]
[ 1 2 3 ]
[ (f 1) { a = 1; b = 2; } [ "c" ] ]
---tokens---
'{' Punctuation
' ' Text
'x' Literal.String.Symbol
' ' Text
'=' Operator
' ' Text
'1' Literal.Number.Integer
';' Punctuation
' ' Text
'y' Literal.String.Symbol
' ' Text
'=' Operator
' ' Text
'2' Literal.Number.Integer
';' Punctuation
' ' Text
'}' Punctuation
'\n\n' Text
'{' Punctuation
' ' Text
'foo' Text
'.' Operator
'bar' Literal.String.Symbol
' ' Text
'=' Operator
' ' Text
'1' Literal.Number.Integer
';' Punctuation
' ' Text
'}' Punctuation
'\n\n' Text
'rec' Keyword
' ' Text
'{' Punctuation
' ' Text
'x' Literal.String.Symbol
' ' Text
'=' Operator
' ' Text
'"' Literal.String.Double
'foo' Literal.String.Double
'"' Literal.String.Double
';' Punctuation
' ' Text
'y' Literal.String.Symbol
' ' Text
'=' Operator
' ' Text
'x' Text
' ' Text
'+' Operator
' ' Text
'"' Literal.String.Double
'bar' Literal.String.Double
'"' Literal.String.Double
';' Punctuation
' ' Text
'}' Punctuation
'\n\n' Text
'[' Punctuation
' ' Text
'"' Literal.String.Double
'foo' Literal.String.Double
'"' Literal.String.Double
' ' Text
'"' Literal.String.Double
'bar' Literal.String.Double
'"' Literal.String.Double
' ' Text
'"' Literal.String.Double
'baz' Literal.String.Double
'"' Literal.String.Double
' ' Text
']' Punctuation
'\n\n' Text
'[' Punctuation
' ' Text
'1' Literal.Number.Integer
' ' Text
'2' Literal.Number.Integer
' ' Text
'3' Literal.Number.Integer
' ' Text
']' Punctuation
'\n\n' Text
'[' Punctuation
' ' Text
'(' Punctuation
'f' Text
' ' Text
'1' Literal.Number.Integer
')' Punctuation
' ' Text
'{' Punctuation
' ' Text
'a' Literal.String.Symbol
' ' Text
'=' Operator
' ' Text
'1' Literal.Number.Integer
';' Punctuation
' ' Text
'b' Literal.String.Symbol
' ' Text
'=' Operator
' ' Text
'2' Literal.Number.Integer
';' Punctuation
' ' Text
'}' Punctuation
' ' Text
'[' Punctuation
' ' Text
'"' Literal.String.Double
'c' Literal.String.Double
'"' Literal.String.Double
' ' Text
']' Punctuation
' ' Text
']' Punctuation
'\n' Text
---input---
let
bar = "bar";
in {
foo.${bar} = 3;
foo.${bar + "bar"} = 3;
}
---tokens---
'let' Keyword
'\n ' Text
'bar' Literal.String.Symbol
' ' Text
'=' Operator
' ' Text
'"' Literal.String.Double
'bar' Literal.String.Double
'"' Literal.String.Double
';' Punctuation
'\n' Text
'in' Keyword
' ' Text
'{' Punctuation
'\n ' Text
'foo' Text
'.' Operator
'${' Literal.String.Interpol
'bar' Text
'}' Literal.String.Interpol
' ' Text
'=' Operator
' ' Text
'3' Literal.Number.Integer
';' Punctuation
'\n ' Text
'foo' Text
'.' Operator
'${' Literal.String.Interpol
'bar' Text
' ' Text
'+' Operator
' ' Text
'"' Literal.String.Double
'bar' Literal.String.Double
'"' Literal.String.Double
'}' Literal.String.Interpol
' ' Text
'=' Operator
' ' Text
'3' Literal.Number.Integer
';' Punctuation
'\n' Text
'}' Punctuation
'\n' Text
---input---
if 1 + 1 == 2 then "yes!" else "no!"
assert 1 + 1 == 2; "yes!"
let x = "foo"; y = "bar"; in x + y
with builtins; head [ 1 2 3 ]
---tokens---
'if' Keyword
' ' Text
'1' Literal.Number.Integer
' ' Text
'+' Operator
' ' Text
'1' Literal.Number.Integer
' ' Text
'==' Operator
' ' Text
'2' Literal.Number.Integer
' ' Text
'then' Keyword
' ' Text
'"' Literal.String.Double
'yes!' Literal.String.Double
'"' Literal.String.Double
' ' Text
'else' Keyword
' ' Text
'"' Literal.String.Double
'no!' Literal.String.Double
'"' Literal.String.Double
'\n\n' Text
'assert' Keyword
' ' Text
'1' Literal.Number.Integer
' ' Text
'+' Operator
' ' Text
'1' Literal.Number.Integer
' ' Text
'==' Operator
' ' Text
'2' Literal.Number.Integer
';' Punctuation
' ' Text
'"' Literal.String.Double
'yes!' Literal.String.Double
'"' Literal.String.Double
'\n\n' Text
'let' Keyword
' ' Text
'x' Literal.String.Symbol
' ' Text
'=' Operator
' ' Text
'"' Literal.String.Double
'foo' Literal.String.Double
'"' Literal.String.Double
';' Punctuation
' ' Text
'y' Literal.String.Symbol
' ' Text
'=' Operator
' ' Text
'"' Literal.String.Double
'bar' Literal.String.Double
'"' Literal.String.Double
';' Punctuation
' ' Text
'in' Keyword
' ' Text
'x' Text
' ' Text
'+' Operator
' ' Text
'y' Text
'\n\n' Text
'with' Keyword
' ' Text
'builtins' Name.Builtin
';' Punctuation
' ' Text
'head' Text
' ' Text
'[' Punctuation
' ' Text
'1' Literal.Number.Integer
' ' Text
'2' Literal.Number.Integer
' ' Text
'3' Literal.Number.Integer
' ' Text
']' Punctuation
'\n' Text
---input---
0.3
.1
0.1e4
-3.0
0.1E-4
0.2e+4
---tokens---
'0.3' Literal.Number.Float
'\n' Text
'.1' Literal.Number.Float
'\n' Text
'0.1e4' Literal.Number.Float
'\n' Text
'-3.0' Literal.Number.Float
'\n' Text
'0.1E-4' Literal.Number.Float
'\n' Text
'0.2e+4' Literal.Number.Float
'\n' Text
---input---
x: x + 1
A function that expects an integer and returns it increased by 1
x: y: x + y
(x: x + 1) 100
let inc = x: x + 1; in inc (inc (inc 100))
{ x, y }: x + y
{ x, y ? "bar" }: x + y
{ x, y, ... }: x + y
{ x, y } @ args: x + y
args @ { x, y }: x + y
---tokens---
'x' Text
':' Punctuation
' ' Text
'x' Text
' ' Text
'+' Operator
' ' Text
'1' Literal.Number.Integer
'\n\n\n' Text
'A' Text
' ' Text
'function' Text
' ' Text
'that' Text
' ' Text
'expects' Text
' ' Text
'an' Text
' ' Text
'integer' Text
' ' Text
'and' Operator.Word
' ' Text
'returns' Text
' ' Text
'it' Text
' ' Text
'increased' Text
' ' Text
'by' Text
' ' Text
'1' Literal.Number.Integer
'\n\n' Text
'x' Text
':' Punctuation
' ' Text
'y' Text
':' Punctuation
' ' Text
'x' Text
' ' Text
'+' Operator
' ' Text
'y' Text
'\n\n' Text
'(' Punctuation
'x' Text
':' Punctuation
' ' Text
'x' Text
' ' Text
'+' Operator
' ' Text
'1' Literal.Number.Integer
')' Punctuation
' ' Text
'100' Literal.Number.Integer
'\n\n' Text
'let' Keyword
' ' Text
'inc' Literal.String.Symbol
' ' Text
'=' Operator
' ' Text
'x' Text
':' Punctuation
' ' Text
'x' Text
' ' Text
'+' Operator
' ' Text
'1' Literal.Number.Integer
';' Punctuation
' ' Text
'in' Keyword
' ' Text
'inc' Text
' ' Text
'(' Punctuation
'inc' Text
' ' Text
'(' Punctuation
'inc' Text
' ' Text
'100' Literal.Number.Integer
')' Punctuation
')' Punctuation
'\n\n' Text
'{' Punctuation
' ' Text
'x' Text
',' Punctuation
' ' Text
'y' Text
' ' Text
'}' Punctuation
':' Punctuation
' ' Text
'x' Text
' ' Text
'+' Operator
' ' Text
'y' Text
'\n\n' Text
'{' Punctuation
' ' Text
'x' Text
',' Punctuation
' ' Text
'y' Text
' ' Text
'?' Operator
' ' Text
'"' Literal.String.Double
'bar' Literal.String.Double
'"' Literal.String.Double
' ' Text
'}' Punctuation
':' Punctuation
' ' Text
'x' Text
' ' Text
'+' Operator
' ' Text
'y' Text
'\n\n' Text
'{' Punctuation
' ' Text
'x' Text
',' Punctuation
' ' Text
'y' Text
',' Punctuation
' ' Text
'.' Operator
'.' Operator
'.' Operator
' ' Text
'}' Punctuation
':' Punctuation
' ' Text
'x' Text
' ' Text
'+' Operator
' ' Text
'y' Text
'\n\n' Text
'{' Punctuation
' ' Text
'x' Text
',' Punctuation
' ' Text
'y' Text
' ' Text
'}' Punctuation
' ' Text
'@' Punctuation
' ' Text
'args' Text
':' Punctuation
' ' Text
'x' Text
' ' Text
'+' Operator
' ' Text
'y' Text
'\n\n' Text
'args' Text
' ' Text
'@' Punctuation
' ' Text
'{' Punctuation
' ' Text
'x' Text
',' Punctuation
' ' Text
'y' Text
' ' Text
'}' Punctuation
':' Punctuation
' ' Text
'x' Text
' ' Text
'+' Operator
' ' Text
'y' Text
'\n' Text
---input---
"foo" + "bar"
1 + 2
2 / 1
"foo" == "f" + "oo"
"foo" != "bar"
!true
2 * 2 < 5
5 > 1
2 - 1
{ x = 1; y = 2; }.x
{ x = 1; y = 2; }.z or 3
{ x = 1; y = 2; } // { z = 3; }
---tokens---
'"' Literal.String.Double
'foo' Literal.String.Double
'"' Literal.String.Double
' ' Text
'+' Operator
' ' Text
'"' Literal.String.Double
'bar' Literal.String.Double
'"' Literal.String.Double
'\n\n' Text
'1' Literal.Number.Integer
' ' Text
'+' Operator
' ' Text
'2' Literal.Number.Integer
'\n\n' Text
'2' Literal.Number.Integer
' ' Text
'/' Operator
' ' Text
'1' Literal.Number.Integer
'\n\n' Text
'"' Literal.String.Double
'foo' Literal.String.Double
'"' Literal.String.Double
' ' Text
'==' Operator
' ' Text
'"' Literal.String.Double
'f' Literal.String.Double
'"' Literal.String.Double
' ' Text
'+' Operator
' ' Text
'"' Literal.String.Double
'oo' Literal.String.Double
'"' Literal.String.Double
'\n\n' Text
'"' Literal.String.Double
'foo' Literal.String.Double
'"' Literal.String.Double
' ' Text
'!' Operator
'=' Operator
' ' Text
'"' Literal.String.Double
'bar' Literal.String.Double
'"' Literal.String.Double
'\n\n' Text
'!' Operator
'true' Name.Constant
'\n\n' Text
'2' Literal.Number.Integer
' ' Text
'*' Operator
' ' Text
'2' Literal.Number.Integer
' ' Text
'<' Operator
' ' Text
'5' Literal.Number.Integer
'\n\n' Text
'5' Literal.Number.Integer
' ' Text
'>' Operator
' ' Text
'1' Literal.Number.Integer
'\n\n' Text
'2' Literal.Number.Integer
' ' Text
'-' Operator
' ' Text
'1' Literal.Number.Integer
'\n\n' Text
'{' Punctuation
' ' Text
'x' Literal.String.Symbol
' ' Text
'=' Operator
' ' Text
'1' Literal.Number.Integer
';' Punctuation
' ' Text
'y' Literal.String.Symbol
' ' Text
'=' Operator
' ' Text
'2' Literal.Number.Integer
';' Punctuation
' ' Text
'}' Punctuation
'.' Operator
'x' Text
'\n\n' Text
'{' Punctuation
' ' Text
'x' Literal.String.Symbol
' ' Text
'=' Operator
' ' Text
'1' Literal.Number.Integer
';' Punctuation
' ' Text
'y' Literal.String.Symbol
' ' Text
'=' Operator
' ' Text
'2' Literal.Number.Integer
';' Punctuation
' ' Text
'}' Punctuation
'.' Operator
'z' Text
' ' Text
'or' Operator.Word
' ' Text
'3' Literal.Number.Integer
'\n\n' Text
'{' Punctuation
' ' Text
'x' Literal.String.Symbol
' ' Text
'=' Operator
' ' Text
'1' Literal.Number.Integer
';' Punctuation
' ' Text
'y' Literal.String.Symbol
' ' Text
'=' Operator
' ' Text
'2' Literal.Number.Integer
';' Punctuation
' ' Text
'}' Punctuation
' ' Text
'//' Operator
' ' Text
'{' Punctuation
' ' Text
'z' Literal.String.Symbol
' ' Text
'=' Operator
' ' Text
'3' Literal.Number.Integer
';' Punctuation
' ' Text
'}' Punctuation
'\n' Text
---input---
''
''\t ''\n ''\r ''' ''$ '
''
"\"\$${builtins.toString maxArgIndex}\""
"\n"
"cp \"$(nix-build -A ${attr})\" \"$0\" > /dev/null"
"$" #
''$'' #
" \ "
'' ''\ ''
---tokens---
"''" Literal.String.Multiline
'\n ' Literal.String.Multiline
"''\\t" Literal.String.Escape
' ' Literal.String.Multiline
"''\\n" Literal.String.Escape
' ' Literal.String.Multiline
"''\\r" Literal.String.Escape
' ' Literal.String.Multiline
"'''" Literal.String.Escape
' ' Literal.String.Multiline
"''$" Literal.String.Escape
' ' Literal.String.Multiline
"'\n" Literal.String.Multiline
"''" Literal.String.Multiline
'\n\n' Text
'"' Literal.String.Double
'\\"' Literal.String.Escape
'\\$' Literal.String.Escape
'${' Literal.String.Interpol
'builtins' Name.Builtin
'.' Operator
'toString' Name.Builtin
' ' Text
'maxArgIndex' Text
'}' Literal.String.Interpol
'\\"' Literal.String.Escape
'"' Literal.String.Double
'\n\n' Text
'"' Literal.String.Double
'\\n' Literal.String.Escape
'"' Literal.String.Double
'\n\n' Text
'"' Literal.String.Double
'cp ' Literal.String.Double
'\\"' Literal.String.Escape
'$(' Literal.String.Double
'nix-build -A ' Literal.String.Double
'${' Literal.String.Interpol
'attr' Text
'}' Literal.String.Interpol
')' Literal.String.Double
'\\"' Literal.String.Escape
' ' Literal.String.Double
'\\"' Literal.String.Escape
'$0' Literal.String.Double
'\\"' Literal.String.Escape
' > /dev/null' Literal.String.Double
'"' Literal.String.Double
'\n\n' Text
'"' Literal.String.Double
'$' Literal.String.Double
'"' Literal.String.Double
' ' Text
'#' Comment.Single
'\n\n' Text
"''" Literal.String.Multiline
'$' Literal.String.Multiline
"''" Literal.String.Multiline
' ' Text
'#' Comment.Single
'\n\n' Text
'"' Literal.String.Double
' ' Literal.String.Double
'\\' Literal.String.Double
' ' Literal.String.Double
'"' Literal.String.Double
'\n\n' Text
"''" Literal.String.Multiline
' ' Literal.String.Multiline
"''\\" Literal.String.Escape
' ' Literal.String.Multiline
"''" Literal.String.Multiline
'\n' Text
---input---
from invoices
filter invoice_date >= @1970-01-16
derive {
transaction_fees = 0.8,
income = total - transaction_fees
}
filter income > 1
group customer_id (
aggregate {
average total,
sum_income = sum income,
ct = count total,
}
)
sort {-sum_income}
take 10
join c=customers (==customer_id)
derive name = f"{c.last_name}, {c.first_name}"
select {
c.customer_id, name, sum_income
}
derive db_version = s"version()"
---tokens---
'from' Name.Variable
' ' Text.Whitespace
'invoices' Name.Variable
'\n' Text.Whitespace
'filter' Name.Variable
' ' Text.Whitespace
'invoice_date' Name.Variable
' ' Text.Whitespace
'>=' Operator
' ' Text.Whitespace
'@1970-01-16' Literal.Date
'\n' Text.Whitespace
'derive' Name.Variable
' ' Text.Whitespace
'{' Punctuation
'\n ' Text.Whitespace
'transaction_fees' Name.Variable
' ' Text.Whitespace
'=' Operator
' ' Text.Whitespace
'0.8' Literal.Number.Float
',' Punctuation
'\n ' Text.Whitespace
'income' Name.Variable
' ' Text.Whitespace
'=' Operator
' ' Text.Whitespace
'total' Name.Variable
' ' Text.Whitespace
'-' Operator
' ' Text.Whitespace
'transaction_fees' Name.Variable
'\n' Text.Whitespace
'}' Punctuation
'\n' Text.Whitespace
'filter' Name.Variable
' ' Text.Whitespace
'income' Name.Variable
' ' Text.Whitespace
'>' Operator
' ' Text.Whitespace
'1' Literal.Number.Integer
'\n' Text.Whitespace
'group' Name.Variable
' ' Text.Whitespace
'customer_id' Name.Variable
' ' Text.Whitespace
'(' Punctuation
'\n ' Text.Whitespace
'aggregate' Name.Variable
' ' Text.Whitespace
'{' Punctuation
'\n ' Text.Whitespace
'average' Name.Function
' ' Text.Whitespace
'total' Name.Variable
',' Punctuation
'\n ' Text.Whitespace
'sum_income' Name.Variable
' ' Text.Whitespace
'=' Operator
' ' Text.Whitespace
'sum' Name.Function
' ' Text.Whitespace
'income' Name.Variable
',' Punctuation
'\n ' Text.Whitespace
'ct' Name.Variable
' ' Text.Whitespace
'=' Operator
' ' Text.Whitespace
'count' Name.Function
' ' Text.Whitespace
'total' Name.Variable
',' Punctuation
'\n ' Text.Whitespace
'}' Punctuation
'\n' Text.Whitespace
')' Punctuation
'\n' Text.Whitespace
'sort' Name.Variable
' ' Text.Whitespace
'{' Punctuation
'-' Operator
'sum_income' Name.Variable
'}' Punctuation
'\n' Text.Whitespace
'take' Name.Variable
' ' Text.Whitespace
'10' Literal.Number.Integer
'\n' Text.Whitespace
'join' Name.Variable
' ' Text.Whitespace
'c' Name.Variable
'=' Operator
'customers' Name.Variable
' ' Text.Whitespace
'(' Punctuation
'==' Operator
'customer_id' Name.Variable
')' Punctuation
'\n' Text.Whitespace
'derive' Name.Variable
' ' Text.Whitespace
'name' Name.Variable
' ' Text.Whitespace
'=' Operator
' ' Text.Whitespace
'f' Literal.String.Affix
'"' Literal.String.Double
'{' Literal.String.Interpol
'c' Name.Variable
'.' Operator
'last_name' Name.Variable
'}' Literal.String.Interpol
', ' Literal.String.Double
'{' Literal.String.Interpol
'c' Name.Variable
'.' Operator
'first_name' Name.Variable
'}' Literal.String.Interpol
'"' Literal.String.Double
'\n' Text.Whitespace
'select' Name.Variable
' ' Text.Whitespace
'{' Punctuation
'\n ' Text.Whitespace
'c' Name.Variable
'.' Operator
'customer_id' Name.Variable
',' Punctuation
' ' Text.Whitespace
'name' Name.Variable
',' Punctuation
' ' Text.Whitespace
'sum_income' Name.Variable
'\n' Text.Whitespace
'}' Punctuation
'\n' Text.Whitespace
'derive' Name.Variable
' ' Text.Whitespace
'db_version' Name.Variable
' ' Text.Whitespace
'=' Operator
' ' Text.Whitespace
's' Literal.String.Affix
'"' Literal.String.Double
'version()' Literal.String.Double
'"' Literal.String.Double
'\n' Text.Whitespace
---input---
f"Hello {name}!"
---tokens---
'f' Literal.String.Affix
'"' Literal.String.Double
'Hello ' Literal.String.Double
'{' Literal.String.Interpol
'name' Name.Variable
'}' Literal.String.Interpol
'!' Literal.String.Double
'"' Literal.String.Double
'\n' Text.Whitespace
---input---
r"Not \n escaped"
---tokens---
'r' Literal.String.Affix
'"' Literal.String.Double
'Not ' Literal.String.Double
'\\' Literal.String.Double
'n escaped' Literal.String.Double
'"' Literal.String.Double
'\n' Text.Whitespace
---input---
s"version()"
---tokens---
's' Literal.String.Affix
'"' Literal.String.Double
'version()' Literal.String.Double
'"' Literal.String.Double
'\n' Text.Whitespace
---input---
foo = true # comment used to prevent bool from being recognized
---tokens---
'foo' Name
' ' Text.Whitespace
'=' Operator
' ' Text.Whitespace
'true' Keyword.Constant
' ' Text.Whitespace
'# comment used to prevent bool from being recognized' Comment.Single
'\n' Text.Whitespace
---input---
[example] # A comment can appear on the same line as a section header
foo = "bar"
---tokens---
'[' Keyword
'example' Keyword
']' Keyword
' ' Text.Whitespace
'# A comment can appear on the same line as a section header' Comment.Single
'\n' Text.Whitespace
'foo' Name
' ' Text.Whitespace
'=' Operator
' ' Text.Whitespace
'"' Literal.String.Double
'bar' Literal.String.Double
'"' Literal.String.Double
'\n' Text.Whitespace
---input---
message = """multiline strings
can be followed by""" # comments
---tokens---
'message' Name
' ' Text.Whitespace
'=' Operator
' ' Text.Whitespace
'"""' Literal.String.Double
'multiline strings\n can be followed by' Literal.String.Double
'"""' Literal.String.Double
' ' Text.Whitespace
'# comments' Comment.Single
'\n' Text.Whitespace
---input---
1234 = "foo" # 1234 is a name
foo = 1234 # 1234 is a number
foo2 = [
1234, # number
{ 1234 = "foo", foo = 1234 } # name then number
]
---tokens---
'1234' Name
' ' Text.Whitespace
'=' Operator
' ' Text.Whitespace
'"' Literal.String.Double
'foo' Literal.String.Double
'"' Literal.String.Double
' ' Text.Whitespace
'# 1234 is a name' Comment.Single
'\n' Text.Whitespace
'foo' Name
' ' Text.Whitespace
'=' Operator
' ' Text.Whitespace
'1234' Literal.Number.Integer
' ' Text.Whitespace
'# 1234 is a number' Comment.Single
'\n' Text.Whitespace
'foo2' Name
' ' Text.Whitespace
'=' Operator
' ' Text.Whitespace
'[' Punctuation
'\n ' Text.Whitespace
'1234' Literal.Number.Integer
',' Punctuation
' ' Text.Whitespace
'# number' Comment.Single
'\n ' Text.Whitespace
'{' Punctuation
' ' Text.Whitespace
'1234' Name
' ' Text.Whitespace
'=' Punctuation
' ' Text.Whitespace
'"' Literal.String.Double
'foo' Literal.String.Double
'"' Literal.String.Double
',' Punctuation
' ' Text.Whitespace
'foo' Name
' ' Text.Whitespace
'=' Punctuation
' ' Text.Whitespace
'1234' Literal.Number.Integer
' ' Text.Whitespace
'}' Punctuation
' ' Text.Whitespace
'# name then number' Comment.Single
'\n' Text.Whitespace
']' Punctuation
'\n' Text.Whitespace
---input---
[ foo . bar ] # whitespace allowed in table headers
---tokens---
'[' Keyword
' ' Text.Whitespace
'foo' Keyword
' ' Text.Whitespace
'.' Keyword
' ' Text.Whitespace
'bar' Keyword
' ' Text.Whitespace
']' Keyword
' ' Text.Whitespace
'# whitespace allowed in table headers' Comment.Single
'\n' Text.Whitespace
---input---
[strings]
basic-string = "I'm a basic string. I can contain 'single quotes', \u0055nicode escapes \U0001f61b \U0001F61B, \"escaped\" double quotes, \n and \t more."
literal-string = 'I am literal string. Escapes like \this have no effect on me. I can contain "double quotes".'
multiline-basic-string = """
I'm a multiline basic string.
I can span several lines and contain 'single' and "double" quotes
as well as \u0055nicode escapes. Line continuations \
work too.
"""
multiline-literal-string = '''
I'm a "multiline" 'literal' string.
Escapes like \this have no effect on me. Neither does this: \
it is not a line continuation.'''
---tokens---
'[' Keyword
'strings' Keyword
']' Keyword
'\n' Text.Whitespace
'basic-string' Name
' ' Text.Whitespace
'=' Operator
' ' Text.Whitespace
'"' Literal.String.Double
"I'm a basic string. I can contain 'single quotes', " Literal.String.Double
'\\u0055' Literal.String.Escape
'nicode escapes ' Literal.String.Double
'\\U0001f61b' Literal.String.Escape
' ' Literal.String.Double
'\\U0001F61B' Literal.String.Escape
', ' Literal.String.Double
'\\"' Literal.String.Escape
'escaped' Literal.String.Double
'\\"' Literal.String.Escape
' double quotes, ' Literal.String.Double
'\\n' Literal.String.Escape
' and ' Literal.String.Double
'\\t' Literal.String.Escape
' more.' Literal.String.Double
'"' Literal.String.Double
'\n' Text.Whitespace
'literal-string' Name
' ' Text.Whitespace
'=' Operator
' ' Text.Whitespace
"'" Literal.String.Single
'I am literal string. Escapes like \\this have no effect on me. I can contain "double quotes".\'' Literal.String.Single
'\n' Text.Whitespace
'multiline-basic-string' Name
' ' Text.Whitespace
'=' Operator
' ' Text.Whitespace
'"""' Literal.String.Double
"\nI'm a multiline basic string.\nI can span several lines and contain 'single' and " Literal.String.Double
'"' Literal.String.Double
'double' Literal.String.Double
'"' Literal.String.Double
' quotes\nas well as ' Literal.String.Double
'\\u0055' Literal.String.Escape
'nicode escapes. Line continuations ' Literal.String.Double
'\\' Literal.String.Escape
'\n' Text.Whitespace
' work too.\n' Literal.String.Double
'"""' Literal.String.Double
'\n' Text.Whitespace
'multiline-literal-string' Name
' ' Text.Whitespace
'=' Operator
' ' Text.Whitespace
"'''" Literal.String.Single
'\nI' Literal.String.Single
"'" Literal.String.Single
'm a "multiline" ' Literal.String.Single
"'" Literal.String.Single
'literal' Literal.String.Single
"'" Literal.String.Single
' string.\nEscapes like \\this have no effect on me. Neither does this: \\\n it is not a line continuation.' Literal.String.Single
"'''" Literal.String.Single
'\n' Text.Whitespace
---input---
foo = "this string should" # not extend into the comment that contains "quotes"
bar = 'same with a' # basic 'string'
baz = """same with a""" # multiline """basic string"""
spam = '''same with a''' # multiline '''literal string'''
---tokens---
'foo' Name
' ' Text.Whitespace
'=' Operator
' ' Text.Whitespace
'"' Literal.String.Double
'this string should' Literal.String.Double
'"' Literal.String.Double
' ' Text.Whitespace
'# not extend into the comment that contains "quotes"' Comment.Single
'\n' Text.Whitespace
'bar' Name
' ' Text.Whitespace
'=' Operator
' ' Text.Whitespace
"'" Literal.String.Single
"same with a' # basic 'string'" Literal.String.Single
'\n' Text.Whitespace
'baz' Name
' ' Text.Whitespace
'=' Operator
' ' Text.Whitespace
'"""' Literal.String.Double
'same with a' Literal.String.Double
'"""' Literal.String.Double
' ' Text.Whitespace
'# multiline """basic string"""' Comment.Single
'\n' Text.Whitespace
'spam' Name
' ' Text.Whitespace
'=' Operator
' ' Text.Whitespace
"'''" Literal.String.Single
'same with a' Literal.String.Single
"'''" Literal.String.Single
' ' Text.Whitespace
"# multiline '''literal string'''" Comment.Single
'\n' Text.Whitespace
---input---
[table.header.with.some."strings".and."escape\nsequences\u263a"]
---tokens---
'[' Keyword
'table' Keyword
'.' Keyword
'header' Keyword
'.' Keyword
'with' Keyword
'.' Keyword
'some' Keyword
'.' Keyword
'"' Literal.String.Double
'strings' Literal.String.Double
'"' Literal.String.Double
'.' Keyword
'and' Keyword
'.' Keyword
'"' Literal.String.Double
'escape' Literal.String.Double
'\\n' Literal.String.Escape
'sequences' Literal.String.Double
'\\u263a' Literal.String.Escape
'"' Literal.String.Double
']' Keyword
'\n' Text.Whitespace
---input---
log Transfer
event Transfer:
struct Tessst:
---tokens---
'log' Keyword
' ' Text.Whitespace
'Transfer' Name.Class
'\n\n' Text.Whitespace
'event' Keyword
' ' Text.Whitespace
'Transfer' Name.Class
':' Punctuation
'\n\n' Text.Whitespace
'struct' Keyword
' ' Text.Whitespace
'Tessst' Name.Class
':' Punctuation
'\n' Text.Whitespace
[tox]
envlist = py
[testenv]
description =
run tests with pytest (you can pass extra arguments for pytest,
e.g., "tox -- --update-goldens")
deps =
pytest >= 7.0
pytest-cov
pytest-randomly
wcag-contrast-ratio
commands = pytest {posargs}
use_develop = True
[testenv:regexlint]
description =
lint regular expressions with regexlint
deps =
git+https://github.com/pygments/regexlint.git@master
commands = regexlint pygments.lexers
[testenv:pylint]
description =
lint code with pylint
deps =
pylint
skip_install = True # doesn't need installing Pygments into the venv
commands =
pylint --rcfile scripts/pylintrc pygments
[testenv:check]
description =
miscellaneous checks on the source code, including pyflakes
deps =
flake8
commands =
python scripts/check_crlf.py pygments external
python scripts/detect_missing_analyse_text.py --skip-no-aliases
# We only use pyflakes, not pycodestyle, but use it through flake8 nevertheless
# to be able to use the --ignore option.
flake8 --select F --ignore F401 pygments
python scripts/check_sources.py -i pygments/lexers/_mapping.py \
-i pygments/styles/_mapping.py \
-i docs/_build -i pygments/formatters/_mapping.py -i pygments/unistring.py \
-i tests/support/empty.py
python scripts/count_token_references.py --minfiles=1 --maxfiles=1 \
--minlines=1 --maxlines=3 --subtoken
[testenv:mapfiles]
description =
regenerate map files
deps =
commands =
python scripts/gen_mapfiles.py
[testenv:coverage]
description =
run tests, and generate a coverage report in htmlcov/
commands =
pytest --cov --cov-report=html --cov-report=term {posargs}
[testenv:doc]
description =
compile documentation with Sphinx. You can pass a builder name,
like "tox -e doc -- latex". You can also add extra options, like
"SPHINXOPTS='-D latex_paper_size=letter' tox -e doc -- latex".
change_dir = doc
deps =
sphinx
wcag-contrast-ratio
commands =
sphinx-build -b {posargs:html} -n {env:SPHINXOPTS} . _build/{posargs:html}
[testenv:web-doc]
description =
same as doc, but also build the demo by compiling Pygments to WASM.
change_dir = doc
deps = {[testenv:doc]deps}
allowlist_externals =
docker
setenv =
# Enable the BuildKit backend to use the --output option.
DOCKER_BUILDKIT = 1
# Build the demo page.
WEBSITE_BUILD = 1
commands =
docker build --file pyodide/Dockerfile --output _build/pyodide/pyodide ..
sphinx-build -b {posargs:html} {env:SPHINXOPTS} . _build/{posargs:html}
+2
-0

@@ -143,2 +143,3 @@ Pygments is written and maintained by Georg Brandl <georg@python.org>.

* Mark Lee -- Vala lexer
* Thomas Linder Puls -- Visual Prolog lexer
* Pete Lomax -- Phix lexer

@@ -274,2 +275,3 @@ * Valentin Lorentz -- C++ lexer improvements

* diskdance -- Wikitext lexer
* vanillajonathan -- PRQL lexer
* Nikolay Antipov -- OpenSCAD lexer

@@ -276,0 +278,0 @@ * Markus Meyer, Nextron Systems -- YARA lexer

+2
-2

@@ -88,4 +88,4 @@ {#

<div class="footer" role="contentinfo">
&copy; Copyright 2006-2022, Georg Brandl and Pygments contributors.
Created using <a href="https://sphinx-doc.org/">Sphinx</a> {{
&copy; Copyright 2006-2023, Georg Brandl and Pygments contributors.
Created using <a href="https://www.sphinx-doc.org/">Sphinx</a> {{
sphinx_version }}. <br/>

@@ -92,0 +92,0 @@ Pygments logo created by <a href="https://joelunger.com">Joel Unger</a>.

@@ -59,3 +59,3 @@ =====================

.. autoclass:: Lexer
:members: __init__, get_tokens, get_tokens_unprocessed, analyse_text
:members: __init__, add_filter, get_tokens, get_tokens_unprocessed, analyse_text

@@ -62,0 +62,0 @@ There are several base class derived from ``Lexer`` you can use to build your lexer from:

@@ -16,2 +16,10 @@ .. -*- mode: rst -*-

you didn't give an explicit formatter name).
.. note::
If you are on Windows, an extra tool may be needed for colored output to
work in the terminal. You can make sure Pygments is installed with
Windows console coloring support by installing Pygments with the ``windows-terminal``
extra (e.g., ``pip install pygments[windows-terminal]``).
:program:`pygmentize` attempts to

@@ -139,3 +147,3 @@ detect the maximum number of colors that the terminal supports. The difference

The ``-s`` option processes lines one at a time until EOF, rather than waiting
to process the entire file. This only works for stdin, only for lexers with no
to process the entire file. This only works for stdin, only for lexers with no
line-spanning constructs, and is intended for streaming input such as you get

@@ -142,0 +150,0 @@ from `tail -f`. Usage is as follows::

@@ -31,63 +31,6 @@ ========================

* Make sure to add a test for your new functionality, and where applicable,
write documentation. See below on how to test lexers.
write documentation.
* Use the standard importing convention: ``from token import Punctuation``
How to add a lexer
==================
To add a lexer, you have to perform the following steps:
* Select a matching module under ``pygments/lexers``, or create a new
module for your lexer class.
.. note::
We encourage you to put your lexer class into its own module, unless it's a
very small derivative of an already existing lexer.
* Next, make sure the lexer is known from outside the module. All modules
in the ``pygments.lexers`` package specify ``__all__``. For example,
``esoteric.py`` sets::
__all__ = ['BrainfuckLexer', 'BefungeLexer', ...]
Add the name of your lexer class to this list (or create the list if your lexer
is the only class in the module).
* Finally the lexer can be made publicly known by rebuilding the lexer mapping.
.. code-block:: console
$ tox -e mapfiles
How lexers are tested
=====================
To add a new lexer test, create a file with just your code snippet
under ``tests/snippets/<lexer_alias>/``. Then run
``tox -- --update-goldens <filename.txt>`` to auto-populate the
currently expected tokens. Check that they look good and check in the
file.
Lexer tests are run with ``tox``, like all other tests. While
working on a lexer, you can also run only the tests for that lexer
with ``tox -- tests/snippets/language-name/`` and/or
``tox -- tests/examplefiles/language-name/``.
Running the test suite with ``tox`` will run lexers on the test
inputs, and check that the output matches the expected tokens. If you
are improving a lexer, it is normal that the token output changes. To
update the expected token output for the tests, again use
``tox -- --update-goldens <filename.txt>``. Review the changes and
check that they are as intended, then commit them along with your
proposed code change.
Large test files should go in ``tests/examplefiles``. This works
similar to ``snippets``, but the token output is stored in a separate
file. Output can also be regenerated with ``--update-goldens``.
Goals & non-goals of Pygments

@@ -117,1 +60,12 @@ =============================

be worth it.
Language support
----------------
While we strive for the broadest language support possible, we can't support
every programming language on the planet. Our minimum bar is fairly low, but to
avoid pet projects and other one-off languages, we expect any language that is
proposed for inclusion to have a reasonably sized community around it. If you
need a syntax highlighter for your in-house programming language or the brand
new language project you kicked off, consider writing a :doc:`plugin <plugins>`
until it gains enough popularity.

@@ -23,2 +23,61 @@ .. -*- mode: rst -*-

How to add a lexer
==================
To add a lexer, you have to perform the following steps:
* Select a matching module under ``pygments/lexers``, or create a new
module for your lexer class.
.. note::
We encourage you to put your lexer class into its own module, unless it's a
very small derivative of an already existing lexer.
* Next, make sure the lexer is known from outside the module. All modules
in the ``pygments.lexers`` package specify ``__all__``. For example,
``automation.py`` sets::
__all__ = ['AutohotkeyLexer', 'AutoItLexer']
Add the name of your lexer class to this list (or create the list if your
lexer is the only class in the module).
* Finally the lexer can be made publicly known by rebuilding the lexer mapping.
.. code-block:: console
$ tox -e mapfiles
How to test your lexer
======================
To add a new lexer test, create a file with just your code snippet
under ``tests/snippets/<lexer_alias>/``. Then run
``tox -- --update-goldens <filename.txt>`` to auto-populate the
currently expected tokens. Check that they look good and check in the
file.
Lexer tests are run with ``tox``, like all other tests. While
working on a lexer, you can also run only the tests for that lexer
with ``tox -- tests/snippets/language-name/`` and/or
``tox -- tests/examplefiles/language-name/``.
Running the test suite with ``tox`` will run lexers on the test
inputs, and check that the output matches the expected tokens. If you
are improving a lexer, it is normal that the token output changes. To
update the expected token output for the tests, again use
``tox -- --update-goldens <filename.txt>``. Review the changes and
check that they are as intended, then commit them along with your
proposed code change.
Large test files should go in ``tests/examplefiles``. This works
similar to ``snippets``, but the token output is stored in a separate
file. Output can also be regenerated with ``--update-goldens``.
.. note::
When contributing a new lexer, you *must* provide an example file or test
snippet. Lexers which can't be tested will not be accepted.
RegexLexer

@@ -35,5 +94,19 @@ ==========

type, or changing state), the current position is set to where the last match
ended and the matching process continues with the first regex of the current
ended and the matching process continues with the _first_ regex of the current
state.
.. note::
This means you're always jumping back to the first entry, i.e. you cannot match states in a particular order. For example, a state with the following rules won't work as intended:
.. code:: python
'state': [
(r'\w+', Name,),
(r'\s+', Whitespace,),
(r'\w+', Keyword,)
]
In the example above, ``Keyword`` will never be matched. To match certain token types in order, see below for the `bygroups` helper.
Lexer states are kept on a stack: each time a new state is entered, the new

@@ -720,3 +793,3 @@ state is pushed onto the stack. The most basic lexers (like the `DiffLexer`)

a rule like ``@.*@`` will match the whole string ``@first@ second @third@``,
instead of matching ``@first@`` and ``@second@``. You can use ``@.*?@`` in
instead of matching ``@first@`` and ``@third@``. You can use ``@.*?@`` in
this case to stop early. The ``?`` tries to match *as few times* as possible.

@@ -723,0 +796,0 @@

@@ -5,13 +5,17 @@ =======

If you want to extend Pygments without hacking the sources, but want to
use the lexer/formatter/style/filter lookup functions (`lexers.get_lexer_by_name`
et al.), you can use `setuptools`_ entrypoints to add new lexers, formatters
or styles as if they were in the Pygments core.
If you want to extend Pygments without hacking the sources, you can use
package `entry points`_ to add new lexers, formatters, styles or filters
as if they were in the Pygments core.
.. _setuptools: https://pypi.org/project/setuptools/
.. _entry points: https://packaging.python.org/en/latest/guides/creating-and-discovering-plugins/
That means you can use your highlighter modules with the `pygmentize` script,
which relies on the mentioned functions.
The idea is to create a Python package, declare how extends Pygments,
and install it.
This will allow you to use your custom lexers/... with the
``pygmentize`` command. They will also be found by the lookup functions
(``lexers.get_lexer_by_name`` et al.), which makes them available to
tools such as Sphinx, mkdocs, ...
Plugin discovery

@@ -26,3 +30,3 @@ ================

``pkg_resources`` is not available, no plugins will be loaded at
all. Note that ``pkg_resources`` is distributed with `setuptools`_, and
all. Note that ``pkg_resources`` is distributed with setuptools, and
thus available on most Python environments. However, ``pkg_resources``

@@ -47,74 +51,11 @@ is considerably slower than ``importlib.metadata`` or its

Defining plugins through entrypoints
====================================
Defining plugins through entry points
=====================================
Here is a list of setuptools entrypoints that Pygments understands:
We have created a repository with a project template for defining your
own plugins. It is available at
`pygments.lexers`
https://github.com/pygments/pygments-plugin-scaffolding
This entrypoint is used for adding new lexers to the Pygments core.
The name of the entrypoint values doesn't really matter, Pygments extracts
required metadata from the class definition:
.. sourcecode:: ini
[pygments.lexers]
yourlexer = yourmodule:YourLexer
Note that you have to define ``name``, ``aliases`` and ``filename``
attributes so that you can use the highlighter from the command line:
.. sourcecode:: python
class YourLexer(...):
name = 'Name Of Your Lexer'
aliases = ['alias']
filenames = ['*.ext']
`pygments.formatters`
You can use this entrypoint to add new formatters to Pygments. The
name of an entrypoint item is the name of the formatter. If you
prefix the name with a slash it's used as a filename pattern:
.. sourcecode:: ini
[pygments.formatters]
yourformatter = yourmodule:YourFormatter
/.ext = yourmodule:YourFormatter
`pygments.styles`
To add a new style you can use this entrypoint. The name of the entrypoint
is the name of the style:
.. sourcecode:: ini
[pygments.styles]
yourstyle = yourmodule:YourStyle
`pygments.filters`
Use this entrypoint to register a new filter. The name of the
entrypoint is the name of the filter:
.. sourcecode:: ini
[pygments.filters]
yourfilter = yourmodule:YourFilter
How To Use Entrypoints
======================
This documentation doesn't explain how to use those entrypoints because this is
covered in the `setuptools documentation`_. That page should cover everything
you need to write a plugin.
.. _setuptools documentation: https://setuptools.readthedocs.io/en/latest/
Extending The Core

@@ -121,0 +62,0 @@ ==================

@@ -44,10 +44,4 @@ .. -*- mode: rst -*-

* add ``your.py`` file
* register the new style by adding a line to the ``__init__.py`` file:
* regenerate the mappings file using ``tox -e mapfiles``
.. sourcecode:: python
STYLE_MAP = {
...
'your': 'your::YourStyle',
.. note::

@@ -54,0 +48,0 @@

@@ -11,3 +11,3 @@ Download and installation

<https://pypi.python.org/pypi/Pygments>`_. For installation of packages from
PyPI, we recommend `Pip <https://www.pip-installer.org>`_, which works on all
PyPI, we recommend `Pip <https://pypi.org/project/pip/>`_, which works on all
major platforms.

@@ -14,0 +14,0 @@

@@ -38,4 +38,3 @@ :orphan:

Pygments only needs a standard Python install, version 3.6 or higher. No
additional libraries are needed.
Pygments only needs a standard Python install. No additional libraries are needed.

@@ -42,0 +41,0 @@ How can I use Pygments?

@@ -38,3 +38,3 @@ Welcome!

Many lexers and fixes have been contributed by **Armin Ronacher**, the rest of
the `Pocoo <https://dev.pocoo.org/>`_ team and **Tim Hatch**.
the `Pocoo <https://www.pocoo.org/>`_ team and **Tim Hatch**.

@@ -41,0 +41,0 @@ .. toctree::

@@ -17,3 +17,3 @@ # Makefile for Sphinx documentation

.PHONY: help clean pyodide html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext
.PHONY: help clean html dirhtml singlehtml pickle json htmlhelp qthelp devhelp epub latex latexpdf text man changes linkcheck doctest gettext

@@ -20,0 +20,0 @@ help:

# Dockerfile for building Pyodide with a Pygmenets version from the current checkout.
# For an example of how to use this image, see the `pyodide` target in the documentation's Makefile.
# This is used by the `pyodide` tox environment (see /tox.ini).
FROM ghcr.io/pyodide/pyodide:0.20.0 AS build-stage

@@ -4,0 +4,0 @@

Metadata-Version: 2.1
Name: Pygments
Version: 2.16.1
Version: 2.17.0
Summary: Pygments is a syntax highlighting package written in Python.
Author-email: Georg Brandl <georg@python.org>
Maintainer: Matthäus G. Chajdas
Maintainer-email: Georg Brandl <georg@python.org>, Jean Abou Samra <jean@abou-samra.fr>
License: BSD-2-Clause
Project-URL: Homepage, https://pygments.org

@@ -14,2 +10,8 @@ Project-URL: Documentation, https://pygments.org/docs

Project-URL: Changelog, https://github.com/pygments/pygments/blob/master/CHANGES
Author-email: Georg Brandl <georg@python.org>
Maintainer: Matthäus G. Chajdas
Maintainer-email: Georg Brandl <georg@python.org>, Jean Abou Samra <jean@abou-samra.fr>
License: BSD-2-Clause
License-File: AUTHORS
License-File: LICENSE
Keywords: syntax highlighting

@@ -29,2 +31,3 @@ Classifier: Development Status :: 6 - Mature

Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: Implementation :: CPython

@@ -35,6 +38,7 @@ Classifier: Programming Language :: Python :: Implementation :: PyPy

Requires-Python: >=3.7
Provides-Extra: plugins
Requires-Dist: importlib-metadata; python_version < '3.8' and extra == 'plugins'
Provides-Extra: windows-terminal
Requires-Dist: colorama>=0.4.6; extra == 'windows-terminal'
Description-Content-Type: text/x-rst
Provides-Extra: plugins
License-File: LICENSE
License-File: AUTHORS

@@ -41,0 +45,0 @@ Pygments

@@ -29,3 +29,3 @@ """

__version__ = '2.16.1'
__version__ = '2.17.0'
__docformat__ = 'restructuredtext'

@@ -32,0 +32,0 @@

@@ -134,3 +134,3 @@ """

return _formatter_cache[name](**options)
for cls in find_plugin_formatters():
for _name, cls in find_plugin_formatters():
for filename in cls.filenames:

@@ -137,0 +137,0 @@ if _fn_matches(fn, filename):

@@ -326,2 +326,3 @@ """

tags file should specify line numbers (see the `-n` option to ctags).
The tags file is assumed to be encoded in UTF-8.

@@ -912,3 +913,3 @@ .. versionadded:: 1.6

if self._ctags.find(entry, token.encode(), 0):
return entry['file'], entry['lineNumber']
return entry['file'].decode(), entry['lineNumber']
else:

@@ -915,0 +916,0 @@ return None, None

@@ -202,16 +202,5 @@ """

def get_tokens(self, text, unfiltered=False):
"""
This method is the basic interface of a lexer. It is called by
the `highlight()` function. It must process the text and return an
iterable of ``(tokentype, value)`` pairs from `text`.
def _preprocess_lexer_input(self, text):
"""Apply preprocessing such as decoding the input, removing BOM and normalizing newlines."""
Normally, you don't need to override this method. The default
implementation processes the options recognized by all lexers
(`stripnl`, `stripall` and so on), and then yields all tokens
from `get_tokens_unprocessed()`, with the ``index`` dropped.
If `unfiltered` is set to `True`, the filtering mechanism is
bypassed even if filters are defined.
"""
if not isinstance(text, str):

@@ -259,2 +248,20 @@ if self.encoding == 'guess':

return text
def get_tokens(self, text, unfiltered=False):
"""
This method is the basic interface of a lexer. It is called by
the `highlight()` function. It must process the text and return an
iterable of ``(tokentype, value)`` pairs from `text`.
Normally, you don't need to override this method. The default
implementation processes the options recognized by all lexers
(`stripnl`, `stripall` and so on), and then yields all tokens
from `get_tokens_unprocessed()`, with the ``index`` dropped.
If `unfiltered` is set to `True`, the filtering mechanism is
bypassed even if filters are defined.
"""
text = self._preprocess_lexer_input(text)
def streamer():

@@ -261,0 +268,0 @@ for _, t, v in self.get_tokens_unprocessed(text):

@@ -25,2 +25,3 @@ """

'Python3TracebackLexer': 'PythonTracebackLexer',
'LeanLexer': 'Lean3Lexer',
}

@@ -27,0 +28,0 @@

@@ -28,3 +28,3 @@ """

name = 'GAP'
url = 'http://www.gap-system.org'
url = 'https://www.gap-system.org'
aliases = ['gap']

@@ -31,0 +31,0 @@ filenames = ['*.g', '*.gd', '*.gi', '*.gap']

@@ -545,11 +545,10 @@ """

ip_re = (
r'(?:(?:(?:[3-9]\d?|2(?:5[0-5]|[0-4]?\d)?|1\d{0,2}|0x0*[0-9a-f]{1,2}|'
r'0+[1-3]?[0-7]{0,2})(?:\.(?:[3-9]\d?|2(?:5[0-5]|[0-4]?\d)?|1\d{0,2}|'
r'0x0*[0-9a-f]{1,2}|0+[1-3]?[0-7]{0,2})){3})|(?!.*::.*::)(?:(?!:)|'
r':(?=:))(?:[0-9a-f]{0,4}(?:(?<=::)|(?<!::):)){6}(?:[0-9a-f]{0,4}'
r'(?:(?<=::)|(?<!::):)[0-9a-f]{0,4}(?:(?<=::)|(?<!:)|(?<=:)(?<!::):)|'
r'(?:25[0-4]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-4]|2[0-4]\d|1\d\d|'
r'[1-9]?\d)){3}))'
)
ipv4_group = r'(\d+|0x[0-9a-f]+)'
ipv4 = rf'({ipv4_group}(\.{ipv4_group}){{3}})'
ipv6_group = r'([0-9a-f]{0,4})'
ipv6 = rf'({ipv6_group}(:{ipv6_group}){{1,7}})'
bare_ip = rf'({ipv4}|{ipv6})'
# XXX: /integer is a subnet mark, but what is /IP ?
# There is no test where it is used.
ip = rf'{bare_ip}(/({bare_ip}|\d+))?'

@@ -567,3 +566,3 @@ tokens = {

(words(acls, prefix=r'\b', suffix=r'\b'), Keyword),
(ip_re + r'(?:/(?:' + ip_re + r'|\b\d+\b))?', Number.Float),
(ip, Number.Float),
(r'(?:\b\d+\b(?:-\b\d+|%)?)', Number),

@@ -1119,4 +1118,3 @@ (r'\S+', Text),

"""
Lexer for TOML, a simple language
for config files.
Lexer for TOML, a simple language for config files.

@@ -1127,43 +1125,161 @@ .. versionadded:: 2.4

name = 'TOML'
url = 'https://github.com/toml-lang/toml'
aliases = ['toml']
filenames = ['*.toml', 'Pipfile', 'poetry.lock']
mimetypes = ['application/toml']
url = 'https://toml.io'
# Based on the TOML spec: https://toml.io/en/v1.0.0
# The following is adapted from CPython's tomllib:
_time = r"\d\d:\d\d:\d\d(\.\d+)?"
_datetime = rf"""(?x)
\d\d\d\d-\d\d-\d\d # date, e.g., 1988-10-27
(
[Tt ] {_time} # optional time
(
[Zz]|[+-]\d\d:\d\d # optional time offset
)?
)?
"""
tokens = {
'root': [
# Table
(r'^(\s*)(\[.*?\])$', bygroups(Whitespace, Keyword)),
# Note that we make an effort in order to distinguish
# moments at which we're parsing a key and moments at
# which we're parsing a value. In the TOML code
#
# 1234 = 1234
#
# the first "1234" should be Name, the second Integer.
# Basics, comments, strings
(r'[ \t]+', Whitespace),
(r'\n', Whitespace),
(r'#.*?$', Comment.Single),
# Basic string
(r'"(\\\\|\\[^\\]|[^"\\])*"', String),
# Literal string
(r'\'\'\'(.*)\'\'\'', String),
(r'\'[^\']*\'', String),
(r'(true|false)$', Keyword.Constant),
(r'[a-zA-Z_][\w\-]*', Name),
# Whitespace
(r'\s+', Whitespace),
# Datetime
# TODO this needs to be expanded, as TOML is rather flexible:
# https://github.com/toml-lang/toml#offset-date-time
(r'\d{4}-\d{2}-\d{2}(?:T| )\d{2}:\d{2}:\d{2}(?:Z|[-+]\d{2}:\d{2})', Number.Integer),
# Comment
(r'#.*', Comment.Single),
# Numbers
(r'(\d+\.\d*|\d*\.\d+)([eE][+-]?[0-9]+)?j?', Number.Float),
(r'\d+[eE][+-]?[0-9]+j?', Number.Float),
# Handle +-inf, +-infinity, +-nan
(r'[+-]?(?:(inf(?:inity)?)|nan)', Number.Float),
(r'[+-]?\d+', Number.Integer),
# Assignment keys
include('key'),
# Punctuation
(r'[]{}:(),;[]', Punctuation),
# After "=", find a value
(r'(=)(\s*)', bygroups(Operator, Whitespace), 'value'),
# Table header
(r'\[\[?', Keyword, 'table-key'),
],
'key': [
# Start of bare key (only ASCII is allowed here).
(r'[A-Za-z0-9_-]+', Name),
# Quoted key
(r'"', String.Double, 'basic-string'),
(r"'", String.Single, 'literal-string'),
# Dots act as separators in keys
(r'\.', Punctuation),
],
'table-key': [
# This is like 'key', but highlights the name components
# and separating dots as Keyword because it looks better
# when the whole table header is Keyword. We do highlight
# strings as strings though.
# Start of bare key (only ASCII is allowed here).
(r'[A-Za-z0-9_-]+', Keyword),
(r'"', String.Double, 'basic-string'),
(r"'", String.Single, 'literal-string'),
(r'\.', Keyword),
(r'\]\]?', Keyword, '#pop'),
# Operators
(r'=', Operator)
# Inline whitespace allowed
(r'[ \t]+', Whitespace),
],
'value': [
# Datetime, baretime
(_datetime, Literal.Date, '#pop'),
(_time, Literal.Date, '#pop'),
]
# Recognize as float if there is a fractional part
# and/or an exponent.
(r'[+-]?\d[0-9_]*[eE][+-]?\d[0-9_]*', Number.Float, '#pop'),
(r'[+-]?\d[0-9_]*\.\d[0-9_]*([eE][+-]?\d[0-9_]*)?',
Number.Float, '#pop'),
# Infinities and NaN
(r'[+-]?(inf|nan)', Number.Float, '#pop'),
# Integers
(r'-?0b[01_]+', Number.Bin, '#pop'),
(r'-?0o[0-7_]+', Number.Oct, '#pop'),
(r'-?0x[0-9a-fA-F_]+', Number.Hex, '#pop'),
(r'[+-]?[0-9_]+', Number.Integer, '#pop'),
# Strings
(r'"""', String.Double, ('#pop', 'multiline-basic-string')),
(r'"', String.Double, ('#pop', 'basic-string')),
(r"'''", String.Single, ('#pop', 'multiline-literal-string')),
(r"'", String.Single, ('#pop', 'literal-string')),
# Booleans
(r'true|false', Keyword.Constant, '#pop'),
# Start of array
(r'\[', Punctuation, ('#pop', 'array')),
# Start of inline table
(r'\{', Punctuation, ('#pop', 'inline-table')),
],
'array': [
# Whitespace, including newlines, is ignored inside arrays,
# and comments are allowed.
(r'\s+', Whitespace),
(r'#.*', Comment.Single),
# Delimiters
(r',', Punctuation),
# End of array
(r'\]', Punctuation, '#pop'),
# Parse a value and come back
default('value'),
],
'inline-table': [
# Note that unlike inline arrays, inline tables do not
# allow newlines or comments.
(r'[ \t]+', Whitespace),
# Keys
include('key'),
# Values
(r'(=)(\s*)', bygroups(Punctuation, Whitespace), 'value'),
# Delimiters
(r',', Punctuation),
# End of inline table
(r'\}', Punctuation, '#pop'),
],
'basic-string': [
(r'"', String.Double, '#pop'),
include('escapes'),
(r'[^"\\]+', String.Double),
],
'literal-string': [
(r".*'", String.Single, '#pop'),
],
'multiline-basic-string': [
(r'"""', String.Double, '#pop'),
(r'(\\)(\n)', bygroups(String.Escape, Whitespace)),
include('escapes'),
(r'[^"\\]+', String.Double),
(r'"', String.Double),
],
'multiline-literal-string': [
(r"'''", String.Single, '#pop'),
(r"[^']+", String.Single),
(r"'", String.Single),
],
'escapes': [
(r'\\u[0-9a-fA-F]{4}|\\U[0-9a-fA-F]{8}', String.Escape),
(r'\\.', String.Escape),
],
}

@@ -1170,0 +1286,0 @@

@@ -35,3 +35,3 @@ """

name = 'Crystal'
url = 'http://crystal-lang.org'
url = 'https://crystal-lang.org'
aliases = ['cr', 'crystal']

@@ -38,0 +38,0 @@ filenames = ['*.cr']

@@ -453,4 +453,4 @@ """

aliases = ['json', 'json-object']
filenames = ['*.json', 'Pipfile.lock']
mimetypes = ['application/json', 'application/json-object']
filenames = ['*.json', '*.jsonl', '*.ndjson', 'Pipfile.lock']
mimetypes = ['application/json', 'application/json-object', 'application/x-ndjson', 'application/jsonl', 'application/json-seq']

@@ -457,0 +457,0 @@ # No validation of integers, floats, or constants is done.

@@ -882,3 +882,3 @@ """

name = 'Snowball'
url = 'http://snowballstem.org/'
url = 'https://snowballstem.org/'
aliases = ['snowball']

@@ -933,3 +933,4 @@ filenames = ['*.sbl']

'root': [
(words(('len', 'lenof'), suffix=r'\b'), Operator.Word),
(r'len\b', Name.Builtin),
(r'lenof\b', Operator.Word),
include('root1'),

@@ -936,0 +937,0 @@ ],

@@ -25,3 +25,3 @@ """

name = 'Eiffel'
url = 'http://www.eiffel.com'
url = 'https://www.eiffel.com'
aliases = ['eiffel']

@@ -28,0 +28,0 @@ filenames = ['*.e']

@@ -26,3 +26,3 @@ """

name = 'Elm'
url = 'http://elm-lang.org/'
url = 'https://elm-lang.org/'
aliases = ['elm']

@@ -29,0 +29,0 @@ filenames = ['*.elm']

@@ -62,4 +62,4 @@ """

'CONTINUE', 'CRITICAL', 'CYCLE', 'DATA', 'DEALLOCATE', 'DECODE',
'DEFERRED', 'DIMENSION', 'DO', 'ELEMENTAL', 'ELSE', 'ENCODE', 'END',
'ENDASSOCIATE', 'ENDBLOCK', 'ENDDO', 'ENDENUM', 'ENDFORALL',
'DEFERRED', 'DIMENSION', 'DO', 'ELEMENTAL', 'ELSE', 'ELSEIF', 'ENCODE',
'END', 'ENDASSOCIATE', 'ENDBLOCK', 'ENDDO', 'ENDENUM', 'ENDFORALL',
'ENDFUNCTION', 'ENDIF', 'ENDINTERFACE', 'ENDMODULE', 'ENDPROGRAM',

@@ -66,0 +66,0 @@ 'ENDSELECT', 'ENDSUBMODULE', 'ENDSUBROUTINE', 'ENDTYPE', 'ENDWHERE',

@@ -38,3 +38,2 @@ """

'root': [
include('comment'),
include('clauses'),

@@ -46,6 +45,4 @@ include('keywords'),

include('barewords'),
include('comment'),
],
'comment': [
(r'^.*//.*$', Comment.Single),
],
'keywords': [

@@ -81,2 +78,6 @@ (r'(create|order|match|limit|set|skip|start|return|with|where|'

bygroups(Keyword, Whitespace, Keyword, Whitespace, Keyword)),
(r'(using)(\s+)(index)\b',
bygroups(Keyword, Whitespace, Keyword)),
(r'(using)(\s+)(range|text|point)(\s+)(index)\b',
bygroups(Keyword, Whitespace, Name, Whitespace, Keyword)),
(words((

@@ -87,3 +88,3 @@ 'all', 'any', 'as', 'asc', 'ascending', 'assert', 'call', 'case', 'create',

'remove', 'return', 'set', 'skip', 'single', 'start', 'then', 'union',
'unwind', 'yield', 'where', 'when', 'with'), suffix=r'\b'), Keyword),
'unwind', 'yield', 'where', 'when', 'with', 'collect'), suffix=r'\b'), Keyword),
],

@@ -99,3 +100,3 @@ 'relations': [

'strings': [
(r'"(?:\\[tbnrf\'"\\]|[^\\"])*"', String),
(r'([\'"])(?:\\[tbnrf\'"\\]|[^\\])*?\1', String),
(r'`(?:``|[^`])+`', Name.Variable),

@@ -110,2 +111,5 @@ ],

],
'comment': [
(r'//.*$', Comment.Single),
],
}

@@ -1722,3 +1722,3 @@ """

name = 'Macaulay2'
url = 'https://faculty.math.illinois.edu/Macaulay2/'
url = 'https://macaulay2.com/'
aliases = ['macaulay2']

@@ -1725,0 +1725,0 @@ filenames = ['*.m2']

@@ -182,2 +182,3 @@ """

(r'\\\S+', String),
(r'\[(?P<level>=*)\[[\w\W]*?\](?P=level)\]', String.Multiline),
(r'[^)$"# \t\n]+', String),

@@ -184,0 +185,0 @@ (r'\n', Whitespace), # explicitly legal

@@ -933,3 +933,5 @@ """

'zh', 'zh-hans', 'zh-hant', 'zh-cn', 'zh-hk', 'zh-mo', 'zh-my', 'zh-sg', 'zh-tw',
# UnConverter.php
# WuuConverter.php
'wuu', 'wuu-hans', 'wuu-hant',
# UzConverter.php
'uz', 'uz-latn', 'uz-cyrl',

@@ -1080,3 +1082,3 @@ # TlyConverter.php

""" % ('|'.join(protocols), title_char.replace('/', ''),
title_char, f'{title_char}#'),
title_char, f'{title_char}#'),
bygroups(Punctuation, Name.Namespace, Punctuation,

@@ -1093,3 +1095,3 @@ using(this, state=['wikilink-name']), Punctuation, Name.Label, Punctuation)

""" % ('|'.join(protocols), title_char.replace('/', ''),
title_char, f'{title_char}#'),
title_char, f'{title_char}#'),
bygroups(Punctuation, Name.Namespace, Punctuation,

@@ -1194,12 +1196,34 @@ using(this, state=['wikilink-name']), Punctuation, Name.Label, Punctuation),

r"""(?xi)
(-\{{) # Escape format()
(?: ([^|]) (\|))?
(?: (\s* (?:{variants}) \s*) (=>))?
(\s* (?:{variants}) \s*) (:)
(-\{{) # Use {{ to escape format()
([^|]) (\|)
(?:
(?: ([^;]*?) (=>))?
(\s* (?:{variants}) \s*) (:)
)?
""".format(variants='|'.join(variant_langs)),
bygroups(Punctuation, Keyword, Punctuation,
Name.Label, Operator, Name.Label, Punctuation),
using(this, state=['root', 'lc-raw']),
Operator, Name.Label, Punctuation),
'lc-inner'
),
(r'-\{(?!\{)', Punctuation, 'lc-raw'),
# LanguageConverter markups: composite conversion grammar
(
r"""(?xi)
(-\{)
([a-z\s;-]*?) (\|)
""",
bygroups(Punctuation,
using(this, state=['root', 'lc-flag']),
Punctuation),
'lc-raw'
),
# LanguageConverter markups: fallbacks
(
r"""(?xi)
(-\{{) (?!\{{) # Use {{ to escape format()
(?: (\s* (?:{variants}) \s*) (:))?
""".format(variants='|'.join(variant_langs)),
bygroups(Punctuation, Name.Label, Punctuation),
'lc-inner'
),
],

@@ -1267,2 +1291,7 @@ 'wikilink-name': [

],
'lc-flag': [
(r'\s+', Whitespace),
(r';', Punctuation),
*text_rules(Keyword),
],
'lc-inner': [

@@ -1272,6 +1301,6 @@ (

(;)
(?: (\s* (?:{variants}) \s*) (=>))?
(?: ([^;]*?) (=>))?
(\s* (?:{variants}) \s*) (:)
""".format(variants='|'.join(variant_langs)),
bygroups(Punctuation, Name.Label,
bygroups(Punctuation, using(this, state=['root', 'lc-raw']),
Operator, Name.Label, Punctuation)

@@ -1278,0 +1307,0 @@ ),

@@ -8,7 +8,7 @@ """

SNBT. A data communication format used in Minecraft.
wiki: https://minecraft.fandom.com/wiki/NBT_format
wiki: https://minecraft.wiki/w/NBT_format
MCFunction. The Function file for Minecraft Data packs and Add-ons.
official: https://learn.microsoft.com/en-us/minecraft/creator/documents/functionsintroduction
wiki: https://minecraft.fandom.com/wiki/Function
wiki: https://minecraft.wiki/w/Function

@@ -37,3 +37,3 @@ MCSchema. A kind of data Schema for Minecraft Add-on Development.

name = "SNBT"
url = "https://minecraft.fandom.com/wiki/NBT_format"
url = "https://minecraft.wiki/w/NBT_format"
aliases = ["snbt"]

@@ -112,3 +112,3 @@ filenames = ["*.snbt"]

name = "MCFunction"
url = "https://minecraft.fandom.com/wiki/Commands"
url = "https://minecraft.wiki/w/Commands"
aliases = ["mcfunction", "mcf"]

@@ -115,0 +115,0 @@ filenames = ["*.mcfunction"]

@@ -369,3 +369,3 @@ """

keywords = (
'as', 'assert', 'begin', 'class', 'constraint', 'do', 'done',
'and', 'as', 'assert', 'begin', 'class', 'constraint', 'do', 'done',
'downto', 'else', 'end', 'exception', 'external', 'false',

@@ -376,3 +376,3 @@ 'for', 'fun', 'function', 'functor', 'if', 'in', 'include',

'raise', 'rec', 'sig', 'struct', 'then', 'to', 'true', 'try',
'type', 'value', 'val', 'virtual', 'when', 'while', 'with',
'type', 'val', 'virtual', 'when', 'while', 'with',
)

@@ -387,3 +387,3 @@ keyopts = (

operators = r'[!$%&*+\./:<=>?@^|~-]'
word_operators = ('and', 'asr', 'land', 'lor', 'lsl', 'lxor', 'mod', 'or')
word_operators = ('asr', 'land', 'lor', 'lsl', 'lxor', 'mod', 'or')
prefix_syms = r'[!?~]'

@@ -390,0 +390,0 @@ infix_syms = r'[=<>@^|&+\*/$%-]'

@@ -37,4 +37,4 @@ """

'map', 'removeAttrs', 'throw', 'toString', 'derivation']
operators = ['++', '+', '?', '.', '!', '//', '==',
'!=', '&&', '||', '->', '=']
operators = ['++', '+', '?', '.', '!', '//', '==', '/',
'!=', '&&', '||', '->', '=', '<', '>', '*', '-']

@@ -63,2 +63,13 @@ punctuations = ["(", ")", "[", "]", ";", "{", "}", ":", ",", "@"]

# floats
(r'-?(\d+\.\d*|\.\d+)([eE][-+]?\d+)?', Number.Float),
# integers
(r'-?[0-9]+', Number.Integer),
# paths
(r'[\w.+-]*(\/[\w.+-]+)+', Literal),
(r'~(\/[\w.+-]+)+', Literal),
(r'\<[\w.+-]+(\/[\w.+-]+)*\>', Literal),
# operators

@@ -71,16 +82,11 @@ ('(%s)' % '|'.join(re.escape(entry) for entry in operators),

(r'\{', Punctuation, 'block'),
# punctuations
('(%s)' % '|'.join(re.escape(entry) for entry in punctuations), Punctuation),
# integers
(r'[0-9]+', Number.Integer),
# strings
(r'"', String.Double, 'doublequote'),
(r"''", String.Single, 'singlequote'),
(r"''", String.Multiline, 'multiline'),
# paths
(r'[\w.+-]*(\/[\w.+-]+)+', Literal),
(r'\<[\w.+-]+(\/[\w.+-]+)*\>', Literal),
# urls

@@ -90,5 +96,6 @@ (r'[a-zA-Z][a-zA-Z0-9\+\-\.]*\:[\w%/?:@&=+$,\\.!~*\'-]+', Literal),

# names of variables
(r'[\w-]+\s*=', String.Symbol),
(r'[\w-]+(?=\s*=)', String.Symbol),
(r'[a-zA-Z_][\w\'-]*', Text),
(r"\$\{", String.Interpol, 'antiquote'),
],

@@ -101,20 +108,19 @@ 'comment': [

],
'singlequote': [
(r"'''", String.Escape),
(r"''\$\{", String.Escape),
(r"''\n", String.Escape),
(r"''\r", String.Escape),
(r"''\t", String.Escape),
(r"''", String.Single, '#pop'),
'multiline': [
(r"''(\$|'|\\n|\\r|\\t|\\)", String.Escape),
(r"''", String.Multiline, '#pop'),
(r'\$\{', String.Interpol, 'antiquote'),
(r"['$]", String.Single),
(r"[^'$]+", String.Single),
(r"[^'\$]+", String.Multiline),
(r"\$[^\{']", String.Multiline),
(r"'[^']", String.Multiline),
(r"\$(?=')", String.Multiline),
],
'doublequote': [
(r'\\', String.Escape),
(r'\\"', String.Escape),
(r'\\$\{', String.Escape),
(r'\\(\\|"|\$|n)', String.Escape),
(r'"', String.Double, '#pop'),
(r'\$\{', String.Interpol, 'antiquote'),
(r'[^"]', String.Double),
(r'[^"\\\$]+', String.Double),
(r'\$[^\{"]', String.Double),
(r'\$(?=")', String.Double),
(r'\\', String.Double),
],

@@ -127,2 +133,6 @@ 'antiquote': [

],
'block': [
(r"\}", Punctuation, '#pop'),
include('root'),
],
}

@@ -129,0 +139,0 @@

@@ -81,3 +81,8 @@ """

def analyse_text(text):
return ':-' in text
"""Competes with IDL and Visual Prolog on *.pro"""
if ':-' in text:
# Visual Prolog also uses :-
return 0.5
else:
return 0

@@ -84,0 +89,0 @@

@@ -38,4 +38,4 @@ """

name = 'Python'
url = 'http://www.python.org'
aliases = ['python', 'py', 'sage', 'python3', 'py3']
url = 'https://www.python.org'
aliases = ['python', 'py', 'sage', 'python3', 'py3', 'bazel', 'starlark']
filenames = [

@@ -429,3 +429,3 @@ '*.py',

name = 'Python 2.x'
url = 'http://www.python.org'
url = 'https://www.python.org'
aliases = ['python2', 'py2']

@@ -835,3 +835,3 @@ filenames = [] # now taken over by PythonLexer (3.x)

name = 'Cython'
url = 'http://cython.org'
url = 'https://cython.org'
aliases = ['cython', 'pyx', 'pyrex']

@@ -838,0 +838,0 @@ filenames = ['*.pyx', '*.pxd', '*.pxi']

@@ -263,2 +263,6 @@ """

# BlankNodeLabel
(r'(_)(:)([' + PN_CHARS_U_GRP + r'0-9]([' + PN_CHARS_GRP + r'.]*' + PN_CHARS + ')?)',
bygroups(Name.Namespace, Punctuation, Name.Tag)),
# Comment

@@ -265,0 +269,0 @@ (r'#[^\n]+', Comment),

@@ -51,3 +51,3 @@ """

'bool', 'dyn'), suffix=r'\b'), Keyword.Type),
(words(('printf', 'sizeof', 'alignof', 'len'), suffix=r'\b(\()'),
(words(('printf', 'sizeof', 'alignof', 'len', 'panic'), suffix=r'\b(\()'),
bygroups(Name.Builtin, Punctuation)),

@@ -66,3 +66,3 @@ # numeric literals

(r'<<=|>>=|<<|>>|<=|>=|\+=|-=|\*=|/=|\%=|\|=|&=|\^=|&&|\|\||&|\||'
r'\+\+|--|\%|\^|\~|==|!=|::|[.]{3}|#!|#|[+\-*/&]', Operator),
r'\+\+|--|\%|\^|\~|==|!=|->|::|[.]{3}|#!|#|[+\-*/&]', Operator),
(r'[|<>=!()\[\]{}.,;:\?]', Punctuation),

@@ -69,0 +69,0 @@ # identifiers

@@ -7,2 +7,4 @@ """

See also :mod:`pygments.lexers.lean`
:copyright: Copyright 2006-2023 by the Pygments team, see AUTHORS.

@@ -17,4 +19,5 @@ :license: BSD, see LICENSE for details.

Number, Punctuation, Generic, Whitespace
from pygments.lexers.lean import LeanLexer
__all__ = ['CoqLexer', 'IsabelleLexer', 'LeanLexer']
__all__ = ['CoqLexer', 'IsabelleLexer']

@@ -391,106 +394,1 @@

}
class LeanLexer(RegexLexer):
"""
For the Lean theorem prover.
.. versionadded:: 2.0
"""
name = 'Lean'
url = 'https://github.com/leanprover/lean'
aliases = ['lean']
filenames = ['*.lean']
mimetypes = ['text/x-lean']
tokens = {
'expression': [
(r'\s+', Text),
(r'/--', String.Doc, 'docstring'),
(r'/-', Comment, 'comment'),
(r'--.*?$', Comment.Single),
(words((
'forall', 'fun', 'Pi', 'from', 'have', 'show', 'assume', 'suffices',
'let', 'if', 'else', 'then', 'in', 'with', 'calc', 'match',
'do'
), prefix=r'\b', suffix=r'\b'), Keyword),
(words(('sorry', 'admit'), prefix=r'\b', suffix=r'\b'), Generic.Error),
(words(('Sort', 'Prop', 'Type'), prefix=r'\b', suffix=r'\b'), Keyword.Type),
(words((
'(', ')', ':', '{', '}', '[', ']', '⟨', '⟩', '‹', '›', '⦃', '⦄', ':=', ',',
)), Operator),
(r'[A-Za-z_\u03b1-\u03ba\u03bc-\u03fb\u1f00-\u1ffe\u2100-\u214f]'
r'[.A-Za-z_\'\u03b1-\u03ba\u03bc-\u03fb\u1f00-\u1ffe\u2070-\u2079'
r'\u207f-\u2089\u2090-\u209c\u2100-\u214f0-9]*', Name),
(r'0x[A-Za-z0-9]+', Number.Integer),
(r'0b[01]+', Number.Integer),
(r'\d+', Number.Integer),
(r'"', String.Double, 'string'),
(r"'(?:(\\[\\\"'nt])|(\\x[0-9a-fA-F]{2})|(\\u[0-9a-fA-F]{4})|.)'", String.Char),
(r'[~?][a-z][\w\']*:', Name.Variable),
(r'\S', Name.Builtin.Pseudo),
],
'root': [
(words((
'import', 'renaming', 'hiding',
'namespace',
'local',
'private', 'protected', 'section',
'include', 'omit', 'section',
'protected', 'export',
'open',
'attribute',
), prefix=r'\b', suffix=r'\b'), Keyword.Namespace),
(words((
'lemma', 'theorem', 'def', 'definition', 'example',
'axiom', 'axioms', 'constant', 'constants',
'universe', 'universes',
'inductive', 'coinductive', 'structure', 'extends',
'class', 'instance',
'abbreviation',
'noncomputable theory',
'noncomputable', 'mutual', 'meta',
'attribute',
'parameter', 'parameters',
'variable', 'variables',
'reserve', 'precedence',
'postfix', 'prefix', 'notation', 'infix', 'infixl', 'infixr',
'begin', 'by', 'end',
'set_option',
'run_cmd',
), prefix=r'\b', suffix=r'\b'), Keyword.Declaration),
(r'@\[', Keyword.Declaration, 'attribute'),
(words((
'#eval', '#check', '#reduce', '#exit',
'#print', '#help',
), suffix=r'\b'), Keyword),
include('expression')
],
'attribute': [
(r'\]', Keyword.Declaration, '#pop'),
include('expression'),
],
'comment': [
(r'[^/-]', Comment.Multiline),
(r'/-', Comment.Multiline, '#push'),
(r'-/', Comment.Multiline, '#pop'),
(r'[/-]', Comment.Multiline)
],
'docstring': [
(r'[^/-]', String.Doc),
(r'-/', String.Doc, '#pop'),
(r'[/-]', String.Doc)
],
'string': [
(r'[^\\"]+', String.Double),
(r"(?:(\\[\\\"'nt])|(\\x[0-9a-fA-F]{2})|(\\u[0-9a-fA-F]{4}))", String.Escape),
('"', String.Double, '#pop'),
],
}

@@ -193,2 +193,8 @@ """

#: user-friendly style name (used when selecting the style, so this
# should be all-lowercase, no spaces, hyphens)
name = 'unnamed'
aliases = []
# Attribute for lexers defined within Pygments. If set

@@ -195,0 +201,0 @@ # to True, the style is not shown in the style gallery

@@ -13,58 +13,13 @@ """

from pygments.util import ClassNotFound
from pygments.styles._mapping import STYLES
#: A dictionary of built-in styles, mapping style names to
#: ``'submodule::classname'`` strings.
STYLE_MAP = {
'abap': 'abap::AbapStyle',
'algol_nu': 'algol_nu::Algol_NuStyle',
'algol': 'algol::AlgolStyle',
'arduino': 'arduino::ArduinoStyle',
'autumn': 'autumn::AutumnStyle',
'borland': 'borland::BorlandStyle',
'bw': 'bw::BlackWhiteStyle',
'colorful': 'colorful::ColorfulStyle',
'default': 'default::DefaultStyle',
'dracula': 'dracula::DraculaStyle',
'emacs': 'emacs::EmacsStyle',
'friendly_grayscale': 'friendly_grayscale::FriendlyGrayscaleStyle',
'friendly': 'friendly::FriendlyStyle',
'fruity': 'fruity::FruityStyle',
'github-dark': 'gh_dark::GhDarkStyle',
'gruvbox-dark': 'gruvbox::GruvboxDarkStyle',
'gruvbox-light': 'gruvbox::GruvboxLightStyle',
'igor': 'igor::IgorStyle',
'inkpot': 'inkpot::InkPotStyle',
'lightbulb': 'lightbulb::LightbulbStyle',
'lilypond': 'lilypond::LilyPondStyle',
'lovelace': 'lovelace::LovelaceStyle',
'manni': 'manni::ManniStyle',
'material': 'material::MaterialStyle',
'monokai': 'monokai::MonokaiStyle',
'murphy': 'murphy::MurphyStyle',
'native': 'native::NativeStyle',
'nord-darker': 'nord::NordDarkerStyle',
'nord': 'nord::NordStyle',
'one-dark': 'onedark::OneDarkStyle',
'paraiso-dark': 'paraiso_dark::ParaisoDarkStyle',
'paraiso-light': 'paraiso_light::ParaisoLightStyle',
'pastie': 'pastie::PastieStyle',
'perldoc': 'perldoc::PerldocStyle',
'rainbow_dash': 'rainbow_dash::RainbowDashStyle',
'rrt': 'rrt::RrtStyle',
'sas': 'sas::SasStyle',
'solarized-dark': 'solarized::SolarizedDarkStyle',
'solarized-light': 'solarized::SolarizedLightStyle',
'staroffice': 'staroffice::StarofficeStyle',
'stata-dark': 'stata_dark::StataDarkStyle',
'stata-light': 'stata_light::StataLightStyle',
'stata': 'stata_light::StataLightStyle',
'tango': 'tango::TangoStyle',
'trac': 'trac::TracStyle',
'vim': 'vim::VimStyle',
'vs': 'vs::VisualStudioStyle',
'xcode': 'xcode::XcodeStyle',
'zenburn': 'zenburn::ZenburnStyle'
}
#: This list is deprecated. Use `pygments.styles.STYLES` instead
STYLE_MAP = {v[1]: v[0].split('.')[-1] + '::' + k for k, v in STYLES.items()}
#: Internal reverse mapping to make `get_style_by_name` more efficient
_STYLE_NAME_TO_MODULE_MAP = {v[1]: (v[0], k) for k, v in STYLES.items()}
def get_style_by_name(name):

@@ -78,4 +33,4 @@ """

"""
if name in STYLE_MAP:
mod, cls = STYLE_MAP[name].split('::')
if name in _STYLE_NAME_TO_MODULE_MAP:
mod, cls = _STYLE_NAME_TO_MODULE_MAP[name]
builtin = "yes"

@@ -88,10 +43,11 @@ else:

builtin = ""
mod = name
mod = 'pygments.styles.' + name
cls = name.title() + "Style"
try:
mod = __import__('pygments.styles.' + mod, None, None, [cls])
mod = __import__(mod, None, None, [cls])
except ImportError:
raise ClassNotFound("Could not find style module %r" % mod +
(builtin and ", though it should be builtin") + ".")
(builtin and ", though it should be builtin")
+ ".")
try:

@@ -105,4 +61,5 @@ return getattr(mod, cls)

"""Return a generator for all styles by name, both builtin and plugin."""
yield from STYLE_MAP
for v in STYLES.values():
yield v[1]
for name, _ in find_plugin_styles():
yield name

@@ -16,3 +16,7 @@ """

__all__ = ['AbapStyle']
class AbapStyle(Style):
name = 'abap'

@@ -19,0 +23,0 @@ styles = {

@@ -36,3 +36,7 @@ """

__all__ = ['Algol_NuStyle']
class Algol_NuStyle(Style):
name = 'algol_nu'

@@ -39,0 +43,0 @@ background_color = "#ffffff"

@@ -36,3 +36,7 @@ """

__all__ = ['AlgolStyle']
class AlgolStyle(Style):
name = 'algol'

@@ -39,0 +43,0 @@ background_color = "#ffffff"

@@ -16,2 +16,5 @@ """

__all__ = ['ArduinoStyle']
class ArduinoStyle(Style):

@@ -22,2 +25,3 @@ """

"""
name = 'arduino'

@@ -27,73 +31,73 @@ background_color = "#ffffff"

styles = {
Whitespace: "", # class: 'w'
Error: "#a61717", # class: 'err'
Whitespace: "", # class: 'w'
Error: "#a61717", # class: 'err'
Comment: "#95a5a6", # class: 'c'
Comment.Multiline: "", # class: 'cm'
Comment.Preproc: "#728E00", # class: 'cp'
Comment.Single: "", # class: 'c1'
Comment.Special: "", # class: 'cs'
Comment: "#95a5a6", # class: 'c'
Comment.Multiline: "", # class: 'cm'
Comment.Preproc: "#728E00", # class: 'cp'
Comment.Single: "", # class: 'c1'
Comment.Special: "", # class: 'cs'
Keyword: "#728E00", # class: 'k'
Keyword.Constant: "#00979D", # class: 'kc'
Keyword.Declaration: "", # class: 'kd'
Keyword.Namespace: "", # class: 'kn'
Keyword.Pseudo: "#00979D", # class: 'kp'
Keyword.Reserved: "#00979D", # class: 'kr'
Keyword.Type: "#00979D", # class: 'kt'
Keyword: "#728E00", # class: 'k'
Keyword.Constant: "#00979D", # class: 'kc'
Keyword.Declaration: "", # class: 'kd'
Keyword.Namespace: "", # class: 'kn'
Keyword.Pseudo: "#00979D", # class: 'kp'
Keyword.Reserved: "#00979D", # class: 'kr'
Keyword.Type: "#00979D", # class: 'kt'
Operator: "#728E00", # class: 'o'
Operator.Word: "", # class: 'ow'
Operator: "#728E00", # class: 'o'
Operator.Word: "", # class: 'ow'
Name: "#434f54", # class: 'n'
Name.Attribute: "", # class: 'na'
Name.Builtin: "#728E00", # class: 'nb'
Name.Builtin.Pseudo: "", # class: 'bp'
Name.Class: "", # class: 'nc'
Name.Constant: "", # class: 'no'
Name.Decorator: "", # class: 'nd'
Name.Entity: "", # class: 'ni'
Name.Exception: "", # class: 'ne'
Name.Function: "#D35400", # class: 'nf'
Name.Property: "", # class: 'py'
Name.Label: "", # class: 'nl'
Name.Namespace: "", # class: 'nn'
Name.Other: "#728E00", # class: 'nx'
Name.Tag: "", # class: 'nt'
Name.Variable: "", # class: 'nv'
Name.Variable.Class: "", # class: 'vc'
Name.Variable.Global: "", # class: 'vg'
Name.Variable.Instance: "", # class: 'vi'
Name: "#434f54", # class: 'n'
Name.Attribute: "", # class: 'na'
Name.Builtin: "#728E00", # class: 'nb'
Name.Builtin.Pseudo: "", # class: 'bp'
Name.Class: "", # class: 'nc'
Name.Constant: "", # class: 'no'
Name.Decorator: "", # class: 'nd'
Name.Entity: "", # class: 'ni'
Name.Exception: "", # class: 'ne'
Name.Function: "#D35400", # class: 'nf'
Name.Property: "", # class: 'py'
Name.Label: "", # class: 'nl'
Name.Namespace: "", # class: 'nn'
Name.Other: "#728E00", # class: 'nx'
Name.Tag: "", # class: 'nt'
Name.Variable: "", # class: 'nv'
Name.Variable.Class: "", # class: 'vc'
Name.Variable.Global: "", # class: 'vg'
Name.Variable.Instance: "", # class: 'vi'
Number: "#8A7B52", # class: 'm'
Number.Float: "", # class: 'mf'
Number.Hex: "", # class: 'mh'
Number.Integer: "", # class: 'mi'
Number.Integer.Long: "", # class: 'il'
Number.Oct: "", # class: 'mo'
Number: "#8A7B52", # class: 'm'
Number.Float: "", # class: 'mf'
Number.Hex: "", # class: 'mh'
Number.Integer: "", # class: 'mi'
Number.Integer.Long: "", # class: 'il'
Number.Oct: "", # class: 'mo'
String: "#7F8C8D", # class: 's'
String.Backtick: "", # class: 'sb'
String.Char: "", # class: 'sc'
String.Doc: "", # class: 'sd'
String.Double: "", # class: 's2'
String.Escape: "", # class: 'se'
String.Heredoc: "", # class: 'sh'
String.Interpol: "", # class: 'si'
String.Other: "", # class: 'sx'
String.Regex: "", # class: 'sr'
String.Single: "", # class: 's1'
String.Symbol: "", # class: 'ss'
String: "#7F8C8D", # class: 's'
String.Backtick: "", # class: 'sb'
String.Char: "", # class: 'sc'
String.Doc: "", # class: 'sd'
String.Double: "", # class: 's2'
String.Escape: "", # class: 'se'
String.Heredoc: "", # class: 'sh'
String.Interpol: "", # class: 'si'
String.Other: "", # class: 'sx'
String.Regex: "", # class: 'sr'
String.Single: "", # class: 's1'
String.Symbol: "", # class: 'ss'
Generic: "", # class: 'g'
Generic.Deleted: "", # class: 'gd',
Generic.Emph: "", # class: 'ge'
Generic.Error: "", # class: 'gr'
Generic.Heading: "", # class: 'gh'
Generic.Inserted: "", # class: 'gi'
Generic.Output: "", # class: 'go'
Generic.Prompt: "", # class: 'gp'
Generic.Strong: "", # class: 'gs'
Generic.Subheading: "", # class: 'gu'
Generic.Traceback: "", # class: 'gt'
Generic: "", # class: 'g'
Generic.Deleted: "", # class: 'gd',
Generic.Emph: "", # class: 'ge'
Generic.Error: "", # class: 'gr'
Generic.Heading: "", # class: 'gh'
Generic.Inserted: "", # class: 'gi'
Generic.Output: "", # class: 'go'
Generic.Prompt: "", # class: 'gp'
Generic.Strong: "", # class: 'gs'
Generic.Subheading: "", # class: 'gu'
Generic.Traceback: "", # class: 'gt'
}

@@ -16,2 +16,5 @@ """

__all__ = ['AutumnStyle']
class AutumnStyle(Style):

@@ -21,2 +24,3 @@ """

"""
name = 'autumn'

@@ -23,0 +27,0 @@ styles = {

@@ -16,2 +16,5 @@ """

__all__ = ['BorlandStyle']
class BorlandStyle(Style):

@@ -21,2 +24,3 @@ """

"""
name = 'borland'

@@ -23,0 +27,0 @@ styles = {

@@ -16,3 +16,7 @@ """

__all__ = ['BlackWhiteStyle']
class BlackWhiteStyle(Style):
name = 'bw'

@@ -19,0 +23,0 @@ background_color = "#ffffff"

@@ -16,2 +16,5 @@ """

__all__ = ['ColorfulStyle']
class ColorfulStyle(Style):

@@ -21,2 +24,3 @@ """

"""
name = 'colorful'

@@ -23,0 +27,0 @@ styles = {

@@ -16,2 +16,5 @@ """

__all__ = ['DefaultStyle']
class DefaultStyle(Style):

@@ -21,2 +24,3 @@ """

"""
name = 'default'

@@ -23,0 +27,0 @@ background_color = "#f8f8f8"

@@ -19,86 +19,73 @@ """

__all__ = ['DraculaStyle']
background = "#282a36"
foreground = "#f8f8f2"
selection = "#44475a"
comment = "#6272a4"
cyan = "#8be9fd"
green = "#50fa7b"
orange = "#ffb86c"
pink = "#ff79c6"
purple = "#bd93f9"
red = "#ff5555"
yellow = "#f1fa8c"
deletion = "#8b080b"
class DraculaStyle(Style):
name = 'dracula'
background_color = "#282a36"
highlight_color = "#44475a"
line_number_color = "#f1fa8c"
line_number_background_color = "#44475a"
line_number_special_color = "#50fa7b"
line_number_special_background_color = "#6272a4"
background_color = background
highlight_color = selection
line_number_color = yellow
line_number_background_color = selection
line_number_special_color = green
line_number_special_background_color = comment
styles = {
Whitespace: "#f8f8f2",
Whitespace: foreground,
Comment: "#6272a4",
Comment.Hashbang: "#6272a4",
Comment.Multiline: "#6272a4",
Comment.Preproc: "#ff79c6",
Comment.Single: "#6272a4",
Comment.Special: "#6272a4",
Comment: comment,
Comment.Preproc: pink,
Generic: "#f8f8f2",
Generic.Deleted: "#8b080b",
Generic.Emph: "#f8f8f2 underline",
Generic.Error: "#f8f8f2",
Generic.Heading: "#f8f8f2 bold",
Generic.Inserted: "#f8f8f2 bold",
Generic.Output: "#44475a",
Generic.Prompt: "#f8f8f2",
Generic.Strong: "#f8f8f2",
Generic.EmphStrong: "#f8f8f2 underline",
Generic.Subheading: "#f8f8f2 bold",
Generic.Traceback: "#f8f8f2",
Generic: foreground,
Generic.Deleted: deletion,
Generic.Emph: "underline",
Generic.Heading: "bold",
Generic.Inserted: "bold",
Generic.Output: selection,
Generic.EmphStrong: "underline",
Generic.Subheading: "bold",
Error: "#f8f8f2",
Keyword: "#ff79c6",
Keyword.Constant: "#ff79c6",
Keyword.Declaration: "#8be9fd italic",
Keyword.Namespace: "#ff79c6",
Keyword.Pseudo: "#ff79c6",
Keyword.Reserved: "#ff79c6",
Keyword.Type: "#8be9fd",
Literal: "#f8f8f2",
Literal.Date: "#f8f8f2",
Name: "#f8f8f2",
Name.Attribute: "#50fa7b",
Name.Builtin: "#8be9fd italic",
Name.Builtin.Pseudo: "#f8f8f2",
Name.Class: "#50fa7b",
Name.Constant: "#f8f8f2",
Name.Decorator: "#f8f8f2",
Name.Entity: "#f8f8f2",
Name.Exception: "#f8f8f2",
Name.Function: "#50fa7b",
Name.Label: "#8be9fd italic",
Name.Namespace: "#f8f8f2",
Name.Other: "#f8f8f2",
Name.Tag: "#ff79c6",
Name.Variable: "#8be9fd italic",
Name.Variable.Class: "#8be9fd italic",
Name.Variable.Global: "#8be9fd italic",
Name.Variable.Instance: "#8be9fd italic",
Number: "#ffb86c",
Number.Bin: "#ffb86c",
Number.Float: "#ffb86c",
Number.Hex: "#ffb86c",
Number.Integer: "#ffb86c",
Number.Integer.Long: "#ffb86c",
Number.Oct: "#ffb86c",
Operator: "#ff79c6",
Operator.Word: "#ff79c6",
Other: "#f8f8f2",
Punctuation: "#f8f8f2",
String: "#bd93f9",
String.Backtick: "#bd93f9",
String.Char: "#bd93f9",
String.Doc: "#bd93f9",
String.Double: "#bd93f9",
String.Escape: "#bd93f9",
String.Heredoc: "#bd93f9",
String.Interpol: "#bd93f9",
String.Other: "#bd93f9",
String.Regex: "#bd93f9",
String.Single: "#bd93f9",
String.Symbol: "#bd93f9",
Text: "#f8f8f2",
Error: foreground,
Keyword: pink,
Keyword.Constant: pink,
Keyword.Declaration: cyan + " italic",
Keyword.Type: cyan,
Literal: foreground,
Name: foreground,
Name.Attribute: green,
Name.Builtin: cyan + " italic",
Name.Builtin.Pseudo: foreground,
Name.Class: green,
Name.Function: green,
Name.Label: cyan + " italic",
Name.Tag: pink,
Name.Variable: cyan + " italic",
Number: orange,
Operator: pink,
Other: foreground,
Punctuation: foreground,
String: purple,
Text: foreground,
}

@@ -16,2 +16,5 @@ """

__all__ = ['EmacsStyle']
class EmacsStyle(Style):

@@ -21,2 +24,3 @@ """

"""
name = 'emacs'

@@ -23,0 +27,0 @@ background_color = "#f8f8f8"

@@ -19,2 +19,5 @@ """

__all__ = ['FriendlyGrayscaleStyle']
class FriendlyGrayscaleStyle(Style):

@@ -26,2 +29,3 @@ """

"""
name = 'friendly_grayscale'

@@ -28,0 +32,0 @@ background_color = "#f0f0f0"

@@ -16,2 +16,5 @@ """

__all__ = ['FriendlyStyle']
class FriendlyStyle(Style):

@@ -21,2 +24,3 @@ """

"""
name = 'friendly'

@@ -23,0 +27,0 @@ background_color = "#f0f0f0"

@@ -15,2 +15,6 @@ """

__all__ = ['FruityStyle']
class FruityStyle(Style):

@@ -21,2 +25,4 @@ """

name = 'fruity'
background_color = '#111111'

@@ -23,0 +29,0 @@ highlight_color = '#333333'

@@ -17,2 +17,5 @@ """

__all__ = ['GhDarkStyle']
# vars are defined to match the defs in

@@ -44,2 +47,4 @@ # - [GitHub's VS Code theme](https://github.com/primer/github-vscode-theme) and

"""
name = 'github-dark'

@@ -46,0 +51,0 @@ background_color = BG_DEFAULT

@@ -17,2 +17,5 @@ """

__all__ = ['GruvboxDarkStyle', 'GruvboxLightStyle']
class GruvboxDarkStyle(Style):

@@ -22,2 +25,4 @@ """

"""
name = 'gruvbox-dark'

@@ -68,2 +73,3 @@ background_color = '#282828'

class GruvboxLightStyle(Style):

@@ -74,2 +80,4 @@ """

name = 'gruvbox-light'
background_color = '#fbf1c7'

@@ -76,0 +84,0 @@ highlight_color = '#3c3836'

@@ -15,2 +15,5 @@ """

__all__ = ['IgorStyle']
class IgorStyle(Style):

@@ -21,2 +24,4 @@ """

name = 'igor'
styles = {

@@ -23,0 +28,0 @@ Comment: 'italic #FF0000',

@@ -16,4 +16,8 @@ """

__all__ = ['InkPotStyle']
class InkPotStyle(Style):
name = 'inkpot'
background_color = "#1e1e27"

@@ -20,0 +24,0 @@

@@ -27,2 +27,5 @@ """

__all__ = ['LightbulbStyle']
COLORS = {

@@ -51,2 +54,5 @@ "bg": "#1d2331",

"""
name = 'lightbulb'
background_color = COLORS['bg']

@@ -53,0 +59,0 @@ highlight_color = COLORS['gray_3']

@@ -14,9 +14,15 @@ """

__all__ = ['LilyPondStyle']
class LilyPondStyle(Style):
"""
Style for the LilyPond language.
.. versionadded:: 2.11
"""
name = 'lilypond'
# Don't show it in the gallery, it's intended for LilyPond

@@ -23,0 +29,0 @@ # input only and doesn't show good output on Python code.

@@ -20,2 +20,5 @@ """

__all__ = ['LovelaceStyle']
class LovelaceStyle(Style):

@@ -26,11 +29,13 @@ """

"""
_KW_BLUE = '#2838b0'
_NAME_GREEN = '#388038'
_DOC_ORANGE = '#b85820'
_OW_PURPLE = '#a848a8'
_FUN_BROWN = '#785840'
_STR_RED = '#b83838'
_CLS_CYAN = '#287088'
_ESCAPE_LIME = '#709030'
_LABEL_CYAN = '#289870'
name = 'lovelace'
_KW_BLUE = '#2838b0'
_NAME_GREEN = '#388038'
_DOC_ORANGE = '#b85820'
_OW_PURPLE = '#a848a8'
_FUN_BROWN = '#785840'
_STR_RED = '#b83838'
_CLS_CYAN = '#287088'
_ESCAPE_LIME = '#709030'
_LABEL_CYAN = '#289870'
_EXCEPT_YELLOW = '#908828'

@@ -37,0 +42,0 @@

@@ -19,2 +19,5 @@ """

__all__ = ['ManniStyle']
class ManniStyle(Style):

@@ -24,3 +27,4 @@ """

"""
name = 'manni'
background_color = '#f0f3f3'

@@ -27,0 +31,0 @@

@@ -17,2 +17,6 @@ """

__all__ = ['MaterialStyle']
class MaterialStyle(Style):

@@ -22,16 +26,18 @@ """

"""
name = 'material'
dark_teal = '#263238'
white= '#FFFFFF'
black= '#000000'
red= '#FF5370'
orange= '#F78C6C'
yellow= '#FFCB6B'
green= '#C3E88D'
cyan= '#89DDFF'
blue= '#82AAFF'
paleblue= '#B2CCD6'
purple= '#C792EA'
brown= '#C17E70'
pink= '#F07178'
violet= '#BB80B3'
white = '#FFFFFF'
black = '#000000'
red = '#FF5370'
orange = '#F78C6C'
yellow = '#FFCB6B'
green = '#C3E88D'
cyan = '#89DDFF'
blue = '#82AAFF'
paleblue = '#B2CCD6'
purple = '#C792EA'
brown = '#C17E70'
pink = '#F07178'
violet = '#BB80B3'
foreground = '#EEFFFF'

@@ -38,0 +44,0 @@ faded = '#546E7A'

@@ -17,2 +17,6 @@ """

__all__ = ['MonokaiStyle']
class MonokaiStyle(Style):

@@ -22,3 +26,4 @@ """

"""
name = 'monokai'
background_color = "#272822"

@@ -25,0 +30,0 @@ highlight_color = "#49483e"

@@ -16,2 +16,5 @@ """

__all__ = ['MurphyStyle']
class MurphyStyle(Style):

@@ -21,3 +24,4 @@ """

"""
name = 'murphy'
styles = {

@@ -24,0 +28,0 @@ Whitespace: "#bbbbbb",

@@ -16,2 +16,5 @@ """

__all__ = ['NativeStyle']
class NativeStyle(Style):

@@ -21,3 +24,4 @@ """

"""
name = 'native'
background_color = '#202020'

@@ -24,0 +28,0 @@ highlight_color = '#404040'

@@ -17,2 +17,5 @@ """

__all__ = ['NordStyle', 'NordDarkerStyle']
class NordStyle(Style):

@@ -22,3 +25,4 @@ """

"""
name = 'nord'
line_number_color = "#D8DEE9"

@@ -92,3 +96,4 @@ line_number_background_color = "#242933"

"""
name = 'nord-darker'
line_number_color = "#D8DEE9"

@@ -95,0 +100,0 @@ line_number_background_color = "#242933"

@@ -19,2 +19,5 @@ """

__all__ = ['OneDarkStyle']
class OneDarkStyle(Style):

@@ -26,3 +29,4 @@ """

"""
name = 'one-dark'
background_color = '#282C34'

@@ -29,0 +33,0 @@

@@ -20,2 +20,5 @@ """

__all__ = ['ParaisoDarkStyle']
BACKGROUND = "#2f1e2e"

@@ -36,3 +39,4 @@ CURRENT_LINE = "#41323f"

class ParaisoDarkStyle(Style):
name = 'paraiso-dark'
background_color = BACKGROUND

@@ -39,0 +43,0 @@ highlight_color = SELECTION

@@ -20,2 +20,5 @@ """

__all__ = ['ParaisoLightStyle']
BACKGROUND = "#e7e9db"

@@ -36,3 +39,4 @@ CURRENT_LINE = "#b9b6b0"

class ParaisoLightStyle(Style):
name = 'paraiso-light'
background_color = BACKGROUND

@@ -39,0 +43,0 @@ highlight_color = SELECTION

@@ -18,2 +18,5 @@ """

__all__ = ['PastieStyle']
class PastieStyle(Style):

@@ -24,2 +27,4 @@ """

name = 'pastie'
styles = {

@@ -26,0 +31,0 @@ Whitespace: '#bbbbbb',

@@ -18,2 +18,5 @@ """

__all__ = ['PerldocStyle']
class PerldocStyle(Style):

@@ -24,2 +27,4 @@ """

name = 'perldoc'
background_color = '#eeeedd'

@@ -26,0 +31,0 @@

@@ -17,2 +17,6 @@ """

__all__ = ['RainbowDashStyle']
BLUE_LIGHT = '#0080ff'

@@ -41,2 +45,4 @@ BLUE = '#2c5dcd'

name = 'rainbow_dash'
background_color = WHITE

@@ -43,0 +49,0 @@

@@ -12,5 +12,8 @@ """

from pygments.style import Style
from pygments.token import Token, Comment, Name, Keyword, String
from pygments.token import Token, Comment, Name, Keyword, String, Number
__all__ = ['RrtStyle']
class RrtStyle(Style):

@@ -21,2 +24,4 @@ """

name = 'rrt'
background_color = '#000000'

@@ -35,2 +40,3 @@ highlight_color = '#0000ff'

Keyword.Type: '#ee82ee',
Number: '#ff00ff',
}

@@ -18,2 +18,5 @@ """

__all__ = ['SasStyle']
class SasStyle(Style):

@@ -26,2 +29,4 @@ """

name = 'sas'
styles = {

@@ -28,0 +33,0 @@ Whitespace: '#bbbbbb',

@@ -19,2 +19,5 @@ """

__all__ = ['SolarizedLightStyle', 'SolarizedDarkStyle']
def make_style(colors):

@@ -122,2 +125,4 @@ return {

name = 'solarized-dark'
styles = make_style(DARK_COLORS)

@@ -135,2 +140,4 @@ background_color = DARK_COLORS['base03']

name = 'solarized-light'
styles = make_style(LIGHT_COLORS)

@@ -137,0 +144,0 @@ background_color = LIGHT_COLORS['base03']

@@ -15,2 +15,5 @@ """

__all__ = ['StarofficeStyle']
class StarofficeStyle(Style):

@@ -20,3 +23,5 @@ """

"""
name = 'staroffice'
styles = {

@@ -23,0 +28,0 @@ Token: '#000080', # Blue

@@ -18,4 +18,8 @@ """

__all__ = ['StataDarkStyle']
class StataDarkStyle(Style):
name = 'stata-dark'
background_color = "#232629"

@@ -22,0 +26,0 @@ highlight_color = "#49483e"

@@ -17,2 +17,5 @@ """

__all__ = ['StataLightStyle']
class StataLightStyle(Style):

@@ -24,2 +27,4 @@ """

name = 'stata-light'
styles = {

@@ -26,0 +31,0 @@ Text: '#111111',

@@ -44,2 +44,5 @@ """

__all__ = ['TangoStyle']
class TangoStyle(Style):

@@ -51,4 +54,4 @@ """

# work in progress...
name = 'tango'
background_color = "#f8f8f8"

@@ -55,0 +58,0 @@

@@ -16,2 +16,5 @@ """

__all__ = ['TracStyle']
class TracStyle(Style):

@@ -22,2 +25,4 @@ """

name = 'trac'
styles = {

@@ -24,0 +29,0 @@ Whitespace: '#bbbbbb',

@@ -16,2 +16,5 @@ """

__all__ = ['VimStyle']
class VimStyle(Style):

@@ -22,2 +25,4 @@ """

name = 'vim'
background_color = "#000000"

@@ -24,0 +29,0 @@ highlight_color = "#222222"

@@ -16,4 +16,8 @@ """

__all__ = ['VisualStudioStyle']
class VisualStudioStyle(Style):
name = 'vs'
background_color = "#ffffff"

@@ -20,0 +24,0 @@

@@ -16,2 +16,5 @@ """

__all__ = ['XcodeStyle']
class XcodeStyle(Style):

@@ -22,2 +25,4 @@ """

name = 'xcode'
styles = {

@@ -24,0 +29,0 @@ Comment: '#177500',

@@ -19,2 +19,5 @@ """

__all__ = ['ZenburnStyle']
class ZenburnStyle(Style):

@@ -25,2 +28,4 @@ """

name = 'zenburn'
background_color = '#3f3f3f'

@@ -27,0 +32,0 @@ highlight_color = '#484848'

[build-system]
# setuptools added pyproject.toml support in v61.0.0
requires = ["setuptools >= 61"]
build-backend = "setuptools.build_meta"
requires = ["hatchling"]
build-backend = "hatchling.build"

@@ -37,2 +36,3 @@ [project]

"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: Implementation :: CPython",

@@ -46,2 +46,3 @@ "Programming Language :: Python :: Implementation :: PyPy",

plugins = ["importlib-metadata;python_version<'3.8'"]
windows-terminal = ["colorama >= 0.4.6"]

@@ -58,6 +59,3 @@ [project.urls]

[tool.setuptools.dynamic]
version = {attr = "pygments.__version__" }
[tool.setuptools.packages.find]
include = ["pygments", "pygments.*"]
[tool.hatch.version]
path = "pygments/__init__.py"

@@ -22,3 +22,3 @@ """

def main():
for key in ['lexers', 'formatters']:
for key in ['lexers', 'formatters', 'styles']:
lines = []

@@ -31,7 +31,9 @@ for file in (pygments_package / key).glob('[!_]*.py'):

obj = getattr(module, obj_name)
desc = (module_name, obj.name, tuple(obj.aliases), tuple(obj.filenames))
desc = (module_name, obj.name, tuple(obj.aliases))
if key == 'lexers':
desc += (tuple(obj.mimetypes),)
desc += (tuple(obj.filenames), tuple(obj.mimetypes),)
elif key == 'formatters':
desc += (docstring_headline(obj),)
desc += (tuple(obj.filenames), docstring_headline(obj),)
elif key == 'styles':
pass
else:

@@ -38,0 +40,0 @@ assert False

@@ -49,5 +49,11 @@ """

lexer = pygments.lexers.get_lexer_by_name(self.lexer)
tokens = lexer.get_tokens(self.input)
tokens = list(lexer.get_tokens(self.input))
self.actual = '\n'.join(self._prettyprint_tokens(tokens)).rstrip('\n') + '\n'
if not self.config.getoption('--update-goldens'):
if self.config.getoption('--update-goldens'):
# Make sure the new golden output corresponds to the input.
output = ''.join(val for (tok, val) in tokens)
preproc_input = lexer._preprocess_lexer_input(self.input) # remove BOMs etc.
assert output == preproc_input
else:
# Make sure the output is the expected golden output
assert self.actual == self.expected

@@ -63,8 +69,14 @@

if isinstance(excinfo.value, AssertionError):
rel_path = self._test_file_rel_path()
message = (
'The tokens produced by the "{}" lexer differ from the '
'expected ones in the file "{}".\n'
'Run `pytest {} --update-goldens` to update it.'
).format(self.lexer, rel_path, Path(*rel_path.parts[:2]))
if self.config.getoption('--update-goldens'):
message = (
f'The tokens produced by the "{self.lexer}" lexer '
'do not add up to the input.'
)
else:
rel_path = self._test_file_rel_path()
message = (
'The tokens produced by the "{}" lexer differ from the '
'expected ones in the file "{}".\n'
'Run `tox -- {} --update-goldens` to update it.'
).format(self.lexer, rel_path, Path(*rel_path.parts[:2]))
diff = str(excinfo.value).split('\n', 1)[-1]

@@ -71,0 +83,0 @@ return message + '\n\n' + diff

@@ -23,2 +23,7 @@ CMAKE_MINIMUM_REQUIRED(VERSION 2.6 FATAL_ERROR)

message([==[
this is a
multiline argument
]==])
SET( x y A B C ) # stores "y;A;B;C" in x (without quote)

@@ -56,2 +61,2 @@ SET( ${x} ) # => SET( y;A;B;C ) => SET( y A B C)

#[==[ #[[ A "nested" comment ]] ]==]
#[==[ #[[ A "nested" comment ]] ]==]
---input---
-{R|zh-cn:博客;; zh-hk:網誌; zh-tw:部落格;}-
-{T|zh-cn:博客; zh-hk:網誌; zh-tw:部落格;}-
-{A|zh-cn:博客; zh-hk:網誌; zh-tw:部落格;}-
-{H|zh-cn:博客; zh-hk:網誌; zh-tw:部落格;}-
-{-|zh-cn:博客; zh-hk:網誌; zh-tw:部落格;}-
-{H|zh-cn:[[博客]]; zh-hk:網誌; zh-tw:部落格}-
-{H|zh-cn:博客; zh-hk:網誌; zh-tw:部落格;}-
-{H|zh-cn:[[博客]]; zh-hk:網誌; zh-tw:部落格; }-
-{zh-cn:博客; zh-hk:網誌; zh-tw:部落格;}-
-{zh-cn:[[博客]]; zh-hk:網誌; zh-tw:部落格}-
-{zh-cn:博客; zh-hk:網誌; zh-tw:部落格;}-
-{zh-cn:[[博客]]; zh-hk:網誌; zh-tw:部落格; }-
-{zh-tw=>zh-cn:[[博客]];zh-hk=>zh-cn:[[博客]]; }-
-{H|巨集=>zh-cn:宏;}-
-{D|U槽=>zh-cn:U盘; U槽=>zh-sg:U盘; U槽=>zh-my:U盘; U槽=>zh-tw:USB磁碟機; U槽=>zh-hk:U磁碟機; U槽=>zh-mo:U磁碟機}-
北-{}-韓、北朝-{}-鲜
-{部落格}- -{[[部落格]]}-
-{zh;zh-hans;zh-hant|博客、網誌、部落格}-
-{zh;zh-hans;zh-hant|zh-hans:博客、網誌、部落格;zh-hant:;;;;;;;;;}-
-{zh;zh-cn;zh-hk|博客、網誌、部落格}-
-{zh-cn:-{[[博客]]}-; zh-invalid:網誌; zh-tw:部落格}-
-{zh-invalid:''a''[[博客]];;;zh-cn:a}-
-{zh:''a''[[博客]]}-
-{zh-hans:<span style="font-size:120%;">xxx</span>;zh-hant:\

@@ -23,3 +31,2 @@ <span style="font-size:120%;">yyy</span>;}-

-{{This is a template}}-
-{{{This is a template parameter}}}-

@@ -29,9 +36,8 @@

'-{' Punctuation
'H' Keyword
'R' Keyword
'|' Punctuation
'zh-cn' Name.Label
':' Punctuation
'[[' Punctuation
'博客' Name.Tag
']]' Punctuation
'博客' Text
';' Text
';' Punctuation

@@ -45,7 +51,7 @@ ' zh-hk' Name.Label

'部落格' Text
'}-' Punctuation
';}-' Punctuation
'\n' Text
'-{' Punctuation
'H' Keyword
'T' Keyword
'|' Punctuation

@@ -67,2 +73,19 @@ 'zh-cn' Name.Label

'-{' Punctuation
'A' Keyword
'|' Punctuation
'zh-cn' Name.Label
':' Punctuation
'博客' Text
';' Punctuation
' zh-hk' Name.Label
':' Punctuation
'網誌' Text
';' Punctuation
' zh-tw' Name.Label
':' Punctuation
'部落格' Text
';}-' Punctuation
'\n' Text
'-{' Punctuation
'H' Keyword

@@ -72,5 +95,3 @@ '|' Punctuation

':' Punctuation
'[[' Punctuation
'博客' Name.Tag
']]' Punctuation
'博客' Text
';' Punctuation

@@ -84,8 +105,25 @@ ' zh-hk' Name.Label

'部落格' Text
'; }-' Punctuation
';}-' Punctuation
'\n' Text
'-{' Punctuation
'-' Keyword
'|' Punctuation
'zh-cn' Name.Label
':' Punctuation
'博客' Text
';' Punctuation
' zh-hk' Name.Label
':' Punctuation
'網誌' Text
';' Punctuation
' zh-tw' Name.Label
':' Punctuation
'部落格' Text
';}-' Punctuation
'\n' Text
'-{' Punctuation
'H' Keyword
'|' Punctuation
'zh-cn' Name.Label

@@ -107,2 +145,4 @@ ':' Punctuation

'\n' Text
'-{' Punctuation

@@ -137,10 +177,6 @@ 'zh-cn' Name.Label

'部落格' Text
'; }-' Punctuation
'}-' Punctuation
'\n' Text
'\n' Text
'-{' Punctuation
'zh-tw' Name.Label
'=>' Operator
'zh-cn' Name.Label

@@ -152,15 +188,90 @@ ':' Punctuation

';' Punctuation
'zh-hk' Name.Label
' zh-hk' Name.Label
':' Punctuation
'網誌' Text
';' Punctuation
' zh-tw' Name.Label
':' Punctuation
'部落格' Text
';' Punctuation
' ' Text
'}-' Punctuation
'\n' Text
'\n' Text
'-{' Punctuation
'H' Keyword
'|' Punctuation
'巨集' Text
'=>' Operator
'zh-cn' Name.Label
':' Punctuation
'[[' Punctuation
'博客' Name.Tag
']]' Punctuation
'; }-' Punctuation
'宏' Text
';' Punctuation
'}-' Punctuation
'\n' Text
'-{' Punctuation
'D' Keyword
'|' Punctuation
'U槽' Text
'=>' Operator
'zh-cn' Name.Label
':' Punctuation
'U盘' Text
';' Punctuation
' ' Text
'U槽' Text
'=>' Operator
'zh-sg' Name.Label
':' Punctuation
'U盘' Text
';' Punctuation
' ' Text
'U槽' Text
'=>' Operator
'zh-my' Name.Label
':' Punctuation
'U盘' Text
';' Punctuation
' ' Text
'U槽' Text
'=>' Operator
'zh-tw' Name.Label
':' Punctuation
'USB磁碟機' Text
';' Punctuation
' ' Text
'U槽' Text
'=>' Operator
'zh-hk' Name.Label
':' Punctuation
'U磁碟機' Text
';' Punctuation
' ' Text
'U槽' Text
'=>' Operator
'zh-mo' Name.Label
':' Punctuation
'U磁碟機' Text
'}-' Punctuation
'\n' Text
'\n' Text
'北' Text
'-{' Punctuation
'}-' Punctuation
'韓' Text
'、' Text
'北朝' Text
'-{' Punctuation
'}-' Punctuation
'鲜' Text
'\n' Text
'\n' Text
'-{' Punctuation
'部落格' Text

@@ -179,2 +290,79 @@ '}-' Punctuation

'-{' Punctuation
'zh' Keyword
';' Punctuation
'zh' Keyword
'-' Keyword
'hans' Keyword
';' Punctuation
'zh' Keyword
'-' Keyword
'hant' Keyword
'|' Punctuation
'博客' Text
'、' Text
'網誌' Text
'、' Text
'部落格' Text
'}-' Punctuation
'\n' Text
'-{' Punctuation
'zh' Keyword
';' Punctuation
'zh' Keyword
'-' Keyword
'hans' Keyword
';' Punctuation
'zh' Keyword
'-' Keyword
'hant' Keyword
'|' Punctuation
'zh' Text
'-' Text
'hans' Text
':' Text
'博客' Text
'、' Text
'網誌' Text
'、' Text
'部落格' Text
';' Text
'zh' Text
'-' Text
'hant' Text
':' Text
';' Text
';' Text
';' Text
';' Text
';' Text
';' Text
';' Text
';' Text
';' Text
'}-' Punctuation
'\n' Text
'-{' Punctuation
'zh' Keyword
';' Punctuation
'zh' Keyword
'-' Keyword
'cn' Keyword
';' Punctuation
'zh' Keyword
'-' Keyword
'hk' Keyword
'|' Punctuation
'博客' Text
'、' Text
'網誌' Text
'、' Text
'部落格' Text
'}-' Punctuation
'\n' Text
'\n' Text
'-{' Punctuation
'zh-cn' Name.Label

@@ -201,4 +389,2 @@ ':' Punctuation

'\n' Text
'-{' Punctuation

@@ -217,7 +403,5 @@ 'zh' Text

';' Text
';' Text
'zh' Text
'-' Text
'cn' Text
':' Text
';' Punctuation
'zh-cn' Name.Label
':' Punctuation
'a' Text

@@ -239,4 +423,2 @@ '}-' Punctuation

'\n' Text
'-{' Punctuation

@@ -307,4 +489,2 @@ 'zh-hans' Name.Label

'\n' Text
'-' Text

@@ -311,0 +491,0 @@ '{{{' Punctuation

@@ -238,3 +238,3 @@ """

assert pytest.raises(
RuntimeError, HtmlFormatter, tagsfile='support/tags'
RuntimeError, HtmlFormatter, tagsfile='tests/support/tags'
)

@@ -244,3 +244,3 @@ else:

# anymore in the actual source
fmt = HtmlFormatter(tagsfile='support/tags', lineanchors='L',
fmt = HtmlFormatter(tagsfile='tests/support/tags', lineanchors='L',
tagurlformat='%(fname)s%(fext)s')

@@ -247,0 +247,0 @@ outfile = StringIO()

include Makefile CHANGES LICENSE AUTHORS
include external/*
recursive-include tests *
recursive-include doc *
recursive-include scripts *
[console_scripts]
pygmentize = pygments.cmdline:main
Metadata-Version: 2.1
Name: Pygments
Version: 2.16.1
Summary: Pygments is a syntax highlighting package written in Python.
Author-email: Georg Brandl <georg@python.org>
Maintainer: Matthäus G. Chajdas
Maintainer-email: Georg Brandl <georg@python.org>, Jean Abou Samra <jean@abou-samra.fr>
License: BSD-2-Clause
Project-URL: Homepage, https://pygments.org
Project-URL: Documentation, https://pygments.org/docs
Project-URL: Source, https://github.com/pygments/pygments
Project-URL: Bug Tracker, https://github.com/pygments/pygments/issues
Project-URL: Changelog, https://github.com/pygments/pygments/blob/master/CHANGES
Keywords: syntax highlighting
Classifier: Development Status :: 6 - Mature
Classifier: Intended Audience :: Developers
Classifier: Intended Audience :: End Users/Desktop
Classifier: Intended Audience :: System Administrators
Classifier: License :: OSI Approved :: BSD License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.7
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: Implementation :: CPython
Classifier: Programming Language :: Python :: Implementation :: PyPy
Classifier: Topic :: Text Processing :: Filters
Classifier: Topic :: Utilities
Requires-Python: >=3.7
Description-Content-Type: text/x-rst
Provides-Extra: plugins
License-File: LICENSE
License-File: AUTHORS
Pygments
~~~~~~~~
Pygments is a syntax highlighting package written in Python.
It is a generic syntax highlighter suitable for use in code hosting, forums,
wikis or other applications that need to prettify source code. Highlights
are:
* a wide range of over 500 languages and other text formats is supported
* special attention is paid to details, increasing quality by a fair amount
* support for new languages and formats are added easily
* a number of output formats, presently HTML, LaTeX, RTF, SVG, all image
formats that PIL supports and ANSI sequences
* it is usable as a command-line tool and as a library
Copyright 2006-2023 by the Pygments team, see ``AUTHORS``.
Licensed under the BSD, see ``LICENSE`` for details.
[plugins]
[plugins:python_version < "3.8"]
importlib-metadata

Sorry, the diff of this file is too big to display

[egg_info]
tag_build =
tag_date = 0

Sorry, the diff of this file is too big to display

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is too big to display

Sorry, the diff of this file is too big to display

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet