
Security News
CVE Volume Surges Past 48,000 in 2025 as WordPress Plugin Ecosystem Drives Growth
CVE disclosures hit a record 48,185 in 2025, driven largely by vulnerabilities in third-party WordPress plugins.
glsl-tokenizer
Advanced tools
Maps GLSL string data into GLSL tokens, either synchronously or using a streaming API.
var tokenString = require('glsl-tokenizer/string')
var tokenStream = require('glsl-tokenizer/stream')
var fs = require('fs')
// Synchronously:
var tokens = tokenString(fs.readFileSync('some.glsl'))
// Streaming API:
fs.createReadStream('some.glsl')
.pipe(tokenStream())
.on('data', function(token) {
console.log(token.data, token.position, token.type)
})
Returns an array of tokens given the GLSL source string src
You can specify opt.version string to use different keywords/builtins, such as '300 es' for WebGL2. Otherwise, will assume GLSL 100 (WebGL1).
var tokens = tokenizer(src, {
version: '300 es'
})
Emits 'data' events whenever a token is parsed with a token object as output.
As above, you can specify opt.version.
{ 'type': TOKEN_TYPE
, 'data': "string of constituent data"
, 'position': integer position within the GLSL source
, 'line': line number within the GLSL source
, 'column': column number within the GLSL source }
The available token types are:
block-comment: /* ... */line-comment: // ... \npreprocessor: # ... \noperator: Any operator. If it looks like punctuation, it's an operator.float: Optionally suffixed with fident: User defined identifier.builtin: Builtin function.eof: Emitted on end; data will === '(eof)'.integerwhitespacekeywordMIT, see LICENSE.md for further information.
The glsl-parser package is used for parsing GLSL code into an abstract syntax tree (AST). While glsl-tokenizer focuses on breaking down the code into tokens, glsl-parser goes a step further by providing a structured representation of the code. This can be useful for more complex analysis and transformations.
The glsl-transpiler package is designed to transpile GLSL code to other shading languages or JavaScript. It includes tokenization and parsing as part of its process but extends the functionality to code transformation. This makes it more comprehensive compared to glsl-tokenizer, which is focused solely on tokenization.
The glsl-optimizer package is used for optimizing GLSL code. It includes tokenization and parsing as part of its optimization pipeline. While glsl-tokenizer provides the basic building blocks for breaking down GLSL code, glsl-optimizer focuses on improving the performance and efficiency of the code.
FAQs
r/w stream of glsl tokens
The npm package glsl-tokenizer receives a total of 190,056 weekly downloads. As such, glsl-tokenizer popularity was classified as popular.
We found that glsl-tokenizer demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 17 open source maintainers collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
CVE disclosures hit a record 48,185 in 2025, driven largely by vulnerabilities in third-party WordPress plugins.

Security News
Socket CEO Feross Aboukhadijeh joins Insecure Agents to discuss CVE remediation and why supply chain attacks require a different security approach.

Security News
Tailwind Labs laid off 75% of its engineering team after revenue dropped 80%, as LLMs redirect traffic away from documentation where developers discover paid products.