
Security News
Open Source CAI Framework Handles Pen Testing Tasks up to 3,600× Faster Than Humans
CAI is a new open source AI framework that automates penetration testing tasks like scanning and exploitation up to 3,600× faster than humans.
glsl-tokenizer
Advanced tools
The glsl-tokenizer npm package is a tool for tokenizing GLSL (OpenGL Shading Language) code. It breaks down GLSL code into a stream of tokens, which can then be used for further processing, such as parsing, analysis, or transformation.
Tokenizing GLSL code
This feature allows you to tokenize a GLSL shader code. The code sample reads a GLSL file, tokenizes its content, and prints the tokens to the console.
const tokenize = require('glsl-tokenizer');
const fs = require('fs');
const glslCode = fs.readFileSync('shader.glsl', 'utf8');
const tokens = tokenize(glslCode);
console.log(tokens);
Handling different types of tokens
This feature demonstrates how to handle different types of tokens produced by the tokenizer. The code sample tokenizes a simple GLSL code snippet and prints the type and data of each token.
const tokenize = require('glsl-tokenizer');
const glslCode = 'void main() { gl_FragColor = vec4(1.0); }';
const tokens = tokenize(glslCode);
for (const token of tokens) {
console.log(`Type: ${token.type}, Data: ${token.data}`);
}
Customizing tokenization
This feature shows how to customize the tokenization process by specifying options. The code sample tokenizes GLSL code with a specified GLSL version.
const tokenize = require('glsl-tokenizer');
const glslCode = 'void main() { gl_FragColor = vec4(1.0); }';
const tokens = tokenize(glslCode, { version: '300 es' });
console.log(tokens);
The glsl-parser package is used for parsing GLSL code into an abstract syntax tree (AST). While glsl-tokenizer focuses on breaking down the code into tokens, glsl-parser goes a step further by providing a structured representation of the code. This can be useful for more complex analysis and transformations.
The glsl-transpiler package is designed to transpile GLSL code to other shading languages or JavaScript. It includes tokenization and parsing as part of its process but extends the functionality to code transformation. This makes it more comprehensive compared to glsl-tokenizer, which is focused solely on tokenization.
The glsl-optimizer package is used for optimizing GLSL code. It includes tokenization and parsing as part of its optimization pipeline. While glsl-tokenizer provides the basic building blocks for breaking down GLSL code, glsl-optimizer focuses on improving the performance and efficiency of the code.
Maps GLSL string data into GLSL tokens, either synchronously or using a streaming API.
var tokenString = require('glsl-tokenizer/string')
var tokenStream = require('glsl-tokenizer/stream')
var fs = require('fs')
// Synchronously:
var tokens = tokenString(fs.readFileSync('some.glsl'))
// Streaming API:
fs.createReadStream('some.glsl')
.pipe(tokenStream())
.on('data', function(token) {
console.log(token.data, token.position, token.type)
})
Returns an array of tokens
given the GLSL source string src
You can specify opt.version
string to use different keywords/builtins, such as '300 es'
for WebGL2. Otherwise, will assume GLSL 100 (WebGL1).
var tokens = tokenizer(src, {
version: '300 es'
})
Emits 'data' events whenever a token is parsed with a token object as output.
As above, you can specify opt.version
.
{ 'type': TOKEN_TYPE
, 'data': "string of constituent data"
, 'position': integer position within the GLSL source
, 'line': line number within the GLSL source
, 'column': column number within the GLSL source }
The available token types are:
block-comment
: /* ... */
line-comment
: // ... \n
preprocessor
: # ... \n
operator
: Any operator. If it looks like punctuation, it's an operator.float
: Optionally suffixed with f
ident
: User defined identifier.builtin
: Builtin function.eof
: Emitted on end
; data will === '(eof)'
.integer
whitespace
keyword
MIT, see LICENSE.md for further information.
FAQs
r/w stream of glsl tokens
The npm package glsl-tokenizer receives a total of 311,517 weekly downloads. As such, glsl-tokenizer popularity was classified as popular.
We found that glsl-tokenizer demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 17 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
CAI is a new open source AI framework that automates penetration testing tasks like scanning and exploitation up to 3,600× faster than humans.
Security News
Deno 2.4 brings back bundling, improves dependency updates and telemetry, and makes the runtime more practical for real-world JavaScript projects.
Security News
CVEForecast.org uses machine learning to project a record-breaking surge in vulnerability disclosures in 2025.