Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

pex-renderer

Package Overview
Dependencies
Maintainers
2
Versions
128
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

pex-renderer

Physically Based Renderer for Pex

  • 3.1.0
  • latest
  • Source
  • npm
  • Socket score

Version published
Weekly downloads
179
increased by326.19%
Maintainers
2
Weekly downloads
 
Created
Source

pex-renderer v3

Physically based renderer (PBR) and scene graph for PEX.

This is an experimental API and it's likely to change in the future.

Key dependencies:
  • pex-context modern WebGL wrapper (buffers, textures, pipelines, commands etc)
  • pex-math array based math (vec3, mat4, quat etc)

Contents

Usage

PEX Renderer v3 is currently in beta. You can install the latest version via npm:

npm i pex-renderer@next

This will install v3 with the beta release number after the dash e.g. pex-renderer@3.0.0-4.

PEX Renderer is a CommonJS module and you will need a bundler (e.g. Browserify) to run it in the browser.

Examples

Open live examples here.

const createContext = require('pex-context')
const createRenderer = require('pex-renderer')
const createSphere = require('primitive-sphere')

const ctx = createContext({ width: 800, height: 600 })

const renderer = createRenderer({
  ctx: ctx
})

const camera = renderer.entity([
  renderer.transform({ position: [0, 0, 3] }),
  renderer.camera({
    fov: Math.PI / 2,
    aspect: ctx.gl.drawingBufferWidth / ctx.gl.drawingBufferHeight,
    near: 0.1,
    far: 100
  })
])
renderer.add(camera)

const cube = renderer.entity([
  renderer.transform({ position: [0, 0, 0] }),
  renderer.geometry(createSphere(1)),
  renderer.material({
    baseColor: [1, 0, 0, 1]
  })
])
renderer.add(cube)

const skybox = renderer.entity([
  renderer.skybox({
    sunPosition: [1, 1, 1]
  })
])
renderer.add(skybox)

const reflectionProbe = renderer.entity([renderer.reflectionProbe()])
renderer.add(reflectionProbe)

ctx.frame(() => {
  renderer.draw()
})

You can find runnable examples in the /examples folder in this repository. To run an example install Node.js, clone or download this repository and then:

# go to the example folder
cd examples

# install dependencies
npm install

# start a webpack-dev-server to run all the examples
npm start

API

Renderer

Main class responsible for managing scene hierarchy and rendering. You add your entities to the renderer and call draw every frame.

Note: PEX Renderer doesn't currently have a concept of a scene. This can be simulated by creating multiple root entities with their own scene hierarchies and adding / removing them as necessary.

renderer = createRenderer(opts)
const createRenderer = require('pex-renderer')
const renderer = createRenderer({
  ctx,
  shadowQuality: 2,
  rgbm: false,
  profile: false,
  pauseOnBlur: true
})
  • renderer.paused
  • renderer.profiler
propertyinfotypedefault
ctxrendering contextpex-context.Contextnull
shadowQualityshadow smoothnessInteger 0-42
rgbmuse RGBM color packing for rendering pipelineBooleanfalse
profileenable profilingBooleanfalse
pauseOnBlurstop rendering when window looses focusBooleanfalse
entities*list of entities in the sceneArray of Entity[]

 _ required  _ read only

renderer.draw()
function frame() {
  renderer.draw()
  requestAnimationFrame(frame)
}

requestAnimationFrame(frame)

// or using built-in frame() from pex-context
ctx.frame(() => {
  renderer.draw()
})

Updates transforms, shadow-maps, reflection probes, materials, shaders, renders the scene and applies post-processing. Should be called every frame.

Entities

Entities are collection of components representing an object in the scene graph.

NOTE: It's worth mentioning that in its current form PEX Renderer doesn't implement Entity-Component-System architecture. Components are self contained and fully functional not merely buckets of data to be processed by a collection of systems. In that regard it's comparable to Unity and its GameObject and MonoBehaviour implementation.

entity = renderer.entity(components, tags)

Creates an entity from a list of components.

  • components: Array of Component - list of components that the entity is made of
  • tags - Array of String - list of tags

Note: entities are not added to the scene graph automatically.

Note on tagging: Camera component also accepts tags. Only entities matching one or more camera tags will be rendered. If camera doesn't have any tags only untagged entities will be rendered.

const entity = renderer.entity(
  [
    renderer.transform({ position: [0, 1, 0] }),
    renderer.geometry({ positions: [], normals: [], cells: [] }),
    renderer.material({ baseColor: [1, 0, 0, 1] })
  ],
  ['opaque', 'debug-only']
)
entity = renderer.add(entity, parent)

Adds entity to the scene graph and attaches to a parent as a child.

renderer.remove(entity)

Removes entity from the scene graph.

entity.addComponent(component)

Adds component to an entity.

component = entity.getComponent(type)
const entity = renderer.entity([renderer.pointLight()])
entity.getComponent('PointLight')

Gets component by it's class name.

  • type - upper camel case name of the component class
entity.dispose()

Removes entity from the scene and disposes all the components and their resources.

Components

Components are bits of functionality (transform, light type, geometry, material etc) that are added to an entity.

Properties shared by all components:
propertyinfotypedefault
type*component class nameString''
entity*entity the component is attached toEntitynull
changed*event emitted whenever component's property changesSignalnull

* read only

Observing component changes
const entity = renderer.entity([renderer.transform()])
function onParamChange(name) {
  console.log(`param ${name} has changed`)
}

// start listening
entity.transform.changed.add(onParamChange)

// done internaly by transform whenever position changes
entity.transform.dispatch('position')

// stop listening
entity.transform.changed.remove(onParamChange)
Update components
transformComponent.set({
  position: [Math.cos(time), 0, 0]
})
component.dispose()

Scene Components

transform = renderer.transform(opts)

const transform = renderer.transform({
  position: [0, 0, 0],
  scale: [1, 1, 1],
  rotation: [0, 0, 0, 1]
})
propertyinfotypedefault
positionentity position relatively to it's parentVec3 / [x, y, z][0, 0, 0]
scaleentity scale relatively to it's parentVec3 / [x, y, z][1, 1, 1]
rotationentity rotation relatively to it's parentQuat / [x, y, z, w][0, 0, 0, 1]
parententity's parent entityEntitynull
enabledshould the entity be renderedBooleantrue
children*Array of Entityfalse
bounds*
worldBounds*
localModelMatrix*
modelMatrix*

* read only

camera = renderer.camera(opts)

Defines rendering viewport and projection.

Note: camera position/rotation are derived from entity.transform.position/rotation. It's probably easier to use Orbiter component at the moment.

const camera = renderer.camera({
  fov: Math.PI / 4,
  aspect: ctx.gl.drawingBufferWidth / ctx.gl.drawingBufferHeight,
  near: 0.1,
  far: 100
})
propertyinfotypedefault
projectioncamera projection type'perspective' | 'orthographic''perspective'
viewportcamera viewportArray [x, y, width, height][0, 0, gl.drawingBufferWidth, gl.drawingBufferHeight]
nearnear plane distanceNumber0.1
farfar plane distanceNumber100
aspectaspect ratioNumber1
exposureexposure valueNumber1
fovperspective vertical field of view (yfov)Number [rad]Math.PI / 41
focalLengthfocal length of the camera lens [10mm - 200mm]Number [mm]50
fStopratio of camera lens opening, f-number, f/N, aperture [1.2 - 32]Number2.8
sensorSizephysical camera sensor or film size [sensorWidth, sensorHeight]Vec2 [mm, mm][36, 24]
sensorFithow camera frame matches sensor frame'vertical' | 'horizontal' | 'fit' | 'overscan''vertical'
left, right, top, bottomorthographic frustum boundsNumber1
zoomorthographic zoomNumber1
projectionMatrix*
viewMatrix*
inverseViewMatrix*

* read only 1 depends on viewport aspect ratio, focalLength and sensorFit

postProcessing = renderer.postProcessing(opts)

Defines rendering post-processing.

const postProcessing = renderer.postProcessing({
  fxaa: true,
  ssao: true,
  dof: true,
  bloom: true
})
Antialiasing
propertyinfotypedefault
fxaaFXX antaliasing on/offBooleanfalse
Screen Space Ambient Occlusion
propertyinfotypedefault
ssaoSSAO on/offBooleanfalse
ssaoIntensitySSAO shadowsNumber5
ssaoRadiusSSAO shadowsNumber12
ssaoBiasSSAO shadowsNumber0.01
ssaoBlurRadiusSSAO shadowsNumber2
ssaossaoBlurSharpnessBiasSSAO shadowsNumber10
Depth Of Field
propertyinfotypedefault
dofDoF on/offBooleanfalse
dofFocusDistanceDistance to focus planeNumber [meters]5
Bloom
propertyinfotypedefault
bloomBloom on/offBooleanfalse
bloomRadiusAmount of bloom blurNumber1
bloomThresholdBloom color cut off (default 1 = only "hdr" colors will bloom)Number1
bloomIntensityAmount of the bloom to add to the sceneNumber0.1
Fog

TODO: fog, fogColor, fogStart, fogDensity, inscatteringCoeffs, sunPosition, sunColor, sunDispertion, sunIntensity

orbiter = renderer.orbiter(opts)

Orbiter controller for camera component.

Note: orbiter actually doesn't modify the camera but the entity's transform therefore both Orbiter and Camera should be attached to the same entity.

const orbiter = renderer.orbiter({
  target: [0, 0, 0],
  position: [1, 1, 1],
  lat: 0,
  lon: Math.PI / 2,
  easing: 0.1
})

overlay = renderer.overlay(opts)

Flat 2D overlay, useful for tex and logos.

const overlay = renderer.overlay({
  x: 0,
  y: 0,
  width: 1,
  height: 1,
  texture: ctx.Texture
})

Geometry Components

geometry = renderer.geometry(opts)

Represents 3d mesh geometry attributes.

const geometry = renderer.geometry({
  positons: [[0, 0, 1], [1, 2, 3], ...[]],
  normals: [[0, 0, 1], [0, 0, 1], ...[]],
  uvs: [[0, 0], [0, 1], ...[]],
  indices: [[0, 1, 2], [3, 4, 5], ...[]],
  offsets: { data: [[0, 0, 0], [0, 1, 0], ...[]], divisor: 1 }
})
propertyinfotypedefault
positionsvertex positionsArray of Vec3 [x, y, z]null
normalsvertex normalsArray of Vec3 [x, y, z]null
texCoordsvertex tex coordsArray of Vec2 [u, v]null
uvs1alias of texCoordsArray of Vec2 [u, v]null
colorsvertex colorsArray of Vec4 [r, g, b, a]null
indicesindicesArray of Vec3null
cells1geometry facesArray of Vec3 of Int [i, j, j]null
offsets2instances offsetsArray of Vec3 [x, y, z]null
rotations2instances rotationsArray of Quat/Vec4 [x, y, z, w]null
scales2instances scalesArray of Vec3 [x, y, z]null
tints2instanced rotationsArray of Color/Vec4 [r, g, b, a]null

1 write only aliases, uvs data will be stored in texCoords, cells data will be stored in indices

2 those attributes are always instanced and need to be defined with a divisor and additionally number of instances needs to be specified:

const offsets = [[x, y, z], ...[]]
const g = renderer.geometry({
  positions: [[x, y, z], ...[]],
  offsets: { data: offsets, divisor: 1 },
  instances: offsets.length
})

material = renderer.material(opts)

Physically based material description. Default to a Metallic Roughness workflow but can also use a Specular Glossiness workflow or even be unlit.

const material = renderer.material({
  baseColor: [1, 1, 1, 1],
  emissiveColor: [0, 0, 0, 1],
  metallic: 0.8,
  roughness: 0.2,
  castShadows: false,
  receiveShadows: false,
  alphaTest: 0.5,
  alphaMap: ctx.Texture2D
})
propertyinfotypedefault
baseColoralbedoColor/Vec4 [r, g, b, a][1, 1, 1, 1]
baseColorMapbase color texture. Multiplied by baseColor.ctx.Texture | TextureMapnull
unlitno lighting / shadowing. Use baseColor.Booleanfalse
metallicmetallic factor. Used if no metallicMap is provided.Number1
metallicMapmetallic texture. Used if no metallicRoughnessMap is provided.ctx.Texture | TextureMapnull
roughnessroughness factor. Used if no roughnessMap is provided.Number1
roughnessMaproughness texture. Used if no metallicRoughnessMap is provided.ctx.Texture | TextureMapnull
metallicRoughnessMapmetallic (b channel) and roughness (g channel) combined in a texture.ctx.Texture | TextureMapnull
useSpecularGlossinessWorkflowuse a specular/glossiness PBR workflow instead of above Metallic/RoughnessBooleanfalse
diffusediffuse color. Used if no uDiffuseMap is provided.Color/Vec4 [r, g, b, a]1
diffuseMapspecular (b channel) and roughness (g channel) combined in a texture.ctx.Texture | TextureMapnull
specularspecular color. Used if no specularGlossinessMap is provided.Color/Vec3 [r, g, b]1
glossinessglossiness or smoothness. Used if no specularGlossinessMap is provided.Number1
specularGlossinessMapspecular and glossiness combined in a texture.ctx.Texture | TextureMapnull
normalMapnormal texture. Doesn't modify vertices positions, only impacts lighting.ctx.Texture | TextureMapnull
normalScalenormal factor. Control how much the normalMap affects lighting.Number1
displacementMapdisplacement texture. Modifies vertices positions (r channel).ctx.Texture | TextureMapnull
displacementdisplacement factor. Control how much the displacementMap affects vertices.Number0
emissiveColorlight emittedColor/Vec4 [r, g, b, a]null
emissiveIntensityemissive factorNumber1
emissiveColorMapbase color texture. Multiplied by emissiveColor and emissiveIntensity.ctx.Texture | TextureMapnull
occlusionMapocclusion texture. Indicates areas of indirect lighting.ctx.Texture | TextureMapnull
reflectancecontrol specular intensity on non-metallic surfaces.Number 0-10.5
clearCoatstrength of the clear coat layer.Number 0-1null
clearCoatRoughnessroughness of the clear coat layer.Number 0-1null
clearCoatNormalMapnormal texture for the clear coat layer.ctx.Texture | TextureMapnull
clearCoatNormalMapScaleclear coat normal factor.Number1
alphaMapalpha texture. Impacts opacity (r channel).ctx.Texture | TextureMapnull
alphaTestvalue against which to test alpha.Number 0-1true
depthWritedepth write maskBooleantrue
depthTestdepth test on/offBooleantrue
depthFuncdepth test functionctx.DepthFuncctx.DepthFunc.LessEqual
blendblending on/offBooleanfalse
blendSrcRGBFactorblending source color factorctx.BlendFactorctx.BlendFactor.One
blendSrcAlphaFactorblending source alpha factorctx.BlendFactorctx.BlendFactor.One
blendDstRGBFactorblending destination color factorctx.BlendFactorctx.BlendFactor.One
blendDstAlphaFactorblending destination alpha factorctx.BlendFactorctx.BlendFactor.One
cullFaceface culling on/offBooleanfalse
cullFaceModeface culling modectx.Facectx.Face.Back
pointSizeset gl_PointSize for ctx.Primitive.PointsNumber1
castShadowsimpact shadow castingBooleanfalse
receiveShadowsreceive potential shadowingBooleanfalse

Texture transforms are achieved by optionally passing a TextureMap object with offset, rotation and/or scale alongside the texture itself: { texture: ctx.Texture, offset?: Vec2 [x, y], rotation?: Radians, scale?: Vec2 [x, y] }

_The reflectance value represents a remapping of a percentage of reflectance (with a default of 4%: 0.16 _ pow(0.5, 2) = 0.04) and replaces an explicit index of refraction (IOR)*

animation = renderer.animation(opts)

Geometry attribute animations based on glTF 2.0 Spec / Animations.

const animation = renderer.animation({
  channels: [], // Array of Channels
  autoplay: true,
  loop: true
})
// TODO
// const Channel = {
//   input: null,
//   output: null,
//   interpolation: null,
//   target: null,
//   path: null,
// }

morph = renderer.morph(opts)

Geometry morph targets based on glTF 2.0 Spec / Morph Targets.

const morph = renderer.morph({
  sources: { positions, normals, tangents, ...attributes },
  targets: { positions, normals, tangents, ...attributes },
  weights: [0.0, 0.0, ...weights]
})

skin = renderer.skin(opts)

Geometry vertex skin based on glTF 2.0 Spec / Skin.

const skin = renderer.skin({
  joints: [entity, entity, ...entities],
  inverseBindMatrices: [mat4, mat4, ...mat4]
})

Lighting Components

Components representing light sources used for rendering of the scene.

Note on position and orientation of lights: Similar as camera light components position and orientation is controlled via transform component of the entity the light is attached to.

const directionalLightEnity = renderer.entity([
  renderer.transform({
    rotation: quat.fromAxisAngle(quat.create(), [0, 0, 1], Math.PI / 2)
  }),
  renderer.directionalLight({
    color: [1, 1, 1, 1],
    intensity: 1,
    castShadows: true
  })
])

ambientLight = renderer.ambientLight(opts)

const ambientLight = renderer.ambientLight({
  color: [1, 1, 1, 1],
  intensity: 1
})

directionalLight = renderer.directionalLight(opts)

const directionalLight = renderer.directionalLight({
  color: [1, 1, 1, 1],
  intensity: 1,
  castShadows: true
})

Note: directionalLight direction is derived from entity.transform.rotation

areaLight = renderer.areaLight(opts)

Rectangular area light.

const areaLight = renderer.areaLight({
  color: [1, 1, 1, 1],
  intensity: 1
})

Note: areaLight position/rotation/size are derived from entity.transform.position/rotation/scale

spotLight = renderer.spotLight(opts)

const spotLight = renderer.spotLight({
  color: [1, 1, 1, 1],
  intensity: 1
})

Note: spotLight direction is derived from entity.transform.rotation

skybox = renderer.skybox(opts)

const skybox = renderer.skybox({
  sunPosition: [1, 1, 1], // sky gradient used for reflections
  texture: ctx.texture2D(), // used for reflections instad of sky
  backgroundTexture: ctx.texture2D(), // used for background rendering, not reflections,
  backgroundBlur: 0 // if set to 1, blurs texture for background rendering, not reflections
})

Note: By default a sky background is rendered unless hdr equirect panorama texture is provided. Note: Skybox orientation differ from engine to engine; to update it, set the entity's transform component rotation and set any reflection probe to dirty.

reflectionProbe = renderer.reflectionProbe(opts)

Captures environmental map of the scene for Image Based Lighting (IBL) specular reflection and irradiance diffuse. Currently requires Skybox component to be present in the scene as only Skybox background is captured.

const reflectionProbe = renderer.reflectionProbe({})

Note: Due to the cost of updating and pre-filtering environment map the ReflectionProbe is no updated automatically and requires reflectionProbe.set({ dirty: true }) whenever Skybox changes. The dirty flag is true by default so the Reflection Probe will get updated once on init.

Loaders

scene = renderer.loadScene(url, opts)

Load a 3D model as a scene: an object containing a root entity hierarchy that you can add to the renderer like any other entity.

const scene = await renderer.loadScene('model.gltf')
renderer.add(scene.root)

Note: Currently only glTF is supported (JSON, binary and Embedded).

Creating Custom Components

Start by creating new class as follows:

// MyComponent.js
const Signal = require('signals')

function MyComponent(opts) {
  this.type = 'MyComponent'
  this.entity = null
  this.numberParameter = 1
  this.stringParameter = 'some text'
  this.changed = new Signal()
  this.dirty = false
  this.set(opts)
}

// this function gets called when the component is added
// to an enity
MyComponent.prototype.init = function(entity) {
  this.entity = entity
}

MyComponent.prototype.set = function(opts) {
  Object.assign(this, opts)
  this.dirty = true
  Object.keys(opts).forEach((prop) => this.changed.dispatch(prop))
}

MyComponent.prototype.update = function() {
  if (!this.dirty) return
  this.dirty = false

  const transform = this.entity.transform
  // do sth with transform

  const geom = this.entity.getComponent('Geometry')
  // do sth with geom
}

// by pex-renderer convention we export factory function
// instead of the class type
module.exports = function createMyComponent(opts) {
  return new MyComponent(opts)
}

Create instance of your component and add it to an entity.

const createMyComponent = require('/path/to/MyComponent')

const myComponent = createMyComponent({ numberParameter: 1 })
const entity = renderer.entity([myComponent])

License

MIT, see LICENSE.md for details.

Keywords

FAQs

Package last updated on 19 Nov 2020

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc