@twilio/video-processors
Advanced tools
Comparing version 2.0.0 to 2.1.0
@@ -0,1 +1,7 @@ | ||
2.1.0 (December 12, 2023) | ||
========================= | ||
* Previously, the VideoProcessors SDK failed to compile with TypeScript 5.x. This release contains changes to support TypeScript 5.x. | ||
* Fixed a bug where WebGL2-based VideoProcessors sometimes generated very low output fps, especially on low-powered Intel graphics cards. | ||
2.0.0 (March 21, 2023) | ||
@@ -5,3 +11,3 @@ ====================== | ||
* The VideoProcessors now work on browsers that do not support `OffscreenCanvas`. With this release, when used with [twilio-video v2.27.0](https://www.npmjs.com/package/twilio-video/v/2.27.0), the Virtual Background feature will work on browsers that supports [WebGL2](https://developer.mozilla.org/en-US/docs/Web/API/WebGL2RenderingContext). See [VideoTrack.addProcessor](https://sdk.twilio.com/js/video/releases/2.27.0/docs/VideoTrack.html#addProcessor__anchor) for details. | ||
* On Chrome, our tests show up to 30% reduction in CPU usage if WebGL2 is used as opposed to Canvas2D. | ||
* On Chrome, our tests with 640x480 VideoTracks show up to 30% reduction in CPU usage if WebGL2 is used as opposed to Canvas2D. Higher resolutions degrade the performance as compared to Canvas2D. While we work to support higher resolutions in future releases, we strongly recommend that you set the maximum resolution to 640x480 for WebGL2, or use Canvas2D instead. | ||
@@ -8,0 +14,0 @@ #### API Changes |
# Common Issues | ||
* On Chrome, resolutions higher than 640x480 degrade the performance as compared to Canvas2D. While we work to support higher resolutions in future releases, we strongly recommend that you set the maximum resolution to 640x480 for WebGL2, or use Canvas2D instead. | ||
* Video Processor execution will result in a significant increase in CPU usage. | ||
* Precision on segmentation mask can be poor on certain conditions such as uneven lighting and increased body movements. | ||
* Currently, desktop Safari and iOS browsers do not support [WebAssembly SIMD](https://v8.dev/features/simd). It is recommended to use camera input dimensions of 640x480 or lower to maintain an acceptable frame rate. | ||
* Currently, the SDK [throws](https://github.com/twilio/twilio-video.js/issues/1629) a TypeScript error while trying to compile when used in projects with TypeScript versions in the range of 4.4.2 - 4.8.4. You can work around this by following this [suggestion](https://github.com/twilio/twilio-video.js/issues/1629#issuecomment-1701877481). |
@@ -28,9 +28,4 @@ "use strict"; | ||
window.Twilio = window.Twilio || {}; | ||
window.Twilio.VideoProcessors = __assign(__assign({}, window.Twilio.VideoProcessors), { GaussianBlurBackgroundProcessor: GaussianBlurBackgroundProcessor_1.GaussianBlurBackgroundProcessor, | ||
ImageFit: types_1.ImageFit, | ||
Pipeline: types_1.Pipeline, | ||
isSupported: support_1.isSupported, | ||
version: version_1.version, | ||
VirtualBackgroundProcessor: VirtualBackgroundProcessor_1.VirtualBackgroundProcessor }); | ||
window.Twilio.VideoProcessors = __assign(__assign({}, window.Twilio.VideoProcessors), { GaussianBlurBackgroundProcessor: GaussianBlurBackgroundProcessor_1.GaussianBlurBackgroundProcessor, ImageFit: types_1.ImageFit, Pipeline: types_1.Pipeline, isSupported: support_1.isSupported, version: version_1.version, VirtualBackgroundProcessor: VirtualBackgroundProcessor_1.VirtualBackgroundProcessor }); | ||
} | ||
//# sourceMappingURL=index.js.map |
@@ -10,2 +10,4 @@ "use strict"; | ||
return function (d, b) { | ||
if (typeof b !== "function" && b !== null) | ||
throw new TypeError("Class extends value " + String(b) + " is not a constructor or null"); | ||
extendStatics(d, b); | ||
@@ -31,3 +33,3 @@ function __() { this.constructor = d; } | ||
if (f) throw new TypeError("Generator is already executing."); | ||
while (_) try { | ||
while (g && (g = 0, op[0] && (_ = 0)), _) try { | ||
if (f = 1, y && (t = op[0] & 2 ? y["return"] : op[0] ? y["throw"] || ((t = y["return"]) && t.call(y), 0) : y.next) && !(t = t.call(y, op[1])).done) return t; | ||
@@ -122,3 +124,3 @@ if (y = 0, t) op = [op[0] & 2, t.value]; | ||
if (typeof radius !== 'number' || radius < 0) { | ||
console.warn("Valid mask blur radius not found. Using " + constants_1.MASK_BLUR_RADIUS + " as default."); | ||
console.warn("Valid mask blur radius not found. Using ".concat(constants_1.MASK_BLUR_RADIUS, " as default.")); | ||
radius = constants_1.MASK_BLUR_RADIUS; | ||
@@ -241,3 +243,3 @@ } | ||
ctx.save(); | ||
ctx.filter = "blur(" + this._maskBlurRadius + "px)"; | ||
ctx.filter = "blur(".concat(this._maskBlurRadius, "px)"); | ||
ctx.globalCompositeOperation = 'copy'; | ||
@@ -308,7 +310,7 @@ ctx.drawImage(this._maskCanvas, 0, 0, captureWidth, captureHeight); | ||
BackgroundProcessor.prototype._createWebGL2Pipeline = function (inputFrame, captureWidth, captureHeight, inferenceWidth, inferenceHeight) { | ||
this._webgl2Pipeline = webgl2_1.buildWebGL2Pipeline({ | ||
this._webgl2Pipeline = (0, webgl2_1.buildWebGL2Pipeline)({ | ||
htmlElement: inputFrame, | ||
width: captureWidth, | ||
height: captureHeight, | ||
}, this._backgroundImage, { type: this._getWebGL2PipelineType() }, { inputResolution: inferenceWidth + "x" + inferenceHeight }, this._outputCanvas, this._tflite, this._benchmark, this._debounce); | ||
}, this._backgroundImage, { type: this._getWebGL2PipelineType() }, { inputResolution: "".concat(inferenceWidth, "x").concat(inferenceHeight) }, this._outputCanvas, this._tflite, this._benchmark, this._debounce); | ||
this._webgl2Pipeline.updatePostProcessingConfig({ | ||
@@ -315,0 +317,0 @@ smoothSegmentationMask: true, |
@@ -39,3 +39,4 @@ import { BackgroundProcessor, BackgroundProcessorOptions } from './BackgroundProcessor'; | ||
* // especially on browsers that do not support SIMD | ||
* // such as desktop Safari and iOS browsers | ||
* // such as desktop Safari and iOS browsers, or on Chrome | ||
* // with capture resolutions above 640x480 for webgl2. | ||
* width: 640, | ||
@@ -42,0 +43,0 @@ * height: 480, |
@@ -10,2 +10,4 @@ "use strict"; | ||
return function (d, b) { | ||
if (typeof b !== "function" && b !== null) | ||
throw new TypeError("Class extends value " + String(b) + " is not a constructor or null"); | ||
extendStatics(d, b); | ||
@@ -44,3 +46,4 @@ function __() { this.constructor = d; } | ||
* // especially on browsers that do not support SIMD | ||
* // such as desktop Safari and iOS browsers | ||
* // such as desktop Safari and iOS browsers, or on Chrome | ||
* // with capture resolutions above 640x480 for webgl2. | ||
* width: 640, | ||
@@ -87,3 +90,3 @@ * height: 480, | ||
if (!radius) { | ||
console.warn("Valid blur filter radius not found. Using " + constants_1.BLUR_FILTER_RADIUS + " as default."); | ||
console.warn("Valid blur filter radius not found. Using ".concat(constants_1.BLUR_FILTER_RADIUS, " as default.")); | ||
radius = constants_1.BLUR_FILTER_RADIUS; | ||
@@ -104,3 +107,3 @@ } | ||
var ctx = this._outputContext; | ||
ctx.filter = "blur(" + this._blurFilterRadius + "px)"; | ||
ctx.filter = "blur(".concat(this._blurFilterRadius, "px)"); | ||
ctx.drawImage(inputFrame, 0, 0); | ||
@@ -107,0 +110,0 @@ }; |
@@ -15,3 +15,4 @@ import { BackgroundProcessor, BackgroundProcessorOptions } from './BackgroundProcessor'; | ||
/** | ||
* The [[ImageFit]] to use for positioning of the background image in the viewport. | ||
* The [[ImageFit]] to use for positioning of the background image in the viewport. Only the Canvas2D [[Pipeline]] | ||
* supports this option. WebGL2 ignores this option and falls back to Cover. | ||
* @default | ||
@@ -55,3 +56,4 @@ * ```html | ||
* // especially on browsers that do not support SIMD | ||
* // such as desktop Safari and iOS browsers | ||
* // such as desktop Safari and iOS browsers, or on Chrome | ||
* // with capture resolutions above 640x480 for webgl2. | ||
* width: 640, | ||
@@ -58,0 +60,0 @@ * height: 480, |
@@ -10,2 +10,4 @@ "use strict"; | ||
return function (d, b) { | ||
if (typeof b !== "function" && b !== null) | ||
throw new TypeError("Class extends value " + String(b) + " is not a constructor or null"); | ||
extendStatics(d, b); | ||
@@ -51,3 +53,4 @@ function __() { this.constructor = d; } | ||
* // especially on browsers that do not support SIMD | ||
* // such as desktop Safari and iOS browsers | ||
* // such as desktop Safari and iOS browsers, or on Chrome | ||
* // with capture resolutions above 640x480 for webgl2. | ||
* width: 640, | ||
@@ -123,3 +126,3 @@ * height: 480, | ||
if (!validTypes.includes(fitType)) { | ||
console.warn("Valid fitType not found. Using '" + types_1.ImageFit.Fill + "' as default."); | ||
console.warn("Valid fitType not found. Using '".concat(types_1.ImageFit.Fill, "' as default.")); | ||
fitType = types_1.ImageFit.Fill; | ||
@@ -177,3 +180,4 @@ } | ||
return { | ||
x: x, y: y, | ||
x: x, | ||
y: y, | ||
w: newContentWidth, | ||
@@ -180,0 +184,0 @@ h: newContentHeight, |
@@ -1,4 +0,4 @@ | ||
export declare type BackgroundConfig = { | ||
export type BackgroundConfig = { | ||
type: 'none' | 'blur' | 'image'; | ||
url?: string; | ||
}; |
@@ -1,3 +0,3 @@ | ||
export declare type BlendMode = 'screen' | 'linearDodge'; | ||
export declare type PostProcessingConfig = { | ||
export type BlendMode = 'screen' | 'linearDodge'; | ||
export type PostProcessingConfig = { | ||
smoothSegmentationMask: boolean; | ||
@@ -9,5 +9,5 @@ jointBilateralFilter: JointBilateralFilterConfig; | ||
}; | ||
export declare type JointBilateralFilterConfig = { | ||
export type JointBilateralFilterConfig = { | ||
sigmaSpace: number; | ||
sigmaColor: number; | ||
}; |
import { PostProcessingConfig } from './postProcessingHelper'; | ||
export declare type RenderingPipeline = { | ||
export type RenderingPipeline = { | ||
render(): Promise<void>; | ||
@@ -4,0 +4,0 @@ updatePostProcessingConfig(newPostProcessingConfig: PostProcessingConfig): void; |
@@ -1,7 +0,7 @@ | ||
export declare type InputResolution = '640x360' | '256x256' | '256x144' | '160x96' | string; | ||
export type InputResolution = '640x360' | '256x256' | '256x144' | '160x96' | string; | ||
export declare const inputResolutions: { | ||
[resolution in InputResolution]: [number, number]; | ||
}; | ||
export declare type SegmentationConfig = { | ||
export type SegmentationConfig = { | ||
inputResolution: InputResolution; | ||
}; |
@@ -1,6 +0,6 @@ | ||
export declare type SourceConfig = { | ||
export type SourceConfig = { | ||
type: 'image' | 'video' | 'camera'; | ||
url?: string; | ||
}; | ||
export declare type SourcePlayback = { | ||
export type SourcePlayback = { | ||
htmlElement: HTMLImageElement | HTMLVideoElement; | ||
@@ -7,0 +7,0 @@ width: number; |
@@ -10,7 +10,9 @@ /** | ||
*/ | ||
export declare const glsl: (template: TemplateStringsArray, ...substitutions: any[]) => string; | ||
export declare const glsl: (template: { | ||
raw: ArrayLike<string> | readonly string[]; | ||
}, ...substitutions: any[]) => string; | ||
export declare function createPiplelineStageProgram(gl: WebGL2RenderingContext, vertexShader: WebGLShader, fragmentShader: WebGLShader, positionBuffer: WebGLBuffer, texCoordBuffer: WebGLBuffer): WebGLProgram; | ||
export declare function createProgram(gl: WebGL2RenderingContext, vertexShader: WebGLShader, fragmentShader: WebGLShader): WebGLProgram; | ||
export declare function compileShader(gl: WebGL2RenderingContext, shaderType: number, shaderSource: string): WebGLShader; | ||
export declare function createTexture(gl: WebGL2RenderingContext, internalformat: number, width: number, height: number, minFilter?: number, magFilter?: number): WebGLTexture | null; | ||
export declare function createTexture(gl: WebGL2RenderingContext, internalformat: number, width: number, height: number, minFilter?: GLint, magFilter?: GLint): WebGLTexture | null; | ||
export declare function readPixelsAsync(gl: WebGL2RenderingContext, x: number, y: number, width: number, height: number, format: number, type: number, dest: ArrayBufferView): Promise<ArrayBufferView>; |
@@ -17,3 +17,3 @@ "use strict"; | ||
if (f) throw new TypeError("Generator is already executing."); | ||
while (_) try { | ||
while (g && (g = 0, op[0] && (_ = 0)), _) try { | ||
if (f = 1, y && (t = op[0] & 2 ? y["return"] : op[0] ? y["throw"] || ((t = y["return"]) && t.call(y), 0) : y.next) && !(t = t.call(y, op[1])).done) return t; | ||
@@ -70,3 +70,3 @@ if (y = 0, t) op = [op[0] & 2, t.value]; | ||
if (!gl.getProgramParameter(program, gl.LINK_STATUS)) { | ||
throw new Error("Could not link WebGL program: " + gl.getProgramInfoLog(program)); | ||
throw new Error("Could not link WebGL program: ".concat(gl.getProgramInfoLog(program))); | ||
} | ||
@@ -81,3 +81,3 @@ return program; | ||
if (!gl.getShaderParameter(shader, gl.COMPILE_STATUS)) { | ||
throw new Error("Could not compile shader: " + gl.getShaderInfoLog(shader)); | ||
throw new Error("Could not compile shader: ".concat(gl.getShaderInfoLog(shader))); | ||
} | ||
@@ -84,0 +84,0 @@ return shader; |
@@ -1,2 +0,2 @@ | ||
export declare type BackgroundBlurStage = { | ||
export type BackgroundBlurStage = { | ||
render(): void; | ||
@@ -3,0 +3,0 @@ updateCoverage(coverage: [number, number]): void; |
@@ -31,3 +31,3 @@ "use strict"; | ||
function buildBlurPass(gl, vertexShader, positionBuffer, texCoordBuffer, personMaskTexture, canvas) { | ||
var fragmentShaderSource = webglHelper_1.glsl(templateObject_1 || (templateObject_1 = __makeTemplateObject(["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputFrame;\n uniform sampler2D u_personMask;\n uniform vec2 u_texelSize;\n\n in vec2 v_texCoord;\n\n out vec4 outColor;\n\n const float offset[5] = float[](0.0, 1.0, 2.0, 3.0, 4.0);\n const float weight[5] = float[](0.2270270270, 0.1945945946, 0.1216216216,\n 0.0540540541, 0.0162162162);\n\n void main() {\n vec4 centerColor = texture(u_inputFrame, v_texCoord);\n float personMask = texture(u_personMask, v_texCoord).a;\n\n vec4 frameColor = centerColor * weight[0] * (1.0 - personMask);\n\n for (int i = 1; i < 5; i++) {\n vec2 offset = vec2(offset[i]) * u_texelSize;\n\n vec2 texCoord = v_texCoord + offset;\n frameColor += texture(u_inputFrame, texCoord) * weight[i] *\n (1.0 - texture(u_personMask, texCoord).a);\n\n texCoord = v_texCoord - offset;\n frameColor += texture(u_inputFrame, texCoord) * weight[i] *\n (1.0 - texture(u_personMask, texCoord).a);\n }\n outColor = vec4(frameColor.rgb + (1.0 - frameColor.a) * centerColor.rgb, 1.0);\n }\n "], ["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputFrame;\n uniform sampler2D u_personMask;\n uniform vec2 u_texelSize;\n\n in vec2 v_texCoord;\n\n out vec4 outColor;\n\n const float offset[5] = float[](0.0, 1.0, 2.0, 3.0, 4.0);\n const float weight[5] = float[](0.2270270270, 0.1945945946, 0.1216216216,\n 0.0540540541, 0.0162162162);\n\n void main() {\n vec4 centerColor = texture(u_inputFrame, v_texCoord);\n float personMask = texture(u_personMask, v_texCoord).a;\n\n vec4 frameColor = centerColor * weight[0] * (1.0 - personMask);\n\n for (int i = 1; i < 5; i++) {\n vec2 offset = vec2(offset[i]) * u_texelSize;\n\n vec2 texCoord = v_texCoord + offset;\n frameColor += texture(u_inputFrame, texCoord) * weight[i] *\n (1.0 - texture(u_personMask, texCoord).a);\n\n texCoord = v_texCoord - offset;\n frameColor += texture(u_inputFrame, texCoord) * weight[i] *\n (1.0 - texture(u_personMask, texCoord).a);\n }\n outColor = vec4(frameColor.rgb + (1.0 - frameColor.a) * centerColor.rgb, 1.0);\n }\n "]))); | ||
var fragmentShaderSource = (0, webglHelper_1.glsl)(templateObject_1 || (templateObject_1 = __makeTemplateObject(["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputFrame;\n uniform sampler2D u_personMask;\n uniform vec2 u_texelSize;\n\n in vec2 v_texCoord;\n\n out vec4 outColor;\n\n const float offset[5] = float[](0.0, 1.0, 2.0, 3.0, 4.0);\n const float weight[5] = float[](0.2270270270, 0.1945945946, 0.1216216216,\n 0.0540540541, 0.0162162162);\n\n void main() {\n vec4 centerColor = texture(u_inputFrame, v_texCoord);\n float personMask = texture(u_personMask, v_texCoord).a;\n\n vec4 frameColor = centerColor * weight[0] * (1.0 - personMask);\n\n for (int i = 1; i < 5; i++) {\n vec2 offset = vec2(offset[i]) * u_texelSize;\n\n vec2 texCoord = v_texCoord + offset;\n frameColor += texture(u_inputFrame, texCoord) * weight[i] *\n (1.0 - texture(u_personMask, texCoord).a);\n\n texCoord = v_texCoord - offset;\n frameColor += texture(u_inputFrame, texCoord) * weight[i] *\n (1.0 - texture(u_personMask, texCoord).a);\n }\n outColor = vec4(frameColor.rgb + (1.0 - frameColor.a) * centerColor.rgb, 1.0);\n }\n "], ["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputFrame;\n uniform sampler2D u_personMask;\n uniform vec2 u_texelSize;\n\n in vec2 v_texCoord;\n\n out vec4 outColor;\n\n const float offset[5] = float[](0.0, 1.0, 2.0, 3.0, 4.0);\n const float weight[5] = float[](0.2270270270, 0.1945945946, 0.1216216216,\n 0.0540540541, 0.0162162162);\n\n void main() {\n vec4 centerColor = texture(u_inputFrame, v_texCoord);\n float personMask = texture(u_personMask, v_texCoord).a;\n\n vec4 frameColor = centerColor * weight[0] * (1.0 - personMask);\n\n for (int i = 1; i < 5; i++) {\n vec2 offset = vec2(offset[i]) * u_texelSize;\n\n vec2 texCoord = v_texCoord + offset;\n frameColor += texture(u_inputFrame, texCoord) * weight[i] *\n (1.0 - texture(u_personMask, texCoord).a);\n\n texCoord = v_texCoord - offset;\n frameColor += texture(u_inputFrame, texCoord) * weight[i] *\n (1.0 - texture(u_personMask, texCoord).a);\n }\n outColor = vec4(frameColor.rgb + (1.0 - frameColor.a) * centerColor.rgb, 1.0);\n }\n "]))); | ||
var scale = 0.5; | ||
@@ -38,9 +38,9 @@ var outputWidth = canvas.width * scale; | ||
var texelHeight = 1 / outputHeight; | ||
var fragmentShader = webglHelper_1.compileShader(gl, gl.FRAGMENT_SHADER, fragmentShaderSource); | ||
var program = webglHelper_1.createPiplelineStageProgram(gl, vertexShader, fragmentShader, positionBuffer, texCoordBuffer); | ||
var fragmentShader = (0, webglHelper_1.compileShader)(gl, gl.FRAGMENT_SHADER, fragmentShaderSource); | ||
var program = (0, webglHelper_1.createPiplelineStageProgram)(gl, vertexShader, fragmentShader, positionBuffer, texCoordBuffer); | ||
var inputFrameLocation = gl.getUniformLocation(program, 'u_inputFrame'); | ||
var personMaskLocation = gl.getUniformLocation(program, 'u_personMask'); | ||
var texelSizeLocation = gl.getUniformLocation(program, 'u_texelSize'); | ||
var texture1 = webglHelper_1.createTexture(gl, gl.RGBA8, outputWidth, outputHeight, gl.NEAREST, gl.LINEAR); | ||
var texture2 = webglHelper_1.createTexture(gl, gl.RGBA8, outputWidth, outputHeight, gl.NEAREST, gl.LINEAR); | ||
var texture1 = (0, webglHelper_1.createTexture)(gl, gl.RGBA8, outputWidth, outputHeight, gl.NEAREST, gl.LINEAR); | ||
var texture2 = (0, webglHelper_1.createTexture)(gl, gl.RGBA8, outputWidth, outputHeight, gl.NEAREST, gl.LINEAR); | ||
var frameBuffer1 = gl.createFramebuffer(); | ||
@@ -87,8 +87,8 @@ gl.bindFramebuffer(gl.FRAMEBUFFER, frameBuffer1); | ||
function buildBlendPass(gl, positionBuffer, texCoordBuffer, canvas) { | ||
var vertexShaderSource = webglHelper_1.glsl(templateObject_2 || (templateObject_2 = __makeTemplateObject(["#version 300 es\n\n in vec2 a_position;\n in vec2 a_texCoord;\n\n out vec2 v_texCoord;\n\n void main() {\n // Flipping Y is required when rendering to canvas\n gl_Position = vec4(a_position * vec2(1.0, -1.0), 0.0, 1.0);\n v_texCoord = a_texCoord;\n }\n "], ["#version 300 es\n\n in vec2 a_position;\n in vec2 a_texCoord;\n\n out vec2 v_texCoord;\n\n void main() {\n // Flipping Y is required when rendering to canvas\n gl_Position = vec4(a_position * vec2(1.0, -1.0), 0.0, 1.0);\n v_texCoord = a_texCoord;\n }\n "]))); | ||
var fragmentShaderSource = webglHelper_1.glsl(templateObject_3 || (templateObject_3 = __makeTemplateObject(["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputFrame;\n uniform sampler2D u_personMask;\n uniform sampler2D u_blurredInputFrame;\n uniform vec2 u_coverage;\n\n in vec2 v_texCoord;\n\n out vec4 outColor;\n\n void main() {\n vec3 color = texture(u_inputFrame, v_texCoord).rgb;\n vec3 blurredColor = texture(u_blurredInputFrame, v_texCoord).rgb;\n float personMask = texture(u_personMask, v_texCoord).a;\n personMask = smoothstep(u_coverage.x, u_coverage.y, personMask);\n outColor = vec4(mix(blurredColor, color, personMask), 1.0);\n }\n "], ["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputFrame;\n uniform sampler2D u_personMask;\n uniform sampler2D u_blurredInputFrame;\n uniform vec2 u_coverage;\n\n in vec2 v_texCoord;\n\n out vec4 outColor;\n\n void main() {\n vec3 color = texture(u_inputFrame, v_texCoord).rgb;\n vec3 blurredColor = texture(u_blurredInputFrame, v_texCoord).rgb;\n float personMask = texture(u_personMask, v_texCoord).a;\n personMask = smoothstep(u_coverage.x, u_coverage.y, personMask);\n outColor = vec4(mix(blurredColor, color, personMask), 1.0);\n }\n "]))); | ||
var vertexShaderSource = (0, webglHelper_1.glsl)(templateObject_2 || (templateObject_2 = __makeTemplateObject(["#version 300 es\n\n in vec2 a_position;\n in vec2 a_texCoord;\n\n out vec2 v_texCoord;\n\n void main() {\n // Flipping Y is required when rendering to canvas\n gl_Position = vec4(a_position * vec2(1.0, -1.0), 0.0, 1.0);\n v_texCoord = a_texCoord;\n }\n "], ["#version 300 es\n\n in vec2 a_position;\n in vec2 a_texCoord;\n\n out vec2 v_texCoord;\n\n void main() {\n // Flipping Y is required when rendering to canvas\n gl_Position = vec4(a_position * vec2(1.0, -1.0), 0.0, 1.0);\n v_texCoord = a_texCoord;\n }\n "]))); | ||
var fragmentShaderSource = (0, webglHelper_1.glsl)(templateObject_3 || (templateObject_3 = __makeTemplateObject(["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputFrame;\n uniform sampler2D u_personMask;\n uniform sampler2D u_blurredInputFrame;\n uniform vec2 u_coverage;\n\n in vec2 v_texCoord;\n\n out vec4 outColor;\n\n void main() {\n vec3 color = texture(u_inputFrame, v_texCoord).rgb;\n vec3 blurredColor = texture(u_blurredInputFrame, v_texCoord).rgb;\n float personMask = texture(u_personMask, v_texCoord).a;\n personMask = smoothstep(u_coverage.x, u_coverage.y, personMask);\n outColor = vec4(mix(blurredColor, color, personMask), 1.0);\n }\n "], ["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputFrame;\n uniform sampler2D u_personMask;\n uniform sampler2D u_blurredInputFrame;\n uniform vec2 u_coverage;\n\n in vec2 v_texCoord;\n\n out vec4 outColor;\n\n void main() {\n vec3 color = texture(u_inputFrame, v_texCoord).rgb;\n vec3 blurredColor = texture(u_blurredInputFrame, v_texCoord).rgb;\n float personMask = texture(u_personMask, v_texCoord).a;\n personMask = smoothstep(u_coverage.x, u_coverage.y, personMask);\n outColor = vec4(mix(blurredColor, color, personMask), 1.0);\n }\n "]))); | ||
var outputWidth = canvas.width, outputHeight = canvas.height; | ||
var vertexShader = webglHelper_1.compileShader(gl, gl.VERTEX_SHADER, vertexShaderSource); | ||
var fragmentShader = webglHelper_1.compileShader(gl, gl.FRAGMENT_SHADER, fragmentShaderSource); | ||
var program = webglHelper_1.createPiplelineStageProgram(gl, vertexShader, fragmentShader, positionBuffer, texCoordBuffer); | ||
var vertexShader = (0, webglHelper_1.compileShader)(gl, gl.VERTEX_SHADER, vertexShaderSource); | ||
var fragmentShader = (0, webglHelper_1.compileShader)(gl, gl.FRAGMENT_SHADER, fragmentShaderSource); | ||
var program = (0, webglHelper_1.createPiplelineStageProgram)(gl, vertexShader, fragmentShader, positionBuffer, texCoordBuffer); | ||
var inputFrameLocation = gl.getUniformLocation(program, 'u_inputFrame'); | ||
@@ -95,0 +95,0 @@ var personMaskLocation = gl.getUniformLocation(program, 'u_personMask'); |
import { BlendMode } from '../helpers/postProcessingHelper'; | ||
export declare type BackgroundImageStage = { | ||
export type BackgroundImageStage = { | ||
render(): void; | ||
@@ -4,0 +4,0 @@ updateCoverage(coverage: [number, number]): void; |
@@ -10,9 +10,9 @@ "use strict"; | ||
function buildBackgroundImageStage(gl, positionBuffer, texCoordBuffer, personMaskTexture, backgroundImage, canvas) { | ||
var vertexShaderSource = webglHelper_1.glsl(templateObject_1 || (templateObject_1 = __makeTemplateObject(["#version 300 es\n\n uniform vec2 u_backgroundScale;\n uniform vec2 u_backgroundOffset;\n\n in vec2 a_position;\n in vec2 a_texCoord;\n\n out vec2 v_texCoord;\n out vec2 v_backgroundCoord;\n\n void main() {\n // Flipping Y is required when rendering to canvas\n gl_Position = vec4(a_position * vec2(1.0, -1.0), 0.0, 1.0);\n v_texCoord = a_texCoord;\n v_backgroundCoord = a_texCoord * u_backgroundScale + u_backgroundOffset;\n }\n "], ["#version 300 es\n\n uniform vec2 u_backgroundScale;\n uniform vec2 u_backgroundOffset;\n\n in vec2 a_position;\n in vec2 a_texCoord;\n\n out vec2 v_texCoord;\n out vec2 v_backgroundCoord;\n\n void main() {\n // Flipping Y is required when rendering to canvas\n gl_Position = vec4(a_position * vec2(1.0, -1.0), 0.0, 1.0);\n v_texCoord = a_texCoord;\n v_backgroundCoord = a_texCoord * u_backgroundScale + u_backgroundOffset;\n }\n "]))); | ||
var fragmentShaderSource = webglHelper_1.glsl(templateObject_2 || (templateObject_2 = __makeTemplateObject(["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputFrame;\n uniform sampler2D u_personMask;\n uniform sampler2D u_background;\n uniform vec2 u_coverage;\n uniform float u_lightWrapping;\n uniform float u_blendMode;\n\n in vec2 v_texCoord;\n in vec2 v_backgroundCoord;\n\n out vec4 outColor;\n\n vec3 screen(vec3 a, vec3 b) {\n return 1.0 - (1.0 - a) * (1.0 - b);\n }\n\n vec3 linearDodge(vec3 a, vec3 b) {\n return a + b;\n }\n\n void main() {\n vec3 frameColor = texture(u_inputFrame, v_texCoord).rgb;\n vec3 backgroundColor = texture(u_background, v_backgroundCoord).rgb;\n float personMask = texture(u_personMask, v_texCoord).a;\n float lightWrapMask = 1.0 - max(0.0, personMask - u_coverage.y) / (1.0 - u_coverage.y);\n vec3 lightWrap = u_lightWrapping * lightWrapMask * backgroundColor;\n frameColor = u_blendMode * linearDodge(frameColor, lightWrap) +\n (1.0 - u_blendMode) * screen(frameColor, lightWrap);\n personMask = smoothstep(u_coverage.x, u_coverage.y, personMask);\n outColor = vec4(frameColor * personMask + backgroundColor * (1.0 - personMask), 1.0);\n }\n "], ["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputFrame;\n uniform sampler2D u_personMask;\n uniform sampler2D u_background;\n uniform vec2 u_coverage;\n uniform float u_lightWrapping;\n uniform float u_blendMode;\n\n in vec2 v_texCoord;\n in vec2 v_backgroundCoord;\n\n out vec4 outColor;\n\n vec3 screen(vec3 a, vec3 b) {\n return 1.0 - (1.0 - a) * (1.0 - b);\n }\n\n vec3 linearDodge(vec3 a, vec3 b) {\n return a + b;\n }\n\n void main() {\n vec3 frameColor = texture(u_inputFrame, v_texCoord).rgb;\n vec3 backgroundColor = texture(u_background, v_backgroundCoord).rgb;\n float personMask = texture(u_personMask, v_texCoord).a;\n float lightWrapMask = 1.0 - max(0.0, personMask - u_coverage.y) / (1.0 - u_coverage.y);\n vec3 lightWrap = u_lightWrapping * lightWrapMask * backgroundColor;\n frameColor = u_blendMode * linearDodge(frameColor, lightWrap) +\n (1.0 - u_blendMode) * screen(frameColor, lightWrap);\n personMask = smoothstep(u_coverage.x, u_coverage.y, personMask);\n outColor = vec4(frameColor * personMask + backgroundColor * (1.0 - personMask), 1.0);\n }\n "]))); | ||
var vertexShaderSource = (0, webglHelper_1.glsl)(templateObject_1 || (templateObject_1 = __makeTemplateObject(["#version 300 es\n\n uniform vec2 u_backgroundScale;\n uniform vec2 u_backgroundOffset;\n\n in vec2 a_position;\n in vec2 a_texCoord;\n\n out vec2 v_texCoord;\n out vec2 v_backgroundCoord;\n\n void main() {\n // Flipping Y is required when rendering to canvas\n gl_Position = vec4(a_position * vec2(1.0, -1.0), 0.0, 1.0);\n v_texCoord = a_texCoord;\n v_backgroundCoord = a_texCoord * u_backgroundScale + u_backgroundOffset;\n }\n "], ["#version 300 es\n\n uniform vec2 u_backgroundScale;\n uniform vec2 u_backgroundOffset;\n\n in vec2 a_position;\n in vec2 a_texCoord;\n\n out vec2 v_texCoord;\n out vec2 v_backgroundCoord;\n\n void main() {\n // Flipping Y is required when rendering to canvas\n gl_Position = vec4(a_position * vec2(1.0, -1.0), 0.0, 1.0);\n v_texCoord = a_texCoord;\n v_backgroundCoord = a_texCoord * u_backgroundScale + u_backgroundOffset;\n }\n "]))); | ||
var fragmentShaderSource = (0, webglHelper_1.glsl)(templateObject_2 || (templateObject_2 = __makeTemplateObject(["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputFrame;\n uniform sampler2D u_personMask;\n uniform sampler2D u_background;\n uniform vec2 u_coverage;\n uniform float u_lightWrapping;\n uniform float u_blendMode;\n\n in vec2 v_texCoord;\n in vec2 v_backgroundCoord;\n\n out vec4 outColor;\n\n vec3 screen(vec3 a, vec3 b) {\n return 1.0 - (1.0 - a) * (1.0 - b);\n }\n\n vec3 linearDodge(vec3 a, vec3 b) {\n return a + b;\n }\n\n void main() {\n vec3 frameColor = texture(u_inputFrame, v_texCoord).rgb;\n vec3 backgroundColor = texture(u_background, v_backgroundCoord).rgb;\n float personMask = texture(u_personMask, v_texCoord).a;\n float lightWrapMask = 1.0 - max(0.0, personMask - u_coverage.y) / (1.0 - u_coverage.y);\n vec3 lightWrap = u_lightWrapping * lightWrapMask * backgroundColor;\n frameColor = u_blendMode * linearDodge(frameColor, lightWrap) +\n (1.0 - u_blendMode) * screen(frameColor, lightWrap);\n personMask = smoothstep(u_coverage.x, u_coverage.y, personMask);\n outColor = vec4(frameColor * personMask + backgroundColor * (1.0 - personMask), 1.0);\n }\n "], ["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputFrame;\n uniform sampler2D u_personMask;\n uniform sampler2D u_background;\n uniform vec2 u_coverage;\n uniform float u_lightWrapping;\n uniform float u_blendMode;\n\n in vec2 v_texCoord;\n in vec2 v_backgroundCoord;\n\n out vec4 outColor;\n\n vec3 screen(vec3 a, vec3 b) {\n return 1.0 - (1.0 - a) * (1.0 - b);\n }\n\n vec3 linearDodge(vec3 a, vec3 b) {\n return a + b;\n }\n\n void main() {\n vec3 frameColor = texture(u_inputFrame, v_texCoord).rgb;\n vec3 backgroundColor = texture(u_background, v_backgroundCoord).rgb;\n float personMask = texture(u_personMask, v_texCoord).a;\n float lightWrapMask = 1.0 - max(0.0, personMask - u_coverage.y) / (1.0 - u_coverage.y);\n vec3 lightWrap = u_lightWrapping * lightWrapMask * backgroundColor;\n frameColor = u_blendMode * linearDodge(frameColor, lightWrap) +\n (1.0 - u_blendMode) * screen(frameColor, lightWrap);\n personMask = smoothstep(u_coverage.x, u_coverage.y, personMask);\n outColor = vec4(frameColor * personMask + backgroundColor * (1.0 - personMask), 1.0);\n }\n "]))); | ||
var outputWidth = canvas.width, outputHeight = canvas.height; | ||
var outputRatio = outputWidth / outputHeight; | ||
var vertexShader = webglHelper_1.compileShader(gl, gl.VERTEX_SHADER, vertexShaderSource); | ||
var fragmentShader = webglHelper_1.compileShader(gl, gl.FRAGMENT_SHADER, fragmentShaderSource); | ||
var program = webglHelper_1.createPiplelineStageProgram(gl, vertexShader, fragmentShader, positionBuffer, texCoordBuffer); | ||
var vertexShader = (0, webglHelper_1.compileShader)(gl, gl.VERTEX_SHADER, vertexShaderSource); | ||
var fragmentShader = (0, webglHelper_1.compileShader)(gl, gl.FRAGMENT_SHADER, fragmentShaderSource); | ||
var program = (0, webglHelper_1.createPiplelineStageProgram)(gl, vertexShader, fragmentShader, positionBuffer, texCoordBuffer); | ||
var backgroundScaleLocation = gl.getUniformLocation(program, 'u_backgroundScale'); | ||
@@ -59,3 +59,3 @@ var backgroundOffsetLocation = gl.getUniformLocation(program, 'u_backgroundOffset'); | ||
function updateBackgroundImage(backgroundImage) { | ||
backgroundTexture = webglHelper_1.createTexture(gl, gl.RGBA8, backgroundImage.naturalWidth, backgroundImage.naturalHeight, gl.LINEAR, gl.LINEAR); | ||
backgroundTexture = (0, webglHelper_1.createTexture)(gl, gl.RGBA8, backgroundImage.naturalWidth, backgroundImage.naturalHeight, gl.LINEAR, gl.LINEAR); | ||
gl.texSubImage2D(gl.TEXTURE_2D, 0, 0, 0, backgroundImage.naturalWidth, backgroundImage.naturalHeight, gl.RGBA, gl.UNSIGNED_BYTE, backgroundImage); | ||
@@ -62,0 +62,0 @@ var xOffset = 0; |
@@ -11,3 +11,3 @@ "use strict"; | ||
function buildJointBilateralFilterStage(gl, vertexShader, positionBuffer, texCoordBuffer, inputTexture, segmentationConfig, outputTexture, canvas) { | ||
var fragmentShaderSource = webglHelper_1.glsl(templateObject_1 || (templateObject_1 = __makeTemplateObject(["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputFrame;\n uniform sampler2D u_segmentationMask;\n uniform vec2 u_texelSize;\n uniform float u_step;\n uniform float u_radius;\n uniform float u_offset;\n uniform float u_sigmaTexel;\n uniform float u_sigmaColor;\n\n in vec2 v_texCoord;\n\n out vec4 outColor;\n\n float gaussian(float x, float sigma) {\n float coeff = -0.5 / (sigma * sigma * 4.0 + 1.0e-6);\n return exp((x * x) * coeff);\n }\n\n void main() {\n vec2 centerCoord = v_texCoord;\n vec3 centerColor = texture(u_inputFrame, centerCoord).rgb;\n float newVal = 0.0;\n\n float spaceWeight = 0.0;\n float colorWeight = 0.0;\n float totalWeight = 0.0;\n\n // Subsample kernel space.\n for (float i = -u_radius + u_offset; i <= u_radius; i += u_step) {\n for (float j = -u_radius + u_offset; j <= u_radius; j += u_step) {\n vec2 shift = vec2(j, i) * u_texelSize;\n vec2 coord = vec2(centerCoord + shift);\n vec3 frameColor = texture(u_inputFrame, coord).rgb;\n float outVal = texture(u_segmentationMask, coord).a;\n\n spaceWeight = gaussian(distance(centerCoord, coord), u_sigmaTexel);\n colorWeight = gaussian(distance(centerColor, frameColor), u_sigmaColor);\n totalWeight += spaceWeight * colorWeight;\n\n newVal += spaceWeight * colorWeight * outVal;\n }\n }\n newVal /= totalWeight;\n\n outColor = vec4(vec3(0.0), newVal);\n }\n "], ["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputFrame;\n uniform sampler2D u_segmentationMask;\n uniform vec2 u_texelSize;\n uniform float u_step;\n uniform float u_radius;\n uniform float u_offset;\n uniform float u_sigmaTexel;\n uniform float u_sigmaColor;\n\n in vec2 v_texCoord;\n\n out vec4 outColor;\n\n float gaussian(float x, float sigma) {\n float coeff = -0.5 / (sigma * sigma * 4.0 + 1.0e-6);\n return exp((x * x) * coeff);\n }\n\n void main() {\n vec2 centerCoord = v_texCoord;\n vec3 centerColor = texture(u_inputFrame, centerCoord).rgb;\n float newVal = 0.0;\n\n float spaceWeight = 0.0;\n float colorWeight = 0.0;\n float totalWeight = 0.0;\n\n // Subsample kernel space.\n for (float i = -u_radius + u_offset; i <= u_radius; i += u_step) {\n for (float j = -u_radius + u_offset; j <= u_radius; j += u_step) {\n vec2 shift = vec2(j, i) * u_texelSize;\n vec2 coord = vec2(centerCoord + shift);\n vec3 frameColor = texture(u_inputFrame, coord).rgb;\n float outVal = texture(u_segmentationMask, coord).a;\n\n spaceWeight = gaussian(distance(centerCoord, coord), u_sigmaTexel);\n colorWeight = gaussian(distance(centerColor, frameColor), u_sigmaColor);\n totalWeight += spaceWeight * colorWeight;\n\n newVal += spaceWeight * colorWeight * outVal;\n }\n }\n newVal /= totalWeight;\n\n outColor = vec4(vec3(0.0), newVal);\n }\n "]))); | ||
var fragmentShaderSource = (0, webglHelper_1.glsl)(templateObject_1 || (templateObject_1 = __makeTemplateObject(["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputFrame;\n uniform sampler2D u_segmentationMask;\n uniform vec2 u_texelSize;\n uniform float u_step;\n uniform float u_radius;\n uniform float u_offset;\n uniform float u_sigmaTexel;\n uniform float u_sigmaColor;\n\n in vec2 v_texCoord;\n\n out vec4 outColor;\n\n float gaussian(float x, float sigma) {\n float coeff = -0.5 / (sigma * sigma * 4.0 + 1.0e-6);\n return exp((x * x) * coeff);\n }\n\n void main() {\n vec2 centerCoord = v_texCoord;\n vec3 centerColor = texture(u_inputFrame, centerCoord).rgb;\n float newVal = 0.0;\n\n float spaceWeight = 0.0;\n float colorWeight = 0.0;\n float totalWeight = 0.0;\n\n vec2 leftTopCoord = vec2(centerCoord + vec2(-u_radius, -u_radius) * u_texelSize);\n vec2 rightTopCoord = vec2(centerCoord + vec2(u_radius, -u_radius) * u_texelSize);\n vec2 leftBottomCoord = vec2(centerCoord + vec2(-u_radius, u_radius) * u_texelSize);\n vec2 rightBottomCoord = vec2(centerCoord + vec2(u_radius, u_radius) * u_texelSize);\n\n float leftTopSegAlpha = texture(u_segmentationMask, leftTopCoord).a;\n float rightTopSegAlpha = texture(u_segmentationMask, rightTopCoord).a;\n float leftBottomSegAlpha = texture(u_segmentationMask, leftBottomCoord).a;\n float rightBottomSegAlpha = texture(u_segmentationMask, rightBottomCoord).a;\n float totalSegAlpha = leftTopSegAlpha + rightTopSegAlpha + leftBottomSegAlpha + rightBottomSegAlpha;\n\n if (totalSegAlpha <= 0.0) {\n outColor = vec4(vec3(0.0), 0.0);\n } else if (totalSegAlpha >= 4.0) {\n outColor = vec4(vec3(0.0), 1.0);\n } else {\n for (float i = -u_radius + u_offset; i <= u_radius; i += u_step) {\n for (float j = -u_radius + u_offset; j <= u_radius; j += u_step) {\n vec2 shift = vec2(j, i) * u_texelSize;\n vec2 coord = vec2(centerCoord + shift);\n vec3 frameColor = texture(u_inputFrame, coord).rgb;\n float outVal = texture(u_segmentationMask, coord).a;\n\n spaceWeight = gaussian(distance(centerCoord, coord), u_sigmaTexel);\n colorWeight = gaussian(distance(centerColor, frameColor), u_sigmaColor);\n totalWeight += spaceWeight * colorWeight;\n\n newVal += spaceWeight * colorWeight * outVal;\n }\n }\n newVal /= totalWeight;\n\n outColor = vec4(vec3(0.0), newVal);\n }\n }\n "], ["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputFrame;\n uniform sampler2D u_segmentationMask;\n uniform vec2 u_texelSize;\n uniform float u_step;\n uniform float u_radius;\n uniform float u_offset;\n uniform float u_sigmaTexel;\n uniform float u_sigmaColor;\n\n in vec2 v_texCoord;\n\n out vec4 outColor;\n\n float gaussian(float x, float sigma) {\n float coeff = -0.5 / (sigma * sigma * 4.0 + 1.0e-6);\n return exp((x * x) * coeff);\n }\n\n void main() {\n vec2 centerCoord = v_texCoord;\n vec3 centerColor = texture(u_inputFrame, centerCoord).rgb;\n float newVal = 0.0;\n\n float spaceWeight = 0.0;\n float colorWeight = 0.0;\n float totalWeight = 0.0;\n\n vec2 leftTopCoord = vec2(centerCoord + vec2(-u_radius, -u_radius) * u_texelSize);\n vec2 rightTopCoord = vec2(centerCoord + vec2(u_radius, -u_radius) * u_texelSize);\n vec2 leftBottomCoord = vec2(centerCoord + vec2(-u_radius, u_radius) * u_texelSize);\n vec2 rightBottomCoord = vec2(centerCoord + vec2(u_radius, u_radius) * u_texelSize);\n\n float leftTopSegAlpha = texture(u_segmentationMask, leftTopCoord).a;\n float rightTopSegAlpha = texture(u_segmentationMask, rightTopCoord).a;\n float leftBottomSegAlpha = texture(u_segmentationMask, leftBottomCoord).a;\n float rightBottomSegAlpha = texture(u_segmentationMask, rightBottomCoord).a;\n float totalSegAlpha = leftTopSegAlpha + rightTopSegAlpha + leftBottomSegAlpha + rightBottomSegAlpha;\n\n if (totalSegAlpha <= 0.0) {\n outColor = vec4(vec3(0.0), 0.0);\n } else if (totalSegAlpha >= 4.0) {\n outColor = vec4(vec3(0.0), 1.0);\n } else {\n for (float i = -u_radius + u_offset; i <= u_radius; i += u_step) {\n for (float j = -u_radius + u_offset; j <= u_radius; j += u_step) {\n vec2 shift = vec2(j, i) * u_texelSize;\n vec2 coord = vec2(centerCoord + shift);\n vec3 frameColor = texture(u_inputFrame, coord).rgb;\n float outVal = texture(u_segmentationMask, coord).a;\n\n spaceWeight = gaussian(distance(centerCoord, coord), u_sigmaTexel);\n colorWeight = gaussian(distance(centerColor, frameColor), u_sigmaColor);\n totalWeight += spaceWeight * colorWeight;\n\n newVal += spaceWeight * colorWeight * outVal;\n }\n }\n newVal /= totalWeight;\n\n outColor = vec4(vec3(0.0), newVal);\n }\n }\n "]))); | ||
var _a = segmentationHelper_1.inputResolutions[segmentationConfig.inputResolution], segmentationWidth = _a[0], segmentationHeight = _a[1]; | ||
@@ -17,4 +17,4 @@ var outputWidth = canvas.width, outputHeight = canvas.height; | ||
var texelHeight = 1 / outputHeight; | ||
var fragmentShader = webglHelper_1.compileShader(gl, gl.FRAGMENT_SHADER, fragmentShaderSource); | ||
var program = webglHelper_1.createPiplelineStageProgram(gl, vertexShader, fragmentShader, positionBuffer, texCoordBuffer); | ||
var fragmentShader = (0, webglHelper_1.compileShader)(gl, gl.FRAGMENT_SHADER, fragmentShaderSource); | ||
var program = (0, webglHelper_1.createPiplelineStageProgram)(gl, vertexShader, fragmentShader, positionBuffer, texCoordBuffer); | ||
var inputFrameLocation = gl.getUniformLocation(program, 'u_inputFrame'); | ||
@@ -21,0 +21,0 @@ var segmentationMaskLocation = gl.getUniformLocation(program, 'u_segmentationMask'); |
@@ -11,3 +11,3 @@ "use strict"; | ||
function buildLoadSegmentationStage(gl, vertexShader, positionBuffer, texCoordBuffer, segmentationConfig, tflite, outputTexture) { | ||
var fragmentShaderSource = webglHelper_1.glsl(templateObject_1 || (templateObject_1 = __makeTemplateObject(["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputSegmentation;\n\n in vec2 v_texCoord;\n\n out vec4 outColor;\n\n void main() {\n float segmentation = texture(u_inputSegmentation, v_texCoord).r;\n outColor = vec4(vec3(0.0), segmentation);\n }\n "], ["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputSegmentation;\n\n in vec2 v_texCoord;\n\n out vec4 outColor;\n\n void main() {\n float segmentation = texture(u_inputSegmentation, v_texCoord).r;\n outColor = vec4(vec3(0.0), segmentation);\n }\n " | ||
var fragmentShaderSource = (0, webglHelper_1.glsl)(templateObject_1 || (templateObject_1 = __makeTemplateObject(["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputSegmentation;\n\n in vec2 v_texCoord;\n\n out vec4 outColor;\n\n void main() {\n float segmentation = texture(u_inputSegmentation, v_texCoord).r;\n outColor = vec4(vec3(0.0), segmentation);\n }\n "], ["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputSegmentation;\n\n in vec2 v_texCoord;\n\n out vec4 outColor;\n\n void main() {\n float segmentation = texture(u_inputSegmentation, v_texCoord).r;\n outColor = vec4(vec3(0.0), segmentation);\n }\n " | ||
// TFLite memory will be accessed as float32 | ||
@@ -18,6 +18,6 @@ ]))); | ||
var _a = segmentationHelper_1.inputResolutions[segmentationConfig.inputResolution], segmentationWidth = _a[0], segmentationHeight = _a[1]; | ||
var fragmentShader = webglHelper_1.compileShader(gl, gl.FRAGMENT_SHADER, fragmentShaderSource); | ||
var program = webglHelper_1.createPiplelineStageProgram(gl, vertexShader, fragmentShader, positionBuffer, texCoordBuffer); | ||
var fragmentShader = (0, webglHelper_1.compileShader)(gl, gl.FRAGMENT_SHADER, fragmentShaderSource); | ||
var program = (0, webglHelper_1.createPiplelineStageProgram)(gl, vertexShader, fragmentShader, positionBuffer, texCoordBuffer); | ||
var inputLocation = gl.getUniformLocation(program, 'u_inputSegmentation'); | ||
var inputTexture = webglHelper_1.createTexture(gl, gl.R32F, segmentationWidth, segmentationHeight); | ||
var inputTexture = (0, webglHelper_1.createTexture)(gl, gl.R32F, segmentationWidth, segmentationHeight); | ||
var frameBuffer = gl.createFramebuffer(); | ||
@@ -24,0 +24,0 @@ gl.bindFramebuffer(gl.FRAMEBUFFER, frameBuffer); |
@@ -11,3 +11,3 @@ "use strict"; | ||
function buildResizingStage(gl, vertexShader, positionBuffer, texCoordBuffer, segmentationConfig, tflite) { | ||
var fragmentShaderSource = webglHelper_1.glsl(templateObject_1 || (templateObject_1 = __makeTemplateObject(["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputFrame;\n\n in vec2 v_texCoord;\n\n out vec4 outColor;\n\n void main() {\n outColor = texture(u_inputFrame, v_texCoord);\n }\n "], ["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputFrame;\n\n in vec2 v_texCoord;\n\n out vec4 outColor;\n\n void main() {\n outColor = texture(u_inputFrame, v_texCoord);\n }\n " | ||
var fragmentShaderSource = (0, webglHelper_1.glsl)(templateObject_1 || (templateObject_1 = __makeTemplateObject(["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputFrame;\n\n in vec2 v_texCoord;\n\n out vec4 outColor;\n\n void main() {\n outColor = texture(u_inputFrame, v_texCoord);\n }\n "], ["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputFrame;\n\n in vec2 v_texCoord;\n\n out vec4 outColor;\n\n void main() {\n outColor = texture(u_inputFrame, v_texCoord);\n }\n " | ||
// TFLite memory will be accessed as float32 | ||
@@ -19,6 +19,6 @@ ]))); | ||
var outputPixelCount = outputWidth * outputHeight; | ||
var fragmentShader = webglHelper_1.compileShader(gl, gl.FRAGMENT_SHADER, fragmentShaderSource); | ||
var program = webglHelper_1.createPiplelineStageProgram(gl, vertexShader, fragmentShader, positionBuffer, texCoordBuffer); | ||
var fragmentShader = (0, webglHelper_1.compileShader)(gl, gl.FRAGMENT_SHADER, fragmentShaderSource); | ||
var program = (0, webglHelper_1.createPiplelineStageProgram)(gl, vertexShader, fragmentShader, positionBuffer, texCoordBuffer); | ||
var inputFrameLocation = gl.getUniformLocation(program, 'u_inputFrame'); | ||
var outputTexture = webglHelper_1.createTexture(gl, gl.RGBA8, outputWidth, outputHeight); | ||
var outputTexture = (0, webglHelper_1.createTexture)(gl, gl.RGBA8, outputWidth, outputHeight); | ||
var frameBuffer = gl.createFramebuffer(); | ||
@@ -36,3 +36,3 @@ gl.bindFramebuffer(gl.FRAMEBUFFER, frameBuffer); | ||
// Downloads pixels asynchronously from GPU while rendering the current frame | ||
webglHelper_1.readPixelsAsync(gl, 0, 0, outputWidth, outputHeight, gl.RGBA, gl.UNSIGNED_BYTE, outputPixels); | ||
(0, webglHelper_1.readPixelsAsync)(gl, 0, 0, outputWidth, outputHeight, gl.RGBA, gl.UNSIGNED_BYTE, outputPixels); | ||
for (var i = 0; i < outputPixelCount; i++) { | ||
@@ -39,0 +39,0 @@ var tfliteIndex = tfliteInputMemoryOffset + i * 3; |
@@ -11,3 +11,3 @@ "use strict"; | ||
function buildSoftmaxStage(gl, vertexShader, positionBuffer, texCoordBuffer, segmentationConfig, tflite, outputTexture) { | ||
var fragmentShaderSource = webglHelper_1.glsl(templateObject_1 || (templateObject_1 = __makeTemplateObject(["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputSegmentation;\n\n in vec2 v_texCoord;\n\n out vec4 outColor;\n\n void main() {\n vec2 segmentation = texture(u_inputSegmentation, v_texCoord).rg;\n float shift = max(segmentation.r, segmentation.g);\n float backgroundExp = exp(segmentation.r - shift);\n float personExp = exp(segmentation.g - shift);\n outColor = vec4(vec3(0.0), personExp / (backgroundExp + personExp));\n }\n "], ["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputSegmentation;\n\n in vec2 v_texCoord;\n\n out vec4 outColor;\n\n void main() {\n vec2 segmentation = texture(u_inputSegmentation, v_texCoord).rg;\n float shift = max(segmentation.r, segmentation.g);\n float backgroundExp = exp(segmentation.r - shift);\n float personExp = exp(segmentation.g - shift);\n outColor = vec4(vec3(0.0), personExp / (backgroundExp + personExp));\n }\n " | ||
var fragmentShaderSource = (0, webglHelper_1.glsl)(templateObject_1 || (templateObject_1 = __makeTemplateObject(["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputSegmentation;\n\n in vec2 v_texCoord;\n\n out vec4 outColor;\n\n void main() {\n vec2 segmentation = texture(u_inputSegmentation, v_texCoord).rg;\n float shift = max(segmentation.r, segmentation.g);\n float backgroundExp = exp(segmentation.r - shift);\n float personExp = exp(segmentation.g - shift);\n outColor = vec4(vec3(0.0), personExp / (backgroundExp + personExp));\n }\n "], ["#version 300 es\n\n precision highp float;\n\n uniform sampler2D u_inputSegmentation;\n\n in vec2 v_texCoord;\n\n out vec4 outColor;\n\n void main() {\n vec2 segmentation = texture(u_inputSegmentation, v_texCoord).rg;\n float shift = max(segmentation.r, segmentation.g);\n float backgroundExp = exp(segmentation.r - shift);\n float personExp = exp(segmentation.g - shift);\n outColor = vec4(vec3(0.0), personExp / (backgroundExp + personExp));\n }\n " | ||
// TFLite memory will be accessed as float32 | ||
@@ -18,6 +18,6 @@ ]))); | ||
var _a = segmentationHelper_1.inputResolutions[segmentationConfig.inputResolution], segmentationWidth = _a[0], segmentationHeight = _a[1]; | ||
var fragmentShader = webglHelper_1.compileShader(gl, gl.FRAGMENT_SHADER, fragmentShaderSource); | ||
var program = webglHelper_1.createPiplelineStageProgram(gl, vertexShader, fragmentShader, positionBuffer, texCoordBuffer); | ||
var fragmentShader = (0, webglHelper_1.compileShader)(gl, gl.FRAGMENT_SHADER, fragmentShaderSource); | ||
var program = (0, webglHelper_1.createPiplelineStageProgram)(gl, vertexShader, fragmentShader, positionBuffer, texCoordBuffer); | ||
var inputLocation = gl.getUniformLocation(program, 'u_inputSegmentation'); | ||
var inputTexture = webglHelper_1.createTexture(gl, gl.RG32F, segmentationWidth, segmentationHeight); | ||
var inputTexture = (0, webglHelper_1.createTexture)(gl, gl.RG32F, segmentationWidth, segmentationHeight); | ||
var frameBuffer = gl.createFramebuffer(); | ||
@@ -24,0 +24,0 @@ gl.bindFramebuffer(gl.FRAMEBUFFER, frameBuffer); |
@@ -21,3 +21,3 @@ "use strict"; | ||
if (f) throw new TypeError("Generator is already executing."); | ||
while (_) try { | ||
while (g && (g = 0, op[0] && (_ = 0)), _) try { | ||
if (f = 1, y && (t = op[0] & 2 ? y["return"] : op[0] ? y["throw"] || ((t = y["return"]) && t.call(y), 0) : y.next) && !(t = t.call(y, op[1])).done) return t; | ||
@@ -54,7 +54,7 @@ if (y = 0, t) op = [op[0] & 2, t.value]; | ||
var shouldRunInference = true; | ||
var vertexShaderSource = webglHelper_1.glsl(templateObject_1 || (templateObject_1 = __makeTemplateObject(["#version 300 es\n\n in vec2 a_position;\n in vec2 a_texCoord;\n\n out vec2 v_texCoord;\n\n void main() {\n gl_Position = vec4(a_position, 0.0, 1.0);\n v_texCoord = a_texCoord;\n }\n "], ["#version 300 es\n\n in vec2 a_position;\n in vec2 a_texCoord;\n\n out vec2 v_texCoord;\n\n void main() {\n gl_Position = vec4(a_position, 0.0, 1.0);\n v_texCoord = a_texCoord;\n }\n "]))); | ||
var vertexShaderSource = (0, webglHelper_1.glsl)(templateObject_1 || (templateObject_1 = __makeTemplateObject(["#version 300 es\n\n in vec2 a_position;\n in vec2 a_texCoord;\n\n out vec2 v_texCoord;\n\n void main() {\n gl_Position = vec4(a_position, 0.0, 1.0);\n v_texCoord = a_texCoord;\n }\n "], ["#version 300 es\n\n in vec2 a_position;\n in vec2 a_texCoord;\n\n out vec2 v_texCoord;\n\n void main() {\n gl_Position = vec4(a_position, 0.0, 1.0);\n v_texCoord = a_texCoord;\n }\n "]))); | ||
var frameWidth = sourcePlayback.width, frameHeight = sourcePlayback.height; | ||
var _a = segmentationHelper_1.inputResolutions[segmentationConfig.inputResolution], segmentationWidth = _a[0], segmentationHeight = _a[1]; | ||
var gl = canvas.getContext('webgl2'); | ||
var vertexShader = webglHelper_1.compileShader(gl, gl.VERTEX_SHADER, vertexShaderSource); | ||
var vertexShader = (0, webglHelper_1.compileShader)(gl, gl.VERTEX_SHADER, vertexShaderSource); | ||
var vertexArray = gl.createVertexArray(); | ||
@@ -79,10 +79,10 @@ gl.bindVertexArray(vertexArray); | ||
// TODO Rename segmentation and person mask to be more specific | ||
var segmentationTexture = webglHelper_1.createTexture(gl, gl.RGBA8, segmentationWidth, segmentationHeight); | ||
var personMaskTexture = webglHelper_1.createTexture(gl, gl.RGBA8, frameWidth, frameHeight); | ||
var resizingStage = resizingStage_1.buildResizingStage(gl, vertexShader, positionBuffer, texCoordBuffer, segmentationConfig, tflite); | ||
var loadSegmentationStage = loadSegmentationStage_1.buildLoadSegmentationStage(gl, vertexShader, positionBuffer, texCoordBuffer, segmentationConfig, tflite, segmentationTexture); | ||
var jointBilateralFilterStage = jointBilateralFilterStage_1.buildJointBilateralFilterStage(gl, vertexShader, positionBuffer, texCoordBuffer, segmentationTexture, segmentationConfig, personMaskTexture, canvas); | ||
var segmentationTexture = (0, webglHelper_1.createTexture)(gl, gl.RGBA8, segmentationWidth, segmentationHeight); | ||
var personMaskTexture = (0, webglHelper_1.createTexture)(gl, gl.RGBA8, frameWidth, frameHeight); | ||
var resizingStage = (0, resizingStage_1.buildResizingStage)(gl, vertexShader, positionBuffer, texCoordBuffer, segmentationConfig, tflite); | ||
var loadSegmentationStage = (0, loadSegmentationStage_1.buildLoadSegmentationStage)(gl, vertexShader, positionBuffer, texCoordBuffer, segmentationConfig, tflite, segmentationTexture); | ||
var jointBilateralFilterStage = (0, jointBilateralFilterStage_1.buildJointBilateralFilterStage)(gl, vertexShader, positionBuffer, texCoordBuffer, segmentationTexture, segmentationConfig, personMaskTexture, canvas); | ||
var backgroundStage = backgroundConfig.type === 'blur' | ||
? backgroundBlurStage_1.buildBackgroundBlurStage(gl, vertexShader, positionBuffer, texCoordBuffer, personMaskTexture, canvas) | ||
: backgroundImageStage_1.buildBackgroundImageStage(gl, positionBuffer, texCoordBuffer, personMaskTexture, backgroundImage, canvas); | ||
? (0, backgroundBlurStage_1.buildBackgroundBlurStage)(gl, vertexShader, positionBuffer, texCoordBuffer, personMaskTexture, canvas) | ||
: (0, backgroundImageStage_1.buildBackgroundImageStage)(gl, positionBuffer, texCoordBuffer, personMaskTexture, backgroundImage, canvas); | ||
function render() { | ||
@@ -89,0 +89,0 @@ return __awaiter(this, void 0, void 0, function () { |
@@ -11,3 +11,3 @@ "use strict"; | ||
WebGL2PipelineType["Image"] = "image"; | ||
})(WebGL2PipelineType = exports.WebGL2PipelineType || (exports.WebGL2PipelineType = {})); | ||
})(WebGL2PipelineType || (exports.WebGL2PipelineType = WebGL2PipelineType = {})); | ||
/** | ||
@@ -37,3 +37,3 @@ * ImageFit specifies the positioning of an image inside a viewport. | ||
ImageFit["None"] = "None"; | ||
})(ImageFit = exports.ImageFit || (exports.ImageFit = {})); | ||
})(ImageFit || (exports.ImageFit = ImageFit = {})); | ||
/** | ||
@@ -58,3 +58,3 @@ * Specifies which pipeline to use when processing video frames. | ||
Pipeline["WebGL2"] = "WebGL2"; | ||
})(Pipeline = exports.Pipeline || (exports.Pipeline = {})); | ||
})(Pipeline || (exports.Pipeline = Pipeline = {})); | ||
//# sourceMappingURL=types.js.map |
@@ -8,3 +8,3 @@ "use strict"; | ||
*/ | ||
exports.version = '2.0.0'; | ||
exports.version = '2.1.0'; | ||
//# sourceMappingURL=version.js.map |
@@ -5,3 +5,3 @@ { | ||
"description": "Twilio Video Processors JavaScript Library", | ||
"version": "2.0.0", | ||
"version": "2.1.0", | ||
"homepage": "https://github.com/twilio/twilio-video-processors.js#readme", | ||
@@ -69,5 +69,6 @@ "author": "Charlie Santos <csantos@twilio.com>", | ||
"twilio-release-tool": "^1.0.2", | ||
"typedoc": "0.20.28", | ||
"typedoc": "0.25.0", | ||
"typedoc-plugin-as-member-of": "^1.0.2", | ||
"typescript": "4.1.5", | ||
"typescript": "5.2.2", | ||
"uglify-js": "^3.17.4", | ||
"vinyl-fs": "^3.0.3", | ||
@@ -74,0 +75,0 @@ "vinyl-source-stream": "^2.0.0" |
# Twilio Video Processors | ||
> [!WARNING] | ||
> We are no longer allowing new customers to onboard to Twilio Video. Effective **December 5th, 2024**, Twilio Video will End of Life (EOL) and will cease to function for all customers. Customers may transition to any video provider they choose, however, we are recommending customers migrate to the Zoom Video SDK and we have prepared a [Migration Guide](https://developers.zoom.us/docs/video-sdk/twilio/). Additional information on this EOL is available in our Help Center [here](https://support.twilio.com/hc/en-us/articles/20950630029595-Programmable-Video-End-of-Life-Notice). | ||
Twilio Video Processors is a collection of video processing tools which can be used with [Twilio Video JavaScript SDK](https://github.com/twilio/twilio-video.js) to apply transformations and filters to a VideoTrack. | ||
@@ -11,4 +14,4 @@ | ||
- [Virtual Background](https://twilio.github.io/twilio-video-processors.js/classes/virtualbackgroundprocessor.html) | ||
- [Background Blur](https://twilio.github.io/twilio-video-processors.js/classes/gaussianblurbackgroundprocessor.html) | ||
- [Virtual Background](https://twilio.github.io/twilio-video-processors.js/classes/VirtualBackgroundProcessor.html) | ||
- [Background Blur](https://twilio.github.io/twilio-video-processors.js/classes/GaussianBlurBackgroundProcessor.html) | ||
@@ -57,7 +60,7 @@ ## Prerequisites | ||
In order to achieve the best performance, the VideoProcessors use WebAssembly to run TensorFlow Lite for person segmentation. You need to serve the tflite model and binaries so they can be loaded properly. These files can be downloaded from the `dist/build` folder. Check the [API docs](https://twilio.github.io/twilio-video-processors.js/interfaces/virtualbackgroundprocessoroptions.html#assetspath) for details and the [examples](https://github.com/twilio/twilio-video-processors.js/tree/master/examples) folder for reference. | ||
In order to achieve the best performance, the VideoProcessors use WebAssembly to run TensorFlow Lite for person segmentation. You need to serve the tflite model and binaries so they can be loaded properly. These files can be downloaded from the `dist/build` folder. Check the [API docs](https://twilio.github.io/twilio-video-processors.js/interfaces/VirtualBackgroundProcessorOptions.html#assetsPath) for details and the [examples](https://github.com/twilio/twilio-video-processors.js/tree/master/examples) folder for reference. | ||
## Usage | ||
These processors run TensorFlow Lite using [MediaPipe Selfie Segmentation Landscape Model](https://drive.google.com/file/d/1dCfozqknMa068vVsO2j_1FgZkW_e3VWv/preview) and requires [WebAssembly SIMD](https://v8.dev/features/simd) support in order to achieve the best performance. We recommend that, when calling [Video.createLocalVideoTrack](https://sdk.twilio.com/js/video/releases/2.23.1/docs/module-twilio-video.html#.createLocalVideoTrack__anchor), the video capture constraints be set to `24 fps` frame rate with `640x480` capture dimensions. Higher resolutions can still be used for increased accuracy, but may degrade performance, resulting in a lower output frame rate on low powered devices. | ||
These processors run TensorFlow Lite using [MediaPipe Selfie Segmentation Landscape Model](https://drive.google.com/file/d/1dCfozqknMa068vVsO2j_1FgZkW_e3VWv/preview) and requires [WebAssembly SIMD](https://v8.dev/features/simd) support in order to achieve the best performance. We recommend that, when calling [Video.createLocalVideoTrack](https://sdk.twilio.com/js/video/releases/2.28.0/docs/module-twilio-video.html#.createLocalVideoTrack__anchor), the video capture constraints be set to `24 fps` frame rate with `640x480` capture dimensions. Higher resolutions can still be used for increased accuracy, but may degrade performance, resulting in a lower output frame rate on low powered devices. | ||
@@ -68,3 +71,3 @@ ## Best Practice | ||
* [VirtualBackgroundProcessor](https://twilio.github.io/twilio-video-processors.js/classes/virtualbackgroundprocessor.html) | ||
* [GaussianBlurBackgroundProcessor](https://twilio.github.io/twilio-video-processors.js/classes/gaussianblurbackgroundprocessor.html) | ||
* [VirtualBackgroundProcessor](https://twilio.github.io/twilio-video-processors.js/classes/VirtualBackgroundProcessor.html) | ||
* [GaussianBlurBackgroundProcessor](https://twilio.github.io/twilio-video-processors.js/classes/GaussianBlurBackgroundProcessor.html) |
Sorry, the diff of this file is too big to display
Sorry, the diff of this file is too big to display
Sorry, the diff of this file is not supported yet
Sorry, the diff of this file is not supported yet
Sorry, the diff of this file is not supported yet
Sorry, the diff of this file is not supported yet
Sorry, the diff of this file is not supported yet
Sorry, the diff of this file is not supported yet
Sorry, the diff of this file is not supported yet
Sorry, the diff of this file is not supported yet
Sorry, the diff of this file is not supported yet
Sorry, the diff of this file is not supported yet
Sorry, the diff of this file is not supported yet
Sorry, the diff of this file is not supported yet
Sorry, the diff of this file is not supported yet
Sorry, the diff of this file is not supported yet
Long strings
Supply chain riskContains long string literals, which may be a sign of obfuscated or packed code.
Found 1 instance in 1 package
Major refactor
Supply chain riskPackage has recently undergone a major refactor. It may be unstable or indicate significant internal changes. Use caution when updating to versions that include significant changes.
Found 1 instance in 1 package
Long strings
Supply chain riskContains long string literals, which may be a sign of obfuscated or packed code.
Found 1 instance in 1 package
5306183
4279
71
14
26