videocontext
Advanced tools
Comparing version 0.53.0 to 0.53.1
@@ -0,1 +1,3 @@ | ||
#### 0.53.1 (2019-07-05) | ||
### 0.53.0 (2019-04-08) | ||
@@ -2,0 +4,0 @@ |
{ | ||
"name": "videocontext", | ||
"version": "0.53.0", | ||
"version": "0.53.1", | ||
"description": "A WebGL & HTML5 graph based video composition library", | ||
@@ -55,3 +55,3 @@ "repository": { | ||
"babel-preset-env": "^1.7.0", | ||
"eslint": "^3.9.1", | ||
"eslint": "^4.18.2", | ||
"eslint-loader": "^2.0.0", | ||
@@ -58,0 +58,0 @@ "generate-changelog": "^1.7.1", |
147
README.md
# VideoContext | ||
![build status](https://travis-ci.org/bbc/VideoContext.svg?branch=master) | ||
@@ -6,15 +7,33 @@ | ||
It consists of two main components. A graph based, shader accelerated processing pipeline, and a media playback sequencing timeline. | ||
It consists of two main components. A graph based, shader accelerated processing pipeline, and a media playback sequencing time-line. | ||
The design is heavily inspired by the Web Audio API, so it should feel familiar for people with experience in the Web Audio world. | ||
[Live examples can be found here](http://bbc.github.io/VideoContext/) | ||
The design is heavily inspired by the WebAudioAPI so should feel familiar to use for people who've had previous experience in the WebAudio world. | ||
## Table of Contents | ||
- [Demo](#demo) | ||
- [Debugging](#debugging) | ||
- [Documentation](#documentation) | ||
- [Node Types](#node-types) | ||
- [VideoNode](#videonode) | ||
- [ImageNode](#imagenode) | ||
- [CanvasNode](#canvasnode) | ||
- [CustomSourceNode](#customsourcenode) | ||
- [EffectNode](#effectnode) | ||
- [TransitionNode](#transitionnode) | ||
- [CompositingNode](#compositingnode) | ||
- [Writing Custom Effect Definitions](#writing-custom-effect-definitions) | ||
- [Advanced Examples](#advanced-examples) | ||
- [Development](#development) | ||
- [Gitflow](#gitflow) | ||
- [Releases](#releases) | ||
- [CI](#ci) | ||
[Live examples can be found here](http://bbc.github.io/VideoContext/) | ||
## Demo | ||
> View on [CodeSandbox](https://codesandbox.io/embed/nostalgic-meitner-08sh2). | ||
## Demo | ||
``` JavaScript | ||
```JavaScript | ||
<!DOCTYPE html> | ||
@@ -31,3 +50,3 @@ <html> | ||
If omitted, the canvas dimensions will be 300x150 and your videos will not rendered at their | ||
optimum definition | ||
optimal definition | ||
https://webglfundamentals.org/webgl/lessons/webgl-resizing-the-canvas.html | ||
@@ -68,8 +87,11 @@ --> | ||
## Debugging | ||
If you need to debug video context graphs or get a better insight into what is happening under the hood there's a new browser extension for chrome, [videocontext-devtools](https://github.com/bbc/videocontext-devtools) | ||
If you need to debug video context graphs or get a better insight into what is happening under the hood, there's a new browser extension for chrome, [videocontext-devtools](https://github.com/bbc/videocontext-devtools) | ||
![Debugging view](../master/readme-debugging.jpg?raw=true) | ||
## Documentation | ||
API Documentation can be built using [ESDoc](https://esdoc.org/) by running the following commands: | ||
``` | ||
@@ -79,20 +101,25 @@ npm install | ||
``` | ||
The documentation will be generated in the "./doc" folder of the repository. | ||
## Node Types | ||
There are a number of different types of nodes which can be used in the VideoContexts processing graph. Here's a quick list of each one, following that is a more in-depth discussion of each type. | ||
* VideoNode - Plays video. | ||
* AudioNode - Plays audio. | ||
* ImageNode - Displays images for specified time. | ||
* CanvasNode - Displays output of canvas for specified time. | ||
* EffectNode - Applies shader to limited number of inputs. | ||
* TransisitonNode - Applies shader to limited number of inputs. Modifies properties at specific times. | ||
* CompositingNode - Applies same shader to unlimited inputs, rendering to same output. | ||
* DestinationNode - Node representing output canvas. Can only be one. | ||
There are a number of different types of nodes which can be used in the VideoContext's processing graph. Here's a quick list of each one. Following that is a more in-depth discussion of each type. | ||
- [VideoNode](#videonode) - Plays video. | ||
- [AudioNode](#audionode) - Plays audio. | ||
- [ImageNode](#imagenode) - Displays images for specified time. | ||
- [CanvasNode](#canvasnode) - Displays output of canvas for specified time. | ||
- [EffectNode](#effectnode) - Applies shader to limited number of inputs. | ||
- [TransitionNode](#transitionnode) - Applies shader to limited number of inputs. Modifies properties at specific times. | ||
- [CompositingNode](#compositingnode) - Applies same shader to unlimited inputs, rendering to same output. | ||
- [DestinationNode](#destinationnode) - Node representing output canvas. Can only be one. | ||
### VideoNode | ||
A video source node. | ||
``` JavaScript | ||
> View on [CodeSandbox](https://codesandbox.io/embed/naughty-sea-dv0x1) | ||
```JavaScript | ||
var videoNode = videoCtx.video("./video1.mp4"); | ||
@@ -110,6 +137,9 @@ videoNode.connect(videoCtx.destination); | ||
### ImageNode | ||
### ImageNode | ||
An image source node. | ||
``` JavaScript | ||
> View on [CodeSandbox](https://codesandbox.io/embed/crazy-bas-6m7r7) | ||
```JavaScript | ||
var imageNode = videoCtx.image("cats.png"); | ||
@@ -122,5 +152,9 @@ imageNode.connect(videoCtx.destination); | ||
### CanvasNode | ||
A canvas source node. | ||
``` JavaScript | ||
var canvas = document.getElementById("input-cavnas"); | ||
> View on [CodeSandbox](https://codesandbox.io/embed/peaceful-meninsky-jkscs) | ||
```JavaScript | ||
var canvas = document.getElementById("input-canvas"); | ||
var canvasNode = videoCtx.canvas(canvas); | ||
@@ -130,3 +164,2 @@ canvasNode.connect(videoCtx.destination); | ||
canvasNode.stop(4); | ||
``` | ||
@@ -142,3 +175,2 @@ | ||
```JavaScript | ||
import Hls from "hls.js"; | ||
@@ -175,3 +207,2 @@ | ||
} | ||
``` | ||
@@ -182,4 +213,5 @@ | ||
### EffectNode | ||
An EffectNode is the simplest form of processing node. It's built from a definition object, which is a combination of fragment shader code, vertex shader code, input descriptions, and property descriptions. There are a number of common operations available as node descriptions accessible as static properties on the VideoContext at VideoContext.DESCRIPTIONS.* | ||
An EffectNode is the simplest form of processing node. It's built from a definition object, which is a combination of fragment shader code, vertex shader code, input descriptions, and property descriptions. There are a number of common operations available as node descriptions accessible as static properties on the VideoContext at `VideoContext.DEFINITIONS`. | ||
The vertex and shader code is GLSL code which gets compiled to produce the shader program. The input description tells the VideoContext how many ports there are to connect to and the name of the image associated with the port within the shader code. Inputs are always render-able textures (i.e images, videos, canvases). The property descriptions tell the VideoContext what controls to attached to the EffectNode and the name, type, and default value of the control within the shader code. | ||
@@ -189,4 +221,5 @@ | ||
> View on [CodeSandbox](https://codesandbox.io/embed/hopeful-shtern-q6lvy) | ||
``` JavaScript | ||
```JavaScript | ||
var monochromeDescription = { | ||
@@ -224,3 +257,2 @@ title:"Monochrome", | ||
}; | ||
``` | ||
@@ -230,3 +262,3 @@ | ||
``` JavaScript | ||
```JavaScript | ||
//Setup the video context. | ||
@@ -242,3 +274,3 @@ var canvas = document.getElementById("canvas"); | ||
//Create the sepia effect node (from the above Monochrome effect description). | ||
var sepiaEffect = ctx.effect(monochromDescription); | ||
var sepiaEffect = ctx.effect(monochromeDescription); | ||
@@ -254,6 +286,4 @@ //Give a sepia tint to the monochrome output (note how shader description properties are automatically bound to the JavaScript object). | ||
ctx.play(); | ||
``` | ||
### TransitionNode | ||
@@ -267,2 +297,4 @@ | ||
> View on [CodeSandbox](https://codesandbox.io/embed/modest-sutherland-gp2c5) | ||
```JavaScript | ||
@@ -305,3 +337,2 @@ var crossfadeDescription = { | ||
}; | ||
``` | ||
@@ -311,3 +342,3 @@ | ||
``` JavaScript | ||
```JavaScript | ||
//Setup the video context. | ||
@@ -360,17 +391,17 @@ var canvas = document.getElementById("canvas"); | ||
ctx.play(); | ||
``` | ||
### CompositingNode | ||
Compositing nodes are different from regular effect nodes as they can have an infinite number of nodes connected to them. They operate by running their effect shader on each connected input in turn and rendering the output to the same texture. This makes them particularly suitable for layering inputs which have alpha channels. | ||
### CompositingNode | ||
When compositing nodes are run, they map each input in turn to the first input in the definition. This means compositing node definitions typically only have a single input defined. It's also worth noting that an effect node definition with a single input can also be used as a compositing shader with no additional modifications. | ||
Compositing nodes are different from regular effect nodes in that they can have an infinite number of nodes connected to them. They operate by running their effect shader on each connected input in turn and rendering the output to the same texture. This makes them particularly suitable for layering inputs which have alpha channels. | ||
A common use for compositing nodes is to collect a series of source nodes which exist at distinct points on a timeline into a single connection for passing onto further processing. This effectively makes the sources into a single video track. | ||
When compositing nodes are run they map each input in turn to the first input in the definition, this means compositing node definitions typically only have a single input defined. It's also worth noting that an effect node definition with a single input can also be used as a compositing shader with no additional modifications. | ||
Here's a really simple shader which renders all the inputs to the same output. | ||
A common use for compositing nodes is to collect a series of source nodes which exist at distinct points on a time-line into a single connection for passing onto further processing. This effectively makes the sources into a single video track. | ||
> View on [CodeSandbox](https://codesandbox.io/embed/sweet-bartik-6cz3d). | ||
Here's a really simple shader which renders all the inputs to the same output. | ||
``` JavaScript | ||
```JavaScript | ||
var combineDecription ={ | ||
@@ -404,3 +435,3 @@ title:"Combine", | ||
``` JavaScript | ||
```JavaScript | ||
//Setup the video context. | ||
@@ -440,3 +471,2 @@ var canvas = document.getElementById("canvas"); | ||
## Writing Custom Effect Definitions | ||
@@ -446,3 +476,3 @@ | ||
``` JavaScript | ||
```JavaScript | ||
var effectDefinition ={ | ||
@@ -459,4 +489,8 @@ title:"", //A title for the effect. | ||
## Advanced Examples | ||
You can view more advanced usage examples [here](AdvancedExamples.md). | ||
## Development | ||
VideoContext has a pretty standard `package.json` | ||
@@ -479,7 +513,8 @@ | ||
### Gitflow | ||
VideoContext uses the gitflow branching model. | ||
To contribute raise a pull request against the `develop` branch. | ||
### Releases | ||
### Releases | ||
Releases are prepared in release branches. When the the release is ready run one of | ||
@@ -498,13 +533,27 @@ | ||
### CI | ||
VideoContext uses the BBCs public travis account to run all tests and publish to npmjs. | ||
All tests must pass before PRs can be merged. | ||
#### Release step-by-step | ||
1. `git checkout develop` | ||
2. `git pull` | ||
3. `git checkout -b release-xxx` | ||
4. tag and push using script | ||
- `npm run release:patch|minor|major` | ||
5. open pull request against master | ||
6. merge when tests have passed | ||
7. merge master back in to develop: | ||
- `git pull` | ||
- `git checkout develop` | ||
- `git merge master` | ||
There is one housekeeping task (this will be automated at some point): | ||
1. update the codesandbox examples to use the latest release | ||
### CI | ||
VideoContext uses the BBCs public travis account to run all tests and publish to npmjs. | ||
All tests must pass before PRs can be merged. | ||
Other options | ||
Other options | ||
``` | ||
@@ -511,0 +560,0 @@ npm run build # build dist packages |
Sorry, the diff of this file is too big to display
Sorry, the diff of this file is not supported yet
Major refactor
Supply chain riskPackage has recently undergone a major refactor. It may be unstable or indicate significant internal changes. Use caution when updating to versions that include significant changes.
Found 1 instance in 1 package
655467
6459
541
0