@thi.ng/transducers
This project is part of the
@thi.ng/umbrella monorepo.
About
This library provides altogether ~120 transducers, reducers, sequence
generators (iterators) and other supporting functions for composing data
transformation pipelines.
The overall concept and many of the core functions offered here are
directly inspired by the original Clojure implementation by Rich Hickey,
though the implementation does heavily differ (also in contrast to some
other JS based implementations) and dozens of less common, but generally
highly useful operators have been added. See full list below.
Furthermore, most transducers & reducers provided here accept an
optional input iterable, which allows them to be used directly as
transformers instead of having to wrap them in one of the execution
functions (i.e. transduce()
, reduce()
, iterator()
, run()
,
step()
). If called this way, transducer functions will return a
transforming ES6 iterator (generator) and reducing functions will return
a reduced result of the given input iterable.
Tutorial
There's an ongoing multi-part blog series about internals, use cases &
patterns of this package, specifically these 3 parts:
5.0.0 release
Several previously included internal support functions have been
migrated to the
@thi.ng/arrays
package. You'll need to update your imports if you've been using any of
these directly. Note that some of these functions also have changes to
their arg order.
Functions using randomness now all support an optional PRNG
implementation of the IRandom
interface from the
@thi.ng/random
package.
Related packages
Extended functionality
Packages utilizing transducers
Installation
yarn add @thi.ng/transducers
Dependencies
Usage examples
There're several standalone example projects using this library in the
/examples
directory.
Almost all functions can be imported selectively, but for development
purposes full module re-exports are defined.
import * as tx from "@thi.ng/transducers";
import { transduce } from "@thi.ng/transducers";
Basic usage patterns
xform = tx.comp(
tx.filter((x) => (x & 1) > 0),
tx.distinct(),
tx.map((x) => x * 3)
);
tx.transduce(xform, tx.push(), [1, 2, 3, 4, 5, 4, 3, 2, 1]);
tx.transduce(xform, tx.conj(), [1, 2, 3, 4, 5, 4, 3, 2, 1]);
[...tx.iterator(xform, [1, 2, 3, 4, 5])]
[...tx.filter((x) => /[A-Z]/.test(x), "Hello World!")]
f = tx.step(xform);
f(1)
f(2)
f(3)
f(4)
f = tx.step(take)
Fuzzy search
[...tx.filterFuzzy("ho", ["hello", "hallo", "hey", "heyoka"])]
[...tx.filterFuzzy("hlo", ["hello", "hallo", "hey", "heyoka"])]
[...tx.filterFuzzy(
[1, 3],
{ key: (x) => x.tags },
[
{ tags: [1, 2, 3] },
{ tags: [2, 3, 4] },
{ tags: [4, 5, 6] },
{ tags: [1, 3, 6] }
]
)]
Histogram generation & result grouping
tx.transduce(tx.map((x) => x.toUpperCase()), tx.frequencies(), "hello world");
tx.reduce(tx.frequencies(), [1, 1, 1, 2, 3, 4, 4]);
tx.frequencies([1, 1, 1, 2, 3, 4, 4]);
tx.frequencies(
(x) => x.length,
"my camel is collapsing and needs some water".split(" ")
);
tx.groupByMap(
{ key: (x) => x.length },
"my camel is collapsing and needs some water".split(" ")
);
[...tx.page(0, 5, tx.range(12))]
[...tx.iterator(tx.comp(tx.page(1, 5), tx.map(x => x * 10)), tx.range(12))]
[...tx.iterator(tx.comp(tx.page(2, 5), tx.padLast(5, "n/a")), tx.range(12))]
[...tx.page(3, 5, tx.range(12))]
Multiplexing / parallel transducer application
multiplex
and multiplexObj
can be used to transform values in
parallel using the provided transducers (which can be composed as usual)
and results in a tuple or keyed object.
tx.transduce(
tx.multiplex(
tx.map((x) => x.charAt(0)),
tx.map((x) => x.toUpperCase()),
tx.map((x) => x.length)
),
tx.push(),
["Alice", "Bob", "Charlie"]
);
tx.transduce(
tx.multiplexObj({
initial: tx.map((x) => x.charAt(0)),
name: tx.map((x) => x.toUpperCase()),
len: tx.map((x) => x.length)
}),
tx.push(),
["Alice", "Bob", "Charlie"]
);
Moving average using sliding window
tx.transduce(
tx.comp(
tx.partition(5, 1),
tx.map(x => tx.reduce(tx.mean(), x))
),
tx.push(),
[1, 2, 3, 3, 4, 5, 5, 6, 7, 8, 8, 9, 10]
)
[...tx.movingAverage(5, [1, 2, 3, 3, 4, 5, 5, 6, 7, 8, 8, 9, 10])]
Benchmark function execution time
fn = () => {
let x;
for (i = 0; i < 1e6; i++) {
x = Math.cos(i);
}
return x;
};
tx.transduce(tx.benchmark(), tx.mean(), tx.repeatedly(fn, 100));
Apply inspectors to debug transducer pipeline
tx.transduce(
tx.comp(
tx.trace("orig"),
tx.map((x) => x + 1),
tx.trace("mapped"),
tx.filter((x) => (x & 1) > 0)
),
tx.push(),
[1, 2, 3, 4]
);
Stream parsing / structuring
The struct
transducer is simply a composition of: partitionOf -> partition -> rename -> mapKeys
. See code
here.
[
...tx.struct(
[["id", 1, (id) => id[0]], ["pos", 2], ["vel", 2], ["color", 4]],
[0, 100, 200, -1, 0, 1, 0.5, 0, 1, 1, 0, 0, 5, 4, 0, 0, 1, 1]
)
];
CSV parsing
tx.transduce(
tx.comp(
tx.mapcat((x) => x.split("\n")),
tx.map((x) => x.split(",")),
tx.rename({ id: 0, name: 1, alias: 2, num: "length" })
),
tx.push(),
["100,typescript\n101,clojure,clj\n110,rust,rs"]
);
Early termination
tx.transduce(tx.comp(tx.flatten(), tx.take(7)), tx.push(), [
1,
[2, [3, 4, [5, 6, [7, 8], 9, [10]]]]
]);
Scan operator
xform = tx.comp(
tx.scan(tx.count()),
tx.map(x => [...tx.repeat(x,x)]),
tx.scan(tx.pushCopy())
)
[...tx.iterator(xform, [1, 1, 1, 1])]
tx.transduce(tx.comp(tx.scan(tx.count()), tx.scan(tx.pushCopy())), tx.push(), [1,1,1,1])
Weighted random choices
[...tx.take(10, tx.choices("abcd", [1, 0.5, 0.25, 0.125]))];
tx.transduce(
tx.take(1000),
tx.frequencies(),
tx.choices("abcd", [1, 0.5, 0.25, 0.125])
);
Keyframe interpolation
See
interpolate()
docs for details.
[
...interpolate(
10,
0,
100,
(a, b) => [a, b],
([a, b], t) => Math.floor(a + (b - a) * t),
[20, 100],
[50, 200],
[80, 0]
)
];
API
Documentation is slowly forthcoming in the form of doc comments (incl.
code examples) for a growing number the functions listed below. Please
see source code for now.
Types
Apart from type aliases, the only real types defined are:
Reducer
Reducers are the core building blocks of transducers. Unlike other
implementations using OOP approaches, a Reducer
in this lib is a
simple 3-element array of functions, each addressing a separate
processing step.
Since v0.6.0 the bundled reducers are all wrapped in functions to
provide a uniform API (and some of them can be preconfigured and/or are
stateful closures). However, it's fine to define stateless reducers as
constant arrays.
interface Reducer<A, B> extends Array<any> {
[0]: () => A;
[1]: (acc: A) => A;
[2]: (acc: A, x: B) => A | Reduced<A>;
}
const push: Reducer<any[], any> = [
() => [],
(acc) => acc,
(acc, x) => (acc.push(x), acc)
];
partition
, partitionBy
, streamSort
, streamShuffle
are (examples
of) transducers making use of their 1-arity completing function.
Reduced
class Reduced<T> implements IDeref<T> {
protected value: T;
constructor(val: T);
deref(): T;
}
Simple type wrapper to identify early termination of a reducer. Does not
modify wrapped value by injecting magic properties. Instances can be
created via reduced(x)
and handled via these helper functions:
reduced(x: any): any
isReduced(x: any): boolean
ensureReduced(x: any): Reduced<any>
unreduced(x: any): any
IReducible
By default reduce()
consumes inputs via the standard ES6 Iterable
interface, i.e. using a for..of..
loop. Array-like inputs are consumed
via a traditional for
-loop and custom optimized iterations can be
provided via implementations of the IReducible
interface in the source
collection type. Examples can be found here:
Note: The IReducible
interface is only used by reduce()
,
transduce()
and run()
.
Transducer
From Rich Hickey's original definition:
A transducer is a transformation from one reducing function to another
As shown in the examples above, transducers can be dynamically composed
(using comp()
) to form arbitrary data transformation pipelines without
causing large overheads for intermediate collections.
type Transducer<A, B> = (rfn: Reducer<any, B>) => Reducer<any, A>;
function map<A, B>(fn: (x: A) => B): Transducer<A, B> {
return (rfn: Reducer<any, B>) => {
return [
() => rfn[0](),
(acc) => rfn[1](acc),
(acc, x: A) => rfn[2](acc, fn(x))
];
};
}
function dedupe<T>(): Transducer<T, T> {
return (rfn: Reducer<any, T>) => {
let prev = {};
return [
() => rfn[0](),
(acc) => rfn[1](acc),
(acc, x) => {
acc = prev === x ? acc : rfn[2](acc, x);
prev = x;
return acc;
}
];
};
}
Composition & execution
comp(f1, f2, ...)
Returns new transducer composed from given transducers. Data flow is
from left to right. Offers fast paths for up to 10 args. If more are
given, composition is done dynamically via for loop.
compR(rfn: Reducer<any, any>, fn: (acc, x) => any): Reducer<any, any>
Helper function to compose reducers.
iterator<A, B>(tx: Transducer<A, B>, xs: Iterable<A>): IterableIterator<B>
Similar to transduce()
, but emits results as ES6 iterator (and hence
doesn't use a reduction function).
reduce<A, B>(rfn: Reducer<A, B>, acc: A, xs: Iterable<B>): A
Reduces xs
using given reducer and optional initial
accumulator/result. If xs
implements the IReducible
interface,
delegates to that implementation. Likewise, uses a fast route if xs
is
an ArrayLike
type.
transduce<A, B, C>(tx: Transducer<A, B>, rfn: Reducer<C, B>, acc: C, xs: Iterable<A>): C
Transforms iterable using given transducer and combines results with
given reducer and optional initial accumulator/result.
run<A, B>(tx: Transducer<A, B>, fx: (x: B) => void, xs: Iterable<A>)
Transforms iterable with given transducer and optional side effect
without any reduction step. If fx
is given it will be called with
every value produced by the transducer. If fx
is not given, the
transducer is assumed to include at least one sideEffect()
step
itself. Returns nothing.
Transducers
All of the following functions can be used and composed as transducers.
With a few exceptions, most also accept an input iterable and then
directly yield a transforming iterator, e.g.
tx.transduce(tx.map((x) => x*10), tx.push(), tx.range(4))
[...tx.map((x) => x*10, tx.range(4))]
Generators / Iterators
Reducers
As with transducer functions, reducer functions can also given an
optional input iterable. If done so, the function will consume the input
and return a reduced result (as if it would be called via reduce()
).
Authors
License
© 2016-2018 Karsten Schmidt // Apache Software License 2.0