
Product
Socket Brings Supply Chain Security to skills.sh
Socket is now scanning AI agent skills across multiple languages and ecosystems, detecting malicious behavior before developers install, starting with skills.sh's 60,000+ skills.
Transducers for JavaScript.
Transducers are composable algorithmic transformations. They are independent from the context of their input and output sources and specify only the essence of the transformation in terms of an individual element. Because transducers are decoupled from input or output sources, they can be used in many different processes - collections, streams, channels, observables, etc. Transducers compose directly, without awareness of input or creation of intermediate aggregates.
http://clojure.org/transducers
If you are not familiar with transducers, check out Transducers Explained.
Collected as a convenience for an aggregated API. Any function or transducer below can be bundled separately in browserify builds by requiring with path from transduce.
What does this mean? You can require the whole thing:
var tr = require('transduce')
tr.into([], [1,2,3,4,5,6])
// [1,2,3,4,5,6]
tr.into('', [1,2,3,4,5,6])
// '123456'
tr.into([], tr.filter(isEven), [1,2,3,4,5,6])
// [2,4,6]
tr.into([], tr.cat, [[1,2],[3,4],[5,6]])
// [1,2,3,4,5,6])
tr.into({}, [[1,2],[3,4],[5,6]])
// {1:2,3:4,5:6}
var transducer = tr.compose(tr.cat, tr.array.unshift(0), tr.map(add(1)))
tr.into([], transducer, [[1,2],[3,4],[5,6]])
// [1,2,3,4,5,6,7]
If you want to be reduce bundle size (or just like to be explicit), require with path from transduce.
var into = require('transduce/base/into'),
compose = require('transduce/base/compose'),
cat = require('transduce/base/cat'),
map = require('transduce/base/map'),
unshift = require('transduce/array/unshift')
var transducer = compose(cat, unshift(0), map(add(1)))
into([], transducer, [[1,2],[3,4],[5,6]])
// [1,2,3,4,5,6,7])
Too explicit? Require the packages:
var base = require('transduce/base'),
array = require('transduce/array')
var transducer = base.compose(base.cat, array.unshift(0), base.map(add(1)))
base.into([], transducer, [[1,2],[3,4],[5,6]])
// [1,2,3,4,5,6,7]
A source of values, normally a collection, coll. This library supports arrays, plain objects, strings, and anything that can be converted to iterators (see iterator package below). Input sources can also be push based, see push package below.
A two arity function appropriate for passing to reduce. The first argument is the accumulator and the second argument is the iteration value. When using transducers, the accumulator is normally a collection, but it is not required.
The initial accumulator value, init to use with Reduce.
An object that provides a reducing function, step, initial value function, init, and result extraction function, result. Combines the steps of reduce into a single object.
A function that folds over an input source to produce an Output Source. Accepts a Reducing Function or Transformer, xf, as the first argument, an optional initial value, init, as the second argument and an Input Source, coll as the third argument.
Also known as foldLeft or foldl in other languages and libraries.
The function begins with calling the Reducing Function of xf, step, with the initial accumulator, init, and the first item of Input Source, coll. The return value from the reducing function is used as the next accumulator with the next item in coll. The process repeats until either coll is exhausted or xf indicates early termination with reduced. Finally, the result extraction function of xf, result, is called with the final accumulator to perform potentially delayed actions and optionally convert the accumulator to the Output Source.
Reduce defines a Transducible Process.
A function, t, that accepts a transformer, xf, and returns a transformer. All transformations are defined in terms of transducers, independent of the Transducible Process. Can be composed directly to create new transducers.
A process that begins with an initial value accumulator, steps through items of an input source and optionally transforming with transducer, t, and optionally completes with a result. Transduce is one transducible process. Transducible Processes can also be push based. See push package below or transduce-stream for a few examples. The same transducer can be used with any transducible process.
Supports the following functions:
// base functions
reduce: function(xf, init?, coll)
transduce: function(t, xf, init?, coll)
into: function(init, t?, coll)
toArray: function(t?, coll)
// base transducers
map: function(f)
filter: function(predicate)
remove: function(predicate)
take: function(n)
takeWhile: function(predicate)
drop: function(n)
dropWhile: function(predicate)
cat: transducer
mapcat: function(f)
partitionAll: function(n)
partitionBy: function(f)
// base utils
compose: function(/*fns*/)
isReduced: function(value)
reduced: function(value, force?)
unreduced: function(value)
array {
forEach: function(callback)
find: function(predicate)
push: function(/*args*/)
unshift: function(/*args*/)
every: function(predicate)
some: function(predicate)
contains: function(target)
slice: function(begin?, end?)
initial: function(n?)
last: function(n?)
}
math {
min: function(f?)
max: function(f?)
}
push {
tap: function(interceptor)
asCallback: function(t, xf?)
asyncCallback: function(t, continuation, xf?)
}
string {
split: function(separator, limit?)
join: function(separator)
nonEmpty: function()
lines: function(limit?)
chars: function(limit?)
words: function(delimiter?, limit?)
}
unique {
dedupe: function()
unique: function(f?)
}
iterator {
symbol: Symbol.iterator || '@@iterator'
isIterable: function(value)
isIterator: function(value)
iterable: function(value)
iterator: function(value)
toArray: function(value)
sequence: function(t, value)
}
transformer {
symbol: Symbol('transformer') || '@@transformer'
isTransformer: function(value)
transformer: function(value)
}
util {
isFunction: function(value)
isArray: function(value)
isString: function(value)
isRegExp: function(value)
isNumber: function(value)
isUndefined: function(value)
identity: function(value)
arrayPush: function(arr, item)
objectMerge: function(obj, item)
stringAppend: function(str, item)
}
Reduces over a transformation. If xf is not a transformer, it is converted to one. coll is converted to an iterator. Arrays are special cased to reduce using for loop. If the function is called with arity-2, the xf.init() is used as the init value.
Transduces over a transformation. The transducer t is initialized with xf and is passed to reduce. xf is converted to a transformer if it is not one already. If the function is called with arity-3, the xf.init() is used as the init value.
Returns a new collection appending all items into init by passing all items from source collection coll through the optional transducer t. Chooses transformer, xf from type of to. Can be array, object, string or have @@transformer. coll is converted to an iterator
Transduce a collection into an array with an optional transducer, t. coll is converted to an iterator.
Transducer that steps all items after applying a mapping function f to each item.
Transducer that steps items which pass predicate test.
Transducer that removes all items that pass predicate.
Transducer that steps first n items and then terminates with reduced.
Transducer that take items until predicate returns true. Terminates with reduce when predicate returns true.
Transducer that drops first n items and steps remaining untouched.
Transducer that drops items until predicate returns true and steps remaining untouched.
Concatenating transducer. Reducing over every item in the transformation using provided transformer.
Transducer that applies a mappingFunction to each item, then concatenates the result of the mapping function. Same is compose(map(mappingFunction), cat)
Partitions the source into arrays of size n. When transformer completes, the transformer will be stepped with any remaining items.
Partitions the source into sub arrays when the value of the function f changes equality. When transformer completes, the transformer will be stepped with any remaining items.
Use Array methods as Transducers. Treats each stepped item as an item in the array, and defines transducers that step items with the same contract as array methods.
Passes every item through unchanged, but after executing callback(item, idx). Can be useful for "tapping into" composed transducer pipelines. The return value of the callback is ignored, item is passed unchanged.
Like filter, but terminates transducer pipeline with the result of the first item that passes the predicate test. Will always step either 0 (if not found) or 1 (if found) values.
Passes all items straight through until the result is requested. Once completed, steps every argument through the pipeline, before returning the result. This effectively pushes values on the end of the stream.
Before stepping the first item, steps all arguments through the pipeline, then passes every item through unchanged. This effectively unshifts values onto the beginning of the stream.
Checks to see if every item passes the predicate test. Steps a single item true or false. Early termination on false.
Checks to see if some item passes the predicate test. Steps a single item true or false. Early termination on true.
Does the stream contain the target value (target === item)? Steps a single item true or false. Early termination on true.
Like array slice, but with transducers. Steps items between begin (inclusive) and end (exclusive). If either index is negative, indexes from end of transformation. If end is undefined, steps until result of transformation. If begin is undefined, begins at 0.
Note that if either index is negative, items will be buffered until completion.
Steps everything but the last entry. Passing n will step all values excluding the last N.
Note that no items will be sent and all items will be buffered until completion.
Step the last element. Passing n will step the last N values.
Note that no items will be sent until completion.
Steps the min/max value on the result of the transformation. if f is provided, it is called with each item and the return value is used to compare values. Otherwise, the items are compared as numbers
Normally transducers are used with pull streams: reduce "pulls" values out of an array, iterator, etc. This library adds basic support for using transducers with push streams. See transduce-stream for using transducers with Node.js streams, or the underscore-transducer demo for an example of using transducers as event listeners.
Transducer that invokes interceptor with each result and input, and then passes through input. The primary purpose of this method is to "tap into" a method chain, in order to perform operations on intermediate results within the chain. Executes interceptor with current result and input.
Creates a callback that starts a transducer process and accepts parameter as a new item in the process. Each item advances the state of the transducer. If the transducer exhausts due to early termination, all subsequent calls to the callback will no-op and return the computed result. If the callback is called with no argument, the transducer terminates, and all subsequent calls will no-op and return the computed result. The callback returns undefined until completion. Once completed, the result is always returned. If xf is not defined, maintains last value and does not buffer results.
Creates an async callback that starts a transducer process and accepts parameter cb(err, item) as a new item in the process. The returned callback and the optional continuation follow Node.js conventions with fn(err, item). Each item advances the state of the transducer, if the continuation is provided, it will be called on completion or error. An error will terminate the transducer and be propagated to the continuation. If the transducer exhausts due to early termination, any further call will be a no-op. If the callback is called with no item, it will terminate the transducer process. If xf is not defined, maintains last value and does not buffer results.
Transduce over sequences of strings. Particularly useful with transduce-stream.
Treats every item as a substring, and splits across the entire transducer sequence. This allows functions to work with chunks sent through streams. When using transducers with streams, it is helpful to compose the transformation that you want with one of these functions to operate against a given line/word/etc.
Works like ''.split but splits across entire sequence of items. Accepts separator (String or RegExp) and optional limit of words to send.
Buffers all items and joins results on transducer result.
Only steps items that are non empty strings (input.trim().length > 0).
Split chunks into and steps each line/char/word with optional limit.
Transducers to remove duplicate values from the transformation.
Removes consecutive duplicates from the transformation. Subsequent stepped values are checked for equality using ===.
Produce a duplicate-free version of the transformation. If f is passed, it will be called with each item and the return value for uniqueness check. Uniqueness is checked across all values already seen, and as such, the items (or computed checks) are buffered.
Symbol (or a string that acts as symbols) for @@iterator you can use to configure your custom objects.
Does the parameter conform to the iterable protocol?
Returns the iterable for the parameter. Returns value if conforms to iterable protocol. Returns undefined if cannot return en iterable.
The return value will either conform to iterator protocol that can be invoked for iteration or will be undefined.
Supports anything that returns true for isIterable and converts arrays to iterables over each indexed item. Converts to functions to infinite iterables that always call function on next
Does the parameter have an iterator protocol or have a next method?
Returns the iterator for the parameter, invoking if has an iterator protocol or returning if has a next method. Returns undefined if cannot create an iterator.
The return value will either have a next function that can be invoked for iteration or will be undefined.
Supports anything that returns true for isIterator and converts arrays to iterators over each indexed item. Converts to functions to infinite iterators that always call function on next.
Converts the value to an iterator and iterates into an array.
Create an ES6 Iterable by transforming an input source using transducer t.
Symbol (or a string that acts as symbols) for @@transformer you can use to configure your custom objects.
Does the parameter have a transformer protocol or have init, step, result functions?
Attempts to convert the parameter into a transformer. If cannot be converted, returns undefined. If defined, the return value will have init, step, result functions that can be used for transformation. Converts arrays (arrayPush), strings (stringAppend), objects (objectMerge), functions (wrap as reducing function) or anything that isTransformer into a transformer.
Simple function composition of arguments. Useful for composing (combining) transducers.
Is the value reduced? (signal for early termination)
Ensures the value is reduced (useful for early termination). If force is not provided or false, only wraps with Reduced value if not already isReduced. If force is true, always wraps value with Reduced value.
Ensure the value is not reduced (unwraps reduced values if necessary)
Always returns value
Array.push as a reducing function. Calls push and returns array.
Merges the item into the object. If item is an array of length 2, uses first (0 index) as the key and the second (1 index) as the value. Otherwise iterates over own properties of items and merges values with same keys into the result object.
Appends item onto result using +.
Extracted from underscore-transducer, which was created initially as a translation from Clojure. Now compatible with and inspired by protocols defined by transducers-js and transducers.js.
MIT
FAQs
JavaScript transducers and iterators
The npm package transduce receives a total of 10 weekly downloads. As such, transduce popularity was classified as not popular.
We found that transduce demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Product
Socket is now scanning AI agent skills across multiple languages and ecosystems, detecting malicious behavior before developers install, starting with skills.sh's 60,000+ skills.

Product
Socket now supports PHP with full Composer and Packagist integration, enabling developers to search packages, generate SBOMs, and protect their PHP dependencies from supply chain threats.

Security News
An AI agent is merging PRs into major OSS projects and cold-emailing maintainers to drum up more work.