Security News
Introducing the Socket Python SDK
The initial version of the Socket Python SDK is now on PyPI, enabling developers to more easily interact with the Socket REST API in Python projects.
streamerate: a fluent and expressive Python library for chainable iterable processing, inspired by Java 8 streams.
streamerate is a powerful pure-Python library inspired by Fluent Interface pattern (used by Java 8 streams), providing a chainable and expressive approach to processing iterable data.
By leveraging the Fluent Interface pattern, streamerate enables you to chain together multiple operations, such as filtering, mapping, and reducing, to create complex data processing pipelines with ease. With streamerate, you can write elegant and readable code that efficiently operates on streams of data, facilitating the development of clean and expressive Python applications.
streamerate empowers you to write elegant and functional code, unlocking the full potential of your iterable data processing pipelines
The library is distributed under the permissive MIT license, allowing you to freely use, modify, and distribute it in both open-source and commercial projects.
Note: streamerate originated as part of the pyxtension project but has since been migrated as a standalone library.
pip install streamerate
or from Github:
git clone https://github.com/asuiu/streamerate.git
cd streamerate
python setup.py install
or
git submodule add https://github.com/asuiu/streamerate.git
stream
subclasses collections.Iterable
. It's the same Python iterable, but with more added methods, suitable for multithreading and multiprocess processings.
Used to create stream processing pipelines, similar to those used in Scala and MapReduce programming model.
Those who used Apache Spark RDD functions will find this model of processing very easy to use.
Never again will you have to write code like this:
> lst = xrange(1,6)
> reduce(lambda x, y: x * y, map(lambda _: _ * _, filter(lambda _: _ % 2 == 0, lst)))
64
From now on, you may simply write the following lines:
> the_stream = stream( xrange(1,6) )
> the_stream.\
filter(lambda _: _ % 2 == 0).\
map(lambda _: _ * _).\
reduce(lambda x, y: x * y)
64
corpus = [
"MapReduce is a programming model and an associated implementation for processing and generating large data sets with a parallel, distributed algorithm on a cluster.",
"At Google, MapReduce was used to completely regenerate Google's index of the World Wide Web",
"Conceptually similar approaches have been very well known since 1995 with the Message Passing Interface standard having reduce and scatter operations."]
def reduceMaps(m1, m2):
for k, v in m2.iteritems():
m1[k] = m1.get(k, 0) + v
return m1
word_counts = stream(corpus).\
mpmap(lambda line: stream(line.lower().split(' ')).countByValue()).\
reduce(reduceMaps)
Identic with builtin map
but returns a stream
Parallel ordered map using multiprocessing.Pool.imap()
.
It can replace the map
when need to split computations to multiple cores, and order of results matters.
It spawns at most poolSize
processes and applies the f
function.
It won't take more than bufferSize
elements from the input unless it was already required by output, so you can use it with takeWhile
on infinite streams and not be afraid that it will continue work in background.
The elements in the result stream appears in the same order they appear in the initial iterable.
:type f: (T) -> V
:rtype: `stream`
Parallel ordered map using multiprocessing.Pool.imap_unordered()
.
It can replace the map
when the ordered of results doesn't matter.
It spawns at most poolSize
processes and applies the f
function.
It won't take more than bufferSize
elements from the input unless it was already required by output, so you can use it with takeWhile
on infinite streams and not be afraid that it will continue work in background.
The elements in the result stream appears in the unpredicted order.
:type f: (T) -> V
:rtype: `stream`
Parallel unordered map using multithreaded pool.
It can replace the map
when the ordered of results doesn't matter.
It spawns at most poolSize
threads and applies the f
function.
The elements in the result stream appears in the unpredicted order.
It won't take more than bufferSize
elements from the input unless it was already required by output, so you can use it with takeWhile
on infinite streams and not be afraid that it will continue work in background.
Because of CPython GIL it's most usefull for I/O or CPU intensive consuming native functions, or on Jython or IronPython interpreters.
:type f: (T) -> V
:rtype: stream
Parallel ordered map using multithreaded pool.
It can replace the map
and the order of output stream will be the same as of the input.
It spawns at most poolSize
threads and applies the f
function.
The elements in the result stream appears in the predicted order.
It won't take more than bufferSize
elements from the input unless it was already required by output, so you can use it with takeWhile
on infinite streams and not be afraid that it will continue work in background.
Because of CPython GIL it's most usefull for I/O or CPU intensive consuming native functions, or on Jython or IronPython interpreters.
:type f: (T) -> V
:rtype: stream
:param predicate: is a function that will receive elements of self collection and return an iterable
By default predicate is an identity function
:type predicate: (V)-> collections.Iterable[T]
:return: will return stream of objects of the same type of elements from the stream returned by predicate()
Example:
stream([[1, 2], [3, 4], [4, 5]]).flatMap().toList() == [1, 2, 3, 4, 4, 5]
identic with builtin filter, but returns stream
returns reversed stream
Tests whether a predicate holds for some of the elements of this sequence.
:rtype: bool
Example:
stream([1, 2, 3]).exists(0) -> False
stream([1, 2, 3]).exists(1) -> True
Transforms stream of values to a stream of tuples (key, value)
:param keyfunc: function to map values to keys
:type keyfunc: (V) -> T
:return: stream of Key, Value pairs
:rtype: stream[( T, V )]
Example:
stream([1, 2, 3, 4]).keyBy(lambda _:_ % 2) -> [(1, 1), (0, 2), (1, 3), (0, 4)]
groupBy([keyfunc]) -> Make an iterator that returns consecutive keys and groups from the iterable.
The iterable needs not to be sorted on the same key function, but the keyfunction need to return hasable objects.
:param keyfunc: [Optional] The key is a function computing a key value for each element.
:type keyfunc: (T) -> (V)
:return: (key, sub-iterator) grouped by each value of key(value).
:rtype: stream[ ( V, slist[T] ) ]
Example:
stream([1, 2, 3, 4]).groupBy(lambda _: _ % 2) -> [(0, [2, 4]), (1, [1, 3])]
Returns a collections.Counter of values
Example
stream(['a', 'b', 'a', 'b', 'c', 'd']).countByValue() == {'a': 2, 'b': 2, 'c': 1, 'd': 1}
Returns stream of distinct values. Values must be hashable.
stream(['a', 'b', 'a', 'b', 'c', 'd']).distinct() == {'a', 'b', 'c', 'd'}
same arguments with builtin reduce() function
Throttles the stream.
:param max_req: number of requests :param interval: period in number of seconds :return: throttled stream
Example:
>>> s = Stream()
>>> throttled_stream = s.throttle(10, 1.5)
>>> for item in throttled_stream:
... print(item)
returns sset() instance
returns slist() instance
returns sdict() instance
same arguments with builtin sorted()
returns length of stream. Use carefully on infinite streams.
Returns a string joined by f. Proivides same functionality as str.join() builtin method.
if f is basestring, uses it to join the stream, else f should be a callable that returns a string to be used for join
identic with join(f)
returns first n elements from stream
returns first element from stream
the same behavior with itertools.izip()
Returns a stream of unique (according to predicate) elements appearing in the same order as in original stream
The items returned by predicate should be hashable and comparable.
calculates the Shannon entropy of the values from stream
Calculates the population standard deviation.
returns the arithmetical mean of the values
returns the sum of elements from stream
same functionality with builtin min() funcion
same functionality with min() but returns :default: when called on empty streams
same functionality with builtin max()
returns a stream of max values from stream
returns a stream of min values from stream
Inherits streams.stream
and built-in list
classes, and keeps in memory a list allowing faster index access
Inherits streams.stream
and built-in set
classes, and keeps in memory the whole set of values
Inherits streams.stream
and built-in dict
, and keeps in memory the dict object.
Inherits streams.sdict
and adds functionality of collections.defaultdict
from stdlib
streamerate is released under MIT license.
There are other libraries that support Fluent Interface streams as alternatives to streamerate, but being much more poor in features for streaming:
and something quite different from Fluent pattern, that makes kind of Piping: https://github.com/sspipe/sspipe and https://github.com/JulienPalard/Pipe
FAQs
streamerate: a fluent and expressive Python library for chainable iterable processing, inspired by Java 8 streams.
We found that streamerate demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
The initial version of the Socket Python SDK is now on PyPI, enabling developers to more easily interact with the Socket REST API in Python projects.
Security News
Floating dependency ranges in npm can introduce instability and security risks into your project by allowing unverified or incompatible versions to be installed automatically, leading to unpredictable behavior and potential conflicts.
Security News
A new Rust RFC proposes "Trusted Publishing" for Crates.io, introducing short-lived access tokens via OIDC to improve security and reduce risks associated with long-lived API tokens.