
Security News
Deno 2.2 Improves Dependency Management and Expands Node.js Compatibility
Deno 2.2 enhances Node.js compatibility, improves dependency management, adds OpenTelemetry support, and expands linting and task automation for developers.
A light wrapper over TensorFlow that enables you to easily create complex deep neural networks using the Builder Pattern through a functional fluent immutable API
TensorBuilder is a TensorFlow-based library that enables you to easily create complex neural networks using functional programming.
##Import
For demonstration purposes we will import right now everything we will need for the rest of the exercises like this
from tensorbuilder.api import *
import tensorflow as tf
but you can also import just what you need from the tensorbuilder
module.
With the T
object you can create quick math-like lambdas using any operator, this lets you write things like
x, b = tf.placeholder('float'), tf.placeholder('float')
f = (T + b) / (T + 10) #lambda x: (x + b) / (x + 10)
y = f(x)
assert "div" in y.name
Use function composition with the >>
operator to improve readability
x, w, b = tf.placeholder('float', [None, 5]), tf.placeholder('float', [5, 3]), tf.placeholder('float', [3])
f = T.matmul(w) >> T + b >> T.sigmoid()
y = f(x)
assert "Sigmoid" in y.name
Any function from the tf
and nn
modules is a method from the T
object, as before you can use the >>
operator or you can chain them to produce complex functions
x, w, b = tf.placeholder('float', [None, 5]), tf.placeholder('float', [5, 3]), tf.placeholder('float', [3])
f = T.matmul(w).add(b).sigmoid()
y = f(x)
assert "Sigmoid" in y.name
You can use functions from the tf.contrib.layers
module via the T.layers
property. Here we will use Pipe to apply a value directly to an expression:
x = tf.placeholder('float', [None, 5])
y = Pipe(
x,
T.layers.fully_connected(64, activation_fn=tf.nn.sigmoid) # sigmoid layer 64
.layers.fully_connected(32, activation_fn=tf.nn.tanh) # tanh layer 32
.layers.fully_connected(16, activation_fn=None) # linear layer 16
.layers.fully_connected(8, activation_fn=tf.nn.relu) # relu layer 8
)
assert "Relu" in y.name
However, since it is such a common task to build fully_connected layers using the different functions from the tf.nn
module, we've (dynamically) create all combination of these as their own methods so you con rewrite the previous as
x = tf.placeholder('float', [None, 5])
y = Pipe(
x,
T.sigmoid_layer(64) # sigmoid layer 64
.tanh_layer(32) # tanh layer 32
.linear_layer(16) # linear layer 16
.relu_layer(8) # relu layer 8
)
assert "Relu" in y.name
The latter is much more compact, English readable, and reduces a lot of noise.
Coming soon!
Coming soon!
Coming soon!
Coming soon!
Tensor Builder assumes you have a working tensorflow
installation. We don't include it in the requirements.txt
since the installation of tensorflow varies depending on your setup.
pip install tensorbuilder
For the latest development version
pip install git+https://github.com/cgarciae/tensorbuilder.git@develop
Create neural network with a [5, 10, 3] architecture with a softmax
output layer and a tanh
hidden layer through a Builder and then get back its tensor:
import tensorflow as tf
from tensorbuilder import T
x = tf.placeholder(tf.float32, shape=[None, 5])
keep_prob = tf.placeholder(tf.float32)
h = T.Pipe(
x,
T.tanh_layer(10) # tanh(x * w + b)
.dropout(keep_prob) # dropout(x, keep_prob)
.softmax_layer(3) # softmax(x * w + b)
)
Comming Soon!
Comming Soon!
Comming Soon!
Next is an example with all the features of TensorBuilder including the DSL, branching and scoping. It creates a branched computation where each branch is executed on a different device. All branches are then reduced to a single layer, but the computation is the branched again to obtain both the activation function and the trainer.
import tensorflow as tf
from tensorbuilder import T
x = placeholder(tf.float32, shape=[None, 10])
y = placeholder(tf.float32, shape=[None, 5])
[activation, trainer] = T.Pipe(
x,
[
T.With( tf.device("/gpu:0"):
T.relu_layer(20)
)
,
T.With( tf.device("/gpu:1"):
T.sigmoid_layer(20)
)
,
T.With( tf.device("/cpu:0"):
T.tanh_layer(20)
)
],
T.linear_layer(5),
[
T.softmax() # activation
,
T
.softmax_cross_entropy_with_logits(y) # loss
.minimize(tf.train.AdamOptimizer(0.01)) # trainer
]
)
FAQs
A light wrapper over TensorFlow that enables you to easily create complex deep neural networks using the Builder Pattern through a functional fluent immutable API
We found that tensorbuilder demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Deno 2.2 enhances Node.js compatibility, improves dependency management, adds OpenTelemetry support, and expands linting and task automation for developers.
Security News
React's CRA deprecation announcement sparked community criticism over framework recommendations, leading to quick updates acknowledging build tools like Vite as valid alternatives.
Security News
Ransomware payment rates hit an all-time low in 2024 as law enforcement crackdowns, stronger defenses, and shifting policies make attacks riskier and less profitable.