![Oracle Drags Its Feet in the JavaScript Trademark Dispute](https://cdn.sanity.io/images/cgdhsj6q/production/919c3b22c24f93884c548d60cbb338e819ff2435-1024x1024.webp?w=400&fit=max&auto=format)
Security News
Oracle Drags Its Feet in the JavaScript Trademark Dispute
Oracle seeks to dismiss fraud claims in the JavaScript trademark dispute, delaying the case and avoiding questions about its right to the name.
adnn provides TypeSafe Javascript-native neural networks on top of general scalar/tensor reverse-mode automatic differentiation. You can use just the AD code, or the NN layer built on top of it. This architecture makes it easy to define big, complex numer
adnn.ts provides TypeSafe Javascript-native neural networks on top of general scalar/tensor reverse-mode automatic differentiation. You can use just the AD code, or the NN layer built on top of it. This architecture makes it easy to define big, complex numerical computations and compute derivatives w.r.t. their inputs/parameters. adnn also includes utilities for optimizing/training the parameters of such computations.
This is Typescript wrapper on top of adnn
npm install adnn.ts
You can also install adnn.ts
with pnpm, yarn, or slnpm
The simplest use case for adnn:
import { ScalarNode, ad, scalar } from 'adnn.ts'
// Can use normal number or lifted ScalarNode
function dist(x1: number, y1: number, x2: number, y2: number): number
function dist(x1: scalar, y1: scalar, x2: scalar, y2: scalar): ScalarNode
function dist(x1: scalar, y1: scalar, x2: scalar, y2: scalar): scalar {
var xdiff = ad.scalar.sub(x1, x2)
var ydiff = ad.scalar.sub(y1, y2)
return ad.scalar.sqrt(
ad.scalar.add(ad.scalar.mul(xdiff, xdiff), ad.scalar.mul(ydiff, ydiff)),
)
}
// number in, number out
var number_output = dist(0, 1, 1, 4)
console.log(number_output) // 3.162...
// Use 'lifted' inputs to track derivatives
var x1 = ad.lift(0)
var y1 = ad.lift(1)
var x2 = ad.lift(1)
var y2 = ad.lift(4)
// scalar in, scalar out
var scalar_output = dist(x1, y1, x2, y2)
console.log(ad.value(scalar_output)) // still 3.162...
scalar_output.backprop() // Compute derivatives of inputs
console.log(ad.derivative(x1)) // -0.316...
adnn also supports computations involving tensors, or a mixture of scalars and tensors:
import { Tensor, TensorNode, ad } from 'adnn.ts'
function dot(vec: TensorNode) {
var sq = ad.tensor.mul(vec, vec)
return ad.tensor.sumreduce(sq)
}
function dist(vec1: TensorNode, vec2: TensorNode) {
return ad.scalar.sqrt(dot(ad.tensor.sub(vec1, vec2)))
}
var vec1 = ad.lift(new Tensor([3]).fromFlatArray([0, 1, 1]))
var vec2 = ad.lift(new Tensor([3]).fromFlatArray([2, 0, 3]))
var out = dist(vec1, vec2)
console.log(ad.value(out)) // 3
out.backprop()
console.log(ad.derivative(vec1).toFlatArray()) // [-0.66, 0.33, -0.66]
adnn makes it easy to define simple, feedforward neural networks. Here's a basic multilayer perceptron that takes a feature vector as input and outputs class probabilities:
import { Tensor, TrainingData, nn, opt } from 'adnn.ts'
var nInputs = 20
var nHidden = 10
var nClasses = 5
// Definition using basic layers
var net = nn.sequence([
nn.linear(nInputs, nHidden),
nn.tanh,
nn.linear(nHidden, nClasses),
nn.softmax,
])
// Alternate definition using 'nn.mlp' utility
net = nn.sequence([
nn.mlp(nInputs, [{ nOut: nHidden, activation: nn.tanh }, { nOut: nClasses }]),
nn.softmax,
])
// Train the parameters of the network from some dataset
// 'loadData' is a stand-in for a user-provided function that
// loads in an array of {input: , output: } objects
// Here, 'input' is a feature vector, and 'output' is a class label
var trainingData = loadData(100)
opt.nnTrain(net, trainingData, opt.classificationLoss, {
batchSize: 10,
iterations: 100,
method: opt.adagrad(),
})
// Predict class probabilities for new, unseen features
var features = new Tensor([nInputs]).fillRandom()
var classProbs = net.eval(features)
console.log({ features, classProbs })
function loadData(sampleSize: number): TrainingData {
return new Array(sampleSize).fill(0).map(() => ({
input: new Tensor([nInputs]).fillRandom(),
output: Math.floor(Math.random() * nClasses),
}))
}
Below sections are still working in progress, you can read the js version in the meanwhile.
ad
moduleThe ad
module has its own documentation here
nn
moduleThe nn
module has its own documentation here
opt
moduleThe opt
module has its own documentation here
Details see adnn.ts
This project is licensed with BSD-2-Clause
This is free, libre, and open-source software. It comes down to four essential freedoms [ref]:
FAQs
adnn provides TypeSafe Javascript-native neural networks on top of general scalar/tensor reverse-mode automatic differentiation. You can use just the AD code, or the NN layer built on top of it. This architecture makes it easy to define big, complex numer
The npm package adnn.ts receives a total of 0 weekly downloads. As such, adnn.ts popularity was classified as not popular.
We found that adnn.ts demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 0 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Oracle seeks to dismiss fraud claims in the JavaScript trademark dispute, delaying the case and avoiding questions about its right to the name.
Security News
The Linux Foundation is warning open source developers that compliance with global sanctions is mandatory, highlighting legal risks and restrictions on contributions.
Security News
Maven Central now validates Sigstore signatures, making it easier for developers to verify the provenance of Java packages.