New Case Study:See how Anthropic automated 95% of dependency reviews with Socket.Learn More
Socket
Sign inDemoInstall
Socket

abstract-algorithm

Package Overview
Dependencies
Maintainers
1
Versions
27
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

abstract-algorithm - npm Package Compare versions

Comparing version 0.1.4 to 0.1.5

2

package.json
{
"name": "abstract-algorithm",
"version": "0.1.4",
"version": "0.1.5",
"description": "Abstract-algorithm for optional reduction of stratifiable lambda terms",

@@ -5,0 +5,0 @@ "main": "src/api.js",

# Lamping's Abstract Algorithm
A cleaned up adaptation of [Lamping's abstract algorithm](http://dl.acm.org/citation.cfm?id=96711). This is the most beautiful algorithm I know. It evaluates functions optimally, in the sense there is no other way to evaluate them with an inferior number of parallel reduction steps. Not only that, it automatically exploits any inherent parallelizability of your program without any added effort. My implementation is sequential, but optimal for that case. It works by converting λ-calculus terms to a form of graph known as [interaction combinators](http://www.sciencedirect.com/science/article/pii/S0890540197926432) (but [symmetric](https://scholar.google.com.br/scholar?q=symmetric+interaction+combinators&hl=en&as_sdt=0&as_vis=1&oi=scholart&sa=X&ved=0ahUKEwjNgZbO7aTVAhUEkZAKHYyTAFgQgQMIJjAA), and with infinite node colors), reducing the graph to normal form (by applying the rules below lazily) and then reading back to a λ-term.
A cleaned up adaptation of [Lamping's abstract algorithm](http://dl.acm.org/citation.cfm?id=96711). It evaluates functions optimally by encoding a λ-term as ([symmetric](https://scholar.google.com.br/scholar?q=symmetric+interaction+combinators&hl=en&as_sdt=0&as_vis=1&oi=scholart&sa=X&ved=0ahUKEwjNgZbO7aTVAhUEkZAKHYyTAFgQgQMIJjAA)) [interaction combinators](http://www.sciencedirect.com/science/article/pii/S0890540197926432), normalizing the resulting graph, and decoding it back. It asymptotically beats all usual evaluators of functional programs, including Haskell's GHC, Scheme compilers, Google's V8 and so on. Moreover, it is capable of automatically exploiting any inherent parallelizability of your program, since interaction nets are a naturally concurrent model of computation. This implementation consists of two files, [a 120-loc reducer](src/abstract-algorithm.js) for interaction combinators and [a 71-loc conversor](https://github.com/MaiaVictor/abstract-algorithm/blob/master/src/lambda-encoder.js) of λ-terms to and from those. It isn't parallel, but is completely lazy: it doesn't perform any work that won't influence on the normal form. There is an inherent tradeoff between those.

@@ -26,5 +26,4 @@ ![combinator_rules](images/combinators_rules.png)

Here is a [crappy handwritten explanation](images/handwritten_example.jpg) of how it works. Very simple, isn't it? [Here](example.js) is a working example on how to use this lib to reduce untyped λ-terms. The entire reduction algorithm is implemented as a <150 LOC file operating on arrays ([sharing-graph.js](sharing-graph.js)), and the conversor between λ-terms and graphs is about half of that ([lambda-encoder](lambda-encoder.js)). The only drawback is that this only works with stratifiable terms, which roughly means that the algorithm breaks when you have duplications inside of duplications. I personally blame λ-calculus for that defect, since it allows expressing very illogical / garbage terms. Good news is, well-written, sensible terms work, and, if it doesn't work, your λ-term is probably bad and you should feel bad.
Here is a [crappy handwritten explanation](images/handwritten_example.jpg) of how it works. [Here](src/example.js) is a working example on how to use this lib to reduce untyped λ-terms. [Here](src/fast-reducer.js) is an inlined reducer (less pretty, but faster and with GC - the exported lib actually uses it), and [here](stuff/wasmReducer.js) is a poor attempt at making a WASM reducer (tip: it is not any faster, which is weird, because the [C](stuff/ic.c) version is 10x faster than the JS one). [Here](https://github.com/MaiaVictor/parallel_lambda_computer_tests) is an attempt to flatten the graph-reduction algorithm to a linear automata, in such a way that it could, in theory, be implemented as a massivelly parallel ASIC where each 128-bit memory cell is actually a nano-CPU capable of updating its state by looking at its neighbors, in such a way that it causes the emergent global behavior of the memory to converge to the normal form of the initial graph.
[Here](stuff/cleanReducer.js) is a non-inlined reducer (much prettier, great to understand the algorithm), and [here](stuff/wasmReducer.js) is a poor attempt at making a WASM reducer (tip: it is not any faster, which is weird, because the [C](stuff/ic.c) version is 10x faster than the JS one). [Here](https://github.com/MaiaVictor/parallel_lambda_computer_tests) is an attempt to flatten the graph-reduction algorithm to a linear automata, in such a way that it could, in theory, be implemented as a massivelly parallel ASIC where each 128-bit memory node is actually a CPU that performs a computation by looking at the state of its neighbors and updating its own based on a simple rule, in such a way that the memory converges to the normal form of the graph.
Note this only works with stratified terms, which roughly means that the algorithm breaks when you have duplications inside of duplications. To make it work with arbitrary terms, you'd need to augment the system with a complex machinery known as oracle. There are many ways to do it, but all of them ruin the performance. I personally prefer to call non-stratified terms defective and ignore them. It is not hard to avoid those terms, you just have to be cautious about the implicit duplication of arguments performed by beta-reduction (which is the root of all evil).

@@ -1,2 +0,2 @@

// Implement symmetric interaction combinators with infinite
// Implements symmetric interaction combinators with infinite
// node colors. By encoding a λ-calculus term to a interaction

@@ -3,0 +3,0 @@ // net, this becomes Lamping's Abstract Algorithm.

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc