
Research
/Security News
Critical Vulnerability in NestJS Devtools: Localhost RCE via Sandbox Escape
A flawed sandbox in @nestjs/devtools-integration lets attackers run code on your machine via CSRF, leading to full Remote Code Execution (RCE).
With statecraft you can easily load states from a file or from the community repository and use them in your SSMs or stateful models. The interface is just like Huggingface's Transformers library:
from statecraft import StatefulModel
model = StatefulModel.from_pretrained(
model_name="state-spaces/mamba-130m-hf",
initial_state_name="koayon/state-a",
)
Now with every forward pass you get the knowledge from the state you loaded. And because StatefulModel inherits from Huggingface's PreTrainedModel
, you can use it just like any other Huggingface model.
pip install statecraft
model.build_state()
to generate a new state from your context.model.save_state()
or model.save_current_state
to save your state to a file so that you can use it again in the future.model.load_state()
you can load a state either from a file or from the community repository.
statecraft.list_states()
.state
instead of context
, watch this space 👀Currently, we often use RAG to give a transformer contextual information.
With Mamba-like models, you could instead imagine having a library of states created by running the model over specialised data. States could be shared kinda like LoRAs for image models.
For example, I could do inference on 20 physics textbooks and, say, 100 physics questions and answers. Then I have a state which I can give to you. Now you don’t need to add any few-shot examples; you just simply ask your question. The in-context learning is in the state.
In other words, you can drag and drop downloaded states into your model, like literal plug-in cartridges. And note that “training” a state doesn’t require any backprop. It’s more like a highly specialised one-pass fixed-size compression algorithm. This is unlimited in-context learning applied at inference time for zero-compute or latency.
The structure of an effective Transformer LLM call is:
With statecraft and your SSM, we instead simply have:
For more details see here
We welcome contributions to the statecraft library!
And of course, please contribute your SSM states with statecraft.upload_state(...)
🪄
Proudly powering the SSM revolution.
FAQs
Store, manage and remix states for SSMs and other Stateful models
We found that statecraft demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
/Security News
A flawed sandbox in @nestjs/devtools-integration lets attackers run code on your machine via CSRF, leading to full Remote Code Execution (RCE).
Product
Customize license detection with Socket’s new license overlays: gain control, reduce noise, and handle edge cases with precision.
Product
Socket now supports Rust and Cargo, offering package search for all users and experimental SBOM generation for enterprise projects.