Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

transformer-tricks

Package Overview
Dependencies
Maintainers
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

transformer-tricks

A collection of tricks to speed up LLMs, see our transformer-tricks papers on arXiv

  • 0.2.3
  • PyPI
  • Socket score

Maintainers
1

Colab Downloads

Setup

pip3 install transformer-tricks

Example

The example below converts SmolLM-135M to FlashNorm and measures perplexity of the original and the modified model.

import transformer_tricks as tt

# convert model and store the new model in ./SmolLM-135M_flashNorm_test
tt.flashify_repo('HuggingFaceTB/SmolLM-135M')

# run example inference of original and modified model
tt.hello_world('HuggingFaceTB/SmolLM-135M')
tt.hello_world('./SmolLM-135M_flashNorm_test')

# measure perplexity of original and modified model
tt.perplexity('HuggingFaceTB/SmolLM-135M', speedup=16)
tt.perplexity('./SmolLM-135M_flashNorm_test', speedup=16)

Results:

Once upon a time there was a curious little girl
Once upon a time there was a curious little girl
perplexity = 16.083
perplexity = 16.083

You can run the example in your browser by clicking on this notebook: Colab . Hit "cancel" when it says "Notebook does not have secret access", because we don't need an HF_TOKEN for SmolLM.

Test FlashNorm

# setup
git clone https://github.com/OpenMachine-ai/transformer-tricks.git
cd python
pip3 install --quiet -r requirements.txt

# run tests
python3 flashNorm_test.py

Results:

Once upon a time there was a curious little girl
Once upon a time there was a curious little girl
Once upon a time there was a little girl named
Once upon a time there was a little girl named
perplexity = 16.083
perplexity = 16.083
perplexity = 12.086
perplexity = 12.086

To run llama and other LLMs that need an agreement (not SmolLM), you first have to type the following, which will ask for your hf_token:

huggingface-cli login

Contributing

Before making a change to this repo, please do the following:

  • Format your code by typing autopep8 *.py. It's using the config in pyproject.toml.
  • Whenever you change transformer_tricks.py, publish a new version of the package as follows:
    • First, update the version number in pyproject.toml and in requirements.txt
    • Then, push the package to PyPi by typing ./push_pypi.sh
  • Whenever you modify flashNorm_example.py, generate the corresponding notebook as follows:
    jupytext --to ipynb flashNorm_example.py -o ../notebooks/flashNorm_example.ipynb
    

Notes on python package

  • Link to package here
  • Link to stats here
  • Source of this README file here

Please give us a ⭐ if you like this repo, thanks!

FAQs


Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc