
Research
/Security News
npm Author Qix Compromised via Phishing Email in Major Supply Chain Attack
npm author Qix’s account was compromised, with malicious versions of popular packages like chalk-template, color-convert, and strip-ansi published.
axial-positional-embedding
Advanced tools
A type of positional embedding that is very effective when working with attention networks on multi-dimensional data, or for language models in general.
$ pip install axial-positional-embedding
import torch
from axial_positional_embedding import AxialPositionalEmbedding
pos_emb = AxialPositionalEmbedding(
dim = 512,
axial_shape = (64, 64), # axial shape will multiply up to the maximum sequence length allowed (64 * 64 = 4096)
axial_dims = (256, 256) # if not specified, dimensions will default to 'dim' for all axials and summed at the end. if specified, each axial will have the specified dimension and be concatted together. the concatted dimensions needs to sum up to the `dim` (256 + 256 = 512)
)
tokens = torch.randn(1, 1024, 512) # assume are tokens
tokens = pos_emb(tokens) + tokens # add positional embedding to token embeddings
A continuous version with better extrapolation ability (each axis parameterized by a 2 layer MLP)
import torch
from axial_positional_embedding import ContinuousAxialPositionalEmbedding
pos_emb = ContinuousAxialPositionalEmbedding(
dim = 512,
num_axial_dims = 3
)
tokens = torch.randn(1, 8, 16, 32, 512) # say a video with 8 frames, 16 x 32 image dimension
axial_pos_emb = pos_emb((8, 16, 32)) # pass in the size from above
tokens = axial_pos_emb + tokens # add positional embedding to token embeddings
@inproceedings{kitaev2020reformer,
title = {Reformer: The Efficient Transformer},
author = {Nikita Kitaev and Lukasz Kaiser and Anselm Levskaya},
booktitle = {International Conference on Learning Representations},
year = {2020},
url = {https://openreview.net/forum?id=rkgNKkHtvB}
}
@misc{ho2019axial,
title = {Axial Attention in Multidimensional Transformers},
author = {Jonathan Ho and Nal Kalchbrenner and Dirk Weissenborn and Tim Salimans},
year = {2019},
archivePrefix = {arXiv}
}
FAQs
Axial Positional Embedding
We found that axial-positional-embedding demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
/Security News
npm author Qix’s account was compromised, with malicious versions of popular packages like chalk-template, color-convert, and strip-ansi published.
Research
Four npm packages disguised as cryptographic tools steal developer credentials and send them to attacker-controlled Telegram infrastructure.
Security News
Ruby maintainers from Bundler and rbenv teams are building rv to bring Python uv's speed and unified tooling approach to Ruby development.