
Product
Introducing Tier 1 Reachability: Precision CVE Triage for Enterprise Teams
Socket’s new Tier 1 Reachability filters out up to 80% of irrelevant CVEs, so security teams can focus on the vulnerabilities that matter.
axial-positional-embedding
Advanced tools
A type of positional embedding that is very effective when working with attention networks on multi-dimensional data, or for language models in general.
$ pip install axial-positional-embedding
import torch
from axial_positional_embedding import AxialPositionalEmbedding
pos_emb = AxialPositionalEmbedding(
dim = 512,
axial_shape = (64, 64), # axial shape will multiply up to the maximum sequence length allowed (64 * 64 = 4096)
axial_dims = (256, 256) # if not specified, dimensions will default to 'dim' for all axials and summed at the end. if specified, each axial will have the specified dimension and be concatted together. the concatted dimensions needs to sum up to the `dim` (256 + 256 = 512)
)
tokens = torch.randn(1, 1024, 512) # assume are tokens
tokens = pos_emb(tokens) + tokens # add positional embedding to token embeddings
A continuous version with better extrapolation ability (each axis parameterized by a 2 layer MLP)
import torch
from axial_positional_embedding import ContinuousAxialPositionalEmbedding
pos_emb = ContinuousAxialPositionalEmbedding(
dim = 512,
num_axial_dims = 3
)
tokens = torch.randn(1, 8, 16, 32, 512) # say a video with 8 frames, 16 x 32 image dimension
axial_pos_emb = pos_emb((8, 16, 32)) # pass in the size from above
tokens = axial_pos_emb + tokens # add positional embedding to token embeddings
@inproceedings{kitaev2020reformer,
title = {Reformer: The Efficient Transformer},
author = {Nikita Kitaev and Lukasz Kaiser and Anselm Levskaya},
booktitle = {International Conference on Learning Representations},
year = {2020},
url = {https://openreview.net/forum?id=rkgNKkHtvB}
}
@misc{ho2019axial,
title = {Axial Attention in Multidimensional Transformers},
author = {Jonathan Ho and Nal Kalchbrenner and Dirk Weissenborn and Tim Salimans},
year = {2019},
archivePrefix = {arXiv}
}
FAQs
Axial Positional Embedding
We found that axial-positional-embedding demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Product
Socket’s new Tier 1 Reachability filters out up to 80% of irrelevant CVEs, so security teams can focus on the vulnerabilities that matter.
Research
/Security News
Ongoing npm supply chain attack spreads to DuckDB: multiple packages compromised with the same wallet-drainer malware.
Security News
The MCP Steering Committee has launched the official MCP Registry in preview, a central hub for discovering and publishing MCP servers.