Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

@xenova/transformers

Package Overview
Dependencies
Maintainers
1
Versions
75
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@xenova/transformers - npm Package Compare versions

Comparing version 2.17.1 to 2.17.2

5

package.json
{
"name": "@xenova/transformers",
"version": "2.17.1",
"version": "2.17.2",
"description": "State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!",

@@ -77,4 +77,3 @@ "main": "./src/transformers.js",

"sharp": false,
"onnxruntime-node": false,
"stream/web": false
"onnxruntime-node": false
},

@@ -81,0 +80,0 @@ "publishConfig": {

6

README.md

@@ -104,3 +104,3 @@

<script type="module">
import { pipeline } from 'https://cdn.jsdelivr.net/npm/@xenova/transformers@2.17.1';
import { pipeline } from 'https://cdn.jsdelivr.net/npm/@xenova/transformers@2.17.2';
</script>

@@ -138,3 +138,3 @@ ```

By default, Transformers.js uses [hosted pretrained models](https://huggingface.co/models?library=transformers.js) and [precompiled WASM binaries](https://cdn.jsdelivr.net/npm/@xenova/transformers@2.17.1/dist/), which should work out-of-the-box. You can customize this as follows:
By default, Transformers.js uses [hosted pretrained models](https://huggingface.co/models?library=transformers.js) and [precompiled WASM binaries](https://cdn.jsdelivr.net/npm/@xenova/transformers@2.17.2/dist/), which should work out-of-the-box. You can customize this as follows:

@@ -303,2 +303,3 @@

1. **[Falcon](https://huggingface.co/docs/transformers/model_doc/falcon)** (from Technology Innovation Institute) by Almazrouei, Ebtesam and Alobeidli, Hamza and Alshamsi, Abdulaziz and Cappelli, Alessandro and Cojocaru, Ruxandra and Debbah, Merouane and Goffinet, Etienne and Heslow, Daniel and Launay, Julien and Malartic, Quentin and Noune, Badreddine and Pannier, Baptiste and Penedo, Guilherme.
1. **FastViT** (from Apple) released with the paper [FastViT: A Fast Hybrid Vision Transformer using Structural Reparameterization](https://arxiv.org/abs/2303.14189) by Pavan Kumar Anasosalu Vasu, James Gabriel, Jeff Zhu, Oncel Tuzel and Anurag Ranjan.
1. **[FLAN-T5](https://huggingface.co/docs/transformers/model_doc/flan-t5)** (from Google AI) released in the repository [google-research/t5x](https://github.com/google-research/t5x/blob/main/docs/models.md#flan-t5-checkpoints) by Hyung Won Chung, Le Hou, Shayne Longpre, Barret Zoph, Yi Tay, William Fedus, Eric Li, Xuezhi Wang, Mostafa Dehghani, Siddhartha Brahma, Albert Webson, Shixiang Shane Gu, Zhuyun Dai, Mirac Suzgun, Xinyun Chen, Aakanksha Chowdhery, Sharan Narang, Gaurav Mishra, Adams Yu, Vincent Zhao, Yanping Huang, Andrew Dai, Hongkun Yu, Slav Petrov, Ed H. Chi, Jeff Dean, Jacob Devlin, Adam Roberts, Denny Zhou, Quoc V. Le, and Jason Wei

@@ -324,2 +325,3 @@ 1. **[GLPN](https://huggingface.co/docs/transformers/model_doc/glpn)** (from KAIST) released with the paper [Global-Local Path Networks for Monocular Depth Estimation with Vertical CutDepth](https://arxiv.org/abs/2201.07436) by Doyeon Kim, Woonghyun Ga, Pyungwhan Ahn, Donggyu Joo, Sehwan Chun, Junmo Kim.

1. **[MobileViT](https://huggingface.co/docs/transformers/model_doc/mobilevit)** (from Apple) released with the paper [MobileViT: Light-weight, General-purpose, and Mobile-friendly Vision Transformer](https://arxiv.org/abs/2110.02178) by Sachin Mehta and Mohammad Rastegari.
1. **[MobileViTV2](https://huggingface.co/docs/transformers/model_doc/mobilevitv2)** (from Apple) released with the paper [Separable Self-attention for Mobile Vision Transformers](https://arxiv.org/abs/2206.02680) by Sachin Mehta and Mohammad Rastegari.
1. **[MPNet](https://huggingface.co/docs/transformers/model_doc/mpnet)** (from Microsoft Research) released with the paper [MPNet: Masked and Permuted Pre-training for Language Understanding](https://arxiv.org/abs/2004.09297) by Kaitao Song, Xu Tan, Tao Qin, Jianfeng Lu, Tie-Yan Liu.

@@ -326,0 +328,0 @@ 1. **[MPT](https://huggingface.co/docs/transformers/model_doc/mpt)** (from MosaiML) released with the repository [llm-foundry](https://github.com/mosaicml/llm-foundry/) by the MosaicML NLP Team.

@@ -32,3 +32,3 @@ /**

const VERSION = '2.17.1';
const VERSION = '2.17.2';

@@ -35,0 +35,0 @@ // Check if various APIs are available (depends on environment)

@@ -476,2 +476,9 @@ /**

if (power === null && mel_filters !== null) {
throw new Error(
"You have provided `mel_filters` but `power` is `None`. Mel spectrogram computation is not yet supported for complex-valued spectrogram. " +
"Specify `power` to fix this issue."
);
}
if (center) {

@@ -551,4 +558,2 @@ if (pad_mode !== 'reflect') {

// TODO what should happen if power is None?
// https://github.com/huggingface/transformers/issues/27772
if (power !== null && power !== 2) {

@@ -555,0 +560,0 @@ // slight optimization to not sqrt

@@ -10,3 +10,2 @@

import path from 'path';
import stream from 'stream/web';

@@ -16,7 +15,2 @@ import { env } from '../env.js';

if (!globalThis.ReadableStream) {
// @ts-ignore
globalThis.ReadableStream = stream.ReadableStream; // ReadableStream is not a global with Node 16
}
/**

@@ -156,9 +150,9 @@ * @typedef {Object} PretrainedOptions Options for loading a pretrained model.

/**
* Determines whether the given string is a valid HTTP or HTTPS URL.
* @param {string|URL} string The string to test for validity as an HTTP or HTTPS URL.
* Determines whether the given string is a valid URL.
* @param {string|URL} string The string to test for validity as an URL.
* @param {string[]} [protocols=null] A list of valid protocols. If specified, the protocol must be in this list.
* @param {string[]} [validHosts=null] A list of valid hostnames. If specified, the URL's hostname must be in this list.
* @returns {boolean} True if the string is a valid HTTP or HTTPS URL, false otherwise.
* @returns {boolean} True if the string is a valid URL, false otherwise.
*/
function isValidHttpUrl(string, validHosts = null) {
// https://stackoverflow.com/a/43467144
function isValidUrl(string, protocols = null, validHosts = null) {
let url;

@@ -170,6 +164,9 @@ try {

}
if (protocols && !protocols.includes(url.protocol)) {
return false;
}
if (validHosts && !validHosts.includes(url.hostname)) {
return false;
}
return url.protocol === "http:" || url.protocol === "https:";
return true;
}

@@ -185,3 +182,3 @@

if (env.useFS && !isValidHttpUrl(urlOrPath)) {
if (env.useFS && !isValidUrl(urlOrPath, ['http:', 'https:', 'blob:'])) {
return new FileResponse(urlOrPath);

@@ -197,3 +194,3 @@

// Check whether we are making a request to the Hugging Face Hub.
const isHFURL = isValidHttpUrl(urlOrPath, ['huggingface.co', 'hf.co']);
const isHFURL = isValidUrl(urlOrPath, ['http:', 'https:'], ['huggingface.co', 'hf.co']);
if (isHFURL) {

@@ -442,3 +439,3 @@ // If an access token is present in the environment variables,

// If request is a valid HTTP URL, we skip the local file check. Otherwise, we try to get the file locally.
const isURL = isValidHttpUrl(requestURL);
const isURL = isValidUrl(requestURL, ['http:', 'https:']);
if (!isURL) {

@@ -445,0 +442,0 @@ try {

@@ -42,3 +42,3 @@

let { data, info } = await img.raw().toBuffer({ resolveWithObject: true });
let { data, info } = await img.rotate().raw().toBuffer({ resolveWithObject: true });

@@ -45,0 +45,0 @@ const newImage = new RawImage(new Uint8ClampedArray(data), info.width, info.height, info.channels);

@@ -368,16 +368,2 @@

/**
* Completes the spectrum by adding its mirrored negative frequency components.
* @param {Float64Array} spectrum The input spectrum.
* @returns {void}
*/
completeSpectrum(spectrum) {
const size = this._csize;
const half = size >>> 1;
for (let i = 2; i < half; i += 2) {
spectrum[size - i] = spectrum[i];
spectrum[size - i + 1] = -spectrum[i + 1];
}
}
/**
* Performs a Fast Fourier Transform (FFT) on the given input data and stores the result in the output buffer.

@@ -470,2 +456,3 @@ *

// Loop through steps in decreasing order
const table = this.table;
for (step >>= 2; step >= 2; step >>= 2) {

@@ -495,14 +482,14 @@ len = (size / step) << 1;

const tableBr = this.table[k];
const tableBi = inv * this.table[k + 1];
const tableBr = table[k];
const tableBi = inv * table[k + 1];
const MBr = Br * tableBr - Bi * tableBi;
const MBi = Br * tableBi + Bi * tableBr;
const tableCr = this.table[2 * k];
const tableCi = inv * this.table[2 * k + 1];
const tableCr = table[2 * k];
const tableCi = inv * table[2 * k + 1];
const MCr = Cr * tableCr - Ci * tableCi;
const MCi = Cr * tableCi + Ci * tableCr;
const tableDr = this.table[3 * k];
const tableDi = inv * this.table[3 * k + 1];
const tableDr = table[3 * k];
const tableDi = inv * table[3 * k + 1];
const MDr = Dr * tableDr - Di * tableDi;

@@ -640,14 +627,14 @@ const MDi = Dr * tableDi + Di * tableDr;

// TODO: Optimize once https://github.com/indutny/fft.js/issues/25 is fixed
// Loop through steps in decreasing order
const table = this.table;
for (step >>= 2; step >= 2; step >>= 2) {
len = (size / step) << 1;
const quarterLen = len >>> 2;
const halfLen = len >>> 1;
const quarterLen = halfLen >>> 1;
const hquarterLen = quarterLen >>> 1;
// Loop through offsets in the data
for (outOff = 0; outOff < size; outOff += len) {
// Full case
const limit = outOff + quarterLen - 1;
for (let i = outOff, k = 0; i < limit; i += 2, k += step) {
const A = i;
for (let i = 0, k = 0; i <= hquarterLen; i += 2, k += step) {
const A = outOff + i;
const B = A + quarterLen;

@@ -667,14 +654,18 @@ const C = B + quarterLen;

const tableBr = this.table[k];
const tableBi = inv * this.table[k + 1];
// Middle values
const MAr = Ar;
const MAi = Ai;
const tableBr = table[k];
const tableBi = inv * table[k + 1];
const MBr = Br * tableBr - Bi * tableBi;
const MBi = Br * tableBi + Bi * tableBr;
const tableCr = this.table[2 * k];
const tableCi = inv * this.table[2 * k + 1];
const tableCr = table[2 * k];
const tableCi = inv * table[2 * k + 1];
const MCr = Cr * tableCr - Ci * tableCi;
const MCi = Cr * tableCi + Ci * tableCr;
const tableDr = this.table[3 * k];
const tableDi = inv * this.table[3 * k + 1];
const tableDr = table[3 * k];
const tableDi = inv * table[3 * k + 1];
const MDr = Dr * tableDr - Di * tableDi;

@@ -684,6 +675,6 @@ const MDi = Dr * tableDi + Di * tableDr;

// Pre-Final values
const T0r = Ar + MCr;
const T0i = Ai + MCi;
const T1r = Ar - MCr;
const T1i = Ai - MCi;
const T0r = MAr + MCr;
const T0i = MAi + MCi;
const T1r = MAr - MCr;
const T1i = MAi - MCi;
const T2r = MBr + MDr;

@@ -699,9 +690,31 @@ const T2i = MBi + MDi;

out[B + 1] = T1i - T3r;
out[C] = T0r - T2r;
out[C + 1] = T0i - T2i;
out[D] = T1r - T3i;
out[D + 1] = T1i + T3r;
// Output final middle point
if (i === 0) {
out[C] = T0r - T2r;
out[C + 1] = T0i - T2i;
continue;
}
// Do not overwrite ourselves
if (i === hquarterLen)
continue;
const SA = outOff + quarterLen - i;
const SB = outOff + halfLen - i;
out[SA] = T1r - inv * T3i;
out[SA + 1] = -T1i - inv * T3r;
out[SB] = T0r - inv * T2r;
out[SB + 1] = -T0i + inv * T2i;
}
}
}
// Complete the spectrum by adding its mirrored negative frequency components.
const half = size >>> 1;
for (let i = 2; i < half; i += 2) {
out[size - i] = out[i];
out[size - i + 1] = -out[i + 1];
}
}

@@ -708,0 +721,0 @@

@@ -22,3 +22,3 @@ export namespace env {

declare const __dirname: any;
declare const VERSION: "2.17.1";
declare const VERSION: "2.17.2";
declare const localModelPath: any;

@@ -25,0 +25,0 @@ declare const FS_AVAILABLE: boolean;

@@ -43,2 +43,4 @@ declare const FeatureExtractor_base: new () => {

* @param {number|Object} config.size The size to resize the image to.
* @param {boolean} [config.do_flip_channel_order=false] Whether to flip the color channels from RGB to BGR.
* Can be overridden by the `do_flip_channel_order` parameter in the `preprocess` method.
*/

@@ -54,2 +56,3 @@ constructor(config: {

size: number | any;
do_flip_channel_order?: boolean;
});

@@ -72,2 +75,3 @@ image_mean: any;

do_pad: any;
do_flip_channel_order: any;
/**

@@ -144,3 +148,3 @@ * Resize the image to make a thumbnail. The image is resized so that no dimension is larger than any

*/
preprocess(image: RawImage, { do_normalize, do_pad, do_convert_rgb, do_convert_grayscale, }?: any): Promise<{
preprocess(image: RawImage, { do_normalize, do_pad, do_convert_rgb, do_convert_grayscale, do_flip_channel_order, }?: any): Promise<{
/**

@@ -216,2 +220,4 @@ * The original size of the image.

}
export class MobileViTImageProcessor extends MobileViTFeatureExtractor {
}
export class OwlViTFeatureExtractor extends ImageFeatureExtractor {

@@ -681,2 +687,3 @@ /**

MobileViTFeatureExtractor: typeof MobileViTFeatureExtractor;
MobileViTImageProcessor: typeof MobileViTImageProcessor;
OwlViTFeatureExtractor: typeof OwlViTFeatureExtractor;

@@ -683,0 +690,0 @@ Owlv2ImageProcessor: typeof Owlv2ImageProcessor;

@@ -146,5 +146,6 @@ declare const TokenizerModel_base: new () => {

* @param {boolean} [options.return_tensor=true] Whether to return the results as Tensors or arrays.
* @param {boolean} [options.return_token_type_ids=null] Whether to return the token type ids.
* @returns {BatchEncoding} Object to be passed to the model.
*/
_call(text: string | string[], { text_pair, add_special_tokens, padding, truncation, max_length, return_tensor, }?: {
_call(text: string | string[], { text_pair, add_special_tokens, padding, truncation, max_length, return_tensor, return_token_type_ids, }?: {
text_pair?: string | string[];

@@ -156,2 +157,3 @@ padding?: boolean | 'max_length';

return_tensor?: boolean;
return_token_type_ids?: boolean;
}): {

@@ -185,2 +187,3 @@ /**

* @param {boolean} [options.add_special_tokens=true] Whether or not to add the special tokens associated with the corresponding model.
* @param {boolean} [options.return_token_type_ids=null] Whether to return token_type_ids.
* @returns {EncodingSingle} An object containing the encoded text.

@@ -197,6 +200,8 @@ * @private

* @param {boolean} [options.add_special_tokens=true] Whether or not to add the special tokens associated with the corresponding model.
* @param {boolean} [options.return_token_type_ids=null] Whether to return token_type_ids.
* @returns {number[]} An array of token IDs representing the encoded text(s).
*/
encode(text: string, text_pair?: string | null, { add_special_tokens, }?: {
encode(text: string, text_pair?: string | null, { add_special_tokens, return_token_type_ids, }?: {
add_special_tokens?: boolean;
return_token_type_ids?: boolean;
}): number[];

@@ -253,3 +258,3 @@ /**

*
* const tokenizer = await AutoTokenizer.from_pretrained("mistralai/Mistral-7B-Instruct-v0.1");
* const tokenizer = await AutoTokenizer.from_pretrained("Xenova/mistral-tokenizer-v1");
*

@@ -256,0 +261,0 @@ * const chat = [

@@ -157,8 +157,2 @@ /**

/**
* Completes the spectrum by adding its mirrored negative frequency components.
* @param {Float64Array} spectrum The input spectrum.
* @returns {void}
*/
completeSpectrum(spectrum: Float64Array): void;
/**
* Performs a Fast Fourier Transform (FFT) on the given input data and stores the result in the output buffer.

@@ -165,0 +159,0 @@ *

Sorry, the diff of this file is too big to display

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is too big to display

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is too big to display

Sorry, the diff of this file is too big to display

Sorry, the diff of this file is too big to display

Sorry, the diff of this file is too big to display

Sorry, the diff of this file is too big to display

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is too big to display

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc