@xenova/transformers
Advanced tools
Comparing version 2.5.0 to 2.5.1
{ | ||
"name": "@xenova/transformers", | ||
"version": "2.5.0", | ||
"version": "2.5.1", | ||
"description": "State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!", | ||
@@ -5,0 +5,0 @@ "main": "./src/transformers.js", |
@@ -101,3 +101,3 @@ | ||
<script type="module"> | ||
import { pipeline } from 'https://cdn.jsdelivr.net/npm/@xenova/transformers@2.5.0'; | ||
import { pipeline } from 'https://cdn.jsdelivr.net/npm/@xenova/transformers@2.5.1'; | ||
</script> | ||
@@ -116,2 +116,3 @@ ``` | ||
| Code Playground | In-browser code completion website | [link](./examples/code-completion/) | | ||
| Semantic Image Search | Search for images with text (Next.js + Supabase) | [link](./examples/semantic-image-search/) | | ||
| React | Multilingual translation website | [link](./examples/react-translator/) | | ||
@@ -129,3 +130,3 @@ | Browser extension | Text classification extension | [link](./examples/extension/) | | ||
By default, Transformers.js uses [hosted pretrained models](https://huggingface.co/models) and [precompiled WASM binaries](https://cdn.jsdelivr.net/npm/@xenova/transformers@2.5.0/dist/), which should work out-of-the-box. You can customize this as follows: | ||
By default, Transformers.js uses [hosted pretrained models](https://huggingface.co/models) and [precompiled WASM binaries](https://cdn.jsdelivr.net/npm/@xenova/transformers@2.5.1/dist/), which should work out-of-the-box. You can customize this as follows: | ||
@@ -263,2 +264,4 @@ | ||
1. **[CodeGen](https://huggingface.co/docs/transformers/model_doc/codegen)** (from Salesforce) released with the paper [A Conversational Paradigm for Program Synthesis](https://arxiv.org/abs/2203.13474) by Erik Nijkamp, Bo Pang, Hiroaki Hayashi, Lifu Tu, Huan Wang, Yingbo Zhou, Silvio Savarese, Caiming Xiong. | ||
1. **[DeBERTa](https://huggingface.co/docs/transformers/model_doc/deberta)** (from Microsoft) released with the paper [DeBERTa: Decoding-enhanced BERT with Disentangled Attention](https://arxiv.org/abs/2006.03654) by Pengcheng He, Xiaodong Liu, Jianfeng Gao, Weizhu Chen. | ||
1. **[DeBERTa-v2](https://huggingface.co/docs/transformers/model_doc/deberta-v2)** (from Microsoft) released with the paper [DeBERTa: Decoding-enhanced BERT with Disentangled Attention](https://arxiv.org/abs/2006.03654) by Pengcheng He, Xiaodong Liu, Jianfeng Gao, Weizhu Chen. | ||
1. **[DETR](https://huggingface.co/docs/transformers/model_doc/detr)** (from Facebook) released with the paper [End-to-End Object Detection with Transformers](https://arxiv.org/abs/2005.12872) by Nicolas Carion, Francisco Massa, Gabriel Synnaeve, Nicolas Usunier, Alexander Kirillov, Sergey Zagoruyko. | ||
@@ -265,0 +268,0 @@ 1. **[DistilBERT](https://huggingface.co/docs/transformers/model_doc/distilbert)** (from HuggingFace), released together with the paper [DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter](https://arxiv.org/abs/1910.01108) by Victor Sanh, Lysandre Debut and Thomas Wolf. The same method has been applied to compress GPT2 into [DistilGPT2](https://github.com/huggingface/transformers/tree/main/examples/research_projects/distillation), RoBERTa into [DistilRoBERTa](https://github.com/huggingface/transformers/tree/main/examples/research_projects/distillation), Multilingual BERT into [DistilmBERT](https://github.com/huggingface/transformers/tree/main/examples/research_projects/distillation) and a German version of DistilBERT. |
@@ -32,3 +32,3 @@ /** | ||
const VERSION = '2.5.0'; | ||
const VERSION = '2.5.1'; | ||
@@ -35,0 +35,0 @@ // Check if various APIs are available (depends on environment) |
@@ -463,4 +463,4 @@ /** | ||
* let text = 'I enjoy walking with my cute dog,'; | ||
* let generator = await pipeline('text-generation', 'Xenova/distilgpt2'); | ||
* let output = await generator(text); | ||
* let classifier = await pipeline('text-generation', 'Xenova/distilgpt2'); | ||
* let output = await classifier(text); | ||
* console.log(output); | ||
@@ -473,4 +473,4 @@ * // [{ generated_text: "I enjoy walking with my cute dog, and I love to play with the other dogs." }] | ||
* let text = 'Once upon a time, there was'; | ||
* let generator = await pipeline('text-generation', 'Xenova/distilgpt2'); | ||
* let output = await generator(text, { | ||
* let classifier = await pipeline('text-generation', 'Xenova/distilgpt2'); | ||
* let output = await classifier(text, { | ||
* temperature: 2, | ||
@@ -494,4 +494,4 @@ * max_new_tokens: 10, | ||
* let text = 'def fib(n):'; | ||
* let generator = await pipeline('text-generation', 'Xenova/codegen-350M-mono'); | ||
* let output = await generator(text, { | ||
* let classifier = await pipeline('text-generation', 'Xenova/codegen-350M-mono'); | ||
* let output = await classifier(text, { | ||
* max_new_tokens: 40, | ||
@@ -556,2 +556,31 @@ * }); | ||
* can be chosen at runtime. It usually means it's slower but it is **much** more flexible. | ||
* | ||
* **Example:** Zero shot classification with `Xenova/mobilebert-uncased-mnli`. | ||
* ```javascript | ||
* let text = 'Last week I upgraded my iOS version and ever since then my phone has been overheating whenever I use your app.'; | ||
* let labels = [ 'mobile', 'billing', 'website', 'account access' ]; | ||
* let classifier = await pipeline('zero-shot-classification', 'Xenova/mobilebert-uncased-mnli'); | ||
* let output = await classifier(text, labels); | ||
* console.log(output); | ||
* // { | ||
* // sequence: 'Last week I upgraded my iOS version and ever since then my phone has been overheating whenever I use your app.', | ||
* // labels: [ 'mobile', 'website', 'billing', 'account access' ], | ||
* // scores: [ 0.5562091040482018, 0.1843621307860853, 0.13942646639336376, 0.12000229877234923 ] | ||
* // } | ||
* ``` | ||
* | ||
* **Example:** Zero shot classification with `Xenova/nli-deberta-v3-xsmall` (multi-label). | ||
* ```javascript | ||
* let text = 'I have a problem with my iphone that needs to be resolved asap!'; | ||
* let labels = [ 'urgent', 'not urgent', 'phone', 'tablet', 'computer' ]; | ||
* let classifier = await pipeline('zero-shot-classification', 'Xenova/nli-deberta-v3-xsmall'); | ||
* let output = await classifier(text, labels, { multi_label: true }); | ||
* console.log(output); | ||
* // { | ||
* // sequence: 'I have a problem with my iphone that needs to be resolved asap!', | ||
* // labels: [ 'urgent', 'phone', 'computer', 'tablet', 'not urgent' ], | ||
* // scores: [ 0.9958870956360275, 0.9923963400697035, 0.002333537946160235, 0.0015134138567598765, 0.0010699384208377163 ] | ||
* // } | ||
* ``` | ||
* | ||
* @extends Pipeline | ||
@@ -583,3 +612,3 @@ */ | ||
this.contradiction_id = this.label2id['contradiction']; | ||
this.contradiction_id = this.label2id['contradiction'] ?? this.label2id['not_entailment']; | ||
if (this.contradiction_id === undefined) { | ||
@@ -586,0 +615,0 @@ console.warn("Could not find 'contradiction' in label2id mapping. Using 0 as contradiction_id."); |
@@ -159,4 +159,4 @@ | ||
* Adapted from https://stackoverflow.com/a/6768642/13989043 | ||
* @param {...any} arrs Arrays to merge. | ||
* @returns The merged array. | ||
* @param {Array[]} arrs Arrays to merge. | ||
* @returns {Array} The merged array. | ||
*/ | ||
@@ -163,0 +163,0 @@ export function mergeArrays(...arrs) { |
@@ -22,3 +22,3 @@ export namespace env { | ||
declare const __dirname: any; | ||
declare const VERSION: "2.5.0"; | ||
declare const VERSION: "2.5.1"; | ||
declare const localModelPath: any; | ||
@@ -25,0 +25,0 @@ declare const FS_AVAILABLE: boolean; |
@@ -162,4 +162,4 @@ /** | ||
* let text = 'I enjoy walking with my cute dog,'; | ||
* let generator = await pipeline('text-generation', 'Xenova/distilgpt2'); | ||
* let output = await generator(text); | ||
* let classifier = await pipeline('text-generation', 'Xenova/distilgpt2'); | ||
* let output = await classifier(text); | ||
* console.log(output); | ||
@@ -172,4 +172,4 @@ * // [{ generated_text: "I enjoy walking with my cute dog, and I love to play with the other dogs." }] | ||
* let text = 'Once upon a time, there was'; | ||
* let generator = await pipeline('text-generation', 'Xenova/distilgpt2'); | ||
* let output = await generator(text, { | ||
* let classifier = await pipeline('text-generation', 'Xenova/distilgpt2'); | ||
* let output = await classifier(text, { | ||
* temperature: 2, | ||
@@ -193,4 +193,4 @@ * max_new_tokens: 10, | ||
* let text = 'def fib(n):'; | ||
* let generator = await pipeline('text-generation', 'Xenova/codegen-350M-mono'); | ||
* let output = await generator(text, { | ||
* let classifier = await pipeline('text-generation', 'Xenova/codegen-350M-mono'); | ||
* let output = await classifier(text, { | ||
* max_new_tokens: 40, | ||
@@ -223,2 +223,31 @@ * }); | ||
* can be chosen at runtime. It usually means it's slower but it is **much** more flexible. | ||
* | ||
* **Example:** Zero shot classification with `Xenova/mobilebert-uncased-mnli`. | ||
* ```javascript | ||
* let text = 'Last week I upgraded my iOS version and ever since then my phone has been overheating whenever I use your app.'; | ||
* let labels = [ 'mobile', 'billing', 'website', 'account access' ]; | ||
* let classifier = await pipeline('zero-shot-classification', 'Xenova/mobilebert-uncased-mnli'); | ||
* let output = await classifier(text, labels); | ||
* console.log(output); | ||
* // { | ||
* // sequence: 'Last week I upgraded my iOS version and ever since then my phone has been overheating whenever I use your app.', | ||
* // labels: [ 'mobile', 'website', 'billing', 'account access' ], | ||
* // scores: [ 0.5562091040482018, 0.1843621307860853, 0.13942646639336376, 0.12000229877234923 ] | ||
* // } | ||
* ``` | ||
* | ||
* **Example:** Zero shot classification with `Xenova/nli-deberta-v3-xsmall` (multi-label). | ||
* ```javascript | ||
* let text = 'I have a problem with my iphone that needs to be resolved asap!'; | ||
* let labels = [ 'urgent', 'not urgent', 'phone', 'tablet', 'computer' ]; | ||
* let classifier = await pipeline('zero-shot-classification', 'Xenova/nli-deberta-v3-xsmall'); | ||
* let output = await classifier(text, labels, { multi_label: true }); | ||
* console.log(output); | ||
* // { | ||
* // sequence: 'I have a problem with my iphone that needs to be resolved asap!', | ||
* // labels: [ 'urgent', 'phone', 'computer', 'tablet', 'not urgent' ], | ||
* // scores: [ 0.9958870956360275, 0.9923963400697035, 0.002333537946160235, 0.0015134138567598765, 0.0010699384208377163 ] | ||
* // } | ||
* ``` | ||
* | ||
* @extends Pipeline | ||
@@ -225,0 +254,0 @@ */ |
@@ -40,12 +40,12 @@ declare const TokenizerModel_base: new () => { | ||
* @param {string[]} tokens The tokens to encode. | ||
* @returns {number[]} The encoded token IDs. | ||
* @returns {string[]} The encoded token IDs. | ||
*/ | ||
_call(tokens: string[]): number[]; | ||
_call(tokens: string[]): string[]; | ||
/** | ||
* Encodes a list of tokens into a list of token IDs. | ||
* @param {string[]} tokens The tokens to encode. | ||
* @returns {number[]} The encoded token IDs. | ||
* @returns {string[]} The encoded tokens. | ||
* @throws Will throw an error if not implemented in a subclass. | ||
*/ | ||
encode(tokens: string[]): number[]; | ||
encode(tokens: string[]): string[]; | ||
/** | ||
@@ -207,2 +207,6 @@ * Converts a list of tokens into a list of token IDs. | ||
} | ||
export class DebertaTokenizer extends PreTrainedTokenizer { | ||
} | ||
export class DebertaV2Tokenizer extends PreTrainedTokenizer { | ||
} | ||
export class DistilBertTokenizer extends PreTrainedTokenizer { | ||
@@ -391,2 +395,4 @@ } | ||
DistilBertTokenizer: typeof DistilBertTokenizer; | ||
DebertaTokenizer: typeof DebertaTokenizer; | ||
DebertaV2Tokenizer: typeof DebertaV2Tokenizer; | ||
BertTokenizer: typeof BertTokenizer; | ||
@@ -430,2 +436,24 @@ MobileBertTokenizer: typeof MobileBertTokenizer; | ||
export type PretrainedOptions = import('./utils/hub.js').PretrainedOptions; | ||
export type BPENode = { | ||
/** | ||
* The token associated with the node | ||
*/ | ||
token: string; | ||
/** | ||
* A positional bias for the node. | ||
*/ | ||
bias: number; | ||
/** | ||
* The score of the node. | ||
*/ | ||
score?: number; | ||
/** | ||
* The previous node in the linked list. | ||
*/ | ||
prev?: BPENode; | ||
/** | ||
* The next node in the linked list. | ||
*/ | ||
next?: BPENode; | ||
}; | ||
export type SplitDelimiterBehavior = 'removed' | 'isolated' | 'mergedWithPrevious' | 'mergedWithNext' | 'contiguous'; | ||
@@ -432,0 +460,0 @@ declare const Normalizer_base: new () => { |
@@ -78,6 +78,6 @@ /** | ||
* Adapted from https://stackoverflow.com/a/6768642/13989043 | ||
* @param {...any} arrs Arrays to merge. | ||
* @returns The merged array. | ||
* @param {Array[]} arrs Arrays to merge. | ||
* @returns {Array} The merged array. | ||
*/ | ||
export function mergeArrays(...arrs: any[]): any; | ||
export function mergeArrays(...arrs: any[][]): any[]; | ||
/** | ||
@@ -84,0 +84,0 @@ * A base class for creating callable objects. |
Sorry, the diff of this file is too big to display
Sorry, the diff of this file is not supported yet
Sorry, the diff of this file is too big to display
Sorry, the diff of this file is not supported yet
Sorry, the diff of this file is too big to display
Sorry, the diff of this file is too big to display
Sorry, the diff of this file is too big to display
Sorry, the diff of this file is not supported yet
Sorry, the diff of this file is not supported yet
Sorry, the diff of this file is not supported yet
Sorry, the diff of this file is not supported yet
License Policy Violation
LicenseThis package is not allowed per your license policy. Review the package's license to ensure compliance.
Found 1 instance in 1 package
License Policy Violation
LicenseThis package is not allowed per your license policy. Review the package's license to ensure compliance.
Found 1 instance in 1 package
44483526
59
38030
285