Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

@isdk/ai-tool-prompt

Package Overview
Dependencies
Maintainers
1
Versions
27
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@isdk/ai-tool-prompt - npm Package Versions

12

0.0.8

Diff

Changelog

Source

0.0.8 (2024-06-11)

isdk
published 0.0.7 •

Changelog

Source

0.0.7 (2024-06-10)

isdk
published 0.0.6 •

Changelog

Source

0.0.6 (2024-06-10)

Bug Fixes

  • prompts: phi-3 bug (0dcdf12)
isdk
published 0.0.5 •

Changelog

Source

0.0.5 (2024-06-09)

isdk
published 0.0.4 •

Changelog

Source

0.0.4 (2024-06-09)

Bug Fixes

  • prompt: minor bug fixed (945c6ab)
  • use the first one when multi versions matched (6e37a85)
isdk
published 0.0.3 •

Changelog

Source

0.0.3 (2024-06-08)

Bug Fixes

isdk
published 0.0.2 •

Changelog

Source

0.0.2 (2024-06-08)

Features

  • add $getDefaultPrompt() method (15e161f)
  • add $getParameters to get LLM default parameters if any (7066928)
  • add all matched prompts (9418140)
  • add array supports to registerYamlTag (644b1d2)
  • add config init (880729f)
  • add const AIPromptsName (ff19eb2)
  • add extends support for prompt (d884979)
  • add filter the prompt type to $getPrompt (139c47a)
  • add formatPrompt func (9006544)
  • add input, output to AIPromptSettings (6c07ed4)
  • add isFitForLLM method (b42af0a)
  • add priority support for prompt (6314dec)
  • add registerYamlTag func (f47990e)
  • AIPrompt: add isFitForLLM method to check the prompt whether good fit LLM (f49d303)
  • AIPrompts: add custom method getPrompt to get a better prompt for LLM (9d91a57)
  • AIPromptSettings: add optional contentType field (4495900)
  • export ConfigFile (91a8c8c)
  • export parseYaml (32b70aa)
  • more system prompts template added (8549f34)
  • promptIsFitForLLM: the modelName argument can be prompt id now (36e5fbf)
  • prompts: $getDefaultPrompt can pass type now (a9363d4)
  • prompts: add codeqwen support to ChatML (a8f6601)
  • prompts: add Deepseek system prompts (355a50e)
  • prompts: add default prompt (5159ba3)
  • return all fits for LLM (2acaf91)

Bug Fixes

  • add ineractive mode supports (54e00ec)
  • AIPromptsFunc: extends should concat array(messages) (4e6041b)
  • AIPromptsFunc: get should extends array(concat) too (f4454f0)
  • build commonjs need import.meta.url shims (8a6dc4a)
  • build: export commonjs and esm (0936b7f)
  • can not getPrompt by LLM (7e34c74)
  • ChatML: the gguf file build-in eot_token is wrong, so correct it here (260d54c)
  • default prompt template (367b858)
  • forget to add new options to AIPromptSchema (11bdf21)
  • OpenChat: use eot_token instead of end_of_turn (10bc4a0)
  • parmeters rule matched name should be lowercase always (13c9772)
  • Phi-2: forget to add rule (d993921)
  • prompt: add stop words to PHi-3 (59f7920)
  • prompt: add stop_words for qwen system prompt template (a499d25)
  • prompt: add stop_words to llama3 (90ab21f)
  • prompt: for bagel (1ba0c0b)
  • promptIsFitForLLM return not matched prompt (7a655b4)
  • prompt: let openbuddy embeded system prompt template first (b16b423)
  • prompts: add iteractive mode supports (c2ac49e)
  • prompts: many changes (2aee0c7)
  • prompts: update phi-3 rule (75354f9)
  • prompt: with line break (abed8df)
  • ts: add_generation_prompt should be a boolean (71007a7)
  • ts: treat id as any (4469376)
SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc