You're Invited:Meet the Socket Team at BlackHat and DEF CON in Las Vegas, Aug 4-6.RSVP
Socket
Book a DemoInstallSign in
Socket

@promptbook/browser

Package Overview
Dependencies
Maintainers
1
Versions
353
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@promptbook/browser - npm Package Versions

1
35
36

0.94.0-6

Diff
h
hejny
published 0.94.0-6 •
h
hejny
published 0.94.0-4 •
h
hejny
published 0.94.0-2 •
h
hejny
published 0.94.0-3 •
h
hejny
published 0.94.0-1 •
h
hejny
published 0.94.0-0 •
h
hejny
published 0.93.0 •

Changelog

Source

0.93.0 (2025-05-14)

Enhance the presentation of the Promptbook

✨ First release mainly managed by AI

h
hejny
published 0.92.0 •

Changelog

Source

0.92.0 (2025-05-13)

Models and Migrations and processing big tables

  • Models are picked by description
  • During preparation of the pipeline, not single model picked but all models which are relevant for task are sorted by relevance
  • Make real RAG of knowledge
  • Remove "(boilerplate)" from model names
  • Sort model providers by relevance
  • Export utility function filterModels from @promptbook/core
  • All OpenAI models contain description
  • All Anthropic models contain description
  • All DeepSeek models contain description
  • All Google models contain description
  • Fix remote server POST /login
  • Update and fix all status codes and responses in openapi
  • Migrate JSON.parse -> jsonParse (preparation for formats)
  • Migrate papaparse.parse -> csvParse (preparation for formats)
  • Rename FormatDefinition -> FormatParser
  • Limit rate of requests to models
  • Autoheal \r in CsvFormatParser ~~CsvFormatDefinition~~
  • Add getIndexedDbStorage
  • Pipeline migrations
  • Add formfactor COMPLETION which emulates Completion variant of the model
  • Add JSDoc annotations to all entities which are exported from any package
  • When processing more than 50 values, if many items pass but some fail, use "~" for failed value and just console log the error.
  • Fix OpenAI pricing
  • Fix LLM cache
  • Add title and promptbookVersion to ExecutionTask
  • Cache getLocalStorage, getSessionStorage and getIndexedDbStorage
  • Pass databaseName and storeName into getIndexedDbStorage
  • Fix AzureOpenAiExecutionTools
  • Add maxRequestsPerMinute to LLM provider boilerplate configurations
  • ✨Auto-enhance model providers, try autonomous agent to work on Promptbook
  • ✨Auto-fix grammar and typos
h
hejny
published 0.92.0-34 •
h
hejny
published 0.92.0-33 •