You're Invited: Meet the Socket team at BSidesSF and RSAC - April 27 - May 1.RSVP โ†’
Socket
Sign inDemoInstall
Socket

@promptbook/utils

Package Overview
Dependencies
Maintainers
0
Versions
606
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@promptbook/utils

It's time for a paradigm shift. The future of software in plain English, French or Latin

0.86.10
Source
npm
Version published
Weekly downloads
624K
-0.09%
Maintainers
0
Weekly downloads
ย 
Created
Source

โœจ Promptbook

NPM Version of Promptbook logo - cube with letters P and B Promptbook Quality of package Promptbook logo - cube with letters P and B Promptbook Known Vulnerabilities Issues

๐ŸŒŸ New Features

  • ๐Ÿ“‚ We have plugin for VSCode to support .book file extension
  • ๐Ÿณ Available Docker image
  • ๐Ÿ’ซ Support of o3-mini model by OpenAI
  • ๐Ÿ‹ Support of DeepSeek models

๐Ÿ“ฆ Package @promptbook/utils

To install this package, run:

# Install entire promptbook ecosystem
npm i ptbk

# Install just this package to save space
npm install @promptbook/utils

Utility functions used in the library, but also useful for individual use in pre and post-processing of LLM inputs and outputs.

Here is an overview of the functions that can be exported from the @promptbook/utils package and used in your own projects:

Simple templating

The prompt template tag function helps format prompt strings for LLM interactions. It handles string interpolation and maintains consistent formatting for multiline strings and lists and also handles a security to avoid prompt injection.

import { prompt } from '@promptbook/utils';

const promptString = prompt`
    Correct the following sentence:

    > ${unsecureUserInput}
`;

The prompt name could be overloaded by multiple things in your code. If you want to use the promptTemplate which is alias for prompt:

import { promptTemplate } from '@promptbook/utils';

const promptString = promptTemplate`
    Correct the following sentence:

    > ${unsecureUserInput}
`;

Advanced templating

There is a function templateParameters which is used to replace the parameters in given template optimized to LLM prompt templates.

import { templateParameters } from '@promptbook/utils';

templateParameters('Hello, {name}!', { name: 'world' }); // 'Hello, world!'

And also multiline templates with blockquotes

import { templateParameters, spaceTrim } from '@promptbook/utils';

templateParameters(
    spaceTrim(`
        Hello, {name}!

        > {answer}
    `),
    {
        name: 'world',
        answer: spaceTrim(`
            I'm fine,
            thank you!

            And you?
        `),
    },
);

// Hello, world!
//
// > I'm fine,
// > thank you!
// >
// > And you?

Counting

Theese functions are usefull to count stats about the input/output in human-like terms not tokens and bytes, you can use countCharacters, countLines, countPages, countParagraphs, countSentences, countWords

import { countWords } from '@promptbook/utils';

console.log(countWords('Hello, world!')); // 2

Splitting

Splitting functions are similar to counting but they return the splitted parts of the input/output, you can use splitIntoCharacters, splitIntoLines, splitIntoPages, splitIntoParagraphs, splitIntoSentences, splitIntoWords

import { splitIntoWords } from '@promptbook/utils';

console.log(splitIntoWords('Hello, world!')); // ['Hello', 'world']

Normalization

Normalization functions are used to put the string into a normalized form, you can use kebab-case PascalCase SCREAMING_CASE snake_case kebab-case

import { normalizeTo } from '@promptbook/utils';

console.log(normalizeTo['kebab-case']('Hello, world!')); // 'hello-world'
  • There are more normalization functions like capitalize, decapitalize, removeDiacritics,...
  • Theese can be also used as postprocessing functions in the POSTPROCESS command in promptbook

Postprocessing

Sometimes you need to postprocess the output of the LLM model, every postprocessing function that is available through POSTPROCESS command in promptbook is exported from @promptbook/utils. You can use:

Very often you will use unwrapResult, which is used to extract the result you need from output with some additional information:

import { unwrapResult } from '@promptbook/utils';

unwrapResult('Best greeting for the user is "Hi Pavol!"'); // 'Hi Pavol!'

Misc

See also the documentation for all the functions in the @promptbook/utils package, every function is documented by jsdoc, typed by typescript and tested by jest.

  • checkExpectations,
  • executionReportJsonToString,
  • isPassingExpectations,
  • isValidJsonString,
  • parseNumber

Rest of the documentation is common for entire promptbook ecosystem:

๐Ÿค The Book Abstract

It's time for a paradigm shift! The future of software is in plain English, French or Latin.

During the computer revolution, we have seen multiple generations of computer languages, from the physical rewiring of the vacuum tubes through low-level machine code to the high-level languages like Python or JavaScript. And now, we're on the edge of the next revolution!

It's a revolution of writing software in plain human language that is understandable and executable by both humans and machines โ€“ and it's going to change everything!

The incredible growth in power of microprocessors and the Moore's Law have been the driving force behind the ever-more powerful languages, and it's been an amazing journey! Similarly, the large language models (like GPT or Claude) are the next big thing in language technology, and they're set to transform the way we interact with computers.

This shift is going to happen, whether we are ready for it or not. Our mission is to make it excellently, not just good.

Join us in this journey!

๐Ÿš€ Get started

Take a look at the simple starter kit with books integrated into the Hello World sample applications:

๐Ÿ’œ The Promptbook Project

Promptbook project is ecosystem of multiple projects and tools, following is a list of most important pieces of the project:

ProjectAbout
Book language Book is a human-understandable markup language for writing AI applications such as chatbots, knowledge bases, agents, avarars, translators, automations and more. There is also a plugin for VSCode to support .book file extension
Promptbook Engine Promptbook engine can run applications written in Book language. It is released as multiple NPM packages and Docker HUB
Promptbook Studio Promptbook.studio is a web-based editor and runner for book applications. It is still in the experimental MVP stage.

We also have a community of developers and users of Promptbook:

And Promptbook.studio branded socials:

And Promptujeme sub-brand:

/Subbrand for Czech clients/

And Promptbook.city branded socials:

/Sub-brand for images and graphics generated via Promptbook prompting/

๐Ÿ’™ The Book language

Following is the documentation and blueprint of the Book language.

Example

# ๐ŸŒŸ My first Book

-   BOOK VERSION 1.0.0
-   URL https://promptbook.studio/my-first-book/

# Write an article

-   PERSONA Jane, marketing specialist with prior experience in writing articles about technology and artificial intelligence
-   KNOWLEDGE https://ptbk.io
-   KNOWLEDGE ./promptbook.pdf
-   EXPECT MIN 1 Sentence
-   EXPECT MAX 1 Paragraph

> Write an article about the future of artificial intelligence in the next 10 years and how metalanguages will change the way AI is used in the world.
> Look specifically at the impact of Promptbook on the AI industry.

-> {article}

What: Workflows, Tasks and Parameters

Who: Personas

How: Knowledge, Instruments and Actions

General principles of book language

Book language is based on markdown. It is subset of markdown. It is designed to be easy to read and write. It is designed to be understandable by both humans and machines and without specific knowledge of the language.

The file has .book extension. It uses UTF-8 non BOM encoding.

Book has two variants: flat - which is just a prompt with no structure, and full - which has a structure with tasks, commands and prompts.

As it is source code, it can leverage all the features of version control systems like git and does not suffer from the problems of binary formats, proprietary formats, or no-code solutions.

But unlike programming languages, it is designed to be understandable by non-programmers and non-technical people.

๐Ÿ“ฆ Packages (for developers)

This library is divided into several packages, all are published from single monorepo. You can install all of them at once:

npm i ptbk

Or you can install them separately:

โญ Marked packages are worth to try first

๐Ÿ“š Dictionary

๐Ÿ“š Dictionary

The following glossary is used to clarify certain concepts:

General LLM / AI terms

  • Prompt drift is a phenomenon where the AI model starts to generate outputs that are not aligned with the original prompt. This can happen due to the model's training data, the prompt's wording, or the model's architecture.
  • Pipeline, workflow or chain is a sequence of tasks that are executed in a specific order. In the context of AI, a pipeline can refer to a sequence of AI models that are used to process data.
  • Fine-tuning is a process where a pre-trained AI model is further trained on a specific dataset to improve its performance on a specific task.
  • Zero-shot learning is a machine learning paradigm where a model is trained to perform a task without any labeled examples. Instead, the model is provided with a description of the task and is expected to generate the correct output.
  • Few-shot learning is a machine learning paradigm where a model is trained to perform a task with only a few labeled examples. This is in contrast to traditional machine learning, where models are trained on large datasets.
  • Meta-learning is a machine learning paradigm where a model is trained on a variety of tasks and is able to learn new tasks with minimal additional training. This is achieved by learning a set of meta-parameters that can be quickly adapted to new tasks.
  • Retrieval-augmented generation is a machine learning paradigm where a model generates text by retrieving relevant information from a large database of text. This approach combines the benefits of generative models and retrieval models.
  • Longtail refers to non-common or rare events, items, or entities that are not well-represented in the training data of machine learning models. Longtail items are often challenging for models to predict accurately.

Note: Thos section is not complete dictionary, more list of general AI / LLM terms that has connection with Promptbook

Promptbook core

  • Organization (legacy name collection) group jobs, workforce, knowledge, instruments, and actions into one package. Entities in one organization can share resources (= import resources from each other).
    • Jobs
      • Task
      • Subtask
    • Workforce
      • Persona
      • Team
      • Role
    • Knowledge
      • Public
      • Private
      • Protected
    • Instruments
    • Actions

Book language

  • Book file
    • Section
      • Heading
      • Description
      • Command
      • Block
      • Return statement
    • Comment
    • Import
    • Scope

๐Ÿ’ฏ Core concepts

Advanced concepts

Terms specific to Promptbook TypeScript implementation

  • Anonymous mode
  • Application mode

๐Ÿ”Œ Usage in Typescript / Javascript

โž•โž– When to use Promptbook?

โž• When to use

  • When you are writing app that generates complex things via LLM - like websites, articles, presentations, code, stories, songs,...
  • When you want to separate code from text prompts
  • When you want to describe complex prompt pipelines and don't want to do it in the code
  • When you want to orchestrate multiple prompts together
  • When you want to reuse parts of prompts in multiple places
  • When you want to version your prompts and test multiple versions
  • When you want to log the execution of prompts and backtrace the issues

See more

โž– When not to use

  • When you have already implemented single simple prompt and it works fine for your job
  • When OpenAI Assistant (GPTs) is enough for you
  • When you need streaming (this may be implemented in the future, see discussion).
  • When you need to use something other than JavaScript or TypeScript (other languages are on the way, see the discussion)
  • When your main focus is on something other than text - like images, audio, video, spreadsheets (other media types may be added in the future, see discussion)
  • When you need to use recursion (see the discussion)

See more

๐Ÿœ Known issues

๐Ÿงผ Intentionally not implemented features

โ” FAQ

If you have a question start a discussion, open an issue or write me an email.

โŒš Changelog

See CHANGELOG.md

๐Ÿ“œ License

Promptbook project is under BUSL 1.1 is an SPDX license

๐ŸŽฏ Todos

See TODO.md

๐Ÿค Partners

๐Ÿ–‹๏ธ Contributing

We are open to pull requests, feedback, and suggestions.

You can also โญ star the project, follow us on GitHub or various other social networks.

Keywords

ai

FAQs

Package last updated on 27 Feb 2025

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts