Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

@promptbook/remote-client

Package Overview
Dependencies
Maintainers
0
Versions
454
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@promptbook/remote-client - npm Package Compare versions

Comparing version 0.68.0-1 to 0.68.0-3

2

esm/index.es.js

@@ -8,3 +8,3 @@ import { io } from 'socket.io-client';

*/
var PROMPTBOOK_VERSION = '0.68.0-0';
var PROMPTBOOK_VERSION = '0.68.0-2';
// TODO: !!!! List here all the versions and annotate + put into script

@@ -11,0 +11,0 @@

import { COMMANDS } from '../../index';
/**
* Command is one piece of the prompt template which adds some logic to the prompt template or the whole pipeline.
* Command is one piece of the template which adds some logic to the template or the whole pipeline.
* It is parsed from the markdown from ul/ol items - one command per one item.
*/
export type Command = ReturnType<typeof COMMANDS[number]['parse']>;
import type { ExpectationAmount } from '../../types/PipelineJson/Expectations';
import type { ExpectationUnit } from '../../types/PipelineJson/Expectations';
/**
* Expect amount command describes the desired output of the prompt template (after post-processing)
* Expect amount command describes the desired output of the template (after post-processing)
* It can set limits for the maximum/minimum length of the output, measured in characters, words, sentences, paragraphs,...

@@ -6,0 +6,0 @@ *

@@ -9,3 +9,3 @@ import type { PipelineJson } from '../../types/PipelineJson/PipelineJson';

/**
* Callback for creating from prompt template graph node
* Callback for creating from template graph node
*/

@@ -12,0 +12,0 @@ linkTemplate?(template: TemplateJson): {

import type { TemplateJson } from '../../types/PipelineJson/TemplateJson';
import type { string_parameter_name } from '../../types/typeAliases';
/**
* Parses the prompt template and returns the set of all used parameters
* Parses the template and returns the set of all used parameters
*

@@ -6,0 +6,0 @@ * @param template the template with used parameters

@@ -19,4 +19,4 @@ import type { PipelineJson } from '../../types/PipelineJson/PipelineJson';

/**
* Function renameParameter will find all usable parameters for given prompt template
* In other words, it will find all parameters that are not used in the prompt template itseld and all its dependencies
* Function `renameParameter` will find all usable parameters for given template
* In other words, it will find all parameters that are not used in the template itseld and all its dependencies
*

@@ -23,0 +23,0 @@ * @throws {PipelineLogicError} If the new parameter name is already used in the pipeline

@@ -10,3 +10,3 @@ import type { PipelineJson } from '../../types/PipelineJson/PipelineJson';

*
* @param path - The path to the file relative to samples/templates directory
* @param path - The path to the file relative to samples/pipelines directory
* @private internal function of tests

@@ -13,0 +13,0 @@ */

@@ -12,3 +12,3 @@ import type { PipelineJson } from '../types/PipelineJson/PipelineJson';

/**
* @@@ Sequence of prompt templates that are chained together to form a pipeline
* @@@ Sequence of templates that are chained together to form a pipeline
*/

@@ -15,0 +15,0 @@ readonly templatesPrepared: Array<TemplateJson>;

@@ -38,5 +38,5 @@ import type { string_markdown_text } from '../typeAliases';

/**
* Sequence of prompt templates in order which were executed
* Sequence of templates in order which were executed
*/
readonly promptExecutions: Array<ExecutionPromptReportJson>;
};

@@ -16,3 +16,3 @@ import type { ModelRequirements } from '../ModelRequirements';

* Promptbook is the **core concept of this package**.
* It represents a series of prompt templates chained together to form a pipeline / one big prompt template with input and result parameters.
* It represents a series of templates chained together to form a pipeline / one big template with input and result parameters.
*

@@ -31,3 +31,3 @@ * Note: [🚉] This is fully serializable as JSON

* For example: https://promptbook.studio/webgpt/write-website-content-cs.ptbk.md@1.0.0
* Warning: Do not hash part of the URL, hash part is used for identification of the prompt template in the pipeline
* Warning: Do not hash part of the URL, hash part is used for identification of the template in the pipeline
*/

@@ -62,3 +62,3 @@ readonly pipelineUrl?: string_pipeline_url;

/**
* Sequence of prompt templates that are chained together to form a pipeline
* Sequence of templates that are chained together to form a pipeline
*/

@@ -65,0 +65,0 @@ readonly templates: Array<TemplateJson>;

@@ -13,3 +13,3 @@ import type { BlockType } from '../../commands/BLOCK/BlockTypes';

/**
* Common properties of all prompt templates
* Common properties of all templates
*/

@@ -21,7 +21,7 @@ export interface TemplateJsonCommon {

* - It should start uppercase and can contain letters and numbers
* - The pipelineUrl together with hash and name are used to identify the prompt template in the pipeline
* - The pipelineUrl together with hash and name are used to identify the template in the pipeline
*/
readonly name: string_name;
/**
* Title of the prompt template
* Title of the template
* It can use simple markdown formatting like **bold**, *italic*, [link](https://example.com), ... BUT not code blocks and structure

@@ -31,3 +31,3 @@ */

/**
* Description of the prompt template
* Description of the template
* It can use multiple paragraphs of simple markdown formatting like **bold**, *italic*, [link](https://example.com), ... BUT not code blocks and structure

@@ -37,3 +37,3 @@ */

/**
* List of parameter names that are used in the prompt template and must be defined before the prompt template is executed
* List of parameter names that are used in the template and must be defined before the template is executed
*

@@ -44,3 +44,3 @@ * Note: Joker is one of the dependent parameters

/**
* If theese parameters meet the expectations requirements, they are used instead of executing this prompt template
* If theese parameters meet the expectations requirements, they are used instead of executing this template
*

@@ -52,3 +52,3 @@ * @see https://github.com/webgptorg/promptbook/discussions/66

* Type of the execution
* This determines if the prompt template is send to LLM, user or some scripting evaluation
* This determines if the template is send to LLM, user or some scripting evaluation
*/

@@ -71,3 +71,3 @@ readonly blockType: BlockType;

/**
* List of postprocessing steps that are executed after the prompt template
* List of postprocessing steps that are executed after the template
*

@@ -95,3 +95,3 @@ * @see https://github.com/webgptorg/promptbook/discussions/31

/**
* Name of the parameter that is the result of the prompt template
* Name of the parameter that is the result of the template
*/

@@ -98,0 +98,0 @@ readonly resultingParameterName: string_name;

/**
* Promptbook is the **core concept of this package**.
* It represents a series of prompt templates chained together to form a pipeline / one big prompt template with input and result parameters.
* It represents a series of templates chained together to form a pipeline / one big template with input and result parameters.
*

@@ -5,0 +5,0 @@ * @see @@@ https://github.com/webgptorg/promptbook#promptbook

{
"name": "@promptbook/remote-client",
"version": "0.68.0-1",
"version": "0.68.0-3",
"description": "Supercharge your use of large language models",

@@ -50,3 +50,3 @@ "private": false,

"peerDependencies": {
"@promptbook/core": "0.68.0-1"
"@promptbook/core": "0.68.0-3"
},

@@ -53,0 +53,0 @@ "dependencies": {

@@ -287,3 +287,3 @@ <!-- ⚠️ WARNING: This code has been generated so that any manual changes will be overwritten -->

- [More template samples](./samples/templates/)
- [More template samples](./samples/pipelines/)
- [Read more about `.ptbk.md` file format here](https://github.com/webgptorg/promptbook/discussions/categories/concepts?discussions_q=is%3Aopen+label%3A.ptbk.md+category%3AConcepts)

@@ -405,3 +405,3 @@

Different levels of abstraction. OpenAI library is for direct use of OpenAI API. This library is for a higher level of abstraction. It is for creating prompt templates and promptbooks that are independent of the underlying library, LLM model, or even LLM provider.
Different levels of abstraction. OpenAI library is for direct use of OpenAI API. This library is for a higher level of abstraction. It define pipelines that are independent of the underlying library, LLM model, or even LLM provider.

@@ -408,0 +408,0 @@ ### How is it different from the Langchain library?

@@ -15,3 +15,3 @@ (function (global, factory) {

*/
var PROMPTBOOK_VERSION = '0.68.0-0';
var PROMPTBOOK_VERSION = '0.68.0-2';
// TODO: !!!! List here all the versions and annotate + put into script

@@ -18,0 +18,0 @@

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc