Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

@samchon/openapi

Package Overview
Dependencies
Maintainers
1
Versions
101
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@samchon/openapi - npm Package Compare versions

Comparing version 1.0.0 to 1.0.1

lib/converters/HttpLlmConverter.mjs

2

lib/OpenApiV3_1.d.ts

@@ -187,3 +187,3 @@ /**

interface IArray extends __ISignificant<"array"> {
items: IJsonSchema | IJsonSchema[];
items?: IJsonSchema | IJsonSchema[];
prefixItems?: IJsonSchema[];

@@ -190,0 +190,0 @@ uniqueItems?: boolean;

@@ -8,3 +8,3 @@ import { OpenApi } from "../OpenApi";

*
* `IHttpLlmApplication` is a data structure representing collection of
* `IHttpLlmApplication` is a data structure representing a collection of
* {@link IHttpLlmFunction LLM function calling schemas} composed from the

@@ -50,5 +50,5 @@ * {@link OpenApi.IDocument OpenAPI document} and its {@link OpenApi.IOperation operation}

* By the way, there can be some parameters (or their nested properties) which must be
* composed by human, not by LLM. File uploading feature or some sensitive information
* composed by Human, not by LLM. File uploading feature or some sensitive information
* like secrety key (password) are the examples. In that case, you can separate the
* function parameters to both LLM and human sides by configuring the
* function parameters to both LLM and Human sides by configuring the
* {@link IHttpLlmApplication.IOptions.separate} property. The separated parameters are

@@ -64,3 +64,3 @@ * assigned to the {@link IHttpLlmFunction.separated} property.

* Additionally, if you've configured {@link IHttpLlmApplication.IOptions.separate},
* so that the parameters are separated to human and LLM sides, you can merge these
* so that the parameters are separated to Human and LLM sides, you can merge these
* humand and LLM sides' parameters into one through {@link HttpLlm.mergeParameters}

@@ -171,18 +171,18 @@ * before the actual LLM function call execution.

* When composing parameter arguments through LLM function call,
* there can be a case that some parameters must be composed by human,
* there can be a case that some parameters must be composed by Human,
* or LLM cannot understand the parameter. For example, if the
* parameter type has configured
* {@link ILlmSchema.IString.contentMediaType} which indicates file
* uploading, it must be composed by human, not by LLM
* uploading, it must be composed by Human, not by LLM
* (Large Language Model).
*
* In that case, if you configure this property with a function that
* predicating whether the schema value must be composed by human or
* predicating whether the schema value must be composed by Human or
* not, the parameters would be separated into two parts.
*
* - {@link IHttpLlmFunction.separated.llm}
* - {@link IHttpLlmFunction.separated.human}
* - {@link IHttpLlmFunction.separated.Human}
*
* When writing the function, note that returning value `true` means
* to be a human composing the value, and `false` means to LLM
* to be a Human composing the value, and `false` means to LLM
* composing the value. Also, when predicating the schema, it would

@@ -192,3 +192,3 @@ * better to utilize the {@link LlmTypeChecker} features.

* @param schema Schema to be separated.
* @returns Whether the schema value must be composed by human or not.
* @returns Whether the schema value must be composed by Human or not.
* @default null

@@ -195,0 +195,0 @@ */

import { ILlmFunction } from "./ILlmFunction";
import { ILlmSchema } from "./ILlmSchema";
/**
* Application of LLM function calling.
*
* `ILlmApplication` is a data structure representing a collection of
* {@link ILlmFunction LLM function calling schemas}, composed from a native
* TypeScript class (or interface) type by the `typia.llm.application<App>()`
* function.
*
* By the way, the LLM function calling application composition, converting
* `ILlmApplication` instance from TypeScript interface (or class) type is not always
* successful. As LLM provider like OpenAI cannot understand the recursive reference
* type that is embodied by {@link OpenApi.IJsonSchema.IReference}, if there're some
* recursive types in the TypeScript interface (or class) type, the conversion would
* be failed.
*
* Also, there can be some parameters (or their nested properties) which must be
* composed by Human, not by LLM. File uploading feature or some sensitive information
* like secrety key (password) are the examples. In that case, you can separate the
* function parameters to both LLM and human sides by configuring the
* {@link ILlmApplication.IOptions.separate} property. The separated parameters are
* assigned to the {@link ILlmFunction.separated} property.
*
* For reference, when both LLM and Human filled parameter values to call, you can
* merge them by calling the {@link HttpLlm.mergeParameters} function. In other words,
* if you've configured the {@link ILlmApplication.IOptions.separate} property, you
* have to merge the separated parameters before the funtion call execution.
*
* @reference https://platform.openai.com/docs/guides/function-calling
* @author Jeongho Nam - https://github.com/samchon
*/
export interface ILlmApplication<Schema extends ILlmSchema = ILlmSchema> {

@@ -4,0 +34,0 @@ /**

{
"name": "@samchon/openapi",
"version": "1.0.0",
"version": "1.0.1",
"description": "OpenAPI definitions and converters for 'typia' and 'nestia'.",

@@ -56,3 +56,3 @@ "main": "./lib/index.js",

"ts-patch": "^3.2.1",
"typescript": "^5.5.3",
"typescript": "5.5.4",
"typescript-transform-paths": "^3.4.7",

@@ -62,2 +62,3 @@ "typia": "^6.9.0",

},
"sideEffects": false,
"files": [

@@ -64,0 +65,0 @@ "lib",

@@ -9,3 +9,3 @@ # `@samchon/openapi`

end
subgraph "Ecosystem"
subgraph "OpenAPI Generator"
emended --normalizes--> migration[["Migration Schema"]]

@@ -21,3 +21,3 @@ migration --"Artificial Intelligence"--> lfc{{"LLM Function Calling Application"}}

OpenAPI definitions, converters and utillity functions.
OpenAPI definitions, converters and LLM function calling application composer.

@@ -198,3 +198,3 @@ `@samchon/openapi` is a collection of OpenAPI types for every versions, and converters for them. In the OpenAPI types, there is an "emended" OpenAPI v3.1 specification, which has removed ambiguous and duplicated expressions for the clarity. Every conversions are based on the emended OpenAPI v3.1 specification.

end
subgraph "Ecosystem"
subgraph "OpenAPI Generator"
emended --normalizes--> migration[["Migration Schema"]]

@@ -201,0 +201,0 @@ migration --"Artificial Intelligence"--> lfc{{"<b><u>LLM Function Calling Application</b></u>"}}

@@ -302,3 +302,3 @@ /**

export interface IArray extends __ISignificant<"array"> {
items: IJsonSchema | IJsonSchema[];
items?: IJsonSchema | IJsonSchema[];
prefixItems?: IJsonSchema[];

@@ -305,0 +305,0 @@ uniqueItems?: boolean;

@@ -9,3 +9,3 @@ import { OpenApi } from "../OpenApi";

*
* `IHttpLlmApplication` is a data structure representing collection of
* `IHttpLlmApplication` is a data structure representing a collection of
* {@link IHttpLlmFunction LLM function calling schemas} composed from the

@@ -51,5 +51,5 @@ * {@link OpenApi.IDocument OpenAPI document} and its {@link OpenApi.IOperation operation}

* By the way, there can be some parameters (or their nested properties) which must be
* composed by human, not by LLM. File uploading feature or some sensitive information
* composed by Human, not by LLM. File uploading feature or some sensitive information
* like secrety key (password) are the examples. In that case, you can separate the
* function parameters to both LLM and human sides by configuring the
* function parameters to both LLM and Human sides by configuring the
* {@link IHttpLlmApplication.IOptions.separate} property. The separated parameters are

@@ -65,3 +65,3 @@ * assigned to the {@link IHttpLlmFunction.separated} property.

* Additionally, if you've configured {@link IHttpLlmApplication.IOptions.separate},
* so that the parameters are separated to human and LLM sides, you can merge these
* so that the parameters are separated to Human and LLM sides, you can merge these
* humand and LLM sides' parameters into one through {@link HttpLlm.mergeParameters}

@@ -188,18 +188,18 @@ * before the actual LLM function call execution.

* When composing parameter arguments through LLM function call,
* there can be a case that some parameters must be composed by human,
* there can be a case that some parameters must be composed by Human,
* or LLM cannot understand the parameter. For example, if the
* parameter type has configured
* {@link ILlmSchema.IString.contentMediaType} which indicates file
* uploading, it must be composed by human, not by LLM
* uploading, it must be composed by Human, not by LLM
* (Large Language Model).
*
* In that case, if you configure this property with a function that
* predicating whether the schema value must be composed by human or
* predicating whether the schema value must be composed by Human or
* not, the parameters would be separated into two parts.
*
* - {@link IHttpLlmFunction.separated.llm}
* - {@link IHttpLlmFunction.separated.human}
* - {@link IHttpLlmFunction.separated.Human}
*
* When writing the function, note that returning value `true` means
* to be a human composing the value, and `false` means to LLM
* to be a Human composing the value, and `false` means to LLM
* composing the value. Also, when predicating the schema, it would

@@ -209,3 +209,3 @@ * better to utilize the {@link LlmTypeChecker} features.

* @param schema Schema to be separated.
* @returns Whether the schema value must be composed by human or not.
* @returns Whether the schema value must be composed by Human or not.
* @default null

@@ -212,0 +212,0 @@ */

import { ILlmFunction } from "./ILlmFunction";
import { ILlmSchema } from "./ILlmSchema";
/**
* Application of LLM function calling.
*
* `ILlmApplication` is a data structure representing a collection of
* {@link ILlmFunction LLM function calling schemas}, composed from a native
* TypeScript class (or interface) type by the `typia.llm.application<App>()`
* function.
*
* By the way, the LLM function calling application composition, converting
* `ILlmApplication` instance from TypeScript interface (or class) type is not always
* successful. As LLM provider like OpenAI cannot understand the recursive reference
* type that is embodied by {@link OpenApi.IJsonSchema.IReference}, if there're some
* recursive types in the TypeScript interface (or class) type, the conversion would
* be failed.
*
* Also, there can be some parameters (or their nested properties) which must be
* composed by Human, not by LLM. File uploading feature or some sensitive information
* like secrety key (password) are the examples. In that case, you can separate the
* function parameters to both LLM and human sides by configuring the
* {@link ILlmApplication.IOptions.separate} property. The separated parameters are
* assigned to the {@link ILlmFunction.separated} property.
*
* For reference, when both LLM and Human filled parameter values to call, you can
* merge them by calling the {@link HttpLlm.mergeParameters} function. In other words,
* if you've configured the {@link ILlmApplication.IOptions.separate} property, you
* have to merge the separated parameters before the funtion call execution.
*
* @reference https://platform.openai.com/docs/guides/function-calling
* @author Jeongho Nam - https://github.com/samchon
*/
export interface ILlmApplication<Schema extends ILlmSchema = ILlmSchema> {

@@ -5,0 +35,0 @@ /**

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc