Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

@samchon/openapi

Package Overview
Dependencies
Maintainers
1
Versions
146
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@samchon/openapi - npm Package Compare versions

Comparing version 2.0.0-dev.20241120 to 2.0.0-dev.20241121

1

lib/converters/ChatGptConverter.js

@@ -175,2 +175,3 @@ "use strict";

minItems: undefined,
uniqueItems: undefined,
})),

@@ -177,0 +178,0 @@ tags: props.options.constraint ? [] : getArrayTags(input),

2

lib/structures/IChatGptSchema.d.ts

@@ -31,3 +31,3 @@ /**

* For reference, if you've composed the `IChatGptSchema` type with the
* {@link ILlmApplication.IChatGptOptions.escape} `true` option, only the recursived
* {@link ILlmApplication.IChatGptOptions.reference} `false` option, only the recursived
* named types would be archived into the {@link IChatGptSchema.IParameters.$defs},

@@ -34,0 +34,0 @@ * and the others would be ecaped from the {@link IChatGptSchema.IReference} type.

@@ -19,8 +19,8 @@ /**

*
* Also, by the documents of Gemini, these additional properties are not
* supported, either. However, I can't sure that these additional properties
* Also, by the documents of Gemini, these constraint properties are not
* supported, either. However, I can't sure that these constraint properties
* are really not supported in the Geimni, because the Gemini seems like
* understanding them. Therefore, I've decided to keep them alive.
*
* - ex) additional properties
* - ex) constraint properties
* - {@link IGeminiSchema.IString.default}

@@ -27,0 +27,0 @@ * - {@link IGeminiSchema.__IAttribute.example}

{
"name": "@samchon/openapi",
"version": "2.0.0-dev.20241120",
"version": "2.0.0-dev.20241121",
"description": "OpenAPI definitions and converters for 'typia' and 'nestia'.",

@@ -5,0 +5,0 @@ "main": "./lib/index.js",

@@ -0,1 +1,7 @@

> ## Next version is coming.
>
> This is the `next` version README document.
>
> If you wanna see the latest version, go to the [`v1.0` branch](https://github.com/samchon/openapi/tree/v1.0).
# `@samchon/openapi`

@@ -16,3 +22,3 @@ ```mermaid

[![GitHub license](https://img.shields.io/badge/license-MIT-blue.svg)](https://github.com/samchon/openapi/blob/master/LICENSE)
[![npm version](https://img.shields.io/npm/v/@samchon/openapi.svg)](https://www.npmjs.com/package/@samchon/openapi)
[![npm version](https://img.shields.io/npm/v/@samchon/openapi/next.svg)](https://www.npmjs.com/package/@samchon/openapi/next.svg)
[![Downloads](https://img.shields.io/npm/dm/@samchon/openapi.svg)](https://www.npmjs.com/package/@samchon/openapi)

@@ -54,6 +60,6 @@ [![Build Status](https://github.com/samchon/openapi/workflows/build/badge.svg)](https://github.com/samchon/openapi/actions?query=workflow%3Abuild)

```bash
npm install @samchon/openapi
npm install @samchon/openapi --tag next
```
Just install by `npm i @samchon/openapi` command.
Just install by `npm i @samchon/openapi --tag next` command.

@@ -257,5 +263,5 @@ Here is an example code utilizing the `@samchon/openapi` for LLM function calling purpose.

HttpLlm,
IChatGptSchema,
IHttpLlmApplication,
IHttpLlmFunction,
ILlmSchemaV3_1,
OpenApi,

@@ -283,4 +289,4 @@ OpenApiV3,

const document: OpenApi.IDocument = OpenApi.convert(swagger);
const application: IHttpLlmApplication<"3.1"> = HttpLlm.application({
model: "3.1",
const application: IHttpLlmApplication<"chatgpt"> = HttpLlm.application({
model: "chatgpt",
document,

@@ -290,6 +296,7 @@ });

// Let's imagine that LLM has selected a function to call
const func: IHttpLlmFunction<ILlmSchemaV3_1> | undefined = application.functions.find(
// (f) => f.name === "llm_selected_fuction_name"
(f) => f.path === "/bbs/{section}/articles/{id}" && f.method === "put",
);
const func: IHttpLlmFunction<IChatGptSchema.IParameters> | undefined =
application.functions.find(
// (f) => f.name === "llm_selected_fuction_name"
(f) => f.path === "/bbs/{section}/articles/{id}" && f.method === "put",
);
if (func === undefined) throw new Error("No matched function exists.");

@@ -304,6 +311,10 @@

function: func,
arguments: [
"general",
v4(),
{
input: {
section: "general",
id: v4(),
query: {
language: "en-US",
format: "markdown",
},
body: {
title: "Hello, world!",

@@ -313,83 +324,4 @@ body: "Let's imagine that this argument is composed by LLM.",

},
],
});
console.log("article", article);
};
main().catch(console.error);
```
### Keyword Parameter
Combine parameters into single object.
If you configure `keyword` option when composing the LLM (Large Language Model) function calling appliation, every parameters of OpenAPI operations would be combined to a single object type in the LLM funtion calling schema. This strategy is loved in many A.I. Chatbot developers, because LLM tends to a little professional in the single parameter function case.
Also, do not worry about the function call execution case. You don't need to resolve the keyworded parameter manually. The `HttpLlm.execute()` and `HttpLlm.propagate()` functions will resolve the keyworded parameter automatically by analyzing the `IHttpLlmApplication.options` property.
```typescript
import {
HttpLlm,
IHttpLlmApplication,
IHttpLlmFunction,
ILlmSchemaV3_1,
OpenApi,
OpenApiV3,
OpenApiV3_1,
SwaggerV2,
} from "@samchon/openapi";
import fs from "fs";
import typia from "typia";
import { v4 } from "uuid";
const main = async (): Promise<void> => {
// read swagger document and validate it
const swagger:
| SwaggerV2.IDocument
| OpenApiV3.IDocument
| OpenApiV3_1.IDocument = JSON.parse(
await fs.promises.readFile("swagger.json", "utf8"),
);
typia.assert(swagger); // recommended
// convert to emended OpenAPI document,
// and compose LLM function calling application
const document: OpenApi.IDocument = OpenApi.convert(swagger);
const application: IHttpLlmApplication<"3.1"> = HttpLlm.application({
model: "3.1",
document,
options: {
keyword: true,
},
});
// Let's imagine that LLM has selected a function to call
const func: IHttpLlmFunction<ILlmSchemaV3_1> | undefined = application.functions.find(
// (f) => f.name === "llm_selected_fuction_name"
(f) => f.path === "/bbs/{section}/articles/{id}" && f.method === "put",
);
if (func === undefined) throw new Error("No matched function exists.");
// actual execution is by yourself
const article = await HttpLlm.execute({
connection: {
host: "http://localhost:3000",
},
application,
function: func,
arguments: [
// one single object with key-value paired
{
section: "general",
id: v4(),
query: {
language: "en-US",
format: "markdown",
},
body: {
title: "Hello, world!",
body: "Let's imagine that this argument is composed by LLM.",
thumbnail: null,
},
},
],
});
console.log("article", article);

@@ -412,6 +344,6 @@ };

HttpLlm,
ChatGptTypeChecker,
IChatGptSchema,
IHttpLlmApplication,
IHttpLlmFunction,
ILlmSchemaV3_1,
LlmTypeCheckerV3_1,
OpenApi,

@@ -439,4 +371,4 @@ OpenApiV3,

const document: OpenApi.IDocument = OpenApi.convert(swagger);
const application: IHttpLlmApplication<"3.1"> = HttpLlm.application({
model: "3.1",
const application: IHttpLlmApplication<"chatgpt"> = HttpLlm.application({
model: "chatgpt",
document,

@@ -446,3 +378,3 @@ options: {

separate: (schema) =>
LlmTypeCheckerV3_1.isString(schema) && schema.contentMediaType !== undefined,
ChatGptTypeChecker.isString(schema) && schema.contentMediaType !== undefined,
},

@@ -452,6 +384,7 @@ });

// Let's imagine that LLM has selected a function to call
const func: IHttpLlmFunction<ILlmSchemaV3_1> | undefined = application.functions.find(
// (f) => f.name === "llm_selected_fuction_name"
(f) => f.path === "/bbs/articles/{id}" && f.method === "put",
);
const func: IHttpLlmFunction<IChatGptSchema.IParameters> | undefined =
application.functions.find(
// (f) => f.name === "llm_selected_fuction_name"
(f) => f.path === "/bbs/articles/{id}" && f.method === "put",
);
if (func === undefined) throw new Error("No matched function exists.");

@@ -468,19 +401,21 @@

function: func,
llm: [
llm: {
// LLM composed parameter values
"general",
v4(),
{
section: "general",
id: v4(),
query: {
language: "en-US",
format: "markdown",
},
{
body: {
title: "Hello, world!",
content: "Let's imagine that this argument is composed by LLM.",
},
],
human: [
},
human: {
// Human composed parameter values
{ thumbnail: "https://example.com/thumbnail.jpg" },
],
body: {
thumbnail: "https://example.com/thumbnail.jpg",
},
},
}),

@@ -487,0 +422,0 @@ });

@@ -153,2 +153,3 @@ import { OpenApi } from "../OpenApi";

minItems: undefined,
uniqueItems: undefined,
}),

@@ -155,0 +156,0 @@ },

@@ -31,3 +31,3 @@ /**

* For reference, if you've composed the `IChatGptSchema` type with the
* {@link ILlmApplication.IChatGptOptions.escape} `true` option, only the recursived
* {@link ILlmApplication.IChatGptOptions.reference} `false` option, only the recursived
* named types would be archived into the {@link IChatGptSchema.IParameters.$defs},

@@ -34,0 +34,0 @@ * and the others would be ecaped from the {@link IChatGptSchema.IReference} type.

@@ -19,8 +19,8 @@ /**

*
* Also, by the documents of Gemini, these additional properties are not
* supported, either. However, I can't sure that these additional properties
* Also, by the documents of Gemini, these constraint properties are not
* supported, either. However, I can't sure that these constraint properties
* are really not supported in the Geimni, because the Gemini seems like
* understanding them. Therefore, I've decided to keep them alive.
*
* - ex) additional properties
* - ex) constraint properties
* - {@link IGeminiSchema.IString.default}

@@ -27,0 +27,0 @@ * - {@link IGeminiSchema.__IAttribute.example}

Sorry, the diff of this file is not supported yet

Sorry, the diff of this file is not supported yet

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc