Skip to content

Commit

Permalink
Merge pull request #45 from samchon/feature/llm-description
Browse files Browse the repository at this point in the history
Description comments of LLM function calling schemas.
  • Loading branch information
samchon committed Sep 8, 2024
2 parents 5d5332a + 8217f7f commit e4b9015
Show file tree
Hide file tree
Showing 16 changed files with 437 additions and 91 deletions.
2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@samchon/openapi",
"version": "0.5.0-dev.20240906-2",
"version": "1.0.0-dev.20240908",
"description": "OpenAPI definitions and converters for 'typia' and 'nestia'.",
"main": "./lib/index.js",
"module": "./lib/index.mjs",
Expand Down
169 changes: 168 additions & 1 deletion src/HttpLlm.ts
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,57 @@ import { ILlmFunction } from "./structures/ILlmFunction";
import { ILlmSchema } from "./structures/ILlmSchema";
import { LlmDataMerger } from "./utils/LlmDataMerger";

/**
* LLM function calling application composer from OpenAPI document.
*
* `HttpLlm` is a module for composing LLM (Large Language Model) function calling
* application from the {@link OpenApi.IDocument OpenAPI document}, and also for
* LLM function call execution and parameter merging.
*
* At first, you can construct the LLM function calling application by the
* {@link HttpLlm.application HttpLlm.application()} function. And then the LLM
* has selected a {@link IHttpLlmFunction function} to call and composes its
* arguments, you can execute the function by
* {@link HttpLlm.execute HttpLlm.execute()} or
* {@link HttpLlm.propagate HttpLlm.propagate()}.
*
* By the way, if you have configured the {@link IHttpLlmApplication.IOptions.separate}
* option to separate the parameters into human and LLM sides, you can merge these
* human and LLM sides' parameters into one through
* {@link HttpLlm.mergeParameters HttpLlm.mergeParameters()} before the actual LLM
* function call execution.
*
* @author Jeongho Nam - https://github.com/samchon
*/
export namespace HttpLlm {
/* -----------------------------------------------------------
COMPOSERS
----------------------------------------------------------- */
/**
* Convert OpenAPI document to LLM function calling application.
*
* Converts {@link OpenApi.IDocument OpenAPI document} or
* {@link IHttpMigrateApplication migrated application} to the
* {@link IHttpLlmApplication LLM function calling application}. Every
* {@link OpenApi.IOperation API operations} in the OpenAPI document are converted
* to the {@link IHttpLlmFunction LLM function} type, and they would be used for
* the LLM function calling.
*
* If you have configured the {@link IHttpLlmApplication.IOptions.separate} option,
* every parameters in the {@link IHttpLlmFunction} would be separated into both
* human and LLM sides. In that case, you can merge these human and LLM sides'
* parameters into one through {@link HttpLlm.mergeParameters} before the actual
* LLM function call execution.
*
* Additionally, if you have configured the {@link IHttpLlmApplication.IOptions.keyword}
* as `true`, the number of {@link IHttpLlmFunction.parameters} are always 1 and the
* first parameter type is always {@link ILlmSchema.IObject}. I recommend this option
* because LLM can understand the keyword arguments more easily.
*
* @param document Target OpenAPI document to convert (or migrate application)
* @param options Options for the LLM function calling application conversion
* @returns LLM function calling application
*/
export const application = <
Schema extends ILlmSchema,
Operation extends OpenApi.IOperation,
Expand All @@ -35,11 +85,30 @@ export namespace HttpLlm {
);
};

/**
* Convert JSON schema to LLM schema.
*
* Converts {@link OpenApi.IJsonSchema JSON schema} to {@link ILlmSchema LLM schema}.
*
* By the way, if the target JSON schema has some recursive references, the
* conversion would be failed and `null` value would be returned. It's because
* the LLM schema does not support the reference type embodied by the
* {@link OpenApi.IJsonSchema.IReference} type.
*
* @param props Schema to convert and components to refer
* @returns LLM schema or null value
*/
export const schema = (props: {
components: OpenApi.IComponents;
schema: OpenApi.IJsonSchema;
}): ILlmSchema | null => HttpLlmConverter.schema(props);

/* -----------------------------------------------------------
FETCHERS
----------------------------------------------------------- */
/**
* Properties for the LLM function call.
*/
export interface IFetchProps {
/**
* Document of the OpenAI function call schemas.
Expand All @@ -52,7 +121,7 @@ export namespace HttpLlm {
function: IHttpLlmFunction;

/**
* Connection info to the server.
* Connection info to the HTTP server.
*/
connection: IHttpConnection;

Expand All @@ -61,18 +130,116 @@ export namespace HttpLlm {
*/
arguments: any[];
}

/**
* Execute the LLM function call.
*
* `HttmLlm.execute()` is a function executing the target
* {@link OpenApi.IOperation API endpoint} with with the connection information
* and arguments composed by Large Language Model like OpenAI (+human sometimes).
*
* By the way, if you've configured the {@link IHttpLlmApplication.IOptions.separate},
* so that the parameters are separated to human and LLM sides, you have to merge
* these humand and LLM sides' parameters into one through
* {@link HttpLlm.mergeParameters} function.
*
* About the {@link IHttpLlmApplication.IOptions.keyword} option, don't worry anything.
* This `HttmLlm.execute()` function will automatically recognize the keyword arguments
* and convert them to the proper sequence.
*
* For reference, if the target API endpoinnt responds none 200/201 status, this
* would be considered as an error and the {@link HttpError} would be thrown.
* Otherwise you don't want such rule, you can use the {@link HttpLlm.propagate}
* function instead.
*
* @param props Properties for the LLM function call
* @returns Return value (response body) from the API endpoint
* @throws HttpError when the API endpoint responds none 200/201 status
*/
export const execute = (props: IFetchProps): Promise<unknown> =>
HttpLlmFunctionFetcher.execute(props);

/**
* Propagate the LLM function call.
*
* `HttmLlm.propagate()` is a function propagating the target
* {@link OpenApi.IOperation API endpoint} with with the connection information
* and arguments composed by Large Language Model like OpenAI (+human sometimes).
*
* By the way, if you've configured the {@link IHttpLlmApplication.IOptions.separate},
* so that the parameters are separated to human and LLM sides, you have to merge
* these humand and LLM sides' parameters into one through
* {@link HttpLlm.mergeParameters} function.
*
* About the {@link IHttpLlmApplication.IOptions.keyword} option, don't worry anything.
* This `HttmLlm.propagate()` function will automatically recognize the keyword arguments
* and convert them to the proper sequence.
*
* For reference, the propagation means always returning the response from the API
* endpoint, even if the status is not 200/201. This is useful when you want to
* handle the response by yourself.
*
* @param props Properties for the LLM function call
* @returns Response from the API endpoint
* @throws Error only when the connection is failed
*/
export const propagate = (props: IFetchProps): Promise<IHttpResponse> =>
HttpLlmFunctionFetcher.propagate(props);

/* -----------------------------------------------------------
MERGERS
----------------------------------------------------------- */
/**
* Properties for the parameters' merging.
*/
export interface IMergeProps {
/**
* Metadata of the target function.
*/
function: ILlmFunction;

/**
* Arguments composed by the LLM.
*/
llm: unknown[];

/**
* Arguments composed by the human.
*/
human: unknown[];
}

/**
* Merge the parameters.
*
* If you've configured the {@link IHttpLlmApplication.IOptions.separate} option,
* so that the parameters are separated to human and LLM sides, you can merge these
* humand and LLM sides' parameters into one through this `HttpLlm.mergeParameters()`
* function before the actual LLM function call execution.
*
* On contrary, if you've not configured the
* {@link IHttpLlmApplication.IOptions.separate} option, this function would throw
* an error.
*
* @param props Properties for the parameters' merging
* @returns Merged parameter values
*/
export const mergeParameters = (props: IMergeProps): unknown[] =>
LlmDataMerger.parameters(props);

/**
* Merge two values.
*
* If both values are objects, then combines them in the properties level.
*
* Otherwise, returns the latter value if it's not null, otherwise the former value.
*
* - `return (y ?? x)`
*
* @param x Value X to merge
* @param y Value Y to merge
* @returns Merged value
*/
export const mergeValue = (x: unknown, y: unknown): unknown =>
LlmDataMerger.value(x, y);
}
131 changes: 129 additions & 2 deletions src/HttpMigration.ts
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,72 @@ import { IHttpMigrateApplication } from "./structures/IHttpMigrateApplication";
import { IHttpMigrateRoute } from "./structures/IHttpMigrateRoute";
import { IHttpResponse } from "./structures/IHttpResponse";

/**
* HTTP migration application composer from OpenAPI document.
*
* `HttpMigration` is a module for composing HTTP migration application from the
* {@link OpenApi.IDocument OpenAPI document}. It is designed for helping the OpenAPI
* generator libraries, which converts {@link OpenApi.IOperation OpenAPI operations} to
* an RPC (Remote Procedure Call) function.
*
* The key feature of the `HttpModule` is the {@link HttpMigration.application} function.
* It converts the {@link OpenApi.IOperation OpenAPI operations} to the
* {@link IHttpMigrateRoute HTTP migration route}, and it normalizes the OpenAPI operations
* to the RPC function calling suitable route structure.
*
* The other functions, {@link HttpMigration.execute} and {@link HttpMigration.propagate},
* are for executing the HTTP request to the HTTP server. The {@link HttpMigration.execute}
* function returns the response body from the API endpoint when the status code is `200`
* or `201`. Otherwise, it throws an {@link HttpError} when the status code is not `200`
* or `201`. The {@link HttpMigration.propagate} function returns the response information
* from the API endpoint, including the status code, headers, and response body.
*
* The {@link HttpLlm} module is a good example utilizing this `HttpMigration` module
* for composing RPC function calling application. The {@link HttpLlm} module composes
* LLM (Large Language Model) function calling application from the OpenAPI document
* bypassing through the {@link IHttpLlmApplication} type.
*
* @author Jeongho Nam - https://github.com/samchon
*/
export namespace HttpMigration {
/* -----------------------------------------------------------
COMPOSER
----------------------------------------------------------- */
/**
* Convert HTTP migration application from OpenAPI document.
*
* `HttpMigration.application()` is a function converting the
* {@link OpenApi.IDocument OpenAPI document} and its {@link OpenApi.IOperation operations}
* to the {@link IHttpMigrateApplication HTTP migration application}.
*
* The HTTP migration application is designed for helping the OpenAPI generator
* libraries, which converts OpenAPI operations to an RPC (Remote Procedure Call)
* function. To support the OpenAPI generator libraries, {@link IHttpMigrateRoute}
* takes below normalization rules:
*
* - Path parameters are separated to atomic level.
* - Query parameters are binded into one object.
* - Header parameters are binded into one object.
* - Allow only below HTTP methods
* - `head`
* - `get`
* - `post`
* - `put`
* - `patch`
* - `delete`
* - Allow only below content media types
* - `application/json`
* - `application/x-www-form-urlencoded`
* - `multipart/form-data`
* - `text/plain`
*
* If there're some {@link OpenApi.IOperation API operations} which canont adjust
* the above rules or there're some logically insensible, these operation would be
* failed to migrate and registered into the {@link IHttpMigrateApplication.errors}.
*
* @param document OpenAPI document to migrate.
* @returns Migrated application.
*/
export const application = <
Schema extends OpenApi.IJsonSchema = OpenApi.IJsonSchema,
Operation extends OpenApi.IOperation<Schema> = OpenApi.IOperation<Schema>,
Expand All @@ -15,17 +80,79 @@ export namespace HttpMigration {
): IHttpMigrateApplication<Schema, Operation> =>
MigrateConverter.convert(document);

/**
* Properties for the request to the HTTP server.
*/
export interface IFetchProps {
/**
* Connection info to the HTTP server.
*/
connection: IHttpConnection;

/**
* Route information for the migration.
*/
route: IHttpMigrateRoute;

/**
* Path parameters.
*
* Path parameters with sequenced array or key-value paired object.
*/
parameters:
| Array<string | number | boolean | bigint | null>
| Record<string, string | number | boolean | bigint | null>;

/**
* Query parameters as a key-value paired object.
*/
query?: object | undefined;

/**
* Request body data.
*/
body?: object | undefined;
}
export const request = (props: IFetchProps): Promise<unknown> =>
HttpMigrateRouteFetcher.request(props);

/* -----------------------------------------------------------
FETCHERS
----------------------------------------------------------- */
/**
* Execute the HTTP request.
*
* `HttpMigration.execute()` is a function executing the HTTP request to the HTTP server.
*
* It returns the response body from the API endpoint when the status code is `200`
* or `201`. Otherwise, it throws an {@link HttpError} when the status code is not
* `200` or `201`.
*
* If you want to get more information than the response body, or get the detailed
* response information even when the status code is `200` or `201`, use the
* {@link HttpMigration.propagate} function instead.
*
* @param props Properties for the request.
* @returns Return value (response body) from the API endpoint.
* @throws HttpError when the API endpoint responds none 200/201 status.
*/
export const execute = (props: IFetchProps): Promise<unknown> =>
HttpMigrateRouteFetcher.execute(props);

/**
* Propagate the HTTP request.
*
* `HttpMigration.propagate()` is a function propagating the request to the HTTP server.
*
* It returns the response information from the API endpoint, including the status code,
* headers, and response body.
*
* Even if the status code is not `200` or `201`, this function
* would return the response information. By the way, if the connection to the HTTP server
* is failed, this function would throw an {@link Error}.
*
* @param props Properties for the request.
* @returns Response from the API endpoint.
* @throws Error when the connection is failed.
*/
export const propagate = (props: IFetchProps): Promise<IHttpResponse> =>
HttpMigrateRouteFetcher.propagate(props);
}
Loading

0 comments on commit e4b9015

Please sign in to comment.