application()
function
export namespace llm {
// LLM FUNCTION CALLING APPLICATION SCHEMA
export function application<
App extends Record<string, any>,
Model extends ILlmSchema.Model,
Config extends Partial<ILlmSchema.ModelConfig[Model]> = {},
>(
options?: Partial<Pick<ILlmApplication.IOptions<Model>, "separate">>,
): ILlmApplication<Model>;
// +VALIDATE FUNCTION EMBEDDED
export function applicationOfValidate<
App extends Record<string, any>,
Model extends ILlmSchema.Model,
Config extends Partial<ILlmSchema.ModelConfig[Model]> = {},
>(
options?: Partial<Pick<ILlmApplicationOfValidate.IOptions<Model>, "separate">>,
): ILlmApplicationOfValidate<Model>;
// STRUCTURED OUTPUT
export function parameters<
Parameters extends Record<string, any>,
Model extends ILlmSchema.Model,
Config extends Partial<ILlmSchema.ModelConfig[Model]> = {},
>(): ILlmSchema.ModelParameters[Model];
// TYPE SCHEMA
export function schema<
T,
Model extends ILlmSchema.Model,
Config extends Partial<ILlmSchema.ModelConfig[Model]> = {},
>(
...$defs: Extract<
ILlmSchema.ModelSchema[Model],
{ $ref: string }
> extends never
? []
: [Record<string, ILlmSchema.ModelSchema[Model]>]
): ILlmSchema.ModelSchema[Model];
}
LLM function calling application schema from a native TypeScript class or interface type.
typia.llm.application<App, Model>()
is a function composing LLM (Large Language Model) calling application schema from a native TypeScript class or interface type. The function returns an ILlmApplication
instance, which is a data structure representing a collection of LLM function calling schemas.
If you put LLM function schema instances registered in the ILlmApplication.functions
to the LLM provider like OpenAI ChatGPT
, the LLM will select a proper function to call with parameter values of the target function in the conversations with the user. This is the “LLM Function Calling”.
You can specify the LLM provide model by the second Model
template argument. It’s because detailed specification of the function schema is different by the LLM provider model. Here is the list of LLM schema definitions of each model. Determine one of them carefully reading the LLM schema definitions.
If you’ve determined, let’s make A.I. Chatbot super-easily with typia.llm.application<App, Model>()
function.
- Supported schemas
IChatGptSchema
: OpenAI ChatGPTIClaudeSchema
: Anthropic ClaudeIGeminiSchema
: Google GeminiILlamaSchema
: Meta Llama
- Midldle layer schemas
ILlmSchemaV3
: middle layer based on OpenAPI v3.0 specificationILlmSchemaV3_1
: middle layer based on OpenAPI v3.1 specification
LLM Function Calling and Structured Output
LLM selects proper function and fill arguments.
In nowadays, most LLM (Large Language Model) like OpenAI are supporting “function calling” feature. The “LLM function calling” means that LLM automatically selects a proper function and fills parameter values from conversation with the user (may by chatting text).
Structured output is another feature of LLM. The “structured output” means that LLM automatically transforms the output conversation into a structured data format like JSON.
Description Comment
import { ILlmApplication } from "@samchon/openapi";
import typia, { tags } from "typia";
const app: ILlmApplication<"chatgpt"> = typia.llm.application<
BbsArticleController,
"chatgpt"
>();
console.log(app);
interface BbsArticleController {
/**
* Create a new article.
*
* Writes a new article and archives it into the DB.
*
* @param props Properties of create function
* @returns Newly created article
*/
create(props: {
/**
* Information of the article to create
*/
input: IBbsArticle.ICreate;
}): Promise<IBbsArticle>;
/**
* Update an article.
*
* Updates an article with new content.
*
* @param props Properties of update function
* @param input New content to update
*/
update(props: {
/**
* Target article's {@link IBbsArticle.id}.
*/
id: string & tags.Format<"uuid">;
/**
* New content to update.
*/
input: IBbsArticle.IUpdate;
}): Promise<void>;
/**
* Erase an article.
*
* Erases an article from the DB.
*
* @param props Properties of erase function
*/
erase(props: {
/**
* Target article's {@link IBbsArticle.id}.
*/
id: string & tags.Format<"uuid">;
}): Promise<void>;
}
/**
* Article entity.
*
* `IBbsArticle` is an entity representing an article in the BBS (Bulletin Board System).
*/
interface IBbsArticle extends IBbsArticle.ICreate {
/**
* Primary Key.
*/
id: string & tags.Format<"uuid">;
/**
* Creation time of the article.
*/
created_at: string & tags.Format<"date-time">;
/**
* Last updated time of the article.
*/
updated_at: string & tags.Format<"date-time">;
}
namespace IBbsArticle {
/**
* Information of the article to create.
*/
export interface ICreate {
/**
* Title of the article.
*
* Representative title of the article.
*/
title: string;
/**
* Content body.
*
* Content body of the article writtn in the markdown format.
*/
body: string;
/**
* Thumbnail image URI.
*
* Thumbnail image URI which can represent the article.
*
* If configured as `null`, it means that no thumbnail image in the article.
*/
thumbnail:
| null
| (string & tags.Format<"uri"> & tags.ContentMediaType<"image/*">);
}
/**
* Information of the article to update.
*
* Only the filled properties will be updated.
*/
export type IUpdate = Partial<ICreate>;
}
Here is the example code utilizing the typia.llm.application<App, Model>()
function.
As you can see, above example code is writing detailed descriptions for every functions and their parameter/return types. Such detailed descriptions are very important to teach the purpose of the function to the LLM (Language Large Model), and LLM actually determines which function to call by the description.
Therefore, don’t forget to writing detailed descriptions. It’s very import feature for the LLM function calling.
Also, the reason every functions defined in the BbsArticleController
type have only one object typed parameter is, it is the rule of LLM function calling. Such only one object parameter defining with static key-value pairs is called the keyword arguments. If you violate the LLM function calling’s keyworded arguments rule, typia.llm.application<App, Model>
will throw a compilation error.
Parameters’ Separation
Parameter values from both LLM and Human sides.
When composing parameter arguments through the LLM (Large Language Model) function calling, there can be a case that some parameters (or nested properties) must be composed not by LLM, but by Human. File uploading feature, or sensitive information like secret key (password) cases are the representative examples.
In that case, you can configure the LLM function calling schemas to exclude such Human side parameters (or nested properties) by ILlmApplication.options.separate
property. Instead, you have to merge both Human and LLM composed parameters into one by calling the HttpLlm.mergeParameters()
before the LLM function call execution.
Here is the example separating the parameter schemas.
import { ILlmApplication } from "@samchon/openapi";
import typia, { tags } from "typia";
const app: ILlmApplication<"claude"> = typia.llm.application<
BbsArticleController,
"claude"
>({
separate: (schema: ILlmSchema<"claude">) =>
ClaudeTypeChecker.isString(schema) && schema.contentMediaType !== undefined,
});
console.log(app);
interface BbsArticleController {
/**
* Create a new article.
*
* Writes a new article and archives it into the DB.
*
* @param props Properties of create function
* @returns Newly created article
*/
create(props: {
/**
* Information of the article to create
*/
input: IBbsArticle.ICreate;
}): Promise<IBbsArticle>;
/**
* Update an article.
*
* Updates an article with new content.
*
* @param props Properties of update function
* @param input New content to update
*/
update(props: {
/**
* Target article's {@link IBbsArticle.id}.
*/
id: string & tags.Format<"uuid">;
/**
* New content to update.
*/
input: IBbsArticle.IUpdate;
}): Promise<void>;
/**
* Erase an article.
*
* Erases an article from the DB.
*
* @param props Properties of erase function
*/
erase(props: {
/**
* Target article's {@link IBbsArticle.id}.
*/
id: string & tags.Format<"uuid">;
}): Promise<void>;
}
Function Hiding
Hiding some functions by comment tag.
If you write @hidden
, @human
or @internal
tag onto a function description comment, the function would not participate in the LLM (Large Language Model) function calling application composition. ILlmFunction
schema does not be genrated in the ILlmApplication.functions
collection.
It’s a good feature to hide some internal functions, so that avoiding the LLM function calling.
import {
ClaudeTypeChecker,
ILlmApplication,
ILlmSchema,
} from "@samchon/openapi";
import typia, { tags } from "typia";
const app: ILlmApplication<"chatgpt"> = typia.llm.application<
BbsArticleController,
"chatgpt"
>();
console.log(app);
interface BbsArticleController {
/**
* Create a new article.
*
* Writes a new article and archives it into the DB.
*
* @param props Properties of create function
* @returns Newly created article
*/
create(props: {
/**
* Information of the article to create
*/
input: IBbsArticle.ICreate;
}): Promise<IBbsArticle>;
/**
* Read an article.
*
* Reads an article from the DB.
*
* @param props Properties of read function
* @returns The article
* @hidden
*/
at(props: {
/**
* Target article's {@link IBbsArticle.id}.
*/
id: string & tags.Format<"uuid">;
}): Promise<IBbsArticle>;
/**
* Update an article.
*
* Updates an article with new content.
*
* @param props Properties of update function
* @param input New content to update
* @internal
*/
update(props: {
/**
* Target article's {@link IBbsArticle.id}.
*/
id: string & tags.Format<"uuid">;
/**
* New content to update.
*/
input: IBbsArticle.IUpdate;
}): Promise<void>;
/**
* Erase an article.
*
* Erases an article from the DB.
*
* @param props Properties of erase function
* @human
*/
erase(props: {
/**
* Target article's {@link IBbsArticle.id}.
*/
id: string & tags.Format<"uuid">;
}): Promise<void>;
}
Specialization
You can utilize type tags (or validator’s comment tags) to constructing special fields of JSON schema.
If you write any comment on a property, it would fill the IJsonSchema.description
value. Also, there’re special comment tags only for JSON schema definition that are different with validator’s comment tags like below.
@deprecated
@hidden
@internal
@title {string}
@default {value}
Let’s see how those type tags, comment tags and description comments are working with example code.
import { IChatGptSchema } from "@samchon/openapi";
import typia, { tags } from "typia";
export const schema: IChatGptSchema = typia.llm.schema<Special, "claude">();
interface Special {
/**
* Deprecated tags are just used for marking.
*
* @title Unsigned integer
* @deprecated
*/
type: number & tags.Type<"int32">;
/**
* Internal tagged property never be shown in JSON schema.
*
* It even doesn't be shown in other `typia` functions like `assert<T>()`.
*
* @internal
*/
internal: number[];
/**
* Hidden tagged property never be shown in JSON schema.
*
* However, it would be shown in other `typia` functions like `stringify<T>()`.
*
* @hidden
*/
hidden: boolean;
/**
* You can limit the range of number.
*
* @exclusiveMinimum 19
* @maximum 100
* @default 30
*/
number?: number;
/**
* You can limit the length of string.
*
* Also, multiple range conditions are also possible.
*/
string: string &
(
| (tags.MinLength<3> & tags.MaxLength<24>)
| (tags.MinLength<40> & tags.MaxLength<100>)
);
/**
* You can limit the pattern of string.
*
* @pattern ^[a-z]+$
*/
pattern: string;
/**
* You can limit the format of string.
*
* @format date-time
*/
format: string | null;
/**
* In the Array case, possible to restrict its elements.
*/
array: Array<string & tags.Format<"uuid">> & tags.MinItems<3>;
}
Customziation
If what you want is not just filling regular properties of LLM schema specification, but to adding custom properties into the JSON schema definition, you can do it through the tags.TagBase.schema
property type or tags.JsonSchemaPlugin
type.
For reference, the custom property must be started with x-
prefix. It’s a rule of LLM schema.
import typia, { tags } from "typia";
type Monetary<Value extends string> = tags.TagBase<{
target: "number";
kind: "monetary";
value: Value;
schema: {
"x-monetary": Value;
};
}>;
type Placeholder<Value extends string> = tags.JsonSchemaPlugin<{
"x-placeholder": Value;
}>;
interface IAccount {
code: string & Placeholder<"Write you account code please">;
balance: number & Monetary<"dollar">;
};
typia.llm.schema<IAccount, "chatgpt">();
Restrictions
typia.llm.application<App, Model>()
follows the same restrictions of below.
About the function parameters type, it follows the restriction of both typia.llm.parameters<Params, Models>()
and typia.llm.schema<T, Model>()
functions. Therefore, the parameters must be a keyworded object type with static keys without any dynamic keys. Also, the object type must not be nullable or optional.
About the return value type, it follows the restriction of typia.llm.schema<T, Model>()
function. By the way, if the return type is union type with undefined
, it would be compilation error, due to OpenAPI (JSON schema) specification does not support the undefindable union type.
import { ILlmApplication } from "@samchon/openapi";
import typia, { tags } from "typia";
const app: ILlmApplication<"chatgpt"> = typia.llm.application<
BbsArticleController,
"chatgpt"
>();
console.log(app);
interface BbsArticleController {
/**
* Create a new article.
*
* Writes a new article and archives it into the DB.
*
* @param props Properties of create function
* @returns Newly created article
*/
create(props: {
/**
* Information of the article to create
*/
input: IBbsArticle.ICreate;
}): Promise<IBbsArticle | undefined>;
erase(id: string & tags.Format<"uuid">): Promise<void>;
}