parameters()
function
export namespace llm {
// LLM FUNCTION CALLING APPLICATION SCHEMA
export function application<
App extends Record<string, any>,
Model extends ILlmSchema.Model,
Config extends Partial<ILlmSchema.ModelConfig[Model]> = {},
>(
options?: Partial<Pick<ILlmApplication.IOptions<Model>, "separate">>,
): ILlmApplication<Model>;
// +VALIDATE FUNCTION EMBEDDED
export function applicationOfValidate<
App extends Record<string, any>,
Model extends ILlmSchema.Model,
Config extends Partial<ILlmSchema.ModelConfig[Model]> = {},
>(
options?: Partial<Pick<ILlmApplicationOfValidate.IOptions<Model>, "separate">>,
): ILlmApplicationOfValidate<Model>;
// STRUCTURED OUTPUT
export function parameters<
Parameters extends Record<string, any>,
Model extends ILlmSchema.Model,
Config extends Partial<ILlmSchema.ModelConfig[Model]> = {},
>(): ILlmSchema.ModelParameters[Model];
// TYPE SCHEMA
export function schema<
T,
Model extends ILlmSchema.Model,
Config extends Partial<ILlmSchema.ModelConfig[Model]> = {},
>(
...$defs: Extract<
ILlmSchema.ModelSchema[Model],
{ $ref: string }
> extends never
? []
: [Record<string, ILlmSchema.ModelSchema[Model]>]
): ILlmSchema.ModelSchema[Model];
}
Structured output schema of LLM (Large Language Model).
typia.llm.parameters<Parameters, Model>()
is a function generating structured output of LLM (Large Language Model) from a TypeScript object type. It is used to LLM function calling or structured output feature provided by OpenAI like LLM providers.
Return value type ILlmSchema.IParameters
is a similar with the JSON schema definition’s object type. However, its detailed specification becomes different by LLM provider model you’ve chosen. Here is the list of LLM schema definitions of each model. Determine one of them carefully reading the LLM schema definitions.
- Supported schemas
IChatGptSchema
: OpenAI ChatGPTIClaudeSchema
: Anthropic ClaudeIGeminiSchema
: Google GeminiILlamaSchema
: Meta Llama
- Midldle layer schemas
ILlmSchemaV3
: middle layer based on OpenAPI v3.0 specificationILlmSchemaV3_1
: middle layer based on OpenAPI v3.1 specification
LLM Function Calling and Structured Output
LLM selects proper function and fill arguments.
In nowadays, most LLM (Large Language Model) like OpenAI are supporting “function calling” feature. The “LLM function calling” means that LLM automatically selects a proper function and fills parameter values from conversation with the user (may by chatting text).
Structured output is another feature of LLM. The “structured output” means that LLM automatically transforms the output conversation into a structured data format like JSON.
Specialization
Customziation
Restrictions
typia.llm.parameters<Parameters, Model>()
follows the same restrictions typia.llm.schema<T, Model>()
function. Also, it has only one additional restriction; the keyworded argument.
In the LLM function calling and structured output, the parameters must be a keyworded object type with static keys without any dynamic keys. Also, the object type must not be nullable or optional.
If you don’t follow the LLM’s keyworded arguments rule, typia.llm.parameters<Parameters, Model>()
will throw compilation error like below.
import typia from "typia";
typia.llm.parameters<string>();
typia.llm.parameters<Record<string, boolean>, "chatgpt">();
typia.llm.parameters<Array<number>>();