parameters() function

typia
export namespace llm {
  // LLM FUNCTION CALLING APPLICATION SCHEMA
  export function application<
    App extends Record<string, any>,
    Model extends ILlmSchema.Model,
    Config extends Partial<ILlmSchema.ModelConfig[Model]> = {},
  >(
    options?: Partial<Pick<ILlmApplication.IOptions<Model>, "separate">>,
  ): ILlmApplication<Model>;
 
  // +VALIDATE FUNCTION EMBEDDED
  export function applicationOfValidate<
    App extends Record<string, any>,
    Model extends ILlmSchema.Model,
    Config extends Partial<ILlmSchema.ModelConfig[Model]> = {},
  >(
    options?: Partial<Pick<ILlmApplicationOfValidate.IOptions<Model>, "separate">>,
  ): ILlmApplicationOfValidate<Model>;
 
  // STRUCTURED OUTPUT
  export function parameters<
    Parameters extends Record<string, any>,
    Model extends ILlmSchema.Model,
    Config extends Partial<ILlmSchema.ModelConfig[Model]> = {},
  >(): ILlmSchema.ModelParameters[Model];
 
  // TYPE SCHEMA
  export function schema<
    T,
    Model extends ILlmSchema.Model,
    Config extends Partial<ILlmSchema.ModelConfig[Model]> = {},
  >(
    ...$defs: Extract<
      ILlmSchema.ModelSchema[Model],
      { $ref: string }
    > extends never
      ? []
      : [Record<string, ILlmSchema.ModelSchema[Model]>]
  ): ILlmSchema.ModelSchema[Model];
}

Structured output schema of LLM (Large Language Model).

typia.llm.parameters<Parameters, Model>() is a function generating structured output of LLM (Large Language Model) from a TypeScript object type. It is used to LLM function calling or structured output feature provided by OpenAI like LLM providers.

Return value type ILlmSchema.IParameters is a similar with the JSON schema definition’s object type. However, its detailed specification becomes different by LLM provider model you’ve chosen. Here is the list of LLM schema definitions of each model. Determine one of them carefully reading the LLM schema definitions.

LLM Function Calling and Structured Output

LLM selects proper function and fill arguments.

In nowadays, most LLM (Large Language Model) like OpenAI are supporting “function calling” feature. The “LLM function calling” means that LLM automatically selects a proper function and fills parameter values from conversation with the user (may by chatting text).

Structured output is another feature of LLM. The “structured output” means that LLM automatically transforms the output conversation into a structured data format like JSON.

Structured Output

src/examples/llm.parameters.ts
import OpenAI from "openai";
import typia, { tags } from "typia";
 
interface IMember {
  email: string & tags.Format<"email">;
  name: string;
  age: number;
  hobbies: string[];
  joined_at: string & tags.Format<"date">;
}
 
const main = async (): Promise<void> => {
  const client: OpenAI = new OpenAI({
    apiKey: TestGlobal.env.CHATGPT_API_KEY,
    // apiKey: "<YOUR_OPENAI_API_KEY>",
  });
  const completion: OpenAI.ChatCompletion =
    await client.chat.completions.create({
      model: "gpt-4o",
      messages: [
        {
          role: "user",
          content: [
            "I am a new member of the community.",
            "",
            "My name is John Doe, and I am 25 years old.",
            "I like playing basketball and reading books,",
            "and joined to this community at 2022-01-01.",
          ].join("\n"),
        },
      ],
      response_format: {
        type: "json_schema",
        json_schema: {
          name: "member",
          schema: typia.llm.parameters<IMember, "chatgpt">() as any,
        },
      },
    });
  console.log(JSON.parse(completion.choices[0].message.content!));
};
main().catch(console.error);
Terminal
{
  email: 'john.doe@example.com',
  name: 'John Doe',
  age: 25,
  hobbies: [ 'playing basketball', 'reading books' ],
  joined_at: '2022-01-01'
}

You can utilize the typia.llm.parameters<Parameters, Model>() function to generate structured output like above.

Just configure output mode as JSON schema, and deliver the typia.llm.parameters<Parameters, Model>() function returned value to the LLM provider like OpenAI (ChatGPT). Then, the LLM provider will automatically transform the output conversation into a structured data format of the Parameters type.

Validation Feedback

src/examples/llm.parameters.ts
import OpenAI from "openai";
import typia, { IValidation, tags } from "typia";
 
interface IMember {
  email: string & tags.Format<"email">;
  name: string;
  age: number;
  hobbies: string[];
  joined_at: string & tags.Format<"date">;
}
 
const step = async (
  failure?: IValidation.IFailure | undefined,
): Promise<IValidation<IMember>> => {
  const client: OpenAI = new OpenAI({
    apiKey: "<YOUR_OPENAI_API_KEY>",
  });
  const completion: OpenAI.ChatCompletion =
    await client.chat.completions.create({
      model: "gpt-4o",
      messages: [
        {
          role: "user",
          content: [
            "I am a new member of the community.",
            "",
            "My name is John Doe, and I am 25 years old.",
            "I like playing basketball and reading books,",
            "and joined to this community at 2022-01-01.",
          ].join("\n"),
        },
        ...(failure
          ? [
              {
                role: "system",
                content: [
                  "You A.I. agent had taken a mistak that",
                  "returing wrong typed structured data.",
                  "",
                  "Here is the detailed list of type errors.",
                  "Review and correct them at the next step.",
                  "",
                  "```json",
                  JSON.stringify(failure.errors, null, 2),
                  "```",
                ].join("\n"),
              } satisfies OpenAI.ChatCompletionSystemMessageParam,
            ]
          : []),
      ],
      response_format: {
        type: "json_schema",
        json_schema: {
          name: "member",
          schema: typia.llm.parameters<IMember, "chatgpt">() as any,
        },
      },
    });
  const member: IMember = JSON.parse(completion.choices[0].message.content!);
  return typia.validate(member);
};
 
const main = async (): Promise<void> => {
  let result: IValidation<IMember> | undefined = undefined;
  for (let i: number = 0; i < 2; ++i) {
    if (result && result.success === true) break;
    result = await step(result);
  }
  console.log(result);
};
 
main().catch(console.error);
Terminal
{
  email: 'john.doe@example.com',
  name: 'John Doe',
  age: 25,
  hobbies: [ 'playing basketball', 'reading books' ],
  joined_at: '2022-01-01'
}

In sometimes, LLM takes a mistake composing wrong typed structured data.

In that case, you can guide the LLM (Large Language Model) to generate the correct typed structured data at the next step just by delivering the validation error message of the typia.validate<T>() function as a system prompt like above.

Note that, if you are developing an A.I. chatbot project, such validation feedback strategy is essential for both LLM function calling and structured output features. Tends to my experiments, even though the LLM makes a wrong typed structured data, it always be corrected just by only one validation feedback step.

Restrictions

typia.llm.parameters<Parameters, Model>() follows the same restrictions typia.llm.schema<T, Model>() function. Also, it has only one additional restriction; the keyworded argument.

In the LLM function calling and structured output, the parameters must be a keyworded object type with static keys without any dynamic keys. Also, the object type must not be nullable or optional.

If you don’t follow the LLM’s keyworded arguments rule, typia.llm.parameters<Parameters, Model>() will throw compilation error like below.

src/examples/llm.parameters.violation.ts
import typia from "typia";
 
typia.llm.parameters<string>();
typia.llm.parameters<Record<string, boolean>, "chatgpt">();
typia.llm.parameters<Array<number>>();