Super A.I. Chatbot


The above demonstration video shows BBS chatbot built with typia.llm.applicationOfValidate function.

As you can see, in the BBS A.I. chatbot application, the user can do everything defined in the TypeScript class just by conversation texts. Writing and reading articles, user can do these things just by chatting texts.

Just by delivering the TypeScript class type using typia.llm.applicationOfValidate<App, Model>() function, Super A.I. chatbot performing the LLM (Large Language Model) function calling is automatically composed. The Super A.I. chatbot will select proper functions defined in the TypeScript class type by analyzing conversation texts with the user. And then Super A.I. chatbot requests the user to write arguments for the selected functions by conversation text, and actually calls the function with the arguments. This is the key concept of the Super A.I. chatbot of typia and nestia A.I. chatbot.

In other words, every TypeScript classes can be conversed to the A.I. chatbot. In the new A.I. era, you don’t need to develop GUI (Graphical User Interface) application more. Just prepare TypeScript class with enough documentations, and let the A.I. chatbot to do the rest. The A.I. chatbot can replace your new GUI application, and it can be more efficient and user-friendly than the traditional GUI applications.

Application Setup

Terminal
npm create vite@latest bbs -- --template react-ts
cd bbs
 
npm install @nestia/agent @nestia/chat @samchon/openapi openai
npx typia setup
npm install -D @ryoppippi/unplugin-typia

I’ll decribe how to setup the A.I. chatbot application within framework of the vite.

At first, create a vite + react + typescript project by executing the npm create vite@latest bbs -- --template react-ts command. Then, move to the created project directory bbs. And then, install @nestia/chat and its dependency packages. After that, run the npx typia setup command and npm i -D @ryoppippi/unplugin-typia to setup the typia library with the transformer configuration.

At last, configure vite.config.ts file with @ryoppippi/unplugin-typia configuration like below. Import @ryoppippi/unplugin-typia/vite file and add it to the plugins array. Then you can use typia in the vite frontend development environment.

vite.config.ts
import typia from "@ryoppippi/unplugin-typia/vite";
import react from "@vitejs/plugin-react";
import { defineConfig } from "vite";
 
export default defineConfig({
  base: "./",
  plugins: [
    react(), 
    typia(),
  ],
});

From now on, let’s start the A.I. chatbot development.

Create a NestiaAgent instance with the OpenAI API key, its model name, and LLM function calling schema composed by the typia.llm.applicationOfValidate<BbsArticleService, "chatgpt">() function. And then, render the NestiaChatApplication component with the NestiaAgent instance.

For reference, when developing the TypeScript controller class, description comments written on each controller method and DTO schema types are very important. They are used for the A.I. chatbot to understand the purpose of the controller method and the characteristics of the DTO schema types.

src/BbsChatApplication.tsx
import { NestiaAgent } from "@nestia/agent";
import { NestiaChatApplication } from "@nestia/chat";
import OpenAI from "openai";
import typia from "typia";
 
import { BbsArticleService } from "./BbsArticleService";
 
export const BbsChatApplication = (props: BbsChatApplication.IProps) => {
  const service: BbsArticleService = new BbsArticleService();
  const agent: NestiaAgent = new NestiaAgent({
    provider: {
      type: "chatgpt",
      api: new OpenAI({
        apiKey: props.apiKey,
        dangerouslyAllowBrowser: true,
      }),
      model: props.model ?? "gpt-4o-mini",
    },
    controllers: [
      {
        protocol: "class",
        name: "bbs",
        application: typia.llm.applicationOfValidate<
          BbsArticleService,
          "chatgpt"
        >(),
        execute: async (props) => {
          return (service as any)[props.function.name](props.arguments);
        },
      },
    ],
    config: {
      locale: props.locale,
      timezone: props.timezone,
    },
  });
  return <NestiaChatApplication agent={agent} />;
};
export namespace BbsChatApplication {
  export interface IProps {
    apiKey: string;
    model?: OpenAI.ChatModel;
    locale?: string;
    timezone?: string;
  }
}

Backend Development


You can make the super A.I. chatbot by Swagger document too.

Until now, we’ve leared how to build an A.I. chatbot with the TypeScript class type. By the way, @nestia/agent and @nestia/chat supports another way to building the A.I. chatbot. It is the Swagger document. If you have a backend server and the backend server has a Swagger document, you also can create the super A.I. chatbot.

Just deliver the Swagger document to the @nestia/agent and @nestia/chat, then you also can start conversation with the Super A.I. chatbot. The A.I. chatbot will automatically analyze the Swagger document and convert it to the LLM function calling schemas, so that the A.I. chatbot will select the proper functions to call by the conversation contexts with the user.

If you want to learn how to build the A.I. chatbot with the Swagger document, read below document. The most important thing is that, every backend servers providing Swagger documents also can be conversed to the A.I. chatbot too. Therefore, in the new A.I. era, you don’t need to develop GUI (Graphical User Interface) application more. Just develop TypeScript class or backend server, and let the A.I. chatbot to do the rest. The A.I. chatbot can replace the GUI application development, and it can be more efficient and user-friendly than the traditional GUI applications.

Make your A.I. Chatbot

Above @nestia/agent and @nestia/chat libraries are just for testing and demonstration. I’ve made them to prove a conncept that every TypeScript classes can be conversed with the A.I. chatbot, and typia / nestia are especially efficient for the A.I. chatbot development purpose.

However, @nestia/agent support only OpenAI, and has not optimized for specific purpose. As it has not been optimized without any RAG (Retrieval Augmented Generation) models, it may consume a lot of LLM cost than what you may expected. Therefore, use the @nestia/agent for studying the A.I. chatbot development, or just demonstrating your TypeScript class before the production development.

Wrtn OS

Wrtn Logo

https://wrtnlabs.io

The new era of software development.

If you are not familiar with LLM (Large Language Moodel) development or RAG implementation, you can take another option. Prepare your swagger document file, and visit WrtnLabs homepage https://wrtnlabs.io. You can create your own A.I. chatbot with “Wrtn OS”, and re-distribute it as you want. The A.I. assistant in the Wrtn OS is much more optimized and cost efficient than the @nestia/agent, and it is fully open sourced.

Also, you can sell your swagger document (backend API functions) in the “Wrtn Store”, so that let other users to create their own A.I. chatbot with your backend API functions. Conversely, you can purchase the functions you need to create an A.I. chatbot from the store. If you have create an A.I. chatbot with only the functions purchased in the Wrtn Store, it is the no coding development.

I think this is a new way of software development, and a new way of software distribution. It is a new era of software development, and I hope you to be a part of it.