Skip to Content

Step 7 β€” Next steps

You’ve built something real. The same Bookmark interface drove validation, JSON serialization, HTTP request handling, and LLM tool descriptions β€” without writing any of them twice. That’s the whole pitch.

Here are good places to go from here.

Reference pages worth bookmarking

If you want to…Read
Decide between is / assert / validate for a specific callRuntime Validators
Look up which tags exist (Format, Minimum, …)Special Tags
Generate an OpenAPI schemajson.schemas
Speed up JSON outputjson.assertStringify
Build LLM tools from a TypeScript classllm.application
Build LLM tools from an OpenAPI documentHttpLlm
Understand the parse/coerce/validate harnessLlmJson
Build a chatbot on top of the toolsAgentica
Stream binary data (Protocol Buffer)protobuf.encode
Generate test fixturesrandom

Playgrounds and demos

  • Playground β€” paste TypeScript in, see the compiled JavaScript come out. Great for quickly checking what typia emits for a type you’re considering.
  • BBS chatbotΒ  β€” full Agentica chatbot built around a CRUD-style class, same architecture as step 5.
  • Shopping chatbot β€” Agentica driving a real REST API through Swagger / OpenAPI β€” same idea as step 5, but starting from HttpLlm.controller instead of typia.llm.controller.

Performance facts to keep in mind

  • Runtime validation up to 20,000Γ— faster than class-validator (benchmarkΒ )
  • JSON serialization up to 200Γ— faster than class-transformer (benchmarkΒ )
  • LLM function calling: 6.75% β†’ 100% success rate on hard types when running the typia harness (AutoBe storyΒ )

The benchmarks are reproducible β€” run pnpm benchmark from the repo.

Common questions

My tests pass but my service throws β€œno transform has been configured” in production. You built with stock tsc (or your bundler ran SWC/Babel) instead of ttsc / ts-patch / @typia/unplugin. Re-read the verify-your-setup step in Install β€” the fix is always at the build-tool layer, not in your code.

The bundler I use isn’t listed. typia covers the common ones via @ttsc/unplugin (modern) and @typia/unplugin (legacy). If yours isn’t there, the generation mode β€” npx typia generate --input src --output dist-typia β€” lets typia emit .ts files ahead of time, which any tool can then compile normally.

I want to validate against a runtime schema, not a TypeScript type. That’s LlmJson.validate(schema). Slower than the AOT-compiled validator, but works when the schema comes from a database or a remote registry.

I’m building an MCP server / Vercel agent / LangChain agent and want to reuse one class for all three. You can. typia.llm.controller<Class>(name, instance) produces a controller that all three adapters accept. See utilization/mcp, utilization/vercel, utilization/langchain.

Get involved

Thanks for following along.

Last updated on