Step 7 β Next steps
Youβve built something real. The same Bookmark interface drove validation, JSON serialization, HTTP request handling, and LLM tool descriptions β without writing any of them twice. Thatβs the whole pitch.
Here are good places to go from here.
Reference pages worth bookmarking
| If you want to⦠| Read |
|---|---|
Decide between is / assert / validate for a specific call | Runtime Validators |
Look up which tags exist (Format, Minimum, β¦) | Special Tags |
| Generate an OpenAPI schema | json.schemas |
| Speed up JSON output | json.assertStringify |
| Build LLM tools from a TypeScript class | llm.application |
| Build LLM tools from an OpenAPI document | HttpLlm |
| Understand the parse/coerce/validate harness | LlmJson |
| Build a chatbot on top of the tools | Agentica |
| Stream binary data (Protocol Buffer) | protobuf.encode |
| Generate test fixtures | random |
Playgrounds and demos
- Playground β paste TypeScript in, see the compiled JavaScript come out. Great for quickly checking what typia emits for a type youβre considering.
- BBS chatbotΒ β full Agentica chatbot built around a CRUD-style class, same architecture as step 5.
- Shopping chatbot β Agentica driving a real REST API through Swagger / OpenAPI β same idea as step 5, but starting from
HttpLlm.controllerinstead oftypia.llm.controller.
Performance facts to keep in mind
- Runtime validation up to 20,000Γ faster than
class-validator(benchmarkΒ ) - JSON serialization up to 200Γ faster than
class-transformer(benchmarkΒ ) - LLM function calling: 6.75% β 100% success rate on hard types when running the typia harness (AutoBe storyΒ )
The benchmarks are reproducible β run pnpm benchmark from the repo.
Common questions
My tests pass but my service throws βno transform has been configuredβ in production.
You built with stock tsc (or your bundler ran SWC/Babel) instead of ttsc / ts-patch / @typia/unplugin. Re-read the verify-your-setup step in Install β the fix is always at the build-tool layer, not in your code.
The bundler I use isnβt listed.
typia covers the common ones via @ttsc/unplugin (modern) and @typia/unplugin (legacy). If yours isnβt there, the generation mode β npx typia generate --input src --output dist-typia β lets typia emit .ts files ahead of time, which any tool can then compile normally.
I want to validate against a runtime schema, not a TypeScript type.
Thatβs LlmJson.validate(schema). Slower than the AOT-compiled validator, but works when the schema comes from a database or a remote registry.
Iβm building an MCP server / Vercel agent / LangChain agent and want to reuse one class for all three.
You can. typia.llm.controller<Class>(name, instance) produces a controller that all three adapters accept. See utilization/mcp, utilization/vercel, utilization/langchain.
Get involved
- GitHub β github.com/samchon/typiaΒ (issues, PRs, discussions)
- Discord β discord.gg/E94XhzrUCZΒ
- Donate β opencollective.com/typiaΒ
- Bug reports β open an issue with a minimal reproduction (a single
.tsfile is plenty)
Thanks for following along.