Overview
Agents enable the development of agentic AI applications that can generate responses and execute tasks autonomously. Agents utilize the LLM SDK to interact with different language models and allow definitions of instructions, tools, and other language model parameters.
We provide Agent implementations in the following programming languages:
Below is an example of how to implement an agent in each language:
import { Agent, tool, type AgentItem } from "@hoangvvo/llm-agent";import { typeboxTool } from "@hoangvvo/llm-agent/typebox";import { zodTool } from "@hoangvvo/llm-agent/zod";import { Type } from "@sinclair/typebox";import readline from "node:readline/promises";import { z } from "zod";import { getModel } from "./get-model.ts";
// Define the context interface that can be accessed in the instructions and toolsinterface MyContext { userName: string;}
// Define the model to use for the Agentconst model = getModel("openai", "gpt-4o");
// Define the agent toolsconst getTimeTool = tool({ name: "get_time", description: "Get the current time", parameters: { type: "object", properties: {}, additionalProperties: false, }, execute() { return { content: [ { type: "text", text: JSON.stringify({ current_time: new Date().toISOString(), }), }, ], is_error: false, }; },});
// Create an agent tool using @sinclair/typebox with type inference// npm install @sinclair/typeboxconst getWeatherTool = typeboxTool({ name: "get_weather", description: "Get weather for a given city", parameters: Type.Object( { city: Type.String({ description: "The name of the city" }), }, { additionalProperties: false }, ), execute(params) { // inferred as { city: string } const { city } = params; console.log(`Getting weather for ${city}`); return { content: [ { type: "text", text: JSON.stringify({ city, forecast: "Sunny", temperatureC: 25, }), }, ], is_error: false, }; },});
// Create an agent tool using zod with type inference// npm install zod zod-to-json-schemaconst sendMessageTool = zodTool({ name: "send_message", description: "Send a text message", parameters: z.object({ message: z.string().min(1).max(500), phoneNumber: z.string(), }), execute(params) { // inferred as { message: string, phoneNumber: string } const { message, phoneNumber } = params; console.log(`Sending message to ${phoneNumber}: ${message}`); return { content: [ { type: "text", text: JSON.stringify({ success: true, }), }, ], is_error: false, }; },});
// Create the Agentconst myAssistant = new Agent<MyContext>({ name: "Mai", model, instructions: [ "You are Mai, a helpful assistant. Answer questions to the best of your ability.", // Dynamic instruction (context) => `You are talking to ${context.userName}.`, ], tools: [getTimeTool, getWeatherTool, sendMessageTool],});
// Implement the CLI to interact with the Agentconst rl = readline.createInterface({ input: process.stdin, output: process.stdout,});
const userName = await rl.question("Your name: ");
const context: MyContext = { userName,};
console.log(`Type 'exit' to quit`);
const items: AgentItem[] = [];
let userInput = "";
while (userInput !== "exit") { userInput = (await rl.question("> ")).trim(); if (!userInput) { continue; }
if (userInput.toLowerCase() === "exit") { break; }
// Add user message as the input items.push({ type: "message", role: "user", content: [ { type: "text", text: userInput, }, ], });
// Call assistant const response = await myAssistant.run({ context, input: items, });
// Append items with the output items items.push(...response.output);
console.dir(response, { depth: null });}
use async_trait::async_trait;use dotenvy::dotenv;use llm_agent::{Agent, AgentItem, AgentRequest, AgentTool, AgentToolResult, RunState};use llm_sdk::{ openai::{OpenAIModel, OpenAIModelOptions}, JSONSchema, Message, Part,};use schemars::JsonSchema;use serde::Deserialize;use serde_json::{json, Value};use std::{ env, error::Error, io::{self, Write}, sync::Arc,};
// Define the context interface that can be accessed in the instructions and// tools#[derive(Clone)]struct MyContext { pub user_name: String,}
#[derive(Debug, Deserialize)]struct GetWeatherParams { city: String,}
// Define the agent toolsstruct GetWeatherTool;
#[async_trait]impl AgentTool<MyContext> for GetWeatherTool { fn name(&self) -> String { "get_weather".to_string() } fn description(&self) -> String { "Get weather for a given city".to_string() } fn parameters(&self) -> JSONSchema { json!({ "type": "object", "properties": { "city": { "type": "string", "description": "The city to get the weather for" } }, "required": ["city"], "additionalProperties": false }) } async fn execute( &self, args: Value, _context: &MyContext, _state: &RunState, ) -> Result<AgentToolResult, Box<dyn Error + Send + Sync>> { let params: GetWeatherParams = serde_json::from_value(args)?; println!("Getting weather for {}", params.city);
Ok(AgentToolResult { content: vec![Part::text( json!({ "city": params.city, "forecast": "Sunny", "temperatureC": 25 }) .to_string(), )], is_error: false, }) }}
// Define the JSON schema using `schemars` crate#[derive(Debug, Deserialize, JsonSchema)]#[serde(deny_unknown_fields)]struct SendMessageParams { #[schemars(description = "The message to send")] message: String, #[schemars(description = "The phone number to send the message to")] phone_number: String,}
struct SendMessageTool;
#[async_trait]impl AgentTool<MyContext> for SendMessageTool { fn name(&self) -> String { "send_message".to_string() } fn description(&self) -> String { "Send a text message to a phone number".to_string() } fn parameters(&self) -> JSONSchema { schemars::schema_for!(SendMessageParams).into() } async fn execute( &self, args: Value, _context: &MyContext, _state: &RunState, ) -> Result<AgentToolResult, Box<dyn Error + Send + Sync>> { let params: SendMessageParams = serde_json::from_value(args)?; println!( "Sending message to {}: {}", params.phone_number, params.message );
Ok(AgentToolResult { content: vec![Part::text( json!({ "message": params.message, "status": "sent" }) .to_string(), )], is_error: false, }) }}
#[tokio::main]async fn main() -> Result<(), Box<dyn Error>> { dotenv().ok();
// Define the model to use for the Agent let model = Arc::new(OpenAIModel::new( "gpt-4o", OpenAIModelOptions { api_key: env::var("OPENAI_API_KEY") .expect("OPENAI_API_KEY environment variable must be set"), ..Default::default() }, ));
// Create the Agent let my_assistant = Agent::<MyContext>::builder("Mai", model) .add_instruction( "You are Mai, a helpful assistant. Answer questions to the best of your ability.", ) .add_instruction(|ctx: &MyContext| Ok(format!("You are talking to {}", ctx.user_name))) .add_tool(GetWeatherTool) .add_tool(SendMessageTool) .build();
// Implement the CLI to interact with the Agent let mut items = Vec::<AgentItem>::new();
// Get user name let user_name = read_line("Your name: ")?;
let context = MyContext { user_name };
println!("Type 'exit' to quit");
loop { let user_input = read_line("> ")?;
if user_input.is_empty() { continue; }
if user_input.to_lowercase() == "exit" { break; }
// Add user message as the input items.push(AgentItem::Message(Message::user(vec![Part::text( user_input, )])));
// Call assistant let response = my_assistant .run(AgentRequest { context: context.clone(), input: items.clone(), }) .await?;
// Append items with the output items items.extend(response.output.clone());
println!("{response:#?}"); }
Ok(())}
fn read_line(prompt: &str) -> io::Result<String> { print!("{prompt}"); io::stdout().flush()?;
let mut input = String::new(); io::stdin().read_line(&mut input)?; Ok(input.trim().to_string())}
package main
import ( "bufio" "context" "encoding/json" "fmt" "log" "os" "strings"
llmagent "github.com/hoangvvo/llm-sdk/agent-go" llmsdk "github.com/hoangvvo/llm-sdk/sdk-go" "github.com/hoangvvo/llm-sdk/sdk-go/openai" "github.com/joho/godotenv" "github.com/sanity-io/litter")
// Define the context interface that can be accessed in the instructions and// toolstype MyContext struct { UserName string}
type GetWeatherParams struct { City string `json:"city"`}
// Define the agent toolstype GetWeatherTool struct{}
func (t *GetWeatherTool) Name() string { return "get_weather"}
func (t *GetWeatherTool) Description() string { return "Get weather for a given city"}
func (t *GetWeatherTool) Parameters() llmsdk.JSONSchema { return llmsdk.JSONSchema{ "type": "object", "properties": map[string]any{ "city": map[string]any{ "type": "string", "description": "The city to get the weather for", }, }, "required": []string{"city"}, "additionalProperties": false, }}
func (t *GetWeatherTool) Execute(ctx context.Context, paramsJSON json.RawMessage, contextVal MyContext, runState *llmagent.RunState) (llmagent.AgentToolResult, error) { var params GetWeatherParams if err := json.Unmarshal(paramsJSON, ¶ms); err != nil { return llmagent.AgentToolResult{}, err }
fmt.Printf("Getting weather for %s\n", params.City)
result := map[string]any{ "city": params.City, "forecast": "Sunny", "temperatureC": 25, }
resultJSON, err := json.Marshal(result) if err != nil { return llmagent.AgentToolResult{}, err }
return llmagent.AgentToolResult{ Content: []llmsdk.Part{ llmsdk.NewTextPart(string(resultJSON)), }, IsError: false, }, nil}
type SendMessageParams struct { Message string `json:"message"` PhoneNumber string `json:"phone_number"`}
type SendMessageTool struct{}
func (t *SendMessageTool) Name() string { return "send_message"}
func (t *SendMessageTool) Description() string { return "Send a text message to a phone number"}
func (t *SendMessageTool) Parameters() llmsdk.JSONSchema { return llmsdk.JSONSchema{ "type": "object", "properties": map[string]any{ "message": map[string]any{ "type": "string", "description": "The message to send", }, "phone_number": map[string]any{ "type": "string", "description": "The phone number to send the message to", }, }, "required": []string{"message", "phone_number"}, "additionalProperties": false, }}
func (t *SendMessageTool) Execute(ctx context.Context, paramsJSON json.RawMessage, contextVal MyContext, runState *llmagent.RunState) (llmagent.AgentToolResult, error) { var params SendMessageParams if err := json.Unmarshal(paramsJSON, ¶ms); err != nil { return llmagent.AgentToolResult{}, err }
fmt.Printf("Sending message to %s: %s\n", params.PhoneNumber, params.Message)
result := map[string]any{ "message": params.Message, "status": "sent", }
resultJSON, err := json.Marshal(result) if err != nil { return llmagent.AgentToolResult{}, err }
return llmagent.AgentToolResult{ Content: []llmsdk.Part{ llmsdk.NewTextPart(string(resultJSON)), }, IsError: false, }, nil}
func main() { godotenv.Load("../.env")
// Define the model to use for the Agent apiKey := os.Getenv("OPENAI_API_KEY") if apiKey == "" { log.Fatal("OPENAI_API_KEY environment variable must be set") }
model := openai.NewOpenAIModel("gpt-4o", openai.OpenAIModelOptions{ APIKey: apiKey, })
// Get user name reader := bufio.NewReader(os.Stdin) fmt.Print("Your name: ") userName, _ := reader.ReadString('\n') userName = strings.TrimSpace(userName)
myContext := MyContext{UserName: userName}
// Create instruction params staticInstruction := "You are Mai, a helpful assistant. Answer questions to the best of your ability." dynamicInstruction := func(ctx context.Context, ctxVal MyContext) (string, error) { return fmt.Sprintf("You are talking to %s", ctxVal.UserName), nil }
// Create the Agent myAssistant := llmagent.NewAgent("Mai", model, llmagent.WithInstructions( llmagent.InstructionParam[MyContext]{String: &staticInstruction}, llmagent.InstructionParam[MyContext]{Func: dynamicInstruction}, ), llmagent.WithTools( &GetWeatherTool{}, &SendMessageTool{}, ), )
// Implement the CLI to interact with the Agent var items []llmagent.AgentItem
fmt.Println("Type 'exit' to quit")
ctx := context.Background()
for { fmt.Print("> ") userInput, _ := reader.ReadString('\n') userInput = strings.TrimSpace(userInput)
if userInput == "" { continue }
if strings.ToLower(userInput) == "exit" { break }
// Add user message as the input items = append(items, llmagent.NewAgentItemMessage(llmsdk.NewUserMessage( llmsdk.NewTextPart(userInput), )))
// Call assistant response, err := myAssistant.Run(ctx, llmagent.AgentRequest[MyContext]{ Context: myContext, Input: items, }) if err != nil { log.Printf("Error: %v\n", err) continue }
// Append items with the output items items = append(items, response.Output...)
litter.Dump(response) }}
Agent Patterns
Section titled “Agent Patterns”This agent library (not framework) is designed for transparency and control. Unlike many “agentic” frameworks, it ships with no hidden prompt templates or secret parsing rules—and that’s on purpose:
- Nothing hidden – What you write is what runs. No secret prompts or “special sauce” behind the scenes, so your instructions aren’t quietly overridden.
- Works in any settings – Many frameworks bake in English-only prompts. Here, the model sees only your words, in whichever language or format.
- Easy to tweak – Change prompts, parsing, or flow without fighting built-in defaults.
- Less to debug – Fewer layers mean you can trace exactly where things break.
- No complex abstraction – Don’t waste time learning new concepts or APIs (e.g., “chains”, “graphs”, syntax with special meanings, etc.). Just plain functions and data structures.
LLM in the past was not as powerful as today, so frameworks had to do a lot of heavy lifting to get decent results. But with modern LLMs, much of that complexity is no longer necessary.
Because we keep the core minimal (500 LOC!) and do not want to introduce such hidden magic, the library doesn’t bundle heavy agent patterns like hand-off, memory, or planners.
Instead, the examples/
folder shows clean, working references you can copy or adapt to see that it can still be used to build complex use cases.
This philosophy is inspired by this blog post.