High-level shape
hrns is a small composition of six packages:
main: wires everything togetheropenai: OpenAI-compatible request and streaming clientloop: agent loop and tool executiontools: bundled tool implementationsskills: skill discovery and theload_skilltooltui: interactive terminal UI
Boot sequence
At startup,main.go does the following:
- creates a background context
- loads skills from the default global and local roots
- creates the
load_skilltool from the discovered skills - builds a hardcoded system prompt, optionally listing discovered skills
- creates
tui.TUIAppwith the built-in tools - starts the TUI
tui.Run, startup continues like this:
- load
~/.config/hrns/config.json - if the file is missing or has no providers, run onboarding
- pick
currentProvider - build
openai.Clientfrom that provider’surl,key, andskipVerify - create
loop.Loopwith the stored system prompt and tools
Request lifecycle
For each user turn in the TUI:- the TUI appends a
usermessage to the current session - it starts
RunLoopwith the current message history and chosen model RunLoopprepends the system messageRunLoopconverts registered tools into OpenAI-style function schemasopenai.Client.StreamChatCompletionstreams SSE events from/chat/completionsloopemits chunks for assistant text, reasoning, and tool call events- if the model called tools,
loopexecutes them and appendstoolmessages - the loop repeats until a streamed response finishes without tool calls
- the TUI updates its in-memory conversation from
agent.Messages()
Stream accumulation
Theopenai.ChatCompletionAccumulator is a key part of the runtime.
It merges partial streamed deltas into complete choices by:
- concatenating text content
- preserving structured content when text concatenation does not apply
- stitching together fragmented tool-call arguments
- preserving extra provider-specific fields
RunLoop wait until the stream ends and then execute fully assembled tool calls.
Tool execution model
Tool execution is synchronous inside the loop:- tools are looked up by name in a map
- arguments are parsed from the tool call’s JSON string
- the tool returns a string result
- the result is appended as a
toolmessage
State model
Two kinds of state matter:Loop state
loop.Loop stores:
- the client
- the system prompt
- the tool map
- the last completed message history
- a chunk channel
TUI state
The TUI stores:- the system prompt
- the tool map
- the current conversation history
- the loaded config
- the active model string for the session
Current design limits
The architecture is small on purpose, but that comes with hard edges:- system prompt is hardcoded in
main.go - tool schemas are flat and fully required
- tool results are plain strings
- provider connections are not switched live after
/connect - the bundled entrypoint is interactive only