Skip to main content

Supported configuration

The bundled TUI stores provider settings in:
~/.config/hrns/config.json
If the file does not exist, startup falls back to an interactive onboarding flow.

First-run onboarding

On the first launch, hrns prompts for:
  • provider name
  • provider API URL
  • provider API key
  • default model
  • whether to skip TLS verification
The entered provider is saved and becomes currentProvider.

Config shape

The current file format looks like this:
{
  "providers": {
    "openai": {
      "url": "https://api.openai.com/v1",
      "key": "your-api-key",
      "model": "your-model",
      "skipVerify": false
    }
  },
  "currentProvider": "openai"
}
For the active provider, tui.Run builds openai.NewClient(...) with:
  • url as the OpenAI-compatible base URL
  • key as the bearer token
  • skipVerify to control TLS verification
  • model as the startup model shown in the TUI

Typical setups

Default OpenAI-style base URL

Use onboarding and enter:
  • provider name: openai
  • provider API URL: https://api.openai.com/v1
  • provider API key: your API key
  • default model: a model your provider supports
  • skip TLS verify: n

Custom endpoint

Use onboarding and enter your endpoint URL and model.

Local endpoint with self-signed TLS

Set:
  • provider API URL: https://localhost:8443/v1
  • provider API key: your development key
  • skip TLS verify: y

Important runtime detail

Provider configuration and model selection are separate concerns. You can change the saved model for the current provider at runtime:
/model <your-model>
You can also add more providers with:
/connect
The current implementation saves the new provider and marks it as current in the config file, but it does not rebuild the in-memory client for the existing session. Restart the TUI to actually talk to that provider.

When embedding instead of using the TUI

If you use the packages directly, you do not need the TUI config file at all. You can build the client explicitly:
client := openai.NewClient(
    openai.WithBaseURL("https://your-provider.example/v1"),
    openai.WithAPIKey("your-api-key"),
)
That is usually the cleaner path for real applications.