Documentation
¶
Index ¶
- Constants
- func BuildSQLPrompt(tables []TableInfo, now time.Time, columnHints string, extraContext string) string
- func BuildSummaryPrompt(question, sql, resultsTable string, now time.Time, extraContext string) string
- func BuildSystemPrompt(tables []TableInfo, dataSummary string, now time.Time, extraContext string) string
- func ExtractSQL(raw string) string
- func FormatResultsTable(columns []string, rows [][]string) string
- type ChatOption
- type Client
- func (c *Client) BaseURL() string
- func (c *Client) ChatComplete(ctx context.Context, messages []Message, opts ...ChatOption) (string, error)
- func (c *Client) ChatStream(ctx context.Context, messages []Message, opts ...ChatOption) (<-chan StreamChunk, error)
- func (c *Client) IsLocalServer() bool
- func (c *Client) ListModels(ctx context.Context) ([]string, error)
- func (c *Client) Model() string
- func (c *Client) Ping(ctx context.Context) error
- func (c *Client) ProviderName() string
- func (c *Client) SetModel(model string)
- func (c *Client) SetThinking(level string)
- func (c *Client) SupportsModelListing() bool
- func (c *Client) Timeout() time.Duration
- type ColumnInfo
- type Message
- type StreamChunk
- type TableInfo
Constants ¶
const QuickOpTimeout = 30 * time.Second
QuickOpTimeout is the context deadline for fast LLM server operations (ping, model listing, auto-detect). Not configurable.
Variables ¶
This section is empty.
Functions ¶
func BuildSQLPrompt ¶
func BuildSQLPrompt( tables []TableInfo, now time.Time, columnHints string, extraContext string, ) string
BuildSQLPrompt creates a system prompt that instructs the LLM to translate a natural-language question into a single SELECT statement. The prompt includes the current date, the full schema as DDL, and few-shot examples. If extraContext is non-empty, it's appended at the end.
func BuildSummaryPrompt ¶
func BuildSummaryPrompt( question, sql, resultsTable string, now time.Time, extraContext string, ) string
BuildSummaryPrompt creates a system prompt for the second stage: turning SQL results into a concise natural-language answer. If extraContext is non-empty, it's appended at the end.
func BuildSystemPrompt ¶
func BuildSystemPrompt( tables []TableInfo, dataSummary string, now time.Time, extraContext string, ) string
BuildSystemPrompt assembles the old single-stage system prompt, used as a fallback when the two-stage pipeline fails. If extraContext is non-empty, it's appended at the end.
func ExtractSQL ¶
ExtractSQL pulls the SQL statement from the LLM's response, handling both bare SQL and fenced code blocks. Returns the trimmed SQL string.
func FormatResultsTable ¶
FormatResultsTable renders query results as a pipe-delimited text table, compact enough for an LLM context window.
Types ¶
type ChatOption ¶ added in v1.49.0
type ChatOption func(*chatParams)
ChatOption configures a chat completion request.
func WithJSONSchema ¶ added in v1.49.0
func WithJSONSchema(name string, schema map[string]any) ChatOption
WithJSONSchema constrains the model output to match the given JSON Schema.
type Client ¶
type Client struct {
// contains filtered or unexported fields
}
Client wraps an any-llm-go provider behind a stable API for the rest of the application.
func NewClient ¶
func NewClient( providerName, baseURL, model, apiKey string, timeout time.Duration, ) (*Client, error)
NewClient creates an LLM client for the named provider. Returns an error if the provider cannot be initialized.
func (*Client) ChatComplete ¶
func (c *Client) ChatComplete( ctx context.Context, messages []Message, opts ...ChatOption, ) (string, error)
ChatComplete sends a non-streaming chat completion request and returns the full response content.
func (*Client) ChatStream ¶
func (c *Client) ChatStream( ctx context.Context, messages []Message, opts ...ChatOption, ) (<-chan StreamChunk, error)
ChatStream sends a streaming chat completion request and returns a channel of StreamChunk values. The channel closes when the response completes or the context is cancelled. Callers must drain the channel.
func (*Client) IsLocalServer ¶ added in v1.58.0
IsLocalServer returns true for providers that run on the user's machine (ollama, llamacpp, llamafile).
func (*Client) ListModels ¶
ListModels fetches the available model IDs. Returns an error if the provider does not support model listing.
func (*Client) Ping ¶
Ping checks whether the API is reachable and the configured model is available. For providers without model listing, it's a no-op.
func (*Client) ProviderName ¶ added in v1.58.0
ProviderName returns the provider identifier (e.g. "ollama", "anthropic").
func (*Client) SetThinking ¶ added in v1.45.0
SetThinking sets the reasoning effort level.
func (*Client) SupportsModelListing ¶ added in v1.58.0
SupportsModelListing returns true if the provider implements the ModelLister interface. Cloud providers like Anthropic do not.
type ColumnInfo ¶
ColumnInfo describes a single column in a table.
type StreamChunk ¶
StreamChunk is a single piece of a streaming chat response.
type TableInfo ¶
type TableInfo struct {
Name string
Columns []ColumnInfo
}
TableInfo describes a database table for context injection.