llm

package
v1.70.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Mar 7, 2026 License: Apache-2.0 Imports: 19 Imported by: 0

Documentation

Index

Constants

View Source
const QuickOpTimeout = 30 * time.Second

QuickOpTimeout is the context deadline for fast LLM server operations (ping, model listing, auto-detect). Not configurable.

Variables

This section is empty.

Functions

func BuildSQLPrompt

func BuildSQLPrompt(
	tables []TableInfo,
	now time.Time,
	columnHints string,
	extraContext string,
) string

BuildSQLPrompt creates a system prompt that instructs the LLM to translate a natural-language question into a single SELECT statement. The prompt includes the current date, the full schema as DDL, and few-shot examples. If extraContext is non-empty, it's appended at the end.

func BuildSummaryPrompt

func BuildSummaryPrompt(
	question, sql, resultsTable string,
	now time.Time,
	extraContext string,
) string

BuildSummaryPrompt creates a system prompt for the second stage: turning SQL results into a concise natural-language answer. If extraContext is non-empty, it's appended at the end.

func BuildSystemPrompt

func BuildSystemPrompt(
	tables []TableInfo,
	dataSummary string,
	now time.Time,
	extraContext string,
) string

BuildSystemPrompt assembles the old single-stage system prompt, used as a fallback when the two-stage pipeline fails. If extraContext is non-empty, it's appended at the end.

func ExtractSQL

func ExtractSQL(raw string) string

ExtractSQL pulls the SQL statement from the LLM's response, handling both bare SQL and fenced code blocks. Returns the trimmed SQL string.

func FormatResultsTable

func FormatResultsTable(columns []string, rows [][]string) string

FormatResultsTable renders query results as a pipe-delimited text table, compact enough for an LLM context window.

Types

type ChatOption added in v1.49.0

type ChatOption func(*chatParams)

ChatOption configures a chat completion request.

func WithJSONSchema added in v1.49.0

func WithJSONSchema(name string, schema map[string]any) ChatOption

WithJSONSchema constrains the model output to match the given JSON Schema.

type Client

type Client struct {
	// contains filtered or unexported fields
}

Client wraps an any-llm-go provider behind a stable API for the rest of the application.

func NewClient

func NewClient(
	providerName, baseURL, model, apiKey string,
	timeout time.Duration,
) (*Client, error)

NewClient creates an LLM client for the named provider. Returns an error if the provider cannot be initialized.

func (*Client) BaseURL

func (c *Client) BaseURL() string

BaseURL returns the configured base URL.

func (*Client) ChatComplete

func (c *Client) ChatComplete(
	ctx context.Context,
	messages []Message,
	opts ...ChatOption,
) (string, error)

ChatComplete sends a non-streaming chat completion request and returns the full response content.

func (*Client) ChatStream

func (c *Client) ChatStream(
	ctx context.Context,
	messages []Message,
	opts ...ChatOption,
) (<-chan StreamChunk, error)

ChatStream sends a streaming chat completion request and returns a channel of StreamChunk values. The channel closes when the response completes or the context is cancelled. Callers must drain the channel.

func (*Client) IsLocalServer added in v1.58.0

func (c *Client) IsLocalServer() bool

IsLocalServer returns true for providers that run on the user's machine (ollama, llamacpp, llamafile).

func (*Client) ListModels

func (c *Client) ListModels(ctx context.Context) ([]string, error)

ListModels fetches the available model IDs. Returns an error if the provider does not support model listing.

func (*Client) Model

func (c *Client) Model() string

Model returns the configured model name.

func (*Client) Ping

func (c *Client) Ping(ctx context.Context) error

Ping checks whether the API is reachable and the configured model is available. For providers without model listing, it's a no-op.

func (*Client) ProviderName added in v1.58.0

func (c *Client) ProviderName() string

ProviderName returns the provider identifier (e.g. "ollama", "anthropic").

func (*Client) SetModel

func (c *Client) SetModel(model string)

SetModel switches the active model.

func (*Client) SetThinking added in v1.45.0

func (c *Client) SetThinking(level string)

SetThinking sets the reasoning effort level.

func (*Client) SupportsModelListing added in v1.58.0

func (c *Client) SupportsModelListing() bool

SupportsModelListing returns true if the provider implements the ModelLister interface. Cloud providers like Anthropic do not.

func (*Client) Timeout added in v1.32.0

func (c *Client) Timeout() time.Duration

Timeout returns the deadline for quick operations (ping, model listing).

type ColumnInfo

type ColumnInfo struct {
	Name    string
	Type    string
	NotNull bool
	PK      bool
}

ColumnInfo describes a single column in a table.

type Message

type Message struct {
	Role    string `json:"role"`
	Content string `json:"content"`
}

Message represents a single turn in the conversation.

type StreamChunk

type StreamChunk struct {
	Content string
	Done    bool
	Err     error
}

StreamChunk is a single piece of a streaming chat response.

type TableInfo

type TableInfo struct {
	Name    string
	Columns []ColumnInfo
}

TableInfo describes a database table for context injection.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL