http

package
v0.0.0-...-c514e8c Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 1, 2026 License: MIT Imports: 68 Imported by: 0

Documentation

Overview

Package http implements http server.

Index

Constants

View Source
const (
	// OpenaiMessageRoleSystem system message
	OpenaiMessageRoleSystem = "system"
	// OpenaiMessageRoleUser user message
	OpenaiMessageRoleUser = "user"
	// OpenaiMessageRoleAI ai message
	OpenaiMessageRoleAI = "assistant"
)
View Source
const VisionTokenPrice = 5000

VisionTokenPrice vision token price($/500000)

Variables

This section is empty.

Functions

func Call

func Call(name, args string) (string, error)

func Chat

func Chat(ctx *gin.Context)

Chat render chat page

func ChatHandler

func ChatHandler(ctx *gin.Context)

ChatHandler handles the web UI chat endpoint.

It always talks to upstream using the OpenAI Responses API schema, injects enabled tools, executes tool calls (including MCP), and returns only the final assistant answer to the UI. Intermediate steps are streamed via delta.reasoning_content so the UI can render them inside the collapsible Thinking panel.

func ChatModel

func ChatModel() string

ChatModel return chat model

func ConvertImageToPNG

func ConvertImageToPNG(imgContent []byte) ([]byte, error)

ConvertImageToPNG convert image to png

func CopyHeader

func CopyHeader(to, from http.Header)

CopyHeader copy header from `from` to `to`

func CountTextTokens

func CountTextTokens(text string) int

CountTextTokens returns the approximate token count for a string using tiktoken when available.

func CountVisionImagePrice

func CountVisionImagePrice(width int, height int, resolution VisionImageResolution) (int, error)

CountVisionImagePrice count vision image tokens

https://openai.com/pricing

func CreateDeepResearchHandler

func CreateDeepResearchHandler(c *gin.Context)

CreateDeepResearchHandler deepresearch handler

func DecodeBase64

func DecodeBase64(input string) ([]byte, error)

DecodeBase64 decode base64 string, handle data:...;base64, prefix

func DownloadUserConfig

func DownloadUserConfig(ctx *gin.Context)

func DrawByDalleHandler

func DrawByDalleHandler(ctx *gin.Context)

func DrawByFlux

func DrawByFlux(ctx *gin.Context)

DrawByFlux draw image by flux-pro

func DrawByLcmHandler

func DrawByLcmHandler(ctx *gin.Context)

func DrawBySdxlturboHandlerByNvidia

func DrawBySdxlturboHandlerByNvidia(ctx *gin.Context)

func DrawBySdxlturboHandlerBySelfHosted

func DrawBySdxlturboHandlerBySelfHosted(ctx *gin.Context)

func EditImageHandler

func EditImageHandler(ctx *gin.Context)

func FetchURLContent

func FetchURLContent(gctx *gin.Context, url string) (content []byte, err error)

FetchURLContent fetch url content

func GetCurrentUser

func GetCurrentUser(ctx *gin.Context)

GetCurrentUser get current user

func GetDeepResearchStatusHandler

func GetDeepResearchStatusHandler(c *gin.Context)

GetDeepResearchStatusHandler get deepresearch status

func GetImageModelPrice

func GetImageModelPrice(model string) db.Price

GetImageModelPrice get price for image model

func GetUserInternalBill

func GetUserInternalBill(ctx context.Context,
	user *config.UserConfig, billType db.BillingType) (
	bill *db.Billing, err error)

GetUserInternalBill get user internal bill

func InpaitingByFlux

func InpaitingByFlux(ctx *gin.Context)

func IsModelAllowed

func IsModelAllowed(ctx context.Context,
	user *config.UserConfig,
	req *FrontendReq) error

IsModelAllowed check if model is allowed

func OneShotChatHandler

func OneShotChatHandler(gctx *gin.Context)

OneShotChatHandler handle one shot chat request

func OneapiProxyHandler

func OneapiProxyHandler(ctx *gin.Context)

OneapiProxyHandler proxy to oneapi url

func PaymentHandler

func PaymentHandler(c *gin.Context)

PaymentHandler creates a Stripe PaymentIntent.

func RamjetProxyHandler

func RamjetProxyHandler(ctx *gin.Context)

RamjetProxyHandler proxy to ramjet url

func RegisterStatic

func RegisterStatic(g gin.IRouter)

RegisterStatic register static files

func SaveLlmConservationHandler

func SaveLlmConservationHandler(ctx *gin.Context)

SaveLlmConservationHandler save llm conservation

func SetupHTTPCli

func SetupHTTPCli() (err error)

SetupHTTPCli setup http client

func TTSHanler

func TTSHanler(ctx *gin.Context)

TTSHanler text to speech by azure, will return audio stream

func Tiktoken

func Tiktoken() *tiktoken.Tiktoken

Tiktoken return tiktoken, could be nil if not found

func UploadFiles

func UploadFiles(ctx *gin.Context)

UploadFiles upload files

func UploadUserConfig

func UploadUserConfig(ctx *gin.Context)

Types

type AzureCreateImageResponse

type AzureCreateImageResponse struct {
	Created int64 `json:"created"`
	Data    []struct {
		RevisedPrompt string `json:"revised_prompt"`
		Url           string `json:"url"`
	} `json:"data"`
}

AzureCreateImageResponse return from azure image api

type CreateDeepresearchRequest

type CreateDeepresearchRequest struct {
	Prompt string `json:"prompt" binding:"required,min=1"`
}

CreateDeepresearchRequest deepresearch request

type DrawImageByFluxProResponse

type DrawImageByFluxProResponse struct {
	CompletedAt time.Time                       `json:"completed_at"`
	CreatedAt   time.Time                       `json:"created_at"`
	DataRemoved bool                            `json:"data_removed"`
	Error       string                          `json:"error"`
	ID          string                          `json:"id"`
	Input       DrawImageByFluxReplicateRequest `json:"input"`
	Logs        string                          `json:"logs"`
	Metrics     FluxMetrics                     `json:"metrics"`
	// Output could be `string` or `[]string`
	Output    any       `json:"output"`
	StartedAt time.Time `json:"started_at"`
	Status    string    `json:"status"`
	URLs      FluxURLs  `json:"urls"`
	Version   string    `json:"version"`
}

DrawImageByFluxProResponse is response of DrawImageByFluxProRequest

https://replicate.com/black-forest-labs/flux-pro?prediction=kg1krwsdf9rg80ch1sgsrgq7h8&output=json

func (*DrawImageByFluxProResponse) GetOutput

func (r *DrawImageByFluxProResponse) GetOutput() ([]string, error)

GetOutput return output

type DrawImageByFluxReplicateRequest

type DrawImageByFluxReplicateRequest struct {
	Input FluxInput `json:"input"`
}

DrawImageByFluxReplicateRequest draw image by fluxpro

https://replicate.com/black-forest-labs/flux-pro?prediction=kg1krwsdf9rg80ch1sgsrgq7h8&output=json

type DrawImageByFluxSegmind

type DrawImageByFluxSegmind struct {
	// Prompt is the text prompt for generating the image
	Prompt string `json:"prompt" binding:"required"`

	// Steps is the number of inference steps for image generation
	// min: 1, max: 100
	Steps int `json:"steps" binding:"required,min=1,max=100"`

	// Seed is the seed for random number generation
	Seed int `json:"seed"`

	// SamplerName is the sampler for the image generation process
	SamplerName string `json:"sampler_name" binding:"required"`

	// Scheduler is the scheduler for the image generation process
	Scheduler string `json:"scheduler" binding:"required"`

	// Samples is the number of samples to generate
	Samples int `json:"samples" binding:"required"`

	// Width is the image width, can be between 512 and 2048 in multiples of 8
	Width int `json:"width" binding:"required,min=512,max=2048"`

	// Height is the image height, can be between 512 and 2048 in multiples of 8
	Height int `json:"height" binding:"required,min=512,max=2048"`

	// Denoise is the denoise level for the generated image
	Denoise float64 `json:"denoise" binding:"required"`
}

DrawImageByFluxSegmind is request to draw image by flux schnell

https://www.segmind.com/models/flux-schnell/api

type DrawImageByImageRequest

type DrawImageByImageRequest struct {
	Prompt      string `json:"prompt" binding:"required,min=1"`
	Model       string `json:"model" binding:"required,min=1"`
	ImageBase64 string `json:"image_base64" binding:"required,min=1"`
}

DrawImageByImageRequest draw image by image and prompt

type DrawImageByLcmRequest

type DrawImageByLcmRequest struct {
	// Data consist of 6 strings:
	//  1. prompt,
	//  2. base64 encoded image with fixed prefix "data:image/png;base64,"
	//  3. steps
	//  4. cfg
	//  5. sketch strength
	//  6. seed
	Data    [6]any `json:"data"`
	FnIndex int    `json:"fn_index"`
}

DrawImageByLcmRequest draw image by image and prompt with lcm

type DrawImageByLcmResponse

type DrawImageByLcmResponse struct {
	// Data base64 encoded image with fixed prefix "data:image/png;base64,"
	Data            []string `json:"data"`
	IsGenerating    bool     `json:"is_generating"`
	Duration        float64  `json:"duration"`
	AverageDuration float64  `json:"average_duration"`
}

DrawImageByLcmResponse draw image by image and prompt with lcm

type DrawImageBySdxlturboRequest

type DrawImageBySdxlturboRequest struct {
	Model string `json:"model" binding:"required,min=1"`
	// Text prompt
	Text           string `json:"text" binding:"required,min=1"`
	NegativePrompt string `json:"negative_prompt"`
	ImageB64       string `json:"image"`
	// N how many images to generate
	N int `json:"n"`
}

DrawImageBySdxlturboRequest draw image by image and prompt with sdxlturbo

type DrawImageBySdxlturboResponse

type DrawImageBySdxlturboResponse struct {
	B64Images []string `json:"images"`
}

DrawImageBySdxlturboResponse draw image by image and prompt with sdxlturbo

type DrawImageByTextRequest

type DrawImageByTextRequest struct {
	Prompt string `json:"prompt" binding:"required,min=1"`
	Model  string `json:"model" binding:"required,min=1"`
	N      int    `json:"n"`
	Size   string `json:"size"`
}

DrawImageByTextRequest draw image by text and prompt

type ExternalBillingUserResponse

type ExternalBillingUserResponse struct {
	Data struct {
		Status      ExternalBillingUserStatus `json:"status"`
		RemainQuota db.Price                  `json:"remain_quota"`
	} `json:"data"`
}

ExternalBillingUserResponse return from external billing api

type ExternalBillingUserStatus

type ExternalBillingUserStatus int

ExternalBillingUserStatus user status

const (
	// ExternalBillingUserStatusActive active
	ExternalBillingUserStatusActive ExternalBillingUserStatus = 1
)

type FluxInpaintingInput

type FluxInpaintingInput struct {
	Mask             string `json:"mask" binding:"required"`
	Image            string `json:"image" binding:"required"`
	Seed             int    `json:"seed"`
	Steps            int    `json:"steps" binding:"required,min=1"`
	Prompt           string `json:"prompt" binding:"required,min=5"`
	Guidance         int    `json:"guidance" binding:"required,min=2,max=5"`
	OutputFormat     string `json:"output_format"`
	SafetyTolerance  int    `json:"safety_tolerance" binding:"required,min=1,max=5"`
	PromptUnsampling bool   `json:"prompt_unsampling"`
}

FluxInpaintingInput is input of DrawImageByFluxProRequest

https://replicate.com/black-forest-labs/flux-fill-pro/api/schema

type FluxInput

type FluxInput struct {
	Steps  int    `json:"steps" binding:"required,min=1"`
	Prompt string `json:"prompt" binding:"required,min=1"`
	// ImagePrompt is the image prompt, only works for flux-1.1-pro
	ImagePrompt *string `json:"image_prompt,omitempty"`
	// InputImage is the input image, only works for flux-kontext-pro
	InputImage      *string `json:"input_image,omitempty"`
	Guidance        int     `json:"guidance" binding:"required,min=2,max=5"`
	Interval        int     `json:"interval" binding:"required,min=1,max=4"`
	AspectRatio     string  `json:"aspect_ratio" binding:"required,oneof=1:1 16:9 2:3 3:2 4:5 5:4 9:16"`
	SafetyTolerance int     `json:"safety_tolerance" binding:"required,min=1,max=5"`
	Seed            int     `json:"seed"`
	NImages         int     `json:"n_images" binding:"required,min=1,max=8"`
	Width           int     `json:"width" binding:"required,min=256,max=1440"`
	Height          int     `json:"height" binding:"required,min=256,max=1440"`
}

FluxInput is input of DrawImageByFluxProRequest

https://replicate.com/black-forest-labs/flux-1.1-pro/api/schema

type FluxMetrics

type FluxMetrics struct {
	ImageCount  int     `json:"image_count"`
	PredictTime float64 `json:"predict_time"`
	TotalTime   float64 `json:"total_time"`
}

FluxMetrics is metrics of DrawImageByFluxProResponse

type FluxURLs

type FluxURLs struct {
	Get    string `json:"get"`
	Cancel string `json:"cancel"`
}

FluxURLs is urls of DrawImageByFluxProResponse

type FrontendReq

type FrontendReq struct {
	Model            string               `json:"model"`
	MaxTokens        uint                 `json:"max_tokens"`
	Messages         []FrontendReqMessage `json:"messages,omitempty"`
	PresencePenalty  float64              `json:"presence_penalty"`
	FrequencyPenalty float64              `json:"frequency_penalty"`
	Stream           bool                 `json:"stream"`
	Temperature      float64              `json:"temperature"`
	TopP             float64              `json:"top_p"`
	N                int                  `json:"n"`
	Tools            []OpenaiChatReqTool  `json:"tools,omitempty"`
	ToolChoice       any                  `json:"tool_choice,omitempty"`
	EnableMCP        *bool                `json:"enable_mcp,omitempty"`
	MCPServers       []MCPServerConfig    `json:"mcp_servers,omitempty"`
	// ReasoningEffort constrains effort on reasoning for reasoning models, reasoning models only.
	ReasoningEffort string `json:"reasoning_effort,omitempty" binding:"omitempty,oneof=low medium high"`

	// -------------------------------------
	// Anthropic
	// -------------------------------------
	Thinking *Thinking `json:"thinking,omitempty"`

	// LaiskyExtra some special config for laisky
	LaiskyExtra *struct {
		ChatSwitch struct {
			// DisableHttpsCrawler disable https crawler
			DisableHttpsCrawler bool `json:"disable_https_crawler"`
			// EnableGoogleSearch enable google search
			EnableGoogleSearch bool `json:"enable_google_search"`
		} `json:"chat_switch"`
	} `json:"laisky_extra,omitempty"`
}

FrontendReq request from frontend

func (*FrontendReq) PromptTokens

func (r *FrontendReq) PromptTokens() (n int)

PromptTokens count prompt tokens

type FrontendReqMessage

type FrontendReqMessage struct {
	Role    OpenaiMessageRole         `json:"role"`
	Content FrontendReqMessageContent `json:"content"`
	// Files send files with message
	Files []frontendReqMessageFiles `json:"files"`
}

FrontendReqMessage request message from frontend

type FrontendReqMessageContent

type FrontendReqMessageContent struct {
	StringContent string
	ArrayContent  []OpenaiVisionMessageContent
}

FrontendReqMessageContent is a custom type that can unmarshal from either a string or an array of OpenaiVisionMessageContent.

func (*FrontendReqMessageContent) Append

func (c *FrontendReqMessageContent) Append(s string)

Append append string to content

func (FrontendReqMessageContent) MarshalJSON

func (c FrontendReqMessageContent) MarshalJSON() ([]byte, error)

MarshalJSON marshal to either a string or an array of OpenaiVisionMessageContent.

func (FrontendReqMessageContent) String

func (c FrontendReqMessageContent) String() string

String return string content

func (*FrontendReqMessageContent) UnmarshalJSON

func (c *FrontendReqMessageContent) UnmarshalJSON(data []byte) error

UnmarshalJSON unmarshal from either a string or an array of OpenaiVisionMessageContent.

type ImageUrl

type ImageUrl struct {
	Url string `json:"url"`
}

type InpaintingImageByFlusReplicateRequest

type InpaintingImageByFlusReplicateRequest struct {
	Input FluxInpaintingInput `json:"input"`
}

InpaintingImageByFlusReplicateRequest is request to inpainting image by flux pro

https://replicate.com/black-forest-labs/flux-fill-pro/api/schema

type LLMConservationReq

type LLMConservationReq struct {
	Model     string               `json:"model" binding:"required,min=1"`
	MaxTokens uint                 `json:"max_tokens" binding:"required,min=1"`
	Messages  []FrontendReqMessage `json:"messages" binding:"required,min=1"`
	Response  string               `json:"response" binding:"required,min=1"`
	Reasoning string               `json:"reasoning,omitempty"`
}

type MCPCallOption

type MCPCallOption struct {
	// FallbackAPIKey is used when server.APIKey is empty.
	// Typically this is the session's API key (user's token).
	FallbackAPIKey string
}

MCPCallOption contains optional parameters for callMCPTool.

type MCPServerConfig

type MCPServerConfig struct {
	ID              string            `json:"id,omitempty"`
	Name            string            `json:"name,omitempty"`
	URL             string            `json:"url"`
	URLPrefix       string            `json:"url_prefix,omitempty"`
	APIKey          string            `json:"api_key,omitempty"`
	Enabled         bool              `json:"enabled"`
	EnabledToolName []string          `json:"enabled_tool_names,omitempty"`
	Tools           []json.RawMessage `json:"tools,omitempty"`

	// Session fields are optional and may be provided by the UI.
	MCPProtocolVersion string `json:"mcp_protocol_version,omitempty"`
	MCPSessionID       string `json:"mcp_session_id,omitempty"`
}

MCPServerConfig carries the MCP server configuration from the web UI. It is used by the backend to map tool names to servers and to execute MCP tool calls.

type NvidiaArtifact

type NvidiaArtifact struct {
	Base64       string `json:"base64"`
	FinishReason string `json:"finish_reason"`
	Seed         int    `json:"seed"`
}

NvidiaArtifact draw image artifact

type NvidiaDrawImageBySdxlturboRequest

type NvidiaDrawImageBySdxlturboRequest struct {
	TextPrompts []NvidiaTextPrompt `json:"text_prompts"`
	Seed        int                `json:"seed"`
	Sampler     string             `json:"sampler"`
	Steps       int                `json:"steps"`
}

NvidiaDrawImageBySdxlturboRequest draw image by image and prompt with sdxlturbo

https://build.nvidia.com/explore/discover?snippet_tab=Python#sdxl-turbo

func NewNvidiaDrawImageBySdxlturboRequest

func NewNvidiaDrawImageBySdxlturboRequest(prompt string) NvidiaDrawImageBySdxlturboRequest

NewNvidiaDrawImageBySdxlturboRequest create new request

type NvidiaDrawImageBySdxlturboResponse

type NvidiaDrawImageBySdxlturboResponse struct {
	Artifacts []NvidiaArtifact `json:"artifacts"`
}

NvidiaDrawImageBySdxlturboResponse draw image by image and prompt with sdxlturbo

type NvidiaTextPrompt

type NvidiaTextPrompt struct {
	Text string `json:"text"`
}

NvidiaTextPrompt text prompt

type OneShotChatRequest

type OneShotChatRequest struct {
	SystemPrompt string `json:"system_prompt"`
	UserPrompt   string `json:"user_prompt" binding:"required,min=1"`
}

OneShotChatRequest request to one-shot chat api

type OpenAIResponseReasoning

type OpenAIResponseReasoning struct {
	// Effort defines the reasoning effort level
	Effort *string `json:"effort,omitempty" binding:"omitempty,oneof=low medium high"`
	// Summary defines whether to include a summary of the reasoning
	Summary *string `json:"summary,omitempty" binding:"omitempty,oneof=auto concise detailed"`
}

OpenAIResponseReasoning defines reasoning options for the Responses API.

type OpenAIResponsesFunctionCall

type OpenAIResponsesFunctionCall struct {
	Type      string `json:"type"`
	ID        string `json:"id"`
	CallID    string `json:"call_id"`
	Name      string `json:"name"`
	Arguments string `json:"arguments"`
}

OpenAIResponsesFunctionCall is the Responses API function_call output item.

type OpenAIResponsesFunctionCallOutput

type OpenAIResponsesFunctionCallOutput struct {
	Type   string `json:"type"`
	CallID string `json:"call_id"`
	Output string `json:"output"`
}

OpenAIResponsesFunctionCallOutput is the Responses API function_call_output input item.

type OpenAIResponsesInputMessage

type OpenAIResponsesInputMessage struct {
	Role    string `json:"role"`
	Content any    `json:"content"`
}

OpenAIResponsesInputMessage is a Responses API input message item.

In practice the API accepts {"role":"user","content":"..."} as shown in docs.

type OpenAIResponsesItem

type OpenAIResponsesItem struct {
	Type string `json:"type"`
	// contains filtered or unexported fields
}

OpenAIResponsesItem is a generic output item with a type discriminator.

It keeps the raw JSON payload so we can unmarshal it into typed structs later.

func (OpenAIResponsesItem) Raw

Raw returns the raw JSON payload for this output item.

func (*OpenAIResponsesItem) UnmarshalJSON

func (i *OpenAIResponsesItem) UnmarshalJSON(data []byte) error

UnmarshalJSON implements json unmarshalling and preserves the full raw payload.

type OpenAIResponsesReq

type OpenAIResponsesReq struct {
	Model           string                   `json:"model"`
	Input           any                      `json:"input,omitempty"`
	MaxOutputTokens uint                     `json:"max_output_tokens,omitempty"`
	Reasoning       *OpenAIResponseReasoning `json:"reasoning,omitempty"` // Optional: Configuration options for reasoning models
	Stream          bool                     `json:"stream,omitempty"`
	Temperature     float64                  `json:"temperature,omitempty"`
	TopP            float64                  `json:"top_p,omitempty"`
	Tools           []OpenAIResponsesTool    `json:"tools,omitempty"`
	ToolChoice      any                      `json:"tool_choice,omitempty"`
	Store           *bool                    `json:"store,omitempty"`
}

OpenAIResponsesReq is a subset of the OpenAI Responses API request schema.

type OpenAIResponsesRequiredAction

type OpenAIResponsesRequiredAction struct {
	Type              string                            `json:"type"`
	SubmitToolOutputs *OpenAIResponsesSubmitToolOutputs `json:"submit_tool_outputs,omitempty"`
}

OpenAIResponsesRequiredAction is a subset of Responses API required_action.

OneAPI (and some upstreams) provide tool calls here even when output_text is empty.

type OpenAIResponsesRequiredFunction

type OpenAIResponsesRequiredFunction struct {
	Name      string `json:"name"`
	Arguments string `json:"arguments"`
}

OpenAIResponsesRequiredFunction contains tool name/args.

type OpenAIResponsesRequiredToolCall

type OpenAIResponsesRequiredToolCall struct {
	ID       string                          `json:"id"`
	Type     string                          `json:"type"`
	Function OpenAIResponsesRequiredFunction `json:"function"`
}

OpenAIResponsesRequiredToolCall is a tool call descriptor inside required_action.

type OpenAIResponsesResp

type OpenAIResponsesResp struct {
	ID             string                         `json:"id"`
	Output         []OpenAIResponsesItem          `json:"output"`
	OutputText     string                         `json:"output_text"`
	RequiredAction *OpenAIResponsesRequiredAction `json:"required_action,omitempty"`
	Error          map[string]any                 `json:"error,omitempty"`
	Metadata       map[string]string              `json:"metadata,omitempty"`
}

OpenAIResponsesResp is a subset of the OpenAI Responses API response schema.

type OpenAIResponsesSubmitToolOutputs

type OpenAIResponsesSubmitToolOutputs struct {
	ToolCalls []OpenAIResponsesRequiredToolCall `json:"tool_calls"`
}

OpenAIResponsesSubmitToolOutputs is the required_action payload.

type OpenAIResponsesTool

type OpenAIResponsesTool struct {
	Type        string             `json:"type"`
	Name        string             `json:"name"`
	Description string             `json:"description,omitempty"`
	Parameters  stdjson.RawMessage `json:"parameters,omitempty"`
	Strict      *bool              `json:"strict,omitempty"`
}

OpenAIResponsesTool defines a tool in the OpenAI Responses API schema.

It intentionally mirrors the documented format: {"type":"function","name":"...","description":"...","parameters":{...},"strict":true}

type OpenaiChatReq

type OpenaiChatReq[T string | []OpenaiVisionMessageContent] struct {
	Model            string                `json:"model"`
	MaxTokens        uint                  `json:"max_tokens"`
	Messages         []OpenaiReqMessage[T] `json:"messages,omitempty"`
	PresencePenalty  float64               `json:"presence_penalty"`
	FrequencyPenalty float64               `json:"frequency_penalty"`
	Stream           bool                  `json:"stream"`
	Temperature      float64               `json:"temperature"`
	TopP             float64               `json:"top_p,omitempty"`
	N                *int                  `json:"n,omitempty"`
	// ReasoningEffort constrains effort on reasoning for reasoning models, reasoning models only.
	ReasoningEffort string              `json:"reasoning_effort,omitempty" binding:"omitempty,oneof=low medium high"`
	Tools           []OpenaiChatReqTool `json:"tools,omitempty"`
	ToolChoice      any                 `json:"tool_choice,omitempty"`

	// -------------------------------------
	// Anthropic
	// -------------------------------------
	Thinking *Thinking `json:"thinking,omitempty"`
}

OpenaiChatReq request to openai chat api

type OpenaiChatReqTool

type OpenaiChatReqTool struct {
	Type     string              `json:"type"`
	Function OpenaiChatReqToolFn `json:"function,omitempty"`
	Strict   *bool               `json:"strict,omitempty"`
}

OpenaiChatReqTool define tools

{
	"type": "function",
	"function": {
	  "name": "get_current_weather",
	  "description": "Get the current weather in a given location",
	  "parameters": {
		"type": "object",
		"properties": {
		  "location": {
			"type": "string",
			"description": "The city and state, e.g. San Francisco, CA"
		  },
		  "unit": {
			"type": "string",
			"enum": [
			  "celsius",
			  "fahrenheit"
			]
		  }
		},
		"required": [
		  "location"
		]
	  }
	}
}

func ToolsRequest

func ToolsRequest() []OpenaiChatReqTool

type OpenaiChatReqToolFn

type OpenaiChatReqToolFn struct {
	Name        string          `json:"name"`
	Description string          `json:"description,omitempty"`
	Parameters  json.RawMessage `json:"parameters,omitempty"`
}

OpenaiChatReqToolFn matches OpenAI chat-completions tool schema.

type OpenaiCompletionReq

type OpenaiCompletionReq struct {
	Model            string  `json:"model"`
	MaxTokens        uint    `json:"max_tokens"`
	PresencePenalty  float64 `json:"presence_penalty"`
	FrequencyPenalty float64 `json:"frequency_penalty"`
	Stream           bool    `json:"stream"`
	Temperature      float64 `json:"temperature"`
	TopP             float64 `json:"top_p"`
	N                int     `json:"n"`
	Prompt           string  `json:"prompt,omitempty"`
}

OpenaiCompletionReq request to openai chat api

type OpenaiCompletionResp

type OpenaiCompletionResp struct {
	ID     string `json:"id"`
	Object string `json:"object"`
	Model  string `json:"model"`
	Usage  struct {
		PromptTokens     int `json:"prompt_tokens"`
		CompletionTokens int `json:"completion_tokens"`
		TotalTokens      int `json:"total_tokens"`
	} `json:"usage"`
	Choices []struct {
		Message struct {
			Role             string `json:"role"`
			Content          string `json:"content"`
			ReasoningContent string `json:"reasoning_content,omitempty"`
		} `json:"message"`
		FinishReason string `json:"finish_reason"`
		Index        int    `json:"index"`
	} `json:"choices"`
}

nolint: lll OpenaiCompletionResp return from openai chat api

https://platform.openai.com/docs/guides/chat/response-format

{
	"id": "chatcmpl-6p9XYPYSTTRi0xEviKjjilqrWU2Ve",
	"object": "chat.completion",
	"created": 1677649420,
	"model": "gpt-3.5-turbo",
	"usage": {"prompt_tokens": 56, "completion_tokens": 31, "total_tokens": 87},
	"choices": [
	  {
	   "message": {
		 "role": "assistant",
		 "content": "The 2020 World Series was played in Arlington, Texas at the Globe Life Field, which was the new home stadium for the Texas Rangers."},
	   "finish_reason": "stop",
	   "index": 0
	  }
	 ]
   }

type OpenaiCompletionStreamResp

type OpenaiCompletionStreamResp struct {
	ID      string                             `json:"id"`
	Object  string                             `json:"object"`
	Created int64                              `json:"created"`
	Model   string                             `json:"model"`
	Choices []OpenaiCompletionStreamRespChoice `json:"choices"`
}

OpenaiCompletionStreamResp stream chunk return from openai chat api

{
    "id":"chatcmpl-6tCPrEY0j5l157mOIddZi4I0tIFhv",
    "object":"chat.completion.chunk",
    "created":1678613787,
    "model":"gpt-3.5-turbo-0301",
    "choices":[{"delta":{"role":"assistant"}, "index":0, "finish_reason":null}]
}

type OpenaiCompletionStreamRespChoice

type OpenaiCompletionStreamRespChoice struct {
	Delta        OpenaiCompletionStreamRespDelta `json:"delta"`
	Index        int                             `json:"index"`
	FinishReason string                          `json:"finish_reason"`
}

type OpenaiCompletionStreamRespDelta

type OpenaiCompletionStreamRespDelta struct {
	Role OpenaiMessageRole `json:"role"`
	// Content may be string or []StreamRespContent
	Content          any                                  `json:"content"`
	ReasoningContent string                               `json:"reasoning_content,omitempty"`
	Reasoning        string                               `json:"reasoning,omitempty"`
	ToolCalls        []OpenaiCompletionStreamRespToolCall `json:"tool_calls,omitempty"`
}

type OpenaiCompletionStreamRespToolCall

type OpenaiCompletionStreamRespToolCall struct {
	ID       string `json:"id"`
	Type     string `json:"type"`
	Function struct {
		Name      string `json:"name"`
		Arguments string `json:"arguments"`
	} `json:"function"`
}

OpenaiCompletionStreamRespToolCall tool call

{
	"id": "call_abc123",
	"type": "function",
	"function": {
	  "name": "get_current_weather",
	  "arguments": "{\n\"location\": \"Boston, MA\"\n}"
	}
}

type OpenaiCreateImageEditRequest

type OpenaiCreateImageEditRequest struct {
	Model          string `json:"model,omitempty"`
	Prompt         string `json:"prompt"`
	N              int    `json:"n,omitempty"`
	Size           string `json:"size,omitempty"`
	ResponseFormat string `json:"response_format,omitempty"`
}

OpenaiCreateImageEditRequest request to openai image edit api

type OpenaiCreateImageRequest

type OpenaiCreateImageRequest struct {
	Model          string `json:"model,omitempty"`
	Prompt         string `json:"prompt"`
	N              int    `json:"n"`
	Size           string `json:"size"`
	Quality        string `json:"quality,omitempty"`
	ResponseFormat string `json:"response_format,omitempty"`
	Style          string `json:"style,omitempty"`
}

OpenaiCreateImageRequest request to openai image api

func NewOpenaiCreateImageRequest

func NewOpenaiCreateImageRequest(prompt string, n int) *OpenaiCreateImageRequest

NewOpenaiCreateImageRequest create new request

type OpenaiCreateImageResponse

type OpenaiCreateImageResponse struct {
	Created int64 `json:"created"`
	Data    []struct {
		Url     string `json:"url"`
		B64Json string `json:"b64_json"`
	} `json:"data"`
}

OpenaiCreateImageResponse return from openai image api

type OpenaiMessageRole

type OpenaiMessageRole string

OpenaiMessageRole message role

func (OpenaiMessageRole) String

func (r OpenaiMessageRole) String() string

String return string

type OpenaiReqMessage

type OpenaiReqMessage[T string | []OpenaiVisionMessageContent] struct {
	Role    OpenaiMessageRole `json:"role"`
	Content T                 `json:"content"`
}

OpenaiReqMessage request message to openai chat api

chat completion message and vision message have different content

type OpenaiVisionMessageContent

type OpenaiVisionMessageContent struct {
	Type     OpenaiVisionMessageContentType      `json:"type"`
	Text     string                              `json:"text,omitempty"`
	ImageUrl *OpenaiVisionMessageContentImageUrl `json:"image_url,omitempty"`
}

OpenaiVisionMessageContent vision message content

type OpenaiVisionMessageContentImageUrl

type OpenaiVisionMessageContentImageUrl struct {
	URL    string                `json:"url"`
	Detail VisionImageResolution `json:"detail,omitempty"`
}

OpenaiVisionMessageContentImageUrl image url

type OpenaiVisionMessageContentType

type OpenaiVisionMessageContentType string

OpenaiVisionMessageContentType vision message content type

const (
	// OpenaiVisionMessageContentTypeText text
	OpenaiVisionMessageContentTypeText OpenaiVisionMessageContentType = "text"
	// OpenaiVisionMessageContentTypeImageUrl image url
	OpenaiVisionMessageContentTypeImageUrl OpenaiVisionMessageContentType = "image_url"
)

type QuotaExceededError

type QuotaExceededError struct {
	Limit      int
	Used       int
	Remaining  int
	RetryAfter time.Duration
}

QuotaExceededError indicates the free-tier quota has been exhausted.

func (*QuotaExceededError) Error

func (e *QuotaExceededError) Error() string

type StreamRespContent

type StreamRespContent struct {
	Type     string   `json:"type"`
	ImageUrl ImageUrl `json:"image_url"`
}

type Thinking

type Thinking struct {
	Type         string `json:"type"`
	BudgetTokens int    `json:"budget_tokens" binding:"omitempty,min=1024"`
}

https://docs.anthropic.com/en/docs/build-with-claude/extended-thinking#implementing-extended-thinking

type TokenQuotaManager

type TokenQuotaManager struct {
	// contains filtered or unexported fields
}

TokenQuotaManager keeps track of per-user token usage within a rolling window.

type TokenReservation

type TokenReservation struct {
	// contains filtered or unexported fields
}

TokenReservation represents a temporary reservation of tokens for a request.

func ReserveTokens

func ReserveTokens(ctx *gin.Context, user *config.UserConfig, req *FrontendReq) (*TokenReservation, error)

Reserve tokens for a free-tier user. Returns nil when no reservation is needed.

func (*TokenReservation) EstimatedOutputTokens

func (r *TokenReservation) EstimatedOutputTokens() int

EstimatedOutputTokens returns the initially reserved completion token budget.

func (*TokenReservation) Finalize

func (r *TokenReservation) Finalize(ctx context.Context, actualOutputTokens int) error

Finalize updates the reservation to match the actual token usage.

type TranscriptRequest

type TranscriptRequest struct {
	File  multipart.File `form:"file" binding:"required"`
	Model string         `form:"model" binding:"required"`
}

TranscriptRequest is the request struct for speech to text

type TranscriptionResponse

type TranscriptionResponse struct {
	Task     string                 `json:"task"`
	Language string                 `json:"language"`
	Duration float64                `json:"duration"`
	Text     string                 `json:"text"`
	Segments []transcriptionSegment `json:"segments"`
	XGroq    xGroq                  `json:"x_groq"`
}

TranscriptionResponse is the request struct for speech to text

func Transcript

func Transcript(ctx context.Context, user *config.UserConfig, req *TranscriptRequest) (respData *TranscriptionResponse, err error)

Transcript transcribe audio to text

type VisionImageResolution

type VisionImageResolution string

VisionImageResolution image resolution

const (
	// VisionImageResolutionLow low resolution
	VisionImageResolutionLow VisionImageResolution = "low"
	// VisionImageResolutionHigh high resolution
	VisionImageResolutionHigh VisionImageResolution = "high"
)

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL