lexer

package
v0.3.1 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Dec 28, 2025 License: CC0-1.0 Imports: 2 Imported by: 0

Documentation

Overview

Package lexer provides tokenization for WGSL source code.

The lexer converts a WGSL source string into a sequence of tokens, handling: - Keywords and reserved words - Identifiers (including Unicode XID) - Numeric literals (int, float, hex) - Operators and punctuation - Comments (line and block, with nesting) - Template list disambiguation (<, >)

Index

Constants

This section is empty.

Variables

View Source
var Keywords = map[string]TokenKind{
	"alias":        TokAlias,
	"break":        TokBreak,
	"case":         TokCase,
	"const":        TokConst,
	"const_assert": TokConstAssert,
	"continue":     TokContinue,
	"continuing":   TokContinuing,
	"default":      TokDefault,
	"diagnostic":   TokDiagnostic,
	"discard":      TokDiscard,
	"else":         TokElse,
	"enable":       TokEnable,
	"false":        TokFalse,
	"fn":           TokFn,
	"for":          TokFor,
	"if":           TokIf,
	"let":          TokLet,
	"loop":         TokLoop,
	"override":     TokOverride,
	"requires":     TokRequires,
	"return":       TokReturn,
	"struct":       TokStruct,
	"switch":       TokSwitch,
	"true":         TokTrue,
	"var":          TokVar,
	"while":        TokWhile,
}

Keywords maps keyword strings to their token kinds.

View Source
var ReservedWords = map[string]bool{}/* 146 elements not displayed */

ReservedWords contains all WGSL reserved words that cannot be used as identifiers.

Functions

This section is empty.

Types

type Lexer

type Lexer struct {
	// contains filtered or unexported fields
}

Lexer tokenizes WGSL source code.

func New

func New(source string) *Lexer

New creates a new lexer for the given source.

func (*Lexer) Next

func (l *Lexer) Next() Token

Next returns the next token.

func (*Lexer) Tokenize

func (l *Lexer) Tokenize() []Token

Tokenize returns all tokens in the source.

type Token

type Token struct {
	Kind  TokenKind
	Start int    // Byte offset in source
	End   int    // Byte offset of end (exclusive)
	Value string // For identifiers and literals
}

Token represents a lexical token.

func (Token) Text

func (t Token) Text(source string) string

Text returns the source text of the token.

type TokenKind

type TokenKind uint8

TokenKind represents the type of a token.

const (
	TokError TokenKind = iota
	TokEOF

	// Literals
	TokIntLiteral
	TokFloatLiteral
	TokTrue
	TokFalse

	// Identifiers
	TokIdent

	// Keywords
	TokAlias
	TokBreak
	TokCase
	TokConst
	TokConstAssert
	TokContinue
	TokContinuing
	TokDefault
	TokDiagnostic
	TokDiscard
	TokElse
	TokEnable
	TokFn
	TokFor
	TokIf
	TokLet
	TokLoop
	TokOverride
	TokRequires
	TokReturn
	TokStruct
	TokSwitch
	TokVar
	TokWhile

	// Operators
	TokPlus    // +
	TokMinus   // -
	TokStar    // *
	TokSlash   // /
	TokPercent // %
	TokAmp     // &
	TokPipe    // |
	TokCaret   // ^
	TokTilde   // ~
	TokBang    // !
	TokLt      // <
	TokGt      // >
	TokEq      // =
	TokDot     // .
	TokAt      // @

	// Multi-char operators
	TokPlusPlus   // ++
	TokMinusMinus // --
	TokAmpAmp     // &&
	TokPipePipe   // ||
	TokLtLt       // <<
	TokGtGt       // >>
	TokLtEq       // <=
	TokGtEq       // >=
	TokEqEq       // ==
	TokBangEq     // !=
	TokArrow      // ->
	TokPlusEq     // +=
	TokMinusEq    // -=
	TokStarEq     // *=
	TokSlashEq    // /=
	TokPercentEq  // %=
	TokAmpEq      // &=
	TokPipeEq     // |=
	TokCaretEq    // ^=
	TokLtLtEq     // <<=
	TokGtGtEq     // >>=

	// Delimiters
	TokLParen     // (
	TokRParen     // )
	TokLBrace     // {
	TokRBrace     // }
	TokLBracket   // [
	TokRBracket   // ]
	TokSemicolon  // ;
	TokColon      // :
	TokComma      // ,
	TokUnderscore // _ (as placeholder expression)

	// Template delimiters (context-sensitive)
	TokTemplateArgsStart // < in template context
	TokTemplateArgsEnd   // > in template context
)

func (TokenKind) String

func (k TokenKind) String() string

String returns the string representation of a token kind.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL