Skip to content

Latest commit

 

History

History
142 lines (87 loc) · 3.97 KB

index.md

File metadata and controls

142 lines (87 loc) · 3.97 KB

leac - v0.6.0

Table of contents

Interfaces

Type Aliases

Functions

Type Aliases

Lexer

Ƭ Lexer: (str: string, offset?: number) => LexerResult

Type declaration

▸ (str, offset?): LexerResult

Lexer function.

Parameters
Name Type Description
str string A string to tokenize.
offset? number Initial offset. Used when composing lexers.
Returns

LexerResult


LexerResult

Ƭ LexerResult: Object

Result returned by a lexer function.

Type declaration

Name Type Description
complete boolean True if whole input string was processed. Check this to see whether some input left untokenized.
offset number Final offset.
tokens Token[] Array of tokens.

Options

Ƭ Options: Object

Lexer options (not many so far).

Type declaration

Name Type Description
lineNumbers? boolean Enable line and column numbers computation.

Rules

Ƭ Rules: [Rule | StringRule | RegexRule, ...(Rule | StringRule | RegexRule)[]]

Non-empty array of rules.

Rules are processed in provided order, first match is taken.

Rules can have the same name. For example, you can have separate rules for various keywords and use the same name "keyword".


Token

Ƭ Token: Object

Token object, a result of matching an individual lexing rule.

Type declaration

Name Type Description
column number Column number within the line in the source string (1-based). (Always zero if line numbers not enabled in the lexer options.)
len number The length of the matched substring. (Might be different from the text length in case replace value was used in a RegexRule.)
line number Line number in the source string (1-based). (Always zero if not enabled in the lexer options.)
name string Name of the rule produced this token.
offset number Start index of the match in the input string.
state string Name of the lexer containing the rule produced this token.
text string Text matched by the rule. (Unless a replace value was used by a RegexRule.)

Functions

createLexer

createLexer(rules, state?, options?): Lexer

Create a lexer function.

Parameters

Name Type Description
rules Rules Non-empty array of lexing rules. Rules are processed in provided order, first match is taken. Rules can have the same name - you can have separate rules for keywords and use the same name "keyword" for example.
state? string The name of this lexer. Use when composing lexers. Empty string by default.
options? Options Lexer options object.

Returns

Lexer

createLexer(rules, options?): Lexer

Create a lexer function.

Parameters

Name Type Description
rules Rules Non-empty array of lexing rules. Rules are processed in provided order, first match is taken. Rules can have the same name - you can have separate rules for keywords and use the same name "keyword" for example.
options? Options Lexer options object.

Returns

Lexer