Wiki

Clone wiki

inf225 / glossary / Tokeniser

[Alphabetical Index | Tag Index]

Tokeniser (also lexer or scanner)

A program that performs lexical analysis, grouping and classifying input into tokens.

Updated