Lexing

Lexical analysis

Lexical analysis, also known as lexing or tokenization, is the process of converting a sequence of characters into meaningful tokens. This process is performed by a program called a lexer, tokenizer, or scanner, which is often combined with a parser to analyze the syntax of programming languages and web pages.

1 courses cover this concept

15-411 Compiler Design

Carnegie Mellon University

Fall 2020

Comprehensive study of compiler design and implementation, examining interaction between language design and runtime organization. Topics include program analysis, code generation, optimization, memory management.

No concepts data

+ 21 more concepts