Lexical Analysis

Lexical analysis

Lexical analysis, also known as lexing or tokenization, is the process of converting a sequence of characters into meaningful tokens. This process is performed by a program called a lexer, tokenizer, or scanner, which is often combined with a parser to analyze the syntax of programming languages and web pages.

1 courses cover this concept

CS 143 Compilers

Stanford University

Spring 2022

Combination of theoretical and practical perspectives on compiler design. This course emphasizes correctness over performance, providing a deep understanding of compiler structure and functioning.

No concepts data

+ 16 more concepts