Lexical Analyzer
Lexical Analysis is the first phase in the designing of a compiler. It first groups the source program into lexemes and gives a sequence into tokens. It also collects information about tokens into associated attributes.
- The tokens influence the decisions for parsing.
- These attributes influence the translation of tokens.
Lexical Analysis has two stages
- Scanning
- Tokenization
What is a Lexeme?
It is a sequence of tokens present in the source program. In other terms, it is an instance of a token.
What is a Token?
A token is a sequence of characters that shows a unit of information in the source program.
Reference Link