What does the term “maximal munch” refer to in lexical tokenization?

The term “maximal munch” refers to processing the input characters one at a time until reaching a character that is not acceptable for the current token, ensuring the longest possible match for a token.

For example, in the string “int125”, if both “int” and “int125” are valid tokens, the longest prefix match ensures that “int125” is selected.