Best AI Tools
Tools
Top 100
AI News
Learn
Compare
Partner
Submit Tool
AI Glossary
/
Tokenization
Tokenization
The process of splitting text into tokens (subwords/characters). Tokenization affects context limits, latency, and cost calculations.
Related terms
Token
Context Window
View on glossary index