Tokenization

The process of splitting text into tokens (subwords/characters). Tokenization affects context limits, latency, and cost calculations.

Related terms