Transformers

Plural alias for the Transformer architecture used in modern LLMs. Transformers rely on attention mechanisms to model long‑range dependencies in sequences.