Best AI Tools
Tools
Top 100
AI News
Learn
Compare
Partner
Submit Tool
Learn
Glossary
Multi-Head Attention
Multi-Head Attention
Attention mechanism that runs multiple attention operations in parallel, allowing models to focus on different aspects simultaneously.
Related terms
Attention Mechanism
Transformer Architecture
Self-Attention
View on glossary index