Best AI Tools
Tools
Top 100
AI News
Learn
Compare
Partner
Submit Tool
Learn
Glossary
Self-Attention
Self-Attention
Attention mechanism that computes relationships between elements within the same sequence, enabling contextual understanding.
Related terms
Attention Mechanism
Cross-Attention
Transformer Architecture
View on glossary index