Best AI Tools
Tools
Top 100
AI News
Learn
Compare
Partner
Submit Tool
Learn
Glossary
Cross-Attention
Cross-Attention
Attention mechanism that connects two different sequences (e.g., source and target languages in translation), unlike self-attention which works within one sequence.
Related terms
Attention Mechanism
Self-Attention
Transformer Architecture
View on glossary index