Skip to content
AI ConceptLanguage & Meaningexploration

Attention

Attention is the mechanism that lets a model decide which parts of the input are most relevant when generating each word. When the model processes a long passage, it doesn't treat every word equally — it learns to "attend" to the parts that matter most for the current prediction. Attention is what allows models to track context, resolve references, and handle long, complex texts.

No videos covering this concept yet — follow on YouTube to be notified.

Related concepts