Model
A model is not a database of facts and not a search engine — it's a large file of numbers that encodes patterns absorbed from enormous amounts of text. Think of it as a very compressed snapshot of language: all the relationships, styles, and associations the model encountered during training, frozen into one file. When you use an AI, you're running that frozen snapshot.
Videos explaining this concept
E003Notes on AI
How AI Thinks
LLMs are neural networks trained on vast text datasets. They identify patterns in connections between words and topics. This training gives them one primary capability: Prediction. They predict the...
E005Notes on AI
What Is a Model, Really?
A model is a learned structure that predicts what comes next. The training process — analyzing massive amounts of text data — doesn't stay "alive" inside the model. It collapses and crystallizes in...
E006Notes on AI
Training vs Using a Model
Training and Inference are the two distinct phases of an AI model's lifecycle.
E010Notes on AI
The 5-Sentence Mental Model of GenAI
This episode provides a checkpoint after the foundational episodes, compressing the key concepts into five memorable sentences that serve as a mental compass for AI.
E011Notes on AI
Tokens
AI models do not read words. They read tokens — the basic unit of text a model processes. A token is close to a word but not the same: one word can be one token, several tokens, or several words ca...
E012Notes on AI
Tokenization
Tokenization is the process of turning raw text into tokens before an AI model processes it. It is preprocessing, not thinking — the model only sees the resulting pieces.