Prime your brain first — retention follows
The spelled-out intro to language modeling: building makemore
Video Details & AI Summary
AI Analysis Summary
This video provides a detailed, 'spelled-out' introduction to language modeling using the 'makemore' project. It begins by building a character-level bi-gram language model through explicit counting and normalization, demonstrating how to sample new words and evaluate model quality using negative log likelihood. The tutorial then transitions to implementing the same bi-gram model within a neural network framework using PyTorch, explaining concepts like one-hot encoding, logits, softmax, and gradient-based optimization, ultimately showing how both approaches yield identical results while highlighting the superior flexibility and scalability of neural networks for future, more complex models like transformers.
gemini-2.5-flashOriginal Video