Read first, then watch — you'll remember more

Read ~6m
8 terms · 6 segments

Building makemore Part 5: Building a WaveNet

6chapters with key takeaways — read first, then watch
1

WaveNet Introduction & MLP Review

0:00-2:332m 33sIntro
2

PyTorch Module Refactoring & Sequential

2:33-17:3214m 59sArchitecture
3

Hierarchical Information Fusion for WaveNet

17:32-28:1910m 47sArchitecture
4

Custom Flattening & Hierarchical Model Build

28:19-38:4010m 21sArchitecture
5

BatchNorm1D Correction for Multi-Dim Inputs

38:40-46:077m 27sLimitation
6

Scaling, Convolutions, & Deep Learning Workflow

46:07-56:2210m 15sTraining

Video Details & AI Summary

Published Nov 21, 2022
Analyzed Jan 21, 2026

AI Analysis Summary

This video, Part 5 of the 'Building makemore' series, focuses on evolving a character-level language model from a simple Multi-Layer Perceptron (MLP) to an architecture resembling DeepMind's WaveNet. It details significant code refactoring in PyTorch, including creating custom Embedding, Flatten, and Sequential modules, and debugging a critical BatchNorm1D issue for multi-dimensional inputs. The core of the lecture involves implementing a hierarchical information fusion strategy, where characters are progressively combined in a tree-like structure, leading to improved model performance and setting the stage for more advanced convolutional neural network concepts.

Title Accuracy Score
10/10Excellent
58.9s processing
Model:gemini-2.5-flash