Interview Preparation
Math derivations, intuitions, and clean NumPy/Python implementations. Everything you need to walk through an ML interview with confidence.
6 artefacts
OLS derivation via MLE, gradient descent, normal equation, Ridge/Lasso regularisation.
Sigmoid derivation, binary cross-entropy from MLE, multi-class softmax.
Scaled dot-product attention, multi-head, positional encoding, encoder-decoder from scratch.
Chain rule, computational graphs, vanishing gradients โ derive gradients for a 2-layer MLP by hand.
SGD, Momentum, RMSProp, Adam โ update rules derived, intuition explained.
Distance metrics, curse of dimensionality, KD-tree vs brute-force trade-offs.