This course covers several topics in machine learning theory. The first half focuses on uniform convergence-based methods (e.g., covering, chaining) to establish generalization through complexity measures like Rademacher complexity and VC dimension. Second half starts with reproducing kernel Hilbert spaces and demonstrates double descent phenomenon in a kernel ridge regression setup. Finally, we discuss linearization (NTK) and feature learning in neural networks.