The course will discuss the technical side of statistical methods focusing on two key aspects: optimization and implementation. The first part of the course will introduce necessary background for understanding and devising algorithms for modern statistical methodology. It will cover core concepts and tools from convex optimization such as convexity of sets and functions, Lagrange multipliers method, Newton's method, proximal gradient descent, coordinate descent, alternating direction method of multipliers. In addition, it will include the review of key topics in linear algebra such as matrix and vector norms, quadratic forms and positive semidefinite matrices, matrix calculus (gradient, Hessian, and determinant), matrix decompositions (QR, Cholesky, eigen, and singular value). The second part of the course will focus on topics from statistical methodology with an emphasis on computational aspects. The covered concepts will include model assessment and selection (bias-variance trade-off, cross-validation, and bootstrap), feature selection (penalized generalized linear models, elastic net, group and fused lasso, least angle regression), dimension reduction (principal component analysis, independent component analysis, factor analysis), data compression (k-means, hierarchical, and spectral clustering). The course will involve a significant practical component, which will include labs and coding assignments where students will master their skills in implementing statistical optimization algorithms.