This course provides the fundamentals of Machine Learning, i.e. of designing programs that implement a data-driven, rather than hand-implemented behavior.
The course provides a gentle introduction of the topic, but strives to provide enough details and intuitions to explain state-of-the-art ML approaches: ensembles of Decision Trees (Boosted Trees, Random Forests) and Neural Networks.
Starting with simple linear and Bayesian models, we proceed to learn the concepts of trainable models, selecting the best model based on data, practical and theoretical ways of estimating model performance on new data, and the difference between discriminative and generative training. The course introduces mainstream algorithms for classification and regression including linear models, Naive Bayes, trees, ensembles. Practical sessions provide a hands-on experience on building models together with familiarizing Students with popular python-based tools like numpy, pandas and pytorch.