This course will be mostly blackboard-based, focusing on the statistical foundations of Machine Learning. Many numerical examples will be given in python notebooks, for the sake of illustration and further usefulness to the attendees, and two hands-on sessions are conceived. The required background to make the most of these lectures is, on the one hand, a good level of linear algebra, as well as familiarity with concepts of statistics (probability distributions, likelihood, Bayes theorem, etc)
Preliminary outline (it may change a little):
lecture 1: review of ML + summary of statistics
lecture 2: regression and overfitting control
lecture 3: bayesian learning
lecture 4: classification
lecture 5 (hands-on session): classification
lecture 6: neural networks
lecture 7 (hands-on session): neural nets
lecture 8: non-parametric methods
lecture 9: gaussian processes
lecture 10: unsupervised learning
Timetable:
Lecture 1: May 2nd (Tue), 15:00-17:00
Lectures 2 & 3: May 3rd (Wed) 11:00-13:00 & 15:00-17:00
Lectures 4 & 5: May 4th (Thu), 11:00-13:00 & 15:00-17:00
Lecture 6: May 5th (Fri), 15:00-17:00
Lectures 7 & 8: May 8th (Mon), 11:00-13:00 & 15:00-17:00
Lecture 9: May 9th (Tue), 15:00-17:00
Lecture 10: May 10th(Wed), 11:00-13:00
Bryan Zaldívar