statistics & machine learning

Similar to my notes in measure theory, I’ve decided to review some imporant methods and concepts in machine learning (and statistics). These notes will pull from a bunch of different sources but will mainly focus on Probabilistic Machine Learning by Kevin P. Murphy and The Elements of Statistical Learning by Hastie, Tibshirani, and Friedman with some supplementary information from Introduction to Algorithms by Thomas H. Cormen as well as Foundations of Machine Learning by Mohri, Rostamizadeh, and Talwalkar.

These notes will not really work cover to cover through any of the references mentioned but rather will have a post dedicated to the concepts that I find most foundational or interesting. Hopefully, each post will be thorough enough to provide readers (and especially my future self) with both an intuitive and in-depth understanding of the subject matter. Though the ultimate goal is for each post to be relatively self-contained, there will be an assumption of a very basic introduction to statistical learning (e.g. distributions, parameters, etc.).

Note: Not all of the proofs are finished/included. I am hoping to find the time to return to this post and complete them.