Foundations for Machine Learning
This course provides a practical and theoretical tour of the most essential probability distributions that are most often used for modern machine learning and data science. We will explore the fundamental building blocks for modeling discrete events (Bernoulli, binomial, multinomial distributions) and continuous quantities (Gaussian distribution) and discuss the implications of Bayes Theorem. Moreover, we will discuss two perspectives in estimating the model parameters, namely Bayesian perspective and frequentist perspective and learn how to reason about uncertainty in model parameters themsel…
Watch on Coursera ↗
(saves to browser)
DeepCamp AI