Farzad Farnoud Teaching Publications Home

ECE 6501/4502, CS 6501: Probabilistic Machine Learning (Fall 2025)

Welcome! In this course, we’ll study estimation and machine learning from a probabilistic point of view.

Why a probabilistic view?

Information and uncertainty, which underlie both statistical estimation and machine learning, can be represented via probability in a robust and versatile way. Unknown quantities can be cast as random variables and their relationships to each other and to known information as joint distributions. This provides a unifying framework for setting up estimation and machine learning problems, where we can state our assumptions clearly, design methods, and evaluate performance.

What topics will we study?

We will start with estimation, which can be defined as the problem of learning about the world from data (e.g., finding the chance of getting a disease given one’s genetic make-up) or drawing conclusions about relationships (e.g., what are the best predictors of academic success?). We will then learn about machine learning problems such as regression and classification, where the goal is to predict an unknown quantity, e.g., the price of a house, based on some relevant information. We will also learn how to deal with situations when part of the data is missing. Finally, we will discuss computational methods, which help tackle difficult problems via approximation.

Course objectives:

  1. Use joint distributions and graphical models to describe relationships between known and unknown quantities
  2. Describe, identify, and apply frequentist and Bayesian estimation methods
  3. Construct and apply learning models
  4. Apply computational methods such as expectation-maximization and Monte Carlo sampling
  5. Perform approximate inference using variational methods
  6. Quantify fundamental limits on estimation and learning given available data

Pre-requisites:

What will help you excel:

The most important factor is remaining engaged in the class. In particular, office hours are often underutilized. Ask questions in class when they arise; not doing so can prevent you from following the lecture and understanding the subsequent material as well. Reach out to the instructors and the TA when you need help.

Note to Undergraduate students: You do not need instructor permission to enroll in this course. But fluency in probability is an important prereq and if your foundation in probability is not strong, you will not be able to fully benefit from the course.

Activities and Grading Scheme:

Tentative Grading Scheme:

HW/Labs = 50%; Quizzes/In-class activities = 20%; Exam = 20%; Project = 10%

Comments:

Course Notes:

Links to the complete notes and individual chapters are listed below and are updated as needed, with the chapter notes being possibly more up-to-date. To ensure you always access the latest versions, please view them via the links below rather than downloading local copies.

  • Estimation and Probabilistic Learning
    1. Review of Probability
    2. Probability, Inference, and Learning
    3. Frequentist Parameter Estimation
    4. Bayesian Parameter Estimation
    5. Multivariate random variables
    6. Linear Regression
    7. Linear Classification
    8. Expectation-Maximization
    9. Basics of Graphical Models
    10. Independence in Graphical Models
    11. Parameter Estimation in Graphical Models
    12. Inference in Graphical Models
    13. Inference in Hidden Markov Models
    14. Factor Graphs and Sum/Max-product Algorithms
    15. Markov Chains
    16. Sampling Methods
    17. Variational Inference
    18. Appendix

    Textbooks and other resources:

    The main resources are lectures and the pdf notes posted on this page. But you may find the following useful:

    1. Probabilistic Machine Learning: An introduction by Kevin P. Murphy, 2022.
    2. Deep Learning by Ian Goodfellow et al, 2015.
    3. Information Theory, Inference, and Learning Algorithms by David MacKay, 2003.
    4. Pattern Recognition and Machine Learning by Christopher M. Bishop, 2006.
    5. Probabilistic Graphical Models: Principles and Techniques by Daphne Koller and Nir Friedman, 2009.