Machine Learning for Discrete Optimization
Course information
- Quarter: Spring 2024
- Lecture time: Mondays and Wednesdays 1:30-2:50pm
- Lecture location: 200-203
- Instructor: Ellen Vitercik
- Office hours: Wednesdays, 3-4pm in Huang 250
- CAs:
- Yanis Miraoui
- Office hours: Mondays, 9-11am in Sequoia 207
- Paul Woringer
- Office hours: Tuesdays, 3-5pm in Huang B016
- Yanis Miraoui
- Prerequisites: Introductory course in algorithms/optimization (e.g., CS 161 or MS&E 111/211) and introductory course in machine learning (e.g., CS 221 or CS 229). Students should be familiar with basic feed-forward neural networks (check out pages 5-9 of these lecture notes for a refresher).
Description
Machine learning has become a powerful tool for discrete optimization. This is because, in practice, we often have ample data about the application domain–data that can be used to optimize algorithmic performance, ranging from runtime to solution quality. This course covers how machine learning can be used within the discrete optimization pipeline from many perspectives, including how to design novel combinatorial algorithms with machine-learned modules and configure existing algorithms’ parameters to optimize performance. Topics will include both applied machinery (such as graph neural networks, reinforcement learning, transformers, and LLMs) as well as theoretical tools for providing provable guarantees.
Course activities
- Lectures: The course will include lectures which cover key technical tools used to develop and analyze machine learning approaches to discrete optimization.
- Homework: There will be three homeworks that include both applied coding problems (in Python) and theoretical exercises.
- Project: Students will complete a course project, in a group or individually.
Grading
Grading will be out of 100 points as follows:
- Participation: 10 points
- Homeworks: 45 points
- Project: 45 points
Related courses
I taught a 300-level PhD seminar on these topics in 2023 [link].