News
 Apr 11: Assignment 3 is now posted
 Mar 7: Assignment 2 is now posted
 Jan 31: Assignment 1 is now posted
 Jan 24: Lecture recordings will be uploaded to Piazza → Resources
 Jan 23: Classes will be online over Zoom (939 9035 1363)
Information
 Lectures

Mon 2:304:15pm (
LSB C1Zoom) Tue 3:304:15pm (MMW 702Zoom)  Zoom lectures
 No passcode, but CUHKauthenticated Zoom accounts are required to join
 Instructor
 Siu On CHAN Office hours: Tue 4:306:30pm (Zoom 984 7747 9788)
 Teaching assistant

Yichao LI
yichaoli21@cse
 Prerequisites
 Linear algebra, Probability, Discrete Math, Algorithm Design and Analysis
 References
 [BV] Convex Optimization, Stephen Boyd and Lieven Vandenberghe (available online)
 [GM] Approximation Algorithms and Semidefinite Programming, Bernd Gärtner and Jiri Matousek (accessible online at the university library webpage)
 [S] Spectral and Algebraic Graph Theory, Daniel A. Spielman (available online)
 Additional references
 [WS] The Design of Approximation Algorithms, David P. Williamson and David B. Shmoys (available online)
 [G] Foundations of Optimization, Osman Güler (accessible online at the university library webpage)
 Forum
 Please sign up on Piazza
 Grading
 50% homework, 50% project
Topics
This course will study the design and analysis approximation algorithms for various optimization problems. We will cover spectral algorithms, semidefinite programs, and related convex programs. Here are tentative topics:
 Semidefinite programs
 SDP duality
 Multiplicative weight update
 Graph spectrum
 Eigenvalue interlacing
 Cheeger–Alon–Milman inequality
 Random walks
 Local graph partitioning
 Expanders
 Laplacian solver
 Effective resistance
 Sparsification
 Matrix scaling
 Abstract simplicial complex
 Random spanning trees
Lectures
Lecture recordings will be posted on Piazza → Resources after classes
Date  Topic  Readings  Notes  

1  Jan 10  Positive semidefinite matrices  [GM] Ch. 2  
2  Jan 11  Semidefinite programs and Goemans–Williamson  [GM] Ch. 1  
3  Jan 17  Separating hyperplane theorems, Polar sets  [GM] Ch. 4  
4  Jan 18  Conjugate function  [BV] §3.3  
5  Jan 24  Dual program  [BV] §4.1.1,§4.2,§5.1  
6  Jan 25  Strong duality  [BV] §5.1,§5.2,§5.3  
Jan 31  Lunar New Year Holiday  
Feb 1  Lunar New Year Holiday  
7  Feb 7  Complementary slackness; KKT conditions  [BV] §5.4, §5.5  
8  Feb 8  Multiplicative Weight Update  Arora–Hazan–Kale  
9  Feb 14  Graph spectrum and Laplacian  [S] Ch. 2.1, 3.1, 4.14.3  
10  Feb 15  Conductance, expansion, normalized Laplacian  [S] Ch. 20  
11  Feb 21  Cheeger–Alon–Milman inequality  [S] Ch. 21  
12  Feb 22  Cheeger–Alon–Milman inequality  [S] Ch. 21  
13  Feb 28  Random walk  [S] Ch. 10  
14  Mar 1  Random walk  [S] Ch. 10  
15  Mar 7  Local graph partitioning  [S] Ch. 22  
16  Mar 8  Local graph partitioning  [S] Ch. 22  
17  Mar 14  Expanders  [S] Ch. 27  
18  Mar 15  Electrical flow and Laplace equation  [S] Ch. 11  
19  Mar 21  Effective resistance  [S] Ch. 12  
20  Mar 22  Effective resistance  [S] Ch. 12  
21  Mar 28  Graph sparsification  [S] Ch. 32  
22  Mar 29  Sampling spanning trees  Lau’s lecture  
23  Apr 4  Sampling spanning trees  
Apr 5  Ching Ming Festival  
23  Apr 11  Sampling spanning trees  
24  Apr 12  Highdimensional expander  Lau’s lecture  
Apr 18  Easter Monday  
25  Apr 19  Highdimensional expander  
Date  Topic  Readings  Notes  

1  Jan 10  Positive semidefinite matrices  [GM] Ch. 2  
2  Jan 11  Semidefinite programs and Goemans–Williamson  [GM] Ch. 1  
3  Jan 17  Separation theorems, Polar sets  [GM] Ch. 4  
4  Jan 18  Conjugate function  [BV] §3.3  
5  Jan 24  Dual program  [BV] §4.1.1,§4.2,§5.1  
6  Jan 25  Strong duality  [BV] §5.1,§5.2,§5.3  
Jan 31  Lunar New Year Holiday  
Feb 1  Lunar New Year Holiday  
7  Feb 7  Complementary slackness; KKT conditions  [BV] §5.4, §5.5  
8  Feb 8  Multiplicative Weight Update  Arora–Hazan–Kale  
9  Feb 14  Graph spectrum and Laplacian  [S] Ch. 2.1, 3.1, 4.14.3  
10  Feb 15  Conductance, expansion, normalized Laplacian  [S] Ch. 20  
11  Feb 21  Cheeger–Alon–Milman inequality  [S] Ch. 21  
12  Feb 22  Cheeger–Alon–Milman inequality  [S] Ch. 21  
13  Feb 28  Random walk  [S] Ch. 10  
14  Mar 1  Random walk  [S] Ch. 10  
15  Mar 7  Local graph partitioning  [S] Ch. 22  
16  Mar 8  Local graph partitioning  [S] Ch. 22  
17  Mar 14  Expanders  [S] Ch. 27  
18  Mar 15  Electrical flow and Laplace equation  [S] Ch. 11  
19  Mar 21  Effective resistance  [S] Ch. 12  
20  Mar 22  Effective resistance  [S] Ch. 12  
21  Mar 28  Graph sparsification  [S] Ch. 32  
22  Mar 29  Sampling spanning trees  Lau’s lecture  
23  Apr 4  Sampling spanning trees  
Apr 5  Ching Ming Festival  
23  Apr 11  Highdimensional expander  Lau’s lecture  
24  Apr 12  Highdimensional expander  
Apr 18  Easter Monday  
25  Apr 19  Highdimensional expander  
Homework assignment
There will be three to four homework assignments
Submit your answers as a single PDF file via Blackboard by 11:59pm
You are encouraged to either typeset your solution in LaTeX or Word, or scan or take photos of your handwritten solution (please bundle your scans/photos into a single PDF file)
Collaboration on homework and consulting references is encouraged, but you must write up your own solutions in your own words, and list your collaborators and your references on the solution sheet
 Assignment 1 due Fri Feb 18 23:59
 Assignment 2 due Fri Mar 25 23:59
 Assignment 3 due Fri Apr 29 23:59
Project
The course project involves writing a short report (510 pages) related to approximation algorithms in an area of your interest. The report may be your exposition of a paper from the list below (or any similar paper), or a summary of your new followup results. Focus on results with rigorous, provable guarantees.
You are encouraged include in your report any illustrative, concrete examples that help you understand the papers you read.
Please submit your project report online as a single PDF file at blackboard.cuhk.edu.hk by May 20 23:59
If your report is a summary of a paper you read, it should highlight the following:
 What is the main result? A typical paper contains one main result and several corollaries or extensions. Focus on the main result in your report, unless you find a particular extension interesting.
 If the main result is very general, can you illustrate it with interesting special cases?
 What are the key new ideas? Can you illustrate them with concrete examples? How did the authors come up with these ideas?
 What have you learned? How might the key ideas or new results be useful in your own research?
 Are key inequalities in the papers tight (perhaps up to constant factors)? If so, can you come up with tight examples? If not, what do you think are the correct bounds?
Write your summary as if you are explaining the main result and its key ideas to other students in this class, or to any graduate student with a solid background in math.
Explain in your own words and in simple language as much as possible. The more you can do so, the better you understand the papers.
Paper suggestions:
 Online algorithms
 Optimal Algorithms for Online bMatching with Variable Vertex Capacities, Susanne Albers, Sebastian Schubert
 Robust Online Convex Optimization in the Presence of Outliers, Tim van Erven, Sarah Sachs, Wouter M. Koolen, Wojciech Kotłowski
 Lazy OCO: Online Convex Optimization on a Switching Budget, Uri Sherman, Tomer Koren
 Efficient Methods for Online Multiclass Logistic Regression, Naman Agarwal, Satyen Kale, Julian Zimmert
 Implicit Parameterfree Online Learning with Truncated Linear Models, Keyi Chen, Ashok Cutkosky, Francesco Orabona
 Multiplicative weight update
 Fast Approximation Algorithms for Bounded Degree and Crossing Spanning Tree Problems, Chandra Chekuri, Kent Quanrud, Manuel R. Torres
 Other approximation algorithms
 MinSum Clustering (with Outliers), Sandip Banerjee, Rafail Ostrovsky, Yuval Rabani
 Approximation Algorithms for Socially Fair Clustering, Yury Makarychev, Ali Vakilian
 Learning
 A Fast Spectral Algorithm for Mean Estimation with SubGaussian Rates, Zhixian Lei, Kyle Luh, Prayaag Venkat, Fred Zhang
 Near Optimal Distributed Learning of Halfspaces with Two Parties, Mark Braverman, Gillat Kol, Shay Moran, Raghuvansh R. Saxena
 Minimax Regret for Stochastic Shortest Path with Adversarial Costs and Known Transition, Liyu Chen, Haipeng Luo, ChenYu Wei
 Characterizing the implicit bias via a primaldual analysis, Ziwei Ji, Matus Telgarsky
 Algorithms for learning a mixture of linear classifiers, Aidao Chen, Anindya De, Aravindan Vijayaraghavan
 Concentration
 Scalar and Matrix Chernoff Bounds from ℓ∞Independence, Tali Kaufman, Rasmus Kyng, Federico Soldá
 Highdimensional expander
 LogConcave Polynomials IV: Exchange Properties, Tight Mixing Times, and Faster Sampling of Spanning Trees, Nima Anari, Kuikui Liu, Shayan Oveis Gharan, Cynthia Vinzant
 Isotropy and LogConcave Polynomials: Accelerated Sampling and HighPrecision Counting of Matroid Bases, Nima Anari, Michał Dereziński
 Spectral Independence in HighDimensional Expanders and Applications to the Hardcore Model, Nima Anari, Kuikui Liu, Shayan Oveis Gharan
 From Coupling to Spectral Independence and Blackbox Comparison with the DownUp Walk, Kuikui Liu
 A Matrix TrickleDown Theorem on Simplicial Complexes and Applications to Sampling Colorings, Dorna Abdolazimi, Kuikui Liu, Shayan Oveis Gharan
 Locally Testable Codes with Constant Rate, Distance, and Locality, Irit Dinur, Shai Evra, Ron Livne, Alexander Lubotzky, Shahar Mozes