CSCI5160 Approximation Algorithms — Spring 2017
Siu On CHAN Office hours Tuesday 35pm SHB 911
Teaching assistant
 Yuzhe MA yzma@link
News
 Apr 19: Project report will be due May 12 at 23:59. Please submit your report electronically to siuon@cse.
 Apr 14: Homework 2 deadline is now postponed to Apr 23 (Sunday) 23:59.
 Apr 10: Homework 2 is now posted.
 Mar 4: Homework 1 is now posted.
 Jan 13: There will be makeup classes during the tutorial session time slots on Jan 25, Feb 8 and Feb 15
 Jan 13: Please sign up on Piazza to join our discussion forum
Information
 Lectures
 Wed 10:3011:15pm (LSB G34) Thu 10:3012:15pm (MMW 705)
 References
 [WS] The Design of Approximation Algorithms, David P. Williamson and David B. Shmoys (available online)
 [BV] Convex Optimization, Stephen Boyd and Lieven Vandenberghe (available online)
 [G] Foundations of Optimization, Osman Güler (accessible online at university library webpage)
 [GM] Approximation Algorithms and Semidefinite Programming, Bernd Gärtner and Jiri Matousek (accessible online at university library webpage)
 Forum
 Please sign up on Piazza
 Grading
 50% homeworks, 50% project, 5% bonus scribe notes
Topics
This course will study the design and analysis approximation algorithms for various optimization problems. We will cover spectral algorithms, semidefinite programs, and related convex programs. Here are tentative topics:
 Semidefinite programs
 SDP duality
 Multiplicative weight update
 Graph spectrum
 Cheeger–Alon–Milman inequality
 Random walks
 Local graph partitioning
 Expanders
 Laplacian solver
 Effective resistance
 Sparsification
 Determinantal measures
 strong Rayleigh measures
 Random spanning trees
 Negative dependence
 Lasserre hierarchy
 Integrality gaps
 Extended formulations
 Hardness of approximation
 Parallel repetition
 Smallset expansion
Lectures
Date  Topic  Readings  Notes  

1  Jan 11  Semidefinite programs  [GM] Ch. 2  Notes 1 
2  Jan 12  Goemans–Williamson rounding for MaxCut  [GM] Ch. 1  Notes 1 & 2 
Jan 18  No classes (conference traveling)  
Jan 19  No classes (conference traveling)  
3  Jan 25  Duality makeup class during tutorial 
[GM] Ch. 4  Notes 3 
4  Jan 26  SDP duality  [GM] Ch. 4  
Feb 1  Lunar New Year Holiday  
Feb 2  Lunar New Year Holiday  
5  Feb 8  Multiplicative weights and minimizing regrets makeup class during tutorial 
Arora–Hazan–Kale  
6  Feb 9  Matrix multiplicative weights  Arora–Kale  
7  Feb 15  Graph spectrum and Laplacian makeup class during tutorial 
Lau’s lecture  
8  Feb 16  Cheeger–Alon–Milman inequality  Lau’s lecture  
9  Feb 22  Spectral embedding  Spielman’s lecture  
10  Feb 23  Random walks  Lau’s lecture  
11  Mar 1  Random walks  
12  Mar 2  Expanders  Lau’s lecture  
13  Mar 8  Local graph partitioning  Lau’s lecture  
14  Mar 9  Local graph partitioning  
15  Mar 15  Smallset expanders and hypercontraction  O’Donnell §9.4 & Cor. 8  
16  Mar 16  Effective resistance  Spielman’s lectures 7 & 8  
17  Mar 22  Effective resistance  Lau’s lecture  
18  Mar 23  Graph sparsification  Spielman’s lecture  
19  Mar 29  Laplacian solver  Lau’s lecture Spielman’s lecture 

20  Mar 30  Random spanning trees  Oveis Gharan’s lecture  
21  Apr 5  Random spanning trees  
22  Apr 6  Determinantal measures and negative dependence  Oveis Gharan’s lecture  
23  Apr 12  Traveling salesperson  Oveis Gharan’s lecture  
24  Apr 13  John ellipsoid  Lau’s lecture  
25  Apr 21  John ellipsoid  
26  Apr 22  Largest simplex  Nikolov 
Homeworks
There will be two to three homeworks.
Collaboration on homework and consulting references is encouraged, but you must write up your own solutions in your own words, and list your collaborators and your references on the solution sheet.
Homework 1 due Thu Mar 16 in class
Homework 2 due Sun Apr 23 at 23:59 (updated to fix typos and a mistake in Q5(b) on Apr 17). Please submit to 9th floor assignment box in Ho SinHang Engineering Building.
Project
The course project involves writing a short report (510 pages) related to approximation algorithms in an area of your interest. The report may be your exposition of a paper from the list below (or any similar paper), or a summary of your new followup results. You are encouraged include in your report any illustrative, concrete examples that help you understand the papers you read. Fast Laplacian solver / maxflow
 A Framework for Analyzing Resparsification Algorithms, Rasmus Kyng, Jakub Pachocki, Richard Peng, Sushant Sachdeva
 AlmostLinearTime Algorithms for Markov Chains and New Spectral Primitives for Directed Graphs, Michael B. Cohen, Jonathan Kelner, John Peebles, Richard Peng, Anup Rao, Aaron Sidford, Adrian Vladu
 The Expected Norm of a Sum of Independent Random Matrices: An Elementary Approach, Joel A. Tropp
 Approximate Undirected Maximum Flows in O(m polylog(n)) Time, Richard Peng
 Computing Maximum Flow with Augmenting Electrical Flows, Aleksander Mądry
 Sparsification / expanders
 TwiceRamanujan Sparsifiers, Joshua Batson, Daniel A. Spielman, Nikhil Srivastava
 Spectral Sparsification and Regret Minimization Beyond Matrix Multiplicative Updates, Zeyuan AllenZhu, Zhenyu Liao, Lorenzo Orecchia
 Lifts, Discrepancy and Nearly Optimal Spectral Gap, Yonatan Bilu, Nathan Linial
 Ramanujan Graphs in Polynomial Time, Michael B. Cohen
 Graph partitioning
 Local Graph Partitioning using PageRank Vectors, Reid Andersen, Fan Chung, Kevin Lang
 Improved Cheeger’s Inequality: Analysis of Spectral Partitioning Algorithms through Higher Order Spectral Gap, Tsz Chiu Kwok, Lap Chi Lau, Yin Tat Lee, Shayan Oveis Gharan, Luca Trevisan
 Local Flow Partitioning for Faster Edge Connectivity, Monika Henzinger, Satish Rao, Di Wang
 Minimizing discrepancy
 Deterministic Discrepancy Minimization via the Multiplicative Weight Update Method, Avi Levy, Harishchandra Ramadas, Thomas Rothvoss
 Algorithmic Discrepancy Beyond Partial Coloring, Nikhil Bansal, Shashwat Garg
 A Logarithmic Additive Integrality Gap for Bin Packing, Rebecca Hoberg, Thomas Rothvoss
 Semidefinite programs and hierarchies
 Lecture Notes on the ARV Algorithm for Sparsest Cut, Thomas Rothvoss
 On the Lovász Theta function for Independent Sets in Sparse Graphs, Nikhil Bansal, Anupam Gupta, Guru Guruganesh
 How to refute a random CSP, Sarah R. Allen, Ryan O’Donnell, David Witmer
 Sum of squares lower bounds for refuting any CSP, Pravesh K. Kothari, Ryuhei Mori, Ryan O’Donnell, David Witmer
 Machine learning / sparse approximation
 Optimal ColumnBased LowRank Matrix Reconstruction, Venkatesan Guruswami, Ali Kemal Sinop
 A Matrix Factorization Approach for Learning SemidefiniteRepresentable Regularizers, Yong Sheng Soh, Venkat Chandrasekaran
 Noisy Tensor Completion via the SumofSquares Hierarchy, Boaz Barak, Ankur Moitra
 InformationTheoretic Lower Bounds on the Oracle Complexity of Stochastic Convex Optimization, Alekh Agarwal, Peter L. Bartlett, Pradeep Ravikumar, Martin J. Wainwright
 Online algorithms
 A PrimalDual Randomized Algorithm for Weighted Paging, Nikhil Bansal, Niv Buchbinder , Joseph (Seffi) Naor
 Online Algorithms for Covering and Packing Problems with Convex Objectives, Yossi Azar, Niv Buchbinder, TH. Hubert Chan, Shahar Chen, Ilan Reuven Cohen, Anupam Gupta, Zhiyi Huang, Ning Kang, Viswanath Nagarajan, Joseph (Seffi) Naor, Debmalya Panigrahi
 Hypercontractive inequality
 Robust Optimality of Gaussian Noise Stability, Elchanan Mossel, Joe Neeman
 Hypercontractive inequalities via SOS, and the Frankl–Rödl graph, Manuel Kauers, Ryan O’Donnell, LiYang Tan, Yuan Zhou