Title:Better Algorithms and Generalization Performance for Structured Data
Date: February 27, 2019 (Wednesday)
Time: 11:15 am - 12:15 pm
Venue: Room 121, 1/F, Ho Sin-Hang Engineering Building, The Chinese University of Hong Kong, Shatin, N.T.
Speaker: Dr. Hongyang ZHANG
Stanford University


Dealing with large-scale data from modern social and Web systems has been an interesting challenge for algorithm design and machine learning recently. Formalizing such challenges often require better modeling of the underlying data, as well as better modeling of the optimization paradigm in practice. My research aims to provide new algorithms and better models for these settings.

 This talk will show a few results. First, we study non-convex methods and their generalization performance (or sample efficiency) for common ML tasks. We consider over-parameterized models such as matrix and tensor factorizations. This is motivated by the curious observation that in practice neural networks are often trained with more parameters than number of observations. We show that the generalization performance crucially depends on the initialization in this setting. Meanwhile, adding parameters helps optimization by avoiding bad local minima. Next, we consider the problem of predictng the missing entries of tensors. We show that understanding the generalization performance can inform the choice of tensor models for this task. Lastly, we revisit the distance sketching problem on large graphs. We provide new insight on this classic problem by formalizing the structures of social network data. Our results help explain the empirical success that has been achieved by recent work.


Hongyang Zhang is a Ph.D. candidate studying CS at Stanford University, co-advised by Ashish Goel and Greg Valiant. His research interests lie in machine learning and algorithms, including topics related to neural networks, matrix and tensor factorizations, non-convex optimization, social network analysis and game theory. He is a co-author on the best paper at COLT'18.