09:00 - 09:50 Piotr Indyk
09:55 - 10:20 Chao Chen
10:20 - 10:45 Mickael Buchet, Tamal K. Dey, Jiayuan Wang, and Yusu Wang
15:45 - 11:10 Coffee break
11:10 - 12:10 Santosh Vempala
Abstract: In the first part of this talk, we show how simulated annealing can be used for both logconcave sampling and convex optimization by varying parameter settings. The resulting algorithm for optimization can be viewed as an interior-point method needing only a membership oracle and achieving the same worst-case iteration complexity as the best possible barriers. In the second part, we present a faster sampling algorithm for polytopes which achieves subquadratic mixing time. The advantage of the algorithm, called geodesic gliding, comes from using non-Euclidean geometries where effects of the boundary are milder. Roughly speaking, geodesic gliding is an efficient discrete-time simulation of a stochastic differential equation on a Riemannian manifold whose metric is defined by the Hessian of a convex function.
The talk is based on joint works with Ben Cousins, Adam Kalai, Yin Tat Lee and Laci Lovasz.
12:10 - 14:00 Lunch
14:00 - 15:00 Timothy Chan
Abstract: Classical exact algorithms in computational geometry usually have run times that are exponential in the dimension. Recently, slightly subquadratic results have been found for some basic problems, such as offline orthogonal range search and offline Hamming nearest neighbor search, even when the dimension goes above logarithmic, surprisingly. I will survey these recent findings (including some work in progress and a new joint work with Josh Alman and Ryan Williams).
Along the way, we will see how old geometric divide-and-conquer ideas (variants of range trees and kd trees) can help solve nongeometric problems, such as all-pairs shortest paths, Boolean matrix multiplication, and 0-1 integer linear programming. In the opposite direction, we will also see how nongeometric techniques (fast matrix multiplication, circuit complexity, probabilistic polynomials, and Chebyshev polynomials) can help computational geometry. The latter techniques also give new results on offline approximate nearest neighbor search.
15:00 - 15:30 Coffee break
15:30 - 16:20 Sanjoy Dasgupta
Abstract: In the usual setup of supervised learning, there is little interaction between human and machine: a human being labels a data set and then vanishes from the picture; and at some later time, a machine is started up, given that data, and told to find a good classifier.
"Interactive learning" refers to scenarios in which the human engages with the machine while learning is actually taking place. This can happen in many ways, for example:
This is largely an open area of research, but it is clear that the key determiner of "interaction complexity" is the geometry of the class of structures being learned. The quantities of interest are quite different from traditional parameters associated with learning, like VC dimension.
16:25 - 16:50 Kush R. Varshney and Karthikeyan Natesan Ramamurthy
16:50 - 17:15 Justin Eldridge, Mikhail Belkin, and Yusu Wang
17:15 - 17:40 Hu Ding, Yu Liu, Lingxiao Huang, and Jian Li
17:40 - 18:05 Cyril J. Stark