STOC 2016 SoCG 2016
General
Important Dates
Call for Submissions
Program
Paper Submission
Location
Organization

Program

09:00 - 09:50 Piotr Indyk
Invited Talk: New Algorithms for Similarity Search in High Dimensions

09:55 - 10:20 Chao Chen
High Dimensional Mode Estimation via Graphical Models

10:20 - 10:45 Mickael Buchet, Tamal K. Dey, Jiayuan Wang, and Yusu Wang
Declutter and Resample: Towards parameter free denoising

15:45 - 11:10 Coffee break

11:10 - 12:10 Santosh Vempala
STOC-SOCG Keynote: The Interplay of Sampling and Optimization in High Dimension

Abstract: In the first part of this talk, we show how simulated annealing can be used for both logconcave sampling and convex optimization by varying parameter settings. The resulting algorithm for optimization can be viewed as an interior-point method needing only a membership oracle and achieving the same worst-case iteration complexity as the best possible barriers. In the second part, we present a faster sampling algorithm for polytopes which achieves subquadratic mixing time. The advantage of the algorithm, called geodesic gliding, comes from using non-Euclidean geometries where effects of the boundary are milder. Roughly speaking, geodesic gliding is an efficient discrete-time simulation of a stochastic differential equation on a Riemannian manifold whose metric is defined by the Hessian of a convex function.

The talk is based on joint works with Ben Cousins, Adam Kalai, Yin Tat Lee and Laci Lovasz.

12:10 - 14:00 Lunch

14:00 - 15:00 Timothy Chan
STOC-SOCG Keynote: Computational Geometry, from Low to High Dimensions

Abstract: Classical exact algorithms in computational geometry usually have run times that are exponential in the dimension. Recently, slightly subquadratic results have been found for some basic problems, such as offline orthogonal range search and offline Hamming nearest neighbor search, even when the dimension goes above logarithmic, surprisingly. I will survey these recent findings (including some work in progress and a new joint work with Josh Alman and Ryan Williams).

Along the way, we will see how old geometric divide-and-conquer ideas (variants of range trees and kd trees) can help solve nongeometric problems, such as all-pairs shortest paths, Boolean matrix multiplication, and 0-1 integer linear programming. In the opposite direction, we will also see how nongeometric techniques (fast matrix multiplication, circuit complexity, probabilistic polynomials, and Chebyshev polynomials) can help computational geometry. The latter techniques also give new results on offline approximate nearest neighbor search.

15:00 - 15:30 Coffee break

15:30 - 16:20 Sanjoy Dasgupta
Invited Talk: The Geometry of Interactive Learning

Abstract: In the usual setup of supervised learning, there is little interaction between human and machine: a human being labels a data set and then vanishes from the picture; and at some later time, a machine is started up, given that data, and told to find a good classifier.

"Interactive learning" refers to scenarios in which the human engages with the machine while learning is actually taking place. This can happen in many ways, for example:
  1. The machine might ask for labels of specific, highly informative points that are chosen adaptively, rather than requiring everything to be labeled in advance.
  2. If prompted, the human might indicate relevant features, for instance by highlighting a few words within a document that are highly indicative of its label.
  3. For tasks that are traditionally considered unsupervised, such as clustering or embedding, an iterative refinement process can be designed in which the human keeps giving feedback on the current structure until it is finally satisfactory.
I will describe a general protocol for interactive learning that includes such scenarios and that admits generic learning algorithms. The central question I'll consider is: how much interaction is needed to learn well? Can interaction significantly reduce the total human effort in the learning process?

This is largely an open area of research, but it is clear that the key determiner of "interaction complexity" is the geometry of the class of structures being learned. The quantities of interest are quite different from traditional parameters associated with learning, like VC dimension.

16:25 - 16:50 Kush R. Varshney and Karthikeyan Natesan Ramamurthy
Persistent Homology of Classifier Decision Boundaries

16:50 - 17:15 Justin Eldridge, Mikhail Belkin, and Yusu Wang
Consistent Estimation of the Graphon Cluster Tree

17:15 - 17:40 Hu Ding, Yu Liu, Lingxiao Huang, and Jian Li
A Geometric Approach for K-Means Clustering with Distributed Dimensions

17:40 - 18:05 Cyril J. Stark
Global completability of matrix factorizations