Many applications require extreme computational efficiency, where the usage of resources, like runtime, storage, or data samples, is sublinear in the size of the input, and the workshop will cover different areas where this topic is studied, and explore connections between them. Specifically, this topic received a lot of attention within Theoretical Computer Science, often motivated by advances in various application areas, and the workshop will review recent developments, such as algorithms for data streams, sketching, dimensionality reduction, graph sparsification, and property testing. In addition, the workshop will examine connections with linear and sublinear methods in Machine Learning, Statistics, Signal Processing and related areas, such as the well-known connections with compressed sensing, sparse recovery, and nearest-neighbor search. Some more recent connections are to online bandit problems and to distributed optimization, where constraints on algorithmic resources are just starting to be considered; such problems may be amenable to techniques from data-stream algorithms.
Sublinear Algorithms and Nearest-Neighbor Search
Dates: November 27-30, 2018
Location: Calvin Hall, Simons Institute, UC Berkeley
This training workshop is part of the Simons Institute's Foundations of Data Science Program being offered during the Fall 2018 semester. BIDS Senior Fellow Michael Mahoney is a co-organizer.
Michael Mahoney works on algorithmic and statistical aspects of modern large-scale data analysis. Much of his recent research has focused on large-scale machine learning including randomized matrix algorithms and randomized numerical linear algebra; geometric network analysis tools for structure extraction in large informatics graphs; scalable implicit regularization methods; and applications in genetics, astronomy, medical imaging, social network analysis, and internet data analysis.