Home Organization Outline Talks Logistics Participants Sponsors

Comparison lemmas: from slepian to non-smooth optimization (Watch )

Babak Hassibi, Caltech


In the past couple of decades, non-smooth convex optimization has emerged as a powerful tool for the recovery of structured signals (sparse, low rank, etc.) from (possibly) noisy measurements in a variety applications in statistics, signal processing, machine learning, etc. I will describe a fairly general theory for how to determine the performance (minimum number of measurements, mean-square-error, etc.) of such methods for certain measurement ensembles (Gaussian, Haar, etc.). The genesis of the theory can be traced back to an inconspicuous 1962 lemma of Slepian (on comparing Gaussian processes).


Babak Hassibi is professor and executive officer of electrical engineering at the California Institute of Technology, where he has been since 2001. From 1998 to 2001 he was a member of the technical staff at the Mathematical Sciences Research Center at Bell Laboratories, Murray Hill, NJ, and prior to that he obtained his PhD in electrical engineering from Stanford University. His research interests span different aspects of communications, signal processing and control. Among other awards, he is a recipient of the David and Lucille Packard Foundation Fellowship, and the Presidential Early Career Award for Scientists and Engineers (PECASE).

The next best code

Rudiger Urbanke, EPFL


We know how to construct long codes that allow reliable transmission close to capacity. But what if we interested in modest blocklenghts? This naturally leads to the idea of "finite-length scaling.” I will discuss how quickly various code families (can) approach capacity and how we might improve the state of the art.


Rudiger L. Urbanke obtained his Dipl. Ing. degree from the Vienna University of Technology, Austria in 1990 and the M.Sc. and PhD degrees in Electrical Engineering from Washington University in St. Louis, MO, in 1992 and 1995, respectively.

He held a position at the Mathematics of Communications Department at Bell Labs from 1995 till 1999 before becoming a faculty member at the School of Computer & Communication Sciences (I&C) of EPFL. He is a member of the Information Processing Group.

He is principally interested in the analysis and design of iterative coding schemes, which allow reliable transmission close to theoretical limits at low complexities. Such schemes are part of most modern communications standards, including wireless transmission, optical communication and hard disk storage. More broadly, his research focuses on the analysis of graphical models and the application of methods from statistical physics to problems in communications.

From 2000-2004 he was an Associate Editor of the IEEE Transactions on Information Theory and he is currently on the board of the series "Foundations and Trends in Communications and Information Theory." Since 2013 he has been a Member of the Board of the Information Theory Society as well as a Dinstinguished Speaker. From 2009 till 2012 he was the head of the I&C doctoral school and in 2013 he served as Dean a. i. of I&C.

Dr. Urbanke is a recipient of a Fulbright Scholarship. He is a co-author of the book "Modern Coding Theory" published by Cambridge University Press a co-recipient of the 2002 and the 2013 IEEE Information Theory Society Paper Award, the 2011 IEEE Koji Kobayashi Award, as well as the 2014 IEEE Hamming Medal.

High-throughput cortex exploration

Christof Koch, The Allen Institute


The Allen Institute for Brain Science produced a series of brain atlases (www.brain-map.org). These are large (3 TB, >1 million slides) public resources, integrating genome-wide gene expression, and neuroanatomical data across the entire brain for developing and adult humans, non-human primates and mice, complemented by high-resolution, cellular-based anatomical connectivity data in several thousand mice. It is the largest integrated neuroscience database world-wide. Anybody can freely access this data without any restrictions. We have now embarked on an ambitious 10-year initiative to understand the structure and function of the neocortex and associated satellite structures in humans and mice. We are setting up high through-put pipelines to exhaustively characterize the morphology, electrophysiology and transcriptome of cell types as well as their synaptic interconnections in the human neocortex (via a combination of fetal, neurosurgical and post-mortem tissues & human stem cells differentiated into forebrain neurons) and in the laboratory mouse. We are building brain observatories to image the activities of neurons throughout the cortico-thalamic system in behaving mice, to record their electrical activities, and to analyze their connectivity at the ultra-structural level. We are constructing biophysically detailed as well as simplified computer simulations of these networks and of their information processing capabilities focusing on how the neocortex, a 2+epsilon dimensional tissue, gives rise to perception, memory and intelligence.


Christof Koch joined the Allen Institute as Chief Scientific Officer in 2011. For the past 25 years, Koch has served on the faculty at the California Institute of Technology (Caltech), from his initial appointment as Assistant Professor, Division of Biology and Division of Engineering and Applied Sciences in 1986, to his most recent position as Lois and Victor Troendle Professor of Cognitive & Behavioral Biology. Previously, he spent four years as a postdoctoral fellow in the Artificial Intelligence Laboratory and the Brain and Cognitive Sciences Department at the Massachusetts Institute of Technology. He received his baccalaureate from the Lycée Descartes in Rabat, Morocco, his M.S. in physics from the University of Tübingen in Germany and his Ph.D. from the Max-Planck-Institut für Biologische Kybernetik, Tübingen.

Koch has published extensively, and his writings and interests integrate theoretical, computational and experimental neuroscience. His most recent book, Consciousness: Confessions of a Romantic Reductionist, blends science and memoir to explore topics in discovering the roots of consciousness. Stemming in part from a long-standing collaboration with the late Nobel Laureate Francis Crick, Koch authored the book The Quest for Consciousness: A Neurobiological Approach. He has also authored the technical books Biophysics of Computation: Information Processing in Single Neurons and Methods in Neuronal Modeling: From Ions to Networks, and served as editor for several books on neural modeling and information processing. Koch’s research addresses scientific questions using a widely multidisciplinary approach.

His research interests include elucidating the biophysical mechanisms underlying neural computation, understanding the mechanisms and purpose of visual attention, and uncovering the neural basis of consciousness and the subjective mind. Koch maintains a part-time appointment and laboratory at Caltech.

Differential privacy and false discovery control

Cynthia Dwork, Microsoft


Throughout the scientific community there is a growing recognition that claims of statistical significance in published research are frequently invalid. There are many sources of false discovery, including the sheer volume of tests/analyses to be carried out (the data set is likely to be a “fluke” for at least some of the tests), and “data snooping” in which the tests are selected adaptively, based on the data themselves, leading to over-fitting. For the most part, the literature addresses only the first of these.

Differential privacy is a definition of privacy tailored to data analysis, and a wide, and still rapidly growing, field explores differentially private techniques for a rich variety of analytical tasks. We will discuss differential privacy in two different roles. For the static case, in which the hypotheses are fixed in advance, we give the first differentially private algorithm for controlling the false discovery rate (joint work with Weijie Su and Li Zhang). For the adaptive case, we show that differential privacy protects against over-fitting (joint work with Vitaly Feldman, Moritz Hardt, Toni Pitassi, Omer Reingold, and Aaron Roth). Combined with the rich algorithmic literature, this yields a set of tools and a methodology for ensuring generalizability, ie, that results learned on the data set apply to the population at large, even in the case of adaptive analysis.


Cynthia Dwork, Distinguished Scientist at Microsoft Research, is renowned for placing privacy-preserving data analysis on a mathematically rigorous foundation. A cornerstone of this work is differential privacy, a strong privacy guarantee frequently permitting highly accurate data analysis. In recent years she has expanded her efforts in formalizing social concepts to the problem of achieving fairness in classification systems. Dr. Dwork has also made seminal contributions in cryptography and distributed computing, and is a recipient of the Edsger W. Dijkstra Prize, recognizing some of her earliest work establishing the pillars on which every fault-tolerant system has been built for decades. She is a member of the National Academy of Sciences and the National Academy of Engineering, and a Fellow of the American Academy of Arts and Sciences.

Market design

Alvin Roth, Stanford


Market design constructs marketplaces that aggregate distributed private information to derive efficient outcomes. I will introduce some of the market design's main themes and topics.


Alvin Roth received his Bachelor's degree from Columbia University and his Ph.D. from Stanford University. He held faculty positions at the University of Illinois at Urbana Champaign and at The University of Pittsburgh. In 1998 he joined Harvard University as the Gund professor of economics and business administration emeritus, and in 2012 he returned to Stanford as the Craig and Susan McCaw professor of economics.

Prof. Roth has made significant contributions to the fields of game theory, market design and experimental economics, and is known for his emphasis on applying economic theory to solutions for "real-world" problems. In 2012, he won the Nobel Memorial Prize in Economic Sciences jointly with Lloyd Shapley "for the theory of stable allocations and the practice of market design."

Efficient optimal strategies for online prediction

Peter Bartlett, UC Berkeley and QUT


We consider prediction problems formulated as repeated games, without probabilistic assumptions. In each round of the prediction game, a strategy makes a decision, then observes an outcome and pays a loss. The aim is to minimize the regret, which is the amount by which the total loss incurred exceeds the total loss of the best decision in hindsight. We are interested in the minimax optimal strategy, which minimizes the regret. We focus on two cases where the optimal strategy is simple to compute. The first involves prediction with log loss, a formulation of sequential probability density estimation that is closely related to sequential compression, coding, gambling and investment problems. We present a simple characterization of problems for which the optimal strategy does not depend on the length of the game, and show that, for general parametric models, this occurs precisely when the optimal strategy is a Bayesian strategy. The second is the sequential least squares game, where decisions and outcomes lie in a subset of a Hilbert space, and loss is squared distance. When the outcomes are the vertices of a simplex, this is the `Brier game,' studied for the calibration of sequential probability forecasts; when the outcome set is convex, the game is related to sequential Gaussian density estimation. We show that the value of the game depends only on the radius of the smallest ball that contains the convex subset, and that the minimax optimal strategy is a simple shrinkage strategy.

Based on joint work with Fares Hedayati, Wouter Koolen and Alan Malek.


Peter Bartlett is a professor in Computer Science and Statistics at UC Berkeley and professor in Mathematics at the Queensland University of Technology. His research interests include machine learning, statistical learning theory, and adaptive control. He has been associate editor of Machine Learning, the Journal of Machine Learning Research, the Journal of Artificial Intelligence Research, the IEEE Transactions on Information Theory, Bernoulli, and Mathematics of Control Signals and Systems, and on the editorial boards of Machine Learning, JAIR, and Foundations and Trends in Machine Learning. He has been professor in the Research School of Information Sciences and Engineering at the Australian National University, Visiting Miller Professor at UC Berkeley, and honorary professor at the University of Queensland. He was awarded the Malcolm McIntosh Prize for Physical Scientist of the Year in Australia in 2001, was an IMS Medallion Lecturer in 2008, and is an Australian Laureate Fellow and a Fellow of the Institute of Mathematical Statistics.