Views Navigation

Event Views Navigation

Latest Past Events

How to Trap a Gradient Flow

online

Abstract: In 1993, Stephen A. Vavasis proved that in any finite dimension, there exists a faster method than gradient descent to find stationary points of smooth non-convex functions. In dimension 2 he proved that 1/eps gradient queries are enough, and that 1/sqrt(eps) queries are necessary. We close this gap by providing an algorithm based on a new local-to-global phenomenon for smooth non-convex functions. Some higher dimensional results will also be discussed. I will also present an extension of the 1/sqrt(eps)…

On Using Graph Distances to Estimate Euclidean and Related Distances

online

Abstract:  Graph distances have proven quite useful in machine learning/statistics, particularly in the estimation of Euclidean or geodesic distances. The talk will include a partial review of the literature, and then present more recent developments on the estimation of curvature-constrained distances on a surface, as well as on the estimation of Euclidean distances based on an unweighted and noisy neighborhood graph. - About the Speaker:  Ery Arias-Castro received his Ph.D. in Statistics from Stanford University in 2004. He then took…

[POSTPONED] The Blessings of Multiple Causes

E18-304

  *Please note: this event has been POSTPONED until Fall 2020* See MIT’s COVID-19 policies for more details. Title: The Blessings of Multiple Causes Abstract: Causal inference from observational data is a vital problem, but it comes with strong assumptions. Most methods require that we observe all confounders, variables that affect both the causal variables and the outcome variables. But whether we have observed all confounders is a famously untestable assumption. We describe the deconfounder, a way to do causal…


© MIT Statistics + Data Science Center | 77 Massachusetts Avenue | Cambridge, MA 02139-4307 | 617-253-1764 |
      
Accessibility