1 Heat flow and a faster Algorithm to Compute the Surface Area of a Convex Body Hariharan Narayanan, University of Chicago Joint work with Mikhail Belkin,

Slides:



Advertisements
Similar presentations
1+eps-Approximate Sparse Recovery Eric Price MIT David Woodruff IBM Almaden.
Advertisements

Quantum Versus Classical Proofs and Advice Scott Aaronson Waterloo MIT Greg Kuperberg UC Davis | x {0,1} n ?
The Future (and Past) of Quantum Lower Bounds by Polynomials Scott Aaronson UC Berkeley.
Lower Bounds for Local Search by Quantum Arguments Scott Aaronson.
Pretty-Good Tomography Scott Aaronson MIT. Theres a problem… To do tomography on an entangled state of n qubits, we need exp(n) measurements Does this.
How to Solve Longstanding Open Problems In Quantum Computing Using Only Fourier Analysis Scott Aaronson (MIT) For those who hate quantum: The open problems.
Routing Complexity of Faulty Networks Omer Angel Itai Benjamini Eran Ofek Udi Wieder The Weizmann Institute of Science.
The Average Case Complexity of Counting Distinct Elements David Woodruff IBM Almaden.
Optimal Bounds for Johnson- Lindenstrauss Transforms and Streaming Problems with Sub- Constant Error T.S. Jayram David Woodruff IBM Almaden.
Sublinear-time Algorithms for Machine Learning Ken Clarkson Elad Hazan David Woodruff IBM Almaden Technion IBM Almaden.
On the degree of symmetric functions on the Boolean cube Joint work with Amir Shpilka.
VC Dimension – definition and impossibility result
Shortest Vector In A Lattice is NP-Hard to approximate
Approximate List- Decoding and Hardness Amplification Valentine Kabanets (SFU) joint work with Russell Impagliazzo and Ragesh Jaiswal (UCSD)
Many-to-one Trapdoor Functions and their Relations to Public-key Cryptosystems M. Bellare S. Halevi A. Saha S. Vadhan.
On Complexity, Sampling, and -Nets and -Samples. Range Spaces A range space is a pair, where is a ground set, it’s elements called points and is a family.
Dana Moshkovitz MIT Joint work with Subhash Khot, NYU 1.
On the Density of a Graph and its Blowup Raphael Yuster Joint work with Asaf Shapira.
Enumerative Lattice Algorithms in any Norm via M-Ellipsoid Coverings Daniel Dadush (CWI) Joint with Chris Peikert and Santosh Vempala.
Approximation Algorithms Chapter 14: Rounding Applied to Set Cover.
An Efficient Membership-Query Algorithm for Learning DNF with Respect to the Uniform Distribution Jeffrey C. Jackson Presented By: Eitan Yaakobi Tamar.
Counting Algorithms for Knapsack and Related Problems 1 Raghu Meka (UT Austin, work done at MSR, SVC) Parikshit Gopalan (Microsoft Research, SVC) Adam.
Heuristics for the Hidden Clique Problem Robert Krauthgamer (IBM Almaden) Joint work with Uri Feige (Weizmann)
Learning Juntas Elchanan Mossel UC Berkeley Ryan O’Donnell MIT Rocco Servedio Harvard.
Optimal Column-Based Low-Rank Matrix Reconstruction SODA’12 Ali Kemal Sinop Joint work with Prof. Venkatesan Guruswami.
On the Spread of Viruses on the Internet Noam Berger Joint work with C. Borgs, J.T. Chayes and A. Saberi.
10/11/2001Random walks and spectral segmentation1 CSE 291 Fall 2001 Marina Meila and Jianbo Shi: Learning Segmentation by Random Walks/A Random Walks View.
Pattern recognition Professor Aly A. Farag
A Randomized Polynomial-Time Simplex Algorithm for Linear Programming Daniel A. Spielman, Yale Joint work with Jonathan Kelner, M.I.T.
Discrete geometry Lecture 2 1 © Alexander & Michael Bronstein
Active Learning. 2 Learning from Examples  Passive learning A random set of labeled examples A random set of labeled examples.
CPSC 689: Discrete Algorithms for Mobile and Wireless Systems Spring 2009 Prof. Jennifer Welch.
Analysis of greedy active learning Sanjoy Dasgupta UC San Diego.
Minimaxity & Admissibility Presenting: Slava Chernoi Lehman and Casella, chapter 5 sections 1-2,7.
Testing of Clustering Noga Alon, Seannie Dar Michal Parnas, Dana Ron.
Vapnik-Chervonenkis Dimension Definition and Lower bound Adapted from Yishai Mansour.
Dasgupta, Kalai & Monteleoni COLT 2005 Analysis of perceptron-based active learning Sanjoy Dasgupta, UCSD Adam Tauman Kalai, TTI-Chicago Claire Monteleoni,
1 On approximating the number of relevant variables in a function Dana Ron & Gilad Tsur Tel-Aviv University.
Preference Analysis Joachim Giesen and Eva Schuberth May 24, 2006.
(work appeared in SODA 10’) Yuk Hei Chan (Tom)
How Robust are Linear Sketches to Adaptive Inputs? Moritz Hardt, David P. Woodruff IBM Research Almaden.
Dana Moshkovitz, MIT Joint work with Subhash Khot, NYU.
Ch. 8 & 9 – Linear Sorting and Order Statistics What do you trade for speed?
Diophantine Approximation and Basis Reduction
Ragesh Jaiswal Indian Institute of Technology Delhi Threshold Direct Product Theorems: a survey.
Ran El-Yaniv and Dmitry Pechyony Technion – Israel Institute of Technology, Haifa, Israel Transductive Rademacher Complexity and its Applications.
Approximating Submodular Functions Part 2 Nick Harvey University of British Columbia Department of Computer Science July 12 th, 2015 Joint work with Nina.
Submodular Functions Learnability, Structure & Optimization Nick Harvey, UBC CS Maria-Florina Balcan, Georgia Tech.
Expanders via Random Spanning Trees R 許榮財 R 黃佳婷 R 黃怡嘉.
Volume computation László Lovász Microsoft Research
Why is it useful to walk randomly? László Lovász Mathematical Institute Eötvös Loránd University October
Manipulating the Quota in Weighted Voting Games (M. Zuckerman, P. Faliszewski, Y. Bachrach, and E. Elkind) ‏ Presented by: Sen Li Software Technologies.
Approximate Inference: Decomposition Methods with Applications to Computer Vision Kyomin Jung ( KAIST ) Joint work with Pushmeet Kohli (Microsoft Research)
Chapter 13 (Prototype Methods and Nearest-Neighbors )
Elements of Pattern Recognition CNS/EE Lecture 5 M. Weber P. Perona.
1 Introduction to Quantum Information Processing CS 467 / CS 667 Phys 467 / Phys 767 C&O 481 / C&O 681 Richard Cleve DC 3524 Course.
Diversity Loss in General Estimation of Distribution Algorithms J. L. Shapiro PPSN (Parallel Problem Solving From Nature) ’06 BISCuit 2 nd EDA Seminar.
The Poincaré Constant of a Random Walk in High- Dimensional Convex Bodies Ivona Bezáková Thesis Advisor: Prof. Eric Vigoda.
1 4.1 Hash Functions and Data Integrity A cryptographic hash function can provide assurance of data integrity. ex: Bob can verify if y = h K (x) h is a.
Theory of Computational Complexity Probability and Computing Ryosuke Sasanuma Iwama and Ito lab M1.
11-1 Lyapunov Based Redesign Motivation But the real system is is unknown but not necessarily small. We assume it has a known bound. Consider.
Theory of Computational Complexity M1 Takao Inoshita Iwama & Ito Lab Graduate School of Informatics, Kyoto University.
Shuffling by semi-random transpositions Elchanan Mossel, U.C. Berkeley Joint work with Yuval Peres and Alistair Sinclair.
Theory of Computational Complexity Probability and Computing Chapter Hikaru Inada Iwama and Ito lab M1.
Stochastic Streams: Sample Complexity vs. Space Complexity
Vitaly Feldman and Jan Vondrâk IBM Research - Almaden
Understanding Generalization in Adaptive Data Analysis
Generalization and adaptivity in stochastic convex optimization
Hariharan Narayanan, University of Chicago Joint work with
Submodular Maximization with Cardinality Constraints
Presentation transcript:

1 Heat flow and a faster Algorithm to Compute the Surface Area of a Convex Body Hariharan Narayanan, University of Chicago Joint work with Mikhail Belkin, Ohio state University Partha Niyogi, University of Chicago

2 Computing the Surface Area of a Convex Body Open problem (Grötschel, Lovász, Schrijver [GLS90].) In randomized polynomial time (Dyer, Gritzmann, Hufnagel [DGH98].)

3 Clustering and Surface Area of Cuts Semi-supervised Classification - Labelled and unlabelled data Low Density Separation (Chapelle, Zien [CZ05].) is a measure of the quality of the cut ( is the prob. density and is the surface area measure on the cut)

4 Prior work on Computing the Volume of Convex bodies n dimension, c fixed constant Volume cannot be approximated in deterministic poly time within (Bárány, Fϋredi [BF88] ) Volume can be approximated in randomized poly time within (Dyer, Freize, Kannan [DFK89].) Numerous improvements in complexity - Best known is ( Lovász, Vempala [LV04].)

5 The Model Given: Membership oracle for convex body K. The radius r and centre O of a ball contained in K. Radius R of a ball with centre O containing K.

6 Complexity of Computing the Surface Area At least as hard as Volume: Let Then the surface area of C(K) is an approximation of twice the volume of K.

7 Computing the Surface Area of a Convex Body Previous approach : Choose appropriate Consider the convex body, its - neighbourhood and their difference.

8 Computing the Surface Area of a Convex Body Previous approach: Compute Surface area by interpolation

9 Computing the Surface Area of a Convex Body Previous approach involves computing the Volume of ; cost appears to be= given membership oracle for (with present Technology) : Answering each oracle query to takes time. Computing volume takes time.

10 Heat Flow t = 0 t = 0.05 t = t = 0.075

11 Heat diffusing out of in time Terminology Motivation

12 Fact For small, Surface Area Terminology Heat diffusing out of in time

13 Heat diffusing out in time Fact For small, Surface Area Terminology Algorithm Choose random points in

14 Algorithm Choose random points in Perturb each by a random vector from a multivariate Gaussian Set fraction of perturbed points landing outside Obtain estimate of the Volume. Output as the estimate for Surface Area.

15 Choice of t: Find radius of a ball in, large in the following sense – For chosen uniformly at random from for some unit vector Set

16 Finding : for some unit vector T 2-isotropic : For all unit vectors Set smallest eigenvalue of

17 If samples were generated uniformly at random, Output {Heat Flow} Algorithm’s relation to Heat Flow

18 Complexity of rounding the body (and finding) - Complexity of estimating volume – Complexity of generating random points - Algorithm’s Complexity

19 Given membership oracle and sufficiently many random samples from the body, “fraction escaping” Cheeger ratio for smooth non- convex bodies

20 Analysis: Upper bound on Terminology: Heat flow

21 Analysis: Upper bound on Terminology: Heat flow : Let Then,

22 Analysis: Upper bound on Terminology: Heat flow : Plot of for t = 1/4

23 Analysis: Upper bound on Terminology: S = Surface Area, V = Volume Heat flow : The “Alexandrov-Fenchel inequalities”imply that which leads to,

24 Analysis: Lower bound on Terminology: Heat flow

25 Analysis: Lower bound on Terminology: Heat flow : Let Then,

26 Analysis: Lower bound on Terminology: Heat flow : Plot of for t = 1/4

27 Analysis: Lower bound on Terminology: Heat flow : For the upper bound we had ?

28 Analysis: Lower bound on Proof: Surface Area is monotonic, that is, Lemma:

29 Analysis: Lower bound on Terminology: Heat flow : implies that

30 Other Considerations: We have the upper bound ; Need to upper bound by. The fraction of perturbed points that fall outside has Expectation ; Need to lower bound by to ensure that is close to its expectation (since we are using random samples.)

31 Other Considerations: Need to upper bound by We show Need to lower bound by We show

32 Upper bound for : We show Infinitesimally,

33 Lower bound for : We show Prove that Method : Consider

34