Presentation is loading. Please wait.

Presentation is loading. Please wait.

Progress in New Computer Science Research Directions John Hopcroft Cornell University Ithaca, New York CAS May 21, 2010.

Similar presentations


Presentation on theme: "Progress in New Computer Science Research Directions John Hopcroft Cornell University Ithaca, New York CAS May 21, 2010."— Presentation transcript:

1 Progress in New Computer Science Research Directions John Hopcroft Cornell University Ithaca, New York CAS May 21, 2010

2 Time of change  The information age is undergoing a fundamental revolution that is changing all aspects of our lives.  Those individuals and nations who recognize this change and position themselves for the future will benefit enormously. CAS May 21, 2010

3 Drivers of change  Merging of computing and communications  Data available in digital form  Networked devices and sensors  Computers becoming ubiquitous CAS May 21, 2010

4 Internet search engines are changing When was Einstein born? Einstein was born at Ulm, in Wurttemberg, Germany, on March 14, 1879. List of relevant web pages CAS May 21, 2010

5

6 Internet queries will be different  Which car should I buy?  What are the key papers in Theoretical Computer Science?  Construct an annotated bibliography on graph theory.  Where should I go to college?  How did the field of CS develop? CAS May 21, 2010

7 Which car should I buy? Search engine response: Which criteria below are important to you?  Fuel economy  Crash safety  Reliability  Performance  Etc. CAS May 21, 2010

8 MakeCostReliabilityFuel economy Crash safety Links to photos/ articles Toyota Prius23,780Excellent44 mpgFair Honda Accord28,695Better26 mpgExcellent Toyota Camry29,839Average24 mpgGood Lexus 35038,615Excellent23 mpgGood Infiniti M3547,650Excellent19 mpgGood

9 CAS May 21, 2010

10 2010 Toyota Camry - Auto Shows Toyota sneaks the new Camry into the Detroit Auto Show. Usually, redesigns and facelifts of cars as significant as the hot-selling Toyota Camry are accompanied by a commensurate amount of fanfare. So we were surprised when, right about the time that we were walking by the Toyota booth, a chirp of our Blackberries brought us the press release announcing that the facelifted 2010 Toyota Camry and Camry Hybrid mid-sized sedans were appearing at the 2009 NAIAS in Detroit. Toyota CamryCamry Hybrid2009 NAIAS We’d have hardly noticed if they hadn’t told us—the headlamps are slightly larger, the grilles on the gas and hybrid models go their own way, and taillamps become primarily LED. Wheels are also new, but overall, the resemblance to the Corolla is downright uncanny. Let’s hear it for brand consistency! Four-cylinder Camrys get Toyota’s new 2.5-liter four-cylinder with a boost in horsepower to 169 for LE and XLE grades, 179 for the Camry SE, all of which are available with six-speed manual or automatic transmissions. Camry V-6 and Hybrid models are relatively unchanged under the skin. Inside, changes are likewise minimal: the options list has been shaken up a bit, but the only visible change on any Camry model is the Hybrid’s new gauge cluster and softer seat fabrics. Pricing will be announced closer to the time it goes on sale this March. Toyota Camry  › OverviewOverview  › SpecificationsSpecifications  › Price with OptionsPrice with Options  › Get a Free QuoteGet a Free Quote News & Reviews  2010 Toyota Camry - Auto Shows 2010 Toyota Camry - Auto Shows Top Competitors  Chevrolet Malibu Chevrolet Malibu  Ford Fusion Ford Fusion  Honda Accord sedan Honda Accord sedan

11 Which are the key papers in Theoretical Computer Science? Hartmanis and Stearns, “On the computational complexity of algorithms” Blum, “A machine-independent theory of the complexity of recursive functions” Cook, “The complexity of theorem proving procedures” Karp, “Reducibility among combinatorial problems” Garey and Johnson, “Computers and Intractability: A Guide to the Theory of NP-Completeness” Yao, “Theory and Applications of Trapdoor Functions” Shafi Goldwasser, Silvio Micali, Charles Rackoff, “The Knowledge Complexity of Interactive Proof Systems” Sanjeev Arora, Carsten Lund, Rajeev Motwani, Madhu Sudan, and Mario Szegedy, “Proof Verification and the Hardness of Approximation Problems” CAS May 21, 2010

12

13 Fed Ex package tracking CAS May 21, 2010

14

15

16

17

18

19 NEXRAD Radar Binghamton, Base Reflectivity 0.50 Degree Elevation Range 124 NMI — Map of All US Radar Sites Map of All US Radar Sites Animate Map Storm TracksTotal PrecipitationShow SevereRegional RadarZoom Map Click:Zoom InZoom OutPan Map ( Full Zoom Out ) Full Zoom Out »AdvancedRadarTypesCLICK»»AdvancedRadarTypesCLICK»

20 Collective Inference on Markov Models for Modeling Bird Migration CAS May 21, 2010 Space time

21 CAS May 21, 2010 Daniel Sheldon, M. A. Saleh Elmohamed, Dexter Kozen

22 Science base to support activities Track flow of ideas in scientific literature Track evolution of communities in social networks Extract information from unstructured data sources. CAS May 21, 2010

23 Tracking the flow of ideas in scientific literature Yookyung Jo

24 CAS May 21, 2010 Tracking the flow of ideas in the scientific literature Yookyung Jo Page rank Web Link Graph Retrieval Query Search Text Web Page Search Rank Web Chord Usage Index Probabilistic Text File Retrieve Text Index Discourse Word Centering Anaphora

25 CAS May 21, 2010 Original papers

26 CAS May 21, 2010 Original papers cleaned up

27 CAS May 21, 2010 Referenced papers

28 CAS May 21, 2010 Referenced papers cleaned up. Three distinct categories of papers

29 CAS May 21, 2010

30 Tracking communities in social networks CAS May 21, 2010 Liaoruo Wang

31 “Statistical Properties of Community Structure in Large Social and Information Networks”, Jure Leskovec; Kevin Lang; Anirban Dasgupta; Michael Mahoney Studied over 70 large sparse real-world networks. Best communities are of approximate size 100 to 150. CAS May 21, 2010

32 Our most striking finding is that in nearly every network dataset we examined, we observe tight but almost trivial communities at very small scales, and at larger size scales, the best possible communities gradually "blend in" with the rest of the network and thus become less "community-like." CAS May 21, 2010

33 Conductance Size of community 100

34 CAS May 21, 2010 Giant component

35 CAS May 21, 2010 Whisker: A component with v vertices connected by edges

36 Our view of a community CAS May 21, 2010 TCS Me Colleagues at Cornell Classmates Family and friends More connections outside than inside

37 CAS May 21, 2010 Should we remove all whiskers and search for communities in the core? Core

38 Should we remove whiskers? Does there exist a core in social networks? Yes Experimentally it appears at p=1/n in G(n,p) model Is the core unique? Yes and No In G(n,p) model should we require that a whisker have only a finite number of edges connecting it to the core? CAS May 21, 2010 Laura Wang

39 Algorithms How do you find the core? Are there communities in the core of social networks? CAS May 21, 2010

40 How do we find whiskers? NP-complete if graph has a whisker There exists graphs with whiskers for which neither the union of two whiskers nor the intersection of two whiskers is a whisker CAS May 21, 2010

41 Graph with no unique core CAS May 21, 2010 3 1 1

42 Graph with no unique core CAS May 21, 2010 1

43 What is a community? How do you find them? CAS May 21, 2010

44 Communities  Conductance  Can we compress graph?  Rosvall and Bergstrom, “An informatio-theoretic framework for resolving community structure in complex networks”  Hypothesis testing  Yookyung Jo CAS May 21, 2010

45 Description of graph with community structure Specify which vertices are in which communities. Specify the number of edges between each pair of communities. CAS May 21, 2010

46 Information necessary to specify graph given community structure m=number of communities n i =number of vertices in i th community l ij number of edges between i th and j th communities CAS May 21, 2010

47 Description of graph consists of description of community structure plus specification of graph given structure. Specify community for each edge and the number of edges between each community Can this method be used to specify more complex community structure where communities overlap? CAS May 21, 2010

48 Hypothesis testing Null hypothesis: All edges generated with some probability p 0 Hypothesis: Edges in communities generated with probability p 1, other edges with probability p 0. CAS May 21, 2010

49 Clustering Social networks Mishra, Schreiber, Stanton, and Tarjan Each member of community is connected to a beta fraction of community No member outside the community is connected to more than an alpha fraction of the community Some connectivity constraint CAS May 21, 2010

50 In sparse graphs How do you find alpha – beta communities? What if each person in the community is connected to more members outside the community then inside? CAS May 21, 2010

51 In sparse graphs How do you find alpha – beta communities? What if each person in the community is connected to more members outside the community then inside? CAS May 21, 2010

52 Structure of social networks Different networks may have quite different structure. An algorithm for finding alpha beta communities in sparse graphs. How many communities of a given size are there in a social network? Results on exploring a tweeter network of approximately 100,000 nodes. Liaoruo Wang, Jing He, Hongyu Liang CAS May 21, 2010

53 200 randomly chosen nodes Apply alpha beta algorithm Resulting alpha beta community How many alpha beta communities? CAS May 21, 2010

54 Massively overlapping communities Are there a small number of massively overlapping communities that share a common core? Are there massively overlapping communities in which one can move from one community to a totally disjoint community? CAS May 21, 2010

55 Massively overlapping communities with a common core

56 CAS May 21, 2010 Massively overlapping communities

57 Define the core of a set of overlapping communities to be the intersection of the communities. There are a small number of cores in the tweeter data set. CAS May 21, 2010

58 Size of initial setNumber of cores 25221 5094 10019 1508 2004 2504 3004 3503 4003 4503 CAS May 21, 2010

59 450 1 5 6 400 1 5 6 350 1 5 6 300 1 3 5 6 250 1 3 5 6 200 1 3 5 6 150 1 2 3 4 5 6 7 8 CAS May 21, 2010

60 What is the graph structure that causes certain cores to merge and others to simply vanish? What is the structure of cores as they get larger? Do they consist of concentric layers that are less dense at the outside? CAS May 21, 2010

61 Sucheta Soundarajan Transmission paths for viruses, flow of ideas, or influence

62 CAS May 21, 2010 Trivial model Half of contacts Two third of contacts

63 CAS May 21, 2010 Time of first item Time of second item

64 Theory to support new directions CAS May 21, 2010  Random graphs  High dimensions and dimension reduction  Spectral analysis  Massive data sets and streams  Collaborative filtering  Extracting signal from noise  Learning and VC dimension

65 Science base What do we mean by science base? Example: High dimensions CAS May 21, 2010

66 High dimension is fundamentally different from 2 or 3 dimensional space CAS May 21, 2010

67 High dimensional data is inherently unstable Given n random points in d dimensional space essentially all n 2 distances are equal. CAS May 21, 2010

68 High Dimensions Intuition from two and three dimensions not valid for high dimension Volume of cube is one in all dimensions Volume of sphere goes to zero

69 Gaussian distribution CAS May 21, 2010 Probability mass concentrated between dotted lines

70 Gaussian in high dimensions CAS May 21, 2010

71 Two Gaussians CAS May 21, 2010

72

73 Distance between two random points from same Gaussian  Points on thin annulus of radius  Approximate by sphere of radius  Average distance between two points is (Place one pt at N. Pole other at random. Almost surely second point near the equator.) CAS May 21, 2010

74

75

76 Expected distance between pts from two Gaussians separated by δ

77 Can separate points from two Gaussians if CAS May 21, 2010

78 Dimension reduction Project points onto subspace containing centers of Gaussians Reduce dimension from d to k, the number of Gaussians CAS May 21, 2010

79 Centers retain separation Average distance between points reduced by

80 Can separate Gaussians provided CAS May 21, 2010 > some constant involving k and γ independent of the dimension

81 Finding centers of Gaussians CAS May 21, 2010 First singular vector v 1 minimizes the perpendicular distance to points

82 The first singular vector goes through center of Gaussian and minimizes distance to the points. Gaussian CAS May 21, 2010

83 Best k-dimensional space for Gaussian is any space containing the line through the center of the Gaussian. CAS May 21, 2010

84 Given k Gaussians, the top k singular vectors define a k dimensional space that contains the k lines through the centers of the k Gaussian and hence contain the centers of the Gaussians CAS May 21, 2010

85 Johnson-Lindenstrass A random projection of n points to log n dimensional space approximately preserves all pair wise distances with high probability. Clearly one cannot project n points to a line and approximately preserve all pair wise distances. CAS May 21, 2010

86 We have just seen what a science base for high dimensional data might look like. What other areas do we need to develop a science base for?

87 CAS May 21, 2010  Extracting information from large data sources  Communities in social networks  Tracking flow of ideas  Ranking is important  Restaurants, movies, books, web pages  Multi billion dollar industry  Collaborative filtering  When a customer buys a product what else is he likely to buy  Dimension reduction

88 Sparse vectors CAS May 21, 2010 There are a number of situations where sparse vectors are important Tracking the flow of ideas in scientific literature Biological applications Signal processing

89 Sparse vectors in biology CAS May 21, 2010 plants Genotype Internal code Phenotype Observables Outward manifestation

90 Signal processing Reconstruction of sparse vector CAS May 21, 2010 0 13 0 9 0 Y Random orthonormal matrix Fourier transform X A

91 CAS May 21, 2010 If x is a sparse vector, one can recover x from a few random samples of y Let be the random samples of y. The set of x consistent with is an affine space V and

92 A vector and it’s Fourier transform cannot both be sparse. CAS May 21, 2010

93 Lemma: Let n x be the number of non zero elements of x and let n y be the number of non zero elements of y. Then n x n y ≥n Proof: Let x have s non zero elements. Let be the sub matrix of Z consisting of columns corresponding to non zero elements of x and s consecutive rows of Z. is non singular

94 CAS May 21, 2010

95 Lemma: The Vandermonde matrix is non singular. Proof:

96 One can construct an n-1 degree polynomial through any n points CAS May 21, 2010

97

98 Representation in composite basis cannot have two sparse representations

99 Data streams Record the WWW on your lap top Census data Find number of distinct elements in a sequence. Find all frequently occurring elements in a sequence. CAS May 21, 2010

100 Multiplication of two very large sparse matrices C = A B Large sparse matrix Sample row of A and columns of B. What is most rows of A are very light and a few are very heavy? Sample rows with probability proportional to square of weight.

101 CAS May 21, 2010 Random graphs – the emergence of the giant component in the G(n,p) model. To compute the size of a connected component of G(n,p) do a search of a component starting from a random vertex and generate an edge only when the search process needs to know if the edge exists.

102 CAS May 21, 2010 Algorithm Mark start vertex discovered. Select unexplored vertex from set of discovered vertices and add all vertices that are connected to this vertex to the set of discovered vertices andmark vertex as explored. Discovered but unexplored vertices are called the frontier. Process terminates when frontier becomes empty.

103 CAS May 21, 2010 Let z i be the number of vertices discovered in the first i steps of the search. Distribution of z i is binomial To see this observe probability given vertex not adjacent to vertex being searched in a given step is not discovered in i steps is discovered in i steps

104 CAS May 21, 2010

105 0 1 This argument used average values. Can we make it rigorous?

106 CAS May 21, 2010 Belief propagation graphical model or random Markov field

107 CAS May 21, 2010 Factor graph

108 CAS May 21, 2010 If factor graph is a tree Send messages up the tree Computes correct marginals

109 If factor graph not a tree then we could use message passing algorithm anyway. No guarantee of convergence No guarantee of correct marginals even if the algorithm converge + + + + + + + ? + + + + + + + Important issue – correlation of variables CAS May 21, 2010

110 Phase transition in correlation between spin at leaves and spin at root. p q

111 CAS May 21, 2010 High temperature Low temperature

112 CAS May 21, 2010 High temperature Low Temperature

113 CAS May 21, 2010 High temperature Low Temperature Phase transition

114 CAS May 21, 2010 Learning theory and VC-dimension 250 million salary data points age census data Theorem: If shape has finite VC-dimension, an estimate based on a random sample will not deviate from true value by much.

115 CAS May 21, 2010 Discrepancy theory

116 Conclusions  We are in an exciting time of change.  Information technology is a big driver of that change.  The computer science theory needs to be developed to support this information age. CAS May 21, 2010


Download ppt "Progress in New Computer Science Research Directions John Hopcroft Cornell University Ithaca, New York CAS May 21, 2010."

Similar presentations


Ads by Google