Presentation is loading. Please wait.

Presentation is loading. Please wait.

Ld,p  (X-p)od = Xod - pod And letting Ld  Xod, Ld,p = Ld - pod

Similar presentations


Presentation on theme: "Ld,p  (X-p)od = Xod - pod And letting Ld  Xod, Ld,p = Ld - pod"— Presentation transcript:

1 Ld,p  (X-p)od = Xod - pod And letting Ld  Xod, Ld,p = Ld - pod
FAUST Analytics X(X1..Xn)  Rn, |X|=N. If X is a classified training set with classes, C={C1..CK} then X=X((X1..Xn,C}. d=(d1..dn), p=(p1..pn)Rn. Functionals, F:RnR, F=L, S, R (in terms of bit columns (compressed or not), of mappings from a PTS to a SPTS). Ld,p  (X-p)od = Xod - pod And letting Ld  Xod, Ld,p = Ld - pod Sp  (X-p)o(X-p) = XoX + Xo(-2p) + pop = L-2p + XoX + pop Rd,p  Sp - L2d,p = XoX+L-2p+pop-(Ld)2-2pod*Xod+(pod)d2 = L-2p-(2pod)d - (Ld)2 + XoX + pop+(pod)2 Fmind,p,k  min(Fd,p&Ck)= minFd,p,k where Fd,p,k = Fd,p & Ck Fmaxd,p,k max(Fd,p&Ck)= maxFd,p,k FPCCd,p,k,j jth precipitous count change (left-to-right) of Fd,p,k. Same notation for PCIs and PCDs (incr/decr) GAP: Gap Clusterer If DensityThreshold, DT, isn't reached, cut C mid-gap of Ld,p&C using the next (d,p) from dpSet PCC: Precipitous Count Change Clusterer If DT isn't reached, cut C at PCCsLd,p&C using the next (d,p) from dpSet Fusion step may be required? Use density, proximity, or use Pillar pkMeans (next slide). TKO: Top K Outlier Detector Use rankn-1Sx for TopKOutlier-slider. LIN: Linear Classifier yCk iff yLHk  {z | minLd,p,k  Ld,p,k(z)  maxLd,pd,k}  (d,p)dpSet LHk is a Linear hull around Ck. dpSet is a set of (d,p) pairs, e.g., (Diag,DiagStartPt). LSR: Linear Spherical Radial Classifier yCk iff yLSRHk{z | minFd,p,k Fd,p,k(z)  maxFd,p,k d,pdpSet, F=L,S,R} XoX can be pre-computed, one time. What should we pre-compute besides XoX? stats(min/avg/max/std); Xop; p=class_Avg/Med; Xod; Xox; d2(X,x); Rkid2(X,x); Ld,p, Rd,p

2 William Perrizo Here attached, are today's notes
William Perrizo Here attached, are today's notes. I hope to get some new Research Assistants in place (so far Maninder Singh) soon to do the following: 1.  sign a Treeminer NDA. 2.  Get access to and learn to use the Treeminer development environment (eclipse, Java, GIT) to code up all the new algorithms we are looking at. 3.  Test new algorithms using the Treeminer real-world datasets. I hope to get these RAs going soon. Others that can jjoin this project group (e.g., Rajat Singh). The payoff will be great if this happens to a substantial degree. It will take a concentrated effort and maybe a little assistance (help desk type) from Treemner? Mark Silverman Tue 11/25/2014 5:26 PM FYI, some updated results in text classification…   a clear indication that faust in and of itself is capable of producing results as accurate as anything out there.  This is famous dataset (7,500 docs).  Plain Jane FAUST got 80% - extra boost by: - Eliminating any term that appears in 2 documents or less in the training set - Using chi-squared to reduce attributes a further 20% (e.g. pick 80% most important attributes from the training set). A key advantage we have is that by processing vertically, we can toss attributes easily before we expend a lot of CPU on them.  If we can toss them intelligently, we can improve the accuracy of the results, as well as reduce classification time!  In this case, we eliminated about 70% of the attributes from the test dataset and achieved better accuracy than the classifiers referenced on Stanford Natural Language Processing site!!  We’re exploring other approaches to further identify the critical attribute. About to turn this loose on datasets approaching 1TB in size. William Perrizo Wed 11/26/2014 8:10 AM Great news! If the classification setting is such that every test sample goes in some class (i.e., no "other" or "no class" samples) then FAUST Oblique using the midpoint of each gap as the cut point should be the best approach (i.e., the original FAUST Oblique).   If the dataset has test samples which do not go in any of the classes (which is always the case when doing "one class" classification for example), then by making two cuts for every gap, one at the beginning of the gap and the other at the end of the gap, we produce a "piecewise linear hull" around each class and thereby accommodate samples that do not belong to class (namely those test samples that fall in the interior of a gap). That's really the only difference between the older FAUST Oblique (cutpoint=gap midpoint) method and the newer FAUST Oblique Hull method. Mark Silverman Wed 11/26/2014 9:06 AM We are adjusting the midpoint as well based on cluster deviation- this gives us an extra 4 percentage points or so accuracy over straight midpoint. The hull is interesting case, as we are looking at situations like this – we are already able to predict which members are poor matches to a class,  I will look more closely at that – this is very interesting and very important case (multiclass even). William Perrizo Wed 11/26/ Yes, we have discovered also that one has to think about the quality of the training set.   If it is very high quality (expected to fully encompass all borderline cases of all classes) then using exact gap endpoints is probably wise, but if there is reason to worry about the comprehensiveness of the training set (e.g., when there are very few training samples - which is often the case in medical expert systems where getting a sufficient number of training samples is difficult and expensive), then it is probably better to move the cutpoints toward the midpoint (reflecting the vagueness of training set class boundaries).  What does one use to decide how much to move away from the endpoints?  That's not an easy question to answer.  Cluster deviation seems like a useful measure to employ. One last though on how to decide whether to cut at gap midpoint, endpoints, or to move the cut-points away from the endpoints toward the midpoint, If one has a time-stamp on training samples, one might assess the "class endpoint" change rate over time.  As the training set gets larger and larger, if an endpoint stops moving much and isn't an outlier, then cutting at the endpoint seems wise.  However, if an endpoint is still changing a lot, then moving away from that endpoint seems wise (maybe based on the rate of change of that endpoint as well as other measures?). Mark Silverman Wed 11/26/ A related point: dominant attributes may exist in only some classes - must be factored in when ascribing weight/value to an attribute.

3 Graph theory (Wikipedia)
hyperedge is an edge with any # of vertices. A simple graph ia a special case of the hypergraph (2-uniform hypergraph W/o qualification, an edge is assumed to consist of at most 2 vertices, and a graph is never confused with a hypergraph. A complete subgraph is a clique. A maximal clique is not a proper subset of any other clique. clique number is the order of a largest clique in G. A graph is connected if  a path between any 2 vertexes; otherwise, graph is disconnected. A cut set (vertex cut, separating set) is a set of vertices whose removal disconnects the remaining subgraph. A bridge set is an analogous edge set. If  path between any 2 vertexes even after removing any k-1 vertices, G is k-connected (iff it has k internally disjoint paths between any 2 vertices) The vertex (edge) connectivity or connectivity of a graph G is the minimum number of vertices (edges) that need to be removed to disconnect G. The set of neighbors of v, that is, vertices adjacent to v not including v itself, is called the (open) neighborhood of v and denoted NG(v). When v is also included, it is called a closed neighborhood and denoted by NG[v]. A graph with n vertices can be represented by its adjacency matrix: an n-by-n matrix whose entry in row-i, col-j is the # of edges from vertex i to j. A graph with two disjoint vertex sets s.t. an edge must run from one set to the other is bipartite; 3 sets, tripartite; k sets, k-partite; multipartite. A complete multipartite graph is a graph in which vertices are adjacent if and only if they belong to different partite sets. A complete bipartite graph is also referred to as a biclique; if its partite sets contain n and m vertices, respectively, then the graph is denoted Kn,m. Let G=(X,Y,E) be a bipartite graph.. A bicliqure (Sx, Sy) is a complete bipartite subgraph induced by bipartite vertex set (Sx, Sy). The Consensus Set of Sx, Py(Sx) = xSxNy(x), i.e., the set of all y's that are adjacent (edge connected) to every x in Sx. Thm1: (Sx, Sy) is a maximal biclique iff Sy = Py(Sx) and Sx = Px(Sy) Thm2:  SyY s.t. Px(Sy) ( Px(Sy), Py(Px(Sy)) ) is maximal and  SxX, s.t. Py(Sx) ( Px(Py)(Sx), Py(Sx) ) is maximal. Find all bi-cliques starting with Sy=singletons. Then examine Sy1y2-doubletons s.t. Px(Sy1y2) i.e., N(y1)N(y2) Then examine Sy1y2y3-tripletons s.t. Px(Sy1y2y3) i.e., Px(Syiyj) i<j and Px(Syiyj)N(yk) k not i or j. Then examine Sy1y2y3y4-quadrupletons s.t. Px(Sy1y2y3y4) i.e., Px(Syiyjyk) i<j<k and Px(Syiyjyk)N(yh) h not i or j or k... Will this find all bi-cliques or do we need to also reverse x and y? Examining MGRs, (x=docs, y=words) all singleton wordsets, Sy, form a nonempty bi-clique. AND pairwise to find all nonempty doubleton wordset bicliques, Sy1y2. AND those nonempty doubleton wordset with each other singleton wordset to find all nonempty tripleton wordset bicliques, Sy1y2y3... Start with singleton docs and include another... until empty set. The last nonempty set is a max-biclique and all subsets are bicliques so we can remove all of them and iterate. 1 8 w58 1 14 w21 1 17 w49 1 23 w52 1 28 w52 1 30 w49 1 41 w52 1 46 w49 1 48 w52 1 8 none 1 14 none w49 w52 4 8 w25 4 29 w2 4 30 w2 4 35 w25 4 39 w2 4 46 w2 w25 4 50 w25 w25 w2 w4 w7 w10 w13 w24 w42 w44 w4 w13 w7 w10 w13 w42 2 37 w57 2 46 w45 2 47 w57 w57 3 13 w51 3 29 w8 3 46 w51 3 47 w8 w51 w8 w26 w38 w34 w34 w38 w22 w5 w16 w25 w3 w42 w35 w3 w42 w3 w42 w35 w3 w42 w44 w42 w32 w12 w19 w44 w38 w17 w47 w4 w13 w54 w51 w47 w54 w42 w38 w18 w56 w47 w55 w44 w59 w15 w44 w59 w47 w55 w22 w50 w31 w48 w6 w3 w60 w1 w36 w30 w10 w4 w54 w47 w43 w53 w27 w23 w24 w2 w49 w52 w28 w20 w9 w52 w43 w53 w28 w8 w2 w22 w10 w33 w17 w40 w45 w34 w7 w44 w39 w2 w18 w9 w45 45 w25 w48 w13 w29 36 w52 w22 w27 w41 w57 w57 42 43 44 46 48 49 50 So there are 12 non-single-word bicliques. Note also that the single-word-bicliques are not necessarily the entire single-word-query-result either.

4 FAUST Clustering1 2-1 separates 7,50 2-2 separates.27s
L-Gap Clusterer Cut, C, mid-gap (of F&C) using next (d,p) from dpSet, where F=L|S|R 2-1 separates 7,50 2-2 separates.27s D=d35 0 d26 0 d1 0 d27 0 d3 0 d44 0 d16 0 d6 0 d17 0 d47 0 d18 0 d10 0 d43 0 d12 0 d33 0 d14 0 d23 0 d49 0 d25 0 d45 0 d2 0 d29 0 d13 0 d9 0 d32 0.27 d28 0.27 d41 0.27 d42 0.27 d30 0.27 d21 0.27 d22 0.27 d15 0.27 d36 0.27 d11 0.27 d38 0.27 d46 0.27 d5 0.27 d8 0.27 d37 0.27 d48 0.27 d39 0.27 d4 0.55 d50 0.55 d7 3.60 d35 35, 7, 50 outliers 2^? D=.27s 0 d9 0 d49 0 d45 0.09 d6 0.09 d3 0.09 d33 0.09 d18 0.09 d44 0.18 d43 0.18 d25 0.18 d22 0.18 d12 0.18 d16 0.18 d2 0.27 d27 0.27 d23 0.27 d42 0.27 d15 0.27 d13 0.27 d47 0.36 d26 0.36 d29 0.36 d36 0.46 d38 0.46 d14 0.46 d48 0.46 d8 0.46 d10 0.46 d37 0.55 d32 0.55 d1 0.55 d5 0.64 d21 0.64 d4 0.64 d11 0.64 d17 0.92 d30 1.01 d41 1.01 d28 1.10 d39 1.29 d46 {28,30,39,41,46} Cluster D=.64s 0 d26 0 d33 0 d3 0 d27 0 d45 0 d2 0 d44 0 d23 0 d9 0 d15 0 d49 0 d16 0 d38 0 d6 0 d18 0 d22 0.25 d1 0.25 d37 0.25 d43 0.25 d8 0.25 d29 0.25 d25 0.25 d42 0.25 d12 0.25 d47 0.25 d48 0.51 d32 0.51 d14 0.51 d4 0.51 d36 0.51 d13 0.51 d5 0.77 d10 1.03 d11 1.29 d17 1.54 d21 the 0's, .25s, .51s are clusters. d10, d11, d17, d21 outliers Going back to D=d35, how close does HOB come? 21, 20 separate 35 C1 (.17  xod  .25)={2,3,6,16,18,22,42,43,49} D=sum of all C31docs 0.63 d17 0.63 d29 0.63 d11 0.84 d50 0.84 d13 0.84 d30 0.95 d26 0.95 d28 0.95 d10 0.95 d41 1.16 d21 C311(..63) ={11,17,29} C312(.84) ={13,30,50} C313(.95) ={10,26,28,41} 21 outlier C2 (.34  xod  .56)={1,4,5,8,9,12,14,15,23,25,27,32,33,36,37,38,44,45,47,48} C3 (.64xod.86)={10,11,13,17,21,26,28,29,30,39,41,50} Single: 46 (xod=.99); 7 (=1.16); 35 (=1.47) D=sum of allC2docs 0.27 d23 0.36 d25 0.36 d4 0.36 d38 0.45 d15 0.45 d33 0.45 d12 0.45 d36 0.54 d8 0.54 d44 0.54 d47 0.63 d1 0.63 d37 0.63 d5 0.63 d32 0.63 d50 0.72 d27 0.72 d45 0.72 d9 0.81 d14 Next, on each Ck try D=Ck, Thres=.2 D=sum of all C1docs 0.42 d16 0.42 d2 0.42 d3 0.42 d42 0.42 d43 0.42 d22 0.63 d18 0.63 d49 0.85 d6 C11(xod=.42)={231622,42,43} 6,18,49 outliers D=sum of all C11docs 0.57 d2 0.57 d3 0.57 d16 0.57 d22 0.57 d42 0.57 d43 D=sum of all C3docs 0.56 d11 0.66 d17 0.66 d29 0.75 d13 0.85 d30 0.85 d10 0.94 d28 0.94 d26 0.94 d41 0.94 d50 1.03 d21 1.41 d39 C31(.56xod1.03) ={10,11,13,17,21,26,28,29,30,41,50} 39 outlier Other Clustering methods later D=44docs GT=.08 0.17 d22 0.17 d49 0.21 d42 0.21 d2 0.21 d16 0.25 d18 0.25 d3 0.25 d43 0.25 d6 0.34 d23 0.34 d15 0.34 d44 0.34 d38 0.34 d25 0.34 d36 0.38 d33 0.38 d48 0.38 d8 0.43 d4 0.43 d12 0.47 d47 0.47 d9 0.47 d37 0.51 d5 0.56 d1 0.56 d32 0.56 d45 0.56 d14 0.56 d27 0.64 d10 0.64 d17 0.64 d21 0.64 d29 0.64 d11 0.69 d26 0.69 d50 0.69 d13 0.73 d30 0.77 d28 0.82 d41 0.86 d39 0.99 d46 1.16 d7 1.47 d35 C11: 2. This little pig went to market. This little pig stayed at home. This little pig had roast beef. This little pig had none. This little pig said Wee, wee. I can't find my way home. 3. Diddle diddle dumpling, my son John. Went to bed with his breeches on, one stocking off, and one stocking on. Diddle diddle dumpling, my son John. 16. Flour of England, fruit of Spain, met together in a shower of rain. Put in a bag tied round with a string. If you'll tell me this riddle, I will give you a ring. 22. Had a little husband no bigger than my thumb. I put him in a pint pot, and there I bid him drum. I bought a little handkerchief to wipe his little nose and a little garters to tie his little hose. 42. Bat bat, come under my hat and I will give you a slice of bacon. And when I bake I will give you a cake, if I am not mistaken. 43. Hark hark, the dogs do bark! Beggars are coming to town. Some in jags and some in rags and some in velvet gowns. C2: 1. Three blind mice! See how they run! They all ran after the farmer's wife, who cut off their tails with a carving knife. Did you ever see such a thing in your life as three blind mice? 4. Little Miss Muffet sat on a tuffet, eating of curds and whey. There came a big spider and sat down beside her and frightened Miss Muffet away. 5. Humpty Dumpty sat on a wall. Humpty Dumpty had a great fall. All the Kings horses, and all the Kings men cannot put Humpty Dumpty together again. 8. Jack Sprat could eat no fat. His wife could eat no lean. And so between them both they licked the platter clean. 9. Hush baby. Daddy is near. Mamma is a lady and that is very clear. 12. There came an old woman from France who taught grown-up children to dance. But they were so stiff she sent them home in a sniff. This sprightly old woman from France. 14. If all seas were one sea, what a great sea that would be! And if all the trees were one tree, what a great tree that would be! And if all the axes were one axe, what a great axe that would be! And if all the men were one man what a great man he would be! And if the great man took the great axe and cut down the great tree and let it fall into great sea, what a splish splash it would be! 15. Great A. little a. This is pancake day. Toss the ball high. Throw the ball low. Those that come after may sing heigh ho! 23. How many miles is it to Babylon? Three score miles and ten. Can I get there by candle light? Yes, and back again. If your heels are nimble and light, you may get there by candle light. 36. Little Tommy Tittlemouse lived in a little house. He caught fishes in other mens ditches. 37. Here we go round mulberry bush, mulberry bush, mulberry bush. Here we go round mulberry bush, on a cold and frosty morning. This is way we wash our hands, wash our hands, wash our hands. This is way we wash our hands, on a cold and frosty morning. This is way we wash our clothes, wash clothes, wash our clothes. This is way we wash our clothes, on a cold and frosty morning. This is way we go to school, go to school, go to school. This is the way we go to school, on a cold and frosty morning. This is the way we come out of school, come out of school, come out of school. This is the way we come out of school, on a cold and frosty morning. 38. If I had as much money as I could tell, I never would cry young lambs to sell. Young lambs to sell, young lambs to sell. I never would cry young lambs to sell. 44. The hart he loves the high wood. The hare she loves the hill. The Knight he loves his bright sword. The Lady loves her will. 47. Cocks crow in the morn to tell us to rise and he who lies late will never be wise. For early to bed and early to rise, is the way to be healthy and wealthy and wise. 48. One two, buckle my shoe. Three four, knock at the door. Five six, ick up sticks. Seven eight, lay them straight. Nine ten. a good fat hen. Eleven twelve, dig and delve. Thirteen fourteen, maids a courting. Fifteen sixteen, maids in the kitchen. Seventeen eighteen. maids a waiting. Nineteen twenty, my plate is empty. C311: 11. One misty moisty morning when cloudy was weather, I met an old man clothed all in leather. He began to compliment and I began to grin. How do And how do? And how do again 17. Here sits the Lord Mayor. Here sit his two men. Here sits the cock. Here sits the hen. Here sit the little chickens. Here they run in. Chin chopper, chin chopper, chin chopper, chin! 29. When little Fred went to bed, he always said his prayers. He kissed his mamma and then his papa, and straight away went upstairs. C312: 13. A robin and a robins son once went to town to buy a bun. They could not decide on plum or plain. And so they went back home again. 30. Hey diddle diddle! The cat and the fiddle. The cow jumped over the moon. The little dog laughed to see such sport, and the dish ran away with the spoon. 50. Little Jack Horner sat in the corner, eating of Christmas pie. He put in his thumb and pulled out a plum and said What a good boy am I! C313: 10. Jack and Jill went up the hill to fetch a pail of water. Jack fell down, and broke his crown and Jill came tumbling after. When up Jack got and off did trot as fast as he could caper, to old Dame Dob who patched his nob with vinegar and brown paper. 26. Sleep baby sleep. Our cottage valley is deep. The little lamb is on the green with woolly fleece so soft and clean. Sleep baby sleep. Sleep baby sleep, down where the woodbines creep. Be always like the lamb so mild, a kind and sweet and gentle child. Sleep baby sleep. 28. Baa baa black sheep, have you any wool? Yes sir yes sir, three bags full. One for my master and one for my dame, but none for the little boy who cries in the lane. 41. Old King Cole was a merry old soul. And a merry old soul was he. He called for his pipe and he called for his bowl and he called for his fiddlers three. And every fiddler, he had a fine fiddle and a very fine fiddle had he. There is none so rare as can compare with King Cole and his fiddlers three.

5 FAUST Cluster 1.2 real HOB Alternate WS0, DS0
DS1= | WS1= 46 | DS2 46 OUTLIER: 46. Tom Tom the piper's son, stole a pig and away he run. The pig was eat and Tom was beat and Tom ran crying down the street. real HOB Alternate WS0, DS0 DS0=|WS1= 35 |---| |DS2| |35 | OUTLIER: 35. Sing a song of sixpence, a pocket full of rye. 4 and 20 blackbirds, baked in a pie. When the pie was opened, the birds began to sing. Was not that a dainty dish to set before the king? The king was in his counting house, counting out his money. Queen was in the parlor, eating bread and honey. The maid was in the garden, hanging out the clothes. When down came a blackbird and snapped off her nose. WS0= DS1 |WS1= 42(Mother) 7 9 |DS2|WS2=WS1 11 |7 27 | 29 C1: Mother theme 7. Old Mother Hubbard went to the cupboard to give her poor dog a bone. When she got there cupboard was bare and so the poor dog had none. She went to baker to buy him some bread. When she came back dog was dead. 9. Hush baby. Daddy is near. Mamma is a lady and that is very clear. 27. Cry baby cry. Put your finger in your eye and tell your mother it was not I. 29. When little Fred went to bed, he always said his prayers. He kissed his mamma and then his papa, and straight away went upstairs. 45. Bye baby bunting. Father has gone hunting. Mother has gone milking. Sister has gone silking. And brother has gone to buy a skin to wrap the baby bunting in. DS0|WS 1 |DS1| WS 10 |10 | DS2| WS3 13 | | 10 | DS3 10 OUTLIER: 10. Jack and Jill went up hill to fetch a pail of water. Jack fell down, and broke his crown and Jill came tumbling after. When up Jack got and off did trot as fast as he could caper, to old Dame Dob who patched his nob with vinegar and brown paper. WS DS1 WS1= {fiddle(32 41) man(11 32) old(11 44) 11 DS2 32 11 41 22 44 C2 fiddle old man theme 11. One misty moisty morning when cloudy was weather, I chanced to meet an old man clothed all in leather. He began to compliment and I began to grin. How do you do How do you do? How do you do again 32. Jack come and give me your fiddle, if ever you mean to thrive. No I will not give my fiddle to any man alive. If I'd give my fiddle they will think I've gone mad. For many a joyous day my fiddle and I have had 41. Old King Cole was a merry old soul. And a merry old soul was he. He called for his pipe and he called for his bowl and he called for his fiddlers three. And every fiddler, he had a fine fiddle and a very fine fiddle had he. There is none so rare as can compare with King Cole and his fiddlers three. DS0| WS1= 1 | DS1|WS2= 13 | 39 |DS2 14 | |39 OUTLIER: 39. A little cock sparrow sat on a green tree. He chirped and chirped, so merry was he. A naughty boy with his bow and arrow, determined to shoot this little cock sparrow. This little cock sparrow shall make me a stew, and his giblets shall make me a little pie. Oh no, says the sparrow I will not make a stew. So he flapped his wings\,away he flew WS DS1 WS1= 38 52 5 17 23 28 36 48 C3: men three 1. Three blind mice! See how they run! They all ran after the farmer's wife, who cut off their tails with a carving knife. Did you ever see such a thing in your life as three blind mice? 5. Humpty Dumpty sat on a wall. Humpty Dumpty had a great fall. All the Kings horses, and all the Kings men cannot put Humpty Dumpty together again. 14. If all the seas were one sea, what a great sea that would be! And if all the trees were one tree, what a great tree that would be! And if all the axes were one axe, what a great axe that would be! And if all the men were one man what a great man he would be! And if the great man took the great axe and cut down the great tree and let it fall into the great sea, what a splish splash that would be! 17. Here sits the Lord Mayor. Here sit his two men. Here sits the cock. Here sits the hen. Here sit the little chickens. Here they run in. Chin chopper, chin chopper, chin chopper, chin! 23. How many miles is it to Babylon? Three score miles and ten. Can I get there by candle light? Yes, and back again. If your heels are nimble and light, you may get there by candle light. 28. Baa baa black sheep, have you any wool? Yes sir yes sir, three bags full. One for my master and one for my dame, but none for the little boy who cries in the lane. 36. Little Tommy Tittlemouse lived in a little house. He caught fishes in other mens ditches. 48. One two, buckle my shoe. Three four, knock at the door. Five six, pick up sticks. Seven eight, lay them straight. Nine ten. a good fat hen. Eleven twelve, dig and delve. Thirteen fourteen, maids a courting. Fifteen sixteen, maids in the kitchen. Seventeen eighteen. maids a waiting. Nineteen twenty, my plate is empty. DS0|WS 13 |DS2|WS 14 |13 |DS3 13 OUTLIER: 13. A robin and a robins son once went to town to buy a bun. They could not decide on plum or plain. And so they went back home again. C4: 4. Little Miss Muffet sat on a tuffet, eating of curds and whey. There came a big spider and sat down beside her and frightened Miss Muffet away. 6. See a pin and pick it up. All the day you will have good luck. See a pin and let it lay. Bad luck you will have all the day. 8. Jack Sprat could eat no fat. Wife could eat no lean. Between them both they licked platter clean. 12. There came an old woman from France who taught grown-up children to dance. But they were so stiff she sent them home in a sniff. This sprightly old woman from France. 15. Great A. little a. This is pancake day. Toss the ball high. Throw the ball low. Those that come after may sing heigh ho! 18. I had two pigeons bright and gay. They flew from me the other day. What was the reason they did go? I can not tell, for I do not k 21. Lion and Unicorn were fighting for crown. Lion beat Unicorn all around town. Some gave them white bread and some gave them brown. Some gave them plum cake, and sent them out of town. 25. There was an old woman, and what do you think? She lived upon nothing but victuals, and drink. Victuals and drink were the chief of her diet, and yet this old woman could never be quiet. 26. Sleep baby sleep. Our cottage valley is deep.Little lamb is on green with woolly fleece so soft, clean. Sleep baby sleep. Sleep baby sleep, down where woodbines creep. Be always like lamb so mild, a kind and sweet and gentle child. Sleep baby sleep. 30. Hey diddle diddle! The cat and the fiddle. The cow jumped over the moon. The little dog laughed to see such sport, and the dish ran away with the spoon. 33. Buttons, a farthing a pair! Come, who will buy them of me? They are round and sound and pretty and fit for girls of the city. Come, who will buy them of me? Buttons, a farthing a pair! 37. Here we go round mulberry bush, mulberry bush, mulberry bush. Here we go round mulberry bush, on a cold and frosty morning. This is way we wash our hands, wash our hands, wash our hands. This is way we wash our hands, on a cold and frosty morning. This is way we wash our clothes, wash our clothes, wash our clothes. This is way we wash our clothes, on a cold and frosty morning. This is way we go to school, go to school, go to school. This is the way we go to school, on a cold and frosty morning. This is the way we come out of school, come out of school, come out of school. This is the way we come out of school, on a cold and frosty morning. 43. Hark hark, the dogs do bark! Beggars are coming to town. Some in jags and some in rags and some in velvet gowns. 44. The hart he loves the high wood. The hare she loves the hill. The Knight he loves his bright sword. The Lady loves her will. 47. Cocks crow in the morn to tell us to rise and he who lies late will never be wise. For early to bed and early to rise, is the way to be healthy and wealthy and wise. 49. There was a little girl who had a little curl right in the middle of her forehead. When she was good she was very very good and when she was bad she was horrid. 50. Little Jack Horner sat in the corner, eating of Christmas pie. He put in his thumb and pulled out a plum and said What a good boy am I! WS0= DS1|WS1(17wds)= 4 6 8|DS2=DS1 DS0|WS1= 2 3|DS2=DS1 Each of the 10 words occur in 1 doc, so all 5 docs are outliers OUTLIERS: 2. This little pig went to market. This little pig stayed home. This little pig had roast beef. This little pig had none. This little pig said Wee, wee. I can't find my way home 3. Diddle diddle dumpling, my son John. Went to bed with his breeches on, one stocking off, and one stocking on. Diddle diddle dumpling, my son John. 16. Flour of England, fruit of Spain, met together in a shower of rain. Put in a bag tied round with a string. If you'll tell me this riddle, I will give you a ring. 22. Had little husband no bigger than my thumb. Put him in a pint pot, there I bid him drum. Bought a little handkerchief to wipe his little nose, pair of little garters to tie little hose 42. Bat bat, come under my hat and I will give you a slice of bacon. And when I bake I will give you a cake, if I am not mistaken. OUTLIER: 38. If I had as much money as I could tell, I never would cry young lambs to sell. Young lambs to sell, young lambs to sell. I never would cry young lambs to sell. Notes Using HOB, the final WordSet is the document cluster theme! When the theme is too long to be meaningful (C4) we can recurse on those (using the opposite DS)|WS0?). The other thing we can note is that DS) almost always gave us an outliers (except for C5) and only WS) almost always gave us clusters (excpt for the first one, 46). What happens if we reverse it? What happens if we just use WS0?

6 FAUST Cluster 1.2.1 real HOB Alternate WS0, DS0 recuring on C3 and C4
C Here we go round mulberry bush, mulberry bush, mulberry bush. Here we go round mulberry bush, on a cold and frosty morning. This is way we wash our hands, wash our hands, wash our hands. This is way we wash our hands, on a cold and frosty morning. This is way we wash our clothes, wash our clothes, wash our clothes. This is way we wash our clothes, on a cold and frosty morning. This is way we go to school, go to school, go to school. This is the way we go to school, on a cold and frosty morning. This is the way we come out of school, come out of school, come out of school. This is the way we come out of school, on a cold and frosty morning. 47. Cocks crow in the morn to tell us to rise and he who lies late will never be wise. For early to bed and early to rise, is the way to be healthy and wealthy and wise. DS0|WS1= (on C4) 21|DS2 WS2=41(morn) 57(way) 26| 37 DS3=DS2 30| 47 . C4.2.1 word47(plum) 21. Lion &Unicorn were fighting for crown. Lion beat Unicorn all around town. Some gave them white bread and some gave them brown. Some gave them plum cake sent them out of town. 50. Little Jack Horner sat in corner, eating of Christmas pie. He put in his thumb and pulled out a plum and said What a good boy am I! DS0|WS1= 47 (plum) 21 DS2 WS2=WS1 26 21 30 50 50 WS0= DS1|WS1= 4 DS2 WS2= DS3 WS3=WS2 49 50 Final WordSet is too long. Recurse 4.2 WS0= DS1|WS1 = 4 |DS2 WS2= 8 |4 DS3 WS3= 12 |8 8 DS4 WS4= 25 | DS5 WS544 59 26 | DS6=DS5 30 | C word44(old) word59(woman) 12. There came an old woman from France who taught grown-up children to dance. But they were so stiff she sent them home in a sniff. This sprightly old woman from France. 25. There was old woman. What do you think? She lived upon nothing but victuals, and drink. Victuals and drink were the chief of her diet, and yet this old woman could never be quiet. Doc26 and doc30 have none of the 12 words in commong so these two will come out outliers on the next recursion! OUTLIERS: 26. Sleep baby sleep. Cottage valley is deep.Little lamb is on green with woolly fleece soft, clean. Sleep baby sleep. Sleep baby sleep, down where woodbines creep. Be always like lamb so mild, a kind and sweet and gentle child. Sleep baby sleep. 30. Hey diddle diddle! Cat and the fiddle. Cow jumped over moon.Little dog laughed to see such sport, and dish ran away with spoon. DS0|WS1= 26 |DS1=DS0 30 OUTLIER: 6. See a pin and pick it up. All the day you will have good luck. See a pin and let it lay. Bad luck you will have all the day. WS0= DS WS1=5 22 DS2 6 C4.2.3 (day eat girl) Little Miss Muffet sat on tuffet, eating curd, whey. Came big spider, sat down beside her, frightened Miss Muffet away 8. Jack Sprat could eat no fat. Wife could eat no lean. Between them both they licked platter clean. 15. Great A. little a. This is pancake day. Toss the ball high. Throw the ball low. Those that come after may sing heigh ho! 18. I had 2 pigeons bright and gay. They flew from me other day. What was the reason they did go? I can not tell, for I do not know. 33. Buttons, farthing pair! Come who will buy them? They are round, sound, pretty, fit for girls of city. Come, who will buy ? Buttons, farthing a pair 49. There was little girl had little curl right in the middle of her forehead. When she was good she was very good and when she was bad she was horrid. DS0|WS1= 4 DS2 =WS1 8 |4 8 15|15 18 Recursing 18|33 49 no change Doc43 and doc44 have none of the 6 words in commong so these two will come out outliers on the next recursion! OUTLIERS: 43. Hark hark, the dogs do bark! Beggars are coming to town. Some in jags and some in rags and some in velvet gowns. 44. The hart he loves the high wood. The hare she loves the hill. The Knight he loves his bright sword. The Lady loves her will. recurse on C3: C31 [21]cut [38]men [49]run 1. Three blind mice! See how run! All ran after farmer's wife, cut off tails with carving knife. Ever see such thing in life as 3 blind mice? 14. If all seas were 1 sea, what a great sea that would be! And if all trees were 1 tree, what a great tree that would be! And if all axes were 1 axe, what a great axe that would be! if all men were 1 man what a great man he would be! And if great man took great axe and cut down great tree and let it fall into great sea, what a splish splash that would be! 17. Here sits Lord Mayor. Here sit his 2 men. Here sits the cock. Here sits hen. Here sit the little chickens. Here they run in. Chin chopper, chin chopper, chin chopper, chin! DS0=|WS1= 1 |DS1 |WS2= 14 |1 |DS3=DS2 17 |14 28 |17 C32: [38]men [52] three 5. Humpty Dumpty sat on wall. Humpty Dumpty had great fall. All Kings horses, all Kings men cannot put Humpty Dumpty together again. 23. How many miles to Babylon? 3 score miles and 10. Can I get there by candle light? Yes, back again. If your heels are nimble, light, you may get there by candle light. 28. Baa baa black sheep, have you any wool? Yes sir yes sir, three bags full. One for my master and one for my dame, but none for the little boy who cries in the lane. 36. Little Tommy Tittlemouse lived in a little house. He caught fishes in other mens ditches. 48. One two, buckle my shoe. Three four, knock at the door. Five six, pick up sticks. Seven eight, lay them straight. Nine ten. a good fat hen. Eleven twelve, dig and delve. Thirteen fourteen, maids a courting. Fifteen sixteen, maids in the kitchen. Seventeen eighteen. maids a waiting. Nineteen twenty, my plate is empty. WS0=38 52 DS1|WS1=WS0 5 |

7 What do we want in bioinformatics? (cliques, strong clusters, ...???)
FAUST Cluster 1.2.2 HOB Alternate WS0, DS0 16 OUTLIERS: Categorize clusters (hub-spoke, cyclic, chain, disjoint...)? Separate disjoint sub-clusters? Each of the 3 C423 words gives a disjoint cluster! Each of the 2 C32 work gives a disjoint sub-clusters also. C day 15. Great A. little a. This is pancake day. Toss ball high. Throw ball low. Those come after sing heigh ho! 18. I had 2 pigeons bright and gay. They flew from me other day. What was reason they go? I can not tell, I do not know. 15 18 day C eat 4. Little Miss Muffet sat on tuffet, eat curd, whey. Came big spider, sat down beside her, frightened away 8. Jack Sprat could eat no fat. Wife could eat no lean. Between them both they licked platter clean. 4 8 eat C4233 girl 33. Buttons, farthing pair! Come who will buy them? They are round, sound, pretty, fit for girls of city. Come, who will buy ? Buttons, farthing a pair 49. There was little girl had little curl right in the middle of her forehead. When she was good she was very good and when she was bad she was horrid. 33 49 girl C1: mother 7. Old Mother Hubbard went to cupboard to give her poor dog a bone. When she got there cupboard was bare, so poor dog had none. She went to baker to buy some bread. When she came back dog was dead. 9. Hush baby. Daddy is near. Mamma is a lady and that is very clear. 27. Cry baby cry. Put your finger in your eye and tell your mother it was not I. 29. When little Fred went to bed, he always said his prayers. He kissed his mamma and then his papa, and straight away went upstairs. 45. Bye baby bunting. Father has gone hunting. Mother has gone milking. Sister has gone silking. And brother has gone to buy a skin to wrap the baby bunting in. 11 32 41 men fiddle old C2: fiddle old men {cyclic} misty moisty morning when cloudy was weather, Chanced to meet old man clothed all leather. He began to compliment,I began to grin. How do you do How do? How do again 32. Jack come give me your fiddle, if ever you mean to thrive. No I'll not give fiddle to any man alive. If I'd give my fiddle they will think I've gone mad. For many joyous day fiddle and I've had 41. Old King Cole was merry old soul. Merry old soul was he. He called for his pipe, he called for his bowl, he called for his fiddlers 3. And every fiddler, had a fine fiddle, a very fine fiddle had he. There is none so rare as can compare with King Cole and his fiddlers three. C11 cut men run {cyclic} 1. Three blind mice! See how run! All ran after farmer's wife, cut off tails with carving knife. Ever see such thing in life as 3 blind mice? 14. If all seas were 1 sea, what a great sea that would be! And if all trees were 1 tree, what a great tree that would be! And if all axes were 1 axe, what a great axe that would be! if all men were 1 man what a great man he would be! And if great man took great axe and cut down great tree and let it fall into great sea, what a splish splash that would be! 17. Here sits Lord Mayor. Here sit his 2 men. Here sits the cock. Here sits hen. Here sit the little chickens. Here they run in. Chin chopper, chin chopper, chin chopper, chin! 17 1 14 run cut men C321 men 5. Humpty Dumpty sat on wall. Humpty Dumpty had great fall. All Kings horses, all Kings men can't put Humpty together again. 36. Little Tommy Tittlemouse lived in little house. He caught fishes in other mens ditches. 5 36 men C322 three 23. How many miles to Babylon? 3 score 10. Can I get there by candle light? Yes, back again. If your heels are nimble, light, you may get there by candle light. 28. Baa baa black sheep, have any wool? Yes sir yes sir, 3 bags full. One for my master and one for my dame, but none for the little boy who cries in the lane. 48. One two, buckle my shoe. Three four, knock at the door. Five six, pick up sticks. Seven eight, lay them straight. Nine ten. a good fat hen. Eleven twelve, dig and delve. Thirteen fourteen, maids a courting. Fifteen sixteen, maids in the kitchen. Seventeen eighteen. maids a waiting. Nineteen twenty, my plate is empty. 23 28 48 three C4.1 morn way 37. Here we go round mulberry bush, mulberry bush, mulberry bush. Here we go round mulberry bush, on cold and frosty morn. This is way wash our hands, wash our hands, wash our hands. This is way wash our hands, on a cold and frosty morning. This is way we wash our clothes, wash our clothes, wash our clothes. This is way we wash r clothes, on a cold and frosty morning. This is way we go to school, go to school, go to school. This is the way we go to school, on a cold and frosty morning. This is the way we come out of school, come out of school, come out of school. This is the way we come out of school, on a cold and frosty morning. 47. Cocks crow in the morn to tell us to rise and he who lies late will never be wise. For early to bed and early to rise, is the way to be healthy and wealthy and wise. 37 47 morn way C421 plum 21. Lion &Unicorn were fighting for crown. Lion beat Unicorn all around town. Some gave them white bread and some gave them brown. Some gave them plum cake sent them out of town. 50. Little Jack Horner sat in corner, eating of Christmas pie. He put in his thumb and pulled out a plum and said What a good boy am I! C422 old woman 12. There came an old woman from France who taught grown-up children to dance. But they were so stiff she sent them home in a sniff. This sprightly old woman from France. 25. There was old woman. What do you think? She lived upon nothing but victuals, and drink. Victuals and drink were the chief of her diet, and yet this old woman could never be quiet. 12 25 old woman Let's pause and ask "What are we after?" Of course it depends upon the client. 3 main categories for relatioinship mining? text corpuses, market baskets (includes recommenders), bioinformatics? Others? What do we want from text mining? (anomalies detection, cliques, bicliques?) What do we want from market basket mining? (future purchase predictions, recommendations...) What do we want in bioinformatics? (cliques, strong clusters, ...???)

8 FAUST Cluster 1.2.3 word-labeled document graph
26 always 29 4 away 30 39 46 9 baby 27 45 7 13 23 back 6 49 bad 50 28 boy 35 bake 3 bed 21 bread 18 44 bright 10 brown 33 buy back buy 42 12 child 8 clean 11 cloth 37 crown 47 cry 38 1 14 cut 15 day 32 dish dog 36 eat fall 5 fiddle 41 full girl green high hill 43 house king lady lamb maid men merry money morn way mother 22 nose old 25 pie 17 pig plum town plum 16 bag round cock 2 run sing son 48 three town tree two wife thumb woman FAUST Cluster 1.2.3 word-labeled document graph 17 1 14 run cut men We have captured only a few of the salient sub-graphs. Can we capture more of them? Of course we can capture a sub-graph for each word, but that might be 100,000. Let's stare at what we got and try to see what we might wish we had gotten in addition. 48 23 28 three 50 21 plum 36 5 men 49 33 girl 29 9 27 45 7 mother men 32 fiddle 41 old 11 37 47 morn way day A bake-bread sub-corpus would have been strong. (docs{ ) A bake-bread sub-corpus would have been strong. (docs{ ) There are many others. eat 8 4 Using AVG+1 d d d d d 12 25 old woman

9 HOB2 Alt (use other HOBs)
26 always 29 4 away 30 39 46 9 baby 27 7 13 23 back 6 49 bad 50 28 boy 35 bake 3 bed 21 bread 18 44 bright 10 brown 33 buy back buy 42 12 child 8 clean 11 cloth 37 crown 47 cry 38 1 14 cut 15 day 32 dish dog 36 eat fall 5 fiddle 41 full girl green high hill 43 house king lady lamb maid men merry money morn way mother 22 nose old 25 pie 17 pig plum town plum 16 bag round cock 2 run sing son 48 three town tree two wife thumb woman FAUST Cluster 1.2.4 HOB2 Alt (use other HOBs) wAvg+1, dAvg+1 a b b e p p w o r a i l a y e t e u y a m d d d d d d 35 21 bread plum 50 39 pie boy 46 away eat recurse: wAv+2,dAvg-1 e p a i t e d d d d 39 46 35 50 pie eat And if we want to pull out a particular word cluster, just turn the word-pTree into a list.: 12 child old woman w=boy a b w o a y 2 9 d d d w=baby a b w a b a y 2 3 d d d d For a particular doc cluster, just turn the doc-pTree into a list: 12. There came an old woman from France who taught grown-up children to dance. But they were so stiff she sent them home in a sniff. This sprightly old woman from France. c o w h l o i d m l a d n d 45 baby 26 9 27 39 50 28 boy

10 FAUST HULL Classification 1
Using the clustering of FAUST Clustering1 as classes, we extract 80% from each class as TrainingSet (w class=cluster#). How accurate is FAUST Hull Classification on the remaining 20% plus the outliers (which should be "Other"). Use Lpd, Sp, Rpd with p=ClassAvg and d=unitized ClassSum. C11={2,3,16,22,42,43} C311= {11,17,29} C312={13,30,50} C313={10,26,28,41} C2 ={1,4,5,8,9,12,14,15,23,25,27,32,33,36,37,38,44,45,47,48} OUTLIERS {18,49} {6} {39} {21} {46} {7} {35} Full classes from slide: FAUST Clustering1 C11={2,16,22,42,43} C311= {11,17} C312={30,50} C313={10,28,41} C2 ={1,5,8,9,12,15,25,27,32,33,36,37,38,44,47,48} 80% Training Set C11={3} C311= {29} C312={13} C313={26} C2 ={4,14,23,45} O={ } 20% Test Set D1=TS p=avTS Lpd MIN MAX CLASS C11 C2 C311 C312 C313 C313 C11 C2 .572 C311 C312 D11=C11 p=avC11 L MIN MAX CLASS C11 0 .63 C2 C311 C312 0 .31 C313 .63 C11 C311 C313 C2 .31 C312 .66 C311 D2=C2 p=avC2 L MIN MAX CLS 0 .22 C11 C2 C311 C312 C313 C11 C312 C313 C2 D311=C311 p=avC311 L MN MX CLAS C11 C2 C311 C312 C313 C311 C11 0 .33 C312 C313 C2 1.58 C312 D312=C312 p=avC312 L MN MX CLAS C11 C2 C311 C312 C313 .31 C11 .31 C2 .31 C311 .31 C313 D313=C313 p=avC313 L MN MX CLAS C11 C2 C311 C312 C313 C11 C2 C311 C313 .22 C312 All 6 class hulls separated using Lpd, p=CLavg, D=CLsum. D311 separates C311, D312 separates C312 and D313 separates C313 from all others. D2 separates C11 and C2. Now, remove some false positives with S and R using the same p's and d's: D1=TS p=avTS Sp C 1.9 C C 2.4 C 4.6 C D11=C11 p=avC11 Sp [1.6]C [ ]C [ ]C313 [ ]C2 [5]C312 D2=C2 p=avC2 Sp [2 2.3]C [ ]C313 [ ]C [5 5.1]C312 [ ]C311 D313=C313 p=avC313 Sp [ ]C [6.5]C312 [ ]C2 [ ]C311 [ ]C313 D311=C311 p=avC311 Sp [1.2]C [4.2]C11 [ ]C312 [ ]C2 [ ]C313 D312=C312 p=avC312 Sp [ ]C11 [ ]C313 [ ]C2 [2.5]C [5.5]C311 Sp removes a lot of the potential for false positives. (Many of the classes lie a single distance from p.) D1=TS p=avTS Rpd [ ]C11 [ ]C2 [ ]C311 [2.1]C312 [ ]C313 D11=C11 p=avC11 Rpd [1.2]C11 [ ]C2 [ ]C311 [2.2 2.]]C312 [ ]C313 D2=C2 p=avC2 Rpd [ ]C11 [ ]C2 [ ]C311 [2.2]C312 [ ]C313 D311=C311 p=avC311 Rpd [1.4]C11 [ ]C2 [1.1]C311 [2.2]C312 [ ]C313 D312=C312 p=avC312 Rpd [ ]C11 [ ]C2 [ ]C311 [1.5]C312 [ ]C313 D313=C313 p=avC313 Rpd [ ]C11 [ ]C2 [ ]C311 [2.2]C312 [ ]C313 Rpd removes even more of the potential for false positives.

11 FAUST Hull Classification 2 (TESTING)
D1=TS p=avTS Lpd [ ]C313 [ ]C11 [ ]C2 [.57]C311 [ ]C312 D1=TS p=avTS Sp [ ]C313 [ ]C11 [ ]C2 [ ]C311 [ ]C312 D1=TS p=avTS Rpd [ ]C11 [ ]C2 [ ]C311 [2.1]C312 [ ]C313 C11={3} C311= {29} C312={13} C313={26} C2 ={4,14,23, 45} O={ } Test Set D11=C11 p=avC11 Lpd [.63]C11 [0]C311 [ ]C313 [ ]C2 [.31]C312 D11=C11 p=avC11 Sp [1.6]C [ ]C [ ]C313 [ ]C2 [5]C312 D11=C11 p=avC11 Rpd [1.2]C11 [ ]C2 [ ]C311 [2.2 2.]]C312 [ ]C313 [.66] C311 D2=C2 p=avC2 Lpd [ ]C11 [ ]C312 .[44 .66]C313 [ ]C2 D2=C2 p=avC2 Sp [2 2.3]C [ ]C313 [ ]C [5 5.1]C312 [ ]C311 D2=C2 p=avC2 Rpd [ ]C [ ]C313 [ ]C2 [ ]C311 [2.2]C312 D311=C311 p=avC311 Lpd [ ]C311 [0]C11 [0 .33]C312 [0 .33]C313 [ ]C2 D311=C311 p=avC311 Sp [1.2]C [4.2]C11 [ ]C312 [ ]C2 [ ]C313 D311=C311 p=avC311 Rpd [1.4]C11 [ ]C2 [1.1]C311 [2.2]C312 [ ]C313 1.58 C312 D312=C312 p=avC312 Lpd .31 C11 .31 C2 .31 C311 .31 C313 D312=C312 p=avC312 Sp [ ]C [ ]C313 [ ]C2 [2.5]C [5.5]C311 D312=C312 p=avC312 Rpd [ ]C [ ]C313 [ ]C2 [ ]C311 [1.5]C312 D313=C313 p=avC313 Lpd [0 .22]C11 [ ]C2 [ ]C311 [ ]C313 [.22]C312 D313=C313 p=avC313 Sp [ ]C [6.5]C312 [ ]C2 [ ]C311 [ ]C313 D313=C313 p=avC313 Rpd [ ]C11 [ ]C2 [ ]C311 [2.2]C312 [ ]C313 D=TS Rpd Sp Lpd trueCL Predicted____CLASS Final R S L predicted d Oth Other d Other d14 Oth Other d23 2| |11 Oth Other d29 Oth | Other d Other d26 Oth Oth Other d6 2| Other d Oth Other d18 2| |11 Oth Other d Oth Oth Other d35 Oth Oth Oth Other d39 Oth Oth Other d46 Oth Oth Other d Oth Other 8/15 = 53% correct just with D=TS p=AvgTS Note: It's likely to get worse as we consider more D's. ε=.8 predicted Class 11 2 311(all 311|2 all) 312(all 312|313 a Other . Let's think about TrainingSet quality resulting from clustering. This a poor quality TrainingSet (from clustering Mother Goose Rythmes. MGR is a difficult corpus to cluster since: 1., in MGR, almost every document is isolated (an outlier), so the clustering is vague (no 2 MGRs deal with the same topic so their word use is quite different.). Instead of tightening the class hulls by replacing CLASSmin and CLASSmax by CLASSfpci (fpci=first percipitous count increase) and CLASSlpcd, we might loosen class hulls (since we know the classes somewhat arbitrary) by expanding the [CLASSmin, CLASSmax] interval as follows: Let A = Avg{ClASSmin, CLASSmax} and R (for radius) = A-CLASSmin (=CLASSmax-A also). Use [A-R-ε, A+R+ε]. Let ε=.8 increases accuracy to 100% (assuming all Other stay Other.). Finally, it occurs to me that Clustering to produce a TrainingSet, then setting aside a TestSet gives a good way to measure the quality of the clustering. If the TestSet part classifies well under the TrainingSet part, the clustering must have been high quality (produced a good TrainingSet for classification). This clustering quality test method is probably not new (check the literature?). If it is new, we might have a paper here? (discuss this quality measure and assess using different ε's?)

12 APPENDIX FAUST Clustering 2
Other variations of the FAUST Clustering1 Algorithm Functional Gap Cluster Dendogram D=sum of all docs in subcluster, but use all gaps! 0.17 d22 0.17 d49 0.21 d42 0.21 d2 0.21 d16 0.25 d18 0.25 d3 0.25 d43 0.25 d6 0.34 d23 0.34 d15 0.34 d44 0.34 d38 0.34 d25 0.34 d36 0.38 d33 0.38 d48 0.38 d8 0.43 d4 0.43 d12 0.47 d47 0.47 d9 0.47 d37 0.56 d1 0.56 d32 0.56 d45 0.56 d14 0.56 d27 0.64 d10 0.64 d17 0.64 d21 0.64 d29 0.64 d11 0.51 d5 0.73 d30 0.77 d28 0.82 d41 0.86 d39 0.99 d46 1.16 d7 1.47 d35 0.69 d26 0.69 d50 0.69 d13 0.47 d23 0.47 d25 0.47 d36 0.94 d15 1.17 d44 0.70 d38 0.63 d3 0.63 d43 0.94 d18 0.94 d6 0.89 d4 1.34 d12 0.77 d9 1.54 d47 1.54 d37 0.8 d32 1.2 d14 1 d1 1 d45 1 d27 0 d21 .2 d10 .6 d17 .4 d11 .4 d29 1.37 d50 1.37 d13 1.6 d26 22. I had a little husband no bigger than my thumb. I put him in a pint pot, and there I bid him drum. I bought a little handkerchief to wipe his little nose and a pair of little garters to tie his little hose. 49. There was a little girl who had a little curl right in the middle of her forehead. When she was good she was very very good and when she was bad she was horrid. 2. This little pig went to market. This little pig stayed at home. This little pig had roast beef. This little pig had none. This little pig said Wee, wee. I can't find my way home. 16. Flour of England, fruit of Spain, met together in a shower of rain. Put in a bag tied round with a string. If you'll tell me this riddle, I will give you a ring. 42. Bat bat, come under my hat and I will give you a slice of bacon. And when I bake I will give you a cake, if I am not mistaken. 3. Diddle diddle dumpling, my son John. Went to bed with his breeches on, one stocking off, and one stocking on. Diddle diddle dumpling, my son John. 43. Hark hark, the dogs do bark! Beggars are coming to town. Some in jags and some in rags and some in velvet gowns. 6. See a pin and pick it up. All the day you will have good luck. See a pin and let it lay. Bad luck you will have all the day. 18. I had two pigeons bright and gay. They flew from me the other day. What was the reason they did go? I can not tell, for I do not know. 23. How many miles is it to Babylon? Three score miles and ten. Can I get there by candle light? Yes, and back again. If your heels are nimble and light, you may get there by candle light. 25. There was an old woman, and what do you think? She lived upon nothing but victuals, and drink. Victuals and drink were the chief of her diet, and yet this old woman could never be quiet. 36. Little Tommy Tittlemouse lived in a little house. He caught fishes in other mens ditches. 8. Jack Sprat could eat no fat. His wife could eat no lean. And so between them both they licked the platter clean. 33. Buttons, a farthing a pair! Come, who will buy them of me? They are round and sound and pretty and fit for girls of the city. Come, who will buy them of me? Buttons, a farthing a pair! 48. One two, buckle my shoe. Three four, knock at the door. Five six, ick up sticks. Seven eight, lay them straight. Nine ten. a good fat hen. Eleven twelve, dig and delve. Thirteen fourteen, maids a courting. Fifteen sixteen, maids in the kitchen. Seventeen eighteen. maids a waiting. Nineteen twenty, my plate is empty. 37. Here we go round mulberry bush, mulberry bush, mulberry bush. Here we go round mulberry bush, on a cold and frosty morning. This is way we wash our hands, wash our hands, wash our hands. This is way we wash our hands, on a cold and frosty morning. This is way we wash our clothes, wash our clothes, wash our clothes. This is way we wash our clothes, on a cold and frosty morning. This is way we go to school, go to school, go to school. This is the way we go to school, on a cold and frosty morning. This is the way we come out of school, come out of school, come out of school. This is the way we come out of school, on a cold and frosty morning. 47. Cocks crow in the morn to tell us to rise and he who lies late will never be wise. For early to bed and early to rise, is the way to be healthy and wealthy and wise. 1. Three blind mice! See how they run! They all ran after the farmer's wife, who cut off their tails with a carving knife. Did you ever see such a thing in your life as three blind mice? 27. Cry baby cry. Put your finger in your eye and tell your mother it was not I. 45. Bye baby bunting. Father has gone hunting. Mother has gone milking. Sister has gone silking. And brother has gone to buy a skin to wrap the baby bunting in. 11. One misty moisty morning when cloudy was the weather, I chanced to meet an old man clothed all in leather. He began to compliment and I began to grin. How do you do And how do you do? And how do you do again 29. When little Fred went to bed, he always said his prayers. He kissed his mamma and then his papa, and straight away went upstairs. 13. A robin and a robins son once went to town to buy a bun. They could not decide on plum or plain. And so they went back home again. 50. Little Jack Horner sat in the corner, eating of Christmas pie. He put in his thumb and pulled out a plum and said What a good boy am I! Av: always away baby back bad bag bake bed boy bread bright brown buy cake child clean cloth cock crown cry cut day dish dog eat fall fiddle full girl green high word# df# min=2 hill house king lady lamb maid men merry money morn mother nose old pie pig plum round run sing son three thumb town tree two way wife woman wool word# df# max=6

13 FAUST Clustering3 (HOB clustering1)
Functional Gap Clusterer D=sum of all docs in subcluster but use HOB! 27. Cry baby cry. Put your finger in your eye and tell your mother it was not I. 45. Bye baby bunting. Father has gone hunting. Mother has gone milking. Sister has gone silking. And brother has gone to buy a skin to wrap the baby bunting in. 1 d45 1 d27 1 d1 d7 d35 3 4 6 8 9 23 25 33 36 38 43 47 2 16 22 42 49 1.. Three blind mice! See how they run! They all ran after the farmer's wife, who cut off their tails with a carving knife. Did you ever see such a thing in your life as three blind mice? 5. Humpty Dumpty sat on a wall. Humpty Dumpty had a great fall. All the Kings horses, and all the Kings men cannot put Humpty Dumpty together again. 10. Jack and Jill went up the hill to fetch a pail of water. Jack fell down, and broke his crown and Jill came tumbling after. When up Jack got and off did trot as fast as he could caper, to old Dame Dob who patched his nob with vinegar and brown paper. 11. One misty moisty morning when cloudy was the weather, I chanced to meet an old man clothed all in leather. He began to compliment and I began to grin. How do you do And how do you do? And how do you do again 13. A robin and a robins son once went to town to buy a bun. They could not decide on plum or plain. And so they went back home again. 14. If all the seas were one sea, what a great sea that would be! And if all the trees were one tree, what a great tree that would be! And if all the axes were one axe, what a great axe that would be! And if all the men were one man what a great man he would be! And if the great man took the great axe and cut down the great tree and let it fall into the great sea, what a splish splash that would be! 17. Here sits the Lord Mayor. Here sit his two men. Here sits the cock. Here sits the hen. Here sit the little chickens. Here they run in. Chin chopper, chin chopper, chin chopper, chin! 21. The Lion and the Unicorn were fighting for the crown. The Lion beat the Unicorn all around the town. Some gave them white bread and some gave them brown. Some gave them plum cake, and sent them out of town. 26. Sleep baby sleep. Our cottage valley is deep. The little lamb is on the green with woolly fleece so soft and clean. Sleep baby sleep. Sleep baby sleep, down where the woodbines creep. Be always like the lamb so mild, a kind and sweet and gentle child. Sleep baby sleep. 28. Baa baa black sheep, have you any wool? Yes sir yes sir, three bags full. One for my master and one for my dame, but none for the little boy who cries in the lane. 29. When little Fred went to bed, he always said his prayers. He kissed his mamma and then his papa, and straight away went upstairs. 30. Hey diddle diddle! The cat and the fiddle. The cow jumped over the moon. The little dog laughed to see such sport, and the dish ran away with the spoon. 32. Jack come and give me your fiddle, if ever you mean to thrive. No I will not give my fiddle to any man alive. If I should give my fiddle they will think that I've gone mad. For many a joyous day my fiddle and I have had 50. Little Jack Horner sat in the corner, eating of Christmas pie. He put in his thumb and pulled out a plum and said What a good boy am I! d7 d35 1 5 10 11 13 14 17 21 26 27 28 29 30 32 39 41 45 46 50 .4 d11 .4 d29 1.37 d50 1.37 d13 3 4 8 9 33 38 47 6 23 25 36 43 47 3 4 8 9 33 38 39 46 27 45 15. Great A. little a. This is pancake day. Toss the ball high. Throw the ball low. Those that come after may sing heigh ho! 44. The hart he loves the high wood. The hare she loves the hill. The Knight he loves his bright sword. The Lady loves her will. 1 5 10 11 13 14 17 21 26 28 29 30 32 50 12. There came an old woman from France who taught grown-up children to dance. But they were so stiff she sent them home in a sniff. This sprightly old woman from France. 48. One two, buckle my shoe. Three four, knock at the door. Five six, ick up sticks. Seven eight, lay them straight. Nine ten. a good fat hen. Eleven twelve, dig and delve. Thirteen fourteen, maids a courting. Fifteen sixteen, maids in the kitchen. Seventeen eighteen. maids a waiting. Nineteen twenty, my plate is empty. 3 4 8 9 33 38 d48 d33 d8 4. Little Miss Muffet sat on a tuffet, eating of curds and whey. There came a big spider and sat down beside her and frightened Miss Muffet away. 9. Hush baby. Daddy is near. Mamma is a lady and that is very clear. 33. Buttons, a farthing a pair! Come, who will buy them of me? They are round and sound and pretty and fit for girls of the city. Come, who will buy them of me? Buttons, a farthing a pair! 38. If I had as much money as I could tell, I never would cry young lambs to sell. Young lambs to sell, young lambs to sell. I never would cry young lambs to sell. 8 4 9 33 38 6. See a pin and pick it up. All the day you will have good luck. See a pin and let it lay. Bad luck you will have all the day. 23. How many miles is it to Babylon? Three score miles and ten. Can I get there by candle light? Yes, and back again. If your heels are nimble and light, you may get there by candle light. 25. There was an old woman, and what do you think? She lived upon nothing but victuals, and drink. Victuals and drink were the chief of her diet, and yet this old woman could never be quiet. 36. Little Tommy Tittlemouse lived in a little house. He caught fishes in other mens ditches. 43. Hark hark, the dogs do bark! Beggars are coming to town. Some in jags and some in rags and some in velvet gowns. 0.63 d3 0.63 d43 0.94 d18 0.94 d6 41 0.47 d23 0.47 d25 0.47 d36 12 15 18 37 44 48 2. This little pig went to market. This little pig stayed at home. This little pig had roast beef. This little pig had none. This little pig said Wee, wee. I can't find my way home. 16. Flour of England, fruit of Spain, met together in a shower of rain. Put in a bag tied round with a string. If you'll tell me this riddle, I will give you a ring. 22. I had a little husband no bigger than my thumb. I put him in a pint pot, and there I bid him drum. I bought a little handkerchief to wipe his little nose and a pair of little garters to tie his little hose. 42. Bat bat, come under my hat and I will give you a slice of bacon. And when I bake I will give you a cake, if I am not mistaken. 49. There was a little girl who had a little curl right in the middle of her forehead. When she was good she was very very good and when she was bad she was horrid. 0.17 d22 0.17 d49 0.21 d42 0.21 d2 0.21 d16

14 FAUST Clustering4 WS0=words wc(MG)>½max (>3)
DS1=all docs s.t. WS0wc(doc)>2 Converge using HOB WS DS1 |WS 7 | 27 |DS2 |WS 45 | 7 | 46 | 9 |DS3=C1 |27 | 7 |45 | 9 |27 |45 C1 (mother theme) 7. Old Mother Hubbard went to the cupboard to give her poor dog a bone. When she got there cupboard was bare and so the poor dog had none. She went to baker to buy him some bread. When she came back dog was dead. 9. Hush baby. Daddy is near. Mamma is a lady and that is very clear. 27. Cry baby cry. Put your finger in your eye and tell your mother it was not I. 45. Bye baby bunting. Father has gone hunting. Mother has gone milking. Sister has gone silking. And brother has gone to buy a skin to wrap the baby bunting in. Remove C1. WS0=words wc(MG')>½max (>3). DS1=all docs s.t. WS0wc(doc)>2. Converge using HOB C2 1. Three blind mice! See how they run! They all ran after the farmer's wife, who cut off their tails with a carving knife. Did you ever see such a thing in your life as three blind mice? 4. Little Miss Muffet sat on a tuffet, eating of curds and whey. There came a big spider and sat down beside her and frightened Miss Muffet away. 11. One misty moisty morning when cloudy was the weather, I chanced to meet an old man clothed all in leather. He began to compliment and I began to grin. How do you do And how do you do? And how do you do again 17. Here sits the Lord Mayor. Here sit his 2 men. Here sits the cock. Here sits the hen. Here sit the little chickens. Here they run in. Chin chopper, chin chopper, chin chopper, chin! 30. Hey diddle diddle! The cat and the fiddle. The cow jumped over the moon. The little dog laughed to see such sport, and the dish ran away with the spoon. 32. Jack come and give me your fiddle, if ever you mean to thrive. No I will not give my fiddle to any man alive. If I should give my fiddle they will think that I've gone mad. For many a joyous day my fiddle and I have had 41. Old King Cole was a merry old soul. And a merry old soul was he. He called for his pipe and he called for his bowl and he called for his fiddlers three. And every fiddler, he had a fine fiddle and a very fine fiddle had he. There is none so rare as can compare with King Cole and his fiddlers three. 46. Tom Tom the piper's son, stole a pig and away he run. The pig was eat and Tom was beat and Tom ran crying down the street. WS DS1 | WS 1 | 4 | DS2 11 | 1 17 | 4 30 | 11 32 | 17 41 | 30 46 | 32 | 41 | 46 This is not as good a cluster as C!. Lets try starting with DS0=docs dc(MG')>½max (>6.5) DS0 |WS1=30 45 26 | 35 |DS1 |WS2= 39 |26 | |35 |DS2|WS3= |39 |35 | |50 |39 |DS3 | |50 |35 | | |39 | | |50 C2 (pie theme) 35. Sing a song of sixpence, a pocket full of rye. Four and twenty blackbirds, baked in a pie. When the pie was opened, the birds began to sing. Was not that a dainty dish to set before the king? The king was in his counting house, counting out his money. The queen was in the parlor, eating bread and honey. The maid was in the garden, hanging out the clothes. When down came a blackbird and snapped off her nose. 39. A little cock sparrow sat on a green tree. And he chirped and chirped, so merry was he. A naughty boy with his bow and arrow, determined to shoot this little cock sparrow. This little cock sparrow shall make me a stew, and his giblets shall make me a little pie, too. Oh no, says sparrow, I'll not make a stew. So he flapped his wings and away he flew. 50. Little Jack Horner sat in the corner, eating of Christmas pie. He put in his thumb and pulled out a plum and said What a good boy am I! Remove C2. WS0=words wc(MG')>½max (>3). DS1=all docs s.t. WS0wc(doc)>2. Converge using HOB C3 1. Three blind mice! See how they run! They all ran after the farmer's wife, who cut off their tails with a carving knife. Did you ever see such a thing in your life as three blind mice? 11. One misty moisty morning when cloudy was the weather, I chanced to meet an old man clothed all in leather. He began to compliment and I began to grin. How do you do And how do you do? And how do you do again 17. Here sits the Lord Mayor. Here sit his two men. Here sits the cock. Here sits the hen. Here sit the little chickens. Here they run in. Chin chopper, chin chopper, chin chopper, chin! 30. Hey diddle diddle! The cat and the fiddle. The cow jumped over the moon. The little dog laughed to see such sport, and the dish ran away with the spoon. 32. Jack come and give me your fiddle, if ever you mean to thrive. No I will not give my fiddle to any man alive. If I should give my fiddle they will think that I've gone mad. For many a joyous day my fiddle and I have had 41. Old King Cole was a merry old soul. And a merry old soul was he. He called for his pipe and he called for his bowl and he called for his fiddlers three. And every fiddler, he had a fine fiddle and a very fine fiddle had he. There is none so rare as can compare with King Cole and his fiddlers three. 46. Tom Tom the piper's son, stole a pig and away he run. The pig was eat and Tom was beat and Tom ran crying down the street. WS0= DS1 |WS1= 1 | 11 | DS2 17 | 1 30 | 11 32 | 17 41 | 30 46 | 32 | 41 | 46 This is not a good cluster! Lets Again starting with DS0=docs dc(MG'')>½max (>3.5) DS0 |WS1= 10 29 | 13 37 |DS1 |WS2=12 19 14 44 |10 | 21 47 |21 |DS2 26 | |10 28 | |21 C3 (crown and brown theme?) 10. Jack and Jill went up the hill to fetch a pail of water. Jack fell down, and broke his crown and Jill came tumbling after. When up Jack got and off did trot as fast as he could caper, to old Dame Dob who patched his nob with vinegar and brown paper. 21. The Lion and the Unicorn were fighting for the crown. The Lion beat the Unicorn all around the town. Some gave them white bread and some gave them brown. Some gave them plum cake, and sent them out of town. Remove C3. Start with DS0=docs dc(MG''')>½max (>3.5) DS0 |WS1= 13 | 14 | DS1 |WS2= 26 | 26 | 28 | 29 | DS2 |WS3= 29 | 37 | 29 | 37 | 47 | 37 | DS3 |WS4= 44 | | 47 | 37 | 47 | | | 47 | DS4 | | | | 37 | | | | 47 C4 (morning theme) 37. Here we go round mulberry bush, mulberry bush, mulberry bush. Here we go round mulberry bush, on a cold and frosty morning. This is way we wash our hands, wash our hands, wash our hands. This is way we wash our hands, on a cold and frosty morning. This is way we wash our clothes, wash our clothes, wash our clothes. This is way we wash our clothes, on a cold and frosty morning. This is way we go to school, go to school, go to school. This is the way we go to school, on a cold and frosty morning. This is the way we come out of school, come out of school, come out of school. This is the way we come out of school, on a cold and frosty morning. 47. Cocks crow in the morn to tell us to rise and he who lies late will never be wise. For early to bed and early to rise, is the way to be healthy and wealthy and wise. Remove C4. Start with DS0=docs dc(MG''')>½max (>3.5)

15 FAUST Clustering4 (continued)
Remove C3. Start with DS0=docs dc(MG''')>½max (>3.5) WS DS1 |WS | 13 | |DS C5 (sheep theme? (But 13 is an internal class outlier!)) Let's consider an alternative C5 starting with DS0 instead of WS0! 13. A robin and a robins son once went to town to buy a bun. They could not decide on plum or plain. And so they went back home again. 26. Sleep baby sleep. Our cottage valley is deep. The little lamb is on the green with woolly fleece so soft and clean. Sleep baby sleep. Sleep baby sleep, down where the woodbines creep. Be always like the lamb so mild, a kind and sweet and gentle child. Sleep baby sleep. 28. Baa baa black sheep, have you any wool? Yes sir yes sir, three bags full. One for my master and one for my dame, but none for the little boy who cries in the lane. DS0 |WS1 1 60 13 | 14 |DS1 |WS2 1 60 26 |26 | 28 |28 | 29 |29 | 44 | | C5 (sleep-lamb hub(26) and spokes(28,29) theme? 26. Sleep baby sleep. Our cottage valley is deep. The little lamb is on the green with woolly fleece so soft and clean. Sleep baby sleep. Sleep baby sleep, down where the woodbines creep. Be always like the lamb so mild, a kind and sweet and gentle child. Sleep baby sleep. 28. Baa baa black sheep, have you any wool? Yes sir yes sir, three bags full. One for my master and one for my dame, but none for the little boy who cries in the lane. 29. When little Fred went to bed, he always said his prayers. He kissed his mamma and then his papa, and straight away went upstairs. Remove C5. Start with DS0=docs dc(MG''')>½max (>2.5) DS0 |WS1=13 26 | 5 33| 8 38|DS1|WS2=26 14 15| | 38 12 44| 5 |------ 13 48|14 |DS2 5 14 C6 fall (and men) theme 5. Humpty Dumpty sat on a wall. Humpty Dumpty had a great fall. All the Kings horses, and all the Kings men cannot put Humpty Dumpty together again. 14. If all the seas were one sea, what a great sea that would be! And if all the trees were one tree, what a great tree that would be! And if all the axes were one axe, what a great axe that would be! And if all the men were one man what a great man he would be! And if the great man took the great axe and cut down the great tree and let it fall into the great sea, what a splish splash that would be! Remove C6. Start with DS0=docs dc(MG''')>½max (>2.5) DS0|WS1= 8 | 12 |DS1 |WS2=13 31 13 | 13 | 15 | 15 | 33 | 33 | 38 | 44 | 44 48| C7 hub(buy,13,33) spoke(high,15,44) theme 13. A robin and a robins son once went to town to buy a bun. They could not decide on plum or plain. And so they went back home again. 15. Great A. little a. This is pancake day. Toss the ball high. Throw the ball low. Those that come after may sing heigh ho! 33. Buttons, a farthing a pair! Come, who will buy them of me? They are round and sound and pretty and fit for girls of the city. Come, who will buy them of me? Buttons, a farthing a pair! 44. The hart he loves the high wood. The hare she loves the hill. The Knight he loves his bright sword. The Lady loves her will. Remove C7. Start with DS0=docs dc(MG''')>½max (>1.5) DS0|WS | all| |DS1|WS2=44 59 | 6 | |12 |DS2 |WS3= |25 |12 |44 59 | |25 | C8 old people theme 11. One misty moisty morning when cloudy was the weather, I chanced to meet an old man clothed all in leather. He began to compliment and I began to grin. How do you do And how do you do? And how do you do again 25. There was an old woman, and what do you think? She lived upon nothing but victuals, and drink. Victuals and drink were the chief of her diet, and yet this old woman could never be quiet. Remove C8. Start with DS0=docs dc(MG''')>½max (>1.5) DS0|WS all| |DS1|WS | 6 | | 8 |DS2|WS |18 | 4 | |22 | 6 | |49 |18 49 C9 theme? 4. Little Miss Muffet sat on a tuffet, eating of curds and whey. There came a big spider and sat down beside her and frightened Miss Muffet away. 6. See a pin and pick it up. All the day you will have good luck. See a pin and let it lay. Bad luck you will have all the day. 18. I had two pigeons bright and gay. They flew from me the other day. What was the reason they did go? I can not tell, for I do not know. 49. There was a little girl who had a little curl right in the middle of her forehead. When she was good she was very very good and when she was bad she was horrid. Remove C9. Start with DS0=docs dc(MG''')>½max (>1.5) DS0|WS1 all all| |DS1 |all C10 theme? 2. This little pig went to market. This little pig stayed at home. This little pig had roast beef. This little pig had none. This little pig said Wee, wee. I can't find my way home. 3. Diddle diddle dumpling, my son John. Went to bed with his breeches on, one stocking off, and one stocking on. Diddle diddle dumpling, my son John. 8. Jack Sprat could eat no fat. His wife could eat no lean. And so between them both they licked the platter clean. 16. Flour of England, fruit of Spain, met together in a shower of rain. Put in a bag tied round with a string. If you'll tell me this riddle, I will give you a ring. 22. I had a little husband no bigger than my thumb. I put him in a pint pot, and there I bid him drum. I bought a little handkerchief to wipe his little nose and a pair of little garters to tie his little hose. 23. How many miles is it to Babylon? Three score miles and ten. Can I get there by candle light? Yes, and back again. If your heels are nimble and light, you may get there by candle light. 36. Little Tommy Tittlemouse lived in a little house. He caught fishes in other mens ditches. 37. Here we go round mulberry bush, mulberry bush, mulberry bush. Here we go round mulberry bush, on a cold and frosty morning. This is way we wash our hands, wash our hands, wash our hands. This 38. If I had as much money as I could tell, I never would cry young lambs to sell. Young lambs to sell, young lambs to sell. I never would cry young lambs to sell. 39. A little cock sparrow sat on a green tree. And he chirped and chirped, so merry was he. A naughty boy with his bow and arrow, determined to shoot this little cock sparrow. This little cock sparrow shall make me a stew, and his giblets shall make me a little pie, too. Oh no, says the sparrow, I will not make a stew. So he flapped his wings and away he flew. 42. Bat bat, come under my hat and I will give you a slice of bacon. And when I bake I will give you a cake, if I am not mistaken. 43. Hark hark, the dogs do bark! Beggars are coming to town. Some in jags and some in rags and some in velvet gowns. 48. One two, buckle my shoe. Three four, knock at the door. Five six, ick up sticks. Seven eight, lay them straight. Nine ten. a good fat hen. Eleven twelve, dig and delve. Thirteen fourteen, maids a courting. Fifteen sixteen, maids in the kitchen. Seventeen eighteen. maids a waiting. Nineteen twenty, my plate is empty.

16 FAUST ARM1 Any relationship (e. g
FAUST ARM1 Any relationship (e.g., a text corpus) is a labeled bipartite graph: doc1 doc2 . docN word1 word2 wordn tf=8 dc=df=2 wc=3 term_frequency (tf) labels each edge with number of times the word occurs in the document. doc_frequency (df) (or doc count (dc)) labels each word with the number of docs the word occurs in. word_count (wc), labels each doc with number of words it contains. We typically lower-bound threshold each of these labels . First, transform the corpus to an existential (either word exists in doc or it doesn't) corpus, using lower bound tf  1? Second, lower bound [and/or upper bound] df (e.g., df  2 requires each word to occur in at least 2 docs)?. Third, lower bound wc (e.g., wc2 requires each doc to contain at least 2 words?. Describing the relationships graphically? Graphical metadata or type: Entity-Relationship diagrams The incidence counts (as well as any other entity attribute) can be used to define sub-graphs and then we can search for stable (convergent) sub-graphs under that def. For doc-word relationship we used wc2 & dc2. Next we will try, wc2 & dc1. After that we will try wc1 & dc2. market basket relationship--> recommender relationship--> social network relationship author wc Document doc# tf contains dc part of speech word# Word zip ic customer cust# quantity buys cc supplier item# item cus# rating rates descr fc member mem# type befriends text relationship--> market basket instances cus1 cus2 . cusN item1 item2 itemn q=3 cc=2 supplier=acme ic=3 zip=58103 Graphical instances labeled bipartite graphs doc1 doc2 docN word1 word2 wordn tf=8 dc=2 PoS=verb wc=3 author=Bob social network instances mem1 mem2 memN .memN type=spouse fc=2 desr=old fc=3 recommender instances rating=5 In all cases, implement as two (redundant) ,pTreeSets, 1 for each entity, with 1 iff an edge. One PTS is rotation of other. If there is a numeric edge label (e.g., tf) each SPTS its bitslices, else 1 bit map.

17 01TBM FAUST ARM2 Suppose we have a corpus of 1.7 million documents, a vocabulary of 100,000 words and an average documents size of 20 words (think s). Vertical or Horizontal data structuring? I.e., do we bitmap (both ways?) or just use a simple edge table in MySQL? We don't want to be accused of cutting a board with a hammer (I've actually done that ;-) just because we have a great hammer! We can grab a great saw when it's the right tool (e.g., MySQL). Horizontal: Edge(edge#,doc,word) has 1.7M*20= 34M rows (each ~40 bits) that's ,360,000,000 bits Vertical (assuming we are capturing tf=term frequency, with a max of 7 (3 bits) DocPTreeSet: 3*1,700,000 = 5,100,000 DocPTrees, each 100,000 bits deep, so 510,000,000,000 bits WordPTreeSet: 3*100,000 = 300,000 WordPTrees, each 1,700,000 bits deep, so 510,000,000,000 bits. So it might not be a bad idea to have three versions of the corpus, Edge(Edge#,Doc,Word), DocPTreeSet, WordPTreeSet or even four versions: EdgeD(Edge#,Doc,Word), EdgeW(Edge#,Doc,Word) where EdgeD is ordered on Doc (same ordering as the doc ordering in DocPTreeSet) and EdgeW is ordered on Word (same ordering as the word ordering in WordPTreeSet). Let's assume we don't capture term frequency (just the existential data, word exists in doc), then SELECT Doc from EdgeD where Word=W4 is just the list version of WordPTreeSet(W4) etc. EdgeD E# D# W# EdgeW E# D# W# DocPTS D1 1 D2 D3 WordPTS W1 W2 W3 W4 W5 W6 W7 SELECT Doc from EdgeD where Word=W4 = D2, D3 WordPTS(W4) = 0 Four versions: Our main interest (and, it appears, Treeminers) is in data mining large text corpuses such as s, tweets, etc. Therefore we will use as our example dataset, the following 44 Mother Goose Rythmes with a vocabulary of 60 synonymized content words.

18 FAUST ARM3 The algorithm will be callled CDSC(w=0%, d=15%, DS0=doc1)
WORD FAUST ARM3 Look for Convergent, Dense Sub-Corpuses (this is somewhat ARM like data mining) The algorithm will be callled CDSC(w=0%, d=15%, DS0=doc1) DS0 = {doc1} WS1=Voc(DS0)={words in > 0% of DS0} DS1={docs with > 15% of WS1} WS2=Voc(DS1)={words in >0% of DS1} DS2={docs with > 15% of WS2} ... DS0 35SSS DS1 07OMH 35SSS 50LJH =DS2 DS0 07OMH 07OMH 13RRS 35SSS 45BBB 07OMH 13RRS 35SSS CDSC(w=0%,d=15%,DS0=35SSS) converges to Sub-Corpus, DS2={7,35,50}, WS2={4,7,9,10,13,17,23,24,25,28,33,34,37,40,42,43,44,45,47,50,53} ED = 25 / (3*21)=25 / 63= 39.7% whereas the original MG corpus EdgeDensity was 167/44x60=167/2640= 6.3% CDSC(w=0%,d=15%,DS0=7OMH) converges to Sub-Corpus, DS3={7,13,35}, DS3ocab={4,7,10,13,17,23,24,25,28,33,34,37,40,42,43,44,45,47,50,51.54}. ED=25/3*21= 25/63= 39.7%. Notes: We may need HighDocumentCount since a singleton DocSet with its vocab has EdgeDensity = 100%. A doubleton DS with its vocab will have high EdgeDensity too (in some sense the EdgeDensity measure the Vocab overlap of the two documents!). Lower EDThreh for large DocSets, e.g., for DSsize>2, ED=doubletonED/DocSetSize*VocabSize? 15% of 13 = 1.95 Vocab(DocSet1) 15% of 21 = 3.15 Vocab(DocSet2) 15% of 7=1.05 Vocab(DocSet1) 15% of 22 = 3.3 Vocab(DocSet2) 15% of 21 = 3.15 Vocab(DocSet3) D O C u m e nt 1 2 3 4 5 6 7 8 9 10 21 30 41 50 b a k e 7 1 r d c l o t h i s 2 3 5 f u 8 m n y 4 p g b a c k 4 1 e 7 o y 9 r d u 3 l t h i s 2 g 5 f 8 n m p er b a c k 4 1 e 7 r d u y 3 o g 2 m t h er l b a y 3 1 c k 4 e 7 r d u l o t h i s 2 g 5 f 8 n m er p w b a y 3 1 c k 4 e 7 r d u l o t h i s 2 g 5 f 8 n m er p w

19 Convergent, Dense Sub-Corpuses: CDSC(w=0%, d=10%, DS0=doc1)
WORD FAUST ARM4 Convergent, Dense Sub-Corpuses: CDSC(w=0%, d=10%, DS0=doc1) DS0 = {doc1} WS1=Voc(DS0)={words in > 0% of DS0} DS1={docs with > 15% of WS1} WS2=Voc(DS1)={words in >0% of DS1} DS2={docs with > 15% of WS2} ... 35SSS 07OMH 35SSS 50LJH 07OMH 13RRS 35SSS 50LJH 07OMH 13RRS 35SSS 50LJH 21LAU 10JAJ CDSC(w=0,d=10, DS0=35SSS) conv to DS4={7,10,13,21,35,50}, WS4={4,7,9,10,12,13,14,17,19,23,24,25,26,28,32,33,34,37,40,42,43,44,45,47,50,51,53,54 ED=41/28*6=24.4%. Lowering the DS%ofVocab from 15% to 10% decreases ED (Because it increases DSSize from 3 to 6?). 10% of 13 = 1.3 10% of 21 = 2.1 10% of 23 = 2.3 10% of 26 = 2.6 D O C 1 2 3 4 5 6 7 8 9 10 21 30 41 b a k e 7 1 r d c l o t h i s 2 3 5 f u 8 m n y 4 p g b a c k 4 1 e 7 o y 9 r d u 3 l t h i s 2 g 5 f 8 n m p er b a c k 4 1 e 7 o y 9 r d u 3 l t h i s 2 g 5 f 8 n m p er w b a c k 4 1 e 7 o y 9 r d u 3 l t h i s 2 g 5 f 8 n m p er w f a l 2 6 1 h i 3

20 Convergent, Dense Sub-Corpuses: CDSC(w=0%, d=10%, DS0={7,35})
WORD FAUST ARM5 Convergent, Dense Sub-Corpuses: CDSC(w=0%, d=10%, DS0={7,35}) DS0 = {7,35} WS1=Voc(DS0)={words in > 0% of DS0} DS1={docs with > 15% of WS1} WS2=Voc(DS1)={words in >0% of DS1} DS2={docs with > 15% of WS2} ... 07OMH 35SSS CDSC(w=0%,d=10%,DS0={7,35} converges to DS0={7,35} WS1={4,7,10,13,17,23,24,25,28,33,34,37,40,42,43,44,47,50}. ED=20/18*2=55.6% So far, ED*DSSizes = 55.6*2=111.2; 39.7*3=119; 24.4*6=146; 6.3*44=277; DSS progression 2,3,6,44; ED*DSS progression 111, 119, 146, 277; DSSs=1,4,42; *8 8,32,336. Subtract from ED*DSS; 111, 111, 114, Using this (highly adjusted and odd) invariant, the 3 sub-corpuses measure out about the same and higher than the MG corpus Note: ED of a single document with its vocabulary is 100%. Lower bound DocCount or at least give DocCount along with the density (or maybe DocCount*EdgeDensity)? It is not yet clear what x%Vocab Document qualification gives us and what convergence under that condition gives us. Would it be best to start by finding large DSs with high ED and work downward using some downward closure condition? 15% of 18= 2.7 D O C 1 2 3 4 5 6 7 8 9 10 21 30 41 b a c k 4 1 e 7 o y 9 r d u 3 l t h i s 2 g 5 f 8 n m p er

21 Convergent, Dense Sub-Corpuses: CDSC(w=0%, d=15%, DS0={26}) then
WORD FAUST ARM 6 Convergent, Dense Sub-Corpuses: CDSC(w=0%, d=15%, DS0={26}) then CDSC(w=0%, d=10%,DS0=26) DS0 26SBS DS1 08JSC 09HBD 12OWF 26SBS 27CBC 28BBB 29LFW 38YLS 39LCS 45BBB 07OMH 08JSC 09HBD 12OWF 26SBS 27CBC 28BBB 29LFW 35SSS 38YLS 39LCS 41OKC 45BBB 46TTP 50LJH DS2 07OMH 26SBS 28BBB 30HDD 35SSS 39LCS 41OKC 46TTP 50LJH DS3 DS4 Using 10%, it converges to DS4{7,26,28,30,35,39,41,46,50} with a 39 word Vocab and an EdgeDensity of 58/39*9 = 58/351 = 16.5%. So far EdgeDens*DSSize: 55.6*2= *3= *6= *9= *44= DSSizes 2,3,6,9, The 4 Deltas from DSS=2 are 1,4,7,42. Multiplied by 8; 8, 32, 56, Subtracting these 8*Delta values from ED*DSS, we get scores of 111, 111, 114, 93, -59. 10% of 7=.7 Vocab(DS1) 10% of 26=2.6 Vocab(DocSet2) Additions: 10% of 43=4.3 Vocab(DocSet3-DocSet2) Additions: 10% of 39=3.9 Vocab(DocSet4) D O C u m e nt 1 2 3 4 5 6 7 8 9 10 21 30 41 50 a l w y s 1 b 3 c h i d 5 e n 6 g r m o a l w y s 1 2 b 3 g 6 e d 8 o 9 u c h i 5 n k r t f m 4 er p b a c k 4 1 e 7 r d l o t h i s 2 3 g f le u n m 8 p 6 9 5 a l w y s 1 2 b 3 g 6 e d 8 o 9 u c h i 5 n k r t f m 4 er p 7 le D O C nt 10 21 30 41 50

22 Convergent Dense Sub-Corpuses HOB
FAUST ARM 7 Convergent Dense Sub-Corpuses HOB CDSC(HOB) Start with densest doc (35SSS 13 wds). Alternating between WSn=WS(DSn) and DSn+1=DS(WSn), ORing CountSPTS from the high side until RootCount>1. Continue this until stable (either the DS or WS is unchanged. WORD DS1 35SSS 07OMH 35SSS 50LJH DS2 DS3 Count Vector SPTS SPTS OR from high side until non-singleton SPTS RootCount=4! Stop ORing. Convert to list and get those 4 word-pTrees as WS2. Construct the SPTS, CountWS2. OR from high side until non-singleton... With DS1={35SSS}, HOB converges to DS2={7,35,50} WS2={7,10,25,45), ED=8/3*4= 66.7% Incidently, throw out densest docs/wds gives  density, e.g., (DS3={7,35}, WS2, ED=75%), (DS3, {7,10,25}, ED=83.3%), DS3, {7,10,51}, ED=83.3%), (DS4={35,50}, WS2, SD=75%), etc. Next, 07OMH, 50LJH w/o pTree details. 07OMH DS1 07OMH 13RRS 35SSS 45BBB DS2 DS3 With DS1={07OMH}, the HOB alg converges to Sub-Corpus DS={7,13,35,45} WS={4,7,10,13,42) ED=11/4*5=11/20= 55%. 50LJH DS1 35SSS 39LCS 50LJH DS2 DS3 Starting with DS1={50LJH}, the HOB alg converges to Sub-Corpus DS={35,39,50} WS={9,25,45), ED=7/9= 77.8%. 07OMH 13RRS 35SSS 45BBB DS1 39LCS 50LJH DS = { } WS={ } ED=20/54=37% count Conclusions: the convergent Sub-corpuses appear to be very dense in general. Theorem: Starting with each doc (from the densest) create all HOB-stable sub-corpuses. Prove this gets all maximal dense sub-corpuses. (doubt if it's true). Maximal means up to downward closures. What downward closure is there? I find: (DS,WS) dense  (DS' ,WS') dense, DS'=DS with any subset of sparsest docs removed, same for WS' WS1 RootCt=3, Conv to list.Get the 3 doc-pTrees as DS2. Const SPTS, CntDS2. Or from hi bit til non-single RootCount=1, singleton DS b a k e 7 1 r d t 2 5 p i 4 WS2 singleton b a c k 4 1 e 7 r d u y 3 o g 2 m t h er l WS1 b a c k 4 1 e 7 r d u y 3 m o t h er 2 WS2 b o y 9 1 e a t 2 5 p i 4 u m 7 h 3 WS1 b o y 9 1 e a t 2 5 WS2 Same tripleton DS2=DS3. Done D O C 1 2 3 4 5 6 7 8 9 10 21 30 41 b a k e 7 1 r d c l o t h i s 2 3 5 f u 8 m n y 4 p g b a c k 4 1 e 7 o y 9 r d u 3 t 2 5 m h er p i cou nt Ve ct or 1 2 13 cou nt SP TS 3 1 cou nt SP TS 2 1 cou nt SP TS 1 cou nt SP TS 1 cou nt SP TS 3 1 cou nt SP TS 3|2 1 cou nt SP TS 3|2|1 1 cou nt Ve ct or 1 2 4 cou nt SP TS 2 1 cou nt SP TS 1 cou nt SP TS 1 cou nt SP TS 2 1 cou nt SP TS 2|1 1 OR OR Now consider union of 3 corpuses above DS = { } WS={ } ED = 20/54 = 37%

23 FAUST ARM 8 CDSC(HOB)2 So far we have used 7 9 12 13 15 26 27 29 35 38
WORD DS1 26SBS 09HBD 12OWF 15PCD 26SBS 27CBC 29LFW 35SSS 38YLS 39LCS 46TTP 13 d35 7 d26 7 d7 7 d39 6 d28 6 d46 6 d21 5 d10 5 d50 5 d13 5 d41 5 d30 4 d37 4 d17 4 d44 4 d1 4 d14 4 d29 4 d47 3 d27 WSC 09HBD 27CBC 45BBB With DS1={26SBS}, HOB converges to DS={ } WS={3 42), ED=6/6= 100%. taking all docs and all words, DS={ } WS={ ), ED=23/96= 23.9%. DS1 28BBB DS2: DS3: DS4: With DS1={28BBB}, HOB converges to DS={ } WS={ }, ED=12/20= 60%. WS1 a l w y s 1 b 3 c h i d 5 e n 6 g r m o WS3: 3 42 WS3: D O C 1 2 3 4 5 6 7 8 9 10 21 30 41 WS2: 3 42 b a y 3 1 m o t h er 4 2 b a g 6 1 o y 9 c r 2 f u l 8 t h e 5 w WS2: W S C w s c So far we have used 7 9 12 13 15 26 27 29 35 38 39 45 46 50

24 01TBM FAUST ARM 9 CDSC(HOB)3 WORD DS1 35SSS 07OMH 10JAJ 13RRS 14ASO 17FEC 21LAU 26SBS 28BBB 29LFW 30HDD 35SSS 37MBB 39LCS 41OKC 44HLH 46TTP 47CCM 50LJH 01TBM DS2 Use WS1Cbit 3DS={35} Voc(DS) ={7,10,17, 23,25,28,33, 34,37 40, 43,45,50}) ED= 100% DS2=WS1Cbit3|2(=WS1Cbit2)={1,7,10,13,14,17,21,26,28,29, 30,35,37,39,41,44,46,47,50} WS2=vocabDS2=all but 5,22, 29,59 ED=105/(19*56)=10% Instead take WS2=DS2bit2={2,49} ED=9/(19*2)= 24% DS2C: DS2b2: WS1=CDCbit2={2,3,13,20,22, 25,38,42,44,49,52} DS2=WS1Cb2={46} ED=4/(11*1)= 36.3% CDC CDCb CDCb CDCb WS1=CDCbit2={2,3,13,20,22, 25,38,42,44,49,52} DS2=WS1Cbit 2|1 ={1,4,7,9,11, 17,27,28,29,30,32,41,45,46} ED=14/(14*11)= 9% 46TTP WS1=Voc(DS1) D O C 1 2 3 4 5 6 7 8 9 10 21 30 41 b a k e 7 1 r d c l o t h i s 2 3 5 f u 8 m n y 4 p g a w y 2 1 r u n 4 9 a w y 2 1 b 3 u c r d e t 5 m n 8 o h er 4 l 9 w s 1 c WS1C 4 2 3 7 5 6 13 WS1Cbit3 1 WS1Cbit2 1

25 Frequency and Density are related.
01TBM FAUST ARM10 Frequent 1DocSets WORD WS1 ={7,10,17,23,25,28, 33,34,37,40,43,45,50} DS1 35SSS DS2 ={35} DS1 26SBS W D2 26 DS1 39LCS W D2 39 DS1 07OMH W DS1 21LAU WS1 ={ } DS1 28BBB WS1 ={ } DS1 46TTP WS1 ={ } DS1 10JAJ WS1 ={ } DS1 13RRS WS1 ={ } DS1 30HDD WS1 ={ } DS1 41OKC WS1 ={ } DS1 50LJH WS1 ={ } DS1 01TBM WS1 ={ } DS1 14ASO WS1 ={ } DS1 17FEC WS1 ={ } DS1 29LFW WS1 ={ } DS1 37MBB WS1 ={ } DS1 44HLH WS1 ={ } DS1 47CCM WS1 ={ } DS1 05HDS WS1 ={ } DS1 08JSC DS1 09HBD WS1 ={ } DS1 11OMM WS1 ={ } DS1 12OWF WS1 ={ } WS1 ={ } DS1 15PCD DS1 27CBC WS1 ={ } DS1 32JGF WS1 ={ } WS1 = } DS1 33BFP DS1 38YLS WS1 = } WS1 ={ } DS1 48OTB DS1 02TLP WS1 ={ } DS1 03DDD WS1 ={8 51} WS1 ={46 57} DS1 04LMM WS1 ={2 25} Of the 2 word docs remaining, , Only the following are nonsingular 100% dense subcorpuses. D2 ={4 46} ED=4/4=100% DS1 04LMM WS1 ={4 59} D2 ={12 25} ED=4/4=100% So there are only 2 nontrivial 100% convervent subcorpuses and both have 2 docs and 2 words only. And. in fact, no convergence steps were required (in each case the sub-corpus converged immediately). F1DocSets={ } To find all freq 2DSs, AND pairwise, then calculate the counts. Easier way?. Frequency and Density are related. Finding ALL frequent DSs is still hard since we have to loop thru all candidate frequent 2DocSets calculating root count of AND. Finding frequent sets is applying our CDSC algorithm once (not applying it until convergent!) and using the full count (100%) instead of a percentage like 15% or 10%. Thus it is CDSC(100%) If we were to take all the wayARM to convergence, we would end up with a DocSet and a WordSet with the property that the DocSet is frequent and the WordSet is frequent. Then we could look for confident DocSet rules AND conf WdSet rules. A confident DocSet rule, AB means: the set of words that occur in every B doc, contains most of the words in the set of words in every A doc. A confident WordSet rule, UV means the set of docs containing every V word, contains most docs in the set of docs that contain every U word.. That's a strong association condition! But it may almost never exist is large corpuses.

26 M G44d60w: 44 MOTHER GOOSE RHYMES with a synonymized vocabulary of 60 WORDS
1. Three blind mice! See how they run! They all ran after the farmer's wife, who cut off their tails with a carving knife. Did you ever see such a thing in your life as three blind mice? 2. This little pig went to market. This little pig stayed at home. This little pig had roast beef. This little pig had none. This little pig said Wee, wee. I can't find my way home. 3. Diddle diddle dumpling, my son John. Went to bed with his breeches on, one stocking off, and one stocking on. Diddle diddle dumpling, my son John. 4. Little Miss Muffet sat on a tuffet, eating of curds and whey. There came a big spider and sat down beside her and frightened Miss Muffet away. 5. Humpty Dumpty sat on a wall. Humpty Dumpty had a great fall. All the Kings horses, and all the Kings men cannot put Humpty Dumpty together again. 6. See a pin and pick it up. All the day you will have good luck. See a pin and let it lay. Bad luck you will have all the day. 7. Old Mother Hubbard went to the cupboard to give her poor dog a bone. When she got there cupboard was bare and so the poor dog had none. She went to baker to buy him some bread. When she came back dog was dead. 8. Jack Sprat could eat no fat. His wife could eat no lean. And so between them both they licked the platter clean. 9. Hush baby. Daddy is near. Mamma is a lady and that is very clear. 10. Jack and Jill went up the hill to fetch a pail of water. Jack fell down, and broke his crown and Jill came tumbling after. When up Jack got and off did trot as fast as he could caper, to old Dame Dob who patched his nob with vinegar and brown paper. 11. One misty moisty morning when cloudy was the weather, I chanced to meet an old man clothed all in leather. He began to compliment and I began to grin. How do you do And how do you do? And how do you do again 12. There came an old woman from France who taught grown-up children to dance. But they were so stiff she sent them home in a sniff. This sprightly old woman from France. 13. A robin and a robins son once went to town to buy a bun. They could not decide on plum or plain. And so they went back home again. 14. If all the seas were one sea, what a great sea that would be! And if all the trees were one tree, what a great tree that would be! And if all the axes were one axe, what a great axe that would be! And if all the men were one man what a great man he would be! And if the great man took the great axe and cut down the great tree and let it fall into the great sea, what a splish splash that would be! 15. Great A. little a. This is pancake day. Toss the ball high. Throw the ball low. Those that come after may sing heigh ho! 16. Flour of England, fruit of Spain, met together in a shower of rain. Put in a bag tied round with a string. If you'll tell me this riddle, I will give you a ring. 17. Here sits the Lord Mayor. Here sit his two men. Here sits the cock. Here sits the hen. Here sit the little chickens. Here they run in. Chin chopper, chin chopper, chin chopper, chin! 18. I had two pigeons bright and gay. They flew from me the other day. What was the reason they did go? I can not tell, for I do not know. 21. The Lion and the Unicorn were fighting for the crown. The Lion beat the Unicorn all around the town. Some gave them white bread and some gave them brown. Some gave them plum cake, and sent them out of town. 22. I had a little husband no bigger than my thumb. I put him in a pint pot, and there I bid him drum. I bought a little handkerchief to wipe his little nose and a pair of little garters to tie his little hose. 23. How many miles is it to Babylon? Three score miles and ten. Can I get there by candle light? Yes, and back again. If your heels are nimble and light, you may get there by candle light. 25. There was an old woman, and what do you think? She lived upon nothing but victuals, and drink. Victuals and drink were the chief of her diet, and yet this old woman could never be quiet. 26. Sleep baby sleep. Our cottage valley is deep. The little lamb is on the green with woolly fleece so soft and clean. Sleep baby sleep. Sleep baby sleep, down where the woodbines creep. Be always like the lamb so mild, a kind and sweet and gentle child. Sleep baby sleep. 27. Cry baby cry. Put your finger in your eye and tell your mother it was not I. 28. Baa baa black sheep, have you any wool? Yes sir yes sir, three bags full. One for my master and one for my dame, but none for the little boy who cries in the lane. 29. When little Fred went to bed, he always said his prayers. He kissed his mamma and then his papa, and straight away went upstairs. 30. Hey diddle diddle! The cat and the fiddle. The cow jumped over the moon. The little dog laughed to see such sport, and the dish ran away with the spoon. 32. Jack come and give me your fiddle, if ever you mean to thrive. No I will not give my fiddle to any man alive. If I should give my fiddle they will think that I've gone mad. For many a joyous day my fiddle and I have had 33. Buttons, a farthing a pair! Come, who will buy them of me? They are round and sound and pretty and fit for girls of the city. Come, who will buy them of me? Buttons, a farthing a pair! 35. Sing a song of sixpence, a pocket full of rye. Four and twenty blackbirds, baked in a pie. When the pie was opened, the birds began to sing. Was not that a dainty dish to set before the king? The king was in his counting house, counting out his money. The queen was in the parlor, eating bread and honey. The maid was in the garden, hanging out the clothes. When down came a blackbird and snapped off her nose. 36. Little Tommy Tittlemouse lived in a little house. He caught fishes in other mens ditches. 37. Here we go round mulberry bush, mulberry bush, mulberry bush. Here we go round mulberry bush, on a cold and frosty morning. This is way we wash our hands, wash our hands, wash our hands. This is way we wash our hands, on a cold and frosty morning. This is way we wash our clothes, wash our clothes, wash our clothes. This is way we wash our clothes, on a cold and frosty morning. This is way we go to school, go to school, go to school. This is the way we go to school, on a cold and frosty morning. This is the way we come out of school, come out of school, come out of school. This is the way we come out of school, on a cold and frosty morning. 38. If I had as much money as I could tell, I never would cry young lambs to sell. Young lambs to sell, young lambs to sell. I never would cry young lambs to sell. 39. A little cock sparrow sat on a green tree. And he chirped and chirped, so merry was he. A naughty boy with his bow and arrow, determined to shoot this little cock sparrow. This little cock sparrow shall make me a stew, and his giblets shall make me a little pie, too. Oh no, says the sparrow, I will not make a stew. So he flapped his wings and away he flew. 41. Old King Cole was a merry old soul. And a merry old soul was he. He called for his pipe and he called for his bowl and he called for his fiddlers three. And every fiddler, he had a fine fiddle and a very fine fiddle had he. There is none so rare as can compare with King Cole and his fiddlers three. 42. Bat bat, come under my hat and I will give you a slice of bacon. And when I bake I will give you a cake, if I am not mistaken. 43. Hark hark, the dogs do bark! Beggars are coming to town. Some in jags and some in rags and some in velvet gowns. 44. The hart he loves the high wood. The hare she loves the hill. The Knight he loves his bright sword. The Lady loves her will. 45. Bye baby bunting. Father has gone hunting. Mother has gone milking. Sister has gone silking. And brother has gone to buy a skin to wrap the baby bunting in. 46. Tom Tom the piper's son, stole a pig and away he run. The pig was eat and Tom was beat and Tom ran crying down the street. 47. Cocks crow in the morn to tell us to rise and he who lies late will never be wise. For early to bed and early to rise, is the way to be healthy and wealthy and wise. 48. One two, buckle my shoe. Three four, knock at the door. Five six, ick up sticks. Seven eight, lay them straight. Nine ten. a good fat hen. Eleven twelve, dig and delve. Thirteen fourteen, maids a courting. Fifteen sixteen, maids in the kitchen. Seventeen eighteen. maids a waiting. Nineteen twenty, my plate is empty. 49. There was a little girl who had a little curl right in the middle of her forehead. When she was good she was very very good and when she was bad she was horrid. 50. Little Jack Horner sat in the corner, eating of Christmas pie. He put in his thumb and pulled out a plum and said What a good boy am I! Av: always away baby back bad bag bake bed boy bread bright brown buy cake child clean cloth cock crown cry cut day dish dog eat fall fiddle full girl green high word# df# min=2 hill house king lady lamb maid men merry money morn mother nose old pie pig plum round run sing son three thumb town tree two way wife woman wool word# df# max=6

27 D O C 1 2 3 4 5 6 7 8 9 10 21 30 41 a l w y s 1 a w y 2 1 b a y 3 1 b a c k 4 1 b a d 5 1 b a g 6 1 b a k e 7 1 b e d 8 1 b o y 9 1 b r e a d 1 b r i g ht 1 b r o w n 1 2 b u y 1 3 c a k e 1 4 c h i l d 1 5 c l e a n 1 6 c l o t h 1 7 c o k 1 8 c r o w n 1 9 c r y 2 1 c u t 2 1 d a y 2 1 d i s h 2 3 1 d o g 2 4 1 e a t 2 5 1 f a l 2 6 1 f i d le 2 7 1 f u l 2 8 1 g i r l 2 9 1 g r e n 3 1 h i g 3 1 h i l 3 2 1 h o u s e 3 1 k i n g 3 4 1 l a d y 3 5 1 l a m b 3 6 1 m a i d 3 7 1 m e n 3 8 1 m e r y 3 9 1 m o n e y 4 1 m o r n 4 1 m o t h er 4 2 1 n o s e 4 3 1 o l d 4 1 p i e 4 5 1 p i g 4 6 1 p i u m 4 7 1 r o u n d 4 8 1 r u n 4 9 1 s i n g 5 1 s o n 5 1 t h r e 5 2 1 t h u m b 5 3 1 t o w n 5 4 1 t r e 5 1 t w 5 6 1 w a y 5 7 1 w i f e 5 8 1 w o m a n 5 9 1 w o l 6 1

28 WORD 01TBM 02TLP 03DDD 04LMM 05HDS 06SPP 07OMH 08JSC 09HBD 10JAJ 11OMM 12OWF 13RRS 14ASO 15PCD 16PPG 17FEC 18HTP 21LAU 22HLH 23MTB 25WOW 26SBS 27CBC 28BBB 29LFW 30HDD 32JGF 33BFP 35SSS 36LTT 37MBB 38YLS 39LCS 41OKC 42BBC 43HHD 44HLH 45BBB 46TTP 47CCM 48OTB 49WLG 50LJH

29 HOB2 DS1: Go down the HOBs of the countSPTSs one at a time with full vocabulary. Then try for a downward closure on subcorpuses. CDC=CorpusDocCnt> DSC =DocSetCount> DSCP=DSCPtrees> CWC WSCWSCP d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d 13 13 max sum mx sum

30 A C I Relationships and ARM: 2 3 3 1 3 2 3 3 3
In Market Basket Research (MBR), we introduce the relationship, cash-register transactions, T, between customers, C, and purchasable items, I, and briefly discussed what strong rules tell us in that context. In Software Engineering (SE), the relationship between Aspects, T, and Code Modules, I (t is related to i iff module, i, is part of the aspect, t). In Bioinformatics, relationship between experiments, T, and genes, I (t is related to i iff gene, i, expresses at threshold level during experiment, t). In Text Mining, the relationship between Documents, D, and Words, W (w related to d iff wd). A strong D-rule means two things: The DSets A and C have many common words. If a word occurs in every document of the DSet, A, it occurs in every doc of C with high probability. In any Entity Relationship diagram, a “part of” relationship in which iI is part of tT (t is related to i iff i is part of t); and an “ISA” relationship in which iI ISA tT (t is related to i iff i IS A t) . . . C I A c1 c2 c3 c4 c5 i1 i2 i3 i4 Given any relationship between two entities (e.g., between customers and items) there are always two ARM problems to analyze. E.g., We analyzed Itemset rule, AC (call them I-rules using info recorded on which customer transactions contained those itemsets. With this, we can intelligently shelf items, to accurately order items (Supply Chain Management), and etc. There are also C-rules, The support [ratio] of itemset A, supp(A), is the fraction of Ts such that A  T(I), e.g., if A={i1,i2} and C={i4} then supp(A) = |{t2,t4}| / |{t1,t2,t3,t4,t5}| = 2/5 | | means set size = count of elements in set. The support [ratio] of rule AC, supp(AC), is the support of {A C}=|{T2,T4}|/|{T1,T2,T3,T4,T5}|=2/5 The confidence of rule AC, conf(AC), is supp(AC) / supp(A) = (2/5) / (2/5) = 1 Data Miners typically want to find all STRONG RULES, AC, with supp(AC) ≥ minsupp and conf(AC) ≥ minconf (minsupp, minconf are threshold levels). A Strong rule indicates two things: high support means it's non-trivial (A and B are found in many market baskets at checkout) and high confidence means that the implication rule is highly likely to be true. Note conf(AC) is also just the conditional probability of t being related to C, given that t is related to A, (e.g., the conditional probability that the market basket contents, T(I), contains C, given that T(I) contains A. minsupp is set by querier at 1/2, minconf at 3/4 (note minsupp and minconf can be expressed as counts (4 transs, so minsupp=2, minconf=3)) (downward closure property of "frequent"). Any subset of a frequent itemset is frequent. APRIORI: Iteratively find Frequent k-itemsets, k=1,2,... Find all strong rules supported by each frequent Itemset. (Ck=candidate k-itemsets. Fk=frequent k-itemsets 1-Iset supports Frequent (supp  2) Start by finding Frequent 1-ItemSets. 2 1s, 3 2s, 3 3s, 1 4, 3 5s TID Items TID Horizontal Transaction Table (HTT) Vertical Transaction Table (VTT) Given a Transaction-Item Relationship: vertically processing of a Horizontal Transaction Table (HTT) or horizontally processing of a Vertical Transaction Table (VTT). In 1., a HTT is processed thru vertical scans for all Frequent I-sets (I-sets with support  minsupp, e.g., I-sets "frequently" found in transaction market baskets). In 2. a VTT is processed thru horizontal operations to find all Frequent I-sets Then each Frequent I-set found is analyzed to determine if it is the support set of a strong rule. Finding all Frequent I-sets is the hard part. The APRIORI Algorithm takes advantage of the "downward closure" property for Frequent I-sets: If a I-set is frequent, then all its subsets are also frequent. E.g., MBR, If A is an I-subset of B and if all of B is in a basket, the certainly all of A is in that basket too. Therefore Supp(A)  Supp(B) whenever AB (downward closure). First, APRIORI scans to determine all Frequent 1-item sets (contain 1 item; therfore called 1-Itemsets), next APRIORI uses downward closure to efficiently find candidates for Frequent 2-Itemsets, next APRIORI scans to determine which of those candidate 2-Itemsets is actually Frequent, ... Until there are no candidates remaining (on the next slide we walk through an example using both a HTT and a VTT)

31 C2 C2 C1 F1 = L1 F2 = L2 C3 F3 = L3 L1={1}{2}{3}{5}
Other ARM methods: FP-Growth: builds a linked data structure precounting counts. Hash-based itemset counting: A k-itemset whose corresponding hashing bucket count is below threshold cannot be frequent. Trans reduction: A transaction that does not contain any frequent k-itemset is useless in subsequent scans. Partitioning: Any potentially frequent itemset that in DB must be frequent in at least one of the partitions of DB. Sampling: mining on a subset of given data, lower support threshold + a method to determine completeness. Dynamic itemset counting: add new candidate itemsets only when all of their subsets are estimated to be frequent Core of Apriori: Use only large (k – 1)-itemsets to generate candidate large k-itemsets Use database scan and pattern matching to collect counts for the candidate itemsets Bottleneck of Apriori: candidate generation Huge candidate sets large 1-itemset may generate 107 candidate 2-itemset. To discover large pattern of size 100, eg, {a1…a100}, we need to generate 2100  1030 candidates. Multiple scans of database: (Needs (n +1 ) scans, n = length of the longest pattern) A supplemental text document on ARM (with additional topics and discussions) at C2 C2 Scan D Scan D Isupp 2 1 {123} need not be scanned for since {12} is not frequent. {135} need not be scanned for since {15} not frequent Scan D C1 HTT F1 = L1 F2 = L2 C3 Iset {2 3 5} {1 2 3} {1,3,5} F3 = L3 Example ARM, uncompressed Ptrees (note: 1-count at Ptree root) P1^P2^P3 1 //\\ 0010 P1^P3 ^P5 1 P2^P3 ^P5 2 0110 P1 2 //\\ 1010 P2 3 0111 P3 3 1110 P4 1 1000 P5 3 Build Ptrees: Scan D P1^P2 1 //\\ 0010 P1^P3 2 1010 P1^P5 1 P2^P3 2 0110 P2^P5 3 0111 P3^P5 2 L1={1}{2}{3}{5} TID 1 2 3 4 5 100 200 300 400 L2={13}{23}{25}{35} L3={235}

32 ARM-7 L1 L2 L3 1-ItemSets don’t support Association Rules (They eihter have no antecedent or no consequent). 2-Itemsets do support ARs. Are there any Strong Rules supported by Frequent=Large 2-ItemSets (at minconf=.75)? {1,3} conf({1}{3}) = supp{1,3}/supp{1} = 2/2 = 1 ≥ .75 STRONG conf({3}{1}) = supp{1,3}/supp{3} = 2/3 = .67 < .75 {2,3} conf({2}{3}) = supp{2,3}/supp{2} = 2/3 = .67 < .75 conf({3}{2}) = supp{2,3}/supp{3} = 2/3 = .67 < .75 {2,5} conf({2}{5}) = supp{2,5}/supp{2} = 3/3 = 1 ≥ .75 STRONG! conf({5}{2}) = supp{2,5}/supp{5} = 3/3 = 1 ≥ .75 STRONG! {3,5} conf({3}{5}) = supp{3,5}/supp{3} = 2/3 = .67 < .75 conf({5}{3}) = supp{3,5}/supp{5} = 2/3 = .67 < .75 Are there any Strong Rules supported by Frequent or Large 3-ItemSets? {2,3,5} conf({2,3}{5}) = supp{2,3,5}/supp{2,3} = 2/2 = 1 ≥ .75 STRONG! conf({2,5}{3}) = supp{2,3,5}/supp{2,5} = 2/3 = .67 < .75 No subset antecedent can yield a strong rule either (i.e., no need to check conf({2}{3,5}) or conf({5}{2,3}) since both denominators will be at least as large and therefore, both confidences will be at least as low. conf({3,5}{2}) = supp{2,3,5}/supp{3,5} = 2/2 = 1  STRONG! conf({3}{2,5}) = supp{2,3,5}/supp{3} = 2/3 = .67 < .75 DONE!

33 HOB-CDSC Start with densest doc (35SSS). Then always choose using highest count (except when doing so results in a singleton, in which case include 2nd high count also). WORD DS0 35SSS DS1 07OMH 35SSS 50LJH DS2 Starting with 35SSS, the alg converges to Sub-Corpus DS={7,35,50} WS={7,10,25,45) with an EdgeDensity= 8/3*4 = 8/12 = 66.7%. 07OMH DS0 07OMH 13RRS 35SSS 45BBB DS1 DS2 Starting with 07OMH, the alg converges to Sub-Corpus DS={7,13,35,45} WS={4,7,10,13,42) with EdgeDensity=11/4*5=11/20= 55%. 50LJH DS0 35SSS 39LCS 50LJH DS1 DS2 Starting with 50LJH, the alg converges to Sub-Corpus DS={35,39,50} WS={9,25,45) with EdgeDensity=7/3*3=7/9= 77.8%. WS1 WS2 b a c k 4 1 e 7 r d u y 3 o g 2 m t h er l WS1 b a c k 4 1 e 7 r d u y 3 m o t h er 2 WS2 b o y 9 1 e a t 2 5 p i 4 u m 7 h 3 WS1 b o y 9 1 e a t 2 5 p i 4 WS2 D O C 1 2 3 4 5 6 7 8 9 10 21 30 41 b a k e 7 1 r d c l o t h i s 2 3 5 f u 8 m n y 4 p g b a k e 7 1 r d t 2 5 p i 4

34 FAUST Classification1 Using the clustering of FAUST Clustering1, we extract 80% from each cluster as TrainingSet (w class=cluster#). How accurate is FAUST Hull Clustering is on the remaining 20% plus the outliers (which should be "other"). C11={2,3,16,22,42,43} C311= {11,17,29} C312={13,30,50} C313={10,26,28,41} C2 ={1,4,5,8,9,12,14,15,23,25,27,32,33,36,37,38,44,45,47,48} OUTLIERS {18,49} {6} {39} {21} {46} {7} {35} Full classes from slide 15 C11={2,16,22,42,43} C311= {11,17} C312={30,50} C313={10,28,41} C2 ={1,5,8,9,12,15,25,27,32,33,36,37,38,44,47,48} Training Set C11={3} C311= {29} C312={13} C313={26} C2 ={4,14,23,45} O={ } Test Set sum sum CL d d d d d d d d d d d d d d d d d d d d d d d d d d d d d TrainingSe PTreeSet D=TrainSet MIN MAX CLASS C11 C2 C311 C312 C313 D=C11 MIN MAX CLASS C11 C2 C311 C312 C313 D=C2 MIN MAX CLASS C11 C2 C311 C312 C313 D=C311 MIN MAX CLASS C11 C2 C311 C312 C313 D=C312 MIN MAX CLASS C11 C2 C311 C312 C313 D=C313 MIN MAX CLASS C11 C2 C311 C312 C313

35 FAUST Clustering 1.1 WS=WordSet is always defined by dc(DS)>Avg(dc(previousDS)) Converge using real HOB (high bit only) DS=WordSet is always defined by wc(DS)>Avg(wc(previousDS)) WS0= DS1= | WS 7 | 13 | DS2| 30 | 7 | 35 | 13 | 39 | 30 | 41 | 35 | 46 | 39 | 50 | 41 | | 46 | | 50 | 7. Old Mother Hubbard went to cupboard to give her poor dog bone. When she got there cupboard was bare, poor dog had none. She went to baker to buy him some bread. When she came back dog was dead. 13. A robin and a robins son once went to town to buy a bun. They could not decide on plum or plain. And so they went back home again. 30. Hey diddle diddle! The cat and the fiddle. The cow jumped over the moon. The little dog laughed to see such sport, and the dish ran away with the spoon. 35. Sing a song of sixpence, a pocket full of rye. 4 and 20 blackbirds, baked in a pie. When the pie was opened, the birds began to sing. Was not that a dainty dish to set before the king? The king was in his counting house, counting out his money. The queen was in the parlor, eating bread and honey. The maid was in the garden, hanging out the clothes. When down came a blackbird and snapped off her nose. 39. A little cock sparrow sat on a green tree. And he chirped and chirped, so merry was he. A naughty boy with his bow and arrow, determined to shoot this little cock sparrow. This little cock sparrow shall make me a stew, and his giblets shall make me a little pie, too. Oh no, says the sparrow, I will not make a stew. So he flapped his wings and away he flew. 41. Old King Cole was a merry old soul. And a merry old soul was he. He called for his pipe and he called for his bowl and he called for his fiddlers three. And every fiddler, he had a fine fiddle and a very fine fiddle had he. There is none so rare as can compare with King Cole and his fiddlers three. 46. Tom Tom the piper's son, stole a pig and away he run. The pig was eat and Tom was beat and Tom ran crying down the street. 50. Little Jack Horner sat in the corner, eating of Christmas pie. He put in his thumb and pulled out a plum and said What a good boy am I! Converge using just real HOB (high bit only) Alternating WS0 and DS0. WS0= DS1= | WS1= 46 | | DS2| | 46 | OUTLIER: 46. Tom Tom the piper's son, stole a pig and away he run. The pig was eat and Tom was beat and Tom ran crying down the street. DS0=|WS1= 35 |---| |DS2| |35 | OUTLIER: 35. Sing a song of sixpence, a pocket full of rye. 4 and 20 blackbirds, baked in a pie. When the pie was opened, the birds began to sing. Was not that a dainty dish to set before the king? The king was in his counting house, counting out his money. The queen was in the parlor, eating bread and honey. The maid was in the garden, hanging out the clothes. When down came a blackbird and snapped off her nose. WS0= DS1 |WS1= 42(Mother) 9 DS2|WS2= 42 11 7 27 9 29 27 32 29 41 45 45 C1: Mother theme 7. Old Mother Hubbard went to the cupboard to give her poor dog a bone. When she got there cupboard was bare and so the poor dog had none. She went to baker to buy him some bread. When she came back dog was dead. 9. Hush baby. Daddy is near. Mamma is a lady and that is very clear. 27. Cry baby cry. Put your finger in your eye and tell your mother it was not I. 29. When little Fred went to bed, he always said his prayers. He kissed his mamma and then his papa, and straight away went upstairs. 45. Bye baby bunting. Father has gone hunting. Mother has gone milking. Sister has gone silking. And brother has gone to buy a skin to wrap the baby bunting in. DS0|WS 10 DS1| WS 13 10 | DS2| WS3 | DS3 21 39 26 41 28 44 30 47 37 50 OUTLIER: 10. Jack and Jill went up hill to fetch a pail of water. Jack fell down, and broke his crown and Jill came tumbling after. When up Jack got and off did trot as fast as he could caper, to old Dame Dob who patched his nob with vinegar and brown paper. WS DS1 WS1= {fiddle(32 41) man(11 32) old(11 44) 32 DS2 41 11 32 44 C2 fiddle old man theme 11. One misty moisty morning when cloudy was weather, I chanced to meet an old man clothed all in leather. He began to compliment and I began to grin. How do you do How do you do? How do you do again 32. Jack come and give me your fiddle, if ever you mean to thrive. No I will not give my fiddle to any man alive. If I'd give my fiddle they will think I've gone mad. For many a joyous day my fiddle and I have had 41. Old King Cole was a merry old soul. And a merry old soul was he. He called for his pipe and he called for his bowl and he called for his fiddlers three. And every fiddler, he had a fine fiddle and a very fine fiddle had he. There is none so rare as can compare with King Cole and his fiddlers three.

36 graphs. A clique in an undirected graph is a subset of its vertices such that every 2 vertices in the subset are connected by an edge. Finding a clique of a given size (the clique problem) is NP-complete. The term "clique" comes from Luce & Perry (1949) (complete subgraphs in social networks to model cliques of people (groups of people who know each other). Cliques have apps in bioinformatics. Graph with 23 1-vertex clique (vertices 42 2-vertex cliques-edges (6 maximal) 19 3-vertex cliques (light/dark blue) 2 4-vertex cliques (dark blue areas). 11 lightblue triangles=maximal cliques The 2 dark blue 4-cliques are both maximum and maximal, and the clique number of the graph is 4. A complete graph is a simple undirected graph in which every pair of distinct vertices is connected by a unique edge. Simple, means no loop edges and no more than 1 edge between any two different verticies/ (each edge = distinct pair of verticies/) An independent set or stable set is a set of vertices in a graph, no two of which are adjacent. A maximum independent set is an independent set of largest possible size for a given graph G. This size is called the independence number of G, and denoted α(G).[2] The problem of finding such a set is called the maximum independent set problem and is an NP-hard optimization problem. As such, it is unlikely that there exists an efficient algorithm for finding a maximum indep set of a graph. Every maximum independent set also is maximal, but the converse implication does not necessarily hold. A set is independent if and only if it is a clique in the graph’s complement, so the two concepts are complementary. The complement or inverse of a graph G is a graph H on the same vertices s.t. 2 distinct vertices of H are adjacent iff not adjacent in G A complete bipartite graph is a graph whose vertices can be partitioned into two subsets V1 and V2 such that no edge has both endpoints in the same subset, and every possible edge that could connect vertices in different subsets is part of the graph - a bipartite graph (V1, V2, E) s.t. for every 2 vertices v1 ∈ V1 and v2 ∈ V2, v1v2 is an edge in E. A complete bipartite graph with partitions of size |V1|=m and |V2|=n, is denoted Km,n;[1][2] every 2 graphs with the same notation are isomorphic. A maximal independent set is either an independent set s.t. adding any other vertex to the set forces the set to contain an edge or all vertices of empty graph. A maximum independent set is an independent set of largest possible size for a given graph G. This size is called the independence number of G, and denoted α(G).[2] The problem of finding such a set is called the maximum independent set problem and is an NP-hard optimization problem. As such, it is unlikely that there exists an efficient algorithm for finding a maximum independent set of a graph. Every maximum independent set also is maximal, but the converse implication does not necessarily hold. A set is independent if and only if it is a clique in the graph’s complement, so the two concepts are complementary. The complement or inverse of a graph G is a graph H on the same vertices such that two distinct vertices of H are adjacent if and only if they are not adjacent in G A complete bipartite graph is a graph whose vertices can be partitioned into two subsets V1 and V2 such that no edge has both endpoints in the same subset, and every possible edge that could connect vertices in different subsets is part of the graph - a bipartite graph (V1, V2, E) s.t. for every 2 vertices v1 ∈ V1 and v2 ∈ V2, v1v2 is an edge in E. A complete bipartite graph with partitions of size |V1|=m and |V2|=n, is denoted Km,n;[1][2] every 2 graphs with the same notation are isomorphic. Every tree is bipartite. Cycle graphs with an even number of vertices are bipartite. Every planar graph whose faces all have even length is bipartite. Special cases of this are grid graphs and squaregraphs, in which every inner face consists of 4 edges and every inner vertex has four or more neighbors.[9] The complete bipartite graph on m and n vertices, denoted by Kn,m is the bipartite graph G = (U, V, E), where U and V are disjoint sets of size m and n, respectively, and E connects every vertex in U with all vertices in V. It follows that Km,n has mn edges.[10] Closely related to the complete bipartite graphs are the crown graphs, formed from complete bipartite graphs by removing the edges of a perfect matching. Hypercube graphs, partial cubes, and median graphs are bipartite. In these graphs, vertices may be labeled by bitvectors, in such a way that 2 vertices are adjacent iff the corresponding bitvectors differ in a single position. A bipartition may be formed by separating the vertices whose bitvectors have an even number of ones from the vertices with an odd number of ones. Trees and squaregraphs form examples of median graphs, and every median graph is a partial cube.[12] A graph is bipartite iff it does not contain an odd cycle. A graph is bipartite if and only if it is 2-colorable, (i.e. its chromatic number is less than or equal to 2). The biadjacency matrix of a bipartite graph is a -matrix of size that has a one for each pair of adjacent vertices and a zero for nonadjacent vertices.[20] Biadjacency matrices may be used to describe equivalences between bipartite graphs, hypergraphs, and directed graphs. Every tree is bipartite. Cycle graphs with an even number of vertices are bipartite. Every planar graph whose faces all have even length is bipartite. Special cases of this are grid graphs and squaregraphs, in which every inner face consists of 4 edges and every inner vertex has four or more neighbors.[9] The complete bipartite graph on m and n vertices, denoted by Kn,m is the bipartite graph G = (U, V, E), where U and V are disjoint sets of size m and n, respectively, and E connects every vertex in U with all vertices in V. It follows that Km,n has mn edges.[10] Closely related to the complete bipartite graphs are the crown graphs, formed from complete bipartite graphs by removing the edges of a perfect matching.[11] Hypercube graphs, partial cubes, and median graphs are bipartite. In these graphs, vertices may be labeled by bitvectors, in such a way that 2 vertices are adjacent iff the corresponding bitvectors differ in a single position. A bipartition may be formed by separating the vertices whose bitvectors have an even number of ones from the vertices with an odd number of ones. Trees and squaregraphs form examples of median graphs, and every median graph is a partial cube.[12] A graph is bipartite iff it does not contain an odd cycle. A graph is bipartite if and only if it is 2-colorable, (i.e. its chromatic number is less than or equal to 2). The biadjacency matrix of a bipartite graph is a -matrix of size that has a one for each pair of adjacent vertices and a zero for nonadjacent vertices.[20] Biadjacency matrices may be used to describe equivalences between bipartite graphs, hypergraphs, and directed graphs.


Download ppt "Ld,p  (X-p)od = Xod - pod And letting Ld  Xod, Ld,p = Ld - pod"

Similar presentations


Ads by Google