Download presentation
Presentation is loading. Please wait.
1
Building Finite-State Machines
Intro to NLP - J. Eisner
2
Finite-State Toolkits
In these slides, we’ll use Xerox’s regexp notation Their tool is XFST – free version is called FOMA Usage: Enter a regular expression; it builds FSA or FST Now type in input string FSA: It tells you whether it’s accepted FST: It tells you all the output strings (if any) Can also invert FST to let you map outputs to inputs Could hook it up to other NLP tools that need finite-state processing of their input or output There are other tools for weighted FSMs (Thrax, OpenFST) Intro to NLP - J. Eisner
3
Common Regular Expression Operators (in XFST notation)
concatenation EF * + iteration E*, E+ | union E | F & intersection E & F ~ \ - complementation, minus ~E, \x, F-E .x. crossproduct E .x. F .o. composition E .o. F .u upper (input) language E.u “domain” .l lower (output) language E.l “range” Intro to NLP - J. Eisner 3
4
Common Regular Expression Operators (in XFST notation)
concatenation EF EF = {ef: e E, f F} ef denotes the concatenation of 2 strings. EF denotes the concatenation of 2 languages. To pick a string in EF, pick e E and f F and concatenate them. To find out whether w EF, look for at least one way to split w into two “halves,” w = ef, such that e E and f F. A language is a set of strings. It is a regular language if there exists an FSA that accepts all the strings in the language, and no other strings. If E and F denote regular languages, than so does EF. (We will have to prove this by finding the FSA for EF!) Intro to NLP - J. Eisner
5
Common Regular Expression Operators (in XFST notation)
concatenation EF * + iteration E*, E+ E* = {e1e2 … en: n0, e1 E, … en E} To pick a string in E*, pick any number of strings in E and concatenate them. To find out whether w E*, look for at least one way to split w into 0 or more sections, e1e2 … en, all of which are in E. E+ = {e1e2 … en: n>0, e1 E, … en E} =EE* Intro to NLP - J. Eisner 5
6
Common Regular Expression Operators (in XFST notation)
concatenation EF * + iteration E*, E+ | union E | F E | F = {w: w E or w F} = E F To pick a string in E | F, pick a string from either E or F. To find out whether w E | F, check whether w E or w F. Intro to NLP - J. Eisner 6
7
Common Regular Expression Operators (in XFST notation)
concatenation EF * + iteration E*, E+ | union E | F & intersection E & F E & F = {w: w E and w F} = E F To pick a string in E & F, pick a string from E that is also in F. To find out whether w E & F, check whether w E and w F. Intro to NLP - J. Eisner 7
8
Common Regular Expression Operators (in XFST notation)
concatenation EF * + iteration E*, E+ | union E | F & intersection E & F ~ \ - complementation, minus ~E, \x, F-E ~E = {e: e E} = * - E E – F = {e: e E and e F} = E & ~F \E = - E (any single character not in E) is set of all letters; so * is set of all strings Intro to NLP - J. Eisner 8
9
Regular Expressions E F * G + & A language is a set of strings.
It is a regular language if there exists an FSA that accepts all the strings in the language, and no other strings. If E and F denote regular languages, than so do EF, etc. Regular expression: EF*|(F & G)+ Syntax: E F * G concat & + | Semantics: Denotes a regular language. As usual, can build semantics compositionally bottom-up. E, F, G must be regular languages. As a base case, e denotes {e} (a language containing a single string), so ef*|(f&g)+ is regular. Intro to NLP - J. Eisner 9
10
Regular Expressions for Regular Relations
A language is a set of strings. It is a regular language if there exists an FSA that accepts all the strings in the language, and no other strings. If E and F denote regular languages, than so do EF, etc. A relation is a set of pairs – here, pairs of strings. It is a regular relation if here exists an FST that accepts all the pairs in the language, and no other pairs. If E and F denote regular relations, then so do EF, etc. EF = {(ef,e’f’): (e,e’) E, (f,f’) F} Can you guess the definitions for E*, E+, E | F, E & F when E and F are regular relations? Surprise: E & F isn’t necessarily regular in the case of relations; so not supported. Intro to NLP - J. Eisner 10
11
Common Regular Expression Operators (in XFST notation)
concatenation EF * + iteration E*, E+ | union E | F & intersection E & F ~ \ - complementation, minus ~E, \x, F-E .x. crossproduct E .x. F E .x. F = {(e,f): e E, f F} Combines two regular languages into a regular relation. Intro to NLP - J. Eisner 11
12
Common Regular Expression Operators (in XFST notation)
concatenation EF * + iteration E*, E+ | union E | F & intersection E & F ~ \ - complementation, minus ~E, \x, F-E .x. crossproduct E .x. F .o. composition E .o. F E .o. F = {(e,f): m. (e,m) E, (m,f) F} Composes two regular relations into a regular relation. As we’ve seen, this generalizes ordinary function composition. Intro to NLP - J. Eisner 12
13
Common Regular Expression Operators (in XFST notation)
concatenation EF * + iteration E*, E+ | union E | F & intersection E & F ~ \ - complementation, minus ~E, \x, F-E .x. crossproduct E .x. F .o. composition E .o. F .u upper (input) language E.u “domain” E.u = {e: m. (e,m) E} Intro to NLP - J. Eisner 13
14
Common Regular Expression Operators (in XFST notation)
concatenation EF * + iteration E*, E+ | union E | F & intersection E & F ~ \ - complementation, minus ~E, \x, F-E .x. crossproduct E .x. F .o. composition E .o. F .u upper (input) language E.u “domain” .l lower (output) language E.l “range” Intro to NLP - J. Eisner 14
15
Function from strings to ...
Acceptors (FSAs) Transducers (FSTs) a:x c:z e:y a c e {false, true} strings Unweighted Weighted a/.5 c/.7 e/.5 .3 a:x/.5 c:z/.7 e:y/.5 .3 numbers (string, num) pairs
16
How to implement? concatenation EF * + iteration E*, E+ | union E | F
slide courtesy of L. Karttunen (modified) concatenation EF * + iteration E*, E+ | union E | F ~ \ - complementation, minus ~E, \x, E-F & intersection E & F .x. crossproduct E .x. F .o. composition E .o. F .u upper (input) language E.u “domain” .l lower (output) language E.l “range” Intro to NLP - J. Eisner 16
17
Concatenation = example courtesy of M. Mohri r r
Intro to NLP - J. Eisner
18
Union | = example courtesy of M. Mohri r
Intro to NLP - J. Eisner
19
Closure (this example has outputs too)
example courtesy of M. Mohri * = The loop creates (red machine)+ . Then we add a state to get do | (red machine)+ . Why do it this way? Why not just make state 0 final? Intro to NLP - J. Eisner
20
Upper language (domain)
example courtesy of M. Mohri .u = similarly construct lower language .l also called input & output languages Intro to NLP - J. Eisner
21
Reversal .r = example courtesy of M. Mohri
Intro to NLP - J. Eisner
22
Inversion .i = example courtesy of M. Mohri
Intro to NLP - J. Eisner
23
Complementation Given a machine M, represent all strings not accepted by M Just change final states to non-final and vice-versa Works only if machine has been determinized and completed first (why?) Intro to NLP - J. Eisner
24
Intersection & = 2/0.8 1 1 2/0.5 0,0 0,1 1,1 2,0/0.8 2,2/1.3 fat/0.5
example adapted from M. Mohri fat/0.5 pig/0.3 eats/0 2/0.8 1 sleeps/0.6 fat/0.2 1 2/0.5 eats/0.6 sleeps/1.3 pig/0.4 & 0,0 fat/0.7 0,1 1,1 pig/0.7 2,0/0.8 2,2/1.3 eats/0.6 sleeps/1.9 = Intro to NLP - J. Eisner
25
Intersection & = 1 2/0.8 1 2/0.5 0,0 0,1 1,1 2,0/0.8 2,2/1.3 fat/0.5
2/0.8 pig/0.3 eats/0 sleeps/0.6 fat/0.2 1 2/0.5 eats/0.6 sleeps/1.3 pig/0.4 & 0,0 fat/0.7 0,1 1,1 pig/0.7 2,0/0.8 2,2/1.3 eats/0.6 sleeps/1.9 = Paths 0012 and 0110 both accept fat pig eats So must the new machine: along path 0,0 0,1 1,1 2,0 Intro to NLP - J. Eisner
26
Intersection & = 2/0.8 1 2/0.5 1 0,1 0,0 fat/0.5 pig/0.3 eats/0
1 sleeps/0.6 pig/0.4 2/0.5 fat/0.2 sleeps/1.3 & 1 eats/0.6 = fat/0.7 0,1 0,0 Paths 00 and 01 both accept fat So must the new machine: along path 0,0 0,1 Intro to NLP - J. Eisner
27
Intersection & = 2/0.8 1 2/0.5 1 1,1 0,0 0,1 fat/0.5 pig/0.3 eats/0
1 sleeps/0.6 pig/0.4 2/0.5 fat/0.2 sleeps/1.3 & 1 eats/0.6 = fat/0.7 pig/0.7 1,1 0,0 0,1 Paths 00 and 11 both accept pig So must the new machine: along path 0,1 1,1 Intro to NLP - J. Eisner
28
Intersection & = 2/0.8 1 2/0.5 1 0,0 0,1 1,1 2,2/1.3 fat/0.5 pig/0.3
eats/0 2/0.8 1 sleeps/0.6 pig/0.4 2/0.5 fat/0.2 sleeps/1.3 & 1 eats/0.6 = fat/0.7 pig/0.7 0,0 0,1 1,1 sleeps/1.9 2,2/1.3 Paths and both accept fat So must the new machine: along path 1,1 2,2 Intro to NLP - J. Eisner
29
Intersection & = 2/0.8 1 2/0.5 1 2,0/0.8 0,0 0,1 1,1 2,2/1.3 fat/0.5
pig/0.3 eats/0 2/0.8 1 sleeps/0.6 pig/0.4 2/0.5 fat/0.2 sleeps/1.3 & 1 eats/0.6 eats/0.6 2,0/0.8 = fat/0.7 pig/0.7 0,0 0,1 1,1 2,2/1.3 sleeps/1.9 Intro to NLP - J. Eisner
30
What Composition Means
abgd g f 3 2 6 4 2 8 ab?d abcd abed abjd abed abd ... Intro to NLP - J. Eisner
31
What Composition Means
3+4 2+2 6+8 ab?d abgd abed Relation composition: f g abd ... Intro to NLP - J. Eisner
32
Relation = set of pairs g f abgd ab?d abcd abed abjd abed abd 3 2 6 4
does not contain any pair of the form abjd … ab?d abcd ab?d abed ab?d abjd … abcd abgd abed abed abed abd … abgd g f 3 2 6 4 2 8 ab?d abcd abed abjd abed abd ... Intro to NLP - J. Eisner
33
f g = {xz: y (xy f and yz g)}
Relation = set of pairs ab?d abcd ab?d abed ab?d abjd … abcd abgd abed abed abed abd … f g = {xz: y (xy f and yz g)} where x, y, z are strings f g ab?d abgd ab?d abed ab?d abd … 4 ab?d abgd 2 abed 8 abd ... Intro to NLP - J. Eisner
34
Intersection vs. Composition
pig/0.4 pig/0.3 pig/0.7 & = 1 1 0,1 1,1 Composition pig:pink/0.4 Wilbur:pig/0.3 Wilbur:pink/0.7 .o. = 1 1 0,1 1,1 Intro to NLP - J. Eisner
35
Intersection vs. Composition
Intersection mismatch elephant/0.4 pig/0.3 pig/0.7 & = 1 1 0,1 1,1 Composition mismatch elephant:gray/0.4 Wilbur:pig/0.3 Wilbur:gray/0.7 .o. = 1 1 0,1 1,1 Intro to NLP - J. Eisner
36
Composition .o. = example courtesy of M. Mohri
FIRST MACHINE: odd # of a’s (which become b’s), then b that is left alone, then any # of a’s the first of which is optionally replaced by b SECOND MACHINE: one or more b’s (all but the first of which become a’s), then a:b, then optional b:a
37
Composition .o. = a:b .o. b:b = a:b
38
Composition .o. = a:b .o. b:a = a:a
39
Composition .o. = a:b .o. b:a = a:a
40
Composition .o. = b:b .o. b:a = b:a
41
Composition .o. = a:b .o. b:a = a:a
42
Composition .o. = a:a .o. a:b = a:b
43
Composition .o. = b:b .o. a:b = nothing
(since intermediate symbol doesn’t match)
44
Composition .o. = b:b .o. b:a = b:a
45
Composition .o. = a:b .o. a:b = a:b
46
Composition in Dyna start = &pair( start1, start2 ).
final(&pair(Q1,Q2)) :- final1(Q1), final2(Q2). edge(U, L, &pair(Q1,Q2), &pair(R1,R2)) min= edge1(U, Mid, Q1, R1) edge2(Mid, L, Q2, R2). Intro to NLP - J. Eisner
47
f g = {xz: y (xy f and yz g)}
Relation = set of pairs ab?d abcd ab?d abed ab?d abjd … abcd abgd abed abed abed abd … f g = {xz: y (xy f and yz g)} where x, y, z are strings f g ab?d abgd ab?d abed ab?d abd … 4 ab?d abgd 2 abed 8 abd ... Intro to NLP - J. Eisner
48
3 Uses of Set Composition:
Feed string into Greek transducer: {abedabed} .o. Greek = {abedabed, abedabd} {abed} .o. Greek = {abedabed, abedabd} [{abed} .o. Greek].l = {abed, abd} Feed several strings in parallel: {abcd, abed} .o. Greek = {abcdabgd, abedabed, abedabd} [{abcd,abed} .o. Greek].l = {abgd, abed, abd} Filter result via No = {abgd, abd, …} {abcd,abed} .o. Greek .o. No = {abcdabgd, abedabd} Intro to NLP - J. Eisner
49
What are the “basic” transducers?
The operations on the previous slides combine transducers into bigger ones But where do we start? a:e for a S e:x for x D Q: Do we also need a:x? How about e:e ? a:e e:x Intro to NLP - J. Eisner
50
Some Xerox Extensions $ containment => restriction
slide courtesy of L. Karttunen (modified) $ containment => restriction replacement Make it easier to describe complex languages and relations without extending the formal power of finite-state systems. Intro to NLP - J. Eisner 50 50
51
Containment $[ab*c] ?* [ab*c] ?*
“Must contain a substring that matches ab*c.” Accepts xxxacyy Rejects bcba Warning: ? in regexps means “any character at all.” But ? in machines means “any character not explicitly mentioned anywhere in the machine.” ?* [ab*c] ?* Equivalent expression Intro to NLP - J. Eisner 51 51
52
~[~[?* b] a ?*] & ~[?* a ~[c ?*]]
Restriction slide courtesy of L. Karttunen (modified) ? c b a a => b _ c “Any a must be preceded by b and followed by c.” Accepts bacbbacde Rejects baca contains a not preceded by b contains a not followed by c ~[~[?* b] a ?*] & ~[?* a ~[c ?*]] Equivalent expression Intro to NLP - J. Eisner 52 52
53
[~$[a b] [[a b] .x. [b a]]]* ~$[a b]
Replacement slide courtesy of L. Karttunen (modified) a:b b a ? b:a a b -> b a “Replace ‘ab’ by ‘ba’.” Transduces abcdbaba to bacdbbaa [~$[a b] [[a b] .x. [b a]]]* ~$[a b] Equivalent expression Intro to NLP - J. Eisner 53 53
54
Replacement is Nondeterministic
a b -> b a | x “Replace ‘ab’ by ‘ba’ or ‘x’, nondeterministically.” Transduces abcdbaba to {bacdbbaa, bacdbxa, xcdbbaa, xcdbxa} Intro to NLP - J. Eisner 54 54
55
Replacement is Nondeterministic
[ a b -> b a | x ] .o. [ x => _ c ] “Replace ‘ab’ by ‘ba’ or ‘x’, nondeterministically.” Transduces abcdbaba to {bacdbbaa, bacdbxa, xcdbbaa, xcdbxa} Intro to NLP - J. Eisner 55 55
56
Replacement is Nondeterministic
slide courtesy of L. Karttunen (modified) a b | b | b a | a b a -> x applied to “aba” Four overlapping substrings match; we haven’t told it which one to replace so it chooses nondeterministically a b a a b a a b a a b a a x a a x x a x Intro to NLP - J. Eisner 56 56
57
More Replace Operators
slide courtesy of L. Karttunen Optional replacement: a b (->) b a Directed replacement guarantees a unique result by constraining the factorization of the input string by Direction of the match (rightward or leftward) Length (longest or shortest) Intro to NLP - J. Eisner 57 57
58
@-> Left-to-right, Longest-match Replacement
slide courtesy of L. Karttunen a b | b | b a | a b x applied to “aba” a b a a b a a b a a b a a x a a x x a x @-> left-to-right, longest match @> left-to-right, shortest match right-to-left, longest match right-to-left, shortest match Intro to NLP - J. Eisner 58 58
59
Using “…” for marking a|e|i|o|u -> [ ... ] p o t a t o p[o]t[a]t[o]
slide courtesy of L. Karttunen (modified) 0:[ [ 0:] ? a e i o u ] a|e|i|o|u -> [ ... ] p o t a t o p[o]t[a]t[o] Note: actually have to write as -> %[ ... %] or -> “[” ... “]” since [] are parens in the regexp language Intro to NLP - J. Eisner 59 59
60
How would you change it to get the other answer?
Using “…” for marking slide courtesy of L. Karttunen (modified) 0:[ [ 0:] ? a e i o u ] a|e|i|o|u -> [ ... ] p o t a t o p[o]t[a]t[o] Which way does the FST transduce potatoe? p o t a t o e p[o]t[a]t[o][e] p o t a t o e p[o]t[a]t[o e] vs. How would you change it to get the other answer? Intro to NLP - J. Eisner 60 60
61
Example: Finnish Syllabification
slide courtesy of L. Karttunen define C [ b | c | d | f ... define V [ a | e | i | o | u ]; [C* V+ ... "-" || _ [C V] “Insert a hyphen after the longest instance of the C* V+ C* pattern in front of a C V pattern.” why? s t r u k t u r a l i s m i s t r u k - t u - r a - l i s - m i Intro to NLP - J. Eisner 61 61
62
Conditional Replacement
slide courtesy of L. Karttunen The relation that replaces A by B between L and R leaving everything else unchanged. A -> B Replacement L _ R Context Sources of complexity: Replacements and contexts may overlap Alternative ways of interpreting “between left and right.” Intro to NLP - J. Eisner 62 62
63
Hand-Coded Example: Parsing Dates
slide courtesy of L. Karttunen Today is Tuesday, [July 25, 2000]. Today is [Tuesday, July 25], 2000. Today is Tuesday, [July 25], 2000. Today is [Tuesday], July 25, 2000. Best result Bad results Today is [Tuesday, July 25, 2000]. Need left-to-right, longest-match constraints. Intro to NLP - J. Eisner 63 63
64
Source code: Language of Dates
slide courtesy of L. Karttunen Day = Monday | Tuesday | ... | Sunday Month = January | February | ... | December Date = 1 | 2 | 3 | ... | 3 1 Year = %0To9 (%0To9 (%0To9 (%0To9))) - %0?* from 1 to 9999 AllDates = Day | (Day “, ”) Month “ ” Date (“, ” Year)) Intro to NLP - J. Eisner 64 64
65
Object code: All Dates from 1/1/1 to 12/31/9999
slide courtesy of L. Karttunen actually represents 7 arcs, each labeled by a string , , Jan 2 1 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9 Tue Mon Wed Fri Sat Sun Thu Feb 1 2 3 4 5 6 7 8 9 Mar Apr 4 5 6 7 8 9 May Jun 3 Jul Aug Sep Oct Nov 1 Dec , 13 states, 96 arcs date expressions , Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Intro to NLP - J. Eisner 65 65
66
Parser for Dates AllDates @-> “[DT ” ... “]”
slide courtesy of L. Karttunen (modified) Compiles into an unambiguous transducer (23 states, 332 arcs). “[DT ” ... “]” Xerox left-to-right replacement operator Today is [DT Tuesday, July 25, 2000] because yesterday was [DT Monday] and it was [DT July 24] so tomorrow must be [DT Wednesday, July 26] and not [DT July 27] as it says on the program. Intro to NLP - J. Eisner 66 66
67
Problem of Reference Tuesday, July 25, 2000 Tuesday, February 29, 2000
slide courtesy of L. Karttunen Valid dates Tuesday, July 25, 2000 Tuesday, February 29, 2000 Monday, September 16, 1996 Invalid dates Wednesday, April 31, 1996 Thursday, February 29, 1900 Tuesday, July 26, 2000 Intro to NLP - J. Eisner 67 67
68
Refinement by Intersection
slide courtesy of L. Karttunen (modified) AllDates LeapYears Feb 29, => _ … MaxDays In Month “ 31” => Jan|Mar|May|… _ “ 30” => Jan|Mar|Apr|… _ WeekdayDate Valid Dates Xerox contextual restriction operator Q: Why does this rule end with a comma? Q: Can we write the whole rule? Q: Why do these rules start with spaces? (And is it enough?) Intro to NLP - J. Eisner 68 68
69
ValidDates: 805 states, 6472 arcs
slide courtesy of L. Karttunen Defining Valid Dates AllDates: 13 states, 96 arcs date expressions AllDates & MaxDaysInMonth LeapYears WeekdayDates = ValidDates ValidDates: 805 states, 6472 arcs date expressions Intro to NLP - J. Eisner 69 69
70
Parser for Valid and Invalid Dates
slide courtesy of L. Karttunen Parser for Valid and Invalid Dates [AllDates - “[ID ” ... “]” , “[VD ” ... “]” 2688 states, 20439 arcs Comma creates a single FST that does left-to-right longest match against either pattern Today is [VD Tuesday, July 25, 2000], not [ID Tuesday, July 26, 2000]. valid date invalid date Intro to NLP - J. Eisner 70 70
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.