Download presentation
Presentation is loading. Please wait.
Published byArnold Roberts Modified over 9 years ago
1
Jan 4 th 2013 Event Extraction Using Distant Supervision Kevin Reschke
2
Event Extraction “… Delta Flight 14 crashed in Mississippi killing 40 … ” … … News Corpus Knowledge Base
3
Event Extraction 1)Generate Candidates Flight 14 crashed in Mississippi. 2) Classify Mentions Features: (Unigram:Mississippi) (NEType:Location) (PrevWord:in) (ObjectOf:crashed) Label: CrashSite 3) Aggregate Labels Final Label: CrashSite Run Named Entity Recognition on relevant docs
4
Training a Mention Classifier Need Labeled Training Data Problems: - Expensive - Does not scale One year after [USAir] Operator [Flight 11] FlightNumber crashed in [Toronto] CrashSite, families of the [200] Fatalities victims attended a memorial service in [Vancouver] NIL.
5
Distant Supervision Solution: Use known events to automatically label training data. Training Knowledge-Base One year after [USAir] Operator [Flight 11] FlightNumber crashed in [Toronto] CrashSite, families of the [200] Fatalities victims attended a memorial service in [Vancouver] NIL.
6
Distant Supervision (High Level) Begin with set of known facts. Use this set to automatically label training instances from corpus. Train and classify (handle noise) 6
7
Distant Supervision for Relation Extraction Slot filling for named entity relations. Minz et al. 2009 (ACL); Surdeanu et al. 2011 (TAC-KBP). Example: Company:,,,, etc. Known relations: founder_of(Steve Jobs, Apple) Noisy Labeling Rule: Slot value and entity name must be in same sentence. 1.(+) Apple co-founder Steve Jobs passed away yesterday. 2.(-) Steve Jobs delivered the Stanford commencement address. 3.(+) Steve Jobs was fired from Apple in 1985. 7
8
Distant Supervision for Event Extraction Sentence level labeling rule doesn’t work. 1.Events lack proper names. “The crash of USAir Flight 11” 2.Slots values occur separate from names. The plane went down in central Texas. 10 died and 30 were injured in yesterday’s tragic incident. 8
9
Automatic Labeling: Event Extraction Solution: Document Level Noisy Labeling Rule. Heuristic: Use Flight Number as proxy for event name. Labeling Rule: Slot value and Flight Number must appear in same document. 9 Training Fact: {, } …Flight 11 crash Sunday… …The plane went down in [Toronto] CrashSite …
10
Evaluation: 80 plane crashes from Wikipedia infoboxes. Training set: 32; Dev set: 8; Test set: 40 Corpus: Newswire data from 1989 – present.
11
Automatic Labeling 38,000 Training Instances. 39% Noise: Examples: Good: At least 52 people survived the crash of the Boeing 737. Bad: First envisioned in 1964, the Boeing 737 entered service in 1968.
12
Extraction Models Local Model Train and classify each mention independently. Pipeline Model Classify sequentially; use previous label as feature. Captures dependencies between labels. E.g., Passengers and Crew go together: “4 crew and 200 passengers were on board.” Joint Model Searn Algorithm (Daumé III et al., 2009). Jointly models all mentions in a sentence.
13
Results
14
Label Aggregation Exhaustive Aggregation 14 Four Four Four Four
15
Label Aggregation: Noisy-OR Key idea: Classifier gives us distribution over labels: Stockholm Compute Noisy-OR for each label. If Noisy-OR > threshold, use label. 15
16
Results: Noisy-OR
17
Next Step Compare Distant Supervision with state of the art supervised approach (Huang & Rilloff, ACL-2011). MUC-4 Shared Task: Terrorist Attacks. Slot Template:,,,, Distant Supervision Source: http://en.wikipedia.org/wiki/List_of_ter rorist_incidents http://en.wikipedia.org/wiki/List_of_ter rorist_incidents Short summaries of several hundred terrorist attacks.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.