Presentation is loading. Please wait.

Presentation is loading. Please wait.

Develop a fast semantic decoder Robust to speech recognition noise Trainable on different domains: Tourist information (TownInfo) Air travel information.

Similar presentations


Presentation on theme: "Develop a fast semantic decoder Robust to speech recognition noise Trainable on different domains: Tourist information (TownInfo) Air travel information."— Presentation transcript:

1 Develop a fast semantic decoder Robust to speech recognition noise Trainable on different domains: Tourist information (TownInfo) Air travel information system (ATIS) Semantic parsing maps natural language to formal langage – most of the maping incurrent dialogue is fairly simple (cities, times,... ). To develop a fast semantic parser, we have to take advantage of this feature. Inference of compact set of transforation rules which transforms the initial naive semantic hypothesis. Compare semantic tuple classifiers with: A handcrafted Phoenix grammar The Hidden Vector State model (He & Young, 2006) Probabilistic Combinatory Categorial Grammar Induction (Zettlemoyer & Collins, 2007) Markov Logic networks (Meza-Ruiz et al., 2008) Evaluation metrics: Dialogue act type accuracy (e.g. inform or request) Precision, recall and F-measure of dialogue act items (e.g. food=Chinese, toloc.city=New York) Conclusion: semantic tuple classifiers are robust to noise and competitive with the state of the art – they provide a simple yet efficient solution to the spoken language understanding problem! The code is available at http://code.google.com/p/tbed-parser. http://code.google.com/p/tbed-parser Transformation-based Learning for Semantic Parsing F. Jurcicek, M. Gasic, S. Keizer, F. Mairesse, B. Thomson, K. Yu, S. Young Cambridge University Engineering Department | {fj228, mg436, sk561, f.mairesse, brmt2, ky219, sjy}@eng.cam.ac.uk Goals Example of parsing Example: find all the flights between Toronto and Sandiego that arrive on Saturday 1.Parser assigns to inital semantics to input sentece. 2.Rules, whose triggers match the sentence a the hypothesised semantics, are sequentially applied. 3.Resulting semantics Acknowledgements This research was partly funded by the UK EPSRC under grant agreement EP/F013930/1 and by the EU FP7 Programme under grant agreement 216594 (CLASSiC project: www.classic-project.org ). www.classic-project.org Evaluation Motivation Semantic decoderAct accuracy Item precision Item recall Item F TownInfo dataset with transcribed utterances: Semantic Tuple Classifiers94.997.494.095.7 Phoenix grammar94.896.394.295.3 TownInfo dataset with ASR output: Semantic Tuple Classifiers85.294.083.788.6 Phoenix grammar74.790.379.585.5 ATIS dataset with transcribed utterances: Semantic Tuple Classifiers92.696.792.494.5 Hidden Vector State90.3 PCCG Induction95.196.795.9 Markov Logic Networks93.489.891.6 what are the lowest airfare from Washington DC to Boston GOAL= airfare airfare.type= lowest from.city= Wahington from.state= DC to.city= Boston Open source Tranformation parsing TriggersTransformations „tickets“replace the goal by „airfare“ „flights * from“ & GOAL=airfarereplace the goal by flight „Seatle“add the slot „to.city=Seatle“ „connecting“replace the slot „to.city=*“ by „stop.city=*“ GOAL= flight from.city=Toronto to.city=Sandiego departure.day=Saturday #triggertransformation 1„between toronto“add the slot „from.city=Toronto“ 2„and sandiago“add the slot „to.city=Sand Diaego“ 3„Saturday“add the slot „departure.day=Saturday“


Download ppt "Develop a fast semantic decoder Robust to speech recognition noise Trainable on different domains: Tourist information (TownInfo) Air travel information."

Similar presentations


Ads by Google