Presentation is loading. Please wait.

Presentation is loading. Please wait.

Haitham Elmarakeby.  Speech recognition

Similar presentations


Presentation on theme: "Haitham Elmarakeby.  Speech recognition"— Presentation transcript:

1 Haitham Elmarakeby

2  Speech recognition http://nlp.stanford.edu/courses/lsa352/

3  Machine translation Welcome to the deep learning class مرحبا بكم في درس التعلم العميق

4  Question answering

5 Knight and Koehn 2003

6

7 Components  Translation model  Language Model  Decoding

8  Translation model Learn the P(f | e) Knight and Koehn 2003

9  Translation model  Input is Segmented in Phrases  Each Phrase is Translated into English  Phrases are Reordered Koehn 2004

10  Language Model Goal of the Language Model: Detect good English P(e) Standard Technique: Trigram Model Knight and Koehn 2003

11  Decoding Goal of the decoding algorithm: Put models to work, perform the actual translation Koehn 2004

12  Decoding Goal of the decoding algorithm: Put models to work, perform the actual translation Koehn 2004

13  Decoding Goal of the decoding algorithm: Put models to work, perform the actual translation Koehn 2004

14  Decoding Goal of the decoding algorithm: Put models to work, perform the actual translation Koehn 2004

15  Decoding Goal of the decoding algorithm: Put models to work, perform the actual translation Koehn 2004

16  Decoding Goal of the decoding algorithm: Put models to work, perform the actual translation Koehn 2004

17  Decoding Goal of the decoding algorithm: Put models to work, perform the actual translation  Prune out Weakest Hypotheses  by absolute threshold (keep 100 best)  by relative cutoff  Future Cost Estimation  compute expected cost of untranslated words

18 Sutskever et al.,2014 Sequence to Sequence Learning with Neural Networks

19  Model A B C W X Y Z

20  Model Sutskever et al. 2014

21  Model- encoder Cho: From Sequence Modeling to Translation

22  Model- encoder Cho: From Sequence Modeling to Translation

23  Model- encoder Cho: From Sequence Modeling to Translation

24  Model- encoder Cho: From Sequence Modeling to Translation

25  Model- encoder Cho: From Sequence Modeling to Translation

26  Model- decoder Cho: From Sequence Modeling to Translation

27  Model- decoder Cho: From Sequence Modeling to Translation

28  Model- decoder Cho: From Sequence Modeling to Translation

29  RNN

30 Vanishing gradient Cho: From Sequence Modeling to Translation

31  LSTM Graves 2013

32  LSTM Problem: Exploding gradient

33  LSTM Problem: Exploding gradient Solution: Scaling gradient

34  Reversing the Source Sentences Welcome to the deep learning class

35  Reversing the Source Sentences Welcome to the deep learning class

36  Results BLEU score (Bilingual Evaluation Understudy) Candidatethethethethethethethe Reference 1thecatisonthemat Reference 2thereisacatonthemat P = m/w= 7/7 = 1 Papineni et al. 2002

37  Results BLEU score (Bilingual Evaluation Understudy) Candidatethethethethethethethe Reference 1thecatisonthemat Reference 2thereisacatonthemat P = 2/7 Papineni et al. 2002

38  Results Sutskever et al. 2014

39  Results Sutskever et al. 2014

40  Model Analysis Sutskever et al. 2014

41  Long sentences Sutskever et al. 2014

42  Long sentences Cho et al. 2014

43 Bahdanau et al.,2014 Neural Machine Translation by Jointly Learning to Align and Translate

44  Long sentences Fixed length representation maybe the cause

45  Attention mechanism

46

47

48

49

50

51

52  Long sentences Cho et al. 2014

53 Vinyals et al., 2015 Grammar as a Foreign Language

54 Parsing tree

55

56

57

58 John has a dog.

59 Converting tree to sequence

60

61 Model

62 Results

63


Download ppt "Haitham Elmarakeby.  Speech recognition"

Similar presentations


Ads by Google