Download presentation
Presentation is loading. Please wait.
Published byMelvin Tyler Modified over 6 years ago
1
Chinese Poetry Generation with Planning based Neural Network
Yiding Wen, Jiefu Liang, Jiabao Zeng
2
Poetry Generation More than text generation: need to follow some specific structural, rhythmical and tonal patterns Some approaches: Based on semantic and grammar templates Statistical machine translation methods Consider it as a sequence to sequence generation problem …...
3
Approches: Planning-based poetry generation approach (PPG)
Two stages: User’s writing intent N keywords N lines poem Poem Planning Poem Generation A word, sentence, document...
4
Approches: Planning-based poetry generation approach (PPG)
PPG Framework
5
Approaches: Poem Planning
User’s writing intent N keywords N lines poem Poem Generation A word, sentence, document... What if we don’t have enough keywords? Keyword Extraction (TextRank algorithm) Keyword Expansion
6
Keyword Extraction: TextRank Algorithm
User’s writing intent Keywords Keyword Expansion derived from: Vertex: a word Edge: co-occurrence between two words TextRange score S(Vi): d is a damping factor (usually set to 0.85) Wij is the weight of the edge between Vi and Vj. E(Vij): the set of vertices connected with Vi
7
Keyword Expansion: Less than N Keywords Keyword Expansion User’s writing intent N Keywords Keyword Extraction RNNLM-based model Predict the subsequent keywords according to the preceding sequence of keywords: Only suitable for generating subtopics for those covering by the collected poem. Knowledge-based method Extra knowledge from encyclopedias, lexical databases (WordNet),... Generate candidate words Choose candidate words with the highest TextRank
8
Approaches: Poem Generation
User’s writing intent N keywords N lines poem Poem Planning A word, sentence, document... Framework of an attention based RNN encoder-decoder(RNN enc-dec) Encoder Decoder
9
An illustration of poem generation model
Another GRU Decoder Encoder Bi-directional Gated Recurrent Unit (GRU) An illustration of poem generation model
10
Approaches - Poem Generation
Encoder All preceding text Corresponding Keyword X = {x1, x2...xTx} K = {a1, a2...aTk} First Backward State of rTk Last Forward State of r1 [h1, h2… hTx] [r1, r2… rTx] h0 h = [h0, h2….hTx] Context Vector c
11
Approaches - Poem Generation
Encode keyword Decoder Internal Status Vector St Context Vector ct Previous Output yt-1 Encode all preceding text
12
Experiment Dataset 76,859 Chinese quatrains: 2,000 poems for validation, 2,000 poems for testing, and the rest for training The training corpus of poems: CRF based word segmentation-> keyword extraction and expansion-> four triples for every poem
13
Experiment Evaluation
A human study to evaluate the poem generation models
14
Experiment Result Four baselines: SMT(Statistical Machine Translation), RNNLM(RNN Language Model), RNNPG(RNN-based Poem Generator) and ANMT(Attention-based Neural Machine Translation) PPG(Planning-based Poetry Generation) outperforms all baseline models in average scores. The results are consistent with both settings of 5-character and 7-character poem generations. From the results of the human evaluation, it can be seen that the proposed method obtained very close performances in Poeticness and Fluency compared with ANMT but much higher Coherence and Meaning scores, which verified the effectiveness of the sub-topic prediction model.
15
Experiment Automatic Generation vs. Human Poet
40 evaluators: Four of them were professional in Chinese literature and were assigned to the Expert Group. 36 were assigned to the Normal Group. Choose from three options: 1) poem A is written by the human; 2) poem B is written by the human; 3) cannot distinguish which one is written by the human We can draw two conclusions from the result: (1) under the standard of normal users, the quality of our machine-generated poems is very close to human poets; (2) but from the view of professional experts, the machine-generated poems still have some obvious shortages comparing to human-written poems.
16
Experiment Generation Examples
The model can also generate poems based on any modern terms The title of the left poem in Table 7 is 啤酒(beer), the keywords given by our poem planning model are 啤酒(beer), 香醇(aroma), 清爽(cool) and 醉(drunk). The title of the right one is a named entity 冰心(Xin Bing), who was a famous writer. The poem planning system generates three keywords besides 冰心(Xin Bing): 春水(spring river), 繁星(stars) and 往事(the past), which are all related to the writer’s works.
17
Conclusion and Future Work
PPG (Planning-based Poetry Generation): two stages Topic planning: pLSA, LDA or word2vec Other forms of literary genres: Song iambics, Yuan Qu etc., or poems in other languages
18
Thank you! Enjoy the spring break and good luck on homework :)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.