Classical Chinese poems excel themselves by their conciseness and elegance. It is always an interesting task to create poem. The problem to solve is that given some hints as the first line, the output is supposed to be a quatrain with the same amount of Chinese characters in each line which fits the rhythm. Many different methods have been applied for this problem, which can be works can be divided into three main categories: (1) template-based method (2) statistical machine translation model (3) deep learning method. But there is no paper published to use GPT model to generate poems till now. In this paper, we fine-tuned a pre-trained Chinese GPT-2 model on 10,0000 four-line poems and evaluate the result by human evaluation and BLEU-1. The result shows that even though the consistency and innovation of the average level of these poems are still have some gap from the average of human poets, this model can generate some really good poem by chance. More work can be done to have an auto criteria for good poems so that we can make use of the efficiency of poetry generation by model to get a lot of good poems. In previous work, we have done a binary-classification problem in English. Through this project, the difficulties and challenges of doing nature language processing on a different language as mentioned in the first lecture of our class is clearly showed. Except for the reason that we want our "poet" to surprise us with beautiful rhythms, the other natural language generation tasks can also be benefited from the research on poem generation.
fangzheli/Classical_Chinese_Poem_Generator
Folders and files
| Name | Name | Last commit date | ||
|---|---|---|---|---|