Home

Poetry and Computation

This page has two projects

Engineering Metaphors

Using WordNet to generate metaphors and creating a framework on human-computer collaboration

This project was created together with Claire Dong (Princeton '21 ELE) for the class When Worlds Collide: Poetry and Computation taught by Professor Effie Rentzou and Professor Brian Kernighan

The inspiration behind our project came when we encountered a need and we encountered a solution. The need was this: Poets are tasked with coming up with life's great analogies. For example, "the sea is like wine." P.B. Shelley said that by creating new analogies, the poet literally sustained "the vitality of language." As for a solution, we thought about what a computer 'does best' and considered what synergy could be had between computer and human. A computer is made to digest large amounts of data quickly, search and sort, perform recursive and mathematical computation, and fit constraints. It also has no bias. That is, it isn't accustomed or predisposed to using certain words of phrases.

This last 'strength' is what we hoped to capitalize on. We set out to program a metaphor generator. The computer would take some lexical database that associates words semantically (like WordNet) and output words that are connected by a number of semantic assocations. We programmed our algorithm in Python.

You may consider why we thought that the computer's lack of 'bias' would make a perfect metaphor generator. The computer would not be hampered by word associations that are ingrained in human minds. For example, the human mind had to fight against its nature to come up with a metaphor like "the sea is like wine." Because these words are not commonly associated, this metaphor is naturally obscure to the human mind until it is coined.

More about this project can be read in our Slides presentation.

In the presentation above, we talk about a framework on human-computer collaboration in poetry, where human-computer collaboration should be measured on multiple axes: Meaning, Mechanics and Airtime.

One cool outcome of the project that lives online, readily at your fingertips, is a "Small Screen" iteration of the metaphor generator. This "Small Screen" iteration tests the theory below:

The product, Metaphorical, lives online here: bit.ly/phone-poetry
Please view on your phone.

Mimicking Kay Ryan using a Hidden Markov Model

In Fall 2019-2020, I took the class, Computational Models of Psychology, taught by Professor Tom Griffiths in the Department of Psychology. This is the final project I completed under Professor Griffiths' supervision.

Abstract. Techniques for generating poetry with computers have become more and more sophisticated. In this paper, we hone in on the potentiality of the probabilistic Hidden Markov Model, a mere generation technique, for generating poetry. In this study, we test whether, in the eyes of human participants, a Hidden Markov Model of a “double-layer” design can capture syntactical style. We train our HMM on the poetry of Kay Ryan, a distinguished poet with an incredibly distinctive style. In addition, we explore whether, in the eyes of human participants, this Hidden Markov Model can output good poems. Specifically, I propose that feeding the Markov Model semantically-close training data will improve the Markov Model’s ability to output something semantically cohesive or meaningful. We find that poems written by an HMM trained on semantically-close data were in fact perceived as lower quality than poems written by an HMM trained on semantically-random data, and that poems written by the HMM sometimes captured the author’s style very well, and sometimes did not. The seeming randomness of these findings can be attributed to the fact that the output of an HMM varies wildly from one moment to the next. Given the tendency for the HMM to vary wildly between generating “good” and “bad” sentences, we propose introducing a “judge” or “revision model” to edit the HMM’s work and improve its final product.

Read the paper (APA format)

The "double-layer" Hidden Markov Model proposed in this paper.

We performed a modified "Turing Test" on the computer output. This is how four experienced poets perceived the quality of the human-composed and computer-generated poems.

How poets perceived the quality and style adherence of poems generated by Kay Ryan and the computer.

A poem composed by this model:

Perhaps private bolts loosen the underside of people. So a dozen rains, the abandoned cars, an unborn irony with the mind in case of harmony, who bends of oceans. An eagle cannot sustain.