Text-Savvy AI Is Here to Write Fiction – WIRED

Posted: November 25, 2019 at 2:46 pm

A few years ago this month, Portland, Oregon artist Darius Kazemi watched a flood of tweets from would-be novelists. November is National Novel Writing Month, a time when people hunker down to churn out 50,000 words in a span of weeks. To Kazemi, a computational artist whose preferred medium is the Twitter bot, the idea sounded mildly tortuous. I was thinking I would never do that, he says. But if a computer could do it for me, Id give it a shot.

Kazemi sent off a tweet to that effect, and a community of like-minded artists quickly leapt into action. They set up a repo on Github, where people could post their projects and swap ideas and tools, and a few dozen people set to work writing code that would write text. Kazemi didnt ordinarily produce work on the scale of a novel; he liked the pith of 140 characters. So he started there. He wrote a program that grabbed tweets fitting a certain templatesome (often subtweets) posing questions, and plausible answers from elsewhere in the Twitterverse. It made for some interesting dialogue, but the weirdness didnt satisfy. So, for good measure, he had the program grab entries from online dream diaries, and intersperse them between the conversations, as if the characters were slipping into a fugue state. He called it Teens Wander Around a House. First novel accomplished.

GPT-2 cant write a novel; not even the semblance, if youre thinking Austen or Franzen.

Its been six years since that first NaNoGenMothats Generation in place of Writing. Not much has changed in spirit, Kazemi says, though the event has expanded well beyond his circle of friends. The Github repo is filled with hundreds of projects. Novel is loosely defined. Some participants strike out for a classic narrativea cohesive, human-readable talehard-coding formal structures into their programs. Most do not. Classic novels are algorithmically transformed into surreal pastiches; wiki articles and tweets are aggregated and arranged by sentiment, mashed-up in odd combinations. Some attempt visual word art. At least one person will inevitably do a variation on meow, meow, meow... 50,000 times over.

That counts, Kazemi says. In fact, its an example on the Github welcome page.

But one thing that has changed is the tools. New machine learning models, trained on billions of words, have given computers the ability to generate text that sounds far more human-like than when Kazemi started out. The models are trained to follow statistical patterns in language, learning basic structures of grammar. They generate sentences and even paragraphs that are perfectly readable (grammatically, at least) even if they lack intentional meaning. Earlier this month, OpenAI released GPT-2, among the most advanced of such models, for public consumption. You can even fine-tune the system to produce a specific styleGeorgic poetry, New Yorker articles, Russian misinformationleading to all sorts of interesting distortions.

GPT-2 cant write a novel; not even the semblance, if youre thinking Austen or Franzen. It can barely get out a sentence before losing the thread. But it has still proven a popular choice among the 80 or so NaNoGenMo projects started so far this year. One guy generated a book of poetry on a six hour flight from New York to Los Angeles. (The project also underlined the hefty carbon footprint involved in training such language models.) Janelle Shane, a programmer known for her creative experiments with cutting-edge AI, tweeted about the challenges shes run into. Some GPT-2 sentences were so well-crafted that she wondered if they were plagiarized, plucked straight from the training dataset. Otherwise, the computer often journeyed into a realm of dull repetition or uncomprehending surrealism.

No matter how much youre struggling with your novel, at least you can take comfort in the fact that AI is struggling even more, she writes.

Its a fun trick to make text that has this outward appearance of verisimilitude, says Allison Parrish, who teaches computational creativity at New York University. But from an aesthetic perspective, GPT-2 didnt seem to have much more to say than older machine learning techniques, she saysor even Markov chains, which have been used in text prediction since the 1940s, when Claude Shannon first declared language was information. Since then, artists have been using those tools to make the assertion, Parrish says, that language is nothing more than statistics.

Follow this link:

Text-Savvy AI Is Here to Write Fiction - WIRED

Related Posts