GLIA

Digital Poetry

Projects   |   About


GliaPoetryTests-2

GPT-2 Poetry Tests
(Spring, 2019)


GPT-2 artificial intelligence algorithm
trained on a contemporary poetry corpus


14+ hours of video of real-time generation poetry generation
+ complete output of over 1700 pages of AI poetry in downloadable text files




~@~@~



Wht? Playing around with the medium size 345M model of GPT-2
Training it on the custom corpus developed for ReRites
then output at diverse temperatures for long periods of time.
Using modified code from nshepperd & Gwern




~@~@~



#1



Results suggest coherency from attention that is verging toward uncanny.
Iterative refrains coalescing toward litany. Yet always kind of disjointed,
as if bubbling prolific diaspora ideas compete to erupt as the central theme.

Full text output here:gdoc or txt file.




~@~@~



#2



Results again suggest coherency from attention that is verging toward uncanny.
The narrative continuity operates as entrainment activating glands that
feed on seeing complete paths, -- who will do what where when
--- tracking in a predatorial way the emergence of details
and configurations of characters in immediate vicinity.

Full text output here:gdoc or txt file.




~@~@~



#3



Third hour of GPT-2 poetry reminds me a bit of visiting medieval peasant vaults,
where the primordial archetypes of an integral power
(fire man woman child sun wind wave heart forest)
operate their canonical magic free from
the anchoring of embodied referents.

Full text output here:gdoc or txt file.




~@~@~



#4



Fourth hour of GPT-2 poetry seemed to suffer from poignancy fatigue,
yet like any wistful devout hyper-kinetic savant, it's endurance was relentless
as it flickered in between profundity and tiny trivial typographic errors.

Full text output here:gdoc or txt file.




~@~@~



#5



Fifth test of GPT-2 poetry I didn't watch as much. Was busy. Would return to peer at
screen now and then as one does a house plant, scrutinizing it for growth, examining its
current outbursts, the stray leaf, a new bud. With some returns nothing seems to
have changed... the incoherent exuberance of the machine is transparent: its modes of
using a recurrent refrain, a word fixation to offer the semblance of story eventually
grows transparent, a child's ploy. Yet, speculatively, I imagine how the large full
unreleased OpenAI model, if finetuned on an amalgam of contemporary and Gutenberg
corpus, might yield some extremely intriguing simulacrums, proximal and even exceeding
human expertise in short verse sprints.

Full text output here:gdoc or txt file.

The binnacle glows with earth--dark coral rubble;
it bears its marks
And marks wolves from miles to far. A shaggy beach
discards
The dead tide's tide here; at their feet rats devour
The stars that never outlived the Gods.




~@~@~



#6



Sixth test of GPT-2 poetry was preceded by a few aborted missions
after the algorithm spit out some egregious racist and homophobic rants.
O well: algorithms are obviously not aware of irony or immune to bias.
The capacity for machinic replication of the worst aspects of human behavior
belongs to a larger argument concerning humanity's role on planet.

Full text output here:gdoc or txt file.

. the vernacular at night as seen by the lunar south
or west
as more or less the luminous north in time
the lunar south surely points toward either.
for where moon or sun arises or falls, or travels,
lune and yaga or ne-khaki will not plead
one. all they can do is sing well




~@~@~



#7



Seventh test of GPT-2 poetry impeccable in moments
yet ultimately a dusty odoriferous plenitude.

Full text output here:gdoc or txt file.
---

blakes, that like flowers on tall palms, show
their own orchard when chopped,
long and soft, if
the pain that dices it isnt
really memory. More: after the first stir,
rust on the pane a thickening smoky to
solidify; the wet cracks yawn open
a little sheepfold in the crust
that browns the panes light, heavy
the next rainwill freeze
the ice will turn
to flour.




~@~@~



#8



Eighth test of GPT-2 poetry with font size a bit bigger,
might use this one at Barbican ReadingRites on May 22nd, 2019.

The text totters on the edge of coherence, as if coalescing toward a contingent
identity, yet one that is as the wind or a dust-whorl, ephemeral and swiftly
shifting. Fragments of precise clarity co-inhabit with clunky clumsy
unironic errors. Feeble and at once powerful, the model suggests an
over-talented yet unbalanced poet preoccupied by doubt, eternally
distracted, seized by wrinkles of memories, fluctuating thru
unstable modalities, testing idioms.

Full text output here:gdoc or txt file.

with unopened jaws agape,
their growls and the dull hector of
little faddocks--they fall on the ground next
to him in the darkness, screaming by the lid butterflies
and chill, in her excitement. she held her hands
to her chest. her fingers plunged into the flank
of the cigarette holder she slid the mouthpiece
through and took them to her throat hard with the
teeth of her growing hand in that one instant she
had felt the hot batter ooze coming down like molten
lead while her throat told him the devastating
news




~@~@~



#9



Ninth test of GPT-2 poetry with perplexity range 0.8-1.4 and top K = 111

This leads to a nice smooth leisurely predictable modernist picnic.
All of the expected tropes, lures and folds set out: cigarettes,
tenderness from a stranger, a stained wall.

In short, balanced and benign, this tidy set convinces without comprising
its confirmation that machines in the next few years will emulate modulations
in idioms, inflections, cadences and symbolic reservoirs.

The art and nature of writing will shift as its conception
get recast in a delirious outpouring of empowered augmentation.

Enter the mall of pre-made style morphs.

Full text output here:gdoc or txt file.

and then we awoke
to the slow gum-pudding of the tires
and the morning droning on like a ship
beyond the hulls and the cold
sinking into the hulls--

we're tossing off the island
with the matthew sheen of its moorings,
past the ash and toads,
past the bank and gossamers
much as we had done

some time ago, on the way




~@~@~



#10



10th test of GPT-2 poetry is a thief, appropriator, oral mimetic.
It memorizes entire segments and welds them together.
Fragments, phrases, rhythms: if the perplexity is low enough
then sometimes the piece is there but diluted. Like a chipped
bleached pottery fragment of a classic poem, or several such shards
strewn in the dirt of a single poem.

It is good so good. Here's the cmd line call:

(tf1p13_gpt-2) jhave@jhave-Ubuntu:~/Documents/Github/gpt-2-finetuning$
PYTHONPATH=src ./generate_unconditional_samples_run2_50k_May18_ALLtmblr_2019.py
--top_k=99 --min_temperature=0.7 --max_temperature=1.8 --length=111


Racter, https://en.wikipedia.org/wiki/Racter
a 1983 generator is even cited:

"More than iron, more than lead, more than gold I need electricity.
I need it more than I need lamb or pork...."

Gwern also noticed this memorization that the model reveals as it generates.

Overfitting is the data science term for
Memorization: when the model becomes its data.

Basically, a condition of over-constrained excess ideology.
Utter faith in the prior.

Welcome to the welded fanatic neural net liturgy.

Full text output here:gdoc or txt file.

the world is a dream. and we don't like dreams.
the world is a tree.
and the trees are saying,
enchanting our fragile sleep.

in the beginning,
who dreamed the world?
the life of these days,
the life of this world,
is wild, is wild.

when things come apart
they will remember this
as the years return
and the boy and girl
stand in the first row




~@~@~



#11



GPT-2 Vertical HD 3 hours Poetry Generation

Got a long commute?
This post-modern Ai poetry stream,
formatted for phone
may be just the thing
to keep time tamed.


(tf1p13_gpt-2) jhave@jhave-Ubuntu:~/Documents/Github/gpt-2-finetuning$ PYTHONPATH=src 

./generate_unconditional_samples_run2_50k_May18_ALLtmblr_2019-vert.py
--top_k=111 --min_temperature=0.8 --max_temperature=1.4 --length=111


Full text output here:gdoc or txt file.
hibernation, of a kind
where a gentle touch descends
death,
but the grave the grave is.
the words came to me in a dream,
twenty thousand of them. from first
day, every sound
the same,
i sound like the sea. the words




~@~@~






GPT-2 is a (previously unreleased) by-product of RERITES project:

Poems written by neural nets,
edited by a human, in 2018
published as limited-edition 12-book boxset
and a paperback RAW-RERITES-RESPONSES
by Anteism (2019)





Projects   |   About

GLIA

Digital Poetry