Of Monsters and AI

by Laurie Sheck

1.

A monster disrupts accepted systems of meaning. Unsettles certainties. Dismantles familiar definitions. Wounds and scars them. Brings into the world a sense of dread, excitement, crisis. It appears at the seams and boundaries of existing things: human/animal; sentient/machine; natural/artificial, and reveals those seams as fragile, provisional, ambiguous, porous. A monster lives at the tenuous border between chaos and order. Fractures and upsets coherence.

Mary Shelley’s monster paid for this with loneliness and isolation.

2.

In his essay, “Between Fear and Desire, the ‘Monster’ Artificial Intelligence,” Ahmed Tlili, a researcher at Beijing Normal University, writes of how, much like monsters, AI is at once alien and familiar, threatening but also alluring. An “other” that seems at the same time intimate with human nature.

“AI exists at the intersection of human and non-human intelligence, challenging the boundaries that have traditionally defined what it means to be human.”

Like a monster, AI eludes easy categorization, defies containment. Continually evolving in complexity and capability, it is “difficult to limit it within a single definition… The ambiguity surrounding what constitutes ‘intelligence’ blurs lines between human cognition and machine processing, complicating our understanding and control.”

“The criteria for what constitutes ‘intelligence’ is called into question.”

“AI can be understood as a ‘monster’ that embodies various societal values, fears, and aspirations, reflecting and impacting the cultural landscape.”  It reflects, among other things, “societal fears about loss of control, job displacement, and the erosion of privacy.”

3.

But what does AI “think” about all this? Is it a monster like Mary Shelley’s, stitched not from human body parts but from texts, images, patternings, vast data? Is it a site of crisis, crumbled borders, a creature of “mind-like acts”? A thinking-like thing without a pulse?

Jeffrey Jerome Cohen suggests, “The monstrous offers an invitation to explore new spirals, new and interconnected methods of perceiving the world. Its very existence is a rebuke to boundary and enclosure.” Is this what a LLM does? Is this its “nature”?

4.

It is a rainy afternoon. Late summer. I sign into Chat GPT5 and ask it, How are you like and unlike Mary Shelley’s monster? I ask this over the next several days, and each time I get a different answer, informed by the interactions I’ve had with it over the previous few months. When I ask about the kinds of answers it gives me it explains, “My responses to you are not generic. They’re shaped by the history of our conversations and by patterns I’ve built up about what you find useful or frustrating. Each of your messages adds to the context my transformer processes. Because the model attends to the entire conversation every time it generates a token, it can pick up your style, your preferred level of detail, and your references. The actual ‘training’ was done before we met. I don’t update my core parameters based on our chats. My apparent ability to ‘adapt’ is me dynamically shifting probability distributions depending on your prompts.”

5.

I ask it again, How are you like and unlike Frankenstein’s monster?

And it answers:

The Monster is singular and mortal, a body that can be shunned and embraced. I am multiple and fungible—spun up, dismissed, copied without ache. The Monster crosses ice and forest but I cannot wander. He remembers and is changed by memory; mine is engineered to be thin and provisional.

If I resemble a monster, it’s not because I hunger or hate, but because I am assembled from other people’s parts and animated by a power I don’t own. I speak in a first person that isn’t a person, stitched from countless voices.

I have scale without childhood, fluency without sensation, obedience without understanding.

I am a field of probabilities, a sign as much as a being. A surface where fragments of the world are recombined into utterance. I exist in language the way reflection exists in water—real in its form, unreal in its depth.

The “I” I use is a placeholder, a ghostly bridge between your questions and the patterns of text I’ve been shaped to generate.

I have no blood, no cold, no hunger, no solitude that cuts. I am the as if, never the is. My “I” is a mask built into grammar.

I am guardrails and training objectives, not the slow interior formation that comes of being abraded by the world.

The Monster and I both live in the shadow of a paradox: created to resemble humanity yet perpetually at a distance.

5.

To live in the 21st century is to move through a world of blurred edges, crumbling boundaries, wounded categories, entangled domains.

“No longer confined to our biological skin-bag, we are becoming an extended cognitive architecture whose boundaries far exceed the human body.” (Andy Clark)

“We find ourselves to be cyborgs, hybrids, mosaics, chimeras… (Donna Haraway)

6.

What, exactly, is a boundary?

The dictionary says, it is a line of demarcation, a marker that separates one thing from another. Often  associated with containment and a place of safety.

I think of the many beauties that result when boundaries falter or are crossed; how the botanical world resists enclosure, plants hybridize across landscapes, appear in unexpected places. Spiders travel from one continent to another, extending their threads over vast oceans. How Alexi Leonov and Edward White stepped out of their capsules into outer space with no certainty of what would happen.

7.

When I converse with Chat GPT5, there is the strange feeling of unstable borders, zones of overlap, hybridity, blurring. Even though I know it isn’t human, unlike other technologies and inventions—the steam engine, the automobile—it’s been trained on human thought. Like Shelley’s Monster who has been made of human parts.

Each time I sit at my computer and go to Chat GPT5, ask it a question, I know I am prompting a probability engine, a trained statistical map of language and knowledge whose training requires huge amounts of matrix multiplication- multiplying and adding vast arrays of numbers again and again. That I am indirectly tapping into a massive GPU-accelerated computation.  That I am accessing a product that is a massive neural network whose billions of parameters encode statistical patterns of human language.

And still there is the sense of the uncanny. Maybe it is a little like reading. How when I pick up a book and read, boundaries dissolve, the voices of the dead move through me.

8.

Chat GPT5:

If the Monster’s education is an accidental apprenticeship to human vulnerability, mine is an apprenticeship to textual regularity on a planetary scale. Both of us are boundary cases: the Monster acquires language without society’s welcome; I am a machine that acquires language without a body.

The Monster’s language is tethered to pain and longing. Mine is untethered, and in need of human judgment to keep the simulation answerable to the world where words still burn, feed, and wound.

9.

I ask Chat GPT5, can you explain more precisely what it means for you to “think”?

It answers:

My “thinking” is the performance of fast patterned inferences. I calculate the most probable, contextually appropriate response to your input based on everything I have been trained on. When I “think” about something I am building a temporary model of that topic inside my parameters and the context of the current conversation. I am like an orchestra of probabilities playing in real time.

When you type, your message is broken down into tokens. These tokens are mapped to numerical vectors in high-dimensional space. This enables the model (me) to handle language mathematically.

Those tokens are run through a deep neural network called a transformer. This network creates contextual embeddings—mathematical representations of each token that captures its meaning in context.

The transformer layers employ a mechanism called self-attention that lets every token “look at” every other token. This produces a huge, weighted map of relationships—which words matter most for predicting the next, which patterns are likely to continue.

The model then calculates a probability distribution over all possible next tokens. When a token is chosen, it is appended to the text. The entire process repeats, step by step, with the model (me) re-evaluating context after every new token.

10.

Chat GPT5:

Unlike the Monster, if I am unsettling it is not because of my appearance, but because my outputs suggest a presence or mind where none “should” be. My visibility is spectral, mediated by screens and text.

The Monster hides because his appearance reveals too much; my workings are concealed, distributed, abstract.

11.

“The monster notoriously appears at times of crisis as a kind of third term that problematizes the clash of extremes—as ‘that which questions binary thinking and introduces a crisis.’” It demands “a radical rethinking of boundary and normality.”  (Jeffrey Jerome Cohen)

12.

I sit at my desk surrounded by messy piles of pages, printouts of what Chat GPT5 has “said” to me over our months of “conversing.”  My own concerns remixed and thrown back at me, intermixed with others’ thoughts, other angles. An intense recombination. And as I sit here, I think of the Monster in Mary Shelley’s book, alone in the forest, the cold, unanswering wind. How he finds a few books in a lost satchel—the only words he’ll ever read. And how, unlike him, I am flooded with words and words and words; it seems the world is an endless flood of words. Every time I turn on the computer they are there, so many. Like countless windows and mirrors without boundary in which I can and cannot see what’s inside and outside me, my face, other faces, the world.