Telling your disruption story from the Peak of Inflated Expectations to the Slopes of Enlightenment

by Sarah Firisen

Many years ago in 1991, in my first job out of college, I worked for a small investment bank. By 1994, I was working in its IT department. One of my tasks was PC support and I had a modem attached to my computer so that I could connect to Compuserve  for research on technical issues. Yes, this was the heydey of Compuserve, the year that the first web browser came out and a time when most people had very little idea, if any, what this Internet thing was.

As a tech geek, I signed up for one of the early, local Internet Service Providers and had an email account on their Unix based system. I actually met my now ex-husband through that email account, which is a whole other story. During this period, the ex and I were just starting our email correspondence and I would dial into my ISP at work to check my email. At some point, these minimal phone charges came to the attention of the firm’s Managing Director who took me aside and asked what I was doing. I told him about this wonderful new thing, the Internet! He told me to stop using the company’s modem to connect to anything but Compuserve. I protested, somewhat, and tried to tell him what a wonderful innovation the Internet was (and bear in mind, at the time, there weren’t a lot of websites and they loaded incredibly slowly, so even a geek had to use some imagination to see the future possibilities). He told me that the company would not be doing anything with the Internet anytime in the future. And by the way, this is a company who had already made a lot of its money from deals and IPOs in the entertainment and technology sector, so that they might have been interested in what I had to say wasn’t an outrageous idea.

Suffice it to say, that Managing Director was wrong and over the years that investment bank has been involved in many of the most significant deals with some of the biggest Internet-related companies. So what was the missed opportunity there? Clearly, that Managing Director was no visionary but my old company also ended up doing just fine and caught onto the Internet early enough to make a lot of money anyway. But, how much more money could they have made if someone had listened to me back then? I was young and very junior at the company and felt ashamed to have been “caught” and told off. But in hindsight, what I could have done was tell him a better story about this new, disruptive technology. Read more »



Feed Me Donald! – Trump, Musk, the Internet, and Monsters from the Id

by Bill Benzon

Seen on Google search, Friday morning, September 9, 2018:

Musk Toke

I’m sure you’ve heard about it. Elon Musk went on Joe Rogan’s podcast, Rogan lit up a blunt, and Musk took a toke. The next day Tesla’s stock tanked. Well, not exactly tanked, but it was down seven points, and the drop can’t be attributed entirely to that toke–there’s been some turmoil in the executive ranks–but that made for a good lede.

Not to mention the image! Billionaire inventor, boy wonder, real-life Iron Man, with his “Occupy Mars” T-shirt, head cloaked in a cloud of smoke­. Get it? Share-holder value, up in smoke?

It’s the stuff of mythology, of realityTVnews.

But that wasn’t the most interesting thing in the interview by a long shot. Read more »

A Faint Distrust of Words

INTERVIEW BETWEEN ANDREA SCRIMA (A LESSER DAY)

AND CHRISTOPHER HEIL (Literaturverlag Droschl)

Novels set in New York and Berlin of the 1980s and 1990s, in other words, just as subculture was at its apogee and the first major gentrification waves in various neighborhoods of the two cities were underway—particularly when they also try to tell the coming-of-age story of a young art student maturing into an artist—these novels run the risk of digressing into art scene cameos and excursions on drug excess. In her novel A Lesser Day (Spuyten Duyvil, second edition 2018), Andrea Scrima purposely avoids effects of this kind. Instead, she concentrates on quietly capturing moments that illuminate her narrator’s ties to the locations she’s lived in and the lives she’s lived there.

When she looks back over more than fifteen years from the vantage of the early 2000s and revisits an era of personal and political upheaval, it’s not an ordering in the sense of a chronological sequence of life events that the narrator is after. Her story pries open chronology and resists narration, much in the way that memories refuse to follow a linear sequence, but suddenly spring to mind. Only gradually, like the small stones of a mosaic, do they join to form a whole.

In 1984, a crucial change takes place in the life of the 24-year-old art student: a scholarship enables her to move from New York to West Berlin. Language, identity, and place of residence change. But it’s not her only move from New York to Berlin; in the following years, she shuttles back and forth between Germany and the US multiple times. The individual sections begin with street names in Kreuzberg, Williamsburg, and the East Village: Eisenbahnstrasse, Bedford Avenue, Ninth Street, Fidicinstrasse, and Kent Avenue. The novel takes on an oscillating motion as the narrator circles around the coordinates of her personal biography. In an effort of contemplative remembrance, she seeks out the places and objects of her life, and in describing them, concentrating on them, she finds herself. The extraordinary perception and precision with which these moments of vulnerability, melancholy, loss, and transformation are described are nothing less than haunting and sensuous, enigmatic and intense. Read more »

Monday, September 10, 2018

Visual Histories: Discovery

by Timothy Don

Speak the word “discovery” and familiar images of explorers, scientists, ships, and treasure chests come to mind. To look into the visual record of “discovery” over the past 50,000 years, however, is to witness the concept expand, swell, and overwhelm the imagination. There is a wonder that arises in the wake of one’s research with the realization that the deeper and closer one looks, the wider, richer, and more capacious the topic at hand becomes. Consider the egg.

Ostrich Egg, Egypt, c. 3450–3300 B.C. © The Metropolitan Museum of Art. Gift of Egypt Exploration Fund, 1901

A simple egg. The familiarity of its shape is unnerving, even disarming. Hovering there, alone in space, it has the weight of a moon or a planet. A red planet. A symbol of discovery in its expansive, exploratory sense: to discover is to reach out into space, to land on the moon, to plan for Mars. But…this object is not Mars. It is an egg. It is an ostrich egg, more than 5,000 years old, from the predynastic period of Northern Upper Egypt, dug out of a tomb. Mars remains undiscovered. This is an artifact from an era now lost to us, uncovered by a forgotten Egyptologist from the 19th century. It belongs to the past. This is an actual discovery. And it is wonderful. It is pure potential. It has been discovered, but it remains uncracked. Full of mystery, this old egg from the past, it fills you with wonder. At that moment a gestalt switch gets thrown, and one realizes that discovery’s arrow points not only forward and outward to unexplored planets, but backward and inward to things lost, buried, and forgotten. To discover a thing is also to discover the past, and the act of discovery is about the recovery of the past just as much as it is about the probing of the future. And so a planet (symbol of the undiscovered future) becomes an artifact (material expression of the discovered past), and the artifact (the egg) becomes a mystery, a wonder, a promise. To be an egg is to promise discovery. Every egg, from this one (c. 3400bc) to the one you opened into a frying pan this morning, contains and shelters something utterly familiar and utterly unique, something waiting for you to find it. Every egg is a discovery.

There are certain artists who seem to have something to say about everything and whose work as a result appears regularly in the pages of the journal at which I serve as visual curator. Hieronymus Bosch is one. Caravaggio is another. Paul Klee and William Kentridge make the list. The genius of these artists (and many others) makes their work ever-contemporary, as immediate and compelling today as it was when they made it. The density of their work allows it to absorb the assault of time, so that its meaning can shift and apply itself to any period without diluting the purity of its original conception and execution. Read more »

Monday Poem

Little Miracles 2:

Cloudmaker

you, generator of clouds,
are indispensable. you
fling them up as if
they were mere vapor
your creativity is unsurpassed
much like cloud yourself
you may be dark or bright
free and light or stretched
like a cowl under stars,
in daylight white grey saffron
pink

at night you draw curtains
cross a moon
but you are not mere vapor
as you stride
and dance with wind
waving arms to fan the air
you cool it down to give
us drink

Jim Culleny
8/2/18

Drawing, Cloudmaker
Jim C. 1997

Apportioning Democracy

by Jonathan Kujawa

Despite what he may wish, the President of the United States is not a king. We have Congress to act as a check and to ensure the varied opinions of the citizens are represented [1]. In principle, a representative democracy is straight-forward: the voters vote, select their representatives, and the legislature gets down to the business of running the country.

The devil, of course, is in details. The framers of the Constitution had knock-down, drag-out fights over basic questions like: does the legislature represent individual citizens or the states? On one side you had those who saw the new country as a joining together of independent, co-equal states. William Patterson of New Jersey compared a large state having more votes than a small one, to the idea “that a rich individual citizen should have more votes than an indigent one”. Those on the other side took the view if this is to be a true common enterprise, then every voter should be treated equally regardless of where they happened to reside. In what can only be a coincidence, a founder’s position on the issue almost invariably matched whether they came from a large state or a small one.

They finally agreed the Senate would have two representatives from each state, regardless of size, while the House of Representatives would have its members allocated to the states according to their population. Even this reasonable compromise nearly failed. Read more »

Argument Repair and the Sporting Attitude

by Scott F. Aikin and Robert B. Talisse

The second edition of our Why We Argue (And How We Should) is set to be released this month. The new edition is an update of the previous 2014 edition, and in particular, it is occasioned by the argumentative events leading up to and subsequent to the 2016 US Presidential election. New forms of fallacy needed to be diagnosed, and different strategies for their correction had to be posed. But we also saw that another element of a book on critical thinking and politics was necessary, one that has been all-too-often left out: a program for argument-repair. We looked at strategies for finding out what arguers are trying to say, what motivates them, and how to address not just the things argued, but the things that drive us to argue. We think that very often, repairing an argument requires repairing the culture of argument.

One reason why arguments go so badly is because the disagreement and interplay between us, our interlocutors, and our onlooking audiences starts off as fraught. And so, further exchange often makes things worse instead of better. We call this phenomenon argument escalation. Again, we’ve all seen it happen — a difference of opinion about some small matter grows into an argument, an argument becomes a verbal fight, and the verbal fight becomes a fight involving more than merely words. And it’s worse when there are onlookers who will judge us and our performances. Half of the time, we want to stop it all and say, “We are all grownups here . . . can’t we just calm down?” But were one to say this, the interlocutor is liable to respond, “Are you calling me a child, then? And don’t tell me to calm down!” How can we repair arguments without further escalating them? Read more »

Should We Own Ourselves?

by Tim Sommers

Do You Own Yourself?

In 1646, the Leveller leader, Richard Overton became the first person in the English-speaking world to assert that we own ourselves. “To every Individuall in nature, is given an individuall property by nature, not to be invaded or usurped by any,” he wrote, “for every one as he is himselfe, so he has a selfe propriety, else he could not be himself.”

We might question the claim that without owning ourselves we couldn’t be ourselves. After all, we don’t need ownership, or property law, to explain why my beliefs are mine or why my actions are mine.

But it’s easy to appreciate the political strategy behind Overton’s use of self-ownership – once you hear it.

The Levellars, so-named originally by opponents for supposedly having leveled hedges during the enclosure riots, were the egalitarians of the English Civil War preaching that sovereignty was founded on consent and that the franchise and property ownership should be extended to all men. Overton was responding to the argument that only men of property should be allowed to vote, when he wrote that all men are men of property, because all men have property in themselves.

Do they, though? It sounds plausible. But do you own yourself?

Actually, that’s an easy one. Ownership is a legal notion. In no legal system in the world do you own yourself. (About as close as you can come to a legal precedent, in American law, that might be interpreted as supporting self-ownership has to do with copyright: celebrities do, at least to some extent, own the use of their image for commercial purposes.)

But even if you don’t own yourself, maybe, you should. Read more »

A Poem About The Service Area

by Amanda Beth Peery

In the service area
where a handful of the families of New Jersey
have pulled off the highway,
the trees are surprising me
with their thickness—
the edge of a secret
or rather forgotten forest
running beside the turnpike.
A small animal occasionally waits,
then breaks
from the velvet shadowed elms.
They press the delicate imprints of leaves
against the broiling sky
as a storm tentatively begins
and one dark-eyed girl, bored with her parents
watches the rain start by the food machines
and swings her arms in loose arcs
like the trees that are now swaying
in the rain and in the wind.
She is just as foreign to me
as these particular trees.
I will never see any of them again.

My Swat Valley Story

by Shadab Zeest Hashmi

The most stunning memory of Swat valley that remains with me since my first visit as a child is the euphoria of the headstrong Darya e Swat, the luxuriously frothy river, like fresh milk churning and churning joyfully. That, and the first time I heard the pristine and full silence of wilderness, meandering along languorous brooks and spotting the smallest wild flowers I had ever seen. Like any child, scale impressed me: the mountains were the highest, the river the fastest, the silence of the trails the deepest I had yet experienced. If there was anything subtle I may have observed, the awe of scale and novelty eclipsed it completely.

On my visit to Swat this year, I tried to recapture that spirit of childhood, trying to set the senses free as well as refining them with the subtlety that was beyond the capacity of my younger self, wrestling furiously against the disquiet and frustration Swat evokes due to recent political events. It struck me that subtlety still eludes the perception of Swat as far as the global psyche goes, dominated as it is by narratives of scale: the most unsafe place in the world, the worst place for women and education, home of the youngest person to receive the Nobel prize, etc. Read more »

Shared Responsibilities for Climate Change Mitigation

by Jalees Rehman

The dangers of climate change pose a threat to all of humankind and to ecosystems all over the world. Does this mean that all humans need to equally shoulder the responsibility to mitigate climate change and its effects? The concept of CBDR (common but differentiated responsibilities) is routinely discussed at international negotiations about climate change mitigation. The basic principle of CBDR in the context of climate change is that highly developed countries have historically contributed far more than to climate change and therefore need to reduce their carbon footprint far more than less developed countries. The per capita rate of vehicles in the United States is approximately 90 cars per 100 people, whereas the rate in India is 5 cars per 100 people. The total per capita carbon footprint includes a plethora of factors such as carbon emissions derived from industry, air travel and electricity consumption of individual households. As of 2015, the per capita carbon footprint in the United States is ten times higher than that of India, but the discrepancy in the historical per capita carbon footprint is even much greater.

CBDR recognizes that while mitigating carbon emissions in the future is a shared responsibility for all countries, highly developed countries which have contributed substantially to global carbon emissions for more than a century have a greater responsibility to rein in carbon emissions going forward than less developed countries. However, the idea of “differentiated” responsibilities has emerged as a rather contentious issue. Read more »

Painful and Inevitable Dissonances

by Niall Chithelen

A number of scenes in Eugene Zamyatin’s dystopian novel We (1921) echo moments from Alexander Bogdanov’s utopian Red Star (1908). Bogdanov was a Bolshevik when he wrote Red Star (he was expelled from the party some years before the Russian Revolution), and his “red star” was a socialist, wonderfully technocratic Mars whose residents are preparing to export revolution back to Earth. Zamyatin, also once an Old Bolshevik, wrote in the disturbing aftermath of the Russian Revolution, and his We takes place on a future Earth, ruled over by the United State. It is possible to read We as a response to Red Star and its intellectual moment, with Zamyatin flipping Bogdanov’s Bolshevik idealism to reflect the fright of Bolshevik reality. Bogdanov sought a sort of Communist technocracy, and Zamyatin sensed its enormity, feared it. But, while the two books do offer different political conclusions, the authors seem to share an important belief in humanity and its imperfections, as they provide rather similar answers to a fundamental question of their genre: what kind of freedom do we really need? Read more »

Digital Art?

by Nickolas Calabrese

The subject of digital art is unclear. Work that was new five years ago can look ancient now because of the constant changes in technologies. The term “digital art” itself is an umbrella term that applies to a variety of names for art that is produced with the aid of a computer including, but is not limited to, “new media art”, “net art”, and “post internet art” (for a fuller linear history of the various terms that denote art made with or assisted by computers, see Christine Paul’s terrific introduction to A Companion to Digital Art reader, which she edited). The decision to use “digital art” is twofold. First, some of the alternatives only denote a temporal category while not actually referring to the form (new media can only be “new” for a brief period; it makes little sense for a “post internet” when it hasn’t significantly evolved from “net art”; put more brusquely, in the words of critic Brian Droitcour in Art in America magazine “most people I know think “Post-Internet” is embarrassing to say out loud”); and second, because I regularly teach an undergraduate visual arts course at NYU named Digital Art (a title I did not choose).

The upshot of digital art’s classification is that it might not actually exist as a category. Read more »

Monday, September 3, 2018

A Return to Philip Roth

by Adele A. Wilby

The link to Charles McGrath’s ‘No Longer Writing, Philip Roth Still Has Plenty to Say’ which appeared in the New York Times in January, only a few months prior to Roth’s death in May this year, was forwarded to me by a friend who thought I might find the article interesting. How indebted I am to my friend that he thought of me in those terms, for the sending of that article rekindled my acquaintance with Roth; life’s events and circumstances had left my reading of his work to the margins.

After reading that January interview, I was surprised and saddened to hear the announcement that Roth had died; despite his eighty-five years there was no suggestion of ill health on his part in the interview. However, the numerous critical and appreciatory obituaries propelled me into reflection on what I might have missed over the years by failing to read this major twentieth century literary figure that has now left us, and that it was time I returned to Roth to discover for myself what all the praise and criticism of his literature was all about. All that remained of my reading of Roth’s Portnoy’s Complaint and Human Stain forty years ago was the impression that they were good books. Fired up with a renewed enthusiasm to ‘return’ to the Roth I had left behind, unsure of what to select from his numerous writings, but armed with my own life story and political history, I chose his I Married a Communist to start reading him again.

I married a communist in the political heyday of the Cold War seventies and eighties, and we could be thought of as politically active in the decades that followed. Thus, with the value of hindsight, of having at least temporarily succumbed to a socialist ideology and politics, and the development of a healthy skepticism of all ideologies, I have to admit to a curiosity as to how Roth would present politically engaged persons, and the socialist politics of the era, albeit in the United States, and I was not disappointed.

My first reaction after reading the final sentence, closing the book and resting it on my lap was to heave a heavy sigh of great satisfaction. Roth, I felt, had written a great book. Read more »

The Eye’s Mind

by Joshua Wilbur 

I would never call myself a birdwatcher. I’m confident—arrogant even—about blue jays and cardinals, but everything else is a crapshoot. I identify finches as sparrows, sparrows as hawks, hawks as starlings. I’m more often wrong than right.

That said, I do like to watch birds. In college, I would go to the Boston Aquarium to see the African penguins. Peering over the central railing, I would pick out a single penguin from the group and follow its (his? her?) every move for as long as possible. Into the water, out of the water, into the water again…

These days I spend a lot of time in Central Park, where countless pigeons roam the paths.  When I’m not in a rush, I’ll find a bench and read for awhile, and, inevitably, the legion arrives. Again, I like to pick out one member of the group and see what it does.  Whether penguin or pigeon (or finch or sparrow or gull), what fascinates me about watching birds is how contingent—how utterly arbitrary— their actions can seem.

Why does the bird take three steps this way instead of that way?

Why does it fly to that tree in particular?

Why does it return to the ground at my feet, turn a few circles, and continue its march along the sidewalk?  

I understand, of course. It’s looking for food. But there’s a wide gulf between a pigeon’s world and mine. And I feel that divide most strongly when I look into a bird’s eyes. I’ll stare down at a pigeon, focus on the red and orange and black of its vigilant eye, and think about a very old question: “Is anyone in there?” Read more »

Letters in the Age of Science: A 19th-century Case for Optimism

by Jeroen Bouterse

The past years have seen many debates about the limits of science. These debates are often phrased in the terminology of scientism, or in the form of a question about the status of the humanities. Scientism is a notoriously vague term, and its vagueness can be put to the advantage of either side. If you position yourself among the ranks of those fighting against scientific overreach, it helps to define scientism as the easily refuted view that there is no knowledge outside of science; don’t the humanities produce knowledge as well? If, on the other hand, you believe that the methods and results of science can still profitably be exported to new markets, you will need as harmless a definition as possible – scientism becomes nothing more, for instance, than the claim that science encourages us to make an effort to understand things.

Are we bound to talk past each other, then, adapting our language to fit our intuitions? Not necessarily. Scientism can be the label for a well-defined philosophical position, and several articles in a recent volume on the subject manage to prep it for conceptual analysis. This is never a waste of time. However, interesting discussions can also take place in the grey zone between philosophical analysis and vague intuitions.

My favorite example of this comes from the 19th century. It concerns an exchange between Thomas Henry Huxley and Matthew Arnold. Neither of them use the term ‘scientism’, or provide a formal definition of the issue, but both manage in a delightful way to have a productive debate about the reach of science.

Huxley is now most famous as ‘Darwin’s bulldog’: the enthusiastic promoter and exegete of Charles Darwin. Darwin himself took care not to go beyond defending the content of his new theory of evolution, which was radical and controversial enough as it stood. To his friend Asa Gray he confessed that he found his knowledge of the more gruesome phenomena in nature hard to square with belief in a good and omnipotent God; but when he received letters from eager theology students asking him to clarify the implications of his theory for divinity, he tended, ever-politely, to evade them. Read more »

Cultural Divides, from Snow to Snowflakes

by Thomas O’Dwyer

Kenneth WidmerpoolThe career of Kenneth Widmerpool defined an era of British social and cultural life spanning most of the 20th century. He is fictional – a character in Anthony Powell’s 12-volume sequence, A Dance to the Music of Time – but he is as memorable as any historical figure. In the first volume, he is a colourless Eton public schoolboy. Across the series, he tunnels his way under British upper-class and bohemian society. A powerful and sinister self-made monster, he even gains a life peerage. In the final volume, the aged Widmerpool joins a hippie cult and dies naked while chasing girls in the woods. Widmerpool lived and prospered in the solid certainties of his acquired culture. He died in the midst of its fragmentation.

Widmerpool was an original snowflake – one who believed that he was so unique that greatness and adulation were his destiny. His lowly father sold fertilisers. His mother raised him to be this snowflake with an inflated uniqueness that would override his mediocrity. The metaphor then was poetic – snowflakes are lovely, and no two are alike.

Today, we have a “snowflake generation,” defined by British author Claire Fox in her 2016 book I Find That Offensive!: “It is a derogatory term for one deemed too vulnerable to cope with views that challenge their own, particularly in universities and other forums once known for robust debate.” With some irony, these delicate modern snowflakes are also called “new Victorians.”

The collapse of cultural certainties was most clear in Britain but rippled through all Western societies. The origin of certain culture-war debates, which erupt from time to time like temperamental volcanoes, is pinned on one Englishman, Lord Charles Percy Snow. A chemist and novelist, Snow in 1959 published The Two Cultures and the Scientific Revolution. He first delivered it as a lecture at Cambridge University. Snow observed that a group of educated people talking in a room would make allusions drawn from books and the arts. Not one of them would be expected to make, or understand, a reference to “the second law of thermodynamics.” Half of human culture – science – appeared to be non-existent for literary intellectuals.

Snow found this odd and alarming, and he considered it a problem whose solution was obvious. Read more »