You don’t turn to Thorn’s memoirs (this is her second; she is also a New Statesman columnist) for rock ’n’ roll name-dropping, but for someone who can – to quote her quoting Updike – “give the mundane its beautiful due”. My brother-in-law says of being a fan of the band: “The only difference between them and us was that we were listening to Everything but the Girl, while they were in Everything but the Girl.” The past is another planet and the diary twinkles with the arcane poetry of lost brand names – Aqua Manda, Green Shield Stamps.
Though it does turn out that Thorn was a bit more rock ’n’ roll than you might have thought. There’s a lot of underage drinking (“Southern Comfort, gin and orange. LOVELY!”) and sexual danger: “Creep asked me to dance again but I said no – found out he is a policeman. Yikes!” She was 13 when she wrote that, but the boys, she says, were “always older”.
The philosophy of white blood cells: this is self, this is non-self. The starry sky of non-self, perfectly mirrored deep inside. Immanuel Kant, perfectly mirrored deep inside.
And he knows nothing about it, he is only afraid of drafts. And he knows nothing about it, though this is the critique of pure reason.
Deep inside.
by Miroslav Holub from The Vintage Book of Contemporary World Poetry Vintage Books, 1996 Translation from Czech by Dana Habova and David Young
Sara Miller Llana in The Christian Science Monitor:
Last fall it was voted America’s best-loved book. This winter it made it to Broadway, grossing more at the box office in its first full week than any other play in history. Harper Lee’s “To Kill a Mockingbird” became a classic the moment it was published in 1960 – a tale of racial injustice set in Depression-era Alabama told through the eyes of 6-year-old Scout. It garnered the Pulitzer Prize in 1961, and its movie adaptation won three Oscars in 1963. That it has smashed theater records and risen to the top of a PBS nationwide popularity poll of American literature nearly 60 years later speaks to the lasting power of the narrative of a little girl making sense of racism and hypocrisy around her, as her father Atticus Finch defends Tom Robinson, a black man falsely accused of raping a white woman.
But its endurance is not just its clear-eyed depiction of a moment in time in the American South. It is the book’s evolution itself, from a groundbreaking text in its time to one today that raises complex questions about how the story is told – who tells it and, notably for some, who doesn’t. As “To Kill a Mockingbird” gets adapted for the stage, giving more voice to the black characters that were secondary or silent in the original novel, and gets re-examined in classrooms across North America, some are asking if a time comes when a book should be retired despite the impression it made on generations of students and the nostalgia many still feel about the work. That discussion gives it even more staying power.
“I don’t want to make somebody else,” Toni Morrison’s character Sula declares, when urged to get married and have kids, “I want to make myself.” Morrison herself might have understood this to be a false dichotomy—she was a single mother of two by the time she published “Sula,” her second novel, in 1973—but, in her fiction, she split the individualist impulse to make an artful life and the domestic drive to make a home between two characters: Sula and her best friend, Nel. The tensions between these two desires animate the body of fiction and nonfiction about the private lives of women and mothers. It’s a canon that has been dominated by the accounts of white, straight writers, but it now includes Michelle Obama’s blockbuster memoir, “Becoming.”
What Obama brings to this genre is, first, a powerful sense of self, which precedes and exceeds her domestic relationships—the book’s three sections are titled “Becoming Me,” “Becoming Us,” “Becoming More”—and, second, a conviction that the roles of wife and mother are themselves undefined. She makes and remakes her relationship to both throughout her adult life. In this, she draws on the literature of black women’s self-making that “Sula” represents. The modern matron saint of that tradition is Zora Neale Hurston, who, in a 1928 essay, describes “How It Feels to Be Colored Me”: a prismatic, mutable experience of being a loner, a spectacle, an ordinary woman, a goddess (“the eternal feminine with its string of beads”). Lucille Clifton shares Hurston’s sense of the need to invent oneself in a world without reliable mirrors or maps; as she writes in a poem, from 1992, “i had no model. / born in babylon / both nonwhite and woman / what did I see to be except myself? / i made it up…” Like these writers, Obama exposes the particular pressures and thrills of black women’s self-creation. But she also details the rather more modest creation of a stable domestic life. By bringing motherhood, marriage, and self-making together in “Becoming,” she combines the possibilities that Sula and Nel represent.
More here. (Note: Throughout February, we will publish at least one post dedicated to Black History Month)
In Diego Rivera’s La creación, Nahui Olin sits among a gathering of figures that represent “the fable,” draped in gold and blue. She appears as the figure of Erotic Poetry, fitting for what she was known for at the time: her erotic writing and sexual freedom.
Olin—a painter, poet, and artists’ model—was known throughout the 1920s and 30s for her intense beauty; huge green eyes, golden hair, and an all-engulfing stare. But her image has been practically erased from the lore of that post-revolutionary time. Her piercing eyes aren’t recognizable unless one is familiar with her face, and the myth that surrounds it: of a woman who reveled in her own beauty, who painted portraits of herself with her many lovers, and who was eventually shunned by society and died, decades later, in the house that she grew up in, alone. Now, a new generation is revalorizing her story.
A young evolutionary biologist, Barrett had come to Nebraska’s Sand Hills with a grand plan. He would build large outdoor enclosures in areas with light or dark soil, and fill them with captured mice. Over time, he would see how these rodents adapted to the different landscapes—a deliberate, real-world test of natural selection, on a scale that biologists rarely attempt.
But first, he had to find the right spots: flat terrain with the right color soil, an abundance of mice, and a willing owner. The last of these was proving especially elusive, Barrett bemoaned. Local farmers weren’t keen on giving up valuable agricultural land to some random out-of-towner. After knocking on door after door, he had come up empty.
When Erica Chenoweth started her predoctoral fellowship at the Belfer Center for Science and International Affairs in 2006, she believed in the strategic logic of armed resistance. She had studied terrorism, civil war, and major revolutions — Russian, French, Algerian, and American — and suspected that only violent force had achieved major social and political change. But then a workshop led her to consider proving that violent resistance was more successful than the nonviolent kind. Since the question had never been addressed systematically, she and colleague Maria J. Stephan began a research project.
For the next two years, Chenoweth and Stephan collected data on all violent and nonviolent campaigns from 1900 to 2006 that resulted in the overthrow of a government or in territorial liberation. They created a data set of 323 mass actions. Chenoweth analyzed nearly 160 variables related to success criteria, participant categories, state capacity, and more. The results turned her earlier paradigm on its head — in the aggregate, nonviolent civil resistance was far more effective in producing change.
And yet the most surprising aspect of the tiny-house phenomenon might be that, while the lifestyle has never been more visible, real-life tiny-house dwellers are hard to find. Fewer than ten thousand of the homes are estimated to exist nationwide. The impracticalities of tiny living—shoebox-sized sinks, closetless rooms—can be daunting, but it’s not only that. There are also legal obstacles. In most states, houses built on a foundation must be at least 400 square feet. To get around that, builders have taken to putting tiny houses on wheels and building them to the less restrictive codes that apply to RVs. But many states and cities bar RVs from being parked in one place year-round. Also, when you buy a tiny house, the house is all you get; you have to either buy the land to put it on, use someone else’s for free, or find a landlord willing to lease land to you.
Around the time of LaBarre and Fredell’s entrepreneurial gambit, the discrepancy between the realities of tiny living and its on-screen representations had generated a great deal of confusion. Inspired by HGTV and Pinterest, people would set out to buy tiny houses, only to learn that tiny living, in the real world, is supremely hard.
In June 2007, in Seville, Spain, a conference was held under the banner “New Fictioneers: The Spanish Literary Atlas.” Around forty writers and critics came together at the Andalusian Center for Contemporary Art to discuss the conservatism they felt to be suffocating their national literature. United in their belief that the Spanish novel in particular was in a bad state, they pointed to a disregard for the increasing centrality of digital media in people’s lives and a knee-jerk resistance to anything that smacked of formal experimentation. They were mostly of a similar age, born in the twilight of the Franco regime, committed to the DIY punk ethos of the fledgling blogosphere, and more likely to claim lineage to J. G. Ballard or Jean Baudrillard than any garlanded compatriots of their own. Nonetheless, the only true point of agreement on the day was that they were not part of a unified movement. The conference’s inaugural address itself rejected any suggestion of a coherent generation—a commonplace criticism familiar in Spain ever since the clumping together, at the end of the nineteenth century, of the Generation of ’98. Within a few weeks, however, an article appeared dubbing these writers “The Nocilla Generation”: the most significant literary phenomenon of Spain’s democratic era now had a label, and it stuck.
Complexities of interpretation are food and drink to Petrarchan scholars, and Christopher Celenza tucks into them with quiet determination in his short life-and-works overview, Petrarch: Everywhere a Wanderer. The subtitle is taken from one of Petrarch’s verse letters, where he describes himself as ‘peregrinus ubique’ (more precisely translated as ‘everywhere a pilgrim’). He was indeed a restless traveller. We find him in Avignon, Montpellier, Lombez, Paris, Ghent, Liège, Basel and Prague, as well as all over Italy. A learned marginalium suddenly becomes a snapshot when he apologises for his handwriting – he’s writing while on a boat. But the peregrination he describes is more a metaphysical restlessness, as he moves among different interests, disciplines, languages and political allegiances; between different versions of himself – a man of no fixed abode, ‘without earth or sky to call his own’. This also has a bearing on exile. His father was of the conservative or ‘white’ wing of the Guelph party, whose members opposed papal influence, and (like Dante, whom he knew) had been banished from Florence in 1302; Petrarch was born two years later in Arezzo and spent his boyhood and youth in Provence. ‘A sense of exile,’ Celenza says, ‘permeated Petrarch’s psychology’ and ‘inflected his writing’. It also, paradoxically, fostered in him an early sense of ‘Italianness’, in place of the regional identity denied him by exile.
I promised God and some other responsible characters, including a bench of bishops, that I was not going to part my lips concerning the U.S. Supreme Court decision on ending segregation in the public schools of the South. But since a lot of time has passed and no one seems to touch on what to me appears to be the most important point in the hassle, I break my silence just this once. Consider me as just thinking out loud.
The whole matter revolves around the self-respect of my people. How much satisfaction can I get from a court order for somebody to associate with me who does not wish me near them? The American Indian has never been spoken of as a minority and chiefly because there is no whine in the Indian. Certainly he fought, and valiantly for his lands, and rightfully so, but it is inconceivable of an Indian to seek forcible association with anyone. His well known pride and self-respect would save him from that. I take the Indian position.
Now a great clamor will arise in certain quarters that I seek to deny the Negro children of the South their rights, and therefore I am one of those “handkerchief-head niggers” who bow low before the white man and sell out my own people out of cowardice. However an analytical glance will show that that is not the case. If there are not adequate Negro schools in Florida, and there is some residual, some inherent and unchangeable quality in white schools, impossible to duplicate anywhere else, then I am the first to insist that Negro children of Florida be allowed to share this boon. But if there are adequate Negro schools and prepared instructors and instructions, then there is nothing different except the presence of white people.
More here. (Note: Throughout February, we will publish at least one post dedicated to Black History Month)
A contemporary truism, ironically enough, is that we now live in a “post-truth” era, as attested by a number of recent books with that or similar titles, related by their authors with varying degrees of chagrin. We all know the ultimate, or at least proximate, source of this concern: that fount of untruth which is Donald Trump’s Twitter feed (with a side of Brexitmania for those across the pond). Even among his supporters – and this is indeed, I think, the most charitable interpretation of the phenomenon in question – Mr. Trump is, as his aide Ms. Conway has put it, best taken “seriously but not literally.” That is, he is not particularly concerned with whether what he says is true, but instead with its effect on his readers and listeners, friendly or hostile.
I’m not going to defend this attitude toward truth-telling, which has become known, thanks to philosopher Harry Frankfurt, as “bullshitting,” and in fact dates back to the ancients, when Plato’s Socrates criticized what were called “sophists” for similar attitudes. However, it does seem that some of today’s self-appointed defenders of truth can paint their targets with an excessively broad brush. Naturally there is a lot of complaining about “postmodernist skepticism about truth,” but since most of our writers are not philosophers, their somewhat vague griping is hard to evaluate. (Michiko Kakutani, for example, clearly doesn’t like Derrida; but that’s just about all I got out of what she said.) But I’m not here to defend postmodernism any more than I am to defend “alternative facts.”Read more »
“He knew the American people better than they knew themselves, and his truth was based upon this knowledge.” —Frederick Douglass, Oration in Memory of Abraham Lincoln, April 14, 1876
In October of 1859, Abraham Lincoln received an invitation to come to New York to deliver a lecture at the Abolitionist minster Henry Ward Beecher’s Plymouth Congregational Church in Brooklyn.
Although Lincoln was, at this time, largely a regional figure among Republicans, and held no public office, the invitation was not accidental. The party was still a bit like a Rube Goldberg contraption, made up of pieces (former Whigs, Know-Nothings, disaffected Northern Democrats) that didn’t all quite fit together. People of influence, notably the newspaper publishers William Cullen Bryant and Horace Greeley, knew the new movement needed a coherent, inclusive platform, and an articulate, attractive man to lead it.
They weren’t, by any means, anointing Lincoln—in fact, they had no idea what he could do. He had acquitted himself well in his 1858 Senate race against Stephen Douglas, the presumed 1860 Democratic Presidential nominee, but he had also lost. There were other questions as well: Would a person some described as a “backwoodsman” play in front of a New York audience?
What was obvious was that the most likely Republican candidates for the nomination were either flawed or disliked, or flawed and disliked. The presumed frontrunner, William Seward of New York, was unquestionably competent, but had made some radical-sounding speeches and was seen by many as a captive of Thurlow Weed’s political machine. Seward’s support was also thin in lower North states like Pennsylvania, Indiana and Illinois—all of which Republicans had lost in 1856 and were essential to victory this time. Other potential candidates had different weaknesses: Ohio’s Salmon P. Chase was an able Governor, but lacked what we would now call retail political skills. Pennsylvania’s favorite, Simon Cameron, was undeniably corrupt. The fourth “first tier” candidate, Edward Bates of Missouri, was 66, barely a Republican, and almost certainly the most conservative. It was hard to see how he could have ignited a movement.Read more »
47-year old Teburoro Tito stood at the head of his delegation on an island way out in the Pacific Ocean. At the stroke of midnight on January 1st, 2000 the President of Kiribati handed a torch to a young man, ceremonially passing the future to a new generation.
Nobody lived there. Nobody ever went out there. When the designers of the International Date Line put marker to map, they drew right through the 33 islands that wouldn’t become Kiribati for 95 more years. They were just trifling bits of land, of no concern to cartographers 9050 miles away in London.
But there the little group stood because in 1994 the I-Kiribatipresident “applied to have a slight loop inserted into the date line that included all its islands in one time zone…. The Greenwich Observatory sanctioned his proposal and thus it was that his little nation profited greatly from the millennial hoopla.”
Dancers in grass skirts played to the cameras and the Kiribati archipelago made its claim to be the first to welcome the new millennium. The US Navy submarine Topeka made its own claim, positioning itself 400 meters underwater straddling both the date line and the equator.
New Zealand claimed the new millennium’s first sunrise. Their Chatham Islands, 500 miles east of the rest of New Zealand, got about a forty-minute jump on the rest of the country.
The real first dawn over land was near remote Dibble Glacier in Antarctica, but it was midsummer there, and you couldn’t really call it dawn, because the sun had never even set. If anybody was there, they never mentioned it.Read more »
Amidst all of the disheartening immigration news, it was refreshing to see the recent D.C. district court decisionin Grace v. Whitaker. The A.C.L.U. and the Center for Gender & Refugee Studies brought the case on behalf of twelve adults and children who fled domestic violence in their home countries and were denied entry by United States border officials. Judge Emmet Sullivan reviewed former Attorney General Jeff Sessions’ extraordinary decisionin Matter of A-B- last summer, which imposed heightened requirements for asylum-seekers entering the U.S. and moreover stated that domestic violence and gang violence were “generally” not grounds for asylum. Judge Sullivan found that Sessions’ decision and the subsequent Policy Memorandum that the Department of Homeland Security issued were unlawful.
Asylum law in the U.S.
Asylum law in the U.S. recognizes refugees belonging to a few specific categories: political opinion, race (encompassing ethnicity), nationality, religion, and “membership in a particular social group.” People fleeing abusive domestic situations and gang violence have been able to gain asylum in the U.S. through the last category, social group. To qualify as a refugee, someone must have a “well-founded fear of persecution” either by governmental actors, or, what is often crucial for social-group applicants, by non-state actors that the government is “unable or unwilling” to control. This language will come up a little later, as Sessions’ decision attempted to morph it into something quite a bit more restrictive.Read more »
Milton Friedman, in his essay The Methodology of Positive Economics[1], first published in 1953, often reprinted, by arguing against burdening models with the need for realistic assumptions helped lay the foundation for mathematical economics. The virtue of a model, the essay argues, is a function of how much of reality it can ignore and still be predictive:
The reason is simple. A [model] is important if it explains much by little, that is, if it abstracts the common and crucial elements from the mass of complex and detailed circumstances surrounding the phenomena to be explained and permits valid predictions on the basis of them alone.
Agreement on how to allow predictive models into the canon of Economics, Friedman believed, would allow Positive Economics to become “… an Objective science, in precisely the same sense as any of the physical sciences.” What Friedman coveted can be found in a footnote:
The … prestige … of physical scientists … derives … from the success of their predictions … When economics seemed to provide such evidence of its worth, in Great Britain in the first half of the nineteenth century, the prestige … of … economics rivaled the physical sciences.
Friedman appreciated the implications of the subject as the investigator, to a degree. “Of course,” he wrote, “the fact that economics deals with the interrelations of human beings, and that the investigator is himself part of the subject matter being investigated…raises special difficulties…
But he loses the value of his observation to a spate of intellectual showboating:
The interaction between the observer and the process observed … [in] the social sciences … has a more subtle counterpart in the indeterminacy principle … And both have a counterpart in pure logic in Gödel’s theorem, asserting the impossibility of a comprehensive self-contained logic …
The absence of an ability to conduct controlled experiments, according to Friedman, was not a burden holding back progress or unique to the social sciences. “No experiment can be completely controlled,” he wrote and offered astronomers as an example of scientists denied the opportunity of controlled experiments while still enjoying the prestige he coveted.
But though moving economics forward as a positive science – one where predictions are formulated through math and then tested against alternative formulations – he did not want to see mathematics supplant economics. “Economic theory,” he wrote, “must be more than a structure of tautologies … if it is to be something different from disguised mathematics.”
When Friedman penned his article, the simplest mathematical formulations exhausted computational capacity. Read more »
Caricature of A.Z. Abushady by the Persian/Alexandrian cartoonist, Mohamed Fridon (ca. 1928)
Joy Amina Garnett is an Egyptian American artist and writer living in New York. Her work, which spans creative writing, painting, installation art, and social media-based projects, reflects how past, present, and future narratives can co-exist through ‘the archive’ in its various forms. Her work has been included in exhibitions at New York’s FLAG Art Foundation, MoMA–PS1, the James Gallery, the Milwaukee Art Museum, Museum of Contemporary Craft Portland, Boston University Art Gallery, and the Witte Zaal in Ghent, Belgium, and she has been awarded grants from Anonymous Was a Woman, the Lower Manhattan Cultural Council, Wellcome Trust, and the Chipstone Foundation. Joy’s paintings and writings have appeared, sometimes side-by-side, in an eclectic array of publications, including the Evergreen Review, Ibraaz, edible Brooklyn, C Magazine, Ping Pong, and The Artists’ and Writers’ Cookbook. She has been working on a memoir and several other projects around the life and work of her late grandfather, the Egyptian Romantic poet and bee scientist A.Z. Abushady (1892–1955). Her chapter on Abushady will appear in Cultural Entanglement in the Pre-Independence Arab World: Arts, Thought, and Literature, edited by Anthony Gorman and Sarah Irving, forthcoming from I.B. Tauris. An excerpt from her memoir-in-progress appears in the January 2019 issue of FULL BLEDE, edited by Sacha Baumann.
Andrea Scrima: Joy, you’re the sole steward of the effects of your famous grandfather—the Egyptian Romantic poet and bee scientist Ahmed Zaki Abushady [Abu Shadi]—and have been compiling an archive for several years. First of all, however, I’d like to ask you about your artistic approach to the material and the ways in which history and storytelling interweave in the work. You showed an earlier version of this work-in-progress at Smack Mellon around four years ago, and now, recently, I’ve seen a number of new installments of The Bee Kingdom on Facebook. It makes me think of a kind of novel of layered fragments.
Abushady in the back garden of his home and research laboratory ‘Rameses Villa,’ Ealing, London. September, 1917
Joy Garnett: I like that description, The Bee Kingdom as a novel of layered fragments, though sometimes it feels like I’m chasing a moving target. It’s been challenging to parlay so many fragments into an artwork or a sustained piece of creative writing, but that is what I’m doing. The source material is not only historically relevant, it’s close and personal, and this affects how I work. And while I want to know the history of what actually happened to my grandfather and my family, and so on, I’m aware of many co-existing unofficial and even secret histories that appear and disappear as I try to make sense of things.
There are other questions, such as what types of media I want to work with. I’ve been a painter all my life, but painting isn’t right for this project. Is The Bee Kingdom mostly writing? Yes and no. I’ve subordinated the visual to writing, but the writing depends heavily on images. Read more »
In the winter of 1927 James Joyce was in desperate need of a kind word. It didn’t seem to matter that he was a genius, the man who’d published Ulysses five years earlier, an artist of such magnitude that another Irish genius—a young Samuel Beckett—worshipped him and acted as his personal secretary. Joyce was completing a new novel under the working name, Work in Progress (Finnegans Wake), and nearly everyone who had read drafts hated it.
James Joyce in Paris.
His wife, Nora Joyce, badgered him: “Why don’t you write sensible books that people can understand?” while his longtime patron, the sophisticated Harriet Shaw Weaver, wrote him scathing letters. She found the work nearly indecipherable. “I am made in such a way that I do not care much for the output from your Wholesale Safety Pun Factory nor for the darkness and unintelligibilities of your deliberately-entangled language system,” she wrote. Joyce biographer Richard Ellmann in his definitive chronicle James Joyce (1959), tells us that Joyce was so upset by this letter he “took to his bed.”
In Joyce’s three previous books he had explored and mastered the limits of the short story and the autobiographical novel, and then proceeded to write a maximalist “avant-garde” novel, Ulysses (1922), that was arguably three-to-four decades ahead of its time. In baseball terms, Ulysses remains the equivalent of Joe DiMaggio’s 56-game hitting streak in 1941. A record of human achievement that is unassailable and will forever remain a sacred Mount Sanai for writers across the globe.
Yet this great book burdened Joyce too. Like DiMaggio, he had to know his achievement was not repeatable. “In Dubliners he had explored the waking consciousness from outside, in A Portrait and Ulysses from inside,” Ellmann wrote. “He had begun to impinge, but gingerly, upon the mind of sleep…that the great psychological discovery of the century was the night world he was, of course, aware.”
Ellmann, we can’t forget, is referring to the writings and work of Sigmund Freud and his psychoanalytic theory, which was an intellectual bombshell of the early 20th Century that was only rivaled by Einstein’s Theory of General Relativity. In Finnegans Wake Joyce was looking to create an entirely new language for the new territory of the unconscious, of sleep, of the dream world. Read more »