A Conversation about Criticism

William Giraldi and Anthony Domestico at Commonweal:

When I suggest a nexus between style and morality I mean always to keep the query centered on the book, to interrogate the book’s moral vision as it is activated in language. That vision, like everything else in the writer’s arsenal, is manifest in style. Language is our fullest, most accurate embodiment of mind. How you write is how you think, and there’s reciprocity there, a fertile feedback loop, because the writing in turn sharpens the thinking, which in turn sharpens the writing. This is what Goethe means by “a writer’s style is a true reflection of his inner life.” Remember, too, Nabokov’s oft-cited line: “Style is matter.” He means that style is not something gummed onto prose after the fact—style is the fact. Style is born of subject. Robert Penn Warren makes a similar observation: “The style of a writer represents his stance toward experience.” That’s what Auden means in his second consideration above.

more here.

The Life of Charles de Gaulle

Seamus Deane at the Dublin Review of Books:

The French people are constantly hauled on to a Corneille-like stage with and by de Gaulle, two characters in search of a destiny, with soliloquies, debates, monologues, conducted in newspapers, radio and television although, such is the nature of a biography, “the people” are really the audience that listens and is moulded, enchanted or aroused to sublimity by the suasion of that resonant, nasal, rhythmic voice. As was noticed on several occasions, de Gaulle was a traditional Catholic Christian; he rarely spoke of or even mentioned God but rarely failed to speak instead of France, the great stained-glass rose window in which the divine light had glowed through the centuries in radiance or in sombre melancholy, picking out at irregular intervals the ranged silhouettes of a Clovis, a Charlemagne, an Henri IV, a Joan of Arc, Louis XI, a Colbert, Richelieu, Louis XIV (or his great general Louvois), a Napoleon and, at last, a de Gaulle. Régis Debray in 1990 is quoted: “In my dreams I am on terms of easy familiarity with Louis XI, with Lenin, with Edison and Lincoln. But I quail before de Gaulle. He is the Great Other, the inaccessible absolute … Napoleon was the great political myth of the nineteenth century; de Gaulle of the twentieth. The sublime, it seems, appears in France only once a century.” Debray had once regarded Mitterrand as a saviour ‑ rather hard to believe now in any retrospective light ‑ but this literary-political canonisation would have pleased de Gaulle, for he certainly believed it to be true, true as only a myth can be.

more here.

Frederick Douglass’s Moral Crusade

Eric Foner at The Nation:

Douglass’s current status as a national hero poses a challenge for the biographer, making it difficult to view him dispassionately. Moreover, those who seek to tell his story must compete with their subject’s own version of it. Douglass published three autobiographies, among the greatest works of this genre in American literature. They present not only a powerful indictment of slavery, but also a tale of extraordinary individual achievement (it is no accident that Douglass’s most frequently delivered lecture was titled “Self-Made Men”). Like all autobiographies, however, Douglass’s were simultaneously historical narratives and works of the imagination. As David Blight notes in his new book, Frederick Douglass: Prophet of Freedom, some passages in them—especially those relating to Douglass’s childhood—are “almost pure invention,” which means the biographer must resist the temptation to take these books entirely at face value.

more here.

Women, Weaving and History

Kassia St Clair at the TLS:

The cultural association between women and the manufacture of textiles runs deep. Many of the deities associated with spinning and weaving are female: Neith in pre-Dynastic Egypt; Grecian Athena; the Norse goddess Frigg and the Chinese Silkworm Goddess. Skill with needle, thread or a loom was seen in many cultures as an intrinsic part of a woman’s value. In Greece, the birth of a baby girl was sometimes marked by placing a tuft of wool by the door of a home. And Homer wrote that when his countrymen raided towns and villages they killed the men they encountered, but women were spared in order to spin and weave in captivity. The “distaff side” was a traditional English term for the maternal side of a family. Women the world over were buried with spindles and distaffs, the very tools they wielded so dexterously and for so many hours in life. Sarah Arderne, a member of the lesser gentry in northern England in the late eighteenth century, bought muslin for her husband’s cravats and handkerchief and supervised the laundering of all his linens. She was pretty typical: it wasn’t unusual for men of this class and period to live and die without ever taking responsibility for their own shirts.

more here.

Strange and intelligent

Christy Wampole in Aeon:

Simone Weil (1909-43) belonged to a species so rare, it had only one member. This peculiar French philosopher and mystic diagnosed the maladies and maledictions of her own age and place – Europe in the first war-torn half of the 20th century – and offered recommendations for how to forestall the repetition of its iniquities: totalitarianism, income inequality, restriction of free speech, political polarisation, the alienation of the modern subject, and more. Her combination of erudition, political and spiritual fervour, and commitment to her ideals adds weight to the distinctive diagnosis she offers of modernity. Weil has been dead now for 75 years but remains able to tell us much about ourselves. Born to a secular Jewish family in Paris, she was gifted from the beginning with a thirst for knowledge of other cultures and her own. Fluent in Ancient Greek by the age of 12, she taught herself Sanskrit, and took an interest in Hinduism and Buddhism. She excelled at the Lycée Henri IV and the École normale supérieure, where she studied philosophy. Plato was a lasting influence, and her interest in political philosophy led her to Karl Marx, whose thought she esteemed but did not blindly assimilate.

As a Christian convert who criticised the Catholic Church and as a communist sympathiser who denounced Stalinism and confronted Trotsky over hazardous party developments, Weil’s independence of mind and resistance to ideological conformity are central to her philosophy. In addition to her intelligence, other aspects of her biography have captured the public’s imagination. As a child during the First World War, she refused sugar because soldiers on the front could have none. Diagnosed with tuberculosis, she died at 34 when working for the resistance government France libre in London, refusing to eat more than the citizens’ rations of her German-occupied France. Teachers and classmates called her the Martian and the Red Virgin, nicknames suggestive of her strangeness and asexuality. A philosopher who refused to cloister herself behind academia’s walls, she worked in factories and vineyards, and left France during the Spanish Civil War to fight alongside the Durruti Column anarchists, a failed mission in many respects.

Several mystical experiences, including Weil’s discovery of the poem ‘Love (III)’ by the 17th-century poet George Herbert led her to embrace Christianity, and many have called for her canonisation as a saint. In her book Devotion (2017), the Francophile poet and punk-rock star Patti Smith described Weil as ‘an admirable model for a multitude of mindsets. Brilliant and privileged, she coursed through the great halls of higher learning, forfeiting all to embark on a difficult path of revolution, revelation, public service, and sacrifice.’ The French politician Charles de Gaulle thought Weil was mad, while the authors Albert Camus, André Gide and T S Eliot recognised her as one of the greatest minds of her time.

More here.

Where People Turn When Nobody Can Diagnose Their Illness

Shayla Love in Tonic:

When something in the body goes wrong, we see a doctor, get examined, get swabbed or draw blood, and usually leave with an answer. But for many patients and families, medical examination doesn’t lead to a diagnosis—and what do they do then? A paper published in the New England Journal of Medicine last week covers the progress of the Undiagnosed Disease Network (UDN), a collection of sites around the country where people can turn when there are no more specialists to see and no more conventional tests to run, to find answers to their health mysteries. The UDN was inspired by a 2008 program called the Undiagnosed Diseases Program (UDP) at the National Institutes of Health (NIH) Clinical Center, which was quickly overwhelmed by patients seeking their services. The UDN was formed in 2014 and is made of 12 sites across the country. They also have two sequencing centers, a group of people who do metabolomics, or study the small molecules made by cells, and a model organisms core, which is a group of people who check genes that might be causing disease in organisms like flies and fish.

Since its inception, 1,519 patients have been referred to the UDN and 601 were accepted for evaluation. A large bulk of them, 40 percent, had a neurological condition, like Quinn. Of the 382 patients who got a complete evaluation, 132 got a diagnosis, about 35 percent. Most of those diagnoses were found through genetic testing. 21 percent of those who were diagnosed were able to make changes to their therapy, specific to their condition. As of today, the UDN has defined 31 new syndromes.

More here.

The Empty Brain

Robert Epstein in Global Research:

No matter how hard they try, brain scientists and cognitive psychologists will never find a copy of Beethoven’s 5th Symphony in the brain – or copies of words, pictures, grammatical rules or any other kinds of environmental stimuli. The human brain isn’t really empty, of course. But it does not contain most of the things people think it does – not even simple things such as ‘memories’.

Our shoddy thinking about the brain has deep historical roots, but the invention of computers in the 1940s got us especially confused. For more than half a century now, psychologists, linguists, neuroscientists and other experts on human behaviour have been asserting that the human brain works like a computer. To see how vacuous this idea is, consider the brains of babies. Thanks to evolution, human neonates, like the newborns of all other mammalian species, enter the world prepared to interact with it effectively. A baby’s vision is blurry, but it pays special attention to faces, and is quickly able to identify its mother’s. It prefers the sound of voices to non-speech sounds, and can distinguish one basic speech sound from another. We are, without doubt, built to make social connections. A healthy newborn is also equipped with more than a dozen reflexes – ready-made reactions to certain stimuli that are important for its survival. It turns its head in the direction of something that brushes its cheek and then sucks whatever enters its mouth. It holds its breath when submerged in water. It grasps things placed in its hands so strongly it can nearly support its own weight. Perhaps most important, newborns come equipped with powerful learning mechanisms that allow them to change rapidly so they can interact increasingly effectively with their world, even if that world is unlike the one their distant ancestors faced.

Senses, reflexes and learning mechanisms – this is what we start with, and it is quite a lot, when you think about it. If we lacked any of these capabilities at birth, we would probably have trouble surviving.

But here is what we are not born with: information, data, rules, software, knowledge, lexicons, representations, algorithms, programs, models, memories, images, processors, subroutines, encoders, decoders, symbols, or buffers – design elements that allow digital computers to behave somewhat intelligently. Not only are we not born with such things, we also don’t develop them – ever.

We don’t store words or the rules that tell us how to manipulate them. We don’t create representationsof visual stimuli, store them in a short-term memory buffer, and then transfer the representation into a long-term memory device. We don’t retrieve information or images or words from memory registers. Computers do all of these things, but organisms do not. Computers, quite literally, process information – numbers, letters, words, formulas, images. The information first has to be encoded into a format computers can use, which means patterns of ones and zeroes (‘bits’) organised into small chunks (‘bytes’). On my computer, each byte contains 8 bits, and a certain pattern of those bits stands for the letter d, another for the letter o, and another for the letter g. Side by side, those three bytes form the word dog. One single image – say, the photograph of my cat Henry on my desktop – is represented by a very specific pattern of a million of these bytes (‘one megabyte’), surrounded by some special characters that tell the computer to expect an image, not a word.

More here.

Friday Poem

Emergency Measures

I take Saturday’s unpopulated trains,
sitting at uncontagious distances,
change at junctions of low body count, in off-hours,
and on national holidays especially, shun stadia
and other zones of efficient kill ratio,
since there is no safety anymore in numbers.

I wear the dull colors of nesting birds,
invest modestly in diverse futures,
views and moods undiscovered by tourists,
buy nothing I can’t carry or would need to sell,
and since I must rest, maintain at several addresses
hardened electronics and three months of water.
And it is thus I favor this unspecific café,
choose the bitterest roast, and only the first sip
of your story, sweet but so long, and poignantly limited
by appointments neither can be late for, and why now
I will swim through the crowd to the place it is flowing away from,
my concerned look and Excuse me excuse me suggesting
I am hurrying back for my umbrella or glasses
or some thrilling truth they have all completely missed.

by James Richardson
from
 By the Numbers

Thursday, October 25, 2018

CLR James rejected the posturing of identity politics

Ralph Leonard in UnHerd:

“I denounce European colonialism”, wrote CLR James in 1980, “but I respect the learning and profound discoveries of Western civilisation.” A Marxist revolutionary and Pan-Africanist, a historian and novelist, an icon of black liberation and die-hard cricket fan, a classicist and lover of popular culture, Cyril Lionel Roberts James, described by V.S Naipaul as “the master of all topics”, was one of the great (yet grossly underrated) intellectuals of the 20th century.

He was one of the few Leftist intellectuals – as Christopher Hitchens once said about George Orwell – who was simultaneously on the right side of the three major questions of the 20th century: Fascism, Stalinism and Imperialism. But today his praise for ‘Western culture’ would probably be dismissed as a slightly embarrassing residue of a barely concealed ‘Eurocentrism”’

Sophie Zhang in a recent column for Varsity entitled “Not all literature is ‘universal’ – nor does it have to be”, writes that:

“The study of English Literature… has often centred around texts that claim to explore ‘universal’ themes and experiences. Yet what such curricula fail to recognise is that in glorifying the universal, we neglect the particular, because to focus on the ‘Western’ canon would be to ‘to centre whiteness and continually place non-white voices on the margins’”.

Implicit in this view is that only “whiteness” could have access to the universal, and those outside of “whiteness” are intrinsically on the margins, and their views are necessarily “particular”.

Similarly, James’s admiration for Western culture and the Western canon is something many black radicals, who otherwise admire James for his opposition to colonialism, struggle to understand about him. It is rather fashionable, and almost expected, that to be a ‘proper’ black radical today is to be hostile to all that is designated as Western; it is to indiscriminately dismiss the Enlightenment as “white” and “racist”, and disparage the Western canon as not being “relevant” to black people.

More here.

The Cure For Racism Is Cancer

Tony Hoagland (who died two days ago at age 64) in The Sun:

The woman sitting next to me in the waiting room is wearing a blue dashiki, a sterile paper face mask to protect her from infection, and a black leather Oakland Raiders baseball cap. I look down at her brown, sandaled feet and see that her toenails are the color of green papaya, glossy and enameled.

This room at MD Anderson Cancer Center in Houston, Texas, is full of people of different ages, body types, skin colors, religious preferences, mother tongues, and cultural backgrounds. Standing along one wall, in work boots, denim overalls, and a hunter’s camouflage hat, is a white rancher in his forties. Nervously, he shifts from foot to foot, a styrofoam cup of coffee in his hand. An elderly Chinese couple sit side by side, silently studying their phones. The husband is watching a video. The wife is the sick one, pale and gaunt. Her head droops as if she is fighting sleep. An African American family occupies a corner. They are wearing church clothes; the older kids are supervising the younger ones while two grown women lean into their conversation and a man — fiftyish, in a gray sports coat — stares into space.

America, that old problem of yours? Racism? I have a cure for it: Get cancer. Come into these waiting rooms and clinics, the cold radiology units and the ICUcubicles. Take a walk down Leukemia Lane with a strange pain in your lower back and an uneasy sense of foreboding. Make an appointment for your CATscan. Wonder what you are doing here among all these sick people: the retired telephone lineman, the grandmother, the junior-high-school soccer coach, the mother of three.

More here.

Notes on Neanderthal Aesthetics

Justin E. H. Smith in his blog:

Often, in the study of past human and natural processes, the realisation that we have no evidence of something, and therefore no positive knowledge, transforms too quickly into the conclusion that because we have no positive knowledge, we may therefore assume the negative. Sometimes this conclusion is correct, but it cannot be applied indiscriminately. To cite one interesting example of the correct and useful refusal to apply it, in recent probabilistic reasoning about extraterrestrials the absence of any direct evidence for their existence is taken as irrelevant to whether we should believe in them or not. Drake’s equation is more powerful than radio signals or space ships in shaping our beliefs.

In palaeontology and palaeoanthropology, the signals that do come down to us in the present are usually the result of contingent, singular events that could have failed to occur. Any individual object that survives erosion or decomposition from the distant past is an exception, and needs to be understood as such. A corollary principle worth adopting in these fields holds that the earliest found artefact of a given sort cannot be the earliest artefact that ever existed of that sort. As an individual object, it is exceptional, but it justifies the presumption of a large class of absent objects to which it belongs or belonged, and in relation to which it is not at all exceptional.

More here.

The Secret Lives of Central Bankers

Annelise Riles in the New York Times:

A few years ago, a senior Japanese central banker let me in on a secret side of his life: Like some others in his rarefied world, he is a passionate devotee of Sherlock Holmes. After formal meetings in capitals around the world, he joins the other Sherlock Holmes buffs over drinks or dinner for trivia competitions, to test their knowledge of obscure plot details, or to share amateur historical research into Victorian London.

It is all very casual, but the camaraderie is important to him. Through this informal fan club, the banker told me, he had made his closest professional friendships. “I feel closer to many of these people than to many of my countrymen,” he said.

As an anthropologist, I have spent 20 years studying the customs, beliefs and rituals of central bankers around the world. They see themselves as jacks-of-all-financial-trades who solve complex financial crises before they can damage the unsuspecting public. They are as clever as the extraordinarily wealthy banking executives whom they regulate, but motivated by higher ideals. So it made sense that the aloof and justifiably arrogant Sherlock Holmes might represent for them an ideal of masculine brilliance (they are mostly still men), rationality and self-control. Like Holmes, central bankers consider their detachment an asset.

But in the real world, this high-mindedness has come at a cost.

More here.

George Scialabba, Radical Democrat

Jedediah Purdy in The New Republic:

It may strike a reader new to George Scialabba’s writing as extraordinary that Slouching Toward Utopia, a new collection of his essays and reviews, is not a response to Donald Trump’s presidency. Although the president does not appear by name until a handful of very recent pieces toward the end—earlier he is decorously invoked, just once, as “a famous social parasite”—Scialabba has argued for years that the United States is a plutocracy, administered mainly for the convenience of those who control capital and jobs. His consistent themes have been the corruption of language, the coarsening of imagination, the colonization of attention by technology and commerce, and the seductions of power. The pathologies that the present moment throws into relief have always been the occasions of his warnings and laments. He writes lucidly about benightedness, vividly about purblindness, so that his essays and reviews show thought as a thing possible in a world that can seem a conspiracy against sense and reason.

It is up to Scialabba’s readers to observe this modest heroism in his work, because he will not claim it for himself. He has long insisted on the political irrelevance of criticism.

More here.

The Rise of Cancer Immunotherapy

Daniel M. Davis in Nautilus:

Every time Jim meets a patient, he cries,” Padmanee said to The New York Times in 2016. “Well not every time,” Jim added. Jim Allison and Padmanee Sharma work together at the MD Anderson Cancer Center in Houston, Texas, having met in 2005 and married in 2014. A decade before they met, Allison and his lab team made a seminal discovery that led to a revolution in cancer medicine. The hype is deserved; cancer physicians agree that Allison’s idea is a game-changer, and it now sits alongside surgery, radiation, and chemotherapy as a mainstream option for the treatment of some types of cancer.

Take one example. In 2004, 22-year-old Sharon Belvin was diagnosed with stage IV melanoma—skin cancer that had already spread to her lungs—and was given a 50/50 chance of surviving the next six months. Chemotherapy didn’t work for her and her prospects looked bleak. “I’ve never felt that kind of pain,” she later recalled, “ … you are lost, I mean you’re lost, you’re absolutely out of control, lost.” All other options exhausted, she signed up to an experimental clinical trial testing a new drug based on Allison’s idea. After just four injections over three months the tumor in her left lung shrunk by over 60 percent. Over the next few months, her tumors kept shrinking and eventually, after two and a half years of living with an intense fear of dying, she was told that she was in remission—her cancer could no longer be detected. The treatment doesn’t work for everyone but, Allison says, “We’re going to cure certain types of cancers. We’ve got a shot at it now.”

More here.

Thursday Poem

Reading Moby-Dick at 30,000 Feet

At this height, Kansas
is just a concept,
a checkerboard design of wheat and corn
.
no larger than the foldout section
of my neighbor’s travel magazine.
At this stage of the journey
.
I would estimate the distance
between myself and my own feelings
is roughly the same as the mileage
.
from Seattle to New York,
so I can lean back into the upholstered interval
between Muzak and lunch,
.
a little bored, a little old and strange.
I remember, as a dreamy
backyard kind of kid,
.
tilting up my head to watch
those planes engrave the sky
in lines so steady and so straight
.
they implied the enormous concentration
of good men,
but now my eyes flicker
.
from the in-flight movie
to the stewardess’s pantyline,
then back into my book,
.
where men throw harpoons at something
much bigger and probably
better than themselves,
.
wanting to kill it,
wanting to see great clouds of blood erupt
to prove that they exist.
.
Imagine being born and growing up,
rushing through the world for sixty years
at unimaginable speeds.
.
Imagine a century like a room so large,
a corridor so long
you could travel for a lifetime
.
and never find the door,
until you had forgotten
that such a thing as doors exist.
.
Better to be on board the Pequod,
with a mad one-legged captain
living for revenge.
.
Better to feel the salt wind
spitting in your face,
to hold your sharpened weapon high,
.
to see the glisten
of the beast beneath the waves.
What a relief it would be
.
to hear someone in the crew
cry out like a gull,
Oh Captain, Captain!
Where are we going now?
.
. by Tony Hoagland
from Donkey Gospel
Graywolf Press, Saint Paul, Minnesota
.
Tony Hoagland
1953 – 2018

Wednesday, October 24, 2018

Looking at the world as a whole: Mary Midgley, 1919-2018

James Garvey in Prospect:

Just a few weeks before her death in October, Mary Midgley agreed to meet and discuss her new book, What Is Philosophy For? It seemed astonishing that someone about to celebrate her 99th birthday had a new book out, but I was less in awe of that than the reputation of one of the most important British philosophers of the 20th century and beyond.

People who have encountered Midgley often use the word “formidable” to describe her. Journalist Andrew Brown called her “the most frightening philosopher in the country: the one before whom it is least pleasant to appear a fool.” During my email correspondence with her to set up a date to talk about those philosophical problems “which are exercising both me and the public,” she worried that publications like the ones I write for “occasionally give rather half-witted answers to large questions of this kind.”

A lot of people were on the receiving end of her sharp intellect. She made puncturing scientific pretension into an art form—going after DNA discoverer Francis Crick for saying that human behavior can be explained simply by the interactions of brain cells, the physicist Lawrence Krauss for claiming that only science can solve philosophical problems, those theorists who insist we must look to machines for our salvation, and, most famously, Richard Dawkins for the idea that a gene could be selfish.

In person, though, Midgley was kind, generous with her time and as engaged as ever with philosophical ideas—even if her voice was soft and she had a little trouble hearing me. She sat in an armchair, sipping tea, surrounded by books. Having just celebrated her approaching birthday with friends and family, she had a kitchen full of cakes.

More here.

‘World’s oldest fossils’ may just be pretty rocks

Maya Wei-Haas in National Geographic:

In 2016, a series of unassuming stone shapes rocked the paleobiology world when they were declared the earliest fossilized life yet found. Standing up to 1.6 inches tall, the triangular forms line up like a string of inverted flags in an outcrop on the southwest coast of Greenland that dates back 3.7 billion years.

“If these are really the figurative tombstones of our earliest ancestors, the implications are staggering,” NASA astrobiologist Abigail Allwood wrote in a review article that accompanied the Nature study announcing the find. The microbes that made these fossils are over 200 million years older than the most widely accepted evidence of fossil life and would have lived a geologic blink of an eye after astroids had blasted Earth’s early surface. Evidence of critters from this time would suggest that “life is not a fussy, reluctant, and unlikely thing,” Allwood wrote. “Give life half an opportunity, and it’ll run with it.”

But even as Allwood penned these words, she had a nagging sense that something was amiss.

More here.

History for a Post-Fact America

Alex Carp in the New York Review of Books:

What was America? The question is nearly as old as the republic itself. In 1789, the year George Washington began his first term, the South Carolina doctor and statesman David Ramsay set out to understand the new nation by looking to its short past. America’s histories at the time were local, stories of states or scattered tales of colonial lore; nations were tied together by bloodline, or religion, or ancestral soil. “The Americans knew but little of one another,” Ramsay wrote, delivering an accounting that both presented his contemporaries as a single people, despite their differences, and tossed aside the assumptions of what would be needed to hold them together. “When the war began, the Americans were a mass of husbandmen, merchants, mechanics and fishermen; but the necessities of the country gave a spring to the active powers of the inhabitants, and set them on thinking, speaking and acting in a line far beyond that to which they had been accustomed.” The Constitution had just been ratified at the time of Ramsay’s writing, the first system of national government submitted to its people for approval. “A vast expansion of the human mind speedily followed,” he wrote. It hashed out the nation as a set of principles. America was an idea. America was an argument.

The question has animated American history ever since. “For the last half century,” the historian and essayist Jill Lepore told an interviewer in 2011, academic historians have been trying “to write an integrated history of the United States, a history both black and white, a history that weaves together political history and social history, the history of presidents and the history of slavery.” Over the same period, a generation of Americans have had their imaginations narrowed, on one side by populist myths blind to the evidence of the past, and on the other by academic histories blind to the power of stories. Why, at a time when facts are more accessible than at any other point in human history, have they failed to provide us with a more broadly shared sense of objective truth?

More here.

The Art of Anni Albers

Lynne Cooke at Artforum:

IN A 1985 INTERVIEW, Anni Albers remarked, “I find that, when the work is made with threads, it’s considered a craft; when it’s on paper, it’s considered art.” This was her somewhat oblique explanation of why she hadn’t received “the longed-for pat on the shoulder,” i.e., recognition as an artist, until after she gave up weaving and immersed herself in printmaking—a transition that occurred when she was in her sixties. It’s hard to judge whether Albers’s tone was wry or rueful or (as one critic alleged) “some-what bitter,” and therefore it’s unclear what her comment might indicate about the belatedness of this acknowledgment relative to her own sense of her achievement. After all, she had been making “pictorial weavings”—textiles designed expressly as art—since the late 1940s. Though the question might now seem moot, it isn’t, given the enduring debates about the hierarchical distinctions that separate fine art from craft, and given the still contested status of self-identified fiber artists who followed in Albers’s footsteps and claimed their woven forms as fine art, tout court.

more here.