The Uproar Over ‘Transracialism’

18brubakerWeb-master675

Rogers Brubaker in The New York Times:

The world of academic philosophy is ordinarily a rather esoteric one. But Rebecca Tuvel’s article “In Defense of Transracialism,” published in the feminist philosophy journal Hypatia this spring, has generated a broad public discussion

Dr. Tuvel was prompted to write her article by the controversy that erupted when Rachel Dolezal, the former local N.A.A.C.P. official who had long presented herself as black, was revealed to have grown up white. The Dolezal story broke just 10 days after Caitlyn Jenner’s Vanity Fair debut, and the two discussions merged. If Ms. Jenner could identify as a woman, could Ms. Dolezal identify as black? If transgender was a legitimate social identity, might transracial be as well? Dr. Tuvel’s article subjected these public debates to philosophical scrutiny.

The idea of transracialism had been rejected out of hand by the cultural left. Some worried — as many cultural conservatives indeed hoped — that this seemingly absurd idea might undermine the legitimacy of transgender claims. Others argued that if self-identification were to replace ancestry or phenotype as the touchstone of racial identity, this would encourage “racial fraud” and cultural appropriation. Because race has always been first and foremost an externally imposed classification, it is understandable that the idea of people declaring themselves transracial struck many as offensively dismissive of the social realities of race.

More here.



Mud

Over at Noah Smith's site:

[T]he demand to "go read the vast literature" could also be eminently unreasonable. Just because a lot of papers have been written about something doesn't mean that anyone knows anything about it. There's no law of the Universe stating that a PDF with an abstract and a standard LaTeX font contains any new knowledge, any unique knowledge, or, in fact, any knowledge whatsoever. So the same is true of 100 such PDFs, or 1000.
There are actual examples of vast literatures that contain zero knowledge: Astrology, for instance. People have written so much about astrology that I bet you could spend decades reading what they've written and not even come close to the end. But at the end of the day, the only thing you'd know more about is the mindset of people who write about astrology. Because astrology is total and utter bunk.
But astrology generally isn't worth talking or thinking about, either. The real question is whether there are interesting, worthwhile topics where reading the vast literature would be counterproductive – in other words, where the vast literature actually contains more misinformation than information.
There are areas where I suspect this might be the case. Let's take the obvious example that everyone loves: Business cycles. Business cycles are obviously something worth talking about and worth knowing about. But suppose you were to go read all the stuff that economists had written about business cycles in the 1960s. A huge amount of it would be subject to the Lucas Critique. Everyone agrees now that a lot of that old stuff, probably most of it, has major flaws. It probably contains some real knowledge, but it contains so much wrong stuff that if you were to read it thinking "This vast literature contains a lot of useful information that I should know," you'd probably come out less informed than you went in.
More here.

Israel-Palestine: the real reason there’s still no peace

Nathan Thrall in The Guardian:

3500 (1)Scattered over the land between the Jordan river and the Mediterranean Sea lie the remnants of failed peace plans, international summits, secret negotiations, UN resolutions and state-building programmes, most of them designed to partition this long-contested territory into two independent states, Israel and Palestine. The collapse of these initiatives has been as predictable as the confidence with which US presidents have launched new ones, and the current administration is no exception.

In the quarter century since Israelis and Palestinians first started negotiating under US auspices in 1991, there has been no shortage of explanations for why each particular round of talks failed. The rationalisations appear and reappear in the speeches of presidents, the reports of thinktanks and the memoirs of former officials and negotiators: bad timing; artificial deadlines; insufficient preparation; scant attention from the US president; want of support from regional states; inadequate confidence-building measures; coalition politics; or leaders devoid of courage.

Among the most common refrains are that extremists were allowed to set the agenda and there was a neglect of bottom-up economic development and state-building. And then there are those who point at negative messaging, insurmountable scepticism or the absence of personal chemistry (a particularly fanciful explanation for anyone who has witnessed the warm familiarity of Palestinian and Israeli negotiators as they reunite in luxury hotels and reminisce about old jokes and ex-comrades over breakfast buffets and post-meeting toasts). If none of the above works, there is always the worst cliche of them all – lack of trust.

Postmortem accounts vary in their apportioning of blame. But nearly all of them share a deep-seated belief that both societies desire a two-state agreement, and therefore need only the right conditions – together with a bit of nudging, trust-building and perhaps a few more positive inducements – to take the final step.

More here.

Michel Foucault Investigates

Mc-main-800x497Duncan Kelly at the Times Literary Supplement:

In 1970, after various appointments in France, Germany, Poland, Sweden and Tunisia, the French philosopher and epistemologist Michel Foucault took a Chair at the Collège de France in Paris. His job title was Professor of the History of Systems of Thought, and his inaugural lecture offered a retrospect and prospect of what that meant to him. Yet only by the end of the 1970s, in a recap of a course given on the birth of modern “biopolitics”, published in English as “History of Systems of Thought” (1979), did Foucault explain what this meant more explicitly. Asking how, from the eighteenth century onwards, governmental practices had sought to rationalize the attention they paid to their subjects and citizens, he considered the range of policies and systems of thought that justified them, targeting the practical problems of governing a population (health, hygiene, care and welfare, births, deaths, diseases, etc). These were forms of “gov­ernmentality” and, he continued, they were “inseparable” as systems of thought from the dominant form of “political rationality” that overlay them, namely, modern “liberalism”. The history of systems of thought, it turns out, covers it all.

If Foucault wanted to cover it all, that is also the ambition of the new two-volume Pléiade edition of his works. He has become part of the classic modern canon of French culture, fixed on pages tracing-paper thin in an eye-wateringly small font. Such enterprises more often than not kill their subjects on the page, sanctifying them as objects of devotion, rather than reviving their earlier words as the weapons they once were. The rather conventional retrospectives puffing the publication of these volumes in the French press indicated ambivalence about Foucault’s contemporary relevance.

more here.

Bullfighting with Picasso

Bull-headMartin Gayford at The Spectator:

Picasso was an aficionado, an ardent devotee of the sport; similarly, Richardson himself is an aficionado of Picasso, combining vast knowledge of his work with an intimate day-to-day acquaintance with the artist, a sense of how Picasso thought and felt. It is now 37 years since he embarked on his immense Life of Picasso, of which three volumes have appeared to date (the third taking the narrative only up to 1932).

Richardson was drawn to the work even before he encountered the man. At 14 years old, still a schoolboy at Stowe, he saw a copy of ‘La Minotauromachie’ (1935), at Zwemmer’s bookshop on Charing Cross Road. Now regarded as Picasso’s greatest print, this etching was then recent and priced at £50. He asked his mother for an advance on his allowance to buy it, but instead she gave Mr Zwemmer a dressing-down: ‘It’s a disgrace trying to palm off this stuff on children!’

The minotaur, like the bullfight, was a frequent theme in Picasso’s art. Together these two interlinked subjects — the bull-man and matador killing the bull — make up the Gagosian exhibition. Both are deeply connected with Mediterranean culture, going back to ancient Crete, and also with Picasso’s psyche. As Richardson points out, the artist seems to empathise now with the bull, now with the matador, and often with the Minotaur. Did Picasso identify with this horned and hirsute monster? ‘God, yes!’

more here.

Bosch & Bruegel: From Enemy Painting to Everyday Life

K10815Tim Smith-Laing at Literary Review:

They might seem an incongruous pair at first, but historically speaking Hieronymus Bosch and Pieter Bruegel the Elder are a natural duo for comparative study. When Bruegel entered the painters’ guild of Antwerp in 1551, Bosch, who had died in 1516, was still the most famous and imitated artist of the age. Antwerp, the centre of European art production at the time, was home to a whole mini-industry of Bosch imitation and forgery, and Bruegel himself cashed in on the continuing demand for his predecessor’s characteristic style. Look at the Boschian pastiches of his ‘Seven Deadly Sins’ series (1558) or 1557’s Big Fish Eat Little Fish, printed with the misleading inscription ‘Hieronymus Bos inventor’, and you can see why a contemporary dubbed Bruegel a ‘second Hieronymus’.

Imitation is not the same as kinship, though, as Koerner is quick to note in this rich and illuminating study of the two painters. Bruegel’s mature style, clear-eyed, intent on the human, temporal and mundane, is a world away from Bosch’s fantasias of the demonic, eternal and infernal. And more fundamentally still, Koerner argues, they belong to two different ages of art history and represent two distinctive conceptions of what art was for. Bosch was a devotional Catholic painter (despite what Koerner pithily calls the ‘rich body of delusional scholarship’ striving to make him a heretic or madman), and he belonged to an age in which artistic ‘subservience to the sacred’ was the norm. Bruegel marks the ‘watershed’ at which European painting ‘emancipated itself’ from that subservience – thanks in no small part to the démarches he himself made. Retrospectively at least, his ‘genre’ scenes of everyday life, his prints and his landscapes seem instrumental in art’s successive migrations from the church to the palace, then to the home and eventually to the museum.

more here.

The true history of fake news

Tom Standage in More Intelligent Life:

FakeGiant man-bats that spent their days collecting fruit and holding animated conversations; goat-like creatures with blue skin; a temple made of polished sapphire. These were the astonishing sights witnessed by John Herschel, an eminent British astronomer, when, in 1835, he pointed a powerful telescope “of vast dimensions” towards the Moon from an observatory in South Africa. Or that, at least, was what readers of the New York Sun were told in a series of newspaper reports.

This caused a sensation. People flocked to buy each day’s edition of the Sun. The paper’s circulation shot up from 8,000 to over 19,000 copies, overtaking the Times of London to become the world’s bestselling daily newspaper. There was just one small hitch. The fantastical reports had in fact been concocted by Richard Adams Locke, the Sun’s editor. Herschel was conducting genuine astronomical observations in South Africa. But Locke knew it would take months for his deception to be revealed, because the only means of communication with the Cape was by letter. The whole thing was a giant hoax – or, as we would say today, “fake news”. This classic of the genre illuminates the pros and cons of fake news as a commercial strategy – and helps explain why it has re-emerged in the internet era. That fake news shifted copies had been known since the earliest days of printing. In the 16th and 17th centuries, printers would crank out pamphlets, or newsbooks, offering detailed accounts of monstrous beasts or unusual occurrences. A newsbook published in Catalonia in 1654 reports the discovery of a monster with “goat’s legs, a human body, seven arms and seven heads”; an English pamphlet from 1611 tells of a Dutch woman who lived for 14 years without eating or drinking. So what if they weren’t true? Printers argued, as internet giants do today, that they were merely providing a means of distribution, and were not responsible for ensuring accuracy.

More here.

Can Plants Hear?

Marta Zaraska in Scientific American:

PeaPseudoscientific claims that music helps plants grow have been made for decades, despite evidence that is shaky at best. Yet new research suggests some flora may be capable of sensing sounds, such as the gurgle of water through a pipe or the buzzing of insects.

In a recent study, Monica Gagliano, an evolutionary biologist at the University of Western Australia, and her colleagues placed pea seedlings in pots shaped like an upside-down Y. One arm of each pot was placed in either a tray of water or a coiled plastic tube through which water flowed; the other arm had only soil. The roots grew toward the arm of the pipe with the fluid, regardless of whether it was easily accessible or hidden inside the tubing. “They just knew the water was there, even if the only thing to detect was the sound of it flowing inside the pipe,” Gagliano says. Yet when the seedlings were given a choice between the water tube and some moistened soil, their roots favored the latter. Gagliano hypothesizes that these plants use sound waves to detect water at a distance but follow moisture gradients to home in on their target when it is closer.

More here.

Wednesday, May 17, 2017

Who’s Afraid of the White Working Class?: On Joan C. Williams’s “White Working Class: Overcoming Class Cluelessness in America”

Whiteworkingclass

David Roediger in the LA Review of Books:

From its title onward, White Working Class suffers under problems with accuracy-in-labeling. The book is not about the working class in any meaningful sense. Its treatment of race is, at best, fleeting. Regarding the former, Williams arrives at a definition of the working class that is neither traditional and coherent nor usefully innovative. She expels the poor, wage earning or not, from the ranks of the working class and shuts the very rich out of the ranks of those holding it back. Income alone, not the more meaningful measure of wealth, defines her answer to the question “Who Is the Working Class?” The bottom third and top 20 percent are excluded, with an exception made for those making more but not having college degrees. The result is a “class” defined by making $41,005 to $131,962 annually (median: $75,144), and by holding values alternately seen as understandable or wonderful.

Calling this group the “white working class,” rather than the middle class, is strange given that its middle-ness is precisely what defines it. The major US scholar whose work most resembles Williams’s is the late-in-life and rightward-moving Christopher Lasch. But Lasch was careful to call the object of his romanticization and defense the “lower middle class.” Williams explains that she too preferred to use “middle class.” But the book’s editor objected that this was unclear, so Williams decided to use working class. Nevertheless, she invites readers to understand that her object of study is really the “true middle class,” shorn of its snobbish, college-educated professional-managerial eliteness. As a marketing ploy, White Working Class is also not a bad eye-catcher.

The level of confusion thus introduced is very high. At one point, casting about for areas of unity between the working class and the poor, Williams expresses her hope that restaurant owners will oppose Trump’s draconian border measures in order to better secure immigrant labor. For those still trying to keep score, the restaurant owners are somehow working class, while their immigrant laboring employees are somehow not. Nevertheless, at certain junctures Williams cannot resist taking up the cause of “white trash” who are maligned by elites but not, by her own definitions, working class. (In actual working-class families, of course, the lines between the deserving and undeserving, the too-honorable-for-welfare and the dissolute, and even the churched and unchurched are nothing like as clear as Williams supposes. Actual working-class lives usually change — hillbilly elegies and Charles Murray notwithstanding — for reasons having precious little to do with a worker’s character.)

More here.

It’s Not Just Profit Wrecking American Healthcare

Expensive-medical-equipment

Maggie Mahar in INET Economics:

Americans can be glad that the Affordable Care Act has brought medical care to millions of the previously uninsured and underinsured. Nevertheless, our healthcare system remains hugely expensive and wildly inefficient.

Too often, care isn’t well coordinated. Meanwhile, we don’t have enough unbiased research showing which treatments are most effective. Everyone just assumes that the newest, most expensive product or procedure must be the best. As a result, we now spend over $3.3 trillion a year, or roughly $10,000 per person on healthcare —more than any other nation on the planet.

Why is our bill so high? It begins with overtreatment.

In Medicine, More is Not Better

In the laissez-faire chaos that we call a healthcare “system,” one out of three dollars is squandered on unnecessary tests, over-priced drugs, and treatments that provide little or no benefit to patients.

This may seem an outrageous statement, but nearly three decades of research done by doctors at Dartmouth’s Medical School demonstrate the waste. (Others, including McKinsey, the New England Institute of Medicine, and Dr. Donald Berwick, former acting director of the Centers for Medicare and Medicaid, confirm their estimate: 30 percent of the money we lay out for medical products and services does nothing to improve patients’ outcomes.) Part of the problem is that some providers simply order more tests and recommend more surgeries than others, yet research shows that their patients fare no better.

It doesn’t seem to matter whether a government-controlled program like Medicare or private insurers are paying the bills; in recent decades medical spending has headed toward the skies. From 1970 to 2006 private insurers’ reimbursements for care soared by an average of 9.7 percent each and every year. Medicare pays doctors and hospitals less, but even so, Medicare payments for medical services and products climbed by an average of 8.7 percent annually.

The point is this: Neither the public sector nor the private sector has found a way to cut the waste and rein in the underlying cost of care. Since the Affordable Care Act passed in 2010, health care inflation has slowed. But not enough: it still outpaces economic growth.

More here.

My Family’s Slave: She lived with us for 56 years. She raised me and my siblings without pay. I was 11, a typical American kid, before I realized who she was

Alex Tizon in The Atlantic:

ScreenHunter_2701 May. 17 20.54Her name was Eudocia Tomas Pulido. We called her Lola. She was 4 foot 11, with mocha-brown skin and almond eyes that I can still see looking into mine—my first memory. She was 18 years old when my grandfather gave her to my mother as a gift, and when my family moved to the United States, we brought her with us. No other word but slave encompassed the life she lived. Her days began before everyone else woke and ended after we went to bed. She prepared three meals a day, cleaned the house, waited on my parents, and took care of my four siblings and me. My parents never paid her, and they scolded her constantly. She wasn’t kept in leg irons, but she might as well have been. So many nights, on my way to the bathroom, I’d spot her sleeping in a corner, slumped against a mound of laundry, her fingers clutching a garment she was in the middle of folding.

More here.

Brian Greene on How Science Became a Political Prisoner

Nick Stockton in Wired:

BrianGreeneTABrian Greene is one of those physicists. You know the type: Blessed with a brain capable of untangling the mysteries of the universe, and a knack for clearly explaining it all to the rest of us schlubs.

His enthusiasm for doing these things keeps him quite busy, what with the three best-selling physics books for grown-ups, a children’s book about time dilation(!), a few TV specials, and, of course, a TED talk. Oh, and he and his wife have, since 2008, spearheaded an annual science-themed takeover of New York. The World Science Festival runs from May 30 to June 4, with talks, performances, and interactive events in all five boroughs.

An ambitious schedule, to be sure. Opening night includes a performance of Greene’s next book (working title: Until the End of Time) exploring humanity’s place in the unfolding universe. Other events explore how the brain works, and how to use the scientific method in the kitchen. A biologist will lead a sailboat cruse of New York Harbor, and Mario Livio will set up some barrel-sized telescopes in a bid to disprove every New Yorker’s fervent believe that you can’t stargaze in the city.

Greene hopes the festival both sates and stirs the public’s appetite for science. No easy feat these days, when politics has shifted science from something people do to something they march for, argue over, and believe in. I gave Greene a call to ask how how the joy of science, and the thrill of figuring things out, might prevail when it seems science is under siege.

Given the debates over science right now, I’d like to start by asking a basic question: What is science?

Science is our most powerful tool for evaluating what’s true in the world. It’s a perspective on reality that allows you to grasp what’s right and what’s not. And, in the best of cases, use that knowledge to manipulate and control the world to the betterment of everyone.

More here.

White supremacy is everywhere: How do we fight a concept that has so thoroughly permeated our politics and culture?

Anis Shivani in Salon:

Ku-klux-klan3-620x412In the first part of this series, I focused on some of the history of white supremacy, particularly its late 20th-century versions, which continue to have so much influence today upon the current alt-right movement. It’s important to understand this history — some of which enters into truly exotic terrain — to understand the continuity of ideas, and to realize that we are not facing anything really new in the current manifestation of white supremacy.

But there’s a more mundane side to white supremacy, which deserves to be studied with as much attention: the way in which white supremacy works in and through institutions that we otherwise think of as legitimate to the core, and even essential to the workings of liberal democracy. If we explore how this has occurred recently, then we can no longer push white supremacy aside as an ideology that can be prevented from infecting so-called “mainstream” institutions. I’m thinking primarily of political parties, but once we admit that white supremacy is a fundamental influence on how parties reinvent and calibrate themselves, then this necessarily sweeps the social organism as a whole into the indictment.

White supremacy implies a certain logic that is inimical to that of the Enlightenment (the foundation of modern democracy). It is no coincidence that much of contemporary white supremacy continues to focus on the Illuminati and Freemasons as the disseminators of “secular humanism” (i.e., the core values of the Enlightenment), or that conspiracy theory mines the same territory when it takes on “The Protocols of the Elders of Zion” (attacked as a worldwide conspiracy to bring about godless materialism) or such obsessions as the Bilderberg Group, the Trilateral Commission, the Council on Foreign Relations and the rest of the institutions associated with the New World Order (largely meaning the forces of globalization). Against the Enlightenment, which is said to lead to the weakening of the nation as an embodiment of the pure idea of race, the white supremacist insists on separation of races as his natural right. Against mongrelization, the white supremacist desires purity.

More here.

A Generation of Sociopaths – how Trump and other Baby Boomers ruined the world

Jane Smiley in The Guardian:

LmcThe day before I finished reading A Generation of Sociopaths, who should pop up to prove Bruce Cannon Gibney’s point, as if he had been paid to do so, but the notorious Joe Walsh (born 1961), former congressman and Obama denigrator. In answer to talkshow host Jimmy Kimmel’s plea for merciful health insurance, using his newborn son’s heart defect as an example, Walsh tweeted: “Sorry Jimmy Kimmel: your sad story doesn’t obligate me or anyone else to pay for somebody else’s health care.” Gibney’s essential point, thus proved, is that boomers are selfish to the core, among other failings, and as a boomer myself, I feel the “you got me” pain that we all ought to feel but so few of us do. Gibney is about my daughter’s age – born in the late 1970s – and admits that one of his parents is a boomer. He has a wry, amusing style (“As the Boomers became Washington’s most lethal invasive species … ”) and plenty of well parsed statistics to back him up. His essential point is that by refusing to make the most basic (and fairly minimal) sacrifices to manage infrastructure, address climate change and provide decent education and healthcare, the boomers have bequeathed their children a mess of daunting proportions. Through such government programmes as social security and other entitlements, they have run up huge debts that the US government cannot pay except by, eventually, soaking the young. One of his most affecting chapters is about how failing schools feed mostly African American youth into the huge for-profit prison system. Someday, they will get out. There will be no structures in place to employ or take care of them.

The boomers have made sure that they themselves will live long and prosper, but only at the expense of their offspring. That we are skating on thin ice is no solace: “Because the problems Boomers created, from entitlements on, grow not so much in linear as exponential terms, the crisis that feels distant today will, when it comes, seem to have arrived overnight.” As one who has been raging against the American right since the election of Ronald Reagan, as someone with plenty of boomer friends who have done the same, I would like to let myself off the hook, but Gibney points out that while “not all Boomers directly participated, almost all benefited; they are, as the law would have it, jointly and severally liable”.

More here.

Century-old tumours offer rare cancer clues

Heidi Ledford in Nature:

CellDeep in the basement archives of London's Great Ormond Street Hospital for Children reside the patient records that cancer researcher Sam Behjati hopes will put the hospital's past to work for the future. On 2 May, he and his colleagues published the result: DNA sequences from the genomes of three childhood tumour samples collected at the facility almost a century ago1. Those historic cells help to address a modern problem: the small number of tumour samples from rare cancers that are available for researchers to sequence. Behjati knows this problem well. At the Wellcome Trust Sanger Institute in Hinxton, UK, he tracks the genomic miswiring that can lead to rare childhood cancers. And as someone who also treats patients, he has been frustrated by the paucity of evidence backing up much of his practice. “The treatment regimens for children with rare cancers are essentially made up,” Behjati says. “If you’ve got three or four patients nationally, how are you ever going to conduct a reasonable clinical trial?” To expand the pool of samples that he could sequence, he decided in 2014 to harness advances in genome sequencing that had already made it possible to sequence DNA from pathology samples a few decades old. The hospital's 165-year archive of samples and patient records provided the opportunity to see how far back in time he could go.

The work highlights the wealth of material that is available in such archives, says Danielle Carrick, a programme director at the US National Cancer Institute in Rockville, Maryland. Mining such archives can expand the options for studying rare conditions and understudied ethnic populations, she notes, and make large, population-scale studies possible. Researchers have analysed DNA from much older specimens: fragments of genome sequence have been used to study ancient human populations from hundreds of thousands of years ago. But DNA tends to degrade over time, and cancer researchers need high-quality sequences to pinpoint the many individual mutations that can contribute to tumour growth.

More here.

Tuesday, May 16, 2017

Have You Ever Had an Intense Experience of Mystical Communion with the Universe, Life, God, etc?

Louis Kahn’s work is accessible, minimal, simple, solid, systematic, and self-evident. It is also the exact opposite.

Thomas de Monchaux in n + 1:

ExeterHere are two things to know about architects. First, they are fastidious and inventive with their names. Frank Lincoln Wright was never, unlike Sinatra, a Francis. He swapped in the Lloyd when he was 18—ostensibly in tribute to his mother’s surname on the occasion of her divorce, but also to avoid carrying around the name of a still more famous man, and for that nice three-beat meter, in full anticipation of seeing his full name in print. In 1917, Charles-Edouard Jeanneret-Gris—who is to modern architecture what Freud is to psychoanalysis—was given the byline Le Corbusier (after corbeau, crow) by his editor at a small journal, so that he could anonymously review his own buildings. The success of the sock puppet critic meant that after the critiques were collected into a book-length manifesto, the nom-de-plume eventually took over Jeanneret-Gris’ architect persona, as well. Ludwig Mies—the inventor of the glass-walled skyscraper—inherited an unfortunate surname that doubled as a German epithet for anything lousy or similarly defiled. He restyled himself Miës van der Rohe—vowel-bending heavy-metal umlaut and all—with the Dutch geographical tussenvoegsel “van” from his mother’s maiden name to add a simulation of the German nobiliary particle, von. Ephraim Owen Goldberg became Frank Gehry.

Second, all architects are older than you think. Or than they want you to think. Unlike the closely adjacent fields of music and mathematics, architecture has no prodigies. Design and construction take time. At 40, an architect is just starting out. Dying at 72 in architecture is like dying at 27 in rock and roll.

More here.

The Myth That Humans Have Poor Smell Is Nonscents

Ed Yong in The Atlantic:

ScreenHunter_2700 May. 16 23.08For years, John McGann has been studying the science of smell by working with rats and mice at Rutgers University. But when he turned his attention to humans, he was in for a shock. The common wisdom is that our sense of smell stinks, compared to that of other mammals. McGann had always suspected that such claims were exaggerated, but even he wasn’t prepared for just how acute his volunteers’ noses were. “We started with an experiment that involved taking two odors that humans can’t tell apart—and we couldn’t find any,” he says. “We tried odors that mice can’t tell apart and humans were like: No, we’ve got this.”

In a new paper, McGann joins a growing list of scientists who argue that human olfaction is nothing to sniff at. We can follow smell trails. We discriminate between similar odors and detect a wide range of substances, sometimes more sensitively than rodents and dogs can. We live in a rich world of scents and sensibility, where odors deeply influence our emotions and behavior. “I was taught in school that human olfaction isn’t a great sense,” he says. “It’s taught in introductory psychology courses and it’s in the textbooks. But this whole thing is a crazy myth.”

For this crime against olfaction, McGann accuses Paul Broca, a 19th-century French neuroscientist. Broca was a materialist who argued that the mind arose from the brain—a position that brought vigorous opposition from the Catholic Church, which believed in a separate and disembodied soul. This intellectual battle colored Broca’s interpretation of the brain.

For example, he noted that the lobes at the front of the human brain, which had been linked to speech and thought, are relatively bigger than those of other animals. By contrast, he noticed that our olfactory bulbs—a pair of structures that govern our sense of smell—are relatively smaller, flatter, and positioned less prominently.

More here.