A North American’s Guide to the Use and Abuse of the Modern PhD

by Colin Eatock

Slide1You applied to the program, and you got in. Then you spent the next four, six, eight or more years stroking the capricious egos of professors, jockeying for position within your peer group and marking bad undergraduate essays for the minimum wage. You completed the research, the grant applications, the writing, the comprehensive exams, and finally the defence.

You got through it all, somehow – and now it's yours. You walked across the stage at a graduation ceremony, and an Important Person in a robe gave you the paper scroll that made it official. You are no longer a Mr. or a Ms. Now, you are a Doctor. You have a PhD.

A PhD isn't just something you've acquired, it's something you've become. It's part of who you are – and you're proud that you've transformed yourself in a way that's meaningful to you. Now that you can hold it in your hands, you feel you are someone special, and you want to tell the whole world.

But can you – or should you? And if so, how?

This is where it gets tricky. Indeed, knowing when it is professionally and socially acceptable to “use” your PhD – to call yourself Doctor, and to hope to be addressed as such in return – is a minefield where values, conventions and contexts intersect in fluid and intricate ways. And nowhere has the question ever been more perplexing than in North America today.

Ironically, this issue is often less troublesome in parts of Europe, Asia and Latin America. In many societies, scholarship and professional rank are highly respected things – and terms of address are an art form, requiring subtlety and precision. It would be tantamount to an insult to fail to address any kind of a doctor as Doctor.

But in North America – where traditions are discarded, hierarchies are flouted, and everything is supposed to be so much easier as a result – the rules surrounding the PhD designation are as clear as mud. Today's freshly minted scholars stand on shifting sands, and often have no idea when or where – or even if – it is acceptable to casually slip the initials Dr. in front of their name.

Read more »

Monday Poem

fireflies
.

every time a new intersection’s built
around me
it winds up bristling with cameras

omniscient light poles sprout
a cornucopia of lenses spills out

a thousand robot retinas
recording everyday ephemera

the little things we do in cars
we might not want the world to know about

all snagged, digitized
and set in a binary log
for some bureaucratic lout
to survey, to scrutinize

to better get to know
the fireflies in his jar
.

by Jim Culleny
5/11/13

Prospect For Ending Poverty

by Maniza Naqvi

Prosperity_AbundanceIt's a place of darkness. People are poor and hail from tribes and clans. They live in basic shelters in remote villages, with no running water or electricity, and no access to clinics. Subsisting on seasonal work, hunting and fishing to stock up food for the lean months, they worship nature's beauty. They consider themselves hardy, descendants of those who suffered war, famine, and religious persecution. They resent that their part of the earth gets attention only through the prism of movies or when natural or manmade disasters strike. Then oil is found and they are blessed.

Nope, this is not one of the 53 countries in Africa. It is not a “fragile state” the term often used for the richest in oil and gas and other mineral resources countries in Africa with the poorest citizens affected by the curse of resources: foreign meddling, conflict, war, corruption and autocratic dictators. This is Prudhoe Bay, Alaska, in the 1970s.

In 1975 the Alaska legislature asked itself: Was it morally acceptable or ethical for the generation whose presence in Alaska coincided with the oil boom to get all the benefits, leaving the following generations to deal with the decline and fall? No said the majority who thought the Alaskans of the future should have a nest egg and be allowed to share in a temporary windfall from the finite oil resource.

Alaska set up the Alaska Permanent Fund Corporation (AHF). In 1987, the APF was worth US$11 billion, and by 1997 it was US$24 billion, exceeding total state oil and gas revenues. As of March 2013 it was US$45.5 billion (here). The lesson is that managed professionally, a national asset can grow into the future beyond the finite resource. You can read the whole case study by Steve Cowper, a former Alaska Governor (no, not that one), in a book edited by the World Bank's Jennifer Johnson Calari here. In neighboring Alberta in Canada the Alberta Heritage Fund had been set up a year earlier (here).

Iran's Citizen Income Scheme (here), along the lines of the APF, is the largest in the world providing cash transfers to all its citizens, universally from its oil revenues. Per capita $500 is transferred to over 75.3 million citizens costing about $45 billion a year (here) and will amount to 15 percent of national income, while Alaska's average is 3-4 percent (here).

If ever there was ever a notion of a perfect nationalization then this would be it, to give to the people in a country what belongs to the Nation, its wealth earnings while making sure that the earnings keep growing for future generations. This is redistribution and growth of wealth. Other examples of such funds: The Future Generations Funds in Kuwait (here and here); Norway (here), the Diamond Empowerment Fund in Botswana (here)and Wyoming's Permanent Wyoming Mineral Trust Fund (here). A list if countries and their sovereign wealth funds is here.

Read more »

The skies and scenes of Südtirol: Photographs

by S. Abbas Raza

The light is very dramatic in the mountains of the Eisacktal at sunrise and sunset and changes from minute to minute. Even at a given time the sky can often look completely different in different directions. Yesterday evening I took a walk from my house in Brixen to the nearby village of Neustift and took some photographs along the way. Here they are without further commentary:

039

045

109

Read more »

Assad’s Castaways

Guernica-assad

J. Malcolm Garcia in Guernica:

Moutasem and Sarah watch their breath in the frigid February air. We are in the principal’s office of Muhammad al-Fatih, a secondary school for teenagers of Syrian refugees in Antakya, Turkey. The school has no heat, but it is better to freeze here than to be in Syria right now, my Syrian translator Hazim tells me. There, the army patrols villages and cities, killing suspected activists. Men, women, children. No one is safe. If the army could arrest the air, he says, it would.

Hazim is a Sunni Muslim, as are the students in the school and the rebels in the Free Syrian Army (FSA). The rebels have been battling the troops of President Bashar al-Assad since March 2011.

Syria’s Alawite minority and Sunni majority have been at odds for hundreds of years. The minority Alawites, like Assad, dominate Syria’s government, hold key military positions, and enjoy a disproportionate share of the country’s wealth.

Moutasem and Sarah, and other children I will meet in the coming days, have had their lives upended by a war made more complicated by centuries of ethnic rivalry.

Moutasem, fifteen, wears black shoes, pressed blue jeans, and a red wool sweater, and slouches, relaxed, in his chair. His eyes stare intently. When he sees me shiver in the cold, he offers me his coat.

Without Concepts

Richard Marshall interviews Edouard Machery in 3:AM Magazine:

3:AM: I think one way that we can immediately see the importance of your approach to philosophy and cognitive science is by discussing your work on racism. Racism has traditionally been thought of as either a question of nature – roughly, the thought that we’re born to think in racial terms– or nurture – roughly, our culture, upbringing, environment constructs races, and that they don’t exist in nature. You took the two research traditions, the nature tradition and the nurture tradition, and combined them. Can you say something about why you thought this combined approach was important at the time and what difference such an approach has made on research into this? Has it been an approach that has been well received by those in the previously opposing camps?

EM: Many social phenomena, such as racism, have been studied by, one the one hand, cultural anthropologists, sociologists, and historians, and on the other hand, by biologists and by evolutionary-minded behavioral scientists (anthropologists and psychologists). Sadly, these two traditions have failed to engage with one another, and, as a result, our understanding of many social phenomena remains incomplete. In my opinion, it is uncontroversial that social and psychological phenomena like racism or morality result from evolved cognitive structures, whose understanding requires an evolutionary perspective, but that many of their properties are the product of contingent historical trajectories. Integrating the two explanatory traditions is what I called the “integration challenge.” In my view, the theory of cultural evolution provides a framework for this integration.


[Photo of Gil-White from his home page.]

Racism is a case in point. As I have argued, following in part Gil-White’s groundbreaking work, we have evolved a sensitivity to “ethnic markers” (roughly, to markers such as clothes, accent, etc., that indicate what cultural group one belongs to) and a motivation to interact preferentially with members of own our cultural group. Racism is a by-product of this evolved sensitivity and motivation, and it emerges when skin color and other physical properties trigger our sensitivity to ethnic markers.

This hypothesis is useful to understand the unity of a large range of social and psychological phenomena, whose fundamental identity has often been ignored, or even denied, by historians and cultural anthropologists. On the other hand, research in history and cultural anthropology is needed to understand the peculiarities of racism in different historical contexts.

How did the conservative ideas of Friedrich Hayek and the Austrian school become our economic reality?

Corey Robin in The Nation:

In the last half-century of American politics, conservatism has hardened around the defense of economic privilege and rule. Whether it’s the libertarianism of the GOP or the neoliberalism of the Democrats, that defense has enabled an upward redistribution of rights and a downward redistribution of duties. The 1 percent possesses more than wealth and political influence; it wields direct and personal power over men and women. Capital governs labor, telling workers what to say, how to vote and when to pee. It has all the substance of noblesse and none of the style of oblige. That many of its most vocal defenders believe Barack Obama to be their mortal enemy—a socialist, no less—is a testament less to the reality about which they speak than to the resonance of the vocabulary they deploy.

The Nobel Prize–winning economist Friedrich Hayek is the leading theoretician of this movement, formulating the most genuinely political theory of capitalism on the right we’ve ever seen. The theory does not imagine a shift from government to the individual, as is often claimed by conservatives; nor does it imagine a simple shift from the state to the market or from society to the atomized self, as is sometimes claimed by the left. Rather, it recasts our understanding of politics and where it might be found. This may explain why the University of Chicago chose to reissue Hayek’s The Constitution of Liberty two years ago after the fiftieth anniversary of its publication. Like The Road to Serfdom (1944), which a swooning Glenn Beck catapulted to the bestseller list in 2010,The Constitution of Liberty is a text, as its publisher says, of “our present moment.”

But to understand that text and its influence, it’s necessary to turn away from contemporary America to fin de siècle Vienna.

More here.

Intricate Blueprints of Flowers

From Smithsonian:

Macoto-Murayama-Lathyrus-odoratus-side-viewThe worlds of architecture and scientific illustration collided when Macoto Murayama was studying at Miyagi University in Japan. The two have a great deal in common, as far as the artist’s eye could see; both architectural plans and scientific illustrations are, as he puts it, “explanatory figures” with meticulous attention paid to detail. “An image of a thing presented with massive and various information is not just visually beautiful, it is also possible to catch an elaborate operation involved in the process of construction of this thing,” Murayama once said in an interview. In a project he calls “Inorganic flora,” the 29-year-old Japanese artist diagrams flowers. He buys his specimens—sweetpeas (Lathyrus odoratus L. , Asiatic dayflowers (Commelina communis L.) and sulfur cosmos (Cosmos sulphureus Cav.), to name a few—from flower stands or collects them from the roadside. Murayama carefully dissects each flower, removing its petals, anther, stigma and ovaries with a scalpel. He studies the separate parts of the flower under a magnifying glass and then sketches and photographs them.

Using 3D computer graphics software, the artist then creates models of the full blossom as well as of the stigma, sepals and other parts of the bloom. He cleans up his composition in Photoshop and adds measurements and annotations in Illustrator, so that in the end, he has created nothing short of a botanical blueprint.

More here.

The Radical History Of Mothers Day

From The Huffington Post:

Free-mothers-day_1367996071At first glance, Mother's Day appears a quaint and conservative holiday, a sort of greeting card moment, honoring 1950s values, a historical throw back to old-fashioned notions of hearth and home. Let's correct that impression by saying: Happy Radical Mother's Day. In May 1907, Anna Jarvis, a member of a Methodist congregation in Grafton, West Virginia, passed out 500 white carnations in church to commemorate the life of her mother. One year later, the same Methodist church created a special service to honor mothers. Many progressive and liberal Christian organizations–like the YMCA and the World Sunday School Association–picked up the cause and lobbied Congress to make Mother's Day a national holiday. And, in 1914, Democratic President Woodrow Wilson made it official and signed Mother's Day into law. Thus began the modern celebration of Mother's Day in the United States. For some years, radical Protestant women had been agitating for a national Mother's Day hoping that it would further a progressive political agenda that favored issues related to women's lives. In the late 19th century, Julia Ward Howe (better know for the “Battle Hymn of the Republic”) expressed this hope in her 1870 prose-poem, “A Mother's Day Proclamation” calling women to pacifism and political resistance:

Arise then…women of this day!
Arise, all women who have hearts! Whether your baptism be of water or of tears!
Say firmly…
“Disarm! Disarm!
The sword of murder is not the balance of justice.”
Blood does not wipe our dishonor,
Nor violence indicate possession.
As men have often forsaken the plough and the anvil
At the summons of war,
Let women now leave all that may be left of home
For a great and earnest day of counsel.
Let them meet first, as women, to bewail and commemorate the dead.
Let them solemnly take counsel with each other as to the means
Whereby the great human family can live in peace…
Each bearing after his own time the sacred impress, not of Caesar, But of God –

Years later, Anna Jarvis intended the new holiday to honor all mothers beginning with her own–Anna Reeves Jarvis, who had died in 1905. Although now largely forgotten, Anna Reeves Jarvis was a social activist and community organizer who shared the political views of other progressive women like Julia Ward Howe.

More here.

the algerian camus

12SULEIMAN-articleInline

As the writings in “Algerian Chronicles” make clear, Camus’s position in “no man’s land” left him increasingly isolated: hated by the right for his condemnation of government policies, scorned by the left for his inability to imagine an independent Algeria from which the French would be absent. Kaplan’s introduction traces the evolution of Camus’s positions on the Algerian conflict, as well as the ups and downs of critics’ judgments of them. While Camus’s first readers saw him as a philosopher concerned with universal questions of human existence, some influential critics writing after the 1970s considered him a typical pied noir (the usual, sometimes pejorative designation for French people from Algeria), whose works present a colonialist perspective. In recent years, however, the pendulum has swung back; Kaplan notes that the bloody civil war of the 1990s in Algeria has made many Algerian intellectuals appreciate Camus’s steadfast rejection of violence, even when it is committed in the name of high principles. Obviously, this does not apply only to Algeria. Some of the most memorable pages here restate an argument Camus had already developed at length in “The Rebel”: not all means are acceptable, even when employed for noble ends; terrorism and torture destroy the very goals they are supposed to serve.

more from Susan Rubin Suleiman at the NY Times here.

everest

Ef5c732a-b92d-11e2-bc57-00144feabdc0

Since it was first climbed, Everest has generated a mountain of books that would take a determined character, backed by a team of hardy librarians, to conquer. From the first flurry by expedition members – John Hunt’s The Ascent of Everest (1953); Wilfrid Noyce’s South Col: One Man’s Adventure on the Ascent of Everest 1953 (1954); Edmund Hillary’s High Adventure (1955) and their embedded Times reporter Jan Morris’s Coronation Everest (1958) – to the current blizzard of books, what all have sought to do in their way is make sense of the achievement: why it matters, what it says about those who took part and what it means to the rest of us. Everest has long been a very British obsession. The deaths of George Mallory and Sandy Irvine near the top in 1924 had only made it more so: there was a corner of a foreign mountain that was forever England. As Wade Davis explored in Into the Silence (2011), his Samuel Johnson Prize-winning account of the expeditions of the early 1920s, those years saw climbers with a very militaristic mindset – an outlook born of their experiences in the trenches – attempt to conquer the mountain.

more from Carl Wilkinson at the FT here.

easy is back

La-ca-jc-walter-mosley-20130512-002

When last we saw Walter Mosley’s detective Easy Rawlins, he had just lost control of a car he was driving on the Pacific Coast Highway north of Malibu. This was in the closing pages of the 11th (and apparently final) Rawlins book, “Blonde Faith,” published in 2007. “The back of my car hit something hard,” Easy tells us, “a boulder no doubt. Something clenched down on my left foot and pain lanced up my leg. I ignored this, though, realizing that in a few seconds, I’d be dead.” And yet, six years later, Easy is back, narrating a new novel, “Little Green” (Doubleday: 292 pp., $25.95), that picks up where “Blonde Faith” left off. He is, if not entirely alive, then at least present, navigating a 1967 Los Angeles he barely recognizes in the wake of both the Watts riots and the Summer of Love. “It was great,” Mosley enthuses, “because for all intents and purposes, Easy was dead. And when he came back to consciousness, he felt dead. … Most of my novels are about redemption. But ‘Little Green’ is about resurrection. And so, I naturally followed it, from having him wake up dead to, at the end of the book, actually being alive.”

more from David L. Ulin at the LA Times here.

After Catastrophe

From The Chronicle of Higher Education:

Photo_36335_wide_largeConsider what has hit us hardest in recent years, how some of these disruptions came from or led to other woes: September 11, 2001; the 2003 Northeast blackout; the oil shock of 2008; the mortgage crisis and the Great Recession; Deepwater Horizon; the intense droughts; Hurricanes Katrina, Irene, and Sandy. There are surely more disruptions to come. Stephen E. Flynn, a security expert and former military officer who is co-director of the George J. Kostas Research Institute for Homeland Security at Northeastern University, ticks off the most likely threats: a breakdown in the power grid; interruption of global supply chains, including those that provide our food; an accident at one of the many chemical factories in urban areas; or damage to the dams, locks, and waterways that shuttle agricultural products and other goods out to sea. The No. 1 threat, he says, is a terrorist attack that prompts lawmakers and a frightened public to shred the Bill of Rights or overreact in another way. The tendency in government has been to focus intensely on these threats—or other problems, considering the wars on cancer, poverty, drugs, crime, and so on—and to try to eliminate them. “If you look at the post-World War II area,” Flynn says, “there is almost an overarching focus on reducing risk and bringing risk down to zero,” the idea that this could be done “if you brought enough science and enough resources and you applied enough muscle.” Since 9/11, that policy has meant spending vast sums to go after terrorists out there, but perhaps we aren't safer.

“Why do we have all this money to go after man-made terrorist attacks, and then we let our bridges fall down?” Flynn wonders.

He advocates a different approach. We should make American society more robust so that it can absorb shocks and carry on. Part of that shift includes reorienting people's attitudes so that they are more willing to deal with these uncertainties. The generation before World War II accepted risk as a matter of life, he says. “They had less ambition or hubris to believe that you would contain all of these things,” he says, “and a measure of character was how you would deal with adversity, how you overcame it.” Obama's administration has picked up the resilience approach—particularly in a policy directive in February to emphasize “critical infrastructure security and resilience”—and his budget directs $200-million to help communities develop resilience to extreme weather and other effects of climate change. That took political courage, Flynn says, because many other government officials, particularly in security, see resilience as defeatist. “They believe our job is to prevent these things from happening,” says Flynn. “What we have seen is that we keep having big events that are profoundly disruptive and that we are woefully underprepared to deal with. That can't continue.”

More here.

On the Muslim Question

From The Guardian:

Anne Norton thinks that the “Muslim question” is, if anything, a question about non-Muslims. She is straightforward in denying the claim that Islam and the west are involved in a “clash of civilisations”, castigating writers of various political persuasions who have, blatantly or inferentially, put forward this view. She thus criticises writers such as John Rawls (as well as those, such as Michael Walzer and Michael Ignatieff, who “have urged them on”) for saying that Muslims constantly seek empire and territory, for stereotyping Muslims' political orientation as the antithesis of liberalism, and for promoting a false history that conceals liberalism's own failings. In an effort to find more common ground, she underwrites Derrida's assertion that Islam is “the other of democracy” because Muslim states could retain their distinctiveness while recognising Israel and promoting democratic values. And she surprisingly lauds Sayyid Qutb, the Islamic theorist executed by Nasser in Egypt, because “even this intolerant, fanatic man has something to teach us about human rights, human dignity, and equality”, given his support for private property and women in the workplace.

In a series of chapters on sexuality, freedom of speech and democracy, Norton recognises that valid differences of orientation exist. But she does not always help her own case by making assertions that are variously vague, trivial or wrong. For example, she says that terrorism is the precursor to democracy (as if the course of the Arab spring was inevitable), that randomness is “terrifying” (so much for evolutionists), that “Germany has no neo-Nazis” (when they number upwards of 5,000), that the publishers of the Danish cartoons “intended to provoke” (and not just insult) Muslims, that the veil is “profoundly erotic” (for elderly women?), or that calling your sports team the Redskins “honours an old enemy” (tell that to Native Americans). But if the clash-of-civilisations approach is false, what options exist for addressing the differences presented by a Muslim minority in a western country?

More here.