Ending the forever war on drugs: 2016 election edition

by Dave Maier

I had not intended to return to the issue of marijuana legalization so soon after my last such post, but this will be my last post before the November election, and there are ballot initiatives about marijuana legalization in no fewer than nine states – four medical marijuana (Arkansas, Montana, Florida, and North Dakota), and five “recreational” (California, Nevada, Arizona, Maine, and Massachusetts) – so here we go again. I won’t give a general argument for or against, but just give a sense of the wide variety of relevant issues. If you live in any of these states, please be sure to read the particular initiative carefully before voting.

IceMarijuana users constitute a small minority of the population, but recent polls have shown consistent majorities in favor of legalization. Not surprisingly, older voters and Republicans are less likely to support it, although not by large margins. More surprising is the opposition to the various legalization initiatives by those otherwise in favor. Why would one want to legalize marijuana but oppose an initiative which does that very thing?

For an answer, let us direct our browsers to noon1.me (“No on [Maine’s ballot proposition] 1”). Again, one might expect a site with that name to argue that it would be horrible to allow our citizens to freely take drugs to get high on drugs, and no doubt there are such sites (for a refresher on prohibitionism, see my previous post). Instead, its focus is mainly on the various regulations involved in, as proponents of the initiative put it, “regulating marijuana like alcohol”. These regulations are necessary, proponents believe, in order to sell the idea to non-users, and indeed the only successful initiatives to date (in Colorado, Washington, Alaska, Oregon, and DC), as well as the various initiatives likely to do well this year, have a whole laundry list of regulations on public use, home growing, DUI limits, possession limits, and so on. This is especially true in California, where the tag line for this year’s effort (“Let’s get it right”) alludes to the apparently overly lax nature of the failed legalization initiative there in 2010.

Read more »

Mad and Mythical Dogs

by Genese Sodikoff

58e5522bbae25d1985fafd00a91db6a7For thousands of years, people on every continent (save for uninhabitable Antarctica) have recognized the behavior of rabid animals and seen the ravages that rabies inflicts on the human mind and body. While the biological symptoms of rabies are universal, it, like many global diseases, manifests in different places with unique cultural markers and histories. These include everyday etiologies, or the ways people trace the origin of a disease or condition. They include the specific images or emotions expressed by victims in a feverish state, or the treatments applied to rabid animal bites. Beyond the cultural ideas and practices that shape any illness, rabies' origins and unpredictable incubation period, which can range anywhere from a week to months (or years!) before symptoms appear, invites the human imagination to fill in the blank.

In Madagascar, where I do anthropological fieldwork, rabies has been around since at least 1896, when the French colonized the island. Historian Eric T. Jennings writes that by 1899, a Pasteur Institute was established to forcefully combat human rabies, known as hydrophobia, but the virus was never eradicated. Jennings writes that to French colonial scientists experienced in treating rabies, Madagascar appeared to have a particularly acute and fast-spreading strain, requiring “more frequent injections of more active virus.” Rabid dogs in Madagascar appeared more ferocious than elsewhere, aiming right for the face.

Given the prevalence and history of rabies in Madagascar, I was surprised to learn that many Malagasy people (including doctors and veterinarians) attribute the viral source to a wild species that has only recently appeared on the landscape: a creature they call “little big chest” (kelibetratra). I refer specifically to people in the region of Moramanga District, about a three-hour drive east of the capital, Antananarivo, but knowledge of the kelibetratra as the rabies source extends far beyond this district.

The creature was described to me as a furtive wild dog from the rain forest that only roams late at night. It is built like a pit bull, but with shorter legs and a bigger thorax. Because of deforestation, they said, the animal has been scared out of its natural habitat into villages and towns, where it attacks pet dogs and cats, infecting them with rabies.

Read more »

Throw Your Vote Away

by Akim Reinhardt

FissureTo say this has been an interesting presidential election season would be an understatement. Regardless of who is declared president after the polls close three weeks from tomorrow, this is almost certainly a tussle that historians will pick over and analyze for decades to come, if not centuries. They're apt to do that when an election reveals deep fissures in society, as has this one.

But of course there's more to it than that. Donald Trump's candidacy is not just about a political outsider emerging as the champion of ostensible insiders (mostly white males) who have come to see themselves as disenchanted, frustrated outsiders amid long term changes in the national economy, culture, and demography. Among other things, it's also about a startlingly unqualified person taking the reigns of a major party against the wishes of that party's leadership; an unleashing of various bigotries that have forced comfortable Americans to stop pretending racism and sexism aren't real problems; and the dramatic erosion of lines separating entertainment and politics.

Amid this whirlwind of upheaval, Hillary Clinton now seems very likely to win. Our Lady of the Establishment looks ever more presidential, partly in contrast to Trump's glaring ineptitude, but mostly because so many people find The Donald to be utterly contemptible. And a victory which, under more banal circumstances, might have been most noteworthy for the United States electing its first female president nearly a century after the 19th Amendment guaranteed women the right to vote, will now largely be seen as a moment when simple sanity held sway over startling lunacy.

Read more »

Lechery in the White House

.

by Leanne Ogasawara

Party like a presidentOn the day of the second presidential debate, my mom and I decided that it would be just too lewd for my son to watch.

I suppose I should mention my son is 14!

Never in my life, have I seen anything like the insane circus that is surrounding this presidential debate, have you? With 24 hour a day coverage and the wild reaches of Internet, it feels like the election is going to take down the entire country with it. I mean, I was just walking my poodle this morning, and I heard two guys in spandex shouting about Trump's latest outrages as they screamed past me on the their bikes.

You can't get away from it. Not even in the days of Bill Clinton was there this level of lechery.

And so I totally agree with John Oliver, when he said we have reached a point so low in this election that we are now breaking through the earth's crust, where drowning in boiling magma will come as sweet, sweet relief.”

Yep.

Of course, Oliver had taped his show before the world had started gleefully repeating “that word” over and over again. All of a sudden, “that word” was everywhere, to the point that the detestable Trump surrogate Scottie Nell Hugh was seen demurely asking CNN's Ana Navarro to, “Please stop saying that word, because,” She explained, “My daughter is listening…”

Suffice it to say this did not go over well with Navarro, who angrily responded,

“You know what Scottie? Don’t tell me you’re offended when I say ‘pussy,’ but you’re not offended when Donald Trump says it!” Navarro shouted at Hughes. “I’m not running for president. He is.”

The CNN panel –along with millions of viewers– sat there stunned, because TRULY, you just can't make this stuff up!

Read more »

We’re All in This Together: Life as Jamie Knows It

by Bill Benzon

Cover image

Jamie is a young man in his early twenties. He has Down syndrome and is the son of Michael Bérubé and Janet Lyon, who teach at Penn State. Michael has just published Life as Jamie Knows It: An Exceptional Child Grows Up (Beacon 2016). Here’s how Michael characterizes his book (p. 16):

In the following pages, Jamie and I will tell you about his experiences at school, his evolving relationship with his brother, his demeanor in sickness and health, and his career as a Special Olympics athlete. And we’ll tangle with bioethics, politicians, philosophers, and a wide array of people we believe to be mistaken about some very important questions, such as whether life is worth living with a significant disability and whether it would be better for all the world if we could cure Down syndrome. (Quick preview: Yes. No.) But we will not tell you that Jamie is a sweet angel/cherub whose plucky triumphs over disability inspire us all. We will not tell you that special-needs children are gifts sent to special parents. And we will definitely not tell you that God never gives someone more than he or she can handle, because as a matter of fact, God dos that all the time–whether through malice or incompetence I cannot say.

That’s a fair characterization of the book. There are stories about Jamie, lots of them, and some stories by Jamie in the Afterword. But there is also philosophy, especially the final chapter, and discussions of disability policy, health care, education, and job-related. The stories about Jamie, his family, and friends, both illuminate and motivate the more abstract discussions. Here and there, as you might already have deduced, Michael slips in a zinger, sometimes mild, sometimes hot and spicy.

In the interests of full disclosure I should tell you that Michael is a friend. While I’ve only seen him face-to-face once, I’ve known him online for sometime, interacting with him through his now defunct blog, American Airspace, where Jamie was a frequent topic of conversation, and through email about this and that, mostly recently about Jamie’s art – a topic we’ll get to in due course. Thus this is not an arms-length review. It is simply a discussion of issues raised by a thought-provoking and well-written book.

Read more »

Strained Analogies Between Recently Released Films and Current Events: Deepwater Horizon and the Second Presidential Debate

by Matt McKenna

DeepwaterhorizonposterHaving watched the second presidential debate three days after watching Deepwater Horizon, it was difficult to know which ninety minutes of entertainment showcased the greater disaster. Sure, Deepwater Horizon depicts the worst human-caused environmental disaster in United States history, but then the debate was something of a disaster itself. While both Deepwater Horizon and the debate were compelling to watch in a glad-that’s-not-me-on-screen sort of way, isn’t it strange that a movie about an oil rig fire caused by greed and avoidable mistakes somehow inspires more confidence in humanity than a debate between two people vying for the most influential job in the world?

Deepwater Horizon follows Mike Williams (Mark Wahlberg) and Jimmy Harrell (Kurt Russell) as they chopper in to start a three-week rotation working on the eponymous oil rig. When the two men finally reach the work site, they’re greeted by a smug BP suit named Vidrine (John Malkovich) who sends home the safety-check crew before they can perform the tests that would have precluded the upcoming catastrophe. And thus, the film’s protagonists and antagonists are quickly established: Mike and Jimmy are the heroes just trying to do their jobs, and Vidrine and the BP stooges are the villains willing to risk the safety of the workers for money. A bit of Googling reveals that the lead-up to the disaster in real life wasn’t quite as simple as the film portrays it, but the depiction of the disaster itself nonetheless seems pretty accurate: something goes wrong on the Deepwater Horizon, and it explodes.

Read more »

ANTHROPOCENE AND EMPIRE

Stacey Balkan in Public Books:

ScreenHunter_2305-Oct.-17-10In the autumn of 1839, an unusually strong tropical storm devastated coastal communities along the Bay of Bengal in what was then the English East India Company’s premier settlement. A decade later, Company merchant and sometime scientist Henry Piddington coined the term “cyclone” to describe this climatological phenomenon, taking a cue from the seaborne storm’s circular movement and eerily hollow center, or “eye.” So common are cyclones in that part of the world that when a tornado—a typically smaller, terrestrial storm—ravaged the land-locked city of New Delhi in 1978, local newspapers erroneously identified the storm as a cyclone.

Amitav Ghosh was a graduate student at the time of the tornado and recounts its aftermath in a new monograph entitled The Great Derangement: Climate Change and the Unthinkable. His first book-length work of nonfiction in decades, The Great Derangement began as a series of lectures delivered last autumn at the University of Chicago. Focused in part on fictional representations of climate change, the author begins by addressing the bewildering absence of such storms from what he calls the “mansions” of “serious” fiction—an egregious oversight, he argues, given the proliferation of similarly catastrophic storms like Hurricane Sandy. Ultimately he asks: “Is climate change [simply] too wild a stream to be navigated in the accustomed barques of narration?”

More here.

Our world is awash in bullshit health claims and scientists want to train kids to spot them

Julia Belluz in Vox:

ScreenHunter_2304 Oct. 17 10.17Over my years in health journalism, I’ve debunked many dubious claims. I’ve discussed how to cover quacks like Dr. Oz and the Food Babe, and how to navigate a medical world so filled with hooey it can make your head spin.

But I wasn’t always fluent in the ways of detecting bull. My eyes were opened in my early 20s, when I met a group of researchers at McMaster University in Canada. They taught me about the limitations of different kinds of evidence, why anecdotes are often wildly misleading, and what a well-designed study looks like. This experience changed how I see the world.

I’ve often wondered why these concepts aren’t taught in schools. We are bombarded with health claims — in the news, on TV, in magazines, at the doctor’s office or the pharmacy — and many of us lack the basic skills to navigate them.

That’s why I found this giant new trial, which is just wrapping up now in Uganda, so compelling. Its mission, according to Sir Iain Chalmers, the Cochrane Collaboration co-founder who’s co-leading it, is to teach children to “detect bullshit when bullshit is being presented to them.”

More here.

Why policy needs philosophers as much as it needs science

Adam Briggle and Robert Frodeman in The Guardian:

ScreenHunter_2303 Oct. 17 10.02In a widely-discussed recent essay for the New Atlantis, the policy scholar Daniel Sarewitz argues that science is in deep trouble. While modern research remains wondrously productive, its results are more ambiguous, contestable and dubious than ever before. This problem isn’t caused by a lack of funding or of scientific rigour. Rather, Sarewitz argues that we need to let go of a longstanding and cherished cultural belief – that science consists of uniquely objective knowledge that can put an end to political controversies. Science can inform our thinking; but there is no escaping politics.

Sarewitz, however, fails to note the corollary to his argument: that a change in our expectations concerning the use of science for policy implies the need to make something like philosophical deliberation more central to decision making.

Philosophy relevant? We had better hope so. Because the alternative is value fundamentalism, where rather than offering reasons for our values, we resort to dogmatically asserting them. This is a prescription for political dysfunction – a result increasingly common on both sides of the Atlantic.

More here.

Trump Coalition After Election Day

Ryan Lizza in The New Yorker:

Lizza-Bannon-1200The man behind this new message is Steve Bannon, who became the C.E.O. of the Trump campaign in August. Bannon is on leave from Breitbart, the right-wing news site where he served as executive chairman, and where he honed a view of international politics that Trump now parrots. Bannon, who is sixty-two, is new to right-wing rabble-rousing, compared to someone like Stone. Bannon was raised in a blue-collar Democratic family around Norfolk and Richmond, Virginia. He served in the Navy, went to the Harvard Business School, and became wealthy as a mergers-and-acquisition deal-maker for Goldman Sachs, in the nineteen-eighties. He made a fortune by buying a share of the royalties for “Seinfeld” back in 1993, and receives them to this day. Bannon met Andrew Breitbart, the founder of the news Web site, when Bannon was financing conservative documentaries in Los Angeles in the aughts. Breitbart, who previously worked with the Drudge Report, started Breitbart in 2005 as a conservative news aggregator, much like his former employer. In the fall of 2009, Bannon and Breitbart worked together on a business plan to launch a more ambitious version of the site, and Bannon joined its board in 2011, once the financing deal closed. When Andrew Breitbart died, in 2012, Bannon became executive chairman and took over the site. Back then Breitbart was a pugnacious but still recognizably conservative site, but, with Bannon in charge, its politics started to change.

Bannon embraced the growing populist movement in America, including the “alt-right,” a new term for white nationalists, who care little about traditional conservative economic ideas and instead stress the need to preserve America’s European heritage and keep out non-whites and non-Christians. Under Bannon, Breitbart promoted similar movements in Europe, including the United Kingdom Independence Party, the National Front in France, Alternative for Germany, and the Freedom Party in the Netherlands. Bannon likes to say that his goal is “to build a global, center-right, populist, anti-establishment news site.” After the election is over, Breitbart, which has offices in London and Rome, plans to open up new bureaus in France and Germany.

More here.

Matt Taibbi on The Fury and Failure of Donald Trump

Matt Taibbi in Rolling Stone:

Trump-failure-with-election-tiabbi-9c70ec25-6cc4-4eb8-bc9e-d6e85c0260b7The first symptom of a degraded aristocracy is a lack of capable candidates for the throne. After years of indulgence, ruling families become frail, inbred and isolated, with no one but mystics, impotents and children to put forward as kings. Think of Nikolai Romanov reading fortunes as his troops starved at the front. Weak princes lead to popular uprisings. Which brings us to this year's Republican field.

There wasn't one capable or inspiring person in the infamous “Clown Car” lineup. All 16 of the non-Trump entrants were dunces, religious zealots, wimps or tyrants, all equally out of touch with voters. Scott Walker was a lipless sadist who in centuries past would have worn a leather jerkin and thrown dogs off the castle walls for recreation. Marco Rubio was the young rake with debts. Jeb Bush was the last offering in a fast-diminishing hereditary line. Ted Cruz was the Zodiac Killer. And so on.

The party spent 50 years preaching rich people bromides like “trickle-down economics” and “picking yourself up by your bootstraps” as solutions to the growing alienation and financial privation of the ordinary voter. In place of jobs, exported overseas by the millions by their financial backers, Republicans glibly offered the flag, Jesus and Willie Horton.

More here.

A Day in the Life of the Brain

Steven Rose in The Guardian:

BookYet another book about consciousness? These days it seems no self-respecting neuroscientist should be without at least one book-length stab at explaining how the brain enables that most central, if elusive, feature of what makes us human. This is Susan Greenfield’s second. Yet, as she reminds us, it has only been in the last few decades that consciousness studies, once regarded as the province of philosophers, and off-limits for neuroscience, has become a cottage industry for brain researchers, oblivious to the sceptics who joke that the initiator of this new wave was an anaesthesiologist, Stuart Hameroff, whose day job ought surely to be elucidating the processes through which people become unconscious.

This origin may help explain why many brain researchers have such a narrow definition of consciousness, understood by Greenfield, in common with her many peers, as what we retain while awake and lose while asleep or anaesthetised. Such a restricted description raises many questions about this protean term. Can there be consciousness in the abstract, distinct from being conscious of something? Awareness is only one of the several meanings the OED ascribes to consciousness, including self-knowledge and, to me the most important, “the totality of the impressions, thoughts, and feelings, which make up a person’s conscious being”. Neuroscientists are rarely trained in philosophy, but a little modesty might not go amiss. Some committed reductionists among them maintain that consciousness is merely a “user illusion” – that you may think you are making conscious decisions but in “reality” all the hard work is being done by the interactions of nerve cells within the brain. Most, however, are haunted by what their philosophical sympathisers call the “hard problem” of the relationship between objective measures – say of light of a particular wavelength – and qualia, the subjective experience of seeing red.

More here.

No Way to Say Goodbye

Morgan-Meis-and-Stefany-Anne-Goldberg

Julian Hanna reviews Stefany Anne Golberg and Morgan Meis's Dead People in 3:AM Magazine:

What makes a life noteworthy and important? What makes a good life? And when a life ends, what constitutes a good summing up, a worthy eulogy? How can a writer, pressed for time, do justice to a life – a great life, presumably – within the hackneyed confines of a two-page obituary? How does one attempt to revive such a dead form?

Or if not dead, then at least resting – unsung, taken for granted – like the manifesto before Marx and Marinetti, when it meant simply a straightforward declaration of intent, no “spectre haunting Europe”, no “courage, audacity, and revolt”. The standard obituary form is: so-and-so was born, rose (usually struggled) to greatness, and died. There is almost a sense that words fail in the face of death, so it is best just to state the facts. How do you breathe life into such a predictable story?

In Dead People, Stefany Anne Golberg and Morgan Meis show us one approach to reinvigorating the form. The collection of twenty-nine obituaries has a provocative cover that makes it great fun to read on the metro. It was written mainly for The Smart Set over the past decade (there are exceptions drawn from n+1 and The New Yorker), and the almost single venue contributes to the high degree of intimacy present in the telling of each life. The authors strike a tone of late-night candour, loose and flowing in the warm glow of the third or fourth drink. But the easy style belies a deeper engagement, honesty about the subject, and a willingness to deal with difficult themes. “We’ve chosen to take these lives personally”, the authors declare in the preface. What Golberg and Meis achieve for the most part is an effortless distillation, boiling down the essence of a public figure’s achievements. The big idea, the breakthrough, the one thing that makes an indelible mark on the culture – this is what we are shown in each brief life.

More here.

Increasing Support for the War on Yemen Is Obviously Insane

20352638644_301f4e4872_k

Daniel Larison in The American Conservative:

The Wall Street Journal publishes another shameless pro-Saudi editorial. This part stood out:

A Saudi air strike last week mistakenly killed civilians at a funeral in Yemen, and the White House is now leaking that Mr. Obama is rethinking U.S. support for the Yemen campaign. But the U.S. has made similar targeting errors in many conflicts, and Saudi bombing won’t get more precise if the U.S. bugs out. The U.S. ought to be helping the Saudis with enough support that they can win in Yemen.

Even by the WSJ‘s standards, this is an insane position to take. The funeral massacre last week was obviously not carried out by “mistake.” The coalition repeatedly hit the same target to maximize casualties, and it chose the target because many high-ranking political and military figures were in attendance. The coalition wanted to hit the target, and it did so several times in a row. They weren’t concerned about the hundreds of civilians killed and injured in the process, and it is absurd to claim that they were. When presented with an obvious atrocity committed by a U.S. client, the WSJ predictably ignores the evidence and insists on even more aggressive support for the offending government.

There is also no realistic prospect of a Saudi victory in Yemen. The coalition’s original goals were to drive the Houthis out of the capital and reinstate Hadi. Even if the coalition could somehow manage to do the former, it would come at an excruciating cost for the civilian population. The latter goal has always been hopeless. Hadi had scant domestic support when the war began, and now he has none. There is no chance that the coalition can “win,” and so it makes absolutely no sense for the U.S. to increase support for the war in the hopes that they do. Yemen has been wrecked and its people brought to the brink of famine because of the foolish belief that the Saudis could “win.”

More here.

Freedom from food

Powder_300x400.83239cfe19b0

Nicola Twilley in Aeon:

Readers of Hacker News, a website popular with programmers and tech entrepreneurs, were the first to latch on to Rhinehart’s Soylent post, encouraging him to share the recipe online. When he did, it quickly spawned an animated Reddit thread in which DIY Soylent adopters reviewed recipes, discussed magnesium sourcing, and compared bowel movements. Within three months, Rhinehart decided that demand was sufficient for him to quit his tech start-up and form his own company in order to supply Soylent to the masses. By the time Soylent 1.0 started shipping in May 2014, the company had already accumulated a backlog of more than 20,000 pre-orders, adding up to more than $2 million dollars in sales and – at a conservative estimate – a collective saving of 2,875 years.

hat, one wonders, are people doing with all this extra time? Will we see a new Renaissance: a Soylent-fuelled flowering of novels, art or, at the very least, apps? It is perhaps too early to tell, but early signs are mixed. Rhinehart has ploughed his 90 minutes a day into launching his company, and says he still has ‘a long reading list, a long online course list, a lot of personal projects I’d like to do’. He is not against using the time for relaxation, of course, and tells me that he’s heard from other early adopters that they spend an extra hour and a half watching TV, hanging out with friends and family, or just catching up on our pervasive national sleep deficit.

‘Just giving people a little more time in general is something the United States really needs,’ he told me. ‘However you use that time is up to you.’

My own experience bodes less well. I lived on Soylent for five days (Rhinehart sent me a week’s supply, but I cracked early) and I was indeed painfully aware of vast open periods that I would have typically spent planning, shopping for, making, enjoying and cleaning up after meals. Much to my editor’s disappointment, I spent all that extra time joylessly clicking around on the internet, my brain resisting every effort to corral it into more productive activities as if it knew it was being cheated of an expected break. (My editor kindly pointed out that this might be more of a reflection of my own personal failings than a shortcoming inherent to Soylent.)

Of course, this is not the first time Americans have been promised relief from the time-suck of food preparation. Today’s Soylent craze has its roots in the post-Second World War embrace of convenience foods. And, then as now, the range of possible uses for that saved time ranged from the trivial to the substantial – but with a much more gendered twist.

More here.

How does Ian McEwan pull off Hamlet told by a fetus in ‘Nutshell’?

David Treuer in the Los Angeles Times:

ScreenHunter_2302 Oct. 15 21.54Thesis: The idea behind Ian McEwan’s new novel, “Nutshell”— an imagining of the events leading up to “Hamlet” (Gertrude and Claudius plotting to and then going ahead with killing King Hamlet) set in modern London, with the three principal parts played by John and Trudy and Claude (poet, trophy wife and scheming real estate-dealing brother, respectively) and narrated by Trudy and John’s unborn child in utero — is quite possibly the worst idea for a novel ever.

Antithesis: In McEwan’s hands, the hugely improbable feat of convincingly narrating a novel in the first person from the point of view of a fetus comes alive because McEwan is such a good writer, someone who drives his prose toward the impossible (and has done so throughout his career) in ways that continue to surprise. And a fetus is the perfect spokesling for the as-of-yet-unexplored terrain leading up the bloodbath that is “Hamlet”: How can two people, both of whom must have loved another, conspire to kill another (a spouse), get away with it, and then how do they suffer for what their imaginations made them do? This is the stuff of very good literature.

Discussion: The novel (McEwan’s 17th book) begins with the unnamed fetus uncomfortably crammed in his mother’s uterus some weeks before his inevitable eviction in a state of agitated suspension. Nabokov observed that many novels lurch forward because of eavesdropping (young Marcel spies on the music teacher’s daughter Mademoiselle Vinteuil during a lesbian frolic; the hedges and closets and hallways and fountains of mountain spas are constructed, it seems, to provide Lermontov’s characters with ways of knowing that which they shouldn’t or can’t otherwise). What better place to eavesdrop on a dastardly plot than from the comfort of someone’s womb? That’s what the fetus does from the start: He wombdrops on his mother and her boyfriend as they discuss the fetus’ father, John.

More here.

Where Does Our Arrow Of Time Come From?

Ethan Siegel in Forbes:

ScreenHunter_2301 Oct. 15 21.23Every moment that passes finds us traveling from the past to the present and into the future, with time always flowing in the same direction. At no point does it ever appear to either stand still or reverse; the “arrow of time” always points forwards for us. But if we look at the laws of physics — from Newton to Einstein, from Maxwell to Bohr, from Dirac to Feynman — they appear to be time-symmetric. In other words, the equations that govern reality don’t have a preference for which way time flows. The solutions that describe the behavior of any system obeying the laws of physics, as we understand them, are just as valid for time flowing into the past as they are for time flowing into the future. Yet we know from experience that time only flows one way: forwards. So where does the arrow of time come from?

Many people believe there might be a connection between the arrow of time and a quantity called entropy. While most people normally equate “disorder” with entropy, that’s a pretty lazy description that also isn’t particularly accurate. Instead, think about entropy as a measure of how much thermal (heat) energy could possibly be turned into useful, mechanical work. If you have a lot of this energy capable of potentially doing work, you have a low-entropy system, whereas if you have very little, you have a high-entropy system. The second law of thermodynamics is a very important relation in physics, and it states that the entropy of a closed (self-contained) system can only increase or stay the same over time; it can never go down. In other words, over time, the entropy of the entire Universe must increase. It’s the only law of physics that appears to have a preferred direction for time.

More here.