Social Media And The Training Of Our Minds

by Samir Chopra

Facebook-reactions-loveOne fine morning, as I walked along a Brooklyn sidewalk to my gym, heading for my 8AM workout, I saw a young woman walking straight at me, her face turned away, attending to some matter of interest. She might have been paying attention to a smartphone, but it might have been kids or pets; the precise details of this encounter have slipped my mind. Unwilling to be run over, clothes-lined, or head-butted by this rapidly approaching freight train, one insensible to my presence, I nimbly stepped aside, infuriated that yet again, on a New York sidewalk, I had been subjected to the tyranny of the inattentive pedestrian.

At that moment of induced irritation, my thoughts were not inchoate, not just an incoherent mess of unresolved frustration; instead, they seemed to arrange themselves into a sentence-long expression of aggravation immediately comprehensible to some imaginary intended audience: “My least favorite pedestrian is the kind that walks in one direction with his attention diverted elsewhere, whether it’s smartphones, kids, or pets.” Or perhaps, “That’s quite all right; you should barge ahead on this sidewalk, your head down, unseeing and uncaring.” This reaction was instinctive; I did not stop to deliberate and compose my verbal reaction in sentence form; my brain responded like a trained machine, a well-primed one; a species of Pavlovian instinctive reaction had taken over my mind. It was not the first time that I had, on encountering something entirely weekday or quotidian, and yet, not unworthy of a mental response, suffered a brief emotional tic and found myself formulating such a summation of my feelings at that instant. The verbal expression of my thoughts did not suggest it was the starting point of a letter to a newspaper or an essay; it had to be concise and succinct.

This was not your garden-variety introspection; it was clearly intended for future public consumption, for a pithy display of my thoughts about a matter of personal interest to those who might be interested. I sensed my audience would be sympathetic; some would chime in with empathetic responses; yet others would add embellishments in their comments and annotations. I did not think this sentiment of mine would be greeted with disapproval; I anticipated approval. Indeed, that is why I indulged in that little bout of composition and drafting in my mind, framing the written expression of my thought to make it appropriately irate or ironic. Maybe the ensuing conversation would feature some cantankerous rants about the smartphone generation, about over-indulgent parents and pet-owners, all too busy texting, fretting over children and dogs and cats; perhaps some of my interlocutors would add witty tales of how, one day, in a urban encounter for the ages, they had stopped one of these offenders, and told them off with an artful blend of the scornful and witty. Perhaps someone would add a ‘horror story’ about how coffee had been spilled on them by someone just like the young woman; and more outrage would ensue. A little chat corner would have developed.

I had been drafting a Facebook status, a tweet.

The folks at Facebook and Twitter have achieved something remarkable: they have made their users regard the world as staging ground for inputs to their products. The world and its events and relations are, so to speak, so much raw material to be submitted to the formulation and framing of Facebook statuses and tweets. The world is not the world tout court, it is the provisioner of ‘content’ for our social media reports.

Read more »

Hidden Figure: Review of ‘The Forgotten Genius of Oliver Heaviside’ by Basil Mahon

by Ali Minai

The-forgotten-genius-of-oliver-heavisideA few years ago, while introducing my class of electrical engineering students to information theory, I said that we lived today in a world created by Faraday, Maxwell, and Shannon. Even as I said this, I was aware that, in my zeal for effect, I was omitting the names of many who had made seminal contributions in the fields of electrical engineering and telecommunications, but one name that did not occur to me then was that of Oliver Heaviside. Basil Mahon’s book, ‘The Forgotten Genius of Oliver Heaviside: A Maverick of Electrical Science’, is a valiant – and, one hopes, successful – attempt to remedy this situation where even those immersed in the field of electrical engineering do not know the achievements of one of its founding figures. To be sure, Heaviside’s name does live on in the simple but surprisingly important Heaviside step function H(x), which takes value 0 if x is less than 0 and 1 if it greater. This function, along with Dirac’s delta function, allows the calculus of discrete variables to be unified with the classical calculus of continuous ones – a fact of great utility in an age where everything is increasingly digital and thus discrete. Forgotten in all this is the fact that Heaviside invented the step function as part of a larger enterprise: An operational calculus that sought to solve the problems of calculus in a purely algebraic form. Though that calculus has left its imprint on many methods used by engineers to solve mathematical problems today, it is not taught explicitly in any curriculum and its name has mostly been forgotten by practitioners – a situation symbolic of the fate that has befallen Oliver Heaviside himself.

A vivid portrait of Heaviside emerges from the book. We see a brilliant and curmudgeonly character – willful but not unkind, except to those who challenge his well-founded theories with half-baked notions. After rather brief coverage of Heaviside’s background and childhood, the book moves to the beginning of his professional career as a technician in the telegraph service. Lacking a formal advanced education, Heaviside was fortunate to get this opportunity, in part through the efforts of his brother, Arthur, who was already employed in the service and – very importantly – the recommendation of the great inventor, Sir Charles Wheatstone, who was Heaviside’s uncle by marriage. For all the struggles that Heaviside had to go through to gain recognition of his genius, he was fortunate in one thing: He got into the field of electrical communication – the telegraph – at exactly the right time for a person of his aptitude. It was then a new technology, but had already established its utility. The desire to connect the world through telegraph was a major effort into which private investors and governments were willing to sink resources. And yet, everything in the field was being done through trial and error, without the benefit of established theory. The field was dominated by technicians rather than scientists, and the leading engineers of the time – such as Heaviside’s nemesis, William Preece – saw theoreticians as little more than ivory tower wonks with little to contribute to engineering practice. Heaviside was the first person to bridge this divide.

Read more »

(Almost) no natural disasters are natural

by Thomas R. Wells

7333766-3x2-940x627A natural disaster is a disaster because it involves a lot of human suffering, not because the event itself is especially big or spectacular. The destruction of an uninhabited island by a volcano is not a natural disaster, because it doesn't really matter to humans. A landslide doesn't matter, however enormous, unless there is a town at the bottom of it.

So what does the word ‘natural' add? We use it to demarcate the edges of responsibility. We don't use it very well.

Man-made disasters, like Chernobyl or Deepwater Horizon or Bhopal or Grenfell Tower, are ones acknowledged to have been brought about by human decisions. These disasters could have been avoided if certain people had made different choices. The suffering of a man-made disaster is therefore the responsibility of particular persons and institutions. They can be held answerable for their decisions: required to justify them and judged – and punished – if they cannot. For example, the investigation into the Grenfell Tower fire will scrutinise in forensic detail the reasoning behind the key decisions that permitted a containable danger to be transformed into mass death; such as the installation of a flammable cladding, the absence of sprinklers; and refuge in place instructions for residents. Some decision-makers may face criminal charges for negligence. They will certainly be vilified in the tabloid press and hate-mobbed on social media. Organisations like the local council and the company running the building will likely receive an official shaming, fines, and compulsory reorganisation or dissolution.

In contrast natural disasters are supposed to have been caused entirely by forces outside human control. They were inevitable. No one can be held responsible.

However, very few disasters these days meet the requirements for a natural disaster. It's not enough that a natural event was necessary to the disaster, i.e. that the disaster wouldn't have happened without it; that all those people couldn't have been killed by falling buildings if the ground hadn't been violently shaking. The natural event must also have been sufficient to bring about the disaster.

To understand the idea of a sufficient cause, it may help to think about imaginary but plausible worlds besides our own actual one. These are worlds in which everything operates by the same laws of physics, but particular histories may be different. For example, Hillary may be president instead. For one event to be the necessary cause of another, there must be no possible worlds in which event B (e.g. thousands of schools collapse) occurs without event A (e.g. magnitude 8.0 earthquake in Sichuan) also occurring. For one event to be the sufficient cause of another, it has to be the case that there must be no possible worlds in which event A (8.0 earthquake) occurs without event B (school collapses) also occurring. Only if it is true that in all possible worlds, if event A happens then event B also happens, is the language of inevitability justified.

Let's put that abstract point in the context of a real disaster. If the earthquake in Haiti had been the sufficient cause of the 200,000 deaths and vast infrastructure destruction that would mean that in any possible world in which an earthquake of that size had struck at that time and place, such massive destruction of human life would always occur.

Read more »

My teacher, mentor, and friend: Richard Harry Adler, 1922-2017

by Syed Tasnim Raza

ScreenHunter_2829 Sep. 22 11.53

Syed Tasnim Raza & Richard Harry Adler

It was late October 1971. My brother-in-law, Dr. Tariq Khan and I were interviewing together for residency training positions in Surgery. We finished our interview at the Downstate Medical Center in Brooklyn at 7:30 PM one night and then drove to Syracuse in heavy rain. We arrived at a friend's house at 2 AM and after sleeping a couple of hours we drove to our next interview at the Buffalo General Hospital in Buffalo, New York. There was heavy fog with very poor visibility, but we had to be at the Buffalo General before 9, so we sped west in the fog and made it there just in time.

The person to interview us first that morning was the acting director of the residency program in Buffalo, Dr. Richard H. Adler, a general Thoracic Surgeon. He had an angelic face and a lovely soft smile, and his presence immediately made us comfortable. The first question he asked us was why our eyes were bloodshot. We explained the all-night driving session after our interview finished later than expected in Brooklyn. He seemed impressed. After reviewing our application and reference letters he sent us to meet two other young faculty members, Dr. Jack Cudmore and Dr. Roger Dayer. And then we were given a tour of the hospital by Dr. Robert Milch, then the senior resident in surgery. After lunch, we met Dr. Adler again, for the closing interview, where he offered both of us the first-year residency position in surgery. This was a pyramidal program, so that there would be 15 first year residents, but these would be reduced to only six in the second year. Both Tariq and I were so excited we accepted the offer on the spot. We would join the program in July 1972. Thus, began my relationship with Dr. Adler, who would become my teacher, mentor and friend for the next 45 years.

Even though there were many Thoracic surgeons in Buffalo during the years Dr. Adler was active, he was the thoracic surgeon and did over 80% of all thoracic surgery at the Buffalo General Hospital. He was Professor of Surgery and eventually Director of the Thoracic Surgery Residency program until his retirement in 1990. He was one of the best thoracic surgeons both in the operating room, and also in his fund of knowledge about thoracic disease, which he always kept updated and current. Dr. Adler was trained at the University of Michigan in Ann Arbor under Dr. Herbert Sloan, one of the eminent thoracic surgeons of his time, who also was the Editor of the Annals of Thoracic Surgery. After his training, Dr. Adler came home to Buffalo and joined the Surgical faculty at the Buffalo General Hospital under Dr. John Payne, the Chairman of Surgery. During the next decade, Dr. Adler spent a year of further training in England in one of the foremost thoracic surgery clinics. While in London, Dr. Adler was exposed to Norman Barrett, one of the premier thoracic surgeons, who described the mucosal changes in the lower esophagus due to chronic reflux of acid from the stomach, now commonly known as Barrett's esophagus. On his return to Buffalo, Dr. Adler started methodically collecting patients with hiatal hernia and acid reflux and did mucosal biopsies of the lower esophagus. When Dr. Adler presented the results of his studies at a Thoracic Surgical meeting, Norman Barrett was in the audience. He discussed the paper and congratulated Dr. Adler for presenting the largest series of well documented cases of Barrett's esophagitis reported until then. Dr. Adler had similarly large series of case-studies, where he performed talc pleurodesis for malignant pleural effusion or patients with post-pneumonectomy space and others.

His attention to detail and very methodical follow-up of any given thoracic disease was remarkable. I always enjoyed assisting him at the smallest procedure, because he made it into an art form.

Read more »

The US and North Korea: Posturing v pragmatism

by Emrys Westacott

On September 19, Donald Trump spoke before the UN general assembly. Addressing the issue of North Korea's nuclear weapons program, he said that the US "if it is forced to defend itself or its allies, . . . will have no choice but to totally destroy North Korea." And of the North Korean leader Kim Jong-un, he said, "Rocket Man is on a suicide mission for himself and his regime." Download

There is nothing new about the US president affirming a commitment to defend itself and its allies. What is noteworthy about Trump's remarks is his cavalier talk of totally destroying another country, which implicitly suggests the use of nuclear weapons, and his deliberately insulting–as opposed to just criticizing–Kim Jong-un. He seems to enjoy getting down in the gutter with the North Korean leader, who responded in kind by calling Trump a "frightened dog," and a "mentally deranged dotard." Critics have noted that Trump's language is closer to what one expects of a strutting schoolyard bully than a national leader addressing an august assembly. And one could ask interesting questions about the psychological make-up of both men that leads them to speak the way they do. From a moral and political point of view, though, the only really important question regarding Trump's behavior is whether or not it is sensible. Is it a good idea to threaten and insult Kim Jong-un.

As a general rule, the best way to evaluate any action, including a speech act, is pragmatically: that is, by its likely effects. This is not always easy. Our predictions about the effects of an action are rarely certain, and they are often wrong. Moreover, even if we agree that one should think pragmatically, most of us find it hard to stick to this resolve. How many parents have nagged their teenage kids even though they know that such nagging will probably be counterproductive? How many of us have gone ahead and made an unnecessary critical comment to a partner that we know is likely to spark an unpleasant and unproductive row? And if one happens to be an ignorant, impulsive, narcissist, the self-restraint required in acting pragmatically is probably out of reach. Which is worrying when one considers how high the stakes are in the verbal cock fight between Trump and Jong-un.

Read more »

STUCK IN THE MIDDLE WITH EU? “CENTRISM” IN THE UK AND BEYOND

UserboxScale.svgbby Richard King

When the writer Paul Mason was booked to appear at the annual conference of Progress earlier this year, he was more or less assured a rough reception. Progress, after all, is a Blairite "ginger group" within the British Labour Party – formed in 1996, one year before their boy won power – and Mason the quasi-Marxist author of the excellent Postcapitalism and a strong supporter of Jeremy Corbyn. But I doubt he was prepared for just how bitter and self-pitying the right wing of the party has become since Corbyn set about transforming Labour into a genuinely social democratic movement with broad appeal amongst the young and the poor. Referring to anti-Blairite tweets Mason had sent in the wake of the May election, one audience member complained how "intimidated" she now felt at Labour Party meetings. Another demanded Mason apologise. (He didn't.) But things got really interesting when the panel chair suggested that Mason had "entered the Labour Party behind Jeremy Corbyn" – a not-so-veiled reference to the Trotskyist tactic of "entryism" whereby radical groups affix themselves to larger mainstream organisations in order to influence policy. Mason reminded the assembled comrades that he'd joined the Labour Party at nineteen years of age, and that his grandfather was of the generation who'd founded the party in 1900. He then invited the Progress faithful to consider whether they wanted to remain in Corbyn's Labour Party at all. As he put it, to boos and jeers from the floor:

In case you're misunderstanding me, just listen. If you want a centrist party, this is not going to be it for the next ten years. If it's really important to you to have a pro-Remain party that is in favour of illegal war, in favour of privatisation, form your own party and get on with it!

There weren't many takers for that last proposition, and to an outsider Mason's peroration might sound like a triumphalist taunt. But the notion that a new party could emerge in the wake of the Brexit referendum is not entirely fanciful. Inspired by the example of Emmanuel Macron, Tony Blair himself has established an entity called the Institute for Global Change, a "policy platform" that aims to refill "the wide-open space in the middle of politics", while Paddy Ashdown, one-time leader of the centrist Liberal Democrats, has helped establish More United, a "political start-up" that raises funds for politicians of a centrist, pro-EU persuasion, regardless of party affiliation. Furthermore, in August, former Tory aide and political editor of the Daily Mail, James Chapman, suggested that a number of Conservative MPs had responded warmly to his idea for a new centrist party called The Democrats. "They are not saying they are going to quit their parties," Chapman told the BBC; "but they are saying they understand that there is an enormous gap in the centre now of British politics."

Read more »

The art of molting

Justin E. H. Smith:

Snake_moultingMany animals, not just humans, generate objects that resemble their generators. In most cases these objects are not held to be works of art, however, since they are not made for the sake of resemblance to their makers. They are not made at all, in fact, but rather molted.

At its most masterful, nature gives us ecdysis, the variety of molting common to many invertebrates. Unlike lizards shedding their skin, birds their feathers, or mammals their fur, insects and arthropods are outfitted with rigid outer casings, and so their molting involves something closer to a crawling out than a casting off.

Consider the scorpion as it sinks into apolysis, when the epidermal cells gradually separate from the hard old exoskeleton. A new cuticle begins to form, and the
creature within agitates, thrusting back and forth until the old integumentary shell cracks. It squeezes out, reborn. Let us imagine that it then turns and regards—perhaps with admiration, perhaps with disgust—the scorpion shaped, self-shaped monument it has, by nature’s necessity, cast off. The new creature appears neotenous, inexperienced, soft-shelled, while the outer casing it leaves behind takes on the appearance of a gutted and abandoned tank, dry and gray and dead, while still plainly retaining the figure of the life it once vehicled.

Can we easily distinguish between what the scorpion does when it molts and what we human beings do when we, say, sculpt the human form in stone? The most common means of distinguishing between the two sorts of production is that the human sculptings are representations of human forms, whereas molted exoskeletons or shells are not representations but rather the things themselves, or at least vestiges of the things.

More here.

New Theory Cracks Open the Black Box of Deep Learning

Natalie Wolchover in Quanta:

InfoBottleneck_2880x1620-2880x1620Even as machines known as “deep neural networks” have learned to converse, drive cars, beat video games and Go champions, dream, paint pictures and help make scientific discoveries, they have also confounded their human creators, who never expected so-called “deep-learning” algorithms to work so well. No underlying principle has guided the design of these learning systems, other than vague inspiration drawn from the architecture of the brain (and no one really understands how that operates either).

Like a brain, a deep neural network has layers of neurons — artificial ones that are figments of computer memory. When a neuron fires, it sends signals to connected neurons in the layer above. During deep learning, connections in the network are strengthened or weakened as needed to make the system better at sending signals from input data — the pixels of a photo of a dog, for instance — up through the layers to neurons associated with the right high-level concepts, such as “dog.” After a deep neural network has “learned” from thousands of sample dog photos, it can identify dogs in new photos as accurately as people can. The magic leap from special cases to general concepts during learning gives deep neural networks their power, just as it underlies human reasoning, creativity and the other faculties collectively termed “intelligence.” Experts wonder what it is about deep learning that enables generalization — and to what extent brains apprehend reality in the same way.

Last month, a YouTube video of a conference talk in Berlin, shared widely among artificial-intelligence researchers, offered a possible answer. In the talk, Naftali Tishby, a computer scientist and neuroscientist from the Hebrew University of Jerusalem, presented evidence in support of a new theory explaining how deep learning works.

More here.

Ernest Hemingway’s long-lost Los Angeles visit

David Kipen in the Los Angeles Times:

La-1499960128-ofzci1ou4s-snap-imageLots happened in L.A. last night. Lives ended. Lives began. Couples fought, couples made up. A recently transplanted Manhattan-ite said, “All my friends are here!” I probably fell asleep with a book on my chest.

Eighty years from now, what record of these events will survive? Partly that depends on who keeps a diary, who writes to friends or family, who posts, who publishes a memoir and who doesn’t. For instance, 80 years ago this week, Ernest Hemingway, the author of “The Sun Also Rises” and “A Farewell to Arms,” grudgingly visited Los Angeles. He had once recommended the only way for a writer to deal with Hollywood: “You throw them your book, they throw you the money, then you jump into your car and drive like hell back the way you came.”

Why, then, did Hemingway make an exception in July 1937? It all had to do with a film that he and Dutch documentarian Joris Ivens had made about the Spanish Civil War called “Tierra de España,” or “The Spanish Earth.” He and a group calling itself “Contemporary Historians, Inc.,” including playwright Lillian Hellman; author of the U.S.A. trilogy (with its Hollywood-themed finale, “The Big Money”) John Dos Passos; poet Archibald MacLeish; and Dorothy Parker (who satisfied all three job descriptions and more), funded the picture out of their own pockets. The idea was to make a movie to raise money for the Loyalist cause. Every $1,000, they promised, would buy a new ambulance.

Fresh off a White House screening for the Roosevelts, Hemingway stayed only a few days in L.A. He made them count, fundraising for the cause everywhere he went.

More here.

The end of September marks fourteen years without Edward Said

Ivana Perić in H-Alter:

Edward_said_jeremy_pollard_copy76925To commemorate Said and recall the magnitude of his works, we are in conversation with Judith Butler, Laleh Khalili, Avi Shlaim and Illan Pappé.

Judith Butler, philosopher and gender theorist, professor at Department of Comparative Literature and the Program of Critical Theory, University of California: Said understood the work of imagination:

"Said was able to imagine a world in which the legacy of colonialism could come to an end and a relation of equality in difference could take its place on the lands of Palestine. He understood the work of the imagination to be central to politics, for without an 'unrealistic' vision of the future, no movement could be made in the direction of peace based on a just and lasting solution.

He lived in the midst of conflict, and used the powers of art and literature, of the archive, testimony, and public appeal, to ask the world to imagine a future in which equality, justice, and freedom finally triumph over subordination, dispossession, and violence. Sometimes I think he was perhaps too good for this world, but that incommensurability between what he could imagine and what actually exists accounts in part for the power of his writing and his presence in the world."

More here.

30 Of The Funniest, Most Empowering Fran Lebowitz Quotes

Rachel Hoding in Thought Catalog:

Publicspeaking_3dskew_f1Very few people possess true artistic ability. It is therefore both unseemly and unproductive to irritate the situation by making an effort. If you have a burning, restless urge to write or paint, simply eat something sweet and the feeling will pass.

When it comes to sports I am not particularly interested. Generally speaking, I look upon them as dangerous and tiring activities performed by people with whom I share nothing except the right to trial by jury.

All God’s children are not beautiful. Most of God’s children are, in fact, barely presentable.

There is no such thing as inner peace. There is only nervousness or death. Any attempt to prove otherwise constitutes unacceptable behavior.

My favorite way to wake up is to have a certain French movie star whisper to me softly at two-thirty in the afternoon that if I want to get to Sweden in time to pick up my Nobel Prize for Literature I had better ring for breakfast. This occurs rather less often than on might wish.

I wouldn’t say that I dislike the young. I’m simply not a fan of naiveté. I mean, unless you have an erotic interest in them, what other interest could you have? What are they going to possibly say that’s of interest? People ask me, Aren’t you interested in what they’re thinking? What could they be thinking? This is not a middle-aged curmudgeonly attitude; I didn’t like people that age even when I was that age.

More here.

Learning By Thinking

Tania Lombrozo in Edge:

LombrozoEdgeSometimes you think you understand something, and when you try to explain it to somebody else, you realize that maybe you gained some new insight that you didn't have before. Maybe you realize you didn't understand it as well as you thought you did. What I think is interesting about this process is that it’s a process of learning by thinking. When you're explaining to yourself or to somebody else without them providing feedback, insofar as you gain new insight or understanding, it isn't driven by that new information that they've provided. In some way, you've rearranged what was already in your head in order to get new insight. The process of trying to explain to yourself is a lot like a thought experiment in science. For the most part, the way that science progresses is by going out, conducting experiments, getting new empirical data, and so on. But occasionally in the history of science, there've been these important episodes—Galileo, Einstein, and so on—where somebody will get some genuinely new insight from engaging in a thought experiment.

The questions that motivate my research concern how we come to understand the social and physical world the way we do. Why are we so motivated to get an understanding of the world? What does that understanding do for us? Those are pretty broad questions that have been approached from lots of different disciplinary perspectives. My own work is most informed by a few different disciplines. One of them is psychology, where people have been interested in the learning mechanisms that allow us to understand aspects of the world; another is philosophy. Traditionally, epistemologists, philosophers of science, have been interested in how we can get a grip on what's going on in the world, how we can effectively interact with the world, and when we arrive at something that we might believe is justified, true, and so on. Those are very broad questions, and part of the way I've tried to get a grip on them empirically is to focus on the question of explanation. People are extremely motivated to explain. If you start eavesdropping on your friends and your neighbors, you'll notice that a lot of what they do is try to explain things that happened in their experience. They try to explain why someone was happy or upset, or why things happened the way that they did.

More here.

Sunday Poem

What It Was Like

If you want to know what
it was like, I'll tell you
what my tío told me:
There was a truck driver,
Antonio, who could handle a
rig as easily in reverse as
anybody else straight ahead:

Too bad me's a Mexican was
what my tío said the
Anglos had to say
about that.

Where do you begin if
you begin with if
you're too good
it's too bad?
.

by Leroy Quintana
from El Coro
University of Massachusetts Press, 1997
.

The Killing of History: A review of the documentary “The Vietnam War” directed by Ken Burns and Lynn Novick

John Pilger at the website of Vijay Prashad:

ScreenHunter_2832 Sep. 24 14.39One of the most hyped “events” of American television, The Vietnam War, has started on the PBS network. The directors are Ken Burns and Lynn Novick. Acclaimed for his documentaries on the Civil War, the Great Depression and the history of jazz, Burns says of his Vietnam films, “They will inspire our country to begin to talk and think about the Vietnam war in an entirely new way”.

In a society often bereft of historical memory and in thrall to the propaganda of its “exceptionalism”, Burns’ “entirely new” Vietnam war is presented as “epic, historic work”. Its lavish advertising campaign promotes its biggest backer, Bank of America, which in 1971 was burned down by students in Santa Barbara, California, as a symbol of the hated war in Vietnam.

Burns says he is grateful to “the entire Bank of America family” which “has long supported our country’s veterans”. Bank of America was a corporate prop to an invasion that killed perhaps as many as four million Vietnamese and ravaged and poisoned a once bountiful land. More than 58,000 American soldiers were killed, and around the same number are estimated to have taken their own lives.

I watched the first episode in New York. It leaves you in no doubt of its intentions right from the start. The narrator says the war “was begun in good faith by decent people out of fateful misunderstandings, American overconfidence and Cold War misunderstandings”.

The dishonesty of this statement is not surprising. The cynical fabrication of “false flags” that led to the invasion of Vietnam is a matter of record. The Gulf of Tonkin “incident” in 1964, which Burns promotes as true, was just one. The lies litter a multitude of official documents, notably the Pentagon Papers, which the great whistleblower Daniel Ellsberg released in 1971.

There was no good faith.

More here.

Let’s Not Lose Our Minds

Carl_Zimmer_S8I0005

Carl Zimmer in Medium:

Trofim Lysenko was a little-known researcher at the time. He did his experiment in the early years of Stalin’s dictatorship, when Stalin was facing dangerous food shortages across the Soviet Union. He had just responded by forcing peasants onto collectivized farms, a terrible decision that would lead over the next decade to the deaths of millions by starvation.

Stalin also demanded that Soviet scientists help fight the crisis by finding better crops, and find them fast. The Soviet Union at the time was home to a thriving community of geneticists who were doing pioneering work to understand the nature of genes in animals and plants. In response to the crisis, Soviet geneticists threw themselves into producing better crops through genetics. But their results were coming too slowly for Stalin.

And then came Lysenko. Lysenko had a great backstory that fit Stalinist ideology. He wasn’t one of those fussy cosmopolitan experts. He was from a peasant family. And despite having little advanced education, he was succeeding where mainstream scientists were failing. As soon as the agricultural ministry learned about Lysenko’s experiment on winter wheat, they began promoting him as scientific hero.

In fact, when Lysenko first described his research at scientific conferences in early 1929, other Soviet scientists roundly dismissed it. For one thing, it was nothing new. Plant breeders had already been trying to use cold temperatures for centuries to improve plant growth. But they had little or no success.

So why should Lysenko suddenly be getting his amazing results? Lysenko’s critics said he was getting nothing of the sort. He was running experiments that were so small and sloppy that they couldn’t be trusted. Even in the early years of Stalin’s rule, Russian scientists were still having vigorous open exchanges. That’s one of the essential ingredients of science, because it allows scientists to hold each other to high standards.

More here.