Tuesday, May 31, 2016

Your brain is not a computer

Haven't you realized yet that the human brain did not evolve in order to understand itself? Do you really think that this organ, evolved to solve problems related to survival on the Pleistocene savanna, can figure out the workings of most complex thing in the Universe? A little humility, please, neuroscientists.




Aeon
"Because neither ‘memory banks’ nor ‘representations’ of stimuli exist in the brain, and because all that is required for us to function in the world is for the brain to change in an orderly way as a result of our experiences, there is no reason to believe that any two of us are changed the same way by the same experience. If you and I attend the same concert, the changes that occur in my brain when I listen to Beethoven’s 5th will almost certainly be completely different from the changes that occur in your brain. Those changes, whatever they are, are built on the unique neural structure that already exists, each structure having developed over a lifetime of unique experiences.
This is why, as Sir Frederic Bartlett demonstrated in his book Remembering (1932), no two people will repeat a story they have heard the same way and why, over time, their recitations of the story will diverge more and more. No ‘copy’ of the story is ever made; rather, each individual, upon hearing the story, changes to some extent – enough so that when asked about the story later (in some cases, days, months or even years after Bartlett first read them the story) – they can re-experience hearing the story to some extent, although not very well (see the first drawing of the dollar bill, above).
This is inspirational, I suppose, because it means that each of us is truly unique, not just in our genetic makeup, but even in the way our brains change over time. It is also depressing, because it makes the task of the neuroscientist daunting almost beyond imagination. For any given experience, orderly change could involve a thousand neurons, a million neurons or even the entire brain, with the pattern of change different in every brain."






Monday, May 30, 2016

Scientists find that you don't know what you believe. Freud is not surprised.

"Seriously, it's taken you experimental scientists and moral philosophers a hundred years to catch up with me? Try reading a book once in a while. I wrote a lot of them -- mostly about how difficult it is know the contents of our own minds."




Aeon
"It is well established that people sometimes think they have beliefs that they don’t really have. For example, if offered a choice between several identical items, people tend to choose the one on the right. But when asked why they chose it, they confabulate a reason, saying they thought the item was a nicer colour or better quality. Similarly, if a person performs an action in response to an earlier (and now forgotten) hypnotic suggestion, they will confabulate a reason for performing it. What seems to be happening is that the subjects engage in unconscious self-interpretation. They don’t know the real explanation of their action (a bias towards the right, hypnotic suggestion), so they infer some plausible reason and ascribe it to themselves. They are not aware that they are interpreting, however, and make their reports as if they were directly aware of their reasons.
Many other studies support this explanation. For example, if people are instructed to nod their heads while listening to a tape (in order, they are told, to test the headphones), they express more agreement with what they hear than if they are asked to shake their heads. And if they are required to choose between two items they previously rated as equally desirable, they subsequently say that they prefer the one they had chosen. Again, it seems, they are unconsciously interpreting their own behaviour, taking their nodding to indicate agreement and their choice to reveal a preference.
...
The ISA theory has some startling consequences. One is that (with limited exceptions), we do not have conscious thoughts or make conscious decisions. For, if we did, we would be aware of them directly, not through interpretation. The conscious events we undergo are all sensory states of some kind, and what we take to be conscious thoughts and decisions are really sensory images – in particular, episodes of inner speech. These images might express thoughts, but they need to be interpreted.
Another consequence is that we might be sincerely mistaken about our own beliefs. Return to my question about racial stereotypes. I guess you said you think they are false. But if the ISA theory is correct, you can’t be sure you think that. Studies show that people who sincerely say that racial stereotypes are false often continue to behave as if they are true when not paying attention to what they are doing. Such behaviour is usually said to manifest an implicit bias, which conflicts with the person’s explicit beliefs. But the ISA theory offers a simpler explanation. People think that the stereotypes are true but also that it is not acceptable to admit this and therefore say they are false. Moreover, they say this to themselves too, in inner speech, and mistakenly interpret themselves as believing it. They are hypocrites but not conscious hypocrites. Maybe we all are."




Sunday, May 29, 2016

Ambulances -- Philip Larkin






Closed like confessionals, they thread
Loud noons of cities, giving back
None of the glances they absorb.
Light glossy grey, arms on a plaque,
They come to rest at any kerb:
All streets in time are visited.

Then children strewn on steps or road,
Or women coming from the shops
Past smells of different dinners, see
A wild white face that overtops
Red stretcher-blankets momently
As it is carried in and stowed,

And sense the solving emptiness
That lies just under all we do,
And for a second get it whole,
So permanent and blank and true.
The fastened doors recede. Poor soul,
They whisper at their own distress;

For borne away in deadened air
May go the sudden shut of loss
Round something nearly at an end,
And what cohered in it across
The years, the unique random blend
Of families and fashions, there

At last begin to loosen. Far
From the exchange of love to lie
Unreachable inside a room
The traffic parts to let go by
Brings closer what is left to come,
And dulls to distance all we are.





Saturday, May 28, 2016

Times of Your Life -- Paul Anka (1975)







Brilliant of Mad Men to use this song in the trailer for the series' final episode:








Did you realize that they were tapping into our nostalgia not for Paul Anka but for this 1977 Kodak commercial?







Good morning, yesterday
You wake up and time has slipped away
And suddenly it's hard to find
The memories you left behind
Remember, do you remember?

The laughter and the tears
The shadows of misty yesteryears
The good times and the bad you've seen
And all the others in between
Remember, do you remember
The times of your life? (do you remember?)

Reach out for the joy and the sorrow
Put them away in your mind
The mem'ries are time that you borrow
To spend when you get to tomorrow

Here comes the setting sun (comes the setting sun)
The seasons are passing one by one
So gather moments while you may
Collect the dreams you dream today
Remember, will you remember
The times of your life?

Gather moments while you may
Collect the dreams you dream today
Remember, will you remember
The times of your life?

Of your life
Of your life
Do you remember, baby
Do you remember the times of your life?

FADE

Do you remember, baby
Do you remember the times of your life?

Songwriters
ROGER NICHOLS, BILL LANE







Friday, May 27, 2016

Structural MRI studies reveal little difference in brains of autistic versus healthy normals

Those are some seriously enlarged ventricles. But most people with schizophrenia don't have brain scans that look like that. Always remember that brain scans are NEVER diagnostic in mental health -- that person on the right could have schizophrenia, or autism, or chronic alcoholism, or dementia, or be perfectly healthy.




Neuroskeptic
"A new paper threatens to turn the world of autism neuroscience upside down. Its title is Anatomical Abnormalities in Autism?, and it claims that, well, there aren’t very many.
Published in Cerebral Cortex by Israeli researchers Shlomi Haar and colleagues, the new research reports that there are virtually no differences in brain anatomy between people with autism and those without.
What makes Haar et al.’s essentially negative claims so powerful is that their study had a huge sample size: they included structural MRI scans from 539 people diagnosed with high-functioning autism spectrum disorder (ASD) and 573 controls. This makes the paper an order of magnitude bigger than a typical structural MRI anatomy study in this field. The age range was 6 to 35.
...
What did they find? Well… not much. First off, the ASD group had no differences in overall brain size (intracranial volume). Nor were there any group differences in the volumes of most brain areas; the only significant finding here was an increased ventricle volume in the ASD group, but even this had a small effect size (d = 0.34). Enlarged ventricles is not specific to ASD by any means – the same thing has been reported in schizophrenia, dementia, and many other brain disorders.
... 
I think this is an important paper and one that the autism field will need to take very seriously. There are hundreds of studies claiming to have found differences in brain structure in autism, many with small sample sizes, and Haar et al’s failure to replicate almost any of these claims, is sobering. It’s important to remember, however, that this paper only considered brain anatomy. It doesn’t contradict studies looking at brain function, nor does it relate to microanatomy or neuropathology (i.e. microscope work.)
As far as it goes, though, it’s a bit of an earthquake – and I’m not sure how much of the field is left standing."







Thursday, May 26, 2016

What really went down at Hiroshima and Nagasaki

"And here is one of the Hiroshima Gas Company and the Honkawa Elementary School. I think the latter really emphasizes the horror of “strategic” bombing, where burning elementary schools become acceptable as “collateral damage.” The famous dome at the upper right hand corner of the photo was directly underneath the explosion; the school was about 800 feet from there."


Scientific American blog


"Some iconic black and white photos of the Hiroshima and Nagasaki, Wellerstein notes, show the cities after bodies and rubble have been cleared and make them look like “abandoned cities on the moon.” He displays color photos that depict a messier reality. I urge you to check out his illustrated column, but here is the coda, which makes a point worth pondering on this somber anniversary. Bold type is in the original:
There are two ways you can go wrong in making sense of the scale of the bombings of Hiroshima and Nagasaki.


The first is to see the bombs as instant vaporizers, to see the bombs as Everything Killers that just zap cities out of existence. This isn’t the case. They kill by crushing and burning and irradiating. They don’t turn you to dust.  They don’t freeze you and turn you into a stop-motion skeleton, like in The Day AfterFor some, death was instantaneous, but for a lot of others, it was a much more protracted affair.
The other way to misunderstand it is to downplay it. Ah, a number of large buildings survived! It’s not so bad, then, right? Maybe the whole nuke thing has been exaggerated! Well, unless you are, you know, not in one of those buildings, and even if you are, it’s a pretty awful thing. Yes, you can approximate the city-wide effects of early atomic bombs with a fleet of conventional bombers dropping napalm — which personally I consider just as much a weapon of mass destruction as anything else... But being napalmed is not exactly a walk in the park for those being bombed, either.
So what’s the right view? An ugly, troublesome, disturbing one; right between those extremes. The atomic bomb was a weapon used to inflict tremendous human suffering. (This is true whether you think its use was justified or not.) If an atomic bomb were to go off over your city, the damage would be horrifying, the death toll staggering. But it’s a level of destruction that people should try to appreciate for what it is — a realistic possibility, not a clean science-fiction ending or a blow to be shrugged off.    
A final point. As Wellerstein would be the first to acknowledge, the photos he displays do not really capture the ugliness of the atomic bombings, because they do not show the victims, dead and alive. You can find photographs of horribly disfigured casualties online. But the best way to appreciate the suffering caused by the atomic bombs is to read John Hersey’s classic work of journalism Hiroshima, published in 1946."


See also: Memoir by Survivor of Atomic Bombing of Nagasaki











Wednesday, May 25, 2016

A general State education is a mere contrivance for moulding people to be exactly like one another

How would this country change if 90% of the population attended private elementary and secondary schools (instead of the 10% today)?


Excerpt from On Liberty (1869), Ch. V: Applications

John Stuart Mill



WERE THE DUTY OF ENFORCING universal education once admitted, there would be an end to the difficulties about what the State should teach, and how it should teach, which now convert the subject into a mere battle-field for sects and parties, causing the time and labour which should have been spent in educating, to be wasted in quarrelling about education. If the government would make up its mind to require for every child a good education, it might save itself the trouble of providing one. It might leave to parents to obtain the education where and how they pleased, and content itself with helping to pay the school fees of the poorer classes of children, and defraying the entire school expenses of those who have no one else to pay for them. The objections which are urged with reason against State education, do not apply to the enforcement of education by the State, but to the State's taking upon itself to direct that education: which is a totally different thing. That the whole or any large part of the education of the people should be in State hands, I go as far as any one in deprecating. All that has been said of the importance of individuality of character, and diversity in opinions and modes of conduct, involves, as of the same unspeakable importance, diversity of education. A general State education is a mere contrivance for moulding people to be exactly like one another: and as the mould in which it casts them is that which pleases the predominant power in the government, whether this be a monarch, a priesthood, an aristocracy, or the majority of the existing generation, in proportion as it is efficient and successful, it establishes a despotism over the mind, leading by natural tendency to one over the body. An education established and controlled by the State should only exist, if it exist at all, as one among many competing experiments, carried on for the purpose of example and stimulus, to keep the others up to a certain standard of excellence. Unless, indeed, when society in general is in so backward a state that it could not or would not provide for itself any proper institutions of education, unless the government undertook the task; then, indeed, the government may, as the less of two great evils, take upon itself the business of schools and universities, as it may that of joint-stock companies, when private enterprise, in a shape fitted for undertaking great works of industry does not exist in the country. But in general, if the country contains a sufficient number of persons qualified to provide education under government auspices, the same persons would be able and willing to give an equally good education on the voluntary principle, under the assurance of remuneration afforded by a law rendering education compulsory, combined with State aid to those unable to defray the expense.









Tuesday, May 24, 2016

Does Free Will Exist? Science Can't Answer That Question.

"Man can do what he wills but he cannot will what he wills." -- Arthur Schopenhauer (a compatibilist, like David Hume, and the Stoics, and Freud)






The Atlantic
"Kathleen Vohs, then at the University of Utah, and Jonathan Schooler, of the University of Pittsburgh, asked one group of participants to read a passage arguing that free will was an illusion, and another group to read a passage that was neutral on the topic. Then they subjected the members of each group to a variety of temptations and observed their behavior. Would differences in abstract philosophical beliefs influence people’s decisions?
Yes, indeed. When asked to take a math test, with cheating made easy, the group primed to see free will as illusory proved more likely to take an illicit peek at the answers. When given an opportunity to steal—to take more money than they were due from an envelope of $1 coins—those whose belief in free will had been undermined pilfered more. On a range of measures, Vohs told me, she and Schooler found that “people who are induced to believe less in free will are more likely to behave immorally.”
It seems that when people stop believing they are free agents, they stop seeing themselves as blameworthy for their actions. Consequently, they act less responsibly and give in to their baser instincts. Vohs emphasized that this result is not limited to the contrived conditions of a lab experiment. “You see the same effects with people who naturally believe more or less in free will,” she said. In another study, for instance, Vohs and colleagues measured the extent to which a group of day laborers believed in free will, then examined their performance on the job by looking at their supervisor’s ratings. Those who believed more strongly that they were in control of their own actions showed up on time for work more frequently and were rated by supervisors as more capable. In fact, belief in free will turned out to be a better predictor of job performance than established measures such as self-professed work ethic.
Another pioneer of research into the psychology of free will, Roy Baumeister of Florida State University, has extended these findings. For example, he and colleagues found that students with a weaker belief in free will were less likely to volunteer their time to help a classmate than were those whose belief in free will was stronger. Likewise, those primed to hold a deterministic view by reading statements like “Science has demonstrated that free will is an illusion” were less likely to give money to a homeless person or lend someone a cellphone.
Further studies by Baumeister and colleagues have linked a diminished belief in free will to stress, unhappiness, and a lesser commitment to relationships. They found that when subjects were induced to believe that “all human actions follow from prior events and ultimately can be understood in terms of the movement of molecules,” those subjects came away with a lower sense of life’s meaningfulness. Early this year, other researchers published a study showing that a weaker belief in free will correlates with poor academic performance.
The list goes on: Believing that free will is an illusion has been shown to make people less creative, more likely to conform, less willing to learn from their mistakes, and less grateful toward one another. In every regard, it seems, when we embrace determinism, we indulge our dark side.
Few scholars are comfortable suggesting that people ought to believe an outright lie. Advocating the perpetuation of untruths would breach their integrity and violate a principle that philosophers have long held dear: the Platonic hope that the true and the good go hand in hand. Saul Smilansky, a philosophy professor at the University of Haifa, in Israel, has wrestled with this dilemma throughout his career and come to a painful conclusion: “We cannot afford for people to internalize the truth” about free will.
Smilansky is convinced that free will does not exist in the traditional sense—and that it would be very bad if most people realized this. “Imagine,” he told me, “that I’m deliberating whether to do my duty, such as to parachute into enemy territory, or something more mundane like to risk my job by reporting on some wrongdoing. If everyone accepts that there is no free will, then I’ll know that people will say, ‘Whatever he did, he had no choice—we can’t blame him.’ So I know I’m not going to be condemned for taking the selfish option.” This, he believes, is very dangerous for society, and “the more people accept the determinist picture, the worse things will get.”
Determinism not only undermines blame, Smilansky argues; it also undermines praise. Imagine I do risk my life by jumping into enemy territory to perform a daring mission. Afterward, people will say that I had no choice, that my feats were merely, in Smilansky’s phrase, “an unfolding of the given,” and therefore hardly praiseworthy. And just as undermining blame would remove an obstacle to acting wickedly, so undermining praise would remove an incentive to do good. Our heroes would seem less inspiring, he argues, our achievements less noteworthy, and soon we would sink into decadence and despondency.
Smilansky advocates a view he calls illusionism—the belief that free will is indeed an illusion, but one that society must defend. The idea of determinism, and the facts supporting it, must be kept confined within the ivory tower. Only the initiated, behind those walls, should dare to, as he put it to me, “look the dark truth in the face.” Smilansky says he realizes that there is something drastic, even terrible, about this idea—but if the choice is between the true and the good, then for the sake of society, the true must go."


This guy Smilansky is a hard-core determinist, a minority among philosophers, but common, even if they don't realize it, among those who have swallowed scientism whole. You see this nonsense all the time in forensic psych researchers who blame Charles Whitman's mass killing on a tumor "in the vicinity" of his amygdala, or who point to genetic  or fMRI studies of psychopathy. Everything is determined, they believe, therefore everyone is blameless. B.F. Skinner, in Beyond Freedom and Dignity (1971), argued that the quicker we above our false belief in free will, the better.










 

Monday, May 23, 2016

Psychiatrist Narenda Nagareddy: One-man opioid epidemic

This is really interesting. I wonder if they are going to get him on these murder charges. Because if they do, there are A LOT of physicians who should be very, very worried. You don't have to run a pill mill to have a patient overdose on the opioid meds you prescribed. And forget about the criminal charges -- it's the civil cases that have the tort lawyers salivating.






SkyNews
"A Georgia psychiatrist dubbed Dr Death has been charged with murder in the overdose deaths of three patients.
Thirty-six of Dr Narendra Nagareddy's patients died while he was prescribing them painkillers, court documents allege.
Twelve of them died of prescription drug overdoses, post-mortem examinations confirmed.
The doctor could face further charges for at least 30 other deaths, the Henry Herald reports. 
Nagareddy - who was arrested in January - allegedly ran a pill mill out of his office in an Atlanta suburb.
In one 11-month period ending in July 2015, he prescribed nearly 500 times the amount of oxycodone prescribed by any other colleague at the Southern Regional Medical Center.
...
Nagareddy prescribed hydrocodone, oxycodone, methadone, fentanyl and amphetamine salts to patients who were suffering addiction, anxiety and depression. [I wonder if he ever prescribed anything that might actually help their alleged anxiety or depression.]
He is charged in the deaths of Cheryl Pennington, 47, David Robinson, 49, and 29-year-old mother-of-two Audrey Austin.
...
A probation officer raised the alarm after noticing that three people in her caseload who had died were patients of Nagareddy."










Sunday, May 22, 2016

A Short Song of Congratulation -- Samuel Johnson (1709-1784)

Sir John Lade
Sir John Lade, as pictured around 1778




LONG-EXPECTED one and twenty
Ling'ring year at last has flown,
Pomp and pleasure, pride and plenty
Great Sir John, are all your own.

Loosen'd from the minor's tether,
Free to mortgage or to sell,
Wild as wind, and light as feather
Bid the slaves of thrift farewell.

Call the Bettys, Kates, and Jenneys
Ev'ry name that laughs at care,
Lavish of your Grandsire's guineas,
Show the spirit of an heir.

All that prey on vice and folly
Joy to see their quarry fly,
Here the gamester light and jolly
There the lender grave and sly.

Wealth, Sir John, was made to wander,
Let it wander as it will;
See the jocky, see the pander,
Bid them come, and take their fill.

When the bonny blade carouses,
Pockets full, and spirits high,
What are acres? What are houses?
Only dirt, or wet or dry.

If the Guardian or the Mother
Tell the woes of willful waste,
Scorn their counsel and their pother,
You can hang or drown at last.                         





Saturday, May 21, 2016

Something Stupid -- Frank (and Nancy) Sinatra (1967)

Who's the guy standing behind Yul Brynner?









I know I stand in line,
until you think you have the time
to spend an evening with me.
And if we go some place to dance,
I know that there's a chance
you won't be leaving with me.
And afterwards we drop into a quiet little place
and have a drink or two.
And then I go and spoil it all,
by saying something stupid
like: "I love you."
I can see it in your eyes,
that you despise the same old lies
you heard the night before.
And though it's just a line to you,
for me it's true,
it never seemed so right before.
I practice every day to find some clever lines to say,
to make the meaning come through.
But then I think I'll wait
until the evening gets late,
and I'm alone with you.
The time is right,your perfume fills my head,
the stars get red,
and oh, the night's so blue.
And then I go and spoil it all,
by saying something stupid
like: "I love you."

Songwriters
Carson Parks



Friday, May 20, 2016

Coming soon: Mass reproduction via IVF and Preimplantation Genetic Diagnosis (PGD)




It's interesting that only the World War II Axis Powers that have outlawed pre-implantation genetic diagnosis ("DPI interdit"). I suppose the Germans, given their track record, fear that they would misuse the technology if given half a chance.




The New Statesman
"It is already possible to avoid more than 250 grave genetic conditions by genetic screening of few-days-old embryos during in vitro fertilisation (IVF), so that embryos free from the genetic mutation responsible can be identified for implantation. But that usually works solely for diseases stemming from a single gene – of which there are many, though most are rare. The procedure is called pre-implantation genetic diagnosis (PGD), and it is generally used only by couples at risk of passing on a particularly nasty genetic disease. Otherwise, why go to all that discomfort, and possibly that expense, when the old-fashioned way of making babies is so simple and (on the whole) fun?
In The End of Sex, Henry Greely, a law professor and bioethicist at Stanford University, argues that this will change. Thanks to advances in reproductive and genetic technologies, he predicts that PGD will become the standard method of conception in a matter of several decades. (Recreational sex might nonetheless persist.)
If that doesn’t sound alarming enough, there will be all manner of other seemingly bizarre and alarming options on the menu for making children: using eggs and sperm both made from a single adult (the “uniparent”), or chromosomes tailor-made by chemistry, or IVF between siblings or pensioners, or IVF with the stolen biological detritus of celebrities.
 ...
Greely does a superb job in his book of explaining the science, as well as the law and politics (at least in the US context), that will make these things possible. At the root is the realisation that human tissue is far more malleable and protean than we had imagined. Every cell in your body – a flake of skin, say – could be a source not just of most or all other tissue types, but of other beings.
Central to these scenarios is the culturing and manipulation of stem cells, the ur-cells from which all others develop. The most versatile are human embryonic stem cells. Because these are “pluripotent” – able to grow into any tissue type – they might be used for regeneration of damaged tissues such as nerves, heart muscle and bone.
But it was first shown in 2014 that they can also be used to generate “gametes”: eggs and sperm. So far, such “artificial sperm” consists of immature “spermatids”, which lack tails for swimming. That is no obstacle, however. Using methods developed for IVF, the cells can be injected directly into eggs to produce apparently healthy offspring – in mice, at least. If we want to make babies this way, we’ll generally want them to have the parental genes. It is possible to create embryonic stem cells containing the genes of an adult by using methods involved in cloning Dolly the sheep, in which genetic material is transferred from an adult body (somatic) cell into an egg that has had its own chromosomes removed. The egg can then be used to grow an embryo – a clone – from which stem cells can be cultured. When will that happen? Two years ago."







 

Thursday, May 19, 2016

Are the Rich Really Lizard People?

I'm sure that the owner of this car is a good person.




The Economist, 1843 Magazine
"In some experiments Keltner and his collaborators put participants from a variety of income brackets to the test; in others, they “primed” subjects to feel less powerful or more powerful by asking them to think about people more or less powerful than themselves, or to think about times when they felt strong or weak. The results all stacked the same way. People who felt powerful were less likely to be empathetic; wealthy subjects were more likely to cheat in games involving small cash stakes and to dip their fists into a jar of sweets marked for the use of visiting children. When watching a video about childhood cancer they displayed fewer physiological signs of empathy.
Similar results occurred even when the privilege under observation had no meaning beyond the experiment room. Rigged games of Monopoly were set up in which one player took a double salary and rolled with two dice instead of one: winners failed to acknowledge their unfair advantage and reported that they had triumphed through merit. In another study, volunteers were divided into bosses and workers and set to work on an administrative task. When a plate of biscuits was brought into the room, the managers reached for twice as many as the managed. “Power tends to corrupt, and absolute power corrupts absolutely,” said Lord Acton in 1887. Here was the evidence, lab-tested, that it also awakened the Cookie Monster within.
Acton’s pronouncement on power was a response to a specific 19th-century event – the Vatican’s decision, in 1870, to adopt the doctrine of papal infallibility. When 20th-century social scientists began studying the moral conduct of powerful people, they did it in reaction to the absolutism of their own age. In 1956 the sociologist C. Wright Mills published “The Power Elite”, an account of American society that shocked a generation: partly because it suggested the country was controlled by self-sustaining cliques of military, political and corporate men; partly because Mills modelled his work on an earlier study of the social and political hierarchies of Nazi Germany. Three years later, Pitirim Sorokin, founder of Harvard’s sociology department and a refugee from Lenin’s Russia, published “Power and Morality”, which proposed that the individuals described by Mills were not just self-interested, but sick. “Taken as a whole,” he wrote, “the ruling groups are more talented intellectually and more deranged mentally than the ruled population.”
...

When Keltner and his colleagues published an influential paper on the subject in 2010, three European academics, Martin Korndörfer, Stefan Schmukle and Boris Egloff, wondered if it would be possible to reproduce the findings of small lab-based experiments using much larger sets of data from surveys carried out by the German state. The idea was to see whether this information, which documented what people said they did in everyday life, would offer the same picture of human behaviour as results produced in the lab. “We simply wanted to replicate their results,” says Boris Egloff, “which seemed very plausible to us and fine in every possible sense.” The crunched numbers, however, declined to fit the expected patterns. Taken cumulatively, they suggested the opposite. Privileged individuals, the data suggested, were proportionally more generous to charity than their poorer fellow citizens; more likely to volunteer; more likely to help a traveller struggling with a suitcase or to look after a neighbour’s cat.
Egloff and his colleagues wrote up their findings and sent them to the Journal of Personality and Social Psychology, which had also published Keltner’s work. “We thought,” says Egloff, “naive as we were, that this might be interesting for the scientific community.” The paper was rejected. They extended their analysis to data from America and other countries, and felt confident that they had identified several more pieces that didn’t fit the jigsaw being assembled by their American peers. They argued that psychology’s consensus view on social status and ethical behaviour did not exist in other disciplines, and concluded with a quiet plea for more research in this area. Their paper was rejected again. Last July, it eventually found a home in a peer-reviewed online journal.
Egloff has been doing research since 1993 and is used to the bloody process of peer review. But he was shocked by the hostility towards his work. “I am not on a crusade,” he says. “I am not rich. My family is not rich. My friends are not rich. We never received any money from any party for doing this research. Personally I would have loved the results of the Berkeley group to be true. That would be nice and would provide a better fit to my personal and political beliefs and my worldview. However, as a scientist…” The experience of going against this particular intellectual grain was so painful that Egloff vows never to study the topic of privilege and ethics again."






Wednesday, May 18, 2016

How many sex partners a year makes you the happiest?

See what it means? Promiscuous people are less happy than people who had only one sex partner over the past 12 months (but happier than those who had no sex at all in the past year).










Here is a different study that found the same thing:


This paper studies the links between income, sexual behavior and reported happiness. It uses recent data on a random sample of 16,000 adult Americans. The paper finds that sexual activity enters strongly positively in happiness equations. Greater income does not buy more sex, nor more sexual partners. The typical American has sexual intercourse 2-3 times a month. Married people have more sex than those who are single, divorced, widowed or separated. Sexual activity appears to have greater effects on the happiness of highly educated people than those with low levels of education. The happiness-maximizing number of sexual partners in the previous year is calculated to be 1.






 


Tuesday, May 17, 2016

Ketamine for suicidal ideation?

Hold your horses, ketamine enthusiasts. (It's a horse tranquilizer, as well as a hallucinogenic street drug.) This recently ballyhooed study had only 14 participants, 2 of whom dropped out (1 due to ketamine side effects). Of the remaining 12 participants, 7 had remission of suicidal thinking (after 3 weeks of twice-weekly intravenous ketamine sessions lasting 45 minutes or longer). But only 2 of those 7 were still in remission from suicidal thinking and depressive symptoms 3 months after treatment. Of course, there was no placebo-control group, but even worse, the researchers didn't have the guts to go head-to-head with ECT, which would kick ketamine's ass.





MassGen


"Repeat intravenous treatment with low doses of the anesthetic drug ketamine quickly reduced suicidal thoughts in a small group of patients with treatment-resistant depression. In their report receiving Online First publication in the Journal of Clinical Psychiatry, a team of Massachusetts General Hospital (MGH) investigators report the results of their study in depressed outpatients who had been experiencing suicidal thought for three months or longer.
“Our finding that low doses of ketamine, when added on to current antidepressant medications, quickly decreased suicidal thinking in depressed patients is critically important because we don’t have many safe, effective, and easily available treatments for these patients,” says Dawn Ionescu, MD, of the Depression Clinical and Research Program in the MGH Department of Psychiatry, lead and corresponding author of the paper. “While several previous studies have shown that ketamine quickly decreases symptoms of depression in patients with treatment-resistant depression, many of them excluded patients with current suicidal thinking.”
It is well known that having suicidal thoughts increases the risk that patients will attempt suicide, and the risk for suicide attempts is 20 times higher in patients with depression than the general population. The medications currently used to treat patients with suicidal thinking – including lithium and clozapine – can have serious side effects, requiring careful monitoring of blood levels; and while electroconvulsive therapy also can reduce suicidal thinking, its availability is limited and it can have significant side effects, including memory loss.
Primarily used as a general anesthetic, ketamine has been shown in several studies to provide rapid relief of symptoms of depression. In addition to excluding patients who reported current suicidal thinking, many of those studies involved only a single ketamine dose. The current study was designed not only to examine the antidepressant and antisuicidal effects of repeat, low-dose ketamine infusions in depressed outpatients with suicidal thinking that persisted in spite of antidepressant treatment, but also to examine the safety of increased ketamine dosage.
The study enrolled 14 patients with moderate to severe treatment-resistant depression who had suicidal thoughts for three months or longer. After meeting with the research team three times to insure that they met study criteria and were receiving stable antidepressant treatment, participants received two weekly ketamine infusions over a three-week period. The initial dosage administered was 0.5 mg/kg over a 45 minute period – about five times less than a typical anesthetic dose – and after the first three doses, it was increased to 0.75 mg/kg. During the three-month follow-up phase after the ketamine infusions, participants were assessed every other week.
The same assessment tools were used at each visit before, during and after the active treatment phase. At the treatment visits they were administered about 4 hours after the infusions were completed. The assessments included validated measures of suicidal thinking, in which patients were directly asked to rank whether they had specific suicide-related thoughts, their frequency and intensity.
While only 12 of the 14 enrolled participants completed all treatment visits – one dropped out because of ketamine side effects and one had a scheduling conflict – most of them experienced a decrease in suicidal thinking, and seven achieved complete remission of suicidal thoughts at the end of the treatment period. Of those seven participants, two maintained remission from both suicidal thinking and depression symptoms throughout the follow-up period. While there were no serious adverse events at either dose and no major differences in side effects between the two dosage levels, additional studies in larger groups of patients are required before any conclusions can be drawn."








Monday, May 16, 2016

Ultimate Psychology Reading List: The Psychologist magazine





The British Psychologist magazine asks every psychologist they interview for a book recommendation. The list below is an edited version of their compilation. I'd save the James for last -- the percentage of psychologists who have read it is probably even less than the percentage of historians who have read Gibbon's Decline and Fall of the Roman Empire. But I have either enjoyed the books on the list below or plan to read them soon.



The entire list




William James’s The Principles of Psychology (1890). "He shares his thinking and questioning with his reader so that one can enter his mind, and live with him as a friend. He combines being a superb communicator with insights of philosophy and science really worth communicating. I was struck by his breadth of mind, respecting the arts from the past as well as technologies for creating future science,"  said Richard Gregory, Jun 08.




"The Principles of Psychology by William James, of course. Not only is it brilliant and prescient, but the quality of the writing is humbling," said Daniel Gilbert, Jul 08.




"Having referred to it for many years in the context of social facilitation, I was intrigued when I finally read Norman Triplett’s original 1898 paper and realised how psychologists have misreported his methods, results and conclusions regarding the effects of coactors on performance," said Sandy Wolfson, Aug 08.




Mistakes Were Made (But Not by Me) by Carol Tavris and Elliot Aronson. "You’ll get to understand why hypocrites never see their own hypocrisy, why couples so often misremember their shared history, why many people persist in courses of action that lead straight into quicksand. It’s lucid and witty, and a delightful read," said Elizabeth Loftus, Oct 08.




"Impossible question. William James’s Principles of Psychology for writing style, prescience and insight; Elliot Aronson’s The Social Animal for its passionate, personal prose and introduction to the major concerns of social psychology; and Judith Rich Harris’s The Nurture Assumption for its brilliant, creative reassessment of the basic but incorrect assumptions of developmental psychology. Her book is a model of how psychologists need to let data supersede ideology and vested intellectual convictions, and change direction when the evidence demands," said Carol Tavris, Mar 09.




"Julian Jaynes’s 1976 cult classic The Origin of Consciousness in the Breakdown of the Bicameral Mind, which makes the startling claim that subjective consciousness (in the sense of internalised mind space) arose a mere 3000 years ago through the development of metaphorical language, a process itself driven by increasing social and cultural complexity. Some of the historical and classical scholarship may be dubious and the neuropsychology is sketchy, but the book is a wonderful imaginative achievement, a pioneering attempt to fuse ancient history, psychology and neuroscience," said Pauls Broks, Apr 09.




Attachment by John Bowlby. "We are much more likely to read critiques of Bowlby’s theories than the original work. Although his views had a negative impact on the lives of women after the Second World War by putting pressure on mothers to stay at home with their children, he writes beautifully and compellingly about the interactions between infants and their mother. This aspect of his work has been lost to those not closely involved with the study of attachment relationships," said Susan Golombok, Aug 09.




Semrad: The Heart of a Therapist. "This book is a collection of quotes and anecdotes from Elvin Semrad, a psychiatrist who practised in the United States in the second half of the 20th century. He consistently emphasised that the first and most important task of the trainee practitioner is to learn to sit with the patient, listen to and hear them, and help to stand the pain they could not bear alone. Semrad wouldn’t have agreed with my choice here as he believed ‘the patient is the only textbook we need’," said David Lavallee, Jun 10.




The Republic, Plato. "It covers so many aspects of social organisation, reminding us that the fundamental questions have been addressed, just as we go on addressing them," said Margaret McAllister, Feb 11.




"B.F. Skinner’s The operational analysis of psychological terms (Psychological Review 52, 270–277, 1945) is rarely read and even less often understood. Contrary to some misrepresentations of his position, Skinner never doubted that we can describe internal states such as thoughts or emotions, but he wondered how we are able to do this. His answer was surprising, relevant to the practice of psychotherapy, and a challenge to all those who (like some unsophisticated therapists) assume that we can know our own feelings by a simple process of self-inspection," said Richard Bentall, Apr 11.






"Michael Rutter’s Genes and Behaviour. It eloquently and effortlessly gives a comprehensive overview of behaviour genetics." Essi Viding, September 2011.




"I think The Selfish Gene by Richard Dawkins was probably the most influential book I read as a student that really shaped my thinking as a psychologist and a human." Bruce Hood, Jan 12



"The Man with a Shattered World by A.R Luria. It’s about a Russian soldier who sustained a severe brain injury in World War II. The effort, hope and commitment shown by this man had a big impact on me." Barbara Wilson, February 12



"The two volumes of William James’s The Principles of Psychology. Martin Conway, March 2012.
 
"The maxims of the Duc de la Rochefoucauld. Perhaps it’s best to read whatever excites your curiosity at the time. But it might well bore you later on." Richard Hallam June 2012



"Advice for a Young Investigator by Ramon y Cajal." Hugo Spiers September 2012



"Jonathan Haidt’s The Righteous Mind shows how you can tackle really important themes in a way that is both elegant and precise." Guy Claxton October 2012



"Michel Foucault, Madness and Civilisation: A History of Insanity in the Age of Reason (1967, Tavistock)." From Jane Ussher December 2012



"The Interpretation of Dreams by Freud. It was the first psychology book I read, having borrowed it from the library at the age of 15. By the end of it I was convinced dreams were the ‘royal road to the unconscious’ and that psychology was for me." Dame Glynis Breakwell January 2013



"William James’ Principles of Psychology because it shows how good psychologists can be outside a narrow box of thinking." Shivani Sharma March 2013



"Paul Johnson’s book The Intellectuals (1998) in which he looks at the difference between what the great thinkers – Marx, Rousseau, Ibsen, Russell and many more – said and what they actually did. Insightful, inspiring, an exemplar of good historical research and profound." Stephen Murgatroyd June 2013




"Man’s Search for Meaning. Viktor E. Frankl. A book that might not offer the answer – but it certainly offers an answer to the vexed question of how to live." Frank Tallis September 2013




"Thinking, Fast and Slow, a dual processing account of decision making from Nobel Prize laureate Daniel Kahneman. Accessible, a treasure trove of knowledge, and a real fun read." Shira Elqayam March 2014



"An essay written in 1946 by George Orwell called ‘Politics and the English language’. Like politics, psychology can sometimes fall victim to its own jargon and conceptual confusions. This essay helps you to write well." Valerie Curran September 2014



"The Superstition of the Pigeon by B.F. Skinner: a classic told delightfully." Aleks Krotoski January 2015



"Freud’s Civilisation and Its Discontents. I think this is one of the seminal books in applied psychology, although it is classified as political philosophy. I first read it as a sixth-former in Ceylon. Coming from a Buddhist background my world-view was not so different from Freud’s (‘eros’ and ‘thanatos’).  Like Freud, I became an atheist and freethinker." Migel Jayasinghe, February 2015


"One Flew Over the Cuckoo’s Nest, because it reminds us that the power of the ‘expert’ is easy to abuse." Jo Silvester, October 2015









Sunday, May 15, 2016

Lucifer in Starlight -- George Meredith (1828-1909)





On a starred night Prince Lucifer uprose.
Tired of his dark dominion swung the fiend
Above the rolling ball in cloud part screened,
Where sinners hugged their spectre of repose.
Poor prey to his hot fit of pride were those.
And now upon his western wing he leaned,
Now his huge bulk o'er Afric's sands careened,
Now the black planet shadowed Arctic snows.
Soaring through wider zones that pricked his scars
With memory of the old revolt from Awe,
He reached a middle height, and at the stars,
Which are the brain of heaven, he looked, and sank.
Around the ancient track marched, rank on rank,
The army of unalterable law.


Saturday, May 14, 2016

Maps -- Postmodern Jukebox, ft. Morgan James








I miss the taste of the sweet life
I miss the conversation
I'm searching for a song tonight
I'm changing all of the stations
I like to think that we had it all
We drew a map to a better place
But on that road I took a fall
Oh baby why did you run away?
I was there for you
In your darkest times
I was there for you
In your darkest nights
But I wonder where were you
When I was at my worst
Down on my knees
And you said you had my back
So I wonder where were you
All the roads you took came back to me
So I'm following the map that leads to you
The map that leads to you
Ain't nothing I can do
The map that leads to you
Following, following, following to you
The map that leads to you
Ain't nothing I can do
The map that leads to you
Following, following, following
I hear your voice in my sleep at night
Hard to resist temptation
'Cause something strange has come over me
Now I can't get over you
No I just can't get over you
I was there for you
In your darkest times
I was there for you
In your darkest nights
But I wonder where were you
When I was at my worst
Down on my knees
And you said you had my back
So I wonder where were you
All the roads you took came back to me
So I'm following the map that leads to you
The map that leads to you
Ain't nothing I can do
The map that leads to you
Following, following, following to you
The map that leads to you
Ain't nothing I can do
The map that leads to you
Oh oh oh
Oh oh oh
Yeah yeah yeah
Oh oh
Oh I was there for you
Oh In you darkest times
Oh I was there for you
Oh In your darkest nights
Oh I was there for you
Oh In you darkest times
Oh I was there for you
Oh In your darkest nights
But I wonder where were you
When I was at my worst
Down on my knees
And you said you had my back
So I wonder where were you
All the roads you took came back to me
So I'm following the map that leads to you
The map that leads to you
Ain't nothing I can do
The map that leads to you
Following, following, following to you
The map that leads to you
Ain't nothing I can do
The map that leads to you
Following, following, following
Songwriters
BENJAMIN JOSEPH LEVIN, ADAM LEVINE, AMMAR MALIK, RYAN B. TEDDER, NOEL PATRICK ZANCANELLA


Friday, May 13, 2016

Grit is just Conscientiousness, get over it

The real problem with Angela Duckworth's "grit" is that she implies that it is "growable," that one can somehow become "grittier" through training, parenting, or life experience. The only problem with that assertion is that there is absolutely no evidence to support it. Personality traits are, by definition, stable characteristics. Psychologists can't turn introverts into extroverts and they can't turn low conscientiousness people into high conscientiousness people. And that's too bad, because except for general intelligence (as measured by IQ tests), nothing predicts academic, professional, or personal success as well as Conscientiousness.




Slate
"By the 1980s and the 1990s, lumpers in psychology had embraced a grand unified theory of personality, which collapsed all the nuances that came before into a set of supertraits—the Big Five. Under this new system, grit and all its near and distant cousins—willpower, superego strength, industriousness, and so on—would fall under an umbrella factor known as “conscientiousness.” (The remaining four of the Big Five supertraits: extraversion, agreeableness, neuroticism, and openness to experience.) Like grit, conscientiousness could be measured with a survey: a set of statements, maybe several hundred, for a person to read and then assign himself a score. (There are other ways to measure personality: A psychologist might ask people, for example, whether they engage in specific behaviors such as making lists or showing up early for meetings.)
“[The Big Five] brought clarity to a true buzz of confusion,” Roberts says, and it allowed researchers to make bigger claims about the broad significance of character. A measure of someone’s conscientiousness, for example, could help predict her longevity and physical health, as well as her marital stability. It could also tell you how likely she would be to find success in high school, college, and the workplace. But if the adoption of the Big Five proved useful in the lab, it made the science of personality harder to explain to outsiders. “When I say, conscientiousness,” says Roberts, “people go, ‘Huh?’ ”
That’s why Duckworth worked so hard to give her measure a catchy name. “I came up with it over other terms like pluck, tenacity, persistence, perseverance,” she said during one interview. “It has the connotations that I wanted. It sounds good.” It’s true: Conscientiousness comes off as something weak—a nerdy way of playing by the rules; grit suggests a vigorous, old-fashioned form of virtue. Grit’s the antidote for an overpolished age, a return to rough-hewn authenticity. “It’s brilliant in terms of marketing,” says Roberts. “People understand it immediately.”
Grit the measure and Grit the book are clearly triumphs of rebranding. It’s not as easy to discern whether Duckworth has produced something more than that—a set of new and substantive ideas to match her innovative presentation. To put this another way: Is she the Alice Waters of psychology, the leader of a revolution, or is she the field’s Rick Mast, more a pioneer of pretty packaging?
A brand-new meta-analysis of the literature on grit—conducted by researchers Marcus Credé, Michael Tynan, and Peter Harms using 88 samples and 67,000 subjects—provides some clues. There isn’t much space between Duckworth’s measure and conscientiousness, the study argues. If you test a group of people for both traits, administering standard surveys to measure grit and conscientiousness, the results will end up very tightly linked; in some studies their relationship approaches 1-to-1. In Roberts’ view, grit corresponds very closely to a facet, or subtrait, of conscientiousness that has for many years been called industriousness."


See also: Grit - Angela Duckworth's TED talk 
















Thursday, May 12, 2016

The Law of Unintended Consequences: Microaggression Training











Why not give everyone on campus the Implicit Association Test and kick out those who score in the bottom quartile? Oh yeah, because the IAT is more of a game than a test, in that scores on the IAT have rarely been shown to correlate to real-world racist behaviors. Which is why it can't be used to screen out racist police officer candidates, etc. (There is at least one study showing external validity, and it's really interesting.)




Jonathan Haidt & Lee Jussim, WSJ


"In the past few years, a new approach has gained attention and become a common demand of campus protesters: microaggression training. Microaggressions are defined as brief and commonplace daily indignities, whether intentional or not, that make people of color feel denigrated or insulted. The idea covers everything from asking someone where they are from to questioning the merits of affirmative action during a classroom discussion.
But microaggression training is likely to backfire and increase racial tensions. The term itself encourages moralistic responses to actions that are often unintentional and sometimes even well-meaning. Once something is labeled an act of aggression, it activates an oppressor-victim narrative, which calls out to members of the aggrieved group to rally around the victim. As the threshold for what counts as an offense falls ever lower, cross-racial interactions become more dangerous, and conflict increases.
Protesters also have demanded that microaggression training be coupled with anonymous reporting systems and “bias response teams.” Students are encouraged to report any instance when they witness or suffer a microaggression. It is the “see something, say something” mind-set, transferred from terrorism threats to conversational blunders and ambiguities.
But such systems make it far more important to keep track of everyone by race. How would your behavior change if anything you said could be misinterpreted, taken out of context and then reported—anonymously and with no verification—to a central authority with the power to punish you? Wouldn’t faculty and students of all races grow more anxious and guarded whenever students from other backgrounds were present?"
It other words, to paraphrase the authors of the article, in an environment in which everyone is hyper-alert for perceived microaggressions (and unforgiving/punitive regarding them), wouldn't it be safest for majority students and faculty to avoid interactions with minority students as much as possible? And since cross-group interactions are the best way to reduce inter-group conflict, isn't that self-defeating?