Showing posts with label genetics. Show all posts
Showing posts with label genetics. Show all posts

Thursday, October 10, 2013

Average is Over: Morlocks and Eloi

From Michael Barone's review of Tyler Cowan's near-futurist Average is Over. Cowan is the author of the blog, Marginal Revolution.

The big winners in the economy he foresees will be those who can work with and harness machine intelligence and those who can manage and market such people.
Such "hyperproductive" people, about 15 percent of the population, will be wealthier than ever before. Also doing well will be those providing them personal services. [College professors and psychotherapists?]
For jobs lower down on the ladder, there will be a premium on conscientiousness. That's good for women and bad for men, who are more likely to do things their own way. [Meaning that men are more likely to have ADHD, alcoholism, and authority problems.]
Middle-level jobs, Cowen says, are on the way out. He argues that many of those laid off after the financial crisis were "zero marginal product" workers. They weren't producing anything of value and employers won't replace them.
Upward mobility will still be possible, he says, thanks to machine-aided education, which can spot talent in unlikely places. But I think he overestimates how likely that will be. [The SAT was created to spot talent in unlikely places, e.g. Nebraska farm kids.]
Assortative mating (people marrying similar people) and the considerable hereditability of intelligence means that many or most of those with the talents to get to the top will start out there. A fair society, ironically, may have less social mobility.

 
How will this society handle the pending fiscal shortfall? Cowen's prediction: by raising taxes a bit (but it's hard to get more out of rich, clever people), cutting Medicaid (the poor are a weak constituency) [I suppose that means Obamacare, too], maintaining aid to the elderly (a strong constituency) and squeezing employees (by imposing mandates on employers that will reduce cash income).
Those at the bottom will move to cheaper places like Texas. He recommends beans and tortillas as a delicious and nutritious diet (as in "An Economist Gets Lunch").
Much of this is already happening to some extent. Our most liberal areas (New York, the Bay Area) have the greatest income disparities. Drive down Middlefield Road in Silicon Valley and in one mile you go from $4 million walled mansions to what looks like rural Mexico.
Barone digresses a bit here, for no apparent purpose. He seems to agree with Cowan's predictions but wants to temper them with an insouciant "don't worry about the bottom 85% -- they might be desperately poor, but they'll be happy with their families and tortillas."
Brookings's William Galston, writing in the Wall Street Journal, feels "justified revulsion" at this. He accuses Cowen of "moral indifference." I would accuse him of focusing too narrowly on economics.
People get satisfaction out of more than just earning money. They get satisfaction out of what American Enterprise Institute president Arthur Brooks calls earned success.
Earned success can come from high earnings or from simply doing a job well. It can come from raising children and meeting family obligations.
It can come from working with people in your community or your church, or with others with common interests. Even people of very limited abilities can earn success and live fulfilling lives.
Cowen predicts the masses won't revolt. They will have comfortable lives and good entertainment -- bread and circuses.
I suspect he's right.

Source: Real Clear Politics




Wednesday, September 25, 2013

Alison Gopnik -- Genes and Intelligence

[image]

Wall Street Journal
We all notice that some people are smarter than others. You might naturally wonder how much these differences in intelligence depend on genes or upbringing. But that question, it turns out, is impossible to answer.
That's because changes in our environment can actually transform the relationship among our traits, our upbringing and our genes.
Okay, Genetic X Environmental interactions (genotype x environment = phenotype). But the amount of variance in a population that can be attributed to either genes or environment is not an "impossible" question. What she is avoiding saying right off the bat is that most of the variation in intelligence observed in people from middle-class (or better) backgrounds is attributable to genetic variation.


From Steve Hsu's blog
 
The textbook illustration of this is a dreadful disease called PKU. Some babies have a genetic mutation that makes them unable to process an amino acid in their food, and it leads to severe mental retardation. For centuries, PKU was incurable. Genetics determined whether someone suffered from the syndrome, which gave them a low IQ.
Then scientists discovered how PKU works. Now, we can immediately put babies with the mutation on a special diet. Whether a baby with PKU has a low IQ is now determined by the food they eat—by their environment.
We humans can figure out how our environment works and act to change it, as we did with PKU. So if you're trying to measure the relative influence of human nature and nurture, you have to consider not just the current environment but also all the possible environments that we can create.
Gotta love the PKU example. A great way to illustrate G x E interaction. However, it is one of the rarest causes of low intelligence (although, barring the dietary intervention, it will lead to mental retardation). Don't make the mistake of thinking that if you observe someone with low intelligence, they are a victim of PKU. The rate of babies born with PKU is about 1 in every 15,000 live births.

This doesn't just apply to obscure diseases. In the latest issue of Psychological Science, Timothy C. Bates of the University of Edinburgh and colleagues report a study of the relationship among genes, SES (socio-economic status, or how rich and educated you are) and IQ. They used statistics to analyze the differences between identical twins, who share all DNA, and fraternal twins, who share only some.
When psychologists first started studying twins, they found identical twins much more likely to have similar IQs than fraternal ones. They concluded that IQ was highly "heritable"—that is, due to genetic differences. But those were all high SES twins. Erik Turkheimer of the University of Virginia and his colleagues discovered that the picture was very different for poor, low-SES twins. For these children, there was very little difference between identical and fraternal twins: IQ was hardly heritable at all. Differences in the environment, like whether you lucked out with a good teacher, seemed to be much more important.

Identical twins are clones -- creepy, huh? 
In the new study, the Bates team found this was even true when those children grew up. IQ was much less heritable for people who had grown up poor. This might seem paradoxical: After all, your DNA stays the same no matter how you are raised. The explanation is that IQ is influenced by education. Historically, absolute IQ scores have risen substantially as we've changed our environment so that more people go to school longer.
I probably would have summarized this research differently. Among twins from decent backgrounds, there is a high heritability for intelligence. Among very poor twins, there is little to no heritability. The awfulness of their environment wipes out whatever chance there was for higher intelligence. This is not the same as saying that better schools yield higher IQ. It does suggest that the genes associated with higher intelligence cannot flourish in poor environments. We are talking about IQ being suppressed by an impoverished environment. [Another news report here, quoting Eric Turkheimer from UVa., but getting itself confused with "nature v. nuture".]

Richer children have similarly good educational opportunities, so genetic differences among them become more apparent. And since richer children have more educational choice, they (or their parents) can choose environments that accentuate and amplify their particular skills. A child who has genetic abilities that make her just slightly better at math may be more likely to take a math class, so she becomes even better at math.
But for poor children, haphazard differences in educational opportunity swamp genetic differences. Ending up in a terrible school or one a bit better can make a big difference. And poor children have fewer opportunities to tailor their education to their particular strengths.
How your genes shape your intelligence depends on whether you live in a world with no schooling at all, a world where you need good luck to get a good education or a world with rich educational possibilities. If we could change the world for the PKU babies, we can change it for the next generation of poor children, too.
Here's the kicker. What happens when we give everyone the same environment? Everyone gets a two-family home in which at least one parent is employed and both are non-drug using, non-criminal high school graduates. Everyone goes to schools of the same quality. We will still see variations in intelligence (that is, some people will still be smarter than others). But then 100% of the variation in intelligence will be attributable to genetic variations in the population. Control for environment and the only explanation left for observed differences is genetic.



Footnote:

A version of this article appeared September 21, 2013, on page C2 in the U.S. edition of The Wall Street Journal, with the headline: Good Genes Only Get You So Far With Intelligence. It was released online with a different, more accurate headline: Poverty Can Trump a Winning Hand of Genes.

 

 
 
 
 
 
 
 
 

Friday, June 7, 2013

Supreme Court allows warrantless collection of DNA


 
Gattaca: A soon-to-be True Story


Harvard Law professor Noah Feldman, writing on Bloomberg.com:

The day that DNA cheek swabs officially became the new fingerprints deserves to be marked and remembered -- and not just because of the inevitable march of technology.
No, the Supreme Court’s 5-4 holding today in Maryland v. King, that anyone arrested for a “serious crime” can have his or her DNA taken without any suspicion, is a landmark because it represents a major step toward a “Gattaca” world. This means that evidence of a crime can be collected without any particular suspicion, avoiding the pesky requirement of a warrant that the Founding Fathers thought would give us liberty and privacy.
Justice Anthony Kennedy’s majority opinion treats the standard collection of DNA samples from arrestees in Maryland as the logical outgrowth of the state’s interest in identifying the people it has arrested. This is a bit of a surprise from Kennedy, who can generally be counted on to embrace liberty. Yet in this case, he wrote, the state’s interest in keeping track of everyone it has arrested can be satisfied more accurately by DNA than by fingerprinting. And the swab of the cheek is, he said, little more invasive than a fingerprint.
If DNA sampling was actually like fingerprinting, this argument might be convincing. But of course it isn’t. Fingerprints are a phenotype that reveals nothing except a random pattern that no two individuals share. DNA, however, is your genotype: the blueprint for your entire physical person. If the government has my fingerprints, it’s like they have my randomly assigned Social Security number. If it has my DNA, it’s like they have the entire operating system.
 
 
 
 I had no memory of Gore Vidal being in this movie.
 
Full Blueprint
That DNA is a full blueprint matters in two major ways: The first and most basic is that when the state possesses genetic information, it can -- and in the future, almost certainly will-- know vast amounts about the person whose genes are typed. The court said this wasn’t a worry because Maryland law prohibits the use of DNA information beyond identification. But in a world where every arrestee is sampled, how long will that legal principle last?
Yet it was the second concern that exercised Justice Antonin Scalia in his pungent dissent. Ordinarily, Scalia explained, a search can be performed only on probable cause and with a warrant. Fingerprints are not a search. But DNA is a search, and for a very important reason: The DNA of the petitioner, Alonzo Jay King Jr., was used to link him to the rape he was accused of committing. In other words, said Scalia, the purpose of the search and the swab wasn’t to identify the accused with a unique marker. It was to solve a crime in question.
If Scalia’s distinction seems subtle, it shouldn’t. Fingerprints are generally used for bureaucratic identification and only occasionally to solve a crime -- when the criminal has been careless enough to leave them behind. DNA, by contrast, hasn’t, thus far, been used for bureaucratic identification. It is useful primarily for solving crimes, since it is almost impossible not to leave some DNA behind wherever we may go and whatever we may do. To prove the point, Scalia demonstrated that King’s DNA sat around for weeks before being analyzed -- and was eventually analyzed to solve a crime, not to keep track of the criminal.
What is remarkable, then, is that even Justice Scalia --joined by the liberals Ruth Bader Ginsburg, Elena Kagan and Sonia Sotomayor -- thinks there would be nothing wrong with sampling every arrestee’s DNA if the purpose really were just to keep tabs on them. The constitutional objection focuses on what the DNA is actually used for. However, these two functions --bureaucratic identification and crime solving -- can probably never be fully separated in the real world. As technology improves, the DNA database could be employed to solve crimes even if its primary purpose were just to be for bureaucratic classification. The reason, again, is the nature of DNA itself, which is not only unique but also oozes from our every pore.



The Supreme Court opinions (both majority and dissent) can be found right here. If you have never read a Supreme Court decision, give it a try; they are usually remarkably clear and even entertaining. Antonin Scalia is a fantastic writer. Here's another commentator, from the Cato Institute blog:

If there’s ever a time when Antonin Scalia really rises to the occasion, it’s when he serves as the Supreme Court’s liberal conscience….
[A]long with the good [from DNA testing] comes a new potential, warned against by civil libertarians, for the authorities to use DNA access to track citizens through life. Who was at the closed-door meeting of political dissidents? Swab the discarded drinking cups for traces of saliva, match it to a universal database, and there you’ve got your list of attendees. Want to escape a bad start and begin life over in a different community? Good luck with that once your origins are an open book to officialdom.
In his dissent, Scalia warns of such a “genetic panopticon.” (The reference is to Jeremy Bentham’s idea of a prison laid out so that inmates could be watched at every moment.) And it’s closer than you may think. Already fingerprint requirements have multiplied, as the dissent points out, “from convicted criminals, to arrestees, to civil servants, to immigrants, to everyone with a driver’s license” in some states. DNA sample requirements are now following a similar path, starting reasonably enough with convicts before expanding, under laws passed by more than half the states as well as Maryland, to arrestees. (“Nearly one-third of Americans will be arrested for some offense by age 23.”) Soon will come wider circles. How long before you’ll be asked to give a DNA swab before you can board a plane, work as a lawn contractor, join the football team at your high school, or drive?
With the confidence that once characterized liberals of the Earl Warren–William Brennan school, Scalia says we can’t make catching more bad guys the be-all and end-all of criminal process:
Solving unsolved crimes is a noble objective, but it occupies a lower place in the American pantheon of noble objectives than the protection of our people from suspicionless law-enforcement searches. The Fourth Amendment must prevail. … I doubt that the proud men who wrote the charter of our liberties would have been so eager to open their mouths for royal inspection.”
 
 

Thursday, June 6, 2013

"I don't know why we are here..."

"...but I'm pretty sure that it is not in order to enjoy ourselves."

-- Ludwig Wittgenstein*


 


I wonder how much it would cost to have that epigram carved onto every high school in the United States?

I don't pretend to understand the work of Wittgenstein, who is generally regarded (perhaps on faith?) as the greatest philosopher of the 20th century. He was an intriguing person, regardless of his philosophical work; his family history of suicide and mental illness is remarkable. His father Karl was a speculative, innovative genius in business and industry and made himself one of the richest men in Europe. Ludwig had four older brothers, three of whom (Rudi, Hans, and Kurt) died by suicide. The surviving brother and Ludwig himself both had recurring episodes of suicidal depression. Ludwig said that he first thought about suicide when he was 10 years old. From an interesting New Yorker review of a family biography:

Sometime in 1901, Hans fled from his father and went to America, much as his own father had done thirty-six years earlier. In 1902, he disappeared, by most accounts, from a boat, which may have been in the Chesapeake Bay, perhaps on the Orinoco River in Venezuela, or in several other places. Wherever it was, no one doubted that he had committed suicide. Hans’s disappearance was a banned topic.
Rudi was a twenty-two-year-old chemistry student in Berlin when he walked into a bar on a May evening in 1904, requested a sentimental song from the pianist, and then mixed potassium cyanide into a glass of milk and died in agony. The suicide note left for his parents said that he had been grieving over the death of a friend. A more likely explanation is that he thought he was identifiable as the subject of a published case study about homosexuality. After Rudi’s funeral, Karl forbade the family to mention him ever again. Waugh thinks that this enforced silence, which the dutiful Mrs.Wittgenstein supported, created a permanent rift between parents and children. The exact circumstances of Kurt’s suicide, which took place on the Italian front in 1918, are unknown. He was generally regarded as cheerful, but Hermine recorded that he seemed to carry “the germ of disgust for life within himself.”
At one point in his life, Wittgenstein toyed with the idea of becoming a psychoanalyst. He regarded highly Freud's Interpretation of Dreams, Augustine's Confessions, and Joyce's Portrait of the Artist as a Young Man. I would contend that all three of those books are essential reading for aspiring clinical psychologists. Judging from the letter below, I suspect that Wittgenstein himself had some formidable instincts for the healing arts.

When [Wittgenstein's close friend] Drury was in his first period of hospital residence he was dismayed by his ignorance and clumsiness. He told Wittgenstein that perhaps it had been a mistake for him to become a doctor. The next day he received a letter:
You said in the Park yesterday that possibly you had made a mistake in having taken up medicine: you immediately added that probably it was wrong to think such a thing at all. I am sure it is. But not because being a doctor you may not go the wrong way, or go to the dogs, but because if you do, this has nothing to do with your choice of a profession being a mistake. For what human being can say what would have been the right thing if this is the wrong one? You didn’t make a mistake because there was nothing at the time you knew or ought to have known that you overlooked ... The thing now is to live in the world in which you are, not to think or dream about the world you would like to be in. Look at people’s sufferings, physical and mental, you have them close at hand, and this ought to be a good remedy for your troubles ... Look at your patients more closely as human beings in trouble and enjoy more the opportunity you have to say ‘good night’ to so many people. This alone is a gift from heaven which many people would envy you.
In 1941 Drury was posted to the Middle East. Wittgenstein came to Liverpool to say goodbye to him, and presented him with a silver drinking cup. Wittgenstein: ‘Water tastes so much nicer out of silver. There is only one condition attached to this gift: you are not to worry if it gets lost.’ Later in the war Drury was posted back to England to be a medical officer in a landing craft in the Normandy invasion. When he came to say goodbye one remark of Wittgenstein’s was: ‘If it ever happens that you get mixed up in hand to hand fighting, you must just stand aside and let yourself be massacred.’
(Source)

I'm still not sure that I fully comprehend that last statement. Was Wittgenstein making a general statement on ethics or was that directed to Drury personally? In any event, it has been stuck in my head for months. I am leaning towards the position that he might very well be right.






*As quoted in The Beginning of the End (2004) by Peter Hershey, p. 109





Monday, March 11, 2013

Marshmallow Test

This is probably one of the best known experiments in psychology, thanks to the viral video below. You put a four year old in a room alone, present him with a marshmallow (or pretzel, or cookie, depending on the kid's preferences), and tell him that you are going to step out of the room and if the treat is still uneaten when you return, you will give them twice as many treats. The dependent variable here is how long the kid waits before eating the treat (the experimenter returned after 15 minutes; the kids were watched through a two-way observation mirror).

Here's the original research article.




Cute kids, right?

But the story takes a darker turn (as most good stories do).

Years later, the original researcher, Walter Mischel, tracked down the original participants and found that "teenagers who had waited longer for the marshmallows as preschoolers were more likely to score higher on the SAT, and their parents were more likely to rate them as having a greater ability to plan, handle stress, respond to reason, exhibit self-control in frustrating situations and concentrate without becoming distracted." [Source: APA]

"Recently, [researchers] tracked down 59 subjects, now in their 40s, who had participated in the marshmallow experiments as children. The researchers tested the subjects’ willpower strength with a laboratory task known to demonstrate self-control in adults.

Amazingly, the subjects’ willpower differences had largely held up over four decades. In general, children who were less successful at resisting the marshmallow all those years ago performed more poorly on the self-control task as adults.

Additionally, Casey and colleagues examined brain activity in some subjects using functional magnetic resonance imaging. When presented with tempting stimuli, individuals with low self-control showed brain patterns that differed from those with high self-control. The researchers found that the prefrontal cortex (a region that controls executive functions, such as making choices) was more active in subjects with higher self-control. And the ventral striatum (a region thought to process desires and rewards) showed boosted activity in those with lower self-control.

Research has yet to fully explain why some people are more sensitive to emotional triggers and temptations, and whether these patterns might be corrected."

...

"Amazingly"? Really? The American Psychological Association is "amazed" by these findings?

Adj. 1. amazed - filled with the emotional impact of overwhelming surprise or shockamazed - filled with the emotional impact of overwhelming surprise or shock; "an amazed audience gave the magician a standing ovation"; "I stood enthralled, astonished by the vastness and majesty of the cathedral"; "astounded viewers wept at the pictures from the Oklahoma City bombing"; "stood in stunned silence"; "stunned scientists found not one but at least three viruses"
surprised - taken unawares or suddenly and feeling wonder or astonishment; "surprised by her student's ingenuity"; "surprised that he remembered my name"; "a surprised expression"


Why wouldn't you expect the 4 year olds with higher willpower to have higher willpower as 40 year olds? We have known for over a century that kids who measure in the top percentiles of intelligence at age 5 are likely to be found in the top percentiles at age 15 and again at age 50. For decades we have known that fearful toddlers are more likely to grow up to be shy and inhibited adults. Why would this trait (call it "impulse control") be any different?

By the way, please note that it would not be surprising if the differential brain functioning observed in adulthood was also noted in childhood. In other words, the brains of the two sets of kids (high impulse control versus low impulse control) were already different from each other at age 4 years.


This is kind of amazing, I admit:

"The child who could wait fifteen minutes had an S.A.T. score that was, on average, two hundred and ten points higher than that of the kid who could wait only thirty seconds."

So, the marshmallow test is an IQ test? We're not sure -- we would need to have cognitive ability scores at age 4, which were not collected. If the 4 year olds' cognitive ability correlated very strongly to how long they delayed eating the treat, then we needn't talk about impulse control at age 4 predicting SAT scores, susbtance abuse, etc.. We already know that IQ predicts those things. The conceptual question is: Is impulse control a facet of general intelligence? The practical question is: Can impulse control be modified? (We know that IQ can't be modified, except in the wrong direction -- wear a helmet when you snowboard!)

Well, impulse control can be modified, by ingesting Ritalin, Adderall, Vyvanse, or any other ADHD medication. Can it be taught? We don't know.

Be careful when thinking about the "good parents" you know who "teach" their kids "self-control" through their parenting behaviors. A big part of why those kids end up with high self-control is because they are the biological offspring of people with high self-control.

Which brings us to this:

Anokhin, A.P., Golosheykin, S., Grant, J.D., & Heath, A.C. (2011). Heritability of delay discounting in adolescence: A longitudinal twin study. Behavioral Genetics, 41(2), 175-183.

Abstract


Delay discounting (DD) refers to the preference for smaller immediate rewards over larger but delayed rewards, and is considered to be a distinct component of a broader “impulsivity” construct. Although greater propensity for discounting the value of delayed gratification has been associated with a range of problem behaviors and substance abuse, particularly in adolescents, the origins of individual differences in DD remain unclear. We examined genetic and environmental influences on a real-life behavioral measure of DD using a longitudinal twin design. Adolescent participants were asked to choose between a smaller ($7) reward available immediately and a larger ($10) reward to be received in 7 days. Biometrical genetic analysis using linear structural equation modeling showed significant heritability of DD at ages 12 and 14 (30 and 51%, respectively) and suggested that the same genetic factors influenced the trait at both ages. DD was significantly associated with symptoms of conduct disorder, attention deficit hyperactivity disorder, substance use, and with higher novelty seeking and poor self-regulation. This study provides the first evidence for heritability of DD in humans and suggests that DD can be a promising endophenotype for genetic studies of addiction and externalizing disorders.
 
"This study provides the first estimate of the heritability of DD in humans. Using a real-money choice paradigm, this study demonstrated that individual differences in the discounting of delayed gratification among adolescents are moderately to strongly influenced by genetic factors at ages 12 and 14 respectively. The results also suggest an increasing role of genetic factors with age. Given that DD is considered to be one of the key components of a broader impulsivity trait (; ), the demonstration of a significant genetic component in DD has important implications for genetic studies of psychopathological conditions characterized by increased impulsivity, particularly those involving impulsive decision-making, such as addictions, pathological gambling, externalizing disorders, and ADHD.

...

Taken together, the present findings and previous literature suggest that DD may represent a core neurocognitive dysfunction contributing to a range of problem behaviors characterized by reduced sensitivity to the delayed consequences of one’s decisions and actions. The preference for immediate but smaller rewards may be a behavioral marker of underlying genetic liability to impaired decision making and could serve as a useful endophenotype for genetic studies of the etiology of substance use disorders."


So those cute kids who just couldn't resist eating the marshmallow in the YouTube video? You're laughing at a neurocognitive dysfunction with a strong genetic basis. Here's what those kids look like as teenagers. And here's what they look like in their 40s.


....


Walter Mischel

This is an interesting article from the New Yorker about Walter Mischel and the marshmallow test. It mentions the various ways in which Mischel retarded the advance of personality science, e.g., declaring that personality traits couldn't be measured accurately, and that context was more important than personality.

From Wikipedia:

"In 1968, Mischel published the now classic monograph, Personality and Assessment, which created a paradigm crisis in personality psychology that changed the agenda of the field for decades. The book touched upon the problem in trait assessment that was identified by Allport back in 1937. Mischel showed that study after study failed to support the fundamental traditional assumption of personality theory, that an individual’s behavior with regard to a trait (e.g. conscientiousness, sociability) is highly consistent across diverse situations. Instead, Mischel's analyses revealed that the individual’s behavior, when closely examined, was highly dependent upon situational cues, rather than expressed consistently across diverse situations that differed in meaning.Mischel maintained that behavior is shaped largely by the exigencies of a given situation. That people act in consistent ways across different situations, reflecting an underlying consistency of personality traits, is a myth."

The only problem with that view is that is it wrong, insofar as it is usually interpreted to mean that the measurement of personality traits cannot yield accurate predictions of future behavior. It is ironic that Mischel is famous for two things: 1) saying that personality isn't consistent and traits don't predict future behavior; and 2) developing a personality test (the marshmallow test) that predicts future behavior (SAT scores, ADHD, substance abuse) and appears to assess a stable trait that is consistently demonstrated across a wide variety of situations (e.g., school, home, work).

There is mention in the New Yorker article about work by Mischel and Angela Duckworth that proposes to increase impulse control. My guess is that these will turn out about as well as prior attempts to improve IQ.

...

Finally, here's a summary of another interesting study that uses the marshmallow paradigm. In this one, the experimenters either break or fulfill a promise made to the kids. Whether the kids eat the marshmallow depends on whether the earlier promise had been broken or fulfilled. The researcher suggests that the reason poor kids eat the marshmallow (i.e., display low self-control) is that they have a history of broken promises and therefore less reason to believe the promises of adult experimenters. It's not because they are less intelligent or have genetically-mediated neurocognitive dysfunction.

Well, if Walter Mischel can start collaborating with behavioral geneticists, I can certainly tip my hat to the influences of early childhood environment. But you also have to acknowledge that promise breakers probably have low impulse control, and impulse control is at least partially genetic, so these kids are getting a double dose of vulnerability from their parents: the parents pass along a vulnerability to impulse control disorders, and provide a less-than-ideal environment that interacts with that genetic vulnerability.

Where people go wrong is when they neglect either side of the Genotype x Environment = Phenotype interaction. So, both the "broken promises" study and the twins study are interesting and important, but we have to integrate them in order to understand what's really going on.