Why We Don't Learn from Experience

We don't learn from experience because we refuse to accept the cause and effect loops that are there in plain sight.

Cognitive dissonance - when there is a discrepancy between reality as observed and one's worldview -- the perception of reality reality as constructed or conceived over one's lifetime -- people experience a sense of discomfort . Social Psychologists call this uncomfortable feeling "cognitive dissonance." Theorists hypothesize that people will tend to reconstruct their perceptions of reality in ways that will reduce the dissonance. cognitive dissonance appears to motivate a state of tension that occurs whenever a person holds two psychologically inconsistent cognitions in mind.

Leon Festinger coined the term after observing a cult of people who who believed they were recipients of a "revelation" from God about the end of times. The date of the apocalypse had been revealed to them through a prophet in exact terms: December 21, 19XX. Festinger stayed close to the group as the fateful day approached. When the end of times did not appear, members displayed a surprising conviction that their belief system had not really been disproven. Rather, they tended to change their memories about what the prophet had said, and to find ways in which the revelation had indeed come true. Their belief system remained intact. Festinger coined the term "cognitive dissonance" as he noted that members appeared to find admitting they'd been mistaken was too much of a cost to bear n light of the emotional and life investments each had made in the group. In a nutshell, cognitive dissonance appears to be such a powerful motivator that people will distort their perception of reality in order to lessen or mitigate the discomfort.

Selective Perception - First, we see that individuals faced with an array of disparate information will "select" and attend to the tidbits of data that support their worldview, or set of expectations, and will ignore information that does not fit their expectations.

Confirmation Bias -- Aronson notes that CD often leads to a "confirmation bias," characterized by both selective perception and selective memory.

Selective Perception - First, we see that individuals faced with an array of disparate information will "select" and attend to the tidbits of data that support their worldview, or set of expectations, and will ignore information that does not fit their expectations.

Selective and Distorted Memory -Second, studies of memory have shown that, over time, peoples' memories tend to change in ways that best "fit" their self-concept and general worldview.

T&A quote: p. 6 - "self serving distortions of memory kick in and we forget of distort past events." We gradually come to believe our own lies, the more we tell them. In fact, obvious misstatements of truth don't begin as full blown lies. First a little, then a little more, and then more still. Quicksand is a good metaphor for this. You get sucked in very gradually.

They cite LBJ, pg 7 - "a president who has justified his actions to himself, believing that he has the truth, becomes impervious to self-correction." "LBJ "had a fantastic capacity to persuade himself that the 'truth' which was convenient for the present was the truth..."

Self-Justification -- CD can also be employed to explain the strong tendency for people to engage in "self-justification." Tavris & Aronson: p. 29: "Dissonance is bothersome under any circumstance, but it is most painful to people when an important element of their self-concept is threatened -- typically when they do something that is inconsistent with their view of themselves."

T&A note (p 20) that the "confirmation bias sees to it that no evidence -- the absence of evidence -- is evidence for what we believe. p 2 - they note that "throughout his presidency GWB was the epitome of a man for whom even irrefutable evidence could not pierce his mental armor of self-justification." They note he was wrong in his assertion of WMD, a Saddam-Al Qaeda link, his estimate in the cost of the war, and his expectation that Iraqis would welcome the arrival of American soldiers with a joyful reception.

Lord Molson, British politician: (p. 17) --- "I will look at any additional evidence to confirm the opinion to which I have already come."

Many polls showed that many Americans believed that WMD had been found in Iraq, for months and even years since the military had concluded that there was nothing there to find. (Find examples.)

Hypocrisy: Aldous Huxley said "there is no such thing as a conscious hypocrite." (p 5). they note Ted Haggard did not dwell on the hypocrisy of railing against homosexuality while participating in sexual relationship with a male prostitute.

We now find that the brain itself appears to shut down in the presence of unexpected or dissonant information. see Dew Westen article: The Neural Basis of Motivated Reasoning. it appears that biases in perception are built into the way the brain works.

Once a decision is made, there are multiple mechanisms available for people to use to justify the rightness of the choice.

T&A point poignantly to the end of Casablanca. Despite Rick's admonition that "maybe not today, maybe not tomorrow, but soon, and for the rest of your life..." Elsa would regret making the wrong choice, staying in Casablanca with Rick. Noble ending, but not a correct observation of human nature. T&A postulate that Elsa would come to believe strongly in the rightness of her choice, no matter which direction she'd chosen. -- the drive for self-justification is just that strong.

Look through the lens of dissonance theory -- Several studies have shown that the judgments of "experts" in many fields are no more sound than those of randomly chosen people. The difference is that the experts are supremely confident in their view, while others admit to doubts.

Moreover, the experts are far more susceptible to distortions of their perceptions due to CD because their professional and personal reputation is at stake, unlike the normal person. This suggests, of course, that experts may be less likely to admit mistakes and learn from experience than non-experts.

T&A - p 36 - story of Jeb Magruder, Watergate culprit - show he was a good and decent person entering into his relationship with Liddy and the White House. -- slowly, a little at a time, Magruder went along with dishonest and illegal actions - justifying each one as he did.

Milgram's subjects followed the same pattern. Most would refuse to deliver the maximum jolt when they entered the situation, but when slowly building up the voltage - they seemed to justify each one because it was not much more than the last. Milgram experiment widely cited as showing that ordinary people will do vile and despicble thing when convinced that they are doing something for the greater good.

pg. 43 - "Democrats will endorse an extremely restrictive welfare proposal, one usually associated with Republicans, if they think it has been proposed by the Democratic Party." page 43 p 43 - we are as unaware of our blind spots as fish are unaware of the water they swim in. these biases lead to wrong decisions because of the confirmation bias, and to continue to justify the decision in order to avoid cognitive dissonance.

Merck funded the drug Vioxx. their own scientists did not uncover dangers, despite available data. Only when independent scientist, neither funded by Merck, nor having their own reputations at stake, was evidence of risks associated with the drug exposed. Likely, Merck's investigators did not lie, not did they "knowingly" distort their findings. Rather, they fell pry to cognitive biases that beset us all.

T&A p. 51 Similarly, a group of scientists paid by a group of parents of autistic children produced a study showing a positive correlation between childhood autism and childhood vaccines. 6 years later, 10 of 13 researchers retracted some of the results, citing a conflict of interest by the lead author. Since then, five studies have produced no causal relationship between vaccines and autism.

Us versus not-US -- pg 59 - "When things are going well, people feel pretty tolerant of other cultures and religions ... but when they are angry, anxious, or threatened, the default position is to activate their blind spots..." in a manner known as ethnocentrism... the belief that out own culture, nation or religion is superior to all others. stereotypes are bolstered by the self-justification bias. prejudice - once accepted to a small degree- is difficult to dislodge - cherry-picked pieces of data justify the initial belief, and prejudice grows a little bit at a time. --

Hitler's henchman Albert Speer - wrote in his memoirs: "people who turn their backs on reality are soon set straight by the mockery and criticism of those around them, which makes them aware they have lost credibility. In the Third Reich there were no such correctives, especially for those who belonged to the upper stratum. To the contrary, every self-deception was multiplied as in a hall of distorting mirrors, becoming a repeatedly confirmed picture of a fantastical dream world which no longer bore any relationship to the grim outside world. In those mirrors I could see nothing but my own face reproduced many times over." T&A refer to memory as "the self-justifying historian." pg 69

p 69 - most of us neither intend to lie nor intentionally deceive. Rather, we are self-justifying. "All of us, when we tell our stories, add details and omit inconvenient facts; we give the tail a small, self-enhancing spin." Reinforced for the story, we embellish it even more up on the next telling. "At the simplest level, memory smoothes out the wrinkles of dissonance by enabling the confirmation bias to hum along., selectively causing us to forget the discrepant information about beliefs we hold dear."

"if mistakes were made, memory helps us remember that they were made by someone else." pg 70 Anthony Greenwald refers to "the totalitarian ego" that ruthlessly destroys information it doesn;'t want to hear. [Find this XXX]

pg 71: great quote: "Confabulation, distortion, and plain forgetting are the foot soldiers of memory, and they are summoned to the front lines when the totalitarian ego wants to protect us from the pain and embarrassment of actions we took that are dissonant with our core self-images."

Nietzsche: "'I have done that, says my memory. 'I cannot have done that,' says my pride, and remains inexorable. Eventually--memory yields."

T&A page 71 Memory is reconstructive - pieces of experience are rebuilt from different parts of the brain. Much like a message is reconstructed over the internet from bytes that travel here through disparate pathways. As we rebuild the core memory, we are subject to the bias of our own theories. Upon repeated rebuilding, our story begins to look more and more as we'd have liked it to be. this relates to the "source confusion" phenomenon ---

When people learn that their memories are wrong, they are stunned.

We shape memories to fit our life story, rather than vice versa. See Barbara Tversky and Elizabeth Marsh study -- they show we "spin the stories of our lives." memories change to fit the story -- this happens gradually, over time. generally, memories change in the direction of self-enhancement.

See story of the book Fragments, by Binjamin Wilkomirski. He created a false biography, but seem to believe in the truth of his story. The book tells of his experiences in the Nazi death camps, despite...... there was a major problem with the story, though... as far as historians know, there were no "orphanages" in the Nazi concentration camps.

from Nasim Nicholas Taleb's The Black Swan: The Impact of the Highly Improbable. Memory is "a self-serving dynamic revision machine: you remember the last time you remembered the event and, without realizing it, change the story at every subsequent remembrance." (italics by original author) "we pull memories along causative lines, revising them involuntarily and unconsciously. We continuously re-narrate past events in the light of what appears to make what we think of as logical sense..."

He calls this "reverberation." memory corresponds to the strengthening of connections from an increase in brain activity in a given sector of the brain-- the more activity, the stonger the memory. The brain works this way: it creates narratives ;; and when a memory does not fit the narrative - we fix it so that it does. dreams are forms of narrative --

Taleb says that the story of the Maginot Line is suggestive. The French did learn from their experiences of WW1. They just "learned too specifically." In the run-up to WW2, they readied themselves to protect their county from a threat similar to that it faced two decades before.

pg. 50 --- we have a tendency to "tunnel"... we focus on a few well-defined sources of uncertainty, on too specific a list of [possibilities] at the expense of unimagined possibilities. This is why we did not expect the 9/11 style attack.

pg. 55 - Taleb describes what he calls "naive empiricism" -- "we have a tendency to look for instances that confirm our story and our vision of the world-- these instances are always easy to find... You take past instances that corroborate your theories and you treat them as evidence."

pg 62 - The Narrative fallacy -- our vulnerability to over interpretation and our predilection for compact stories over raw truth." note how truths are carried across generations in the form of mythology.

pg 84 -- the way to avoid the ills of the narrative fallacy is to favor experimentation over story-telling, experience over history, and clinical knowledge over theories.

pg 119 - "... we are explanation-seeking animals who tend to think everything has an identifiable cause and grab the most apparent one as the explanation. Yet there may not be a visible because; to the contrary, frequently there is nothing, not even a spectrum of possible explanations." Back to T&A: many of these false memories are not the result of calculated self-interest, but of self-persuasion. "The weakness of the relationship between accuracy and confidence is one of the best-documented phenomena in the 100-year history of eyewitness memory research." pg 108 The first interpretation of events is hard to break from. much evidence of this in the interrogation literature. IN this day of DNA evidence showing past cases to be mistakenly decided, s surprising number of prosecutors in these case refuse to admit that verdicts were wrong. "We impulsively decide we know what happened and then fit the evidence to support our conclusion, ignoring and discounting evidence that contradicts it." page 135 T&A When the evidence does not fit, the we tend to simply omit Many court-room based studies have shown that jurors will often make up their minds early in the process, and then selectively accept or reject the validity of evidence as it fits or contradicts their initial inclination. that is, when the evidence does not fit the story, they mitigate the importance of the evidence. In studies of police investigations: The confirmation bias sees to it that the prime suspect beacomes the only suspect. pg 137

"Once we have placed out bets, we don't want to entertain any information that casts doubt on that decision."

Example of spiraling self-justification: The crisis in Iran 1979 - Americans viewed their country as being attacked without provocation. So they were mad at the shah, Americans reacted, "what does that have to do with us?"

what started the hostage crisis? Each side blames the other. To Iranians, the Americans started the process in t1953 when the US aided a coup that deposed a democratically elected leader, Mohammed Mossedegh, and installed the Shah. Iranians blamed America as the shah accumulated great wealth, and used his secret police, the SAVAK, to put down dissent in reportedly brutal member.

the engine of the back-and-forth downward spiral of blame and connotation... self justification.

General Westmoreland said during the Vietnam War: "The Oriental doesn't put the same high price on life as does the Westerner. Life is plentiful. Life is cheap in the Orient." More self justification and stereotyping -- see Boston Globe 7-20-05,

2 ways to reduce cognitive dissonance. First is to say that if we do it, it must not be torture. "we do not torture," said GWB, "we use an alternative set of procedures." Second is to state that that the victims of torture simply got what they deserve.

"When George bush announced that he was launching a "crusade" against terrorism, most Americans welcomed the metaphor. In the West, crusade has positive connotations, associated with the good guys" -- think of the Billy Graham crusades... Batman and Robin, the Caped Crusaders.

Not so for most Muslims. The first Crusade of 1095 is still remembered. At that time, an army of Christians slaughtered the inhabitants of Jerusalem. to Muslims, it "might just as well have occurred last month, it's that vivid in the collective memory." Pg 206

in the 2nd presidential debate with John Kerrey October 8, 2004" Bush was asked for "three instances in which you came to realize you had made a wrong decision, and what you did to correct it." His response: "[When people ask about mistakes] they’re trying to say "Did you make a mistake in going to Iraq?" And the answer is 'Absolutely not.' It was the right decision....Now, you asked what mistakes. I made some mistakes in appointing people, but I'm not going to name them. I don't want to hurt their feelings on national TV."

Lao Tzu:
A great nation is like a great man:
When he makes a mistake, he realizes it.
Having realized it, he admits it.
Having admitted it, he corrects it.
He considers those who point out his faults
as his most benevolent teachers.

Quotes about Strategy and Decision

Quotes about Strategy and Decision

"To lack inteloligence is to be in the ring blinfolded."
General David M. Shoup, former Commandant of the US Marine Corps.

Three basic consderations in the threat-evaluation process:

Capabilties - What can the enemy do?
Intentions - What will the enemy do?
Vulnerabilities - What are the enemy's salient weaknesses?

in Rear Admiral J. C. Wylie's book, Military Strategy, he identifies two "elemental, perhaps irreducible strategies, which he entitled "sequential" and "cumulative."

Sequential strategies constitute successive steps, each contingnet onthe one prceding it, that lead to the final objective.

Cumulative strategies constitute a collection of individaul, random actions which collectively and eventually provide an overwhelming or crushing result.

General Beaufre, quoted in Collins pg. 16, "the game of strategy can, like music, be played in two "keys." The major key is direct strategy in which force is the essentail factor. The monor key is indirect strategy, in which force recedes into the background and its place is taken by psychology and planning. Naturally, any strategy may make use of both these keys in varying degree and the result is a large number of "patterns..."

Like Sun Tzu: "to subuethe enemy without fighting is the acme of skill."

See Beaufre for another key strategist of history. Here is a clip for Wikipedia:

In his book 1940: The Fall of France, Beaufre writes: The collapse of the French Army is the most important event of the twentieth century. This may sound strange to American ears, but in a certain point of view this Uchronie is pretty close to correct. Had the French Army held, the Hitler regime would have almost certainly fallen. There would have been no Nazi conquest of Western Europe, no Nazi assault on the Soviet Union, no Holocaust, most likely no Communist takeover of Eastern Europe. He later gave his views on France's fall during interviews for the now famous production by Thames Television, The World at War.

To understand the roots of this catastrophic defeat, one must study social history, political history, and military history. While the proximate causes are to be found in military factors (dispersion rather than concentration of armored forces, in particular), the root causes lie in social and political factors. Anyone reading about France in the 1930s will be struck by the deep divisions in its society, and the extraordinarily vitriolic nature of its politics. Consider, for example, the matter of Léon Blum. In the late 1930s, the following phrase was popular among French elites: "Better Hitler than Blum"

From John M. Collins, Grand Strategy, pg. 2

Collins says: "strategy occupies two distinctive but inter-related planes, one abstract, the other concrete. The former is peopled with strategic philosophers and theoriticians, the latter with practical planners." pg. 14

"Grand strategy, which embraces such niceties as bluff, negotiation, economic skulduggery, and psychological warfare..." this not like Clausweitz but more like Liddell Hart who said "the true aim is not so much to seek battle but to seek a strategic situatuion so advantageous that if it does not of itself produce the decision, its continuation by a battle is sure to achieve this."

me: Two stories should suffice to demosntrate the improtance of strategi thinking. In WW2, strategy was not exemplified by planning for the invasion of Normandy, but by the grand strategy of which Normandy was just a piece.

A second example is Stonewall JHackson's maneuvers in Virginia, forcuing not so much onthe immediate enemy in front of him, but in pulling Union forces away from Lee's primary forces ositioned to the east.

Collins says "only a handful of strategic pioneers, like Alexander, Machiavelli, Lenin, Liddell Hart, and Mao, have devised innovative ways to substitute subtleties for brute force." pg. 16

"In sum, strategy is the are and science of options." Collins, pg 19

"the principle of Flexibility recognizes the inevitability of change in purposes, policies, plans and procedures." p. 25

Wyle (in Collins, p. 25) says "no one can predict with certainty the pattern war will take."

Using the Wrong Frame - Poor Decisions from Good Analogies

Framing includes:

  • Establishing the question to be answered

  • Articulating Assumptions

  • Identifying criteria for decision-making

  • Identifying appropriate and sufficient options

Articulating Assumptions

Richard Neustadt and Ernest R. May (1986) suggest that decision making groups should make a standard practice of listing in three separate columns key elements of the immediate situation, namely those Known, Unclear, and Presumed.

They suggest keeping deliberation of "what to do" at bay until the situation surrounding the decision is characterized in this manner.

In corporate settings, we try to leave key assumptions in clear view on a white board or flipchart to remind decision-makers that their deliberations are built on a foundation of beliefs, which may or may not ultimately stand as facts. As the intelligence gathering process continues, the listing of assumptions can be changed with the ease of a white-board erasure, signaling to all that decisions should be tested against the latest set of assumptions.

Assumptions about key decisions are often evident as deliberants reference other decisions made in the past. Indeed, historical analogues can become so embedded in thinking that decision-makers are sometimes unaware the extent to which the present decision-process is flowing through a channel laid out like a template according to the way previous decisions and events unfolded.

The question of war with Iraq appears to have been influenced greatly by the World War II analogy. Rumsfeld's two key advisors, Paul Wolfowitz and Douglas Feith shared a common background that was seminal in each's choice of career and domain of interest, and appears to have shaped the way the two men viewed and understood the situation in Iraq.

Feith's father grew up in pre-WW2 Poland and Germany and became a sailor, adopting what Feith describes as an "uncommon occupation for a Jew." The senior Feith took heroic action, helping to smuggle Jews off the Continent and onto the British Isles. Captured by the Germans, held in solitary confinement and tortured, he escaped Germany, emigrated to America, and served in the U.S. merchant marines for the remainder of the war. Both of his parents, along with four of his sisters and all three of his brothers were murdered in the holocaust.

Partly as a result of his father's story, young Douglas Feith grew up with an abiding interest in history, and especially the circumstances of pre-WW2 Europe in which the British leaders sought to contain Hitler even as his father smuggled desperate Jews off the continent. Educating himself, Feith says he "read books on diplomacy, politics and government," and concluded that "nothing short of war could have stopped, let alone reversed, the Nazi aggression." Feith says that his study affected his views during the Vietnam War, as he began to question the prevailing view among his peers that war is never necessary. Indeed, he says. "the failures of appeasement in the 1930s made me skeptical about the promises of demonstrably bad actors -- tyrants, murderers, liars, terrorists and the like."

Paul Wolfowitz family history is remarkably similar to that of Feith, though Feith does not note the similarities of factors that shaped their thinking in his War & Decision.Wolfowitz' father, too, was a holocaust survivor.

Though he himself left Poland after WW1, the rest of his family perished in the holocaust [12]

Wikipedia notes that "As a boy, Wolfowitz devoured books about the Holocaust and Hiroshima—what he calls 'the polar horrors'".[3] Speaking of the influence of the Holocaust on his views, Wolfowitz said:

"That sense of what happened in Europe in World War II has shaped a lot of my views ... It's a very bad thing when people exterminate other people, and people persecute minorities. It doesn't mean you can prevent every such incident in the world, but it's also a mistake to dismiss that sort of concern as merely humanitarian and not related to real interest."[12] He told the NY Times that

As Wolfowitz observed the American policy of "containing" the Iraqi threat to peace as a post Gulf War policy, he saw similarity to the British efforts to contain and appease Hitler's threats to pre-WW2 Europe. He told the NY Times (Ricks, pg. 16) th"that sense of waht happened in EUrope in World War 2 has shaped a lot of my views."

World War II as Analogy to Iraq

Strategic thinking is characterized by openness to new and different ideas. And one way to generate new and different perspectives on strategic situations is through the use of metaphor, or its close relative analogy, perhaps the most advanced form of human thinking. As Aristotle said in Poetics, “the greatest thing by far is to be a master of metaphor.” It is “a sign of genius, since a good metaphor implies an intuitive perception of the similarity in dissimilars.”

In their Harvard Business Review article entitled “How Strategists Really Think,” Giovanni Gavetti and Jan W. Rivkin show that reasoning by analogy plays a major role in the thinking of successful strategists. As an example, these writers point to Intel chairman Andy Grove’s story of how he came up with an important business strategy. Attending a management seminar, Grove heard the story of how fledgling “mini-mills” in the steel industry began in the 1970s to offer a low-end product—inexpensive concrete-reinforcing bars known as rebar. Establishing market share with the low-end products, these steel companies then began to migrate up the hierarchy of products toward the higher-end, more lucrative steel products. U.S. Steel, which had ceded the low-end products to the smaller and seemingly insignificant players, was caught unawares by the companies attacking the market for their core business and lost market share over a number of years.

An epiphany struck Andy Grove as he sat in that management seminar, thinking about the steel industry. Using what Gavetti and Rivkin call “analogical thinking,” Grove saw that Intel was sitting in a similar situation to that of U.S. Steel in the 1970s. Intel had theretofore leaned toward ceding low-end computer chips to niche players, a strategy that, Grove now realized, would put Intel in a dangerous situation. He began to see low-end computers as “digital rebar,” a metaphorical image that helped him in articulating his strategy to Intel management. “If we lose the low end today,” Grove said, “ we could lose the high end tomorrow.” As a result of this thinking, and the deliberations that followed, Intel redoubled its efforts to market the low-end “Celeron processor” for low-end personal computers.

Though a mental model—a hypothesis about cause and effect—provides a useful way of understanding the dynamics and working of the world around us, blind adherence to entrenched models can be dangerous. Once we close our eyes to disconfirming evidence, once we fail to see the weaknesses of our assumptions about cause and effect, we have failed as systems thinkers.

History, of course, is replete with examples of people adhering stubbornly to old paradigms despite overwhelming evidence that a new way of thinking has become necessary.

Mental models become the frames through which we view the world. We attend to what is inside our frame, oblivious sometimes to what occurs outside our frames, which can lead to dangerous blind spots. Frames can be useful insofar as they direct our attention toward the information we seek. But they can also constrict our peripheral vision, keeping us from noticing important information and, perhaps, opportunities. Once liberating, mental models can become shackles.

As an illustration of the way in which mental models and frames can get out of hand, consider Donald Schon’s concept of a generative metaphor. A generative metaphor is an “implicit metaphor that can cast a kind of spell on a community. All solutions are understood in terms of the implicit metaphor.” Some work cultures, for example, use a sports analogy as their generative metaphor, ubiquitously describing events in sports language and casting solutions as “game plans.” A generative metaphor like this can be healthy, but it can also restrict creativity and problem-solving, since the “team” may miss out on ideas and options not endemic to the metaphorical world at hand.

At times, an over-used generative metaphor can lead to a group dynamic known as groupthink. When cultural propensities like this become problematic, leaders can stimulate positive organizational change by introducing new and useful generative metaphors as they communicate with others. The new metaphor can provide people with a lens through which to see things anew and lead to positive change in the work atmosphere and business results.

Turning to history for guidance is the essence of wisdom. Thucydidies, the Greek historian of (XXX BC) said that he "wrote for those who want to udnerstand clearly the events which happened in the past and whihc (humna nature being what it is) will at some time or tother and in much the same ways be repeated inthe future."

N & M suggest "boarding" the Likeness and Differences between the present situation and a given analogy as a way of finding useful ways of thinking while limiting over-use of and particluar guiding metaphor. Had DOD decision-makers used such a process, they may have produced a chart like the following:

Likenesses:

  • Saddam, like Hitler, was a tyrannical leader who controlled his minions through intimidation.

  • Saddam had once tried and succeeded at over-running a neighboring country (Kuwait) with the use of conventional armored force, much as Hitler's armed forces overwhelmed, say, the Netherlands in 1940.

  • Saddam did not hesitate to use torture or maiming in controlling his own people.

  • Henchmen like Saddam's sons (XX) displayed ruthlessness reminiscent of (XXX).

Differences:

  • Unlike the liberation of WW2 France, an American-led victory did not free an otherwise untied people.

  • Unlike the vanquished WW2 Germany, surviving Iraqis were divided into multiple factions, some with an interest in continued strife in the country as battles heated up for control of the future.

  • The liberated France of WW2 reestablished a country with a strong sense of national identity, culture and language. Iraq had been cobbled together (when XXX)

Assumption 1: There is a nexus between Saddam Hussein's Iraq and the acts committed by al Qaeda terrorists on September 11, 2001.

Years later, a memo written by Wolfowitz surfaced during congressional investigations. The memo appears to imply an assumption by the DOD officials that a link between Iraq exists, and simply needs to be found.

LA Times Article by Peter SpiegelApril 06, 2007:

Just four months after the Sept. 11 attacks, then-Deputy Defense Secretary Paul D. Wolfowitz dashed off a memo to a senior Pentagon colleague, demanding action to identify connections between Iraqi dictator Saddam Hussein’s regime and Al Qaeda.

“We don’t seem to be making much progress pulling together intelligence on links between Iraq and Al Qaeda,” Wolfowitz wrote in the Jan. 22, 2002, memo to Douglas J. Feith, the department’s No. 3 official.

Using Pentagon jargon for the secretary of Defense, Donald H. Rumsfeld, he added: “We owe SecDef some analysis of this subject. Please give me a recommendation on how best to proceed. Appreciate the short turn-around.”

Wolfowitz’s memo, released Thursday, is included in a recently declassified report by the Pentagon’s inspector general. The memo marked the beginnings of what would become a controversial yearlong Pentagon project supervised by Feith to convince the most senior members of the Bush administration that Hussein and Al Qaeda were linked – a conclusion that was hotly disputed by U.S. intelligence agencies at the time and has been discredited in the years since.

Hear Feith defend his role in pre-Iraq decision-making here:

http://www.npr.org/templates/player/mediaPlayer.html?action=1&t=1&islist=false&id=7309878&m=7309879

Eventually, the decision was framed from the perspective of the DOD "neo-cons." Bush would make a choice to go to war in the interest of spreading democracy. The world was told, though, that war was necessary to defend the region from weapons of mass destruction. As Feith defended the decision process years later, he insists that war was necessary regardless of the presence of WMD.  The key error made by the Bush administration, he suggests, was not in assuming the presence of WMD, but in creating that impression on the global audience. The Administration, he says, should have been forthright in portraying the war as a matter of standing up for freedom.

Group Process

As a counterpoint to the complicated Groupthink hypothesis, historians Neustadt & May suggest in their book, Thinking in Time, that the Bay of Pigs debacle can be explained by a simple and common root cause: Kennedy and his advisers just did not devote enough time to decision-making and planning. After analyzing schedules of key administration executives, they found that key members of the decision making group "could never give [Bay of Pigs planning] sustained attention for more than forty-five minutes at a time. (N&M, 1986, pg. 1).

Motivation and the Sense of Urgency

Two brain parts appear to work in tandem for purposeful behavior to occur... The frontal cortex, part of the cerebral area, and the basal ganglia, which is part of the limbic system. In this article from Scientific American, a man sinks to the bottom of a pool, and cannot seem to muster the motivation to swim back to the top. He is content to drown, and after being saved from dewoning by his daughter, reports remembering being aware he was in grave danger, but he was simply unable to will himself to do anything about it.  Click on the link below for the article.

http://www.sciam.com/article.cfm?id=drowning-mr-m

*** If you read this far, thanks for your visit!  Please click the button below to SHARE it with your LinkedIn followers.  Thanks!

The Universe Large and Small

“How many particles are there in the known universe?,” you ask. Particles… protons, neutrons, electrons… that make up the atoms that make up the molecules in the uh, known universe. Billions of galaxies holding billions of stars and, evidently, planetary systems.

Well, there are apparently about 10 to the 87th particles out there. Ten to the 87th power. Ten multiplied by itself 87 times. That doesn’t seem like such a large number at all, does it? Yes, if I used a small typeface I could type the number onto one line of this page. Check me on this if you’d like. Google it. Subject the question to Wikipedia. Ask a smart friend. I know it’s quite a thought to think.

"Oh the thinks you can think!

Think and wonder and dream

Far and wide as you dare.

When your thinks have run dry,

In the blink of an eye

There's another think there!

If you open your mind

Oh, the thinks you will find

Lining up to get loose

Oh, the thinks you can think.... "

.

--Dr. Seuss

.

Thoughts, by the way, are now observable to the naked eye using a new machine called the fMRI. We can watch as circuits of neurons in the brain to fire in an instant corresponding to a cognition, image, smell or feeling. Since we have a billion neurons in our brains, with each neuron capable of connecting via little connectors called dendrites to a thousand or so other neurons at a time, there is a large number of possible “Hebbian nets” -- neural circuits – that a person might experience. How large a number? A recent estimate indicates the number of possible neural circuits in one person’s brain to be, listen to this now, ten to the millionth power. That’s a 1 followed by a million zeros. Now, just imagine the number of possible thinks that can be thunk when people multiply themselves as members of groups, teams, and organizations!

Yes, in a way of thinking, our brains are larger than the known universe. But we as individuals , as you know, are quite small. Here is a picture of a portion of the universe measuring roughly 2 billion light years in the shape of a cube.

The points of light are galaxies. Scientists could not find anyone willing to stand back far enough to snap this picture, so they roughed it out using a computer model. This is about what it would look like, though, as I understand it.

Now look at the picture below. What do you guess this is?

No, this is not the known universe looking from the other side. This is a picture of a neuron making connections to other neurons in a human brain. Remember, neurons are tiny. Each of our brains contains a hundred billion of these.

Both the big and small slices of the universe shown above are examples of systems, or networks. Obviously, the universe of galaxies and stars bears more than a passing likeness to the inner world we all carry around with us. We have only just begun understanding how these marvelous Hebbian nets work. A few axioms have become clear, though. Here are a few attributes of neurons and neural networks:

A neuron is built for connectivity.

The interesting things that happen in the brain are characterized by vast webs of neurons firing simultaneously or in sequence. As noted, each neuron has numerous dendrites capable of hooking up with other neurons. A neuron working on its own could behave something like a light switch or a thermostat, but would otherwise seem fairly unintelligent and unimpressive to us. A network of neurons, though, provides us with the magic of our human experience. The depiction to the left shows the way neurons reach out to one another.

Individual neurons serve as hubs. Thousands of neighboring neurons may connect through a neuron as a means of reaching thousands more.

The picture to the left is familiar to us a hubbing system. This, of course, is the air traffic system of the North American continent. Bright sections in the picture, analogous to neurons, are the “hub cities” through which people may connect as they travel from their originating city to their airport of destination.

Neural networks are constantly being updated and renewed.

Neural connections show plasticity. New connections are constantly being forged. Much of this reorganization and renewal is accomplished by the active brain while our bodies sleep.

Redundancy is a natural characteristic of Hebbian nets. An impulse may travel through a myriad of possible routes or pathways in order to connect one area of the brain with another. This avoids the bottle-necking that is observed in more limited systems. For example, there is really only one road I can take to reach the airport from my house. If there is a traffic tie-up, I’m sunk. A more natural network of roads, by which I might reach the airport from a variety of routes, would allow me to leave my house later when I travel. I could rest assured that if there is blockage somewhere in the system, I can get through some other way.

The internet works this way, by the way. As I send you an email, small packets of digital information leave my computer, depart from my house and travel separately through a variety web of connections, and are recombined at your house.

The image just above, by the way, is a depiction of the system, or network, that we know as the internet. You can see form this that there are indeed a lot of pathways that might connect my house with yours.

Any given neuron may become a part of any number of “networks” or “webs” that we experience of mental events.

Typically neurons that fire together (in close temporal proximity), wire together. It's thought that networks (representing memories, beliefs, mental events) are represented by such "Hebbian nets" as they're called. Of course, that's just the current thinking and people are still debating it all the time.

The graphic to the left shows four fMRI views of a single mental event. A brainstorm of sorts.

The flavor of a given mental experiece appears to be determeinnted by the parts of the brian through which the Hebbian maps itself. For example, neuroscientist Jessica Payne confirms that "most lasting memories probably do have limbic connections that give them emotional flavor." (We'll examine the nature of the limbic system in a futur post.)

Use it or lose it.

We'll make this the final principle, for the moment. My initial hypothesis was that if a neuron doesn’t get much “action,” its effectiveness diminishes and will not be available for future networks. Dr. Payne writes, though, that it is "not the neuron itself necessarily, but any given concept probably. If you've learned some bit of obscure knowledge and never think about it, it's availability will become diminished and will be unable to be woven into other networks (i.e. you won't be able to build on the knowledge)."

* * * * *

Remember, the purpose of this blog is to examine the workings of the brain as a vehicle for understanding, in an anological or metaphorical sense, the workings of complex orgaizations. Future posts will flesh this out. As a preview, though, I'll leave you with the following graphic showing a human network.

*** If you read this far, thanks for your visit!  Please click the button below to SHARE it with your LinkedIn followers.  Thanks!

Why Do We Have Left and Right Brains?

Mark: Can you think of a reason why the evolutionary development process would leave us with these two brains?

John Batson, Furman University: I think the key is just as you suggest, evolution.

Ancient ancestors that were the first vertebrates displayed bilateral symmetry -- the right side is more or less the mirror image of the left. And that held and holds true for most internal organs, including brain.

So long before there was anything resembling a primate (much less a human), our ancestors were selected for a bilaterally symmetrical body plan. By the time the first primate-like creatures evolved, the bilateral brain was almost certainly symmetrical in function as well. But with recent evolution (the last few million years or so), some functions became lateralized (right brain better at some tasks, left brain better at others). So the present human brain is a relic of ancient selection that has adapted to more current selection pressures, including especially language and bipedalism.

Why do we have two brains? Certainly, if we designed an efficient brain today from scratch, we probably would not end up with the existing brain... it is not completely intelligently designed. We simply inherited what worked long ago, and what got modified along the way.

And by the way, the corpus callosum is not the only fiber system connecting left and right brains. There are several other smaller "commissures" (connecting pathways) joining various parts of left and right.

A Brain in the State of Reverie

Mark: Aware of the treasures of the intellect just before sleep, artists and intellectuals alike have found ways to tap into their theta state in manners both benign and dangerous. Not so long ago, people like Timothy Leary were touting hallucinogenic drugs as windows into the subconscious. On a healthier track, the artist Salvador Dali—he of the melting clocks and strangely shaped human figures—used to sit in a comfortable armchair holding a serving spoon in his hand. As he would begin to drift through the theta state toward sleep, the spoon would fall to the floor with a clang. Alarmed back to a wakeful state, Dali would immediately grab a pencil or brush and sketch the things he had just “seen” during his theta state.

Question: What does science, and in particular the study of different "brain waves" as tell us about the creative process?

Jess: Alpha waves do correspond to relaxed states, but they are usually observed with eyes closed. Mostly they correspond to the reduction of sensory input, as they appear immediately in most people upon closing the eyes with instructions to relax (8-12 Hz). Beta waves are higher frequency waves (> 12 Hz) and characterize active wakefullness, theta is between 4-8Hz and is still in the process of being understood.

It's a characteristic wave of REM sleep, and is not typically seen in healthy awake adults (except in the case of meditation! See Aftanas L, Golosheykin S (2005) Impact of regular meditation practice on EEG activity at rest and during evoked negative emotions. Int J Neurosci. 2005 Jun;115(6):893-909) - so we really could think of it, organizationally speaking, as a quiet, reflective, creative brain wave. But of course, it has a zillion other proposed roles as well, from sensori-motor integration, to short-term memory function, etc, etc. Delta waves are very slow (1-4Hz) and characterize the deepest stage of sleep (stages 3-4). So, what you learned wasn't entirely inaccurate, but it's not quite that simple, since all of these waves are seen in sleep except for beta.