Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts
List Price: $15.00
Our Price: $5.50
You Save: $9.50 (63%)
Why do people dodge responsibility when things fall apart? Why the parade of public figures unable to own up when they screw up? Why the endless marital quarrels over who is right? Why can we see hypocrisy in others but not in ourselves? Are we all liars? Or do we really believe the stories we tell? Backed by years of research and delivered in lively, energetic prose, Mistakes Were Made (But Not by Me) offers a fascinating explanation of self-deception—how it works, the harm it can cause, and how we can overcome it.
The Perversity of Self-justification When the question of our human nature is raised, depending on who you ask, the answers can range from descriptions of ourselves as intrinsically peace loving and altruistic to innately disposed to selfishness and violence. Some see us progressing toward higher levels of consciousness and empathy, and others as inexorably headed toward self-destruction. Whatever our innate proclivities, they are complicated by our need--barely within our awareness--to self-justify our actions. Tavris and Aronson in "Mistakes Were Made (But not by Me): Why we Justify Foolish Beliefs, Bad Decisions and Hurtful Acts" show the extent to which self-justification pervades our beliefs and attitudes, and misdirects us in our personal lives, and in the broader domain of social, legal and political affairs. While the cognitive process of self-justification that is at work in reducing cognitive dissonance is well known to psychologists, those in the field of mental health are no more immune than others to its subversive influence in making a mockery of truth. Tavris and Aronson, building on the solid foundation of cognitive dissonance research in social psychology, show how the process of self-justification, while covertly operating in the service of bolstering one's self-esteem and saving face, has led to serious errors of reasoning and judgment. Tavris and Aronson give numerous examples of how this covert process perpetuates marital discord, how it has destroyed families who were victims of overzealous mental health professionals, how it has kept nations at loggerheads, unable to reconcile their differences, and how it has contributed to egregious injustices in law enforcement, resulting in the imprisonment of innocent persons.
At some time or other we are all inclined to deceive ourselves. Those occasions when we feel the greatest need to justify ourselves are probably the times when we should most carefully examine our motives. Tavris and Aronson's book aims to leave the reader more attuned to the process of self-justification that underlies many of our beliefs and actions, and the harm that can result. The irony, of course, is that the mental machinery of self-justification will be hard at work protecting the sincere reader from looking too closely at himself, while righteously condemning the exposed folly of others. Even so, the world would be a better place if we all took the message in this book to heart.
At first slowly, then quickly Or so say Tavis and Aronson on how we lose our ethical grip---we make a small slip, say to ourselves it is not that bad, and our minds rationalize the next slip. From lunch with a lobbyist to a golf outing in Europe is not---when the mind puts its mind to it---that big a leap. Their discussion of confirmation bias, one of the worst breeders of bad decisions is outstanding and undertandable. And the chapter on how the police get the innocent to confess is chilling. There are all sorts of useful tips.Want to co-op an enemy? Get her to do a favor for you; her mind will say, "I do not do favors for jerks,and because I do not, he must not be that big a jerk." The mind can not hold two thoughts at once, so it bridges the dissonance. At 236 pages, the book is long enough to be worthwhile, but short enough to read on a vacation. Anyone interested in persuasion and how our minds work will find the read a useful one. ...more info
It made me see the world differently What a wonderful book. I actively recommend it to all my students and all my friends. As opposed to much popular social science, this is written by the experts themselves. The book explores some of the effects of cognitive dissonance reduction, with telling examples. The book reminded me of Tom Schelling's great book, Micromotives and Macrobehavior. That book, written by an economist, explains how very small differences in individual behavior can lead to very large differences in societal outcomes. This book, written by psychologists, explains how small differences in two people's initial positions can lead to very large differences over time because of the way we weed out contrary information. ...more info
Great Read, Very Enjoyable, Very Insightful I'll confess, I've had moments in my life where I thought that the woman I had fallen hopelessly in love with was part of a foretold prophecy and the fact she had rejected me was sure evidence that she was a recovering sex addict having intercourse with any third rate bass player who would cross her path on a Tuesday night at the Whiskey, and that syphilis was sure to follow. To say I was raised to have strong opinions is a gross understatement.
However, admitting mistakes for those strong opinions was somehow left out of the guidebook. Is someone slow to get the point? They're obviously a moron!! Did someone forget to clean up dog pee? There are sure signs of narcissist personality disorder!! My family does not "suffer fools gladly".
Thankfully, we are not the exception to the rule, as is the case in point of the book "Mistakes were Made (but not by me): why we justify foolish beliefs, bad decisions, and other hurtful acts" by Carol Tarvis and Elliot Aronson. This book analyzes the strong currents that occurred from Self Justification and Confirmation bias.
This is not a self help book! Self help's focus is on construction, this book's purpose is on observation, and more akin to books like Blink, Predictable Irrationality and The Tipping Point. In fact, if you have read Blink, and enjoyed the correlation between divorce and contempt, this book explores that theme from a self justification point of view. Themes such as clinical psychology, the justice system, and international policy; social and marital situations are discussed with these issues in mind. The book can be found in the general psychology section of the bookstore and I think that is quite appropriate.
I think one of the first things this book has showed me is that we as Americans are horrified of making mistakes. The main difference between the educational systems that surpass the U.S is that their process views mistakes as a natural part of expression, while the American grading system views mistakes as something unpleasant and to be avoided. This can lead to adults who mistake strong feelings for intelligence. I can say I happened to be one of those people.
The final chapter gently strews some ideas about what to do with self justification. Its main focus is trying to get people to separate the relationship with the person from the mistake made. To identify their personal feelings and isolate them from the problem at hand. To take out the "yes, but" out of the explanation of a mistake.
In the spirit of this book, I will gladly admit some mistakes I have made. I blogged and left comments that hurt other people's feelings. Whether I was wrong or right doesn't excuse the fact that I hurt somebody. The easier response of saying "I don't care" was taken and that too is regretful. From those actions there where a couple of relationships which had the opportunity to be deepened and that they were missed is regrettable.
See! Easy Piezy! If you are looking for a good nonfiction read, and willing to look at yourself with a sense of humor if you identify with any of the examples, check it out~!
C'mon everybody! Let's sing the song to reading rainbow!!!
pwood one of the best books i've ever read. will be of interest to anyone in any occupation....more info
How humans justify bad decisions and foolish beliefs: The power of cognitive dissonance This is a well written, snappy book that addresses an important issue, best described by the book's title and subtitle: "Mistakes Were Made (but not by me): Why we justify foolish beliefs, bad decisions, and hurtful acts."
The two authors, both well reputed psychologists, use the theory of cognitive dissonance as their starting point. Leon Festinger was one of the major theorists of this approach. The authors of this book simply define the perspective thus (page 13): "Cognitive dissonance is a state of tension that occurs whenever a person holds two cognitions (ideas, attitudes, beliefs, opinions) that are psychologically inconsistent, such as 'Smoking is a dumb thing to do because it could kill me' and 'I smoke two packs a day.'" How does one deal with this? By adopting one of the positions and then downgrading or rejecting the other. The end result is self-justification, self-deception, seeking out evidence to support the choice that we have made while rejecting evidence that does not fit with our choice.
The brain itself shows evidence of the operation of cognitive dissonance. The example on page 19 of functional Magnetic Resonance Imaging (fMRI) and processing information about presidential candidates is telling. The end result is "blind spots," in which people (page 42) "fail to notice vital events and information that might make them question their behavior or their convictions." As such, the authors note that cognitive dissonance makes mincemeat of such theoretical views as rational actor theory and psychoanalytic theory. One result of cognitive dissonance is what is called "confirmation bias," the attending to evidence that supports our views and the rejection/suppression of evidence that does not support our views.
Many examples are advanced to illustrate the case that the authors make. Issues include: moral lapses (e.g., Watergate participants), "made up" memories (raising serious questions about the whole idea of repressed memories), criminal justice system decisions on guilt or innocence, and so on. Much is at stake with cognitive dissonance as it operates.
In the closing chapter, the authors try to indicate how understanding cognitive dissonance might help us to limit the damage that may occur as a result of its operation. Convincing? I'm not so sure, but this discussion does get one thinking about how we might address the harmful side effects of cognitive dissonance.
A readable book that raises important issues. I think that more use of neuroscientific research could have strengthened this book that much more. Also, the work by cognitive psychologists like Kahneman and Tversky could have spoken to key points as well. This book might also profitably be read in tandem with another recent book on a similar subject, Cordelia Fine, "A Mind of Its Own." In addition, Linden's "accidental Mind" provides a perspective on related issues from a neuroscience viewpoint.
A bit overdone Good explanation & application of Festinger's Cognative Dissonance theory.It does,however share the fault of many psychological theory books in that the authors naively attempt to overly apply their theories to cover present & past historical/political realities. Here,the authors show their overextention & lack of knowledge of history. Their attempts to cover the history of Islam & it's collision course with Christianity/Judaism range from foolish to downright embarassing. That said,this is a valuable & insightful book....more info
The authors make a dry subject come alive! Renowned social psychologists Carol Travis and Elliot
Aronson have written a truly fascinating book, MISTAKES
WERE MADE (BUT NOT BY ME). . . its subtitle made me want
to read it even more: WHY WE JUSTIFY FOOLISH BELIEFS,
BAD DECISIONS AND HURTFUL ACTS because I have long observed
this tendency--even in my own life.
The authors make what could be a dry subject come alive
by the use of many examples . . . in addition, I liked how
they incorporated much research--cited in nearly 40 pages
of endnotes--but made it come alive via a lively writing style.
When they explained how our memories tell more about
what we believe now than what really happened then, I had
to laugh . . . and recall the story of how I once took Risa,
my daughter, to my first home . . . from there, I proceeded
to take her to my elementary school, which I could have
sworn was nearly a mile away . . . in reality, it turned out
to be less than two short blocks away!
MISTAKES WERE MADE further shows how couples can
break out of the "he said,she said" spiral of blame and
defensiveness, and perhaps most importantly, how all of
us can learn to own up and let go of the need to be right.
There were many memorable passages in the book; among
those that most caught my attention were the following:
* The same DNA that exonerates an innocent person can be used
to identify the guilty one, but this rarely happens. Of all the convictions
the Innocence Project has succeeded in overturning so far, there
is not a single instance in which the police later tried to find the
actual perpetrator of the crime. The police and prosecutors just
close the books on the case completely, as if to obliterate its
silent accusation of the mistake they made.
* De Klerk, who had been elected president in 1989, knew that a
violent revolution was all but inevitable. The fight against
apartheid was escalating; sanctions imposed by other countries
were having a significant impact on the nation's economy;
supporters of the banned African National Congress were
becoming increasingly violent, killing and torturing people whom
they believed were collaborating with the white regime. De Klerk
could have tightened the noose by instituting even more repressive
policies in the desperate hope of preserving white power. Instead,
he revoked the ban on the ANC and freed Mandela from the prison
in which he had spent twenty-seven years. For this part, Mandela
could have found entirely legitimate. Instead, he relinquished
anger for the sake of the goal to which he had devoted his life.
"If you want to make peace with your enemy, you have to work with
your enemy," said Mandela. "Then he becomes your partner." In
1993, both men shared the Nobel Peace Prize, and the following
year Mandela was elected president of South Africa.
* Making mistakes is central to the education of budding scientists
and artists of all kinds, who must have the freedom to experiment,
try this idea, flop, try another idea, take a risk, be willing to get the
wrong answer. One classic example, once taught to American
schoolchildren and still on many inspirational Web sites in various
versions, is Thomas Edison's reply to his assistant (or to a reporter),
who was lamenting Edison's ten thousand experimental failures in
his effort to create the first incandescent light bulb. "I have not failed,"
he told the assistant (or reporter). "I successfully discovered 10,000
elements that don't work." Most American children, however, denied
the freedom to noodle around, experiment, and be wrong in ten ways,
let alone ten thousand. The focus on constant testing, which grew
out of reasonable desire to measure and standardize children's
accomplishments, has intensified their fear of failure. It is
certainly important for children to learn to succeed; but it is just
as important for them to learn not to fear failure. When children or
adults fear failure, they fear risk. They can't afford to be wrong.
That said, you won't go wrong by reading MISTAKES WERE
MADE . . . I was so impressed by it that I now plan to get
copies of the book for many of my colleagues at my college,
in that they will be able to relate to much of it . . . so will you.
Help Yourself, Help us all . I got what I was after when I ordered this book. Enhanced with stellar examples ranging in severity and repercussion, from thousands dead to marriages failed, this is a psychology book that Shows more than Telling, letting you do the math.. but also showing how it adds up. Or with the dissonance theory, which it does adequately explain, it shows how sometimes, our cognitive dissonance makes it so that our decisions Don't add up, and why its so hard for us to be fair when faced with dissonance. This book helps us realize how humans, although hard wired to Skew the facts in our perceived favor.. Are able to get around the tragedies of dissonant thinking.
It's a psychology book, with a touch of self help because.. as it so fairly points out, We are all guilty of mistakes, Yet, its those of us who can admit to them that are the one who tend to recover from them, learn from them, and even be more valued and trusted by society-- more often.
Less likely to make the same mistake again.
One of the most interesting ideas it posed through numerous example, was that people Acclimate themselves step by step to immoral behavior, escalating in severity. Thankfully, no ones bad, no ones good, We are all capable of Vast misgivings. Our first mistake being inherent human fallibility, this book examines how we can help understand our cognitive flaws, and own up before things escalate. It also gives us some courage to admit, even if things Have gone too far. ...more info
Highly recommended Other reviews synopsize this book, and I won't repeat that effort. I will, however, chime in to say that it's remarkably readable and pretty much universally relevant. One anecdote in the book relates that one of the authors, a professional in the study of cognitive dissonance, falls prey to it themselves - thus underlining the point at hand, that we're all susceptible to this error.
Cogent, clear, and engagingly written, it makes those obtuse errors of others so much easier to understand - and easier to spot in ourselves. ...more info
Startling Insights I had read positive reviews of this book, but was still startled at the array of situations where I had never thought of a frozen mindset controlling action. I was particularly struck with behavior of district attorneys who "cannot" change their mind, even after DNA evidence clearly proves the innocence of a person they had convicted....more info
Should be required reading! An amazing book that brings insight into politics and social issues in light of our desire to remain "right." A must read for anyone dedicated to social change....more info
Excellent, comprehensive, disturbing We make mistakes. We do not admit them. We justify them. We rationalize them. But of course, we do not know we are in denial. We have opinions about people, formed on the basis of their gender, race, color, religion, ethnicity, sexual preferences, and much more. These are stereotypes, which can degenerate into prejudices and worse. Memories can deceive us. Experts, especially self-styles ones, can do more harm than good. This is the crux of the book, and the authors spend the bulk of the book describing this process in a variety of situations.
The book is well written. It is well organized. Persuasive, passionate, well-researched. The cons, if you have to pick, are that the book could does not get deep enough into any of the topics that it covers in its chapters, so you would necessarily have to look someplace else after reading this book on the areas that it dwells on. A minor quibble, that some people may have, is with its brief mention of political denials and self-justifications.
Gays, blacks, Jews, Chinese (immigrant workers in the US), Japanese (interred during the Second World War), parents, children, spouses, students, - no stereotype, no denomination, no group, is left out.
Impressive as the initial material in the book is, I believe that the real value of this book comes through in the latter chapters. Chapter 3 ("Memory, the Self-justifying Historian"), Chapter 4 ("Good Intentions, Bad Science"), Chapter 6 ("Love's Assassin: Self-justification in Marriage"), Chapter 5 ("Law and Disorder")
Excerpts from the book:
"Cognitive dissonance is a state of tension that occurs whenever a person holds two cognitions (ideas, attitudes, beliefs, opinions) that are psychologically inconsistent..... people don't rest easy until they find a way to reduce it...." [page 13]
"Children learn to justify their aggressive actions early: .... Aggression begets self-justification, which begets more aggression." [page 27]
"Dissonance is bothersome under any circumstance, but it is most painful to people when an important element of their self-concept is threatened - typically when they do something that is inconsistent with their view of themselves." [page 29]
The longest journey begins with the first step. Well... "How do you get an honest man to lose his ethical compass? You get him to take one step at a time, and self-justification will do the rest." [page 37]
"In a sense, dissonance theory is a theory of blind spots." [page 42]
"Prejudices emerge from the disposition of the human mind to perceive and process information in categories. 'Categories' is a nicer, more neutral word than 'stereotypes', but it's the same thing. [page 57]
Stereotypes can be useful at times, because they act as "energy-saving devices that allow us to make efficient decisions on the basis of past experience .... quickly process new information ... often with considerable accuracy." [page 57] but "the downside is that stereotypes flatten out differences within the category we are looking at and exaggerate differences between categories" ... "a stereotype might bend or even shatter under the weight of disconfirming evidence, but the hallmark of prejudice is that it is impervious to reason, experience, and counterexample." [page 60]
Why we continue to be prejudiced is also part of our selves and "so hard to eradicate... they allow people to justify and defend their most important identities - their race, their religion, their sexuality - while reducing the dissonance between 'I am a good person' and 'I really don't like those people'." [page 65]
Memory plays a very important role in the whole process of self-justification... "because memory is reconstructive, it is subject to confabulation. .... In reconstructing a memory, people draw on many sources." which can lead to "source confusion." [page 73]
"Memories create our stories, but our stories also create our memories. Once we have a narrative, we shape our memories to fit into it." [page 77]
"False memories allow us to forgive ourselves and justify our mistakes, but sometimesa ta high price: an inability to take responsibility for our lives." [page 93]
False memories can also lead to tragic consequences, as with the case of Holy Ramona, who at the insistence and suggestions of her therapist, came to 'remember' that she had been sexually abused by her father. After a court foudn the therapist guilty of planting false memories and the father innocent, Holy experienced dissonance, and chose instead to continue believing in these false memories, even to the extent of becoming a psychotherapist, and "encouraging some of her clients to recover childhood memories of their own sexual abuse".
"We give ourselves credit for our good actions but let the situation excuse the bad ones." [page 169]
"... but the evidence shows clearly that while inebriation makes it easier for people to reveal their prejudices, it doesn't put those attitudes in their minds in the first place. [page 63]. So the next time you find yourself saying, 'but I was angry, drunk, tired, etc...', keep this in mind.
How do perpetrators go about "minimizing their moral culpability"? Using several ways, as you would expect.
"The first, naturally, was to say that they did nothing wrong at all." [page 193]
"The second strategy was to admit wrongdoing but excuse or minimize it." [page 194]
"The third strategy, when perpetrators .... could not deny or minimize responsibility, was to admit they had done something wrong and hurtful, and then try to get rid of the episode as fast as possible." ... and to get "a reassuring sense of closure." [page 194]
This whole exercise of self-justification and denial extends, as one would expect, to torture also.
"The universal justification for torture is the ticking-time-bomb excuse." [page 202]
The trouble is that those circumstances are very rare, so the 'saving lives' excuse starts being used even when there is no ticking and there is no bomb." .... "Once torture is justified in rare cases, it is easier to justify it in others;" [page 203]....more info
Wonderful book I was very happy with 75% of the book, there were sections that didn't do much for me but I am sure that it will for others. I have been able to have discussions about irrational thoughts with my clients and those conversations have led to more open and productive meetings. This has allowed me the tools to understand people on a much higher level... more importantly myself. I now have insight into where my feelings come from and that has allowed me to evaluate the healthy and unhealthy ones. Must read....more info
Uncommon sense When you have passed Psych 101, you close the book and move on, little knowing that you have learned and forgotten valuable information. Carol Tavris brings us back to that text book and teaches how we can be better thinkers and critics by taking a simple concept to heart - cognitive dissonance.
But beware. If you want to avoid facing your mean little self, you might want to avoid this book. If it doesn't make you squirm, you missed the point....more info
Has some good points but... Makes some valid points, but the primary purpose of this book in my opinion is to bash conservatives and their views. Examples of poor liberal decisions are few. I would assume liberals make mistakes too... If you are looking for something that gives good analysis without the severe leftward slant look elsewhere....more info
Awesome Book I was truly enjoying it and so was someone else at work because the darn thing went missing! I'm ordering another....more info
Proof That Victory Has Many Fathers but Defeat is an Orphan Authors Travis and Aronson present a wonderful explanation of how we "justify foolish beliefs, bad decisions and hurtful acts" in this fascinating and easy-to-read book that will make you smile or shake your head as you recognize the mental gymnastics on the balance beam of your brain and the brains of others.
They show us how our mind overcomes cognitive dissonance through self-justification where we create blind spots to our pride and prejudice, keep editing our memories until it provides a recall we are comfortable with, how "good" people lose their ethical compasses, how we justify our biases and prejudices, how the "us mentality" fulfills the paramount need for belonging and a sense of superiority, which allows us to do bad things to good people, our spouses, our co-workers, and even strangers who pose no threat or insult.
They explain this using a pyramid where every one of us starts at its apex. One story, act or event leads us down one side of the pyramid that will define our beliefs, character and philosophy while the same experience or a different one will lead someone else down another side of the pyramid. To avoid dissonance, the feelings of being wrong, stupid or weak, we will seek confirmation of our new belief so that we tell ourselves we have not gone down the wrong side, until we find ourselves at the base, unwilling to acknowledge even the most irrefutable evidence that might contradict that belief.
When we are confronted with such evidence, it is called dissonance. It can be so threatening that we find a number of ways of projecting or rationalizing our previous action so we don't have to face the embarrassing possibility of having been wrong. When prosecutors feel they have imprisoned the right man even after DNA proves he didn't commit the crime, they have become convinced in their belief, which is now rooted in the base of the pyramid. They will experience dissonance even when it turns out that the victims of the eight people they successfully prosecuted for murder turn out to be very much alive after all. Self-justification in the form of a rationalization will save the ego from the enormous guilt of having put the wrong man in prison.
Travis and Aronson also take aim at Freudian psychoanalytic theory, especially repression and memory therapy through the tragic experience of Holly Romana and the daycare centers of the 1980's, stories that were chilling in the lives they destroyed because of the overconfidence of therapists, police and district attorneys where, despite their experience, their ability to pick out the molester, abuser, or criminal was no better than chance. Their experience gave them confidence but little insight. They make even a stronger case against the theory with Holocaust survivors who suffered unspeakable, repressible misery, yet were able to remember almost every detail of their depositions forty years earlier when they were liberated. Those who clung to repression theory found validation if a survivor couldn't remember every single detail.
What drew my attention to this book was an intuition that no power of intellect, knowledge, or persuasion would influence people whose political opinions differed from mine in a substantial number of Amazon reviews and comments under dozens of titles. I was fascinated with the contradictory message of being told to read extremist authors (who believed that criticism of national policy was an act of treason), with the admonishment that it should be read by, or was only for "those with an open mind." Equally odd was the saying "the truth really hurts" from people who were reading propagandists. As the authors explained, people see themselves as open-minded and fair with reasonable opinions. Therefore, since they are reasonable, fair and just, and find the book "factual," those who don't share their opinion must be unreasonable, biased, and unable to face the truth, at least as they see it. In other words, it's the other guy who lacks an "open mind."
The more I read, the more I found applications from the book in other comments and reviews. One commentator could not understand how I admired the book "The Greatest Generation," by Tom Brokaw, because of my liberal leanings. According to her, I was one of the people "on [my] side of the fence," that "actually despise every aspect of American life." (It was the authors' example of the "us mentality" that provides the person with the sense of belonging). Membership in a particular group is a must, as is the perception to view our group as being more intelligent, open-minded, or moral than those in a different group. This allows us to see traits in the other group that are undesirable, traits that members are unable to see in their own group, unless it is with justification. The stereotype tends to serve as a defining line where there is only one possible rational explanation for things--hers. Although she felt my respect and politics were incompatible, I could be explained as an aberration, in her mind. (There was hope for me). Conflict resolved, dissonance spared, and paradigm and belief remain intact.
The authors demonstrated the power of rationalization and denial that was all so clear here in discussion threads. If people believed there were WMD even after an administration acknowledged that there weren't any after all, they would still believe that they were driven away by trucks to Syria while still not knowing where in Syria they actually were. Some had their belief systems so tied to the previous administration that criticism could only be an expression of irrational hatred or an act of treason. Travis and Aronson's message explained the rationale of not wanting to understand our enemies. It would be much more convenient to believe they hated us for no good reason, and anyone wanting to understand could be simply explained as a terrorist sympathizer. Righteousness without proof of being right, wins.
In actuality, questioning our beliefs or admitting our mistakes doesn't make us look weak or stupid. It creates the opposite impression. Our society promotes the idea that this is a weakness, and it is instilled in our children early on, turning them into human beings who can never be wrong on their jobs, in their relationships, and about their personal beliefs. The fear of ridicule, failure, and retribution is too strong and difficult for us to face. Few have risen to the occasion such as President Kennedy who has been the last president to admit having made a mistake on a grand scale. Richard Clarke was the only one in the previous administration who flat out stated that he was responsible for the attacks of 9/11, and was the only one to ask for forgiveness. (No one came close). N. Wayne Hale Jr. took full responsibility for the Columbia Shuttle Disaster. He was since promoted to Manager of the Space Shuttle Program. General Eisenhower had a statement ready in case the invasion of Normandy on June 6, 1944 had failed. He changed a key phrase in it from "The troops were withdrawn" to "I withdrew the troops."
I cannot recommend this book enough, because knowing what we believe depends upon how we think, and how we think about those of us with opposing viewpoints. It adds tremendous insight into the human condition and our need to protect our own egos. The authors provide some hope believing that we have the ability to recognize our self-destructive thought patterns and change for the better. As they so blithely state, the body might want sugar, but we have learned to eat vegetables.
This book is proof that victory has many fathers, but defeat is an orphan.
When asked by a reporter what three mistakes he made as president, George W. Bush replied: "[When people ask about mistakes] they're trying to say, `Did you make a mistake going into Iraq?' And the answer is, `Absolutely not,' It was the right decision...Now, you asked what mistakes, I made ...some mistakes in appointing people, but I'm not going to name them. I don't want to hurt their feelings on national television." Page 235.
Just as the authors said, mistakes were made--but not by me.
Jackson, Brooks & Jamieson, Kathleen, H., "un-Spun: Finding Facts in a World of Disinformation," Random House, 2007.
OK More psychology that I was looking for in a book with this title but interesting....more info
A new way of introspection The authors, in a interesting and entertaining way, draw attention to our "blind-spots" and self justifications. It is easy to be outraged at the hypocracy all around us--but their cogent arguments about our own bad decisions is really an eye opener. It has made me view all the current news in a deeper and more meaningful way and it has really drawn my attention to my own prejudices and self justification. A wonderful book, based on scientific studies, arguing that we need to be just as aware of our mental blind-spots as we are of our visual blind-spots when driving. I highly recommend it....more info
You'll Never Look at Your Behavior the Same Way Again Elliot Aronson was chosen by his peers as one of the 100 most influential psychologists of the twentieth century for good reason. His pioneering work in the field of cognitive dissonance theory revolutionized our understanding of how people unconsciously smother their failings under a blanket of self-justification. Now he's teamed with Carol Tavris in another brilliant work exposing the power of the mind to rationalize our mistakes.
Using interesting, real-life examples from a variety of areas (law, science, history, even domestic relations), Aronson and Tavris explore how "hypocrisy theory" allows us to engage in stupid, immoral and wrong conduct, yet remain convinced that we are smart, moral and right.
Mistakes Were Made is fascinating, insightful, and eye-opening. The authors' ability to explain complicated theories of social psychology so entertainingly and interestingly sets this book apart from other academically oriented tomes.
Product warning: Read this book and you'll never be able to look at your behavior the same way again....more info
keep this book close by at all times Written with wit, wisdom and cogent, memorable examples, this book should be mandatory reading for everyone in any kind of relationship, personal or business and more than mandatory for those in political office in Washington (if only they could read!!!!) ...more info
Why people rationalize and justify obviously bad actions After hearing a NPR interview with Carol Travis, I sought out this book and was delighted with it. With ample basis in scholarly research, it was solid and had practical applications in my own life.
I have often been fascinated with why seemingly good people commit poor actions, and then go to great lengths to rationalize and justify their actions, at the expense of and to the detriment of others. Politicians and bureaucrats are those from whom I expect this.
But when it enters your own life, your interest in understanding the basis for it becomes necessary and vital in your own well-being.
The concept of cognitive dissonance and how it drives people to distort their perception of reality, so that the difference between their ideals and their behavior goes away, enters our daily lives. Of course, the problem with this is it reinforces the behavior that caused the dissonance in the first place, and it starts you down a road of deceipt and lies built upon more lies. Having lived the nightmare of being subjected to a "cognitive dissonant" of the nth degree, Travis assisted me in opening my eyes.
The political discussions by Travis will, no doubt, displease conservative readers, and was brave on her part to tackle. This book is strong and I highly recommend it.
Simply excellent, a window into rationalization, justification and reducing cognitive dissonance This is a great book. I heard Carol Tavis in an interview on the Point of Inquiry podcast and thought she was interesting. I looked her up and decided to pick this up.
This book is like a owners manual to your own reasoning and critical thinking, it is just plain smart. Essentially Tavris and Aronson explain how and why everyone comes to believe we are right and reasonable in spite of overwhelming evidence which is contrary to our own firmly held beliefs.
I found this book well written, with expert citation and very easy to read. While I am not a psychological professional or an academic in psychology, I am a skeptic, therefore I can say the book has a scientific/scholarly feel of legitimacy.
This book will likely force you to do quite a bit of reflection, highly recommended....more info
Very good book. Bought 2 coppies for my children. As a retired trial lawyer i think it is required reading for everyone. I first encountered the theory of cognitive dissonance in undergraduate school in the early 60s. Since then the theory seems to have proved its worth in continued scientific studies. About 50 years of scientific scrutiny lends great credibility.
I remember the "recovered memory" sex accusations as they unfolded starting 20 years ago or so and I remember telling my wife and teen age children that I thought the "evidence" was not credible. This book does an excellent job of stating the case for the inaccuracy of memory. I hope this leads to more informed discussion of the subject. ...more info
A MUST-read for everyone What we don't know WILL hurt us.
Well researched. Well written. This book gives a different perspective when evaluating situations and people.
We're really not as smart as we think we are (or, would want others to believe...)...more info
Cognitive dissonance I think the concept of psychological faddism, is addressed by this author in a particularly insightful way. It's a breath of fresh air to find a psychologist who admits there is such a thing let alone specify particular instances....more info
A psychological eye-opener. I have to admit: I read the book twice. The first time, I bogged down after every other chapter because I needed to reconcile what I was reading with what I regarded as true. Many times the book talked about me, and how I justified some aspects of my life. The book actually portrays scenarios very close to my own circumstances!
So, the first time I read it, I felt defensive, because the renowned psychologists and authors, Carol Tavris and Elliot Aronson, wrote chapters clearly explaining how I had self-justified my decisions in life, and how I made myself believe all the stories I told. I also read chapters filled with references to historical, as well as current events, supporting the authors' theories of cognitive dissonance, prejudice, and hypocrisies in our governments and societies. Every page was an eye-opener that required some serious reflection.
When I reached the end, when all the angles of self-justification and self-deception were finally exhausted, I took a long pause. Then I read the book again, this time I was much more open to a better understanding of the principles the authors shared. Only then did I appreciate the nuances of this mental phenomenon of believing only what we want to believe in. "Believing is seeing."
Tavris and Aronson did a marvelous and professional job explaining the self-justifying mechanisms of memory, law enforcement, marriage, and war. How we manipulate our own memories to validate our bad decisions; how officers of the law are "testilying" to back up their preconceived notions; how husbands and wives rationalize divorce; and how heads of state convince the people, and themselves, that they never make mistakes.
What really impressed me about Mistakes Were Made (but not by me) are the countless quotes and references to the words and actions of well-known personalities, celebrities, and politicians. The allusions could be construed as bold and audacious, but they are all public knowledge--quoted from news items, scientific journals, and research papers--and serve well to prove the authors' theories.
So, if you're curious to know how crooks, criminals, and evildoers can sleep at night, and how bitter couples and warring nations can live with themselves, grab this book. And, yes, read it twice. - Ruby Bayan, OurSimpleJoys.com
Research & Polemic I loved the entertaining and easily-read overview of research on cognitive dissonance. The book lost some of its gloss for me when it used the research to make a political statement about the current administration. And, I certainly agree with their statement.
The key is self-awareness and the book makes a significant contribution to that goal. ...more info
interesting, but has flaws About: How people self-justify and reduce cognitive dissonance in their thoughts and actions.
Pros: Will make you think about how you think. Lots of examples. Footnotes.
Cons: Dry writing, very academic, examples tend to focus on politics and more variety would have been nice.
TRULY Great I must admit, I was almost swayed by the reviewer who called this book "almost great" but who was so offended by the use of Bush as an example of the dangers of unchecked self-justification. Like Mr. Almost Great, I don't like books with a heavy political tilt much either. But becasue I was intrigued by the accolades from some of my favorite authors on the dustjacket, I scanned Almost Great's many reviews on Amazon (including 5 stars for Ann Coulter's Liberal-hating books ---Oy Vey!). That decided it for me; I bought the book and read it in an evening.
I LOVE LOVE LOVED Mistakes Were Made! It is TRULY Great.
Reading it, you will learn about your own life, about psychology research, and yes, about politics, but it is not a political book in my opinion. It's a psychological detective story linking up all sorts of puzzling, hilarious, and downright tragic human behavior with a simple, elegant theory. Moreover it is written with humor, clarity, wisdom, and is based on 50 years of research, much of it the work of Aronson, who is a giant in the field of psychology. And despite what some have said, I found it exceedingly fair and balanced--it points out the errors and virtues of both republicans and democrats--unlike books by, say, Ann Coulter, which are anything but fair, much less well-researched.
For example, it explains with crystal clarity why both Bush and LBJ wouldn't budge from a stay-the-course mentality when in both cases it is/was clear to most outsiders that staying the course is/was insane. And it relates these monumental insanities to the kind of decisions and screw-ups and intrangigences we entangle ourselves into every day.
I'm a huge fan of Malcom Gladwell's Books and articles and the Daniel Gilbert book "Stumbling on Happiness," for the way they illuminate the way our minds work in an entertaining way. Like those books, it's a joy to read. But unlike those books, which describe the dynamics, and then say "isn't that interesting," Mistakes Were Made gives you insight and concrete steps to deal with the hobgoblins in our own minds and those of the rationalizing animals--which is everybody--with whom we interact everyday. The section on marriage may be the best treatment of how to get out of annoying spirals of defensive stupidity with one's spouse that I have ever read. And it's not written in an annoying self-help bookish way.
So, If you are like the "Almost Great' reviewer, and get upset hearing about the errors made by individuals from your favored political party, then you definitely NEED this book, and you need you take its lessons to heart, which apparently Mr. "Almost Great" did not. And even if you don't, at least you'll understand why it's so damn hard to. In other words, it will open your eyes to the psychological dynamics underlying partisanship--including being offended by books or ideas that don't confirm your strongly held political leanings.
I cannot recommend this book strongly enough. It deserves to be a best-seller, read by lots of people and reread over and over and over. If it were, I think the world would be a better place....more info
Insights that sting I'm in two book clubs...and to my delight, both of them (operating independently) have chosen MISTAKES WERE MADE as the next book.
It's terrific -- charmingly written and full of perceptive insights. But it is also sobering. It would surely be a hoot to watch as Tavris and Aronson depict the power of self-justification to enable people to make complete fools of themselves....that is, it would be fun if the bell didn't toll very loudly for you and me as well. Happily, the authors give some sound advice about how to escape from the traps that cognitive dissonance sets for us, such as "confirmation bias" -- listening to what you agree with and finding ways to dismiss the rest.
I make plenty of mistakes, but reading this book was NOT one of them... This book was enlightening and disturbing on several points. The most frightening chapter for me discussed police interrogations. For those of you who have been around to remember the stories of the McMartin preschool, you will be especially horrified to learn of the tactics interrogaters use to get you to confess to a crime you never committed.
You would also be amazed about the author's revelations about the theory of repression. It was most disturbing to me that trained "professional" psychiatrists still attempt to explain a patient's underlying problems through some repression of a traumatic experience. As stated in this book, the problem for most people who have suffered traumatic experiences is not that they forget them but that they cannot forget them: The experiences keep intruding. But, the result of some patients to be encouraged to "remember" their past traumas has led to the destruction of family relationships. For clinicians to admit they were incorrect in bring forth the "repressed event" they would have to also admit that their faulty theory resulted in the distruction of the patient's relationships.
Apparently only about 27% of Americans now support our president (or perhaps the person who is really running things... the vice president.) That seems an amazing statistic to me. I mean, how on earth could 27% of the people still think that George W. Bush is doing a worthy job?
Two words: Cognitive dissonance.
Cognitive dissonance is a psychological term which describes the uncomfortable tension that may result from having two conflicting thoughts at the same time, or from engaging in behavior that conflicts with one's beliefs. If you voted for Bush in 2000, and maybe even again in 2004, you would have to admit that you made a mistake in electing a man that has been possibly this nation's worst president.
When I read this book, it all became very clear to me. We all do what we can to appear to be doing the right thing. We all want to look wise and knowledgeable. And when we make a mistake, we attempt to justify what we have done. No one wants to appear clueless; not even the Bush administration. That's why we have several different justifications for why we are in Iraq. They keep changing the reasons, because to admit a mistake would be to look unwise and foolish to the American people.
I told everyone I know that they absolutely need to read this book. But, they'll have to get their own copy because I want to reread mine.
The Milgram experiment in a new light The Milgram experiment, where college students volunteering in a study were "commanded" by the head of the experiment to gradually increase the voltage of shocks administered to other volunteers - even as the recipients screamed in pain, is often used as an example of how sheepish people are in the face of authority (The shockers continued shocking after the simple statement by the head scientist that "The experiment requires that you continue." Talk about The Shock Doctrine!)
This book has a different, more pragmatic take on that experiment, to wit: humans drift off into unethical behavior by taking gradual steps, each of which erodes their resistance to taking the next step toward eventual criminal behavior. Once a person takes that first step toward corruption, the following steps become nearly irresistible. Rationalization is therefore a "gateway drug" of which there is an unlimited supply. Let "The War on Rationalization" begin!
The mechanism of rationalization is the subject of this book. The mechanism is described in Dissonance Theory. This theory, which I suppose we should call an Hypothesis until further notice, offers a convincing explanation for why people "blame the victim" so often and so readily. If I do something unethical (which we, as social animals, are more or less hard-wired to recognize, at least on some level), I can either recognize that lapse and atone for it or, as most people opt to do, rationalize that unethical behavior in order to defend myself and my social status, at least to my own eyes and to my own "underlings". Once I choose rationalization to minimize the discomfort I feel at the lack of congruence between my image of myself as an ethical person and the actions which I know were unethical, I slide down the slippery slope toward projection. Projection is the device by which I further alleviate the discomfort I feel by projecting my own unethical behavior onto the victim of my unethical action. This has been the M-O of Rove, etc.. Out a CIA agent investigating WMD in Iran because she's married to the guy who cast doubt on your story about Iraq's WMD (Nope, no WMD here!), then blame the CIA agent and her husband for betraying the country. Works like magic!! Except, of course, that there were no WMDs and the revelation of the agent's name by the press, encouraged by Rove, Cheney, Libby, et. al. put all those associated with that agent in mortal danger. Yeah, that's how dissonance, rationalization and projection roll.
This book does an excellent job of describing the mechanism and showing how we all are subject to its "wonder working ways." I highly recommend it....more info
This book could have been written in 15 pages This book is about cognitive dissonance. The authors spend one chapter (the first) and 29 pages describing what it and what impact on our behavior are, and then waste the rest of the book with a series of "pop culture" examples of the principle of cognitive dissonance at work. A variety of application areas are addressed--politics, science, love, law, medicine, etc--but nothing is done to flesh out the topic in greater detail nor to prescribe how we can overcome the liabilities that cognitive dissonance can create. My cognitive dissonance is that I wasted $20 on this book and a few hours reading it, but writing this review makes me feel better if I save someone else the trouble....more info
Important for all of us I applaud Tavris and Aronson for such a "needed" work, especially for our current times.
Read the many other excellent reviews for the actual content of the book, as I don't have anything more to add to them except a "thank you" for posting them. My purchase of the book came from these excellent reviews.
I found it to be less of a pointedly or preachy "political book" than some would say here on this forum.
I should probably never be so certain about my position on an issue, or my memory of an event after having read this book. These are indeed hard "habits" to break, or in the sense that the tendency seems to be "hard wired" into each of us.
It seems to me to be a lesson on compromise, listening, dialogue, consideration of another's thoughts, selflessness, and the imperative not to feel like one needs to be "right" about an issue, but instead we might consider that we should only feel the need to be "understood"--but with the contingency that we try to reciprocally "understand."
Regarding some of the criticisms of the book: Leaving out the current political or religious issues would miss a valuable example and lesson on how we collectively become self-righteous as a political or religious body--one only needs to read today's headlines on the MSNBC site to see the cognitive dissonance: "Waterboarding: 'probably saved lives'"--to which I'd ask: "really?" --or "Israeli tanks enter Gaza" --to which I wonder, "will this back-and-forth never end?" or "Dozens killed as blasts rock Algiers"--my hope for this world is an end for the need to be "right," during this season and always.
Problem Without a Solution Well-written discussion of self-justification as a defense to admitting you were wrong or made a mistake. Having described the problem it offers no solutions.
The book also ignores external contributions to creating a toxic social environment for which you should read Phil Zimbardo's The Lucifer Effect and Machiavelli's The Prince....more info
Interesting but has a definite liberal bias The book is interesting. The subject of the book is our "blind spots." Unfortunately, the authors seem to have a few blind spots of their own, which is not surprising. Unfortunately, it makes the book annoying to those of us who don't buy into the liberal political view. Specifically, almost every time the authors pick a public figure as an example of bad behavior, they almost invariably pick a conservative or republican. Dick Cheney, George W Bush, Antonin Scalia, etc etc. Lyndon Johnson and Bill Clinton get mentioned, but in a much softer light. It seems that liberals just don't make as many mistakes as us conservatives. I expected better, especially since the authors spend a lot of time talking about the importance of unbiased psychological experiments. ...more info
One Woman's Voice A fascinating look at the machinations our minds take to keep from giving us the bad news. It is surprising, in fact, that "really bad news" even exists -- at least as far as our own actions are concerned. I thought a small portion of the book (just short of the middle) was redundant, but luckily the authors moved on and provided more unique food for thought. Although the information was scientific and scholarly, it was easily accessible and interesting. It's a fun book to talk about, too. ...more info
Scary and hopeful - everyone should read this I was impressed by the wide varieties of topics in how this self-justifying thinking can occur. Examples ranging from worldwide US politics and war, to prosecutors and detectives, to the average husband and wife, to the elderly taken by scams, to education, in order to show how we self-justify and coerce others into believing "our side" is right.
I would consider this essential reading, especially to US citizens. Our culture has been one of being duped time and time again by the media, politicians, that become so engrossed in this behavior of deceptive emotional rationalization, we accept it as normal and do it ourselves. We live in a "hot potato" culture that quickly throws the hot potato (the blame) in someone else's lap and then justifies it. This must stop.
I felt the writing was easy to read and interesting. I didn't have to "strain my brain" to get the point. There are plenty of footnotes to support what is being said. Others have talked about liberal slant. I didn't see that. When speaking of Christian/Islamic relations, negatives were shown on both sides. How is that liberal? Seems rather unbiased to me to point out the flaws of both sides in order to support how the escalation of violence and war has occurred over the past millennium. As well, US political leaders were picked from both Democratic and Republican parties to use as examples. Where is the slant in that?
I give it 4 stars as I noted some grammatical errors, and some of the flow could have used some better editing. But hey, "mistakes were made" let's move on. I would definitely recommend this book to anyone in a position to dramatically influence other's lives, such as lawyers, doctors, politicians, journalists, judges, teachers, etc. But change starts at home, and so everyone should give this book consideration in order to create a society that is more harmonious and less dissonant....more info
Understanding and motivation This is an extremely readable, perceptive and important book. It explains clearly and undoubtedly accurately how many people think and act.
I bought book after reading a friend's copy, just so I could reread it and make notes all over it. ...more info
Good information, interesting subject, poorly written If these authors ever write a novel, I'll make sure that I DON'T buy it. The writing isn't very good, but the subject matter is interesting and could be recommended for everyone....more info
Great overview of cognitive dissonance Ready for a whirlwind tour through time and space, from the Crusades and the Holocaust to the war in Iraq, from recovered memories and the fallacies of clinical judgment to false confessions, wrongful convictions, and failed marriages? Then this is the book for you.
What ties these disparate topics together, according to tour guides Carol Tavris and Elliot Aronson, is the notion of "cognitive dissonance," which has been creeping into popular awareness in recent years. Cognitive dissonance is the uncomfortable feeling created when you experience a conflict between your behavior and your beliefs, most specifically about who you are as a person. ("I'm a good person, I couldn't do this bad thing.") To reduce dissonance, people engage in a variety of cognitive maneuvers, including self-serving justifications and confirmation bias (paying attention to information that confirms our beliefs while discounting contrary data).
Tavris and Aronson, both top social psychologists and excellent writers to boot, make their point through the repeated use of a pyramid image. Two people can be standing at the top an imaginary pyramid and can undergo the same dissonance-inducing experience. Person A processes the experience accurately, which leads him down one side of the pyramid. Person B engages in a series of defensive maneuvers to reduce cognitive dissonance that eventually lands him at the opposite side of the pyramid. Once at these opposite poles, the two can no longer recognize their initial similarities, and see each other as unfathomable and even dangerous. A particularly compelling, real-life example is two men who experienced a terrifying episode of sleep paralysis in which they saw demons attacking them. One recognized it for what it was; the other became convinced that he had been abducted by aliens and had even fathered a set of twins with an alien partner.
The book could have been called, "Cognitive Dissonance: What It Is and How to Combat It," but then it wouldn't be selling like hotcakes. It provides a thorough overview of the social psychology research on this topic, much of it quite interesting and all of it engagingly presented.
The authors conclude by offering suggestions for reducing the impact of cognitive dissonance on individuals and cultures. One remedy is greater oversight, such as mandatory videotaping of all police interviews of suspects, independent commissions to investigate prosecutorial misconduct, and greater transparency in the academic review process. Another is attention to Americans' cultural fear of making mistakes. Intelligence is acquired, not innate, the authors argue, and mistakes are a necessary part of learning. I particularly enjoyed their examples of prominent individuals who forthrightly owned up to mistakes, including a therapist who had engaged in recovered memory treatment, a prosecutor who had obtained the conviction of an innocent man, and - last but not least - Oprah Winfrey.
Almost a Great Read This book covers some compelling subject matter. The concept of cognitive dissonance is very interesting and very relevant.
However, the authors do themselves and their book a disservice by over-using politcally charged anecdotes to demonstrate instances of cognitive dissonance. In doing so, they tend to annoy or even alienate the reader. Another byproduct of this mistake is that they really find themselves "reaching" when using some stories that don't really make the point they are trying to make.
Although I came away from the book agreeing with the basic premise, it was somewhat difficult for me to embrace the concepts completely knowing that the authors were unable to set aside their ideological biases (given the subject matter, this is humorously ironic).
Politics aside, the core material is compelling. Reading this book is like listening to an engaging speaker who has really annoying idiosynchrasies that you have to mentally block out in order to enjoy the contents of the speech....more info