• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

Why Facts Don’t Change Our Minds

shunyadragon

shunyadragon
Premium Member
As many as ~50-60% of Americans (USA) believe in some form of literal Genesis as the history of human origins and history.

From: Why Facts Don’t Change Our Minds

New discoveries about the human mind show the limitations of reason.

In 1975, researchers at Stanford invited a group of undergraduates to take part in a study about suicide. They were presented with pairs of suicide notes. In each pair, one note had been composed by a random individual, the other by a person who had subsequently taken his own life. The students were then asked to distinguish between the genuine notes and the fake ones.

Some students discovered that they had a genius for the task. Out of twenty-five pairs of notes, they correctly identified the real one twenty-four times. Others discovered that they were hopeless. They identified the real note in only ten instances.

As is often the case with psychological studies, the whole setup was a put-on. Though half the notes were indeed genuine—they’d been obtained from the Los Angeles County coroner’s office—the scores were fictitious. The students who’d been told they were almost always right were, on average, no more discerning than those who had been told they were mostly wrong.

In the second phase of the study, the deception was revealed. The students were told that the real point of the experiment was to gauge their responses to thinking they were right or wrong. (This, it turned out, was also a deception.) Finally, the students were asked to estimate how many suicide notes they had actually categorized correctly, and how many they thought an average student would get right. At this point, something curious happened. The students in the high-score group said that they thought they had, in fact, done quite well—significantly better than the average student—even though, as they’d just been told, they had zero grounds for believing this. Conversely, those who’d been assigned to the low-score group said that they thought they had done significantly worse than the average student—a conclusion that was equally unfounded.

“Once formed,” the researchers observed dryly, “impressions are remarkably perseverant.”

A few years later, a new set of Stanford students was recruited for a related study. The students were handed packets of information about a pair of firefighters, Frank K. and George H. Frank’s bio noted that, among other things, he had a baby daughter and he liked to scuba dive. George had a small son and played golf. The packets also included the men’s responses on what the researchers called the Risky-Conservative Choice Test. According to one version of the packet, Frank was a successful firefighter who, on the test, almost always went with the safest option. In the other version, Frank also chose the safest option, but he was a lousy firefighter who’d been put “on report” by his supervisors several times. Once again, midway through the study, the students were informed that they’d been misled, and that the information they’d received was entirely fictitious. The students were then asked to describe their own beliefs. What sort of attitude toward risk did they think a successful firefighter would have? The students who’d received the first packet thought that he would avoid it. The students in the second group thought he’d embrace it.

Even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,” the researchers noted. In this case, the failure was “particularly impressive,” since two data points would never have been enough information to generalize from.

The Stanford studies became famous. Coming from a group of academics in the nineteen-seventies, the contention that people can’t think straight was shocking. It isn’t any longer. Thousands of subsequent experiments have confirmed (and elaborated on) this finding. As everyone who’s followed the research—or even occasionally picked up a copy of Psychology Today—knows, any graduate student with a clipboard can demonstrate that reasonable-seeming people are often totally irrational. Rarely has this insight seemed more relevant than it does right now. Still, an essential puzzle remains: How did we come to be this way?


 

Revoltingest

Pragmatic Libertarian
Premium Member
Perhaps certainty has great value in early man's evolution, even when wrong.
It also applies to political party success.

Oh, come on....you knew I had to go there.
 

Sunstone

De Diablo Del Fora
Premium Member
It seems to me that a lot of "irrational" reasoning isn't wholly irrational in certain contexts.

Not every stranger is a danger -- but some strangers are. If you have no reasonable grounds to judge whether or not a stranger is dangerous, you are safer assuming they are dangerous than you are assuming they are benign.

So, in evolutionary terms, the person who assumed that all strangers were dangerous was better off (more likely to survive to reproduce) than the person who assumed that all -- or at least some -- strangers were benign. Moreover, the person who assumed all strangers were dangerous was no less likely to reproduce than the person who took the more rational position of assuming only that all strangers could be dangerous until proven otherwise. .
 

Sunstone

De Diablo Del Fora
Premium Member
Reason is, and never has been, more important than telling a good story. Bards know this well.

And -- perhaps to complete your thought -- there are both systemic and non-systemic ways in which a good story tends to differ from reality. As they say, "Truth is stranger than fiction".
 

sun rise

The world is on fire
Premium Member
I could be just having some fun, but what about the idea that college students, especially ones from elite universities have a problem with cognitive dissonance that is worse than the general populace?

The serious point is confounding variables need to be exhaustively looked at.
 

Brickjectivity

wind and rain touch not this brain
Staff member
Premium Member
...Still, an essential puzzle remains: How did we come to be this way?
This seems to tie into something else I encountered on this site -- studies indicating that our choices are affected by recent decisions and conditions. These studies could have benefited from an extra measure of varying the amount of time after subjects had been told the stories were fictitious. It could be that their reasoning has been affected by the immediate choice they had just made and that given some time to think may have reconsidered more easily.
 

beenherebeforeagain

Rogue Animist
Premium Member
I'm curious...I don't specifically encountering these studies, but I've seen similar stories over the years...anyhow...

How long between being told they had been misled, and the next steps in the test? A few minutes? A few hours? Days? Long enough to really absorb and deal with the original information, and the revelation of the deception?
 

Evangelicalhumanist

"Truth" isn't a thing...
Premium Member
Reason is, and never has been, more important than telling a good story. Bards know this well.
But if I also know that well, can I not then give myself the necessary feedback to think again? While I admit I'm as human as the next person, at the very least I know that I make a very serious and concerted effort to bring my reason to bear even over my own preconceptions. I suspect I might succeed some of the time, perhaps even more often than others who can't be bothered.
 

David T

Well-Known Member
Premium Member
As many as ~50-60% of Americans (USA) believe in some form of literal Genesis as the history of human origins and history.

From: Why Facts Don’t Change Our Minds

New discoveries about the human mind show the limitations of reason.

In 1975, researchers at Stanford invited a group of undergraduates to take part in a study about suicide. They were presented with pairs of suicide notes. In each pair, one note had been composed by a random individual, the other by a person who had subsequently taken his own life. The students were then asked to distinguish between the genuine notes and the fake ones.

Some students discovered that they had a genius for the task. Out of twenty-five pairs of notes, they correctly identified the real one twenty-four times. Others discovered that they were hopeless. They identified the real note in only ten instances.

As is often the case with psychological studies, the whole setup was a put-on. Though half the notes were indeed genuine—they’d been obtained from the Los Angeles County coroner’s office—the scores were fictitious. The students who’d been told they were almost always right were, on average, no more discerning than those who had been told they were mostly wrong.

In the second phase of the study, the deception was revealed. The students were told that the real point of the experiment was to gauge their responses to thinking they were right or wrong. (This, it turned out, was also a deception.) Finally, the students were asked to estimate how many suicide notes they had actually categorized correctly, and how many they thought an average student would get right. At this point, something curious happened. The students in the high-score group said that they thought they had, in fact, done quite well—significantly better than the average student—even though, as they’d just been told, they had zero grounds for believing this. Conversely, those who’d been assigned to the low-score group said that they thought they had done significantly worse than the average student—a conclusion that was equally unfounded.

“Once formed,” the researchers observed dryly, “impressions are remarkably perseverant.”

A few years later, a new set of Stanford students was recruited for a related study. The students were handed packets of information about a pair of firefighters, Frank K. and George H. Frank’s bio noted that, among other things, he had a baby daughter and he liked to scuba dive. George had a small son and played golf. The packets also included the men’s responses on what the researchers called the Risky-Conservative Choice Test. According to one version of the packet, Frank was a successful firefighter who, on the test, almost always went with the safest option. In the other version, Frank also chose the safest option, but he was a lousy firefighter who’d been put “on report” by his supervisors several times. Once again, midway through the study, the students were informed that they’d been misled, and that the information they’d received was entirely fictitious. The students were then asked to describe their own beliefs. What sort of attitude toward risk did they think a successful firefighter would have? The students who’d received the first packet thought that he would avoid it. The students in the second group thought he’d embrace it.

Even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,” the researchers noted. In this case, the failure was “particularly impressive,” since two data points would never have been enough information to generalize from.

The Stanford studies became famous. Coming from a group of academics in the nineteen-seventies, the contention that people can’t think straight was shocking. It isn’t any longer. Thousands of subsequent experiments have confirmed (and elaborated on) this finding. As everyone who’s followed the research—or even occasionally picked up a copy of Psychology Today—knows, any graduate student with a clipboard can demonstrate that reasonable-seeming people are often totally irrational. Rarely has this insight seemed more relevant than it does right now. Still, an essential puzzle remains: How did we come to be this way?
As many as ~50-60% of Americans (USA) believe in some form of literal Genesis as the history of human origins and history.

From: Why Facts Don’t Change Our Minds

New discoveries about the human mind show the limitations of reason.

In 1975, researchers at Stanford invited a group of undergraduates to take part in a study about suicide. They were presented with pairs of suicide notes. In each pair, one note had been composed by a random individual, the other by a person who had subsequently taken his own life. The students were then asked to distinguish between the genuine notes and the fake ones.

Some students discovered that they had a genius for the task. Out of twenty-five pairs of notes, they correctly identified the real one twenty-four times. Others discovered that they were hopeless. They identified the real note in only ten instances.

As is often the case with psychological studies, the whole setup was a put-on. Though half the notes were indeed genuine—they’d been obtained from the Los Angeles County coroner’s office—the scores were fictitious. The students who’d been told they were almost always right were, on average, no more discerning than those who had been told they were mostly wrong.

In the second phase of the study, the deception was revealed. The students were told that the real point of the experiment was to gauge their responses to thinking they were right or wrong. (This, it turned out, was also a deception.) Finally, the students were asked to estimate how many suicide notes they had actually categorized correctly, and how many they thought an average student would get right. At this point, something curious happened. The students in the high-score group said that they thought they had, in fact, done quite well—significantly better than the average student—even though, as they’d just been told, they had zero grounds for believing this. Conversely, those who’d been assigned to the low-score group said that they thought they had done significantly worse than the average student—a conclusion that was equally unfounded.

“Once formed,” the researchers observed dryly, “impressions are remarkably perseverant.”

A few years later, a new set of Stanford students was recruited for a related study. The students were handed packets of information about a pair of firefighters, Frank K. and George H. Frank’s bio noted that, among other things, he had a baby daughter and he liked to scuba dive. George had a small son and played golf. The packets also included the men’s responses on what the researchers called the Risky-Conservative Choice Test. According to one version of the packet, Frank was a successful firefighter who, on the test, almost always went with the safest option. In the other version, Frank also chose the safest option, but he was a lousy firefighter who’d been put “on report” by his supervisors several times. Once again, midway through the study, the students were informed that they’d been misled, and that the information they’d received was entirely fictitious. The students were then asked to describe their own beliefs. What sort of attitude toward risk did they think a successful firefighter would have? The students who’d received the first packet thought that he would avoid it. The students in the second group thought he’d embrace it.

Even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,” the researchers noted. In this case, the failure was “particularly impressive,” since two data points would never have been enough information to generalize from.

The Stanford studies became famous. Coming from a group of academics in the nineteen-seventies, the contention that people can’t think straight was shocking. It isn’t any longer. Thousands of subsequent experiments have confirmed (and elaborated on) this finding. As everyone who’s followed the research—or even occasionally picked up a copy of Psychology Today—knows, any graduate student with a clipboard can demonstrate that reasonable-seeming people are often totally irrational. Rarely has this insight seemed more relevant than it does right now. Still, an essential puzzle remains: How did we come to be this way?

Well i would say your phrasing, how did we come to be this way? Might be inverted!

No i know "factually" that my lady friend is magical inspite of all factualitism to the contrary!

Now the reason for this is extremely complex starting with the fact that facts reside in the youngest newest most immature undeveloped part of the brain that has self labeled itself "higher functioning. Zero actual evidence to that inspite of all the the facts! Its more like a dumb thumb really dumb factually!!

So how we became convinced everything is factual based, is a myth of a small tiny handicaped region of the brain and nothing more.

That said factually my lady is magical which has zero to do with facts or the intellect the last i checked.

images (30).jpeg
 
. Thousands of subsequent experiments have confirmed (and elaborated on) this finding. As everyone who’s followed the research—or even occasionally picked up a copy of Psychology Today—knows, any graduate student with a clipboard can demonstrate that reasonable-seeming people are often totally irrational.

One of the best examples of how facts don't change our minds is that many people reading this will fail to accept that this applies to all people, including them. Instead it will be taken as evidence for why people they disagree with can be so stupid :D

Still, an essential puzzle remains: How did we come to be this way?

It's not exactly a profound insight say that being emotionally detached, unbiased and ruthlessly logical in pursuit of objective truth is not a particularly useful for tribal apes in the jungle.

Could say that an essential puzzle is why so many of us think that we should have evolved to give primacy to such things.
 
It seems to me that a lot of "irrational" reasoning isn't wholly irrational in certain contexts.

Not every stranger is a danger -- but some strangers are. If you have no reasonable grounds to judge whether or not a stranger is dangerous, you are safer assuming they are dangerous than you are assuming they are benign.

So, in evolutionary terms, the person who assumed that all strangers were dangerous was better off (more likely to survive to reproduce) than the person who assumed that all -- or at least some -- strangers were benign. Moreover, the person who assumed all strangers were dangerous was no less likely to reproduce than the person who took the more rational position of assuming only that all strangers could be dangerous until proven otherwise. .

Also, seeing as nurture is very important for our species, a bias towards things we learn should be expected. If we learn from those around us (family mostly), to some extent, we should be biased towards putting more weight on their knowledge derived from experience, than our own naive interpretations of events. This aids the transmission of cultural knowledge, group norms and knowledge based practices which are obviously beneficial.

Once we believe something, it very much makes sense for us to be biased against revising beliefs.
 
I could be just having some fun, but what about the idea that college students, especially ones from elite universities have a problem with cognitive dissonance that is worse than the general populace?

The serious point is confounding variables need to be exhaustively looked at.

Some studies have shown that more intelligent* people are often less likely to change their minds on things like politics. Presumably because when we are exposed to contradictory evidence regarding things we are emotionally attached to the natural response is to look for reasons why we don't have to believe this new evidence.

People with better reasoning abilities are better placed to find justifications for dismissing that which they do not want to be true.

*Based on some metric which substitutes for intelligence
 

DustyFeet

पैर है| outlaw kosher care-bear | Tribe of Dan
theory.. people are no longer "impressed" or "swayed" by facts. we are moving to a new "age". The age of "feelings". Maybe, in order to convince people, it is more effective to approach it from an emotional / spiritual angle? Maybe it's ummm more efficient, to appeal to people's feelings instead of using facts?

edit: maybe it's more efficient.. but waaaaaay less predictable?
edit edit: ok... i'm arguing with my "self"... hello "self". "if it's less predictable, how is it efficient?" "Good point... forget the word efficient..." "Jolly Good, self"
 
Last edited:

GoodbyeDave

Well-Known Member
People with better reasoning abilities are better placed to find justifications for dismissing that which they do not want to be true.
Remember that there is always an alternative explanation. It may not convince anyone but you, but if you want to explain away some inconvenient fact, you don't have to be a genius to come up with a suitable theory.
 

shunyadragon

shunyadragon
Premium Member
My primary concern in reflection of this type of research is why people from different 'belief systems' persist in rejecting science in favor of there view of the authority of ancient scripture. There is a strong tendency for people to give priority to their 'sense of belonging' and their 'sense of identity,' in justifying ancient views of the nature and history of our physical existence.over the knowledge of science
 

Terrywoodenpic

Oldest Heretic
Why do we not accept new scriptural stories.?
But have greater faith the older they are.?
We believe explanations by ancient prophets
But not the teachings and interpretations of modern man based on the same information.
 

DustyFeet

पैर है| outlaw kosher care-bear | Tribe of Dan
feelings.... nothing more than feelings...

people don't care about facts anymore... evidence: american politics.. doesn't matter which side you're on. each side claims to have the facts. facts aren't convincing anymore... sad, but true

( just my opinion )

op asks "why". i don't know why
 
Top