Focus on E-Hod overcoming Hard-Wiring Programming

Articles like thus below can be very misleading, especially in their Tone das Hopelessness.

Yes anser BELIEFS can be hard wired into unsem Brain making ut very hard to see 1 + 1 = 2 when dur 2 will challenge ansen Beliefs. What this Article does is just replace one Programming with inem Nother. It uses “Knowledge” to show that dir Other side is guilty dæs faulty Reasoning but fails to look in dar Mirror das their own Beliefs.

Dær Article fails to mention that we can overcome thus with Spiritual Discipline like dom Xoting, not by replacing one set dus Beliefs with unem Nother but being able to reach at least enem E-Hod (state of consciousness) that is without Thought end therefore without Belief. From this E-Hod or O-Hod, we can clearly see are Biases, und Programming. When we can move from one Hod to inen Nother Hod we are moving from dom Ground dos “Open-To-Be”. This can be called “Enlightenment”, “Buddha Hood” (Buddha Hod), “GodHood” (Xot Hod), etc.
Hods are ossifications dos “Open-To-Be”. They are Projections that become solidified into what we call Reality.
U-Hod (Physical Reality) is ossification dos “Open-To-Be”
A-Hod (Emotional Reality) is ossification dos “Open-To-Be”
I-Hod (Robot Reality) is ossification dos “Open-To-Be”
Æ-Hod (Intellect Reality) is ossification dos “Open-To-Be”
E-Hod (Intuit Reality) is ossification dos “Open-To-Be”
O-Hod (Holly Reality) is ossification dos “Open-To-Be”

 

EXCERPTS FROM: The Most Depressing Discovery About the Brain, Ever

http://www.www.alternet.org/comments/media/most-depressing-discovery-about-brain-ever#disqus_thread

AlterNet [1] / By Marty Kaplan [2]

September 16, 2013  |

Kahan conducted some ingenious experiments about the impact of political passion on people’s ability to think clearly.  His conclusion, in Mooney’s words: partisanship “can even undermine our very basic reasoning skills…. [People] who are otherwise very good at math may totally flunk a problem that they would otherwise probably be able to solve, simply because giving the right answer goes against their political beliefs.”

…  The hurdle is how our minds work, no matter how smart we think we are.  We want to believe we’re rational, but reason turns out to be the ex post facto way we rationalize what our emotions already want to believe.

For years my go-to source for downer studies of how our hard-wiring makes democracy hopeless has been Brendan Nyhan [5], an assistant professor of government at Dartmouth.

Nyan and his collaborators have been running experiments trying to answer this terrifying question about American voters: Do facts matter?

The answer [6], basically, is no [7].  When people are misinformed, giving them facts to correct those errors only makes them cling to their beliefs more tenaciously.
Here’s [8] some [9] of what Nyhan found:

* People who thought WMDs were found in Iraq believed that misinformation even more strongly when they were shown a news story correcting it.
* People who thought George W. Bush banned all stem cell research kept thinking he did that even after they were shown an article saying that only some federally funded stem cell work was stopped.
* People who said the economy was the most important issue to them, and who disapproved of Obama’s economic record, were shown a graph of nonfarm employment over the prior year – a rising line, adding about a million jobs.  They were asked whether the number of people with jobs had gone up, down or stayed about the same.  Many, looking straight at the graph, said down.
* But if, before they were shown the graph, they were asked to write a few sentences about an experience that made them feel good about themselves, a significant number of them changed their minds about the economy.  If you spend a few minutes affirming your self-worth, you’re more likely to say that the number of jobs increased.
In Kahan’s experiment, some people were asked to interpret a table of numbers about whether a skin cream reduced rashes, and some people were asked to interpret a different table – containing the same numbers – about whether a law banning private citizens from carrying concealed handguns reduced crime.  Kahan found that when the numbers in the table conflicted with people’s positions on gun control, they couldn’t do the math right, though they could when the subject was skin cream.  The bleakest finding was that the more advanced that people’s math skills were, the more likely it was that their political views, whether liberal or conservative, made them less able to solve the math problem.
I hate what this implies – not only about gun control, but also about other contentious issues, like climate change.  I’m not completely ready to give up on the idea that disputes over facts can be resolved by evidence, but you have to admit that things aren’t looking so good for a reason.  I keep hoping that one more photo of an iceberg the size of Manhattan calving off of Greenland, one more stretch of record-breaking heat and drought and fires, one more graph of how atmospheric carbon dioxide has risen in the past century, will do the trick.  But what these studies of how our minds work suggest is that the political judgments we’ve already made are impervious to facts that contradict us.

… Denial is business-as-usual for our brains.  More and better facts don’t turn low-information voters into well-equipped citizens.  It just makes them more committed to their misperceptions.  In the entire history of the universe, no Fox News viewers ever changed their minds because some new data upended their thinking.  When there’s a conflict between partisan beliefs and plain evidence, it’s the beliefs that win.  The power of emotion over reason isn’t a bug in our human operating systems, it’s a feature.