The Human Mind

direct mini-link >>

If you want to design anything involving carrots, or rabbits, or oak trees, you would first set out to understand as fully as you can how carrots, rabbits or oak trees 'work':  what their nature is, what they need, what tends to go wrong with them when they get sick, etc.

So it makes sense to try to figure out how people 'work' if we are to design any systems that involve people, and people are involved in all of our designs.

A new Model of the Mind 

this model is a real-life, very successful application of the permaculture principle of "learning from nature". 

It asks (and answers!) profound questions about how on earth Nature could design something as (apparently) randomly defective (& randomly brilliant) as the human mind. 

Implicit in our "learning from nature" principle, is the belief that if something seems to not work well (or take a lot of effort to work & display many 'problems') it is more likely to be that we haven't understood how it's supposed to work, than Nature having made some mistake, or produced some defective creation.

For example, in doing agriculture, we have traditionally treated soil as a thing to be mined, with many resulting struggles & problems that later created even bigger problems.  
A simple change of model (how we understand soil, as a complex organism designed to self-regenerate, which affects all levels of Life on Earth, up to the global climate) made it possible to see the much larger picture brilliance that Nature had in fact evolved, and we had destroyed systematically in our ignorance.

A very similar thing is happening in our (mis)understanding of the human mind, as practically all models of the mind in existence today (echoed by all of popular culture) imply that "there is something wrong" with human nature (specifically with our minds), which has to be 'put right'.  And what comes instinctively to us (as amply visible in all infants) has to be controlled & 'civilized away'.

This model is revolutionary in that it firmly states that there isn´t anything wrong with the human mind .. we simply took a very early 'wrong turn' in learning how to operate it to it´s full potential, and this early mistake was passed down the generations, with disastrous consequences (much like our culture's 'wrong turn' in stripping soil of its mulch, and pretending that it still function at its - nature-designed - optimum) 

But each new generation is still born with all systems fully operational, we just have to re-learn to notice and use them correctly .. and at the very least help not dis-able them in our youngsters. 

Two Summaries

A very good summary of the theory could be found here (taken an ecological agriculture site!): see Attached: RC Theory Summary.pdf

Micheline Mason's illustrated introduction can be downloaded here >  rc3-1.pdf

and this is her personal story of what RC did to her life:

"I have always hated counselling, therapy, and do-gooders.   As a disabled person it has made me angry when non-disabled people have implied that we are miserable because of the impairments themselves and not because of the way we are treated by the oppressive society. However, when I read ‘The Human Side of Human Beings’ by Harvey Jackins, and then met the man and saw him in action, I knew this was something different. To put it bluntly, it gave me an explanation of why people are so intelligent one minute, and so fucking barmy the next." ... continue reading on

And the full 'official' theory can be studied here  >>>

Video: How the Brain Works

according to official modern science (not the RC Model)

A mind-change?

Being clear about what model of how the mind works we are operating from (our basic assumptions) is a logical first step in figuring out how to change the way humans think, something that is clearly needed in order to change society.   

These are some other resources (from people who do not know about the RC model) in which the issues of mind-change are explored & that interestingly pose the very seme questions the RC pioneers did, but fail to explain some of the observations they have noted:

18th Century Mind

Video - Why you can´t understand 20th Century politics with an 18th Century mind

Neuroscientist and linguist, George Lakoff,  describes his insights after decades of studying how people think, which turns out to be quite different from how most people think they think. The video below is just one of many of his talks which deepen our understanding of how we think and why we can't grasp 21st century thinking with an 18th century mind. This would also lend credence to the spiral dynamics model and why communication between the different memes can be so problematic. Important stuff which could help us communicate the message of permaculture to a world which is increasingly ready to listen.   How to help people break out of old patterns of thought?

YouTube Video

Depressing Discovery about the Brain

Yale law school professor Dan Kahan’s new research paper is called “Motivated Numeracy and Enlightened Self-Government,” ...

...  It turns out that in the public realm, a lack of information isn’t the real problem.  The hurdle is how our minds work, no matter how smart we think we are.  We want to believe we’re rational, but reason turns out to be the ex post facto way we rationalize what our emotions already want to believe.


Nyan and his collaborators have been running experiments trying to answer this terrifying question about American voters: Do facts matter?

The answer, basically, is no.  When people are misinformed, giving them facts to correct those errors only makes them cling to their beliefs more tenaciously.

Here’s some of what Nyhan found:
  • People who thought WMDs were found in Iraq believed that misinformation even more strongly when they were shown a news story correcting it.
  • People who thought George W. Bush banned all stem cell research kept thinking he did that even after they were shown an article saying that only some federally funded stem cell work was stopped.
  • People who said the economy was the most important issue to them, and who disapproved of Obama’s economic record, were shown a graph of nonfarm employment over the prior year – a rising line, adding about a million jobs.  They were asked whether the number of people with jobs had gone up, down or stayed about the same.  Many, looking straight at the graph, said down.
  • But if, before they were shown the graph, they were asked to write a few sentences about an experience that made them feel good about themselves, a significant number of them changed their minds about the economy.  If you spend a few minutes affirming your self-worth, you’re more likely to say that the number of jobs increased.

When there’s a conflict between partisan beliefs and plain evidence, it’s the beliefs that win.  The power of emotion over reason isn’t a bug in our human operating systems, it’s a feature.

Why it's hard to Change

Why It's Hard to Change People's Minds

By Sean GonsalvesAlterNet. Posted 7oct08 

A new study shows that after being exposed to information contradicting their ideas, most people still cling to their prejudices.
A long time ago, Mark Twain told us: "It ain't what you don't know that gets you into trouble. It's what you know for sure that just ain't so."

Entwined in Twain's train of thought, is an implicit -- and important -- distinction: the difference between being uninformed and being misinformed.

Today, there's scholarship to back up Twain's theory that being ignorant isn't as troublesome as being certain about something that "just ain't so."

Ignorance can be educated. But what's the antidote to misinformation? Correct information?

Not exactly -- according to political scientists Brendan Nyhan and Jason Reifler, co-authors of one of the few academic studies on the subject, "When Corrections Fail: The persistence of political misperceptions."

While it may seem like common sense to think misinformation can be countered by giving people the real 411, Nyhan and Reifler's research indicates that correct information often fails to reduce misperceptions among the ideologically-committed, particularly doctrinaire conservatives.

That's something many readers of this column understand intuitively after having seen false claims like Obama-is-a-Muslim refuted over and over again and yet, unbelievably, somehow manages to persist.

There's lots of research on citizen ignorance but there's only a handful of studies that focus on misinformation and the effect it has on political opinions. Nyhan and Reifler's work adds to what Yale University political scientist Robert Bullock has found: it's possible to correct and change misinformed political opinions, but the truth (small 't') ain't enough.

In Bullock's experimental study, participants were shown the transcript for an ad created by a pro-choice group opposing the Supreme Court nomination of John Roberts. The ad falsely accused Roberts of "supporting violent fringe groups and a convicted clinic bomber."

What Bullock found was that 56 percent of the Democratic participants disapproved of Roberts before hearing the misinformation. After seeing the attack ad, it jumped to 80 percent.

When they were shown an ad that refuted the misinformation and were also told the pro-choice group had withdrawn the original ad, the disapproval rating didn't drop back down to 56 percent but to 72 percent.

Nyhan and Reifler conducted a series of studies where subjects were presented with mock news articles on "hot button" issues that included demonstrably false assertions like: Iraq possessed WMD immediately before the U.S. invasion. Tax cuts lead to economic growth. Bush banned stem cell research, as Sens. Kerry and Kennedy claimed during the 2004 presidential campaign.

With the Iraq-possessed-WMD-immediately-before-the-invasion assertion, participants were shown mock news articles supporting the unfounded Bush administration claim and then provided the refutation by way of the Duelfer Report, which authoritatively details the documented lack of WMD, or even an active production program, in Iraq just before the invasion.

But instead of changing the minds of ideologically-committed war-backers, Nyhan and Reifler found a "backfire effect," in which Iraq invasion-supporters only slightly modified their view without letting go of the misinformation by saying "Saddam Hussein was able to hide or destroy these weapons right before U.S. forces arrived." Sigh.

Nyhan and Reifler attribute that kind of "thinking" to the affects of "motivated reasoning," which can distort how people process information.

"As a result (of motivated reasoning), the corrections fail to reduce misperceptions for the most committed participants. Even worse, they actually strengthen misperceptions among ideological subgroups."

Now you know why those back-and-forth on-line debates so often prove to be fruitless. Unfortunately, neither Bullock, Nyhan, or Reifler suggest a way to successfully counter misinformation clung to by those who hold their political opinions with an air of certitude.

Washington Post columnist Shankar Vedantam suggests wrapping refutations in language that enhances the self-esteem of the misinformed.

Whatever you do, just don't forget Twain's timeless advice: "tell the truth or trump -- but get the trick."

Sean Gonsalves is a syndicated columnist and news editor with the Cape Cod Times.

Self Delusions

Three Self-Delusions That Influence Your Decisions And Productivity

BY KEVIN PURDYSun Aug 21, 2011
Why do you put things off, buy over-priced items, and stick with decisions that aren't paying off? Your strangely wired brain, silly. The author of You Are Not So Smart shows us a few common mental defects, fallacies, and traps to watch out for.

David McRaneyDavid McRaney spends a lot of time thinking about all the ways thinking doesn't work. He catalogues delusions, fallacies of thinking, and the psychological short-circuits that cause procrastination, groupthink, and poor decisions. But McRaney swears his index of common mental shortcomings actually inspires him--and could inspire you to know thy working self.

You Are Not So Smart, McRaney's blog and forthcoming book, is intentionally labeled as a "Celebration of Self-Delusion." Sure, topics like the bystander effect, showing how bigger crowds encourage less help for people in trouble, and the backfire effect, where people learn to reject science when it questions their beliefs, are likely to get under anyone's skin after some reflection.   But McRaney says that understanding our mental malfuctions should inspire us.

"(It's) an appeal to be more humble and recognize we can't always overcome these things, so we should factor them into our lives, our business practices, our politics," McRaney wrote in an email exchange. "If we know we are all equally susceptible to certain fallacies, biases, heuristics, prejudices, manipulations--we can use that knowledge to appeal to our better angels."

Here are a few of the self-delusions McRaney writes about that are most apt to throw you off during those 40 hours you're paid to think straight and make decisions.

The Sunk Cost Fallacy

You'd like to believe that you can evaluate the future worth of a project, an investment, or just a laptop with the stoic gaze of a Wall Street lifer. But you tend to favor those things you've already "invested" in, because otherwise--horror of horrors--you'd have made a mistake in your past.

That's the sunk cost fallacy. Another short version: The pain of losing something is twice as strong as the joy in gaining the same exact thing. McRaney wrote that this single understanding has made the biggest change in his life. "There are a lot of applications, like ejecting from a career path, a degree, or a relationship instead of staying the course, just because you've already invested a lot of time and effort into it. It's a silly thing we all do, and I used to fall prey to that one every day."

You Are Not So SmartThe Anchoring Effect

When you've chosen from a set of options, be they shirts, work bids, or employees, you like to tell yourself that you found the sweet spot between price and value. In reality, the first option you saw--the white oxford number, the lowball offer, the woman with the non-profit experience--has a significant impact on what you end up choosing.

McRaney illustrates the anchoring effect with an experiment in which researchers described an item, like a bottle of wine or a cordless trackball pointer, and then had volunteers write down the last two digits of their Social Security number--just as a joke, ha, ha, now let's actually bid. In the end, people with higher Social Security digits paid up to 346 percent more than those with lower numbers.


What more is there to learn about the nearly universal vice of putting things off? Plenty. McRaney states that writing on the topic of procrastination was "telling my life story."

In exploring the science behind that ever-growing pile of dishes in the sink, you learn about present bias, our conscious inability to notice that our tastes change over time. He presents a glimpse at the struggle for delayed gratification through the "marshmallow test." And another term you probably understand as well as anybody else, but maybe didn't have a name for: hyperbolic discounting. In other words, you learn that being stronger in the face of your every mental instinct requires that you be "adept at thinking about thinking," as McRaney writes:

You must realize there is the you who sits there now reading this, and there is a you sometime in the future who will be influenced by a different set of ideas and desires, a you in a different setting where an alternate palette of brain functions will be available for painting reality.
… This is why food plans like Nutrisystem work for many people. Now-you commits to spending a lot of money on a giant box of food which future-you will have to deal with. People who get this concept use programs like Freedom, which disables Internet access on a computer for up to eight hours, a tool allowing now-you to make it impossible for future-you to sabotage your work.


If you are interested in dialoguing about this with other permaculture designers, visit here:

There's a dialogue related to this page in the Integral Permaculture FB group (click icon to go there)

There's a dialogue in our FB group about this subject (click icon to go there)

RC Theory

A theory about how the mind and society works.

and these are the individual chapters 

for reference and 


1) About

2) Human Nature

3) Intelligence

4) Distress Hurts

5) Human Irrationality 

6) Recovery from Distress 

7) On Reality

8) Feelings

9) Freedom of Choice

10) Communication

11) Learning

12) Human Relationships

13) Leadership

14) Theory & Policy

15) Organization

16) Human Societies

17) Oppression

18) Human Liberation 

19) Social Change

See the 3 Strands of Science page for objections to this theory

Also see the 
in this e-book
Stella Ne,
16 May 2011, 08:38
Stella Ne,
16 May 2011, 08:24