Thursday 16 October 2014

Dispensing with the Moral/Thought Dictate

In my earlier post (today, I'm feeling quite brain-fart-y this morning), I described the emotional reward we get for learning lessons that have no physical nature. The very act of 'recognising' a pattern already triggers a reward in the brain, but seeing this recognition in another human seems to add a secondary, stronger effect: a negative reaction from that other human should move us to adapt our emotional reaction ('this doesn't work here'), and a positive reaction would doubly reward and confirm that pattern (as 'acceptance' also means 'survival' to the human brain). This is what we'd call 'morals', but it can also be called 'adapting to new environments'.

As a human has first to rely on his existing memory/emotion cocktail to judge the new environment, what it does next is based on its need for survival: if the situation is dire, and doesn't match any existing patterns, a human has to use these to do its best to determine a course of action in order to survive; if the situation is not life-threatening or challenging, the human has the privilege of choice: either remembering the situation or problem for later analysis, or simply dismissing it and forgetting about it. In a real-life alone-against-nature situation, the brain has no choice but to rely on reason: it will decide if a situation is good for it or not, and adjust its neurone network accordingly; yet if the brain is presented with an idea not represented in reality, if it is not dismissed, it will try to 'make' it real by imagining it, and will attempt to make the same adjustments.

No matter how we name them or talk about them, our basest instincts are around self-preservation. In situations that involve physical things, decisions are empirical and obvious, and 'working methods' are rewarded and categorised 'good for survival', but when it comes to untested new non-physical ideas, how does the brain decide if a non-physical idea 'works' or not? The choice here is simple: trust your own judgement, or trust a more experienced someone else's, but first, trust your own ability to choose between the two.

The last choice is everything. If you don't think yourself able to 'know' or to 'decide', then you are have no choice but to rely on someone else's judgement, meaning that you are dependant on that 'trust tree' I mentioned in my last post. And if an experience doesn't 'match' something 'taught' to you by anyone in your trust tree, depending on the level of danger, your brain will either dismiss the experience or remember it as something 'dangerous' (only 'confirming' the earlier mimicked 'lesson').

So for anyone who can't or won't rely their your own judgement, anything, anybody, or any idea originating from 'outside' that programmed sphere becomes a sometimes very scary 'unknown, can't judge, beware' category directly linked to our emotional, instinctive sense of self-preservation.

This is what both religion and totalitarianism tap into. Both divorce from a human its innate ability to judge the world for itself, make it impossible for a developing human to take its first steps in that self-sufficient state, and present (or impose) themselves as a sole recourse for any and all judgement in things both real and imagined as though they are equally "true". To the untrained and non-rational brain, they might as well be.

This limits humans not only in their education and decision-making abilities, it makes entire populations, instead of measuring themselves against reality and each other, rely on one source for all their decisions and guidance. Since that central source's judgement also encompasses what is dangerous or not, the people they 'lead' have no choice but to refer to their 'trust tree' for a 'measure' of safety, and view anything not originating from it as potentially dangerous. If their 'moral guide' dictates that something or someone 'not of theirs' is dangerous, the brain will adjust its emotional and neurone networks accordingly as though it was a real danger.

Now take two (or more) 'moral guides' competing for control over a single population trained to believe that they 'need' guidance: no wonder Christianity and Communism hated each other (and 'because atheism' my ass!). Now take two competing 'moral guides' who know that they can never persuade the other's following to follow them, so paint them as 'the enemy'. Now imagine that that central dictate is only educated in its own indoctrination methods and inept at everything else. Now imagine that the central dictate starts adjusting its dictates and descriptions of 'the enemy' only to accommodate its own wealth and survival... oh, throw some nuclear weapons in there, too. Scared yet?

But not only totalitarian regimes and religion are guilty of this: it exists to a lesser degree in advertising, Fox 'news', politics, and we all aid and abet this system by 'confirming' (or rejecting) each other's choice of dictate.

If we want to fix this, before all, we need to tell humans that they are able to make their own decisions and that their brains are wired for it from birth. We no longer have to face nature to 'prove' that ability, but that doesn't mean we don't have it and shouldn't use it, because, after a ~10,000-year long 'rationality-free' holiday, our very survival depends on it today.

PS: I just thought about someone's 'concentric circles of extremism', that the 'moderates' don't support what the 'extremists' do... f*ck that, ALL of them follow the same dictate, so it's the dictate's support or condemnation of whoever's action that speaks for all of them, since the dictate is the unique moral guide for all.

No comments:

Post a Comment