What to Do if You or Someone You Love is in a Cult
Hey there world, This series has been somewhat of a surprise to me – not in terms of content, but in terms of readership. Ordinarily, I would have gone off into other topics again by now, but due to the fact that this has been picked up by another blog, I need to finish it where I said I would.
I’m doing it via video…eeeeek! Sometimes you need to see whether the person on the other end of the article is a friendly face. I hope you see mine as friendly! Click the link to check the vid! The video is the whole blog post so like - go on. Click it. If you are in a safe place to listen to this stuff without someone asking questions. Here for you. xo
I forgot to raise one thing in the video (because – nerves! I like to be behind the camera not in front of it!) And that is that institutional religion can offer the benefit of solid policies, grievance procedures, and that these can/should exist to protect the people inside it. I meant to say that…but completely forgot. Anyway! Go watch the video, and don’t forget this isn’t a tell all – you’ll need to read the rest of the series if you missed it. Thanks for tuning in. I really hope this helps, or at least raises awareness and compassion for what some people may be going through. Cults/high demand groups/toxic groups still happen. They didn’t go out of fashion in the 70’s.Okay! Go watch if you haven’t already. And if you need more info, don’t be afraid to consult an expert.
Be safe. Be happy. Be free.
Kit K.
Thinking Errors and Thought Stoppers in Cults
Welcome to part 6 of the “Cults and Unhealthy Groups” series. I thought I’d be done by now, but as it turns out there are a few more things that popped up on my radar that made me go “Hmmm. Now that needs a little more awareness.” So we need to talk about cognitive distortions and thought-stopping clichés. A couple of slightly tidier terms for these are “thought-stoppers” and “thinking errors.”
I’m going to try and keep this light, but if you are exiting or have recently exited a cult/unhealthy group, you may find some examples triggering. If you do, make sure you contact one of the helplines listed at the bottom of the article. Look after yourself, friend. You are worth it.
When it comes to cognitive distortions (thinking errors) – anyone can have them. To me, that seems pretty much human. We can be healthy in some areas of our thinking and others can be thoroughly unhelpful. But there are counsellors, journals or wise friends to help us recognise these and get through them. Whether you are in a cult or not, you can have a couple of these going on. No biggie. But there’s a big difference between say, the cognitive distortions of a teenager with an eating disorder and the cognitive distortions of a cult leader and the effect that could have on their followers.
If you are going to pick up a book on this topic, then “Captive Minds: Captive Hearts” by Lalich and Tobias is about as good as it gets. The topic of cognitive distortions in cults is not a newie and all of these terms have come from experts (i.e. not me!) But! When it comes to cults/unhealthy groups, the thing I find interesting is the way that thinking errors can feed into the power and control structures of the group. I’m not an expert and this is a step outside my comfort zone. I’d rather just lay out the research. But I feel like examples will help make this more relatable. So its research writer meets fiction writer today. I hope it helps.
We need to know what these things are before we can be empowered to recognise them. We can’t always help others see their own thinking errors, especially in cults. But we can free our own minds from their negative effects.
Cognitive Distortions/Thinking Errors
There’s a list of 15 at Psych Central, and another list of 50 at Psychology Today but I thought I’d pick out the ones that may be most applicable to cults. Psychologist Linda Tilgner (and co-writers) flagged the first five of these when they were writing on the topic of institutional abuse. The last two are added by yours truly based on other reading. If you’ve read the first few pieces in this series, you’ll know by now that cult leaders can wield a whole lot of control over their followers – in every area of life, potentially. What I’m going to jump right into is what these thinking errors are and how they might be used to further entrench the ideas and control structures of the group.
1 – Mental filtering. This occurs when you filter out some of the details of a story or situation and fixate (usually) on the negatives. A person may ignore all the positives in a situation and focus only on the down-sides, thus allowing their thinking to spiral toward a dark or doomy conclusion.
How might a cult leader do this? They may ignore or filter out the good progress society is making in many areas and use a single crime to paint a picture of the world becoming a more depraved or dangerous place. The rumination on this single event could then be linked with unrelated events to create a dark picture of the future. This may then cause followers to fear/distance the outside world more, and further devote themselves to the teachings and lifestyle of the group – which they may believe will be the thing that ‘saves’ them. Their commitment to their group, and the leaders control over them, becomes more entrenched.
2 – Polarised thinking. This one is also called “Black and white” thinking, and basically it removes all grey areas or middle ground. You are either a complete success or an utter failure. People are either for you (team/pseudo-family/covenant) or against you (enemies/people who cannot be trusted). There is no middle ground like a neutral acquaintance or just steady progress in life with a minor setback here and there.
In a cult/unhealthy group setting, the most obvious way this might play out is the ‘us vs them’ mindset where people are either allies or enemies and nothing in between. This has the potential to create deep divisions between family/friends and the cult member, or even fuel a persecution complex.
But on a more personal level, the demand for purity (that Robert Lifton talks about in his work) means that this polarised thinking could result in the “complete success or utter failure” version of polarised thinking – If I mess up even once, it’s a disaster. I’m a complete failure. (This links with another thinking error called “catastrophisation” but time doesn’t permit me to go there.) Given the control structures within the cult, and the impact of failure on a persons standing in the group, the pressure to conform could be loaded with consequences here.
3 – Overgeneralisation. This is when we come to a general conclusion based on a single incident or a single piece of evidence. In a cult setting, the leader may use this tendency towards overgeneralisation to make sweeping statements based on single events, further entrenching their teachings, labels or view of the world, whilst also filtering out the details that don’t suit their narrative and using polarised thinking to further separate followers from “the rest of the world.” The world is getting darker. This group is a problem. This person has an agenda. This person or thing is a threat. That sort of thing.
4 – Emotional reasoning. Emotional reasoning is when you feel something so therefore you believe it is true. Lalich and Tobias, in “Captive Minds, Captive Hearts” (cited here) explain it this way: “In groups that place emphasis on feeling over thinking, members learn to make choices and judge reality solely based on what they feel. This is true of all New Age groups and many transformational and psychology cults. Interpreting reality through feelings is a form of wishful thinking. If it really worked, we would all be wealthy and the world would be a safe and happy place. When this type of thinking turns negative, it can be a shortcut to depression and withdrawal: “I feel bad, worthless, and so on, therefore I am bad, worthless, and so on.”
Side note: One of the reasons cult leaders have such power over their people is that their followers believe they have special enlightenment, or a special ability to hear God or be God. So they’re not going to call emotional reasoning what it is. It would take someone pretty darn ballsy (or unconcerned with their reputation or standing in the group) to say “I don’t think that’s based on fact. I think its emotional reasoning.”
What might emotional reasoning look like in a cult or high demand group? Imagine an extended meditation session guided by a guru, or perhaps an extended praying in tongues session, or intense music that leads you into an altered state of consciousness, one where you are emotionally heightened, open-minded and impressionable. Things said at the peak of such an experience may resonate very strongly because of the emotional state you are in. It can be very easy to take them as truth during this time even if the evidence in your life is contradictory. “I must change.” “God will only love me if I change.” “I must become better.” “I must work harder.” Or “A particular thing in my life is about to change. I just know it. I’m going to make life decisions based on this feeling.”
Experiences during such heightened states may not be recognised as emotional reasoning, but this is what it can indeed be.
5 – Labelling. This happens when we take one occurrence or characteristic and apply a label to someone. Its unhealthy when we do it to ourselves or other people, but when a group dynamic is added, and a single person has the power to label someone and have a whole group agree without questioning – that’s powerful. In Scientology, dissenters are branded “Suppressive persons”. In other groups, there are other words. “Jezebel” is one I’ve heard thrown around a bit. Labels can damage a persons self-esteem, but also their relationships and standing within the group. A person could be labelled anything by a cult leader, and because followers lack the ability to question him/her, it can be difficult or impossible to shift.
The “us vs them” mindset is also an example of labelling in the cult/unhealthy group context. This has the power to make followers disregard the care and input of entire groups of people because they’ve been labelled as an enemy.
6 – Mind reading.“I just knew what they were thinking.” If a guru/central person in a cult tells us this, we may whole-heartedly believe they actually knew what someone was thinking. We may even react pre-emptively based on this assumption. Guess what: We can’t read minds. No one can. So if you are told “they were thinking this,” and it didn’t come straight from the mouth of the person who was thinking it, it’s a cognitive distortion. No one can read minds. Not even your guru/pastor. This links with another cognitive distortion called “fortune telling” where people claim to know what happens next. “They’re thinking this…they’ll do this next.” That sort of thing. No one can know. They can only guess.
How might this look in a cult/unhealthy group setting? “She’s thinking this. She’ll do this next” is said by the guru/central person and the group goes into damage control mode. All over something that may not have been thought, and hasn’t been done.
7 – Cognitive Conformity (also known as) Group Think. Psychology Today explains cognitive conformity (or group-think) as “Seeing things the way people around you view them. Research has shown that this often happens at an unconscious level.”
A cult or unhealthy/high demand group may exhibit this group-think by unquestioning agreement with the central person in the group. This culture of “Don’t question the leader” may result in unconscious agreement with everything he/she says, even if it means suppressing ones own misgivings.
If you are to look in a normal group of say, 20 people, there are likely to be clusters of opinions on a particular issue and not all of them will be in agreement. But if a leader gives a statement and then says “Do you agree?” to a group of 20, and everyone agrees or adds more so-called evidence (which may be based on emotional reasoning or mind-reading) to the leaders statement, you may have yourself a thinking error in action. This should be especially concerning (in my opinion) if you feel fear about voicing a differing opinion. My belief is that this potentially links with another thinking error, the in-group bias where we tend to trust people in our circle, from our background or from our own experience more than others. If you can only trust people in your group, if you can only listen to opinions from an echo chamber, you may have a problem.
How do these cognitive distortions work together in a cult or unhealthy group setting?
Individually, they look a little different. They may make it difficult for a person to be happy or at peace in life. They may screw with personal relationships. But they don’t have the same controlling or coercive power they have in a cult/unhealthy group. As I mentioned further up, its the interplay that I find interesting. I’m not an expert. But here’s how it could play out.
You’ve been trained not to question the leader. You believe he possesses a level enlightenment you’ve not seen equalled. You’ve gone through the programs and have learned to see things his way, to look through his eyes. So you haven’t even noticed that you are subject to group-think or an in-group bias. You start to filter out uncomfortable details – details that mess with your groups way of seeing the world. People who disagree with you are distanced, disregarded or cut off from your life. If they aren’t for you, they’re against you. You are seeing things through the polarised thinking that is part of the group. No one picks you up on this, because everyone thinks the same way.
You go to one of those meditation sessions. After hours of pushing yourself to the limits and stilling your mind so you can receive enlightenment, the guru speaks. Its truth. You know it. You feel it. You don’t pick up on the fact that he used overgeneralisation and mental filtering to lead you to the point of unquestioning belief. You start to think about implementing this new truth in to your life. You don’t realise its emotional reasoning, because you don’t check it against anything. What you do know is that you must succeed at it. This is vital. This leader has a unique level of enlightenment. You won’t question it because you don’t want to miss out. Group think, meets emotional reasoning, meets polarised thinking, influenced by overgeneralisation and mental filtering. If anyone challenges you, you know to shut them down. You have something precious, even if it doesn’t make sense to anyone else. Even if you can’t explain it. If there was ever a conflict, you know you would agree with the guru. He has the keys for you life. You feel it. Therefore it is true.
Now for a little more on Thought-Stoppers
I know the “thought-stopping” cliché was mentioned in part two of this series, but I felt it warranted a little more of an example, hence its mention here. Thought-stopping cliches are used in cults and unhealthy groups to kill dialogue or healthy, investigative thought. Nathan Dial (of Medium) wrote, ” It is an essential tool of totalitarians and would-be brainwashers. Be aware of it, and guard against it. A very large amount of both everyday and politically-oriented (and at times, religious oriented as well) rhetoric consists of these.”
Robert Lifton also talks about it in his discussion of “Loading the Language” which you can read here.
Some everyday examples include:
It is what it is
S@*t happens
God works in mysterious ways
Its all good
Just choose joy (actually I wrote a whole blog post on that here).
All of these could cause someone to silence their misgivings or questions in order to just accept something and move on without thinking about it. But there are other examples. “Its apples and oranges” may cause someone in a cult/unhealthy group to stop comparing their own groups thoughts/doctrines with others, when in fact they can be compared. Another one is “Damned if you do, damned if you don’t” which can reduce tricky situations to a double binds which can’t be solved.
I just wanted to illustrate how thought-stoppers can affect someone in a group. I agonised over what kind of an example to use here. I decided to use one regarding women, because its a story I’ve heard a lot: (Trigger warning: deals with disclosures of abuse)
Person A discloses abuse to Person B. The story goes around the group and a few people now know about it. The guru/central person needs to shut it down. Guru says to Person B “She’s just not seeing things straight. Her perception is altered. You know, she’s been through so much.”
Person B doesn’t investigate further. “She’s not seeing straight. She’s been through so much” was enough to make them nod, agree, and not look any further. Person B now assumes Person A is mentally affected, and unable to see things straight. The thought process has been stopped. This may be repeated through-out the group to discredit the person who disclosed the abuse. (Abhorrent behaviour, in my opinion! All victims should be given the benefit of our compassion and belief. Anyway..!)
Let’s think this through. Follow-on questions could probe beyond the thought-stopper and arrive at the conclusion that the victim is actually seeing things straight and should be believed.
“What has she been through, then?” (Ahhh, so she has faced abuse? So she is seeing things straight).
“And who did that to her?” (And there’s the whole ballgame. Don’t get me wrong, abusive people will dodge, weave, gaslight and lie to avoid getting caught. But eventually the truth comes out, because the more lies that are required to cover the tracks, the more likely it is that one will end up in a “Gotchya” moment.)
Asking questions beyond the thought-stopper is powerful. Whether we are allowed to do it out loud (without putting ourselves at risk) or whether we do it in the silence of our own thoughts until we can get to a place of freedom, it can be liberating. It’s important.
Too often, we stop at “They’ve been through so much” as a means of discrediting someone. Too often, victims end up defending themselves because they have a right and need to be heard but they’re in an establishment that’s not listening. Next time someone tries this on you, ask what they’ve been through. Ask who did it. Don’t fall for the thought stoppers.
If any of these rang a bell for you, and if you feel you need support right now, please call either Lifeline on 13 11 14 (for mental health support) or the Cult Information and Family Support Service. You aren’t alone. You can get through this. There are people here who get it, and are here to help.
BIBLIOGRAPHY:
Grohol J , Psych Central “15 Common Cognitive Distortions,” https://psychcentral.com/lib/15-common-cognitive-distortions/
Boyes A, Psychology Today, “50 Common Cognitive Distortions,” https://www.psychologytoday.com/au/blog/in-practice/201301/50-common-cognitive-distortions
Tilgner L, Dowie T, and Denning N, Integrative Psychology, “Recovery from church, institutional and cult abuse: A review of theory and treatment perspectives,” https://integrativepsychology.net.au/wp-content/uploads/2017/11/institutional-abuse.pdf
Tobias M and Lalich J (excerpt cited on International Cultic Studies Association), “The Role of Cognitive Distortion” http://www.icsahome.com/articles/the-role-of-cognitive-distortion-tobias
Dial, M, Medium, “Beware the Thought Stopping Cliche,” https://medium.com/@nathandial/thinking-friends-please-be-informed-about-the-thought-stopping-cliché-ea6b0d9510d8
And the rest of the articles in this series if you missed them
How do I know if I’m in a cult – https://kitkennedy.com/2018/09/06/how-do-i-know-if-im-in-a-cult/
8 Key Characteristics of cults – https://kitkennedy.com/2018/09/06/8-key-characteristics-of-cults/
What cults have in common – https://kitkennedy.com/2018/09/08/cult-commonalities/
What is gaslighting in cults and high demand groups? – https://kitkennedy.com/2018/09/12/what-is-gaslighting/
Whats the difference between a cult and a healthy church: the Kit Kennedy Opinion -https://kitkennedy.com/2018/09/16/difference-between-cult-and-church/