Friday, August 6, 2010

35. Cults & Brainwashing File, Pt. 4 (Coleman, pt. 1) (Was: Responses to Articles, Cont'd.)

When I've been at the computer today it's just been to write in this blog... at some point I need to check my e-mail too. But for now...

on with the show!

***

This next quote is from an opinion piece in a professional journal:

Coleman, Lee. (1984) New religions and the myth of mind control. American Journal of Orthopsychiatry, 54(2), p. 322-325.

"People who conform to behavioral expectations of a group have made a choice to do so. Even when imprisoned (as, for example, during a 'deprogramming') an individual who conforms is demonstrating behavior control, not mind control. Behavior control is not difficult to achieve with a person who is confined and subjected to sufficient pressure to overcome any resistance. When an individual in such circumstances chooses to conform, it is not a free choice; it is coerced, it is unethical, and it may be illegal. But an individual who has been intimidated has not thereby lost his mind." (p. 323)

In this section Coleman differentiates between mind control and behavior control. Although I'm not a professional in the field of psychology, this does seem like a reasonable distinction. However, I would like to add that from my standpoint, as the recipient of this kind of thing, it sure felt like mind control. When I get into the details of what happened in Vienna I have an especially poignant example of this.

I'm not sure exactly what Coleman would include as "behavior" in "behavior control", my understanding is that that that's a pretty broad word in that field. In Vienna attitudes were definitely part of the whole induction process, and I think it's pretty difficult to mask attitudes, unless one is an especially gifted actor, perhaps. But if attitudes are included in the term "behavior" here, then is there a differentiation between apparent attitudes and what's actually going on in the person's head? If the group is able to effectively change apparent attitudes, but not what's actually going on it the person's head, then perhaps that really is "behavioral control". BUT, if it's virtually impossible to differentiate the two or the organization succeeds in molding the actual attitudes, is that "mind control"? Or is that part of the non-free choices made by the individual.

Then, if the individual knows that these are non-free, coerced behavior modifications, does that create a dissonance in the individual's head that might result in his "losing his mind"? I imagine it might be easier for the person who doesn't appreciate what's going on, as they might not have that kind of dissonance and therefore be none the wiser for it, perhaps along the lines of Algernon in the book "Flowers for Algernon" when he understood how people were responding to him when his intelligence temporarily increased then decreased again.

I think if Coleman's position is taken seriously, I can't imagine how mind control is possible, outside of some science fiction notion of it. In the very first sentence of this position paper, Coleman seems to equate brainwasing and mind control ("The controversy about 'cults' centers around the accusation that they use brainwashing or mind control to win converts." p. 322). I'm not sure I agree with him that brainwashing and mind control are the same thing.

Of course I say this as a layperson, not a professional in the field, but my impression of mind control, as Mr. Coleman suggests gives the idea of sort of robotic control or the image of a mad hypnotist sending people off in trances to do dastardly deeds. In contrast, I think of brainwashing as a process that confuses a person and induces new ideas that s/he might not otherwise freely adhere to. In such cases, they might be sent off to do dastardly deeds, but not in a trance, but as part of this process where they made decisions/accepted tenets more or less under duress. I don't think of brainwashing, however as a mundane thing, but one that is pretty well engineered and intentional in somewhat extraordinary, by societal standards, circumstances, such as isolation.

Well, this is all pretty cerebral, but in short, I think that brainwashing is not mind control but does involve behavior control, although I'm not sure exactly about the bounds of "behavior", and that it is unethical. So I partially agree with Coleman.

***

This next article, a book chapter, is only partially relevant to my experiences in Vienna, because that organization wasn't a cult in the standard sense of the word; they just had some cult-like characteristics.

Galanter, M. (1989). Cults: Faith, Healing & Coercion. New York: Oxford University Press. Chapter 6: The Cult as a Social System, pp. 98-116.

"Since an intensive mobilization of a charismatic group's psychological and material resources may be directed at the conversion of new members, they can create deep turmoil in the individual convert. On the one hand, the group is intensely seductive in its attempt to attract new members; on the other,l it demands a disruption of antecedent social ties and a metamorphosis in the convert's world view. Thus, when the full resources of the group are focused on a recruit, the potential for tearing the fabric of that individual's psychological stability is considerable. The result may be psychiatric symptoms in people with no history of mental disorder or psychological instability. The genesis of these symptoms may lie more in the conflict between the convert's needs and the group's demands than in an underlying psychological impairment of the convert... Among the illnesses to be considered from [the Diagnostic and Statistical Manual of the American Psychiatric Association] are dissociative disorders, pathologic adjustment reactions, major depressive disorders, brief reactive psychoses, and paranoid disorders of a psychotic nature. Each can be generated by the charismatic group as its forces are mobilized to implement the transformative function." (p, 99-100)

It's clear that the recruitment part of this quote isn't directly applicable to the mission, but I think that if we substitute it for the socialization of new members, then we'll find a fair amount of relevant information in this text. I think that the socialization practices at the mission in Vienna caused some of these reactions in new inductees (myself included) and it would have been very interesting to have a neutral psychologist study that issue there. Of course, the vast majority of people there had been there quite some time before I arrives and so had presumably long since completed this initiation process.

***

"Suppression of concerns that might detract from the primary task of an intensely committed social system is actually quite common. In time of battle, for example, an army may be mobilitzed to achieve its immediate military objectives, and its primary task is therefore the transformation of all personnel and material into a fighting force. The psychology of the troops is bent to this mission to the exclusion of all else since victory in battle is paramount. Concern for the needs of the wounded may be secondary, since this could detract from the thrust into battle. In a similar way, mobilization for the transformation process in the charismatic sect cannot be deflected by the difficulties experienced by individual converts because the usual constraints on exerting social pressure are suppressed." (p. 103).

I think this is a reasonably good fit for my experiences in Vienna. This is along the lines of what I said in an earlier post about how leaders, even though they may be charismatic, are expendable - the work would continue without the participation of any specific individual in it. So the choice for the newly arrived was to succumb to their way of thinking and doing things, or just become a casualty.

The other thing here that I'd like to point out is right at the end of this quote about usual constraints on social pressure being suppressed. Since the mission was so far away from the potential scrutinizing eyes of supporters back home, and they could easily pull the wool over visitor's eyes about the day-in-day-out operations, so there were few constraints on what they could do regarding the exertion of social pressure on members.

***

"For a social system to regulate its functioning effectively, it must have the capacity to suppress members' deviation from its implicit or explicit goals. In charismatic groups, the penalty for those who deviate from norms is psychological distress; overt coercion usually is not necessary to induce compliance. How is this penalty exacted? We have seen that there is a decrease in psychological well-being among those who felt less closely associated with the group. Furthermore, members who were considering leaving the sect had attitudes most clearly at variance with those of the group, and their scores fell into the clinically depressed range. Indeed, 36% of those who dropped out of the Unification Church reported the emergence of "serious emotional problems" in the period following their departure." (p. 107)

I think this is a good depiction of some of the activities I witnessed and experienced in Vienna. Those most in tune with the group were happier, and enjoyed the privilege of more meaningful work, increased access to privileged knowledge and the good favor of the leadership in general. Unfortunately, I wasn't one of those people.

***

I finished my evening stimulator session and it's time for me to do more chores.

Have a pleasant evening.

~ Meg