by Mark McKergow and Jenny Clarke
It is vain to do with more what can be done with fewer -Occam.
Philosophers since recorded time began have been strugglingwith the concept of "reality" and whether it exists,either in objective form or indeed anywhere outside the thinker's thoughts. William of Occam lived from 1290 to 1349, a period when philosophy was dominated by the Scholastics, whose aim was to integrate knowledge derived from human reason with the understanding granted by divine (Christian) revelation. Occamused extreme rigour in his logic, arguing that many of the received Christian beliefs (for example, that God is One, indivisible and the Creator of all things, and that the human soul is immortal) could not be demonstrated by reason, but only by revelation. His lasting contribution to philosophical thought is the principle that "it is vain to do with more what can be done with fewer" - in other words, one should cut away assumptions as if with a razor (hence Occam's Razor) and strive for simplicity.
What has Occam to say to a modern student of NLP? Like Occam, NLP makes a virtue of distinguishing between what we can detect with our own senses and what we deduce from a variety of sources- experience, reading, generalising, rationalising etc (as well as the twentieth century equivalents of divine revelation, which add the theories of the myriad schools of psychology to the older theological traditions).
In their seminal book "The Structure of Magic", published in 1975, Richard Bandler and John Grinder make the point this way:
" ...there is an irreducibledifference between the world and our experience of it. Each of uscreates a representation of the world in which we live - that is, we create a map or model of the worldwhich we use to generate our behaviour. Ourrepresentation of the world determines to a large degree what our experience of the worldwill be, how we will perceive the world, what choice we willsee available to us as we live in the world." (page 7)
This idea is usually encapsulated inthe NLP world as "the map is not the territory".
In the 1970s, Bandler and Grinder introduced Neuro-Linguistic Programming to the world as an activity, a way of perceiving and doing things that produce desired results. Since those early days, interest in NLP as a topic has flourished: the number of training organisations accredited by the ANLP rose from 5 in 1991 to 27 at the end of 1995; bookshelves are filling up with books about it, brand differentiation is emerging and rows and court cases about its identity and ownership are becoming commonplace.
In adopting the noun acronym "NLP", we have lost the gerund form of the verb and NLP has itself become a nominalisation. How much of what today passes for NLP is truly useful in helping people to articulate and achieve their objectives? How much of it is necessary for the process of Neuro-Linguistic Programming?
In the very first edition on "NLP World", NLP isdefined as the study of human experience (Vol 1, No 1, page 9). It has been described by Richard Bandler as "an attitude and a methodology that leaves behind a trail of techniques". The attitude is above all one of flexibility: an understanding that if our map of the world is not producing the results we want, then using a different map might help. Certainly, change is in order; the more flexible one is the more choices of how to change are available. The methodology is modelling: by studying and replicating behaviours used by successful people, success can be replicated. The techniques are particular patterns which have proved effective -often they are the first aspects of NLP which people come across.
The presuppositions of NLP, the commonly accepted principles or axioms governing it, are summarised by Steve Andreas and Charles Faulkner in their new book (pages 35 - 37). In brief,
The map is not the territory.
Experience has a structure.
If one person can do something, anyone can learn to do it.
Mind and body are parts of the same system.
People have all the resources they need.
You cannot not communicate.
The meaning of your communication is the response you get.
Underlying every behaviour there is a positive intention.
People always make the best choices available to them at the time.
If what you are doing isn't working, do something else.
So, these are the starting points. If we accept these as at least useful (if not true!) by definition, then how do various aspects of NLP sit alongside them?
In their book "Introducing Neuro-Linguistic Programming: The new psychology of personal excellence", Joseph O'Connorand John Seymour present NLP in a Three Minute Seminar:
" ... to be successful in life, you need only rememberthree things. Firstly, know what you want; have a clear idea of your outcome in any situation. Secondly, be alert and keep your senses open so that you know what you are getting. Thirdly, have the flexibility to keep changing what you do until you get what you want." (p27)
This seems to us to be a useful brief description of a process which, if followed, leads to the outcome being attained by definition. Fine so far. We hear about the importance of choice, to add choice, not to take choice away. Fine too. So what are we to make of occasions when people use NLP-talk to justify their percpetion that they have no choice.
We get this feeling when we hear folks who hold Practitioner certificates say something like "Well, I'm very kinaesthetic, so there's no point in asking me to make pictures in my head or listen to remembered voices"? Yet we have rarely experienced anyone untrained in NLP who is not at least competent to some degree in all representation systems. We have also been asked, when setting up an exercise in an NLP practice group for practitioners and above to describe what experience participants should expect, "because we don't want to do things wrong". We have found folks with NLP practitioner certificates who hold back fromcarrying out some simple task as asking a colleague somequestions because "I haven't dealt with something at the Identity level". Here we see NLP making skilled people less adaptable than they were before; something is going wrong.
Much of the "NLP syllabus" arises from the reification of examples of products of NLP's attitudes and processes so that they become confused with NLP as an activity. Many of the "techniques" we find out about in our early training come into this category - things that have been found tobe effective in some situations, but yet are presented as if they were "true". We then have a large number of patterns with names that can be bandied about, which seem to act to remove flexibility by having the NLPer place their faith/competence/attention on the technique, as opposed to their client/self/whoever.
Let us examine a couple of these reified process, to see how they might be reduced by applying Occam's razor.
Eye Accessing Cues
This topic is introduced right at the beginning of the Bandler and Grinder seminar recorded in Frogs into Princes in Chapter 1, entitled Sensory Experience. Even here, we learn that it is not quite as simple as the well-known diagrams of smiley faces with eyes at 45, 90, 135, 225, 270 and 315 degrees (approximately?) may suggest. We are warned that we may be confused by the rapidity of response, so that observers may not distinguish between accessing and processing; between lead systems and the strategic sequence as a whole; between people who conform to generalisations and people who organise their experience differently. Of course, it helps to be shown the kind of details to look for when learning the skills of sensory acuity, the skills needed to notice whether or not we are achieving our objectives. But why not learn to calibrate each person's experience directly rather than introducing a generalised model against which individuals must first be calibrated? Is this just another introduction to the land of self-fulfilling predictions? Occam would have a fine time with this structure. Noticing what happens in the case of each individual would surely suffice for him.
Visual-Auditory-Kinesthetic
Just a moment - where did "VAK" come from? It isn't in the presuppositions that distinguishing between these representational systems is a useful thing to do. Sure, people see things and say things and do things and feel things, and it's probably useful to notice if each of these is going on, and not to confuse, for example, what someone says the do from what they actually do. But there isn't anything in the fundamentals of NLP that says that everyone has a preferred representational system.Or indeed that VAK is of any relevance at all. It is quite possible to operate entirely within the presuppositions and never come anywhere near VAK at all. Would such an operation be Neuro-Linguistic Programming? We think it would.
Time Lines
The use of time lines can be a useful metaphor or frame forworking with people who cannot let go of their past or who want to explore their future. Not surprisingly, time line therapy hasbecome a separate discipline, with its own maps and set of nominalisations and its own accreditation systems. What are the essential elements of this new discipline? We are not certificated practitioners and so our perspective comes from the NLP training we have had over the years. However, cut to the bones with Occam's razor, it seems that what is important is to access real or imagined scenarios or meanings so that we can "do" or "view" what we like. When dealing with the past, one may not be able to change facts, but we can change the meaning we give to those facts; when dealing with the future, we can do something and notice the effect it has. It may not be necessary to delineate time lines in imagined 3 dimensional space and to walk the lines to achieve the desired objective.
"The" Logical Levels
Logical levels started life as a mathematical concept, introduced by Bertrand Russell and AlfredNorth Whitehead in their Principia Mathematica (1910) to get around paradoxes in their attempts to formulate a rigorous formal basis for mathematics and number theory. This concept is an abstract one, and refers to distinctions between "things" and "classes of things" - the key point being that an item on one level cannot exist on another level. Gregory Bateson (1972) extended the idea to classes of learning and communication, and influenced Bandler and Grinder as well as other members of his Palo Altoresearch group (Watzlawick, Weakland and Fisch, 1974). Then, Robert Dilts produced his set of "neurological levels", which will be familiar to students of NLP. This very useful way of examining a situation and finding guides to action has, we fear, suffered a similarfate to many other NLP patterns - it has taken on a life of its own, has been set in concrete, and can now be used to inventlimitations and concerns rather than to generate options and shedlight on complex experiences. Many NLP practitioners now seem torefer to Dilts's model as "The Logical Levels", as if there were no others.
Now, the concept of neurological levels is a fine model, potentially useful and well thought-out. We have used it ourselves many times. We are disturbed, however, that the idea has become reified into a different kind of existence -where these "levels" themselves are the objects of interest, rather than the experience they seek to describe. They are a way (and there are many, of course) of cutting up the cake of experience into more manageable slices. Can it make epistemological sense to talk about someone's Identity in the same way as we talk about their Hair, or their Car, or their Toenails?
It seems to us that many recent developments in NLP are adding to the potential confusion. Recent papers on "meta-states", for example (Hall, 1995, 1996) put us in mind of the philosophers of the middle ages, inventing ever more complicated hypotheses to outwit each other and explain the same worldly phenonena. A recent offer to attend a training covering "Meta-Strategy for Directionalisation" seems to lean the same way. It was against philosophy such as this that Occam first came to prominence. He pointed out that to construct a more complicated explanation for a phenomenon is an exercise in vanity. The much more difficult enterprise, he contended, was tocreate a simpler model to account for the same phenomena.We have chosen these specific examples from amongst many, to illustrate our point - our criteria being that they are recent and they exemplify for us the complications that can be made from postulating particular relationships amongst a number of nominalisations. While we are sure that these are sincere endeavours, representing work which is effective for their authors, the sheer complexity of the language and models makesthem less likely to be useful in a general way to lots of people.
In our early studies of NLP we noticed and enjoyed the ways in which more traditional forms of therapy were held up to question on the grounds that they had replaced references to processes with nominalisations. A client coming along with a diagnosis of "depression" is then asked about how they do being depressed, or what they'd rather be doing instead. Now we find NLP proceeding down the same route, with more modern nominalistions and more complex models which require (surprise surprise!) more and more training. And the field of NLP becomes more mysterious, as we seek the words of the Great Ones about the number of submodalities which can dance on the head of ananchor...
There are of course many good things about NLP, especially its attitude and its presuppositions. To practise sensory acuity is a good way of taking yourself outside your own map and theories of what is going on and making yourself concentrate on the outside manifestations of the world, not the inner world. In their book Shifting Contexts, O'Hanlon and Wilk make a distinction between"facts" and "meanings" (page 15). Video descriptions apply to the facts that can be observed with the senses, to universal agreement. Thus, we might say "She is wrinkling her brow" as a video description. To claim that she is angry, puzzled, curious, short sighted or experiencing an unpleasant smell etc is to introduce interpretation and give the action meaning. As NLPers know, reframing can uncover as many interpretations as there are observers.
In the discussion of anchoring above, we refer to the tool of language. Used with precision, this can do all sorts of things to help us identify and achieve our desires. The "linguistic" elements of NLP are particularly useful in showing us how to achieve the precision needed. The specific details of an individuals map of the world can be uncovered using the meta-model. At the other end of the spectrum, the vagueness of the Milton model enables us to make an effect without any particular understanding of content. Given the tools of language, it is enough to have the ability to be aware of what is happening and the adaptability to change our thinking and/or behaviour.
There are many futures for NLP. We would like to put in a wordfor a simple one. Using language and sensory acuity to help people model what works for them. Not what works for the people that contributed to some modelling project, written up neatly and passed down for the Next Generation to digest and inflict on others. Some people (for example Bill O'Hanlon, the founding editor of the NLP Newsletter, and Steve de Shazer, leading minimalist and developer of Solution Focussed therapy) have already set off down this road, working within our NLP presuppositions, and seem to be finding ways of working and being without the aid (or indeed the worry) of meta-states, or VAKs, ormeta-strategies. This seems to be at the heart of what William of Occam would have had in mind if he had been around today.
We are concerned about another future: that where the latest edition of the DSM could contain a condition called "NLPer", defined as "Compulsive training attendee, talks in polysyllabicwords about arcane processes, doesn't use these processes other than to infect other "NLPers" (they call this"training"). Uses the word "meta" a lot, sometimes in two or even threes. Strong focus on accreditation and committees (in the UK in particular) - liable to demonstrate their competence by waving pieces of paper. Talks a lot about respecting everyone's opinions, in an edgy and disrespectful way. Seems unable to take a joke."
We return to O'Connor and Seymour's Three Minute Seminar: Outcome, acuity, flexibility. For us, this is Neuro-LinguisticProgramming; the rest is no more than examples of how others have used it.
Andreas, S. and Faulkner, C. NLP: The New Technology of Achievement.
Positive Paperbacks, Nicholas Brealey Publishing, London 1996.
Bandler, R. and Grinder, J. The Structure of Magic. Science and Behavior Books inc. 1975.
Bandler, R. and Grinder, J. Frogs into Princes. Eden GroveEditions, 1979.
Bateson, G. Steps to an Ecology of Mind, Ballantine Books, 1972.
Hall, M. The New Domain of Meta-States in the History of NLP, NLP World Vol 2, No 3 (1995)
Hall, M. Meta-States as correlated to "Core" states,NLP World Vol 3, No 1 (1996)
O'Hanlon, W. and Wilk, J. Shifting Contexts. Guilford Press, 1987.
O'Connor, J. and Seymour, J. Introducing Neuro-Linguistic Programming. Mandala, 1990.
Russell, B. and Whitehead, A.N. Principia Mathematica. 1910.
Walzlawick, P., Weakland, J.H., and Fisch, R Change: Problem Formation and Problem Resolution. Norton, 1974
"Occam's Razor in the NLP Toolbox", NLP World 3,No 3, pp 47 - 56 (1996)
359 days ago
646 days ago
768 days ago
775 days ago