The Human Brain and Psyche as Complex Systems
by Gerald Schueler, Ph.D. © 1997
This paper discusses the human brain as a complex self-regulating system. The psyche, as defined in Junian psychology, is also addressed as a complex self-regulating system. Some of the primary findings from modern chaos theory, synergetics, and information theory are presented and related to the psyche. Major brain models are described. The most promising of these is the complex systems model which uses the findings of chaos theory. Just as our brain is a macroscopic collection of microscopic subsystems, so our psyche is a collection of subsystems. These include complexes, instincts, and archetypes. It is highly likely that both the brain and psyche possess global properties, such as memory and consciousness, not found in any of their component subsystems.
The brain is the portion of the vertebrate central nervous system that is enclosed within the cranium, continuous with the spinal cord, and composed of both gray matter and white matter. It is the primary center for the regulation and control of bodily activities, receiving and interpreting sensory impulses, and transmitting information to the muscles and body organs. The main parts of the human brain are shown in figure 1.
Figure 1. The Human Brain.
The nervous system is a
"network of specialized tissue that controls actions and reactions of the body, enabling it to adjust to its environment. The system functions by receiving signals from all parts of the body, relaying them to the brain and spinal cord, and then sending appropriate return signals to muscles and body organs." (Microsoft Bookshelf)
The central nervous system consists of the brain and spinal cord. It is connected to the outside world through the peripheral nervous system. The peripheral nervous system consists of the somatic nervous system (under our conscious control) and the autonomic nervous system (not under our conscious control). The autonomic nervous systems is divided into the sympathetic division, dealing with emergency responses, and the parasympathetic division, dealing with internal monitoring and regulation of various functions (Zimbardo, 1985).
The human brain and central nervous system together form a very complex and dynamic system. For example, there are at least ten billion nerve cells (neurons) in the brain and each sends (via the axons) and receives (via the synapses) messages with the others (Mainzer, 1994).
Modern thermodynamics teaches that there are three basic kinds of systems: isolated, closed, and open (Çambel, 1993). Isolated systems are those which are totally independent of their environment (these exist only in the laboratory). Closed systems are closed to matter (no matter may pass through the boundaries of the system) but are open to energy and information. Open systems are dependent on environment. Matter, energy, and information may pass through the boundaries of open systems.
Most dynamic systems, and all living systems, are open. Our body, for example, is an open system (Atkins, 1984). Modern chaos theory addresses complex systems, which are systems with a large number of interrelated parts. It also addresses dynamic systems. There are two main types of dynamic systems: discrete and continuous. Every complex system, and especially every living system (living systems are usually referred to as self-organizing systems) is also a dissipative structure. Ilya Prigogine won the Nobel Prize for chemistry in 1977 for his work on dissipative structures, which he defined as any structure that takes on and dissipates energy as it interacts with its environment. A dissipative system, unlike one that conserves energy, gives rise to irreversible processes (Nicolis & Prigogine, 1989). All systems that exhibit disequilibrium and self-organization are dissipative and have a dissipative structure (Briggs and Peat, 1989). Thus, not only our physical body itself is such a structure but every organ including the brain. The term itself expresses a paradox because dissipative suggests falling apart or chaos while structure suggests organization and order. Dissipative systems are those which are able to maintain identity only because they are open to flows of energy, matter, or information from their environments (Prigogine and Stengers, 1984). For example, living systems dissipate entropy by taking in low-entropy energy in the form of food and oxygen, and giving off high-entropy energy in the form of heat, carbon dioxide, and excreta (Penose, 1989).
Not only is our body a dissipative system, but our ego as well. The Swiss psychologist, C.G. Jung, designated the ego as an ego-complex because of the numerous components and processes with which it is comprised. He taught that the components of the ego-complex are held together by the gravitational force of their relation to consciousness (Pascal, 1992).
The ego, the subject of consciousness, comes into existence as a complex quantity which is constituted partly by the inherited disposition (character constituents) and partly by unconsciously acquired impressions and their attendant phenomena ... Analytical psychology differs from experimental psychology in that ... it is far more concerned with the total manifestation of the psyche as a natural phenomenon - a highly complex structure. (Jung, 1991, pp. 91-92)
For Jung, the structure of the psyche, of which the ego is but one component, is not static but dynamic (Jacobi, 1973). Kast (1992) writes that, "To Jung, the psyche, like the living body, is a self-regulating system" (p. 5). From the foregoing, it seems reasonable to assume that the principles of chaos theory, that relate to complex dynamical systems like the brain, should be applicable to the ego and psyche as described by Jung. Abraham (1994) would go even further. He asserts that chaos theory is applicable to the holistic unity of the mind, brain, behavior, and environment, and that none should be examined as a separate entity but rather "their mutually interactive and complex processes comprise an organic entity" (p. 85).
1. Equilibrium. One of the findings of chaos theory is that complex systems that seem to be in equilibrium are not really at equilibrium. Systems damped by friction, and driven by some kind of energy input, while appearing to be at an equilibrium state, are not really at equilibrium at all. Tiny variations are present which can send the system into chaos at any time. Complex systems, and especially living systems, require far-from-equilibrium conditions in order to maintain self-organization or growth (Cohen & Stewart, 1994; Gleick, 1987; Kellert, 1993; Prigogine & Stengers, 1984).
2. Bifurcation. Another finding of chaos theory is bifurcation theory. A bifurcation is a crisis point in the life of a system, in which the future of that system is uncertain. It is usually depicted as a fork in the time sequence of a system, in which a system can take two possible branches, one or both leading to chaos. All dynamic systems go through bifurcations, most of which are irreversible (Cohen & Stewart, 1994; Gleick, 1987; Kellert, 1993; Peitgen, Jürgens & Saupe, 1992; Prigogine & Stengers, 1984).
3. Irreversibility. According to the second law of thermodynamics, the entropy (a measure of disorder or chaos) of an isolated system (one that does not exchange matter, energy, or information with its environment) will remain unchanged for reversible processes but will increase for irreversible processes (Angrist & Hepler, 1967; Atkins, 1984). Thus irreversible systems, including all living systems, tend to become chaotic as their entropy increases. Nicolis and Prigogine (1989) speak of a "universal role of irreversibility in nature" and point out that irreversibility of complex systems is not the result of "the complexity of the collective behavior of intrinsically simply objects" but rather is "due to the very structure of the dynamical systems" (pp. 214-215).
4. Entropy. Entropy is a measure of chaos (Nicolis & Prigogine, 1989). The second law of thermodynamics suggests that our entire universe is slowing down, because its entropy, its need for sustaining energy, is increasing. One of the results of this law is the prediction there that can be no perpetual motion machine. All systems wear down; energy is lost and cannot be totally recovered by a system. We can also consider entropy to be a measure of internal randomness, or molecular chaos. As entropy increases, chaos increases (Angrist & Hepler, 1967; Atkins, 1984).
Prigogine's entropy (the entropy for open systems developed by the Nobel prize winner, Ilya Prigogine) implies that as systems become more complex, a threshold of complexity will be reached such that the system will begin functioning in unpredictable directions; such a system will lose its initial conditions and these can never be reversed or recovered (Briggs and Peat, 1989; Çambel, 1993). For open systems, where the entropy change due to the environment is sufficiently negative that it exceeds the magnitude of the internal entropy of the system, then entropy will decrease (order will increase) over time during the process. This helps to explain the thermodynamics of dissipative systems (those that require energy from external sources) and self-organizing systems such as all living systems (Nicolis & Prigogine, 1989).
If we think of complex systems as being composed of millions of tiny subsystems (for example, the cells in our body, the citizens of a country, and the molecules in an object) then we will discover that each subsystem can act randomly while the overall system itself is in equilibrium and is relatively predictable (Mainzer, 1994). The theory of statistical mechanics, invented at the end of the last century, is one way of dealing with such subsystems. In this theory, the system itself functions on the averages or probabalistic actions of its subsystems (Çambel, 1993). This is true for dissipative structures that are also autopoietic or self-organizing structures, which is to say, for living systems. Living systems maintain their dissipative structure by dissipating entropy before it has a chance to build up (Prigogine & Stengers, 1984). Statistical entropy was created by the Austrian physicist, Ludwig Boltzmann. Boltzmann's entropy indicates that entropy will always tend towards a state of maximum probability (Lebowitz, 1993). In order to apply this equation, all of the accessible states must have the same probability of occurring (Fast, 1962).
When entropy is viewed as a measure of chaos, it can be stated that the probability of accessible states for any complex system is a measure of that system's uncertainty. Ludwig Boltzmann was the first to note that entropy is a measure of molecular disorder and he concluded that increasing entropy implied increasing disorder (Prigogine, 1980). Irreversible thermodynamics deals with systems that change over time, but it addresses only systems that are near to equilibrium conditions (Angrist, 1967).
5. Attractors. In chaos theory, attractors are states towards which a system may evolve, when starting from certain initial conditions (Kellert, 1993; Nicolis & Prigogine, 1989). Attractors can be unique states, called fixed point attractors. But they can also be a whole range of states in the case of periodic or quasi-periodic attractors (Briggs & Peat, 1989). Sometimes the specific condition or system state at an attractor is entirely unpredictable. Such attractors are called chaotic, and their general surrounding region is called a basin. Within a basin, a system is under far-from-equilibrium conditions (Baker & Gollub, 1991; Peitgen, Jürgens & Saupe, 1992).
Mainzer (1994) states, "Synergetic principles (among others) provide a heuristic scheme to construct models of nonlinear complex systems in the natural sciences and the humanities" (p. 13). Probably chief among the reasons why synergetics seems so well suited to this task, is its use of the slaving principle and the ordering parameter (Haken, 1988; Mainzer, 1994).
When a ferromagnet is heated above a certain threshold temperature, it suddenly loses its magnetization. When the temperature is lowered below that threshold temperature, it regains its magnetism. Such a transformation is called a phase transition (Haken, 1983).
According to synergetics, as complex systems undergo phase transitions, a special type of ordering occurs at the microscopic level (with ferromagets, for example, the atomic particles become dislocated at high temperatures and point randomly, and then return to their directional order at low temperatures). Haken (1983) has shown mathematically that instead of addressing each of countless atoms in a complex system undergoing a phase transition, we can address their modes by means of an ordering parameter. This has the mathematical result of drastically lowering the degrees of freedom to only a few parameters. Haken (1987) also demonstrates how these ordering parameters guide complex processes in self-organizing systems. When a system ordering parameter guides one or more subsystems, they are said to slave the subsystems, and this slaving principle is the key to understanding self-organizing systems. Complex systems organize and generate themselves at far from equilibrium conditions, where
"in general just a few collective modes become unstable and serve as "ordering parameters" which describe the macroscopic pattern. At the same time the macroscopic variables, i.e., the order parameters, govern the behavior of the microscopic parts by the "slaving principle. In this way, the occurrence of order parameters and their ability to enslave allows the system to find its own structure." (Haken, 1988, p. 13)
Because the human brain has many atoms, comprising two distinct nervous systems, it has many degrees of freedom. By addressing one or two ordering parameters (that which describes the macroscopic order and simultaneously "orders" or slaves the atoms) the degrees of freedom are reduced to one or two, allowing for mathematical and especially statistical approaches. In this way, Haken (1987) concludes "In general, the behavior of the total system is governed by only a few order parameters that prescribe the newly evolving order of the system" (p. 425). A circular causality is formed when the subsystems collectively determine the order parameters and the order parameters determine the behavior of the subsystems. Haken (1987) also proposes that brain models must include an interplay between function and structure. Structures are formed through the process of receiving information. The incoming signals or patterns from the senses can be addressed as ordering parameters which can either cooperate or compete with one another.
Entropy is a measure of disorder, and disorder is essentially the same thing as ignorance. This is how entropy is related to information theory (Angrist & Hepler, 1967; Gell-Mann, 1994; Gleick, 1987).
When we toss a coin, we have two possible outcomes. When we throw a die, we have six possible outcomes. According to Claude Shannon (1948), the founder of information theory, information refers simply to the number of possibilities, Z. So that information, I, is expressed as:
I = log2 Z
Shannon (1948) used the logarithm to the base 2 because modern communication works with binary numbers or bits. When the letters of a good novel are each counted up, the resultant totals for each letter are the letter frequencies for that particular book. Shannon let j = 1 for a, 2 for b, 3 for c, and so on, so that the relative frequency Nj would be the frequency of occurrence of these letters in a particular book. Then he calculated that the probability of finding any letter labeled j out of a total of N letters would be
Pj = Nj/N
From this he showed that the average information per letter contained in that book will be
Ij = -3pj log2 pj
With one more step, in which he added a constant, K, he arrived at the formula for Shannon uncertainty, Shannon entropy, or simply information entropy (Çambel, 1993):
|Hs = -K 3 pi log2 pi|
This equation provides a measure of the disorder or ignorance that may exist in a quantity of information. However, Shannon information does not concern itself with meaning, and it only applies to closed systems. Shannon used his concept to study the capacity of a communication channel to transfer information even under the impact of noise (Haken, 1988).
Information theory has been found useful in detecting and correcting errors as well as in data compression. The brain, in order to be a detector or receiver of information, must be able to account for the length of the message, as well as determine any hidden or decoded information in the message. In addition, in order for the brain to obtain meaning from any message, a shared interpretation of all information symbols used must first be established (Cohen & Stewart, 1994).
Models of the Human Brain
There are numerous working models today that attempt to simulate how consciousness arises from the human brain. In general, brain models can be divided into four main areas or types:
a. Physical and neurophysical systems such as the reticular formation activating system.
b. Models of biochemical and neurochemical systems (bottom up).
c. Cognitive models (top down).
d. Complex systems models using chaos theory.
Early brain researchers attempted to establish how the brain functioned by how it was put together and what parts controlled various activies and functions. Many thought that the reticular formation, which controls arousal, alerting, and attention, was the physical seat of consciousness. Models based on the physiology of the brain, such as those described by Jasper, Proctor, Knighton, Noshay, & Costello (1958) on the reticular formation, have largely been discarded. It was soon recognized that the human brain is probably the most complex system in the world. For example, it is composed of 1010 or more nerve cells whose cooperation allows us to recognize patterns, speak, and perform mental functions (Haken, 1988). However, Arbib (1987) acknowledges that the work done by these pioneers of brain research formed the bases of most of today's more advanced models. He points out that the top down approach, which is based upon a functional analysis of cognitive processes, and the bottom up approach, which is based on analysis of the dynamics of neural nets, are the most prevalent today, and he suggests the two be united. He argues that there is a "need for a healthy interaction between cognitive studies (top-down) and neuroscience (bottom-up)" (p. 280). He concludes that "The very richness of current research on the brain guarantees that any single view must be incomplete" (p. 305).
Most work on brain theory today is focused on the neuron and its parallel networks. Aertsen, Gerstein & Johannesma (1986) state that
A central paradigm in the study of the sensory nervous system is that meaningful information regarding its principles of operation can be obtained from experimental investigation of the functional characteristics of its elementary components, i.e., the single neurons. (p. 7)
This paradigm is central to the bottom up approach. However, the nonlinearity of brain processes was also noted during the same period by Johannesma, Aertsen, Van Den Boogaard, Eggermont, and Epping (1987) who wrote, "The brain must be considered as a multi-input/multi-output system composed of nonlinear stochastic elements" (p. 32). Also, the work of Freeman and Viana di Prisco (1987) using chaos theory to explain the functioning of the olfactory bulb was published in the same proceedings. Work done by Kaneko (1993) suggests that complexity can arise from a feedback process of clustering (a synchronization by chaotic instability) which he calls homeochaos. The findings of chaos theory and synergetics are used in the new complex systems model eloquently described by Mainzer (1994).
The top-down approach is championed by the advocates and practitioners of cognitive psychology. Cognitive psychology began during the 1960s with George Miller who linked mental processes with the functioning of computers. Mathematicians such as John von Neumann and Claude Shannon demonstrated that mathematics was a language, much like English or any other language. Their premise was that using rules to convert algebraic relationships into words and back again, a computer can perform operations analogous to some kinds of thinking and reasoning. This idea has led to the concept of artificial intelligence or AI, with the ultimate goal being the construction of intelligent robots (Hunt, 1993).
The work of Claude Shannon, the developer of Shannon's entropy and information theory, was instrumental in formalizing cognitive psychology (Medlin & Ross, 1992). Shannon's mathematical theories of communication were used by Ulric Neisser (1967) and others to develop cognitive psychology which views the brain as a computer. Analogies between computers and the brain were used to establish theories of human cognition. Cognitive psychology is an interdisciplinary approach to the study of the mind employing theories from artificial intelligence, philosophy, anthropology, and linguistics (Medlin & Ross, 1992).
Restak (1994) describes the new modular brain model in which there is no central repository of experience or memory. According to this theory, the brain functions with multiple connections all operating simultaneously and in parallel.
Crick (1995) conducted a literature search of the modern scientific findings on the human brain and consciousness. He concludes that human beings are nothing more than interacting neurons.
Although a great amount of important work on brain modeling has been done, there is to date no comprehensive or universally accepted model of exactly how the human brain operates, or how consciousness is produced. Penrose (1989) suggests that modern physics is still too immature to describe how the brain functions. Briggs & Peat (1989) conclude that "the brain is the nonlinear product of a nonlinear evolution on a nonlinear planet" (p. 166) and suggest that the brain is primarily a nonlinear feedback device.
The Complex Systems Model of the Brain
Mainzer (1994) writes:
The complex system approach is an interdisciplinary methodology to deal with nonlinear complex systems like the cellular organ known as the brain. The emergence of mental states (for instance pattern recognition, feelings, thoughts) is explained by the evolution of (macroscopic) order parameters of cerebral assemblies which are caused by nonlinear (microscopic) interactions of neural cells in learning strategies far from thermal equilibrium. Cell assemblies with mental states are interpreted as attractors (fixed points, periodic, quasi-periodic, or chaotic) of phase transitions. (p. 7)
By addressing the brain as a complex system of neural cells, we can assume that its dynamics follow the nonlinear mathematics of neural networks. For example, the standard evolution equations used for pattern emergence in physics, chemistry, and biology, should carry over to the brain's ability to compare and recognize patterns. Although neurons are the cerebral parts of the brain, the complex systems model suggests that the whole is more than the sum of its parts. The nonlinear interactions of cells and molecules causes phase transitions when conditions are far from equilibrium. Due to the slaving principle of macrocosmic ordering parameters, the (healthy) brain as a whole changes over time in a linear and predictable fashion. The complex systems model "does not explain what life is, but it can model how forms of life can arise under certain conditions" (Mainzer, 1994, p. 73).
Figure 3 shows seven levels of complexity within the human central nervous system (CNS). There is a hierarchy of anatomical organization varying from molecules through the entire CNS. The entire CNS is shown on the left. On the right are three figures; at the bottom is a chemical synapse. In the middle is a network model showing how ganglion cells could be connected together in the visual cortex. At the top is a subset of visual areas in the visual cortex (Mainzer, 1994).
Open systems, such as the brain, have both internal entropy production and external production associated with energy or mass transformations to or from the environment. When the brain exchanges energy and matter with its environment, it maintains itself for periods of time in a state far from thermal equilibrium as well as at a locally reduced entropy state. Under these conditions, small fluctuations lead to irreversible bifurcations, and thus to increasing complexity of possible behavior. In this way, the brain increases its entropy production so that its control parameter can maintain a certain threshold level. When production goes too far, feedback mechanisms allow the control parameter to change. This induces instabilities which cause increased dissipation which, in turn, influences the threshold level. In short, the brain is a self-regulating system where,
A mental disposition is understood as a global state of a complex system which is caused by the local nonlinear interactions of its parts, but which cannot be reduced to its parts. (Mainzer, 1994, p. 107)
Thus global (overall system) effects of the brain cannot be reduced to the actions of the neurons or individual cells. Because of this, the chaos model predicts that "a purely bottom-up-strategy of exploring the brain functions must fail" (Mainzer, 1994, p. 118). It also predicts failure for the top-down approach.
The complex system approach offers the possibility for modeling the neural interactions of brain processes on the macroscopic scale and the emergence of cognitive structures on the macroscopic scale. Thus, it seems to be possible to bridge the gap between the neurobiology of the brain and the cognitive sciences of the mind, which traditionally has been considered as an unsolvable problem. (Mainzer, 1994, p. 144)
The basic premise of the complex systems model is given by Mainzer (1994) as:
During neural instability, different modes of collective excitations evolve to coherent macroscopic patterns which are neurophysiologically based on certain cell assemblies and psychologically expressed as certain feelings or thoughts. (pp. 150-151)
The complex systems model offers promise, but much work still needs to be done. Mental states must be modeled by state vectors in complex state spaces. A working phase portrait of brain dynamics, showing trajectories that accurately describe the brain's activities, has yet to be determined.
The Psyche of Carl Jung
The Swiss psychologist, C. G. Jung, taught that the human mind or psyche is complex and is composed of parts, much like the physical body. He coined the word "complexes" for various unconscious parts of the psyche. Complexes are the focal and nodal points of psychic life (Jacobi, 1973, p. 37). He also divided the unconscious into two distinct regions, the personal and the collective. "Whereas the personal unconscious consists for the most part of complexes, the content of the collective unconscious is made up essentially of archetypes" (Jung, 1990, p. 42).
In Jungian phraseology, the ego itself is a complex. It is the complex that is the subject of consciousness (Jacobi, 1973, p. 7). Jung also taught that the stability of the ego is relative, and far-reaching changes of personality can and do occur. These need not be pathological; they are sometimes developmental (Jung, 1978, p. 6).
The unconscious, the inner 'environment' of the psyche, is a different medium from the conscious. There is usually not much change in the near-to-conscious areas because of the rapid alternation between light and shadow. Jung (1973) calls this fluid area a "no man's land" and designates it as the personal unconscious (p. 97).
A simplified model of the ego and the unconscious is shown in Figure 4. Behind the personal unconscious lies the collective unconscious which contain the archetypes. The archetypes represent the structure of a "psychic world" whose reality is seen through its effects on the conscious mind (Jacobi, 1973, p. 37). From the foregoing we can write the equality:
archetypes = psychic attractors
Phase space is the state space of a system, a mathematical abstract space used to visualize the evolution of a dynamic system (Nicolis and Prigogine, 1989). The phase space of a human being has not yet been defined mathematically, but life itself can be envisioned as a human phase space with time plotted along the x axis. Such a plot would begin at birth and end with death as a fixed attractor.
The structure of the psyche is similar to that of the physical body. According to Jung, "the archetypes are the organs of the prerational psyche" (Jacobi, 1973, p. 46). Archetypes are structures, not images. They allow for the periodic creation and dissolution of images. The archetypes have a hierarchical order. The "primary" archetypes are those that cannot be further reduced. The next in line are the "children" or "secondary" archetypes. Then come the "grandchildren" or "tertiary" until we come to those which are closest to consciousness and which have the least intensity, meaning, and numinosity or energy charge (Jacobi, 1973, p. 56).
The psyche, as a macroscopic system, can remain predictable and stable even when its main subsystem, the ego, is unstable. When the ego enters basins of instability, its trajectory through phase space becomes uncertain and multiple possibilities or accessible states become available to it. In this way, the healthy ego grows and matures in individual ways over time by learning from personal experience. If an archetypal (chaotic) attractor encountered in phase space cannot be assimilated, unhealthy states can develop. According to Jung (1990), various forms of insanity can result from the failure to assimilate such encounters.
Figure 5 shows a more complicated model. Here the center of the psyche is the self, balanced by the ego and shadow. This model illustrates the open nature of Jung's view of the psyche. At the conscious end, the persona acts as a filter for the ego to the external
world, while at the unconscious end, the archetype of the anima-animus acts as a filter to the collective unconscious. Figures 4 and 5 are elaborations of models presented by Jacobi (1973) and illustrate the complex dynamic nature of the psyche as defined by Jung.
Jung's psyche functions with circular causality. The central archetype of the psyche is the self which, together with the ego, determine the order parameters of the entire psyche (biases, dispositions, likes and dislikes, values, and so on). These parameters then determine the behavior of the ego and self. The behavior of the ego can be determined from personality characteristics and traits (Jung, 1971). The behavior of the self can be determined from dreams (Jung, 1974).
Jung (1973) published his essay On Psychic Energy in 1928. In a footnote he writes "A system is absolutely closed when no energy from outside can be fed into it. Only in such a system can entropy occur" (p. 26). This was the prevailing understanding at that time. Today we have Prigogine's entropy which addresses open systems, and Shannon's entropy which addresses the exchange of information.
The psyche, like the brain, is an open system. The brain exchanges both mass and energy with its environment. The psyche exchanges energy (Jung called this libido) and information with its environment.
Jung also equated the will with "disposable energy," implying that energy can be stored within and dispensed from the psyche (Jung, 1973). According to information theory, in order for information to have meaning, there must be a sharing of symbolic coding. Jung's collective unconscious allows psyches to share archetypal meanings. The archetypes, serving in their role as strange attractors, create upredictability and raise entropy. They attempt to balance the exchange of information, which must be assimilated by the ego in order to be meaningful.
Consciousness may have a direct effect on the subatomic particles of the body, especially those within the brain. A tiny change within the open system of the brain, for example, can result in a vast change to the overall health of the body because of amplification through feedback loops. Nonlinearity exists at many scales throughout the brain. This increases the likelihood that bifurcation and amplification at some point in the brain will take place. Brain activity in its details is unpredictable--but it does have tendencies. New thoughts/stimuli are chaotic but after repetition, become orderly. This process is called recall or memory through neuron feedback coupling.
Figure 6 shows a very simplified model for brain activity. The brain has three basic input channels and one output channel. Information comes into the brain by way of the five physical senses and/or memory, as shown on the left under All Input. However, at the top of the brain, a second input (orderly or rational) is shown coming through the cortex. Also, at the bottom of the brain, a third input (chaotic or irrational) is shown coming through the brain stem. Essentially, the inputs of order and chaos represent the sense of order and disorder that is present in virtually all brain activity. The inputs of order and chaos cause a tension throughout the brain that is mandatory for proper and healthy growth. Without the input from order, the brain would fall into insanity and irrationality. It would dream too much. Without the input from chaos, the brain would fall into stereotyped "grooves" or habitual modes of thinking, and would function like an automaton or robot, without any real creativity. It would no longer dream at all. The output, our thoughts, feelings, and behaviors, depends upon a proper blending of chaos and order. "The psyche is made up of processes whose energy springs from the equilibration of all kinds of opposites" (Jung, 1973, p. 117).
Jung (1973) wrote that "all knowledge is the result of imposing some kind of order upon the reactions of the psychic system as they flow into our consciousness" (p. 81). Thus the imposition of order upon the chaotic flow of our sensory impressions, gives rise to meaningful information.
A Science and Psychology Interface
Chaos theory, synergetics, and information theory can interface with psychology by viewing the psyche as a complex dynamic system.
The psyche's gateway to the external world is the brain, which is also a complex dynamic system. The complex systems model of the brain is an attempt to define the psyche as the overall evolutionary effect of macroscopic ordering parameters operating within the brain.
Chaos theory geometrically describes complex systems in terms of their trajectories through a suitable phase space. Often mass or momentum can be used for a phase space axis. However, the momentum of the psyche has yet to be measured. Measurable parameters for the brain and psyche have not yet been determined. When these are selected and agreed upon, it is likely that phase space maps of the psyche will show the presence of a chaotic attractor. Bütz (1992) shows that such an attractor could be compared to the Eastern mandala, a Jungian symbol for the archetypal self. In this sense, the self can be viewed as a mandala-like attractor drawing psychic energy toward its evolutionary goal as part of Jung's individuation process.
Just as the brain is comprised of a host of interrelated components, so Jung's psyche is comprised of numerous complexes, instincts, and archetypes. These parts, together with the libido, work together to form macrocosmic ordering parameters which are the feelings, thoughts, and memories, that make up the personality or ego-complex--that part of the psyche which is conscious.
Both energy and information are continually exchanged between the brain and psyche. Jung (1973) states that psychic energy or libido generally remains conserved over time which implies that its exchanges with the brain tend to balance out. But information is not conserved. Incoming information must be given meaning before it is useful. This could be explained as a synergetic process that creates order out of apparent chaos.
The search for an understanding of the human brain and consciousness has led to a wide variety of possible models. Currently, the model with the most promise of success is the complex systems model, but more work needs to be done to develop this model into a useful tool that fully models brain functioning. For example, a suitable phase space for both the brain and consciousness needs to be defined.
Although brain cells are not conscious, and electric currents do not feel, the brain could possess a mind, feeling, and consciousness naturally as a whole complex system. Complex systems can possess properties that are completely irrelevant and meaningless for their components (Rosen, 1991; Mainzer, 1994; Peat, 1987).
Just as our brain is a macroscopic collection of microscopic subsystems, so our psyche is a collection of subsystems. These include complexes, instincts, and archetypes. It is highly likely that both the brain and psyche possess global properties, such as memory and consciousness, not found in any of their component subsystems.
Abraham, F. D. (1994, Fall/Winter). Chaos, bifurcations, and self-organization: Dynamical extensions of neurological positivism & ecological psychology. Psychoscience, 1 (2).
Aertsen, A., Gerstein, G. and Johannesma, P. (1986). From neuron to assembly: Neuronal organization and stimulus representation. In Palm, G. and Aertsen, A. (Eds.), Brain theory: Proceedings of
the First Trieste Meeting on Brain Theory, October 1-4, 1984. Berlin: Springer-Verlag.
Angrist, S. W. and Hepler, L. G. (1967). Order and chaos: Laws of energy and entropy. New York: Basic Books.
Arbib, M. A. (1987). A view of brain theory. In F. E. Yates (Ed.), Self-organizing systems: The emergence of order. New York: Plenum Press.
Atkins, P. W. (1984). The second law. New York: Scientific American.
Baker, G. L. and Gollub, J. P. (1991). Chaotic dynamics: An introduction. New York: Cambridge University Press. First published in 1990.
Briggs, J. and Peat, F. D. (1989). The turbulent mirror: An illustrated guide to chaos theory and the science of wholeness. New York: Harper & Row.
Bütz, M. R. (1992). The fractal nature of the development of the self. Psychological Reports, 71, 1043-1063.
Çambel, A. B. (1993). Applied chaos theory: A paradigm for
complexity. Boston: Academic Press.
Cohen, J. and Stewart, I. (1994). The collapse of chaos: Discovering simplicity in a complex world. New York: Viking.
Crick, F. (1995). The astonishing hypothesis: The scientific search for the soul. New York: Touchstone.
Freeman, W. J. and Viana di prisco, G. (1987). EEG spatial pattern differences with discriminated odors manifest chaotic and limit
cycle attractors in olfactory bulbs of rabbits. In Palm, G. and Aertsen, A.
(Eds.), Brain theory: Proceedings of the First Trieste Meeting on Brain Theory, October 1-4, 1984. Berlin: Springer-Verlag.
Fast, J. D. (1962). Entropy: The significance of the concept of entropy and its applications in science and technology. New York: McGraw Hill.
Gell-Mann, M. (1994). The quark and the jaguar: Adventures in the simple and the complex. New York: W. H. Freeman.
Gleick, J. (1987). Chaos: Making a new science. New York: Penguin.
Haken, H. (1983). Synergetics: An introduction: Nonequilibrium phase transitions and self-organization in physics, chemistry and biology. Belin: Springer-Verlag. First published in 1977.
Haken, H. (1987). Synergetics. In F. E. Yates (Ed.),
Self-organizing systems: The emergence of order. New York: Plenum Press.
Haken, H. (1988). Information and self-organization: A macroscopic approach to complex systems. London: Springer-Verlag.
Hunt, M. (1993). The story of psychology. New York: Doubleday.
Jacobi, J. (1973). The psychology of C. G. Jung: An
introduction with illustrations. New Haven: Yale University Press. First published in 1942.
Jasper, H. H., Proctor, L. D., Knighton, R. S., Noshay, W. C., and Costello, R. T. (Eds). (1958). Reticular formation of the brain: Henry Ford Hospital international symposium. Boston: Little, Brown.
Johannesma, P., Aertsen, A., Van Den Boogaard, H., Eggermont,
J., and Epping, W. (1987). In G. Palm and A. Aertsen (Eds.), Brain theory: Proceedings of the First Trieste Meeting on Brain Theory, October 1-4, 1984. Berlin: Springer-Verlag.
Jung, C. G. (1991). The development of personality: Papers on child psychology, education, and related subjects. Hull, R. F. C. (Trans). Bollingen Series XX. The Collected Works of C.G. Jung. 17.
Princeton, NJ: Princeton University Press. First published in 1954.
Jung, C.G. (1990). The archetypes of the collective unconscious. Hull, R. F. C. (Trans). Bollingen Series XX. The Collected Works of C.G. Jung. 9 (1). Princeton, NJ: Princeton University Press. First published in 1959.
Jung, C. G. (1978). Aion: Researches into the phenomenology of the self. Hull, R. F. C. (Trans). Bollingen Series XX. The Collected Works of C.G. Jung. 9 (2). Princeton, NJ: Princeton University Press. First published in 1959.
Jung, C. G. (1973). On the nature of the psyche. Hull, R. F. C. (Trans). from Bollingen Series XX. The Collected Works of C.G. Jung. 8. Princeton, NJ: Princeton University Press. First published in 1960.
Jung, C. G. (1974). Dreams. Hull, R. F. C. (Trans). From Bollingen Series XX. The Collected Works of C. G. Jung. 4, 8, 12, 16. Princeton, NJ: Princeton University Press.
Kaneko, K. (November, 1993). Relevance of dynamic clustering to biological networks. Tokyo, Japan: University of Tokyo. Downloaded from Internet.
Kast, V. (1992). Schwarz, S. A. (Trans). The dynamics of symbols: Fundamentals of Jungian psychotherapy. New York: Fromm International.
Kellert, S. H. (1993). In the wake of chaos: Unpredictable order in dynamical systems. Chicago: The University of Chicago Press.
Lebowitz, J. L. (1993, September). Boltzmann's entropy and time's arrow. Physics Today, 32-38.
Mainzer, K. (1994). Thinking in complexity: The complex dynamics of matter, mind, and mankind. Berlin: Springer-Verlag.
Medlin, D. and Ross, B. H. (1992). Cognitive psychology. Fort Worth: Harcourt Brace Johanovich.
Microsoft Corp., Microsoft Bookshelf, CD-ROM. 1995 edition. Redmond, WA: Author.
Neisser, U. (1967). Cognitive psychology. Englewood Cliffs, NJ: Prentice-Hall.
Nicolis G. and Prigogine, I. (1989). Exploring complexity: An introduction. New York: W. H. Freeman.
Pascal, E. (1992). Jung to live by. New York: Warner.
Peat, D. F. (1987). Synchronicity: The bridge between matter and mind. Toronto: Bantam.
Peitgen. H, Jürgens, H., and Saupe, D. (1992). Chaos and fractals: New frontiers of science. New York: Springer-Veriag.
Penrose, R. (1989). The emperor's new mind: Concerning
computers, minds, and the laws of physics. Oxford: Oxford University Press.
Prigogine, I. (1980). From being to becoming: Time and complexity in the physical sciences. San Francisco: W. H. Freeman.
Prigogine, I. and Stengers, I. (1984). Order out of chaos: Man's new dialogue with nature. Toronto: Bantam.
Restak, R. (1994). The modular brain: How new discoveries in
neuroscience are answering age-old questions about memory, free will, consciousness, and personal identity. New York: Touchstone.
Shannon, C. E. (1948). A mathematical theory of communication. The Bell System Journal, XXVII. 379-423, 623-656.
Zimbardo P. G. (1985). Psychology and life. 12th Ed. Glenville, IL: Scott, Foresman.