CHAPTER 9 – ORDER, ENERGY, INFORMATION AND SYSTEMS

A Natural Sequel to the Second Law

The relationship between order, energy and information is a natural sequel to discussions of the second law of thermodynamics. In addition to an examination of information theory and its relationship to an understanding of order and energy, we will also look at general system theory, which in turn is closely linked to an understanding of information theory. The two theories form an excellent and necessary bridge to the understanding of order and energy in living systems, from cells to human beings and civilization.

Dots and Dashes and Bits

A good place to begin this discussion is with Morse Code invented in the 19th Century by Samuel F. B. Morse. Morse Code consists of two signals, a short click of electricity signifying a “dot” and a longer “buzz” signifying a “dash.” All the letters of the alphabet in Morse Code are symbolized by just these two signals, which are really ordered energy (electricity or sound). The pauses between each group of signals signifying a letter, and between groups of letters signifying words, are also essential to understanding the messages of Morse Code.

Binary code, invented in the 20th Century, is even simpler. It consists of just two symbols, 1 and 0. When thought of in terms of an electric switch, which is how binary code functions in computers and many other devices, 1 means the switch is active or closed and can mean “yes.” Zero (0) means the switch is open or inactive and can mean “no.” All numbers and letters can be converted into “binary digits,” a term which was shortened by taking the first letters of “binary” and the last of “digits” to make “bits.” In this manner, 1 signifies 1 in binary code, 2 becomes 10, 3 becomes 11, 4 becomes 100, 5 becomes 101, and so on.

The Foundation Of Information Theory

Claude Shannon, an engineer with Bell Telephone Laboratories, published articles in the Bell System Technical Journal in 1948 which became the foundation for information theory. Although he intended them primarily for improving radio and telephone transmissions, his concepts were sufficiently abstract and general that they have been widely applied.

In Grammatical Man, Jeremy Campbell writes:

“Scientists are still actively exploring the riddle of why nature’s products are so improbable, why they display so much order, when the most probable state for them to be in is one of muddle and error, a surrender to the forces of disorder in the universe that seem so overwhelming and natural. This is still thought of as being one of the disturbing paradoxes in science…. In his 1948 papers, Shannon proved that, contrary to what we might expect, … a message can persist in the midst of … a haphazard disorder, or noise.

“Most striking of all, Shannon’s expression for the amount of information, the first precise, scientific measure … was of the same form as the equation devised many years earlier, in the nineteenth century, for that most peculiar and fugitive of physical laws, the entropy principle.”

Order, Energy and Information

Shannon saw the similarities between the tendency of order in a system to degrade to disorder, and the tendency of a message in communications to become degraded with noise. The mathematics are virtually the same.

Think of the similarity between a jumble of molecules inside a sealed container, which is in an equilibrium state of maximum entropy, and a jumble of sounds or letters which we cannot understand. Restoring order among the molecules and restoring order or meaning to the sounds or letters both require energy. High entropy means low or no information; high order, or ordered energy, means information is available. This may also be stated in terms of probability: high entropy is the most highly probable condition of a closed system; low entropy, or high order, is the most improbable. In this way, both what we know and don’t know about a system or a message may be expressed in terms of probability. In information theory, the information content (I) of an event is the inverse of its probability (P). An information content of 3 would have a probability of 1/3. But instead of regular whole numbers, information theory uses the bit, the binary digit we discussed earlier.

Redundancy And Rules

Another important aspect of information theory for the understanding of how order and energy are related in living things is the concept of redundancy. Campbell writes:

“In nearly all forms of communication, more messages are sent than are strictly necessary to convey the information intended by the sender. Such additional messages diminish the unexpectedness, the surprise effect, of the information itself, making it more predictable. This extra ration of predictability is called redundancy, and it is one of the most important concepts in information theory. Redundancy is essentially a constraint,… reducing the number of ways in which the various parts of a system can be arranged. A message conveys no information unless some prior uncertainty exists in the mind of the receiver about what the message will contain. And the greater the uncertainty, the larger the amount of information conveyed when that uncertainty is resolved. Herein lies the profound relationship between information and probability.”

In the English language, like many others, redundancy reduces uncertainty in several ways. First of all, there are rules of spelling and rules of grammar. If I wrote the preceding sentence, “Foist uv awl, they is rools uv spellin and rools uv grammer,” you probably could understand me. Even though I violated several rules of spelling and rules of grammar, I followed others, such as the rule of syntax, that enable you to reconstruct the correct meaning of the sentence. This is why rules are considered a form of redundancy.

Had I written, “Rools uv uv uv foist spellin they awl, grammer rools and is,” breaking virtually all the rules (order), your reconstruction would have been much more difficult. The order of words in sentences is extremely important in English.

Another form of redundancy relates to the order in which items of information such as letters spelling words follow one another. If I sent you a message containing the word “spellin_,” and a smudge prevented you from reading the last letter, you would be very likely to supply the missing letter. “Lipslen” would have been more difficult, even though the same letters are used in each case.

Redundancy also allows for more complex forms of order. John von Neumann, a noted mathematics theorist, pointed out “that the structure of living organisms and of machines is dictated to a great extent by the way in which they fail and become unreliable,” Campbell notes . “Failure, von Neumann said, must not be thought of as an aberration, but as an essential, independent part of the logic of complex systems. The more complex the system, the more likely it is that one of its parts will malfunction. Redundancy is a means of keeping the system running in the presence of malfunction.”

Predictability and Accident

As you will see when we further discuss living organisms and evolution, the difference between predictability and accident is very important. This difference is very important to scientists who are attempting to develop theories that will allow a high degree of predictability. And according to evolutionary theory, all advances in life forms over millions of years are the result of accidental genetic changes occurring during the reproduction process of cells. Although we do not accept the theory of evolution at face value, it serves here to demonstrate the link between information theory and biology.

Rupert Reidl has put all this in perspective in his remarkable book, Order In Living Organisms: A Systems Analysis of Evolution. He points out that the ability to predict accurately the information content of a message requires five conditions:

1. “The source must repeat its transmissions,” because this is the only way that the receiver can predict that the message will be repeated the same way it was transmitted earlier.

2. “The receiver must have a memory.” Without a memory, the receiver cannot compare transmissions or make predictions.

3. “The receiver must be able to compare,” which relates to point 2.

4. “The programmes of a large number of sources must be so organized that the receiver can learn to distinguish between individual events…and series of events.” What this means is, if the series of numbers ‘1 2 3 4 5’ was repeated over and over, the receiver could predict its recurrence as a series. However, if in a large sequence of numbers, ‘1 2 3 4 5’ seems to be only individual events, predictability will be virtually impossible. Suppose for example the message was: 1 3 7 4 9 3 7 8 0 3 6 5 3 1 2 3 4 5 9 6 3 4 2 1 5…” etc. The receiver would be very unlikely to separate ‘1 2 3 4 5’ out from all the other individual numbers as a meaningful series.

5. “The programme of a source must remain within the same limits long enough for the receiver to appreciate these limits…For only a large number of comparisons make it improbable that the limits in a series of natural events have remained the same by accident alone.”4 In other words, if the sender limits the message to series of 5 numbers with a break in between each series, the receiver would be able to perceive these limits and make predictions accordingly.

If these five conditions make predictability possible, what leads us to believe that an event or series of events is the result of accident instead of necessity? Reidl gives the example of tossing a coin. Suppose someone says he can predict which way a coin will fall when tossed. The probability of being right on the first toss is relatively good, 1 out of 2, so we might ascribe his prediction to luck or accident. But the probability of his correctly predicting a series of tosses accurately the second, third, fifth, tenth or hundredth time drops rapidly: 1/4, 1/8, 1/32, 1/1024, and 1/1.3 X 1030. The point at which each of us, as observers of this process, would switch our perception from “accidental” to “something is causing this to happen” varies with each person. However, most of us would become very suspicious that the coin-tossing was “rigged” long before the 100th accurate prediction!

“In the world of accident and necessity we must then assume the reign of necessity,” Reidl says of a series like the 100 predictable coin tosses. He calls the rule of necessity “determinacy” (actually he uses a German word translated as determinacy). Determinacy is his measurement of the degree of order in an event or series. He notes that we may refer to the probability of accident or indeterminacy in an event just as we may refer to the probability of order or determinacy. Since one is the reciprocal of the other, we need to be clear which one we are talking about. When the probability of determinacy or order is 100%, we have what Reidl says is “law.”5

This all may seem irrelevant until you realize that Reidl has developed a means of mathematically measuring and specifying the degree of order in any situation, natural or unnatural. He is thus able to “prove” that many, many events in nature which were once thought “accidental” are so unlikely that they must be “determined” by law or other forms of order. In fact he defines order as “conformity to law.” The degree of order in an event or system is equal to the law content multiplied by the number of instances where it applies. If a “law” applied only once, we could barely perceive the order. But if a “law” applies to many, many instances, such as the law of gravity, we consider the amount of order to be virtually total, 100%.

Before we move on to the fascinating world of order in living things, we need to understand one more dimension of order: the order of systems.

General System Theory

A system is obviously a form of order, an orderly arrangement of parts which function as a whole. You may recall that the second law of thermodynamics referred to the tendency toward increasing disorder and entropy within a closed system. A closed system, like the molecules in the airtight container which we used as an example, is isolated from the rest of the world. Technically the earth is not a closed system because it receives energy and cosmic particles from the sun and outer space, and radiates energy back out into space. We may speak of the universe as a closed system if we believe that it is finite, as recent astrophysics indicates. But all living things, which we will discuss in subsequent chapters, are open systems. They must take in food, gasses such as oxygen or carbon dioxide, and other nutrients in order to survive, and they expel wastes back into their environment.

In 1968, Ludwig von Bertalannfy sought to bring all these system concepts into one consistent understanding with his book General System Theory.5 A number of other authors have expounded on his theory in a book published in 1975, General Systems Theory and Human Communication.6 From these two books I have extracted 10 principles that apply to closed and open systems, as follows:

1. A system is a dynamically interacting and interdependent group of members or components and their attributes, forming a whole. This means that a system must consist of two or more members or components; one thing is not to be considered a system unless it is composed of multiple interacting components. For example, when we speak of interpersonal systems, an individual person does not constitute a system, but in terms of living organisms composed of component parts, a human being is most definitely a system. That is, insofar as we discuss something as a system, we must by definition refer to it as a group of interdependent parts.

“Dynamically interacting” means that there is movement and change within a system. A static crystal such as a diamond is not considered a system because its parts do not move or change, although technically its subatomic particles are in motion. “Interdependent” is also an important term because what happens to one part of the system affects the other parts. Remove the roots from a tree and it will die. Break some of the electrical connections in a computer and it will not function. A serious accident to one member of a family will affect the other members of the family, and so on.

The attributes of components refers to their characteristics. This is a direct function of their interdependence. If the roots of a tree are dry, if the connections of a computer are broken, if a member of a family is injured, then the dryness, brokenness and injury are all attributes that are part of the system and that impact other components of the system.

When we speak of a system forming a whole, we are making a very significant statement about the order of the system. This wholeness has several important aspects. One is that the system functions as a whole and can be recognized as a whole. A family, a tree, a computer can each be recognized as a whole made up of parts. Another aspect is that removing one or more components undermines or destroys the wholeness. A branch is not a whole tree, a keyboard is not a whole computer, a mother is not a whole family. This aspect of wholeness is one of the most profound dimensions of order. We may take it for granted, but it seems almost mystical or magical at times. The way a tree grows from a seed, or in some cases can be propagated from rooted cuttings, involves complex living processes that scientists still do not fully understand. The bonds between members of a family, though not physically visible, are so strong that they can motivate sacrifices even of life itself. The loosening of those bonds, when a child grows up and leaves home or when a divorce occurs, can be painful experiences. The forces that make a system whole tend to keep it whole and to resist the break-up of the system. (Perhaps this cannot be said of an inanimate object such as a computer, but it is very true of living systems.)

2. The environment of a system consists of all objects and forces external to the system such that a change in the environment’s attributes or actions affects the system and vice versa. A pond is an environment for a fish. A home is an environment for a family. Von Bertalanffy and other systems thinkers define environment as external things which have a relationship with or impact on the system. Thus the environment for each system can change over time.

If you are a member of an American company, much of what goes on in Japan is not part of your environment. But when a Japanese competitor starts taking your customers’ business, your environment changes dramatically. In fact many of the forces which are tending to shrink our world into a closely linked, interactive economy are forcing more and more people to think of the whole earth as their environment. From another perspective, the world economy is becoming a system in itself, and when we discuss subjects such as international weather and protection of the environment, the earth itself becomes a system of interactive parts! We may summarize all this by saying that “environment” is sometimes arbitrary and always relative to how we define the system we are focusing on.

3. All living systems are open systems. Open systems maintain themselves in a continuous interchange with their environments, importing and exporting matter and energy. One essential aspect of life is that every living organism depends on importing energy from its environment in order to survive. Some of that energy may be warmth from the sun or fire. Some of it is always in the form of food or nutrients, which the organism processes to release chemical energy or form new chemicals essential to life. And all living systems give off by-products, ranging from animal wastes to oxygen released by plants. The by-products almost always have lower-order energy than the ingested nutrients for that organism, although other organisms may still take energy from the wastes or inhale the oxygen from plants.

A closed system, such as a computer, can exist and persist without importing energy. But even then, any system in which there is physical, electrical or other motion must have energy from the outside at some point to power its motion, such as electricity which runs the computer. Actually a closed system is in many ways a product of the imagination of scientists who need to isolate experiments from their environments in order to observe the results, such as heating an enclosed gas to observe the increase in pressure. A closed system is an unnatural device, of little value to anyone except to observe how it functions when energy is applied. Even with closed systems, the relationship between order and energy is thus essential.

4. Open systems are always acting and changing. However, they have a strong tendency to reach and maintain a balance known as a steady state. For example, human beings are always acting and changing, yet our bodies have a strong tendency to maintain a balance. When we use up our energy reserves, we become hungry. When we get overheated, we perspire. When we become tired, we desire rest. The steady state often represents an optimum condition that the system seeks to return to again and again, even though the system is in almost constant motion or change.

5. In open systems, the same steady state may be reached from different initial conditions and pathways. This is called equifinality. For example, if a person is hungry, he can eat all sorts of foods in all sorts of locations in order to restore his steady state. Equifinality is another way of saying that open systems are adaptable; if there were not different pathways to the steady state, the survival of the open system would be very much at risk.

6. Living systems tend to evolve toward higher levels of order in terms of differentiation and organization. This is one of the most profound aspects of living systems and represents the exact opposite of the entropy principle, which is a tendency to evolve toward greater disorder and uniformity. It is very important to understand that the second law of thermodynamics applies not to everything in the universe but only to closed or inert systems such as machines. In living systems, there is what seems to be an innate drive for order. And how is this higher order achieved? Through processing energy! Whether the energy comes from sunlight or food, energy is essential for living systems, which have a natural drive for order. This includes human beings, of course. This principle is very important to understand and again demonstrates the profound significance of the relationship between order and energy in all living things.

7. In human systems, the primary means of evolving toward higher levels of order, differentiation and organization is the communication of information, especially in the form of decisions. Get a group of people together for the first time and they may just mingle around in a state of very low order. But once the group begins making decisions as to what it will do collectively, or how labor will be divided, it begins evolving toward a higher level of order. (The tension between achieving higher levels of order and succumbing to more animal impulses was graphically illustrated in Lord of the Flies.)

8. Some order is brought about by the dynamic interaction of the components. Order may also be achieved by feedback, through which the effects of actions are transmitted back to the source of the actions to allow self-regulation. A thermostat is a classic example of a feedback loop in a system – when the temperature drops to a certain level, the thermostat sends a signal to the heater to send more heat. Feedback is essential to the success of any living system, and modern management consultants stress the value of feedback from customers to keep an organization functioning at peak performance.

9. A high level of order tends to make a living system and its parts function more efficiently, but it also tends to restrict or abolish the equality of power among the parts. For example, in most organizations a high level of order means different people have different power. There are typically a chairman, president, vice presidents, assistant vice presidents, managers, etc., each with a particular level of power.

10. Adaptive systems try different means to ends or goals and eventually settle into a pattern of interaction which minimizes conflict with critical factors in the environment. The tendency here of various nations to join together through the United Nations to support world peace is a good example. A husband and wife over the years tend to adapt to each other more and more, resulting in fewer conflicts.

Order And Energy In Systems

Order and energy are absolutely essential for all living systems, all forms of life. Energy is either taken in directly in the form of sunlight or heat, or created within the system by a biochemical process such as the oxidation of carbohydrate, to make the maintenance of order possible. This is what some researchers call dissipative structure – energy dissipates through the living system. The living system must have energy to survive. And what makes it alive is its order, its division into nucleus and membrane, for example, within each individual cell. Order and energy are not just necessary for life – they are the very essence of life. Think about that for a moment before moving onto the next chapter.

GO TO CHAPTER: 1  2  3  4  5  6  7  8  9  10  11  12  13  14  15  16  17  18  19

Comments are closed.