ABOUT ME

-

Today
-
Yesterday
-
Total
-
  • Brillouin Science And Information Theory Pdf Merge
    카테고리 없음 2020. 2. 19. 05:28

    . 64 Downloads. Abstract Brillouin sees order as generated by tapping negentropy sources existing upstream, while Prigogine sees it as generated by dumping entropy downstream. Joining both ideas yields a picture of the computer closely paralleling that of Carnot's heat engine. The difference is that the one delivers information and the other, work.

    In either case the irretrievable (that is, by definition) loss occurs at the last step. Bennett and Landauer very rightly emphasize this, but their fixation on the condenser blinds them to the necessity of the furnace; thus they are led to believe in the possibility of “perpetual duplication of the second kind,” which Brillouin explicitly denies.

    In an important 1949 article entitled 'Life, Thermodynamics, and Cybernetics,' Brillouin was inspired by 's new book Cybernetics and its connection of the new information theory with entropy and intelligence One of the most interesting parts in Wiener's Cybernetics is the discussion on 'Time series, information, and communication,' in which he specifies that a certain 'amount of information is the negative of the quantity usually defined as entropy in similar situations.' This is a very remarkable point of view, and it opens the way for some important generalizations of the notion of entropy. Wiener introduces a precise mathematical definition of this new negative entropy for a certain number of problems of communication, and discusses the question of time prediction: when we possess a certain number of data about the behavior of a system in the past, how much can we predict of the behavior of that system in the future? In addition to these brilliant considerations, Wiener definitely indicates the need for an extension of the notion of entropy. 'Information represents negative entropy'; but if we adopt this point of view, how can we avoid its extension to all types of intelligence? We certainly must be prepared to discuss the extension of entropy to scientific knowledge technical know-how, and all forms of intelligent thinking. Some examples may illustrate this new problem.

    Take an issue of the New York Times, the book on Cybernetics, and an equal weight of scrap paper. Do they have the same entropy? According to the usual physical definition, the answer is 'yes.' But for an intelligent reader, the amount of information contained in the three bunches of paper is very different. If 'information means negative entropy,' as suggested by Wiener, how are we going to measure this new contribution to entropy? Wiener suggests some practical and numerical definitions that may apply to the simplest possible problem of this kind. This represents an entirely new field for investigation and a most revolutionary idea.

    ('Life, Thermodynamics, and Cybernetics,' American Scientist, 37, p.554) In his 1956 book Science and Information theory, Leon Brillouin coined the term 'negentropy' for the negative entropy (a characteristic of free or available energy, as opposed to heat energy in equilibrium). He then connected it to in what he called the 'negentropy principle of information.' Brillouin described his principle as a generalization of Carnot's principle, that in the normal evolution of any system, the change in the entropy is greater than or equal to zero. Δ(S - I) ≥ 0 (2) New information can only be obtained at the expense of the negentropy of some other system.

    The principal source of negentropy for terrestrial life is the sun, which acquired its low entropy state from the expanding universe followed by the collapse of material particles under the force of gravity. Brillouin summarizes his ideas: Acquisition of information about a physical system corresponds to a lower state of entropy for this system. Low entropy implies an unstable situation that will sooner or later follow its normal evolution toward stability and high entropy. The second principle does not tell us anything about the time required, and hence we do not know how long the system will remember the information. But, if classical thermodynamics fails to answer this very important question, we can obtain the answer from a discussion of the molecular or atomic model, with the help of kinetic theory: the rate of attenuation of all sorts of waves, the rate of diffusion, the speed of chemical reactions, etc., can be computed from suitable models, and may vary from small fractions of a second to years or centuries. These delays are used in all practical applications: it does not take very long for a system of pulses (representing dots and dashes, for instance) to be attenuated and forgotten, when sent along an electric cable, but this short time interval is long enough for transmission even over a long distance, and makes telecommunications possible.

    A system capable of retaining information for some time can be used as a memory device in a computing machine. The examples discussed in the preceding section are not only interesting from a theoretical point of view, but they also show how to attack a practical problem. Let us consider, for instance, the problems of diffusion and spin distribution. The information stored in this system corresponds to a decrease in entropy.

    Our discussion shows how this situation is progressively destroyed by diffusion and collisions that increase the entropy and erase the information. Entropy is usually described as measuring the amount of disorder in a physical system. A more precise statement is that entropy measures the lack of information about the actual structure of the system. This lack of information introduces the possibility of a great variety of microscopically distinct structures, which we are, in practice, unable to distinguish from one another. Since any one of these different microstructures can actually be realized at any given time, the lack of information corresponds to actual disorder in the hidden degrees of freedom. This picture is clearly illustrated in the case of the ideal gas. When we specify the total number n of atoms, their mass m, their degeneracy factor g, and the total energy E., we do not state the positions and velocities of each individual atom.Since we do not specify the positions and velocities of the atoms, we are unable to distinguish between two different samples of the gas, when the difference consists only in different positions and velocities for the atoms.

    Hence we can describe the situation as one of disordered atomic motion. The origin of our modern ideas about entropy and information can be found in an old paper by 5, who did the pioneer work but was not well understood at the time. The connection between entropy and information was rediscovered by 6, but he defined entropy with a sign just opposite to that of the standard thermodynamical definition. Hence what Shannon calls entropy of information actually represents negentropy. This can be seen clearly in two examples (pages 27 and 61 of Shannon's book) where Shannon proves that in some irreversible processes (an irreversible transducer or a filter) his entropy of information is decreased. To obtain agreement with our conventions, reverse the sign and read negentropy. The connection between entropy and information has been clearly discussed in some recent papers by Rothstein 7 in complete agreement with the point of view presented in this chapter.

    On Measurement Errors and Determinism Brillouin emphasizes that experimental errors are inevitable and that it is unscientific to think of infinite accuracy in any measurement., and even knew this to be the case. Brillouin says that this makes strict impossible in scientific predictions. 's demon can not acquire the infinite information needed to predict the future perfectly, just as 's demon cannot acquire the information needed to violate the second law, without destroying an equivalent amount of negentropy. The natural evolution of any closed system involves a loss of information. Mechanical laws are supposed to be reversible in time This is said also of the unitary evolution of the equation in quantum mechanics, but this is true only if errors and experimental uncertainties are ignored. The theory of information provides us with a possibility.to define the amount of information obtained from a certain experiment, and to measure it in a precise way.

    We only need to know the field of uncertainty - before and after the observation. The logarithm of the ratio of these two uncertainties yields the amount of information.

    If the final uncertainty is very small (very accurate measurement) the information obtained is very large. The mathematician dreams of measurements of infinite accuracy, defining for instance the position of a point without any possible error. This would mean an experiment yielding an infinite amount of information and this is physically impossible. One of the most important results of the theory is known as the 'negentropy principle of information.' It states that any information obtained from an experiment must be paid for in negentropy. A very large amount of information shall cost a very high price, in negentropy.

    C Levelut

    An infinite amount of information is unattainable. An infinitely short distance cannot be measured, and a physical continuum in space and time is impossible to define physically. The role of experimental errors has been known for a very long time and was recognized by all scientists; but it was usually considered as a secondary effect, a source of nuisance that could be neglected in most occasions and should be ignored by the theory. The assumption was that errors could be made 'as small as might be desired,' by careful instrumentation, and played no essential role.

    This was the point of view of mathematicians discussing the axioms of geometry, and most physicists accepted, implicitly or explicitly, this kind of idealization. Modern physics had to get rid of these unrealistic schemes, and it was indispensable to recognize the fundamental importance of errors, together with the unpleasant fact that they cannot be made 'as small as desired' and must be included in the theory. The first instance was found in connection with statistical thermodynamics, but it was usually toned down and led to many (in our opinion often meaningless) discussions such as: how is it possible to obtain irreversible thermodynamics from strictly reversible mechanical laws?

    We shall come back to this problem when discussing the exact meaning of determinism and show that it corresponds to a metaphysical creed, not to a physical law. With 's uncertainty principle, the fundamental role of experimental errors became a basic feature of physics. An additional law was stated in Chapters 12 and 16 called the 'negentropy principle of information.' It states that an observation yields a certain amount of information ΔI, and that this information can be quantitatively measured and compared with the entropy increase ΔS during the experimental measurement. The net result is (in entropy units). The Problem of Determinism The laws of classical mechanics represent a mathematical idealization and should not be assumed to correspond to the real laws of nature.

    In many problems (astronomy, for instance) they yield wonderful results that agree with observation within experimental errors. In other fields they had to be amended (relativity, quantum mechanics). The classical viewpoint was to ignore the actual role and importance of experimental errors. Errors were assumed to be accidental; hence, it was always imagined that they could be made as small as one wished and finally ignored. This oversimplified picture led to the assumption of complete in classical mechanics. We now have to realize that experimental errors are inevitable, a discovery that makes strict determinism impossible.

    Errors are an essential part of the world's picture and must be included in the theory. Causality must be replaced by statistical probabilities; a scientist may or may not believe in determinism. It is a matter of faith, and belongs to metaphysics.

    Physical discussions are unable to prove or to disprove it. This general viewpoint may be called the 'matter of fact' position.

    Born states very clearly the situation. He quotes Einstein as saying that before quantum mechanics, it was assumed that 'everything was to be reduced to objects situated in space-time, and to strict relations between these objects. Nothing appeared to refer to our empirical knowledge about these objects. This is what was meant by a physical description of a real external world.' This position appears as untenable in modern physics. We have no way to prove the existence of such a real external world, and it is very dangerous to speak of something we cannot observe. If we restrain our thinking to observable facts, we can only speak of possible relations between a certain experiment and another one, but we should never discuss what happens while we are not making any observation; we must candidly admit that we do not know (no more than we know what happens on the other side of the moon).

    The position defined in this way is taken by M. Born and agrees with the philosophy of science stated by the Vienna school.

    Is such a viewpoint accepted by all physicists? The answer is far from clear. Pure mathematicians have great difficulty in agreeing with this inclusion of errors within the theory, and many theoretical physicists are still mathematicians at heart.

    The uncertainty relations of Bohr and Heisenberg are based upon the kind of thinking we tried to define. But when one looks at the further expansion of quantum theories, one is amazed at the many fancy visualizations describing physics in terms of unobservable entities. The language of physicists is loaded with a jargon understandable only to specialists; special names have been coined for terms in a series of approximations, as if each isolated term had a meaning (exchange terms, pair creation, virtual creation, and absorption of particles, etc.). Actually, only the final sum matters. Wise men know where and how to use these figures of language, and they are aware of their complete lack of reality.

    They realize that the jargon represents no more than an artificial way of describing complicated equations; but many physicists may be misled by such methods, which are really dangerous. In brief, quantum theory pays lip service to the sound principle of matter-of-fact descriptions, but soon forgets about it and uses a very careless language. Besides mathematicians and quantum theoreticians, many scientists feel very reluctant to face the situation described above and to abandon old-fashioned ideas.

    They still believe in a real physical world following its own unperturbed evolution, whether we observe it or not. In order to reconcile this view with recent physical discoveries, they have to invent the existence of a number of 'hidden variables' that we are unable to observe at present. In our opinion these hidden variables may do more harm than good. If we cannot observe them, let us admit that they have no reality and may exist only in the imagination of their authors. This is not meant to be a sarcasm. Imagination is absolutely needed in scientific research, and many important discoveries were, at the beginning, pure works of imagination; they became important only later when experimental proof was obtained and checked with results predicted by pure imagination. Finally, the new experimental discoveries became the scientific basis for the part that had been verified by experiment.

    Note that it is Brillouin, not Borel, who suggests Sirius Borel, for instance, computed that a displacement of 1 cm, on a mass of 1 gram, located somewhere in a not too distant star (say, Sirius) would change the gravitational field on the earth by a fraction 10 -100. The present author went further and proved that any information obtained from an experiment must be paid for by a corresponding increase of entropy in the measuring device: infinite accuracy would cost an infinite amount of entropy increase and require infinite energy!

    This is absolutely unthinkable. Let us simplify the problem, and assume that the laws of mechanics are rigorous, while experimental errors appear only in the determination of initial conditions. Ln the bundle of trajectories defined by these conditions, some may be 'nondegenerate' while others may 'degenerate.'

    And

    The bundle may soon explode, be divided into a variety of smaller bundles forging ahead in different directions. This is the case for a model corresponding to the kinetic theory of gases. Borel computes that errors of 10 -100 on initial conditions will enable one to predict molecular collisions for a split second and no more. It is not only 'very difficult,' but actually impossible to predict exactly the future behavior of such a model. The present considerations lead directly to Boltzmann's statistical mechanics and the so-called 'ergodic' theorem.

Designed by Tistory.