HEAT

                                    A Brief Overview of Various Interpretations

                                               

The Conservation of Energy

      During the historical period of the wave and corpuscular theories of light and the theories of ether, the properties of heat were also subject to intense research. The most important law of all physics, The First Law of Thermodynamics was discovered. Very few scientific theories withstood the passage of time without modification, as did the First Law of Thermodynamics. It states that energy cannot be produced or destroyed, it can only be converted. It is called the The Law of the Conservation of Energy. This law simply expresses that whenever and whatever manner energy is converted into useful work and some part of it is converted into heat.

     The laws of mechanical motion and the conservation of energy as momentum were well understood in Newton’s time. The nature of heat was not. The theories of heat were not specifically connected with the theories of the ether. Many of the developers of the various theories of light assumed that heat is an oscillatory motion of the smallest parts of bodies. Around the middle of the ninetieth century this presumption was extended by the notion that heat is a radiated form of energy. In 1841 Joule summarized the earlier findings of Faraday, Carnot, W. Thomson and others, and supplemented those by his findings that a seemingly lost electrical energy in a circuit is not lost, but converted into heat. Gradually, it had become known that mechanical energy can be converted into heat, and its reverse, that the energy of heat can be converted to mechanical work.

    In 1847, Helmholtz published the first formal assertion that the conservation of heat is a universal principle. Concurrently, with the recognition that ultimately all energy that appear to be lost is actually converted into heat, other researchers have worked upon the problem on how the energy of heat leaves a body and ‘radiated’ through space. Since Maxwell’s (1831-79) theory and equations it is well known that heat is electromagnetic radiation; part of the spectrum of light and radio waves, x-rays, and gamma rays. Maxwell developed his famous and still valid, equations of electromagnetic radiation upon the basis of his mechanical model of the ether. The reviews of interpretations regarding the various theories associated with heat are instructional.

 

Heat as a Weightless Fluid

      The theory of heat as a weightless fluid was the first explanation. It was observed that the combined weights of a heated object immersed in colder water remained the same after their temperatures have equalized. This experiment seemed to prove that heat is a weightless invisible fluid. Heat seemed to be ‘poured’, by analogy to a fluid, from higher a temperature to a lower temperature substance. It is an analogue of mechanical motion, like water flowing from higher elevation to a lower and never in a reverse direction. The quantity of the energy of this invisible fluid had to be measured and expressed in some way.

      This lead to 19th century idea of ‘calorique’, (calorie) the quantity of heat measurable experimentally by physical means. First a standard of measurements had to be selected and a quantitative scale devised. Historically, first the expansion of heated air was chosen to measure temperature. The scale of Celsius arbitrarily marked the temperature of boiling water as 100° C and 0° C for melting ice. The notion of an ‘absolute zero’ temperature first occurred to Amontons, Boyle, and Mariotte. Knowing there are colder temperatures than 0° C, the thought occurred that no weightless fluid would remain in substances at an absolute zero temperature. By simple extrapolation of the Celsius scale absolute zero temperature was expressed (-273.16° C, -459.69° F). Absolute zero temperature is very important regarding the fundamental properties of matter, but it will not be pursued here any further.

 

Corpuscular Theories

      A second interpretation of heat was equated with mechanical energy which was well understood by the 19th Century. Heat energy was seen as due to the rapid mechanical motion of molecules. The first molecular theories of heat were based upon the investigation of gases. The impact of rapidly moving gas molecules were thought to create pressure in a closed vessel. Boyle and Gay-Lussac found mathematical laws expressing the relationship between pressure, temperature, and volume of gases based upon actual experimental data. The findings corresponded with Newton’s laws of motion. The mechanical law of the conservation of momentum corresponded with the conservation of heat energy. This finding has strongly supported the molecular theory of heat. The corpuscular theory of gases replaced the invisible fluid theory. This also reflected the philosophical views of rationalist science of that era which was dominated throughout by Newtonian mechanics

      In the late part of the 19th century Maxwell created a mathematical theory of heat, based upon a statistical model of gas molecules. The probability distribution of gas molecules was based upon large amount of observations. From Maxwell’s statistical theory Botzmann derived a mathematical function of heat and entropy. The statistical model and theory have further extended the supposition that heat, volume, and pressure of gases are due to random motions, impacts, and the kinetic energy of corpuscular gas molecules.

 

Entropy

      There is no single physical interpretation of the word Entropy. Whereas, the equation expressing entropy quantitatively is the second law of thermodynamics, and was originally based upon overwhelming empirical evidence. Heat cannot pass from a colder to a warmer body, and heat is always lost when one type of energy is converted into another. The first time, in the middle of the ninetieth century, Clausius defined The Second Law of Thermodynamics: “The entropy of an isolated system never diminishes.” There are no perfectly isolated systems in experimental settings. The same law applied in an open system means that entropy always diminishes. Its meaning therefore is that in an open system an invisible part of the total energy escapes in the form of heat. Consequently, Clausius defined entropy, the lost energy, as being negative.

     In reality, when a quantity of heat energy is utilized as mechanical work, or in other useful manners, a part of the total energy is always lost. Clausius saw this diminishing part as the dissipated part of the total entropy. Entropy of a system was seen as the ‘transformation’ of part of mechanical, chemical, or electrical energy into a massless medium which leaves the system.

      A simple mathematical equation states the second law; entropy is equal with the quantity of heat divided by absolute temperature. In a closed system, which means a theoretically completely isolated system from the outside world, heat cannot escape, entropy is constant. This must be so to obey the first law; all energy is conserved, regardless of the internal changes. However, in the real world there are no isolated systems, and useful work obtainable is less than the total energy converted. The puzzle is that energy seem to disappear in an invisible weightless form. This invisible non recovered part had several interpretation. The Second Law is simple, but it does not explain in physical terms what entropy means.

      Subsequently to Clausius, Maxwell interpreted of entropy as a degree of ‘disorder of a system’. His model was based upon the presumption of two chambers containing randomly moving gas molecules at different pressures. The chambers were connected through a small orifice, therefore the pressures of the gas in the two chambers eventually become equal. The notion of entropy was further analyzed and interpreted by Boltzmann that a closed system tends toward an equilibrium state of maximum probability. That means with respect to gases the equalization of pressure, temperature, and other variables. His interpretation of entropy was that orderly forms of energy degenerate into disordered ones as part of the available energy is converted into useful work. The idea of order versus disorder is not clear unless related to physical referents; consequently all members of the scientific community do not share this meaning of entropy.

            Clausius’s interpretation of entropy has lead him to the belief that the universe will end in a “thermal death”. He meant that all physical changes will come to a standstill because all useful energies will be dissipated into space in the form of heat. This may be viewed as ‘universal order’, just the opposite of the Boltzmann’s interpretation of entropy as a degree of disorder.

            Milne challenged Clausius’ widely shared interpretation in 1931. He believed that the law of entropy applies only for closed systems, which have an inside and outside in which measurements are possible. Theoretically, he said we have no way to measure entropy of the entire universe which has no outside. Milne’s view of the universe, like other interpretations, is void of physical meaning.    Interpretations of the meaning of entropy have diversified and to a degree obscured its original explanation by Clausius. Loschmidt argued against the irreversibility of increasing entropy, based on the symmetry of dynamics with respect to time.

            Later, at the first decade of the 20th century Zermelo argued that finite motion of a molecular systems must be cyclical. His opinion was based on Poincare’s dynamics. According to his opinion, under certain general conditions, the initial state of a molecular system recur infinitely in cycles. Such arguments are difficult to understand in physical terms, for being increasingly based upon mathematical reasoning in place of physical explanations.

            The Ehrenfest couple tried to resolve opposing views. The statistical interpretation of entropy is based on averages of disorder, thus does not preclude the possibility of decreases or increases of entropy in equal frequency within a closed system. To Withrow, Maxwell’s imaginary experiment was responsible for the introduction of a “sorting demon”, who selectively allowed entrance of fast and slow gas particles into two different sides of a divided vessel through the small opening. Maxwell viewed this as a contradiction of the second law of thermodynamics.

            Leo Szilard, one of the inventors of the atomic bomb, viewed the demon as an active part of the closed system, which converts information into negative entropy. The interpretations of entropy have become more and more obscure for those who prefer to see the descriptions of laws of the universe in meaningful physical terms.

            Shannon created a mathematical system of information theory in 1948. He found that random motion of atoms and molecules created by heat in an electronic circuit, commonly known as noise, sets the ultimate limit to electronic communication of information. He found that Boltzmann’s equation satisfied the general requirements set by his theory. Whithrow connects Shannon’s electronics communication theory of information with the notion of entropy. He writes, “The fact that negative entropy is identified with information is due to Clausius’ unfortunate choice of the term ‘entropy’ to denote the negative of the ‘availability for work’ of the heat in a given system.”

            Such connection is individual interpretation of the works of others, having little to do with the explanation of entropy. Shannon simply used, as many other scientists do, Boltzmann’s equation of heat to express the energy of electronic noise in circuits. The separation of the intelligent part of electronic communication from its obscuring noise is a difficult task. Shannon’s theory sets a theoretical limit of such separation in terms of heat generated in the active and passive components of amplifiers and the noise being also part of electromagnetic waves carrying intelligence, being received and amplified.

            The successful utilization of a quantitatively correct equation obtained by one model of reality and its associated theory does not necessarily justify the same interpretations in connection with an unrelated or distantly related theory. Hull, a practicing engineer applied the Second Law in rheology and compared the measured and calculated properties of a few dozen materials. He poses an interesting question:    “Does the fact that the thermal and statistical entropy both obey the second law prove that they are identical qualities?” His blunt answer is, “No! For one, entropy could increase while the other remained the same. This problem of the identity of the two definitions of entropy bothered me.” Hull found no solution in the literature, and his opinion was based on his own tests.

            Mendelssohn comments as to the Second Law: “Admittedly, its thermodynamic formulation, while perfectly clear and unambiguous, does not present a concept which can be grasped.”   “This law simply says that only those processes can take place in which the entropy increases or, at best, remains constant.” (p.p. 95-6) The above references show that no agreement exists regarding the physical nature and the meaning of entropy. The words; closed and open system, order, disorder, weightless fluid, etc., all sound very familiar, but they are meaningless when used as general characterization of something real. Meaningful interpretations require factual referents.

            Scientists and engineers, those who actually applied the equation of entropy and tested the results, clearly discriminate between useful and lost energies. If one applies the First and the Second Laws of Thermodynamics consistently, then entropy remains constant when the system is idealized as completely ‘closed’. But in practice there is no isolated system from which no energy can escape. A ‘system’ in the real world consists of the ‘internal’ apparatus, such as an automobile engine in practical terms, which produces humanly utilized energy. The total internal energy has another not utilized ‘external’ part; that is the lost energy of the system to the environment, and ultimately to the universe. The total system consists of the apparatus and the universe.

The problem of interpretation arises clearly from the fact that originally entropy was given by a mathematical formula, and as such, its symbols are always subject to interpretations. The above cases provide an excellent illustration of the influence of human ‘conduct’. Different experimental conditions were taken to examine the nature of heat, and rival theories and interpretations were their logical ‘consequences’, as they were also in connection with the nature of light. The ideal of knowing cause — effect relationship is tainted by human conduct. Philosophical, and logical consistency demands seeing inanimate nature as various forms and composition of matter, down to the smallest invisible corpuscles; that is seen, and therefore defined, as ‘the primary essence of matter.

            Maxwell’s model of randomly moving gas particles is a physical model, but it does not account for radiated heat, which ultimately becomes part of a seemingly empty space. Logical consistency demands the notion that a void cannot carry energy. Consequently, my conjecture is that radiated heat and forces acting through distance are so far not accounted form of energies must be the manifestations of some massless material entity that carries energy over vast distances of space, such as the ether was thought to be and abandoned later..