Figure 3.9
Gibbs paradox.
Figure 3.10
Using the macroscopic approach of thermodynamics, we have found that the equilibrium state of an isolated system is that state in which the entropy has its maximum value. From a microscopic point of view, we might expect that the equilibrium state of an isolated system would be the state with the maximum sta- tistical probability. For simple systems we can use the molecular point of view to calculate the statistical probabilities of different final states. For example, assume that one mole of an ideal gas is in a container that is connected to a container of equal volume through a stopcock; this expansion has already been discussed in Example 3.3. Actually, it is a little easier to think about the reverse process, and so we will do that. As shown in Fig. 3.10, we will start with an opening between the two chambers and ask, what is the statistical probability that all of the molecules will be in the original chamber? The probability that a particular molecule will be in the original chamber is 1/2. The probability that two particular molecules will be in the original chamber is (1/2) , and the probability that all the molecules
N
.
S k .
k R N
S S
S
S S S k .
S k
. .
. .
S .
k R N
S k
N
. .
.
89
23 23
23
6 022 10 4 174 10
A
4 174 10 1
23 23
1
1 1
A
Boltzmann’s constant
3.6 Entropy and Statistical Probability
⫻ ⫺ ⫻
⫺ ⫻
⫺
⫺
⫺
⫺
⫺
⬘ ⬘
⬘ ⬘
⬘
⬘
⍀
⍀
⌬ ⍀
⍀
⌬ ⫺ ⍀ ⍀
⍀ ⍀
⍀ ⍀
⌬
⫻ ⫺ ⫻
⫺
⌬
⍀ ⍀
will be in the original chamber is (1/2) , where is the number of molecules in the system. If the system contains a mole of gas molecules, the statistical proba- bility that all the molecules will be in the original chamber is
(1/2) e (3 45)
Boltzmann postulated that
ln (3 46)
where is ( / ) and is the number of equally probable microscopic arrangements for the system. This relation can be used to calculate for the transformation from an initial state with entropy and equally probable arrangements to a final state with entropy and equally probable arrangements:
ln( / ) (3 47)
It is often difficult to count the number of equally probable arrangements in the final state and in the initial state, but the ratio / in this case is equal to the ratio of the probability that all the molecules are in one chamber to the probability that they are all in one chamber or the other. The probability that all the molecules are in the original chamber or the other chamber is, of course, unity. Thus, for the system in the preceding paragraph, we have already calculated the ratio of the probabilities that is equal to / . The change in entropy in going from the state with the gas distributed between the two chambers to the state with all the molecules in the original chamber is
ln e
(1 381 10 J K )( 4 174 10 )
5 76 J K (3 48)
Since we are considering one mole of an ideal gas, the change in entropy for the expansion process in Example 3.3 is 5 76 J K mol , in agreement with the result using Boltzmann’s hypothesis. This confirms that the Boltzmann con- stant is indeed given by / .
If the gas molecules were all to be found in one chamber after having been distributed between the two chambers, we would say that the second law had been violated. We have just seen that the probability that such a thing might happen is not zero. It is, however, so small that we could never expect to be able to observe all the molecules in one chamber, even for systems containing much, much less than one mole of gas. If, however, we considered a system of only two molecules, then we could find both molecules in one chamber with reasonable probability.
This shows that the laws of thermodynamics are based on the fact that macro- scopic systems contain very large numbers of molecules.
The equation ln embodies an important concept, but it is not used very often because it is difficult to calculate . In Chapter 16 on statistical me- chanics we will use other equations to calculate the entropy.
The collision of two gas molecules is reversible in the sense that the reverse process can also happen. If, after the molecules are moving away from each other, we could simply reverse the direction of the velocity vectors, the molecules would move along the same trajectories in the reverse direction. In short, the movie of the reverse process is just as reasonable as the movie of the forward process. This is true for both classical mechanics and quantum mechanics. If molecular collisions
Comment:
79:
S q T
q
q
The entropy of mixing of ideal gases brings up a ery important idea, namely, that some processes happen spontaneously e en though they do not reduce the energy of a system. Our experience with mechanics leads us to expect that if something happens spontaneously, there is necessarily a decrease in energy. Now we know that is not true, and we can expect to find chemical reactions that occur because of the contribution of a positi e S .
J. Chem. Educ.
*F. L. Lambert, 187–192 (2002).
v v
v ⱖ
⌬
are reversible, then why is the expansion of a gas into a vacuum, or the mixing of two gases, irreversible? If we could take a movie of the expansion of a gas into a vacuum that would show the locations of all the molecules, we could tell whether the movie was being run forward or backward. However, if we were to look at each molecular collision we would find that each followed the laws of mechanics, and was reversible. If we were to look at the movie being run backward, we would feel that it was depicting something that could not happen. But why couldn’t it happen? As a matter of fact, it could, but only if we could give all the molecules the positions they have at the end of the movie but then reverse their velocity vectors. If we then looked at the individual collisions, we would find that they would take all the molecules to the region from which they had expanded. The reason this does not happen in real life is that it takes an extraordinarily special set of molecular coordinates and velocities. This set of coordinates and velocity vectors is so unlikely that thermodynamics says that the reverse process can never happen.
Since d d / , the entropy is a measure of the flow of heat between a sys- tem and its environment. When heat is absorbed by the system from its surround- ings, is positive and the entropy of the system increases. The energy flowing into the sysem is “dispersed” in the sense that it goes into increasing the energy of var- ious molecular motions in the system. This concept of the dispersal of energy also applies to the expansion of an ideal gas into a vacuum. In this case is zero, but the total energy of the gas is dispersed over a larger volume. Thus entropy is a measure of the dispersal of energy among the possible microstates of molecules in a system.
Sometimes entropy is referred to as “disorder,” and a messy desk is referred to as a state of high entropy. Or shuffling a deck of playing cards is said to result in an increase in entropy of the cards. But this is misleading from a scientific view- point because moving macroscopic objects around does not involve an increase in entropy.* Another source of confusion about entropy comes from the use of this term in information theory, which was introduced by Shannon in 1948. The quan- tity entropy in information theory is not the entropy of thermodynamics because it does not deal with the transfer of heat and the dispersal of energy among the microstates of a system.
The concept of temperature is necessarily involved in understanding ther- modynamic entropy because it indicates the thermal environment of the par- ticles in a system. These particles are involved in the ever-present thermal motion that makes spontaneous change possible because it is the mechanism by which molecules can occupy new microstates when the external conditions are altered.
100 80 60 40
Solid
M.P.
Gas Liquid
20
00 50 100 150 200 250 300 T/K
(a)
(b) – CP/J K–1mol–1 – CP/J K–1mol–1
100 80 60
40 Solid
B.P.
B.P.
Liquid
20 0
1.0 1.2 1.4 1.6 1.8 2.0 2.2 2.4 2.6 M.P.
Gas
log(T/K)
q T
C
C T
T T
T
H
C H C C
S S T T T
T T T T T
.
T T
.
P
P
T T T
P P P
T
T T
91
T Method of Calculation S
C T
C a
S . S J. Am. Chem. Soc.
P
P
m b
m b
1 1
3
rev
m b
b fus vap
0 0 m b
6
2 1 2
2 1
Entropy of Sulfur Dioxide
/K /J K mol
0–15 Debye function ( constant ) 1.26
15–197.64 Graphical, solid 84.18
Heat capacity of sul-
197.64 Fusion, 7402/197.64 37.45
fur dioxide at a constant pressure
197.64–263.08 Graphical, liquid 24.94
of 1 bar at different temperatures.
263.08 Vaporization, 24 937/263.08 94.79
[Graph redrawn from W. F.
263.08–298.15 From of gas 5.23
Giauque and C. C. Stephenson,
(298 15 K) (0 K) 247.85 1389 (1938).]
⌬
⫺