Lecture 8 Part IV Entropy

Download as pdf or txt
Download as pdf or txt
You are on page 1of 31

Part IV: Entropy

Lecture 8:
• Introduction
• The Clausius Inequality
• Entropy
• The Increase in Entropy Principle
• Entropy Balance
• What is Entropy
• Calculation of Entropy Changes
• The T-S Diagram

8-1
Introduction
In Part III, we introduced the second law of thermodynamics and applied it to
cycles and cyclic devices. In this part, we apply the second law to processes.
• The first law of thermodynamics deals with the property energy and the
conservation of it.
• The second law leads to the definition of a new property called entropy.
Entropy is a somewhat abstract property, and it is difficult to give a
physical description of it. Entropy is best understood and appreciated by
studying its uses in commonly encountered processes.

Here we start with a discussion of the Clausius inequality, which forms the
basis for the definition of entropy, and continue with the increase of entropy
principle. Unlike energy, entropy is a nonconserved property, and there is no
such thing as conservation of entropy principle. Next, the entropy changes
that take place during processes for pure substances, and ideal gases are
discussed, and a special class of idealized processes, called isentropic
processes, are examined.

8-2
THE CLAUSIUS INEQUALITY
The second law of thermodynamics often leads to expressions that involve inequalities.
An irreversible (i.e., actual) heat engine, for example, is less efficient than a reversible
one operating between the same two thermal energy reservoirs. Likewise, an irreversible
refrigerator or a heat pump has a lower coefficient of performance (COP) than a
reversible one operating between the same temperature limits. Another important
inequality that has major consequences in thermodynamics is the Clausius inequality. It
was first stated by the German physicist R. J. E. Clausius (1822-1888), one of the
founders of thermodynamics, and is expressed as

δQ
∫T ≤0 4-1

That is, the cyclic integral of δQ /T is always less than or equal to zero. This inequality
is valid for all cycles, reversible or irreversible.

To demonstrate the validity of Clausius inequality, consider a system connected to a


thermal energy reservoir at a constant absolute temperature of TR through a reversible
cyclic device ) (see figure below).

8-3
The cyclic device receives heat δQR from the reservoir and supplies heat δQ to the
system whose absolute temperature at that part of the boundary is T (a variable) while
producing work δWrev. The system produces work δWsys as a result of this heat transfer.
Applying the conservation of energy principle to the combined system identified by
dashed lines yields

dEC = δWC + δQR 4-2a

where δWC is the total work of the combined system (δWsys + δWrev) and dEC is the
change in the total energy of the combined system. Considering that the cyclic device is
a reversible one, we have
δQR δQ
=
TR T 4-2b
⎛ QH ⎞ T
(see the Eq. 3-28: ⎜ ⎟ = H )
⎜Q ⎟
⎝ L ⎠ rev TL

where the sign of δQ is determined with respect to the system (positive if to the system
and negative if from the system) and the sign of δQR is determined with respect to the
reversible cyclic device. Eliminating δQR from 4-2a and 4-2b yields

dEC = δWC + TR δQ/T

We now let the system undergo a cycle while the cyclic device undergoes an integral
number of cycles. Then the relation above becomes

δQ
WC = −TR ∫T
since the cyclic integral of energy (the net change in the energy, which is a property,
during a cycle) is zero. Here WC is the cyclic integral of δWC, and it represents the net
work for the combined cycle.

It appears that the combined system is exchanging heat with a single thermal energy
reservoir while involving (producing or consuming) work WC during a cycle. On the basis
of the Kelvin-Planck statement of the second law, which states that no system can
produce a net amount of work while operating in a cycle and exchanging heat with a
single thermal energy reservoir, we reason that WC cannot be a work output, and thus it

8-4
cannot be a negative quantity. Considering that TR is an absolute temperature and thus a
positive quantity, we must have

δQ
∫T ≤0

which is the Clausius inequality. This inequality is valid for all thermodynamic cycles,
reversible or irreversible, including the refrigeration cycles.

If no irreversibilities occur within the system as well as the reversible cyclic device,
then the cycle undergone by the combined system will be internally reversible. As such, it
can be reversed. In the reversed cycle case, all the quantities will have the same
magnitude but the opposite sign. Therefore, the work WC, which could not be a negative
quantity in the regular case, cannot be a positive quantity in the reversed case. Then it
follows that WC, int rev = 0 since it cannot be a positive or negative quantity, and therefore

⎛ δQ ⎞
∫ ⎜⎜
⎝ T
⎟⎟
⎠ int,rev
=0
4-3

for internally reversible cycles. Thus we conclude that the equality in the Clausius
inequality (Eq.4-1) holds for totally or just internally reversible cycles and the inequality
for the irreversible ones.

ENTROPY

The Clausius inequality discussed above forms the basis for the definition of a new
property called entropy.

To develop a relation for the definition of entropy, let us examine Eq. 4-3 more closely.
Here we have a quantity whose cyclic integral is zero. Let us think for a moment what
kind of quantities can have this characteristic. We know that the cyclic integral of work
is not zero.

Now consider the volume occupied by a gas in a piston-cylinder device undergoing a


cycle, as shown below.

8-5
When the piston returns to its initial position at the end of a cycle, the volume of the
gas also returns to its initial value. Thus the net change in volume during a cycle is zero.
This is also expressed as

∫ dV = 0 4-4

That is, the cyclic integral of volume (or any other property) is zero.

Conversely, a quantity whose cyclic integral is zero depends on the state only and not
the process path, and thus it is a property. Therefore the quantity (δQ / T)int rev must
represent a property in the differential form.

Clausius realized in 1865 that he had discovered a new thermodynamic property, and he
chose to name this property entropy. It is designated S and is defined as

⎛ δQ ⎞
dS = ⎜⎜ ⎟⎟
⎝ T ⎠ int,rev (in kJ/K) 4-5

Entropy is an extensive property of a system and sometimes is referred to as total

8-6
entropy. Entropy per unit mass, designated s, is an intensive property and has the unit
kJ/(kg . K). The term entropy is generally used to refer to both total entropy and entropy
per unit mass since the context usually clarifies which one is meant.

The entropy change of a system during a process can be determined by integrating


Eq. 4-5 between the initial and the final states:

2 ⎛ δQ ⎞
∆S = S2 − S1 = ∫1 ⎜⎝ T ⎟
⎠int,rev 4-6

Notice that we have actually defined the change in entropy instead of entropy itself, just
as we defined the change in energy instead of energy when we developed the first-law
relation for closed systems.

Absolute values of entropy are determined on the basis of the third law of
thermodynamics, which is discussed later.

Engineers are usually concerned with the changes in entropy. Therefore, the entropy of a
substance can be assigned a zero value at some arbitrarily selected reference state, and
the entropy values at other states can be determined from Eq.4-6 by choosing state 1 to
be the reference state (S = 0) and state 2 to be the state at which entropy is to be
determined.

So, what is entropy? To answer this, let us ask: what is energy? The point is, do we
exactly know what energy is? Perhaps not; we do not need to know what energy is, but
we find it satisfactory to interpret internal energy, on the basis of a kinetic molecular
hypothesis, as the kinetic and potential energies of atoms and molecules. Similarly, we
do not need to know what entropy is, but we find it satisfactory to interpret entropy, on
the basis of a kinetic-molecular hypothesis, in terms of the randomness of the
distribution of atoms and molecules in space and in energy states.

To perform the integration in Eq. 4-6, one needs to know the relation between Q and
T during a process. This relation is often not available, and the integral can be performed
for a few cases only. For the majority of cases we have to rely on tabulated data for
entropy.

Note that entropy is a property, and like all other properties, it has fixed values at
fixed states. Therefore, the entropy change ∆S between two specified states is the same
no matter what path, reversible or irreversible, is followed during a process.

8-7
• Also note that the integral of δQ / T will give us the value of entropy change only if
the integration is carried out along an internally reversible path between the two
states.

• The integral of δQ / T along an irreversible path is not a property, and in general,


different values will be obtained when the integration is carried out along different
irreversible paths. Therefore, even for irreversible processes, the entropy change
should be determined by carrying out this integration along some convenient
imaginary internally reversible path between the specified states.

THE INCREASE IN ENTROPY PRINCIPLE

Consider a cycle that is made up of two processes: process 1-2, which is arbitrary
(reversible or irreversible), and process 2-1, which is internally reversible, as shown.

8-8
From Clausius inequality,
δQ
∫T ≤0
or

2 δQ 1 ⎛ δQ ⎞
∫1 T
+ ∫2 ⎜⎝ T ⎟
⎠int, rev
≤0

The second integral in the above relation is readily recognized entropy change S1 - S2.
Therefore
2 ⎛ δQ ⎞
∫1 ⎜
⎝ T
⎟ + S1 − S2 ≤ 0
⎠ 4-7

which can be rearranged as


2 ⎛ δQ ⎞
∆S = S2 − S1 ≥ ∫1 ⎜
⎝ T ⎠
⎟ 4-8

Equation 4-8 can be viewed as a mathematical statement of the second law of


thermodynamics for a closed mass. It can also be expressed in differential form as
δQ
dS ≥ 4-9
T
where the equality holds for an internally reversible process and the inequality for an
irreversible process.

8-9
We may conclude from these equations that the entropy change of a closed system during
an irreversible process is greater than the integral of δQ/T evaluated for that process. In
the limiting case of a reversible process, these two quantities become equal. We again
emphasize that T in the above relations is the absolute temperature at the boundary where
the differential heat δQ is transferred between the system and the surroundings.

The quantity ∆S = S2 – S1 represents the entropy change of the system which, for a
2

reversible process, becomes equal to 1δQ / T , which represents the entropy transfer with
heat.

The inequality sign in the relations above is a constant reminder that the entropy change
of a closed system during an irreversible process is always greater than the entropy
transfer. That is, some entropy is generated or created during an irreversible process, and
this generation is due entirely to the presence of irreversibilities. The entropy generated
during a process is called entropy generation, and is denoted by Sgen. Noting that the
difference between the entropy change of a closed system and the entropy transfer is
equal to entropy generation, Eq. 4-8 can be rewritten as an equality as

2 ⎛ δQ ⎞
∆S = S2 − S1 = ∫1 ⎜
⎝ T ⎠
⎟ + S gen 4-10

Note that entropy generation Sgen is always a positive quantity or zero. Its value depends
on the process, and thus it is not a property of the system.

Equation 4-8 has far-reaching implications in thermodynamics. For an isolated system


(or just an adiabatic closed system), the heat transfer is zero, and Eq. 4-8 reduces to

∆S isolated ≥ 0 4-11

This equation can be expressed as the entropy of an isolated system during a process
always increases or, in the limiting case of a reversible process, remains constant. In
other words, it never decreases. This is known as the increase of entropy principle.

• Note that in the absence of any heat transfer, entropy change is due to
irreversibilities only, and their effect is always to increase the entropy.

• Since no actual process is truly reversible, we can conclude that some entropy is

8-10
generated during a process, and therefore the entropy of the universe, which can
be considered to be an isolated system, is continuously increasing. The more
irreversible a process is, the larger the entropy generated during that process.

• No entropy is generated during reversible processes (Sgen = 0).

Entropy increase of the universe is a major concern not only to engineers but also to
philosophers and theologians since entropy is viewed as a measure of the disorder (or
"mixed-up-ness") in the universe.

The increase of entropy principle does not imply that the entropy of a system or the
surroundings cannot decrease. The entropy change of a system or its surroundings can be
negative during a process (see figure); but entropy generation cannot.

The increase of entropy principle can be summarized as follows:

> 0 irreversible process


Sgen = 0 reversible process
< 0 impossible process

This relation serves as a criterion in determining whether a process is reversible,


irreversible, or impossible.

8-11
The things in nature have a tendency to change until they attain a state of equilibrium.
The increase in the entropy principle dictates that the entropy of an isolated system will
increase until the entropy of the system reaches a maximum value. At that point the
system is said to have reached an equilibrium state since the increase in entropy principle
prohibits the system from undergoing any change of state that will result in a decrease in
entropy.

Some Remarks about Entropy

In the light of the preceding discussions, we can draw the following conclusions:

1. Processes can occur in a certain direction only, not in any direction. A process must
proceed in the direction that complies with the increase of entropy principle, that is, Sgen
≥0. A process that violates this principle is impossible. This principle often forces
chemical reactions to come to a halt before reaching completion.

2. Entropy is a nonconserved property, and there is no such thing as the conservation of


entropy principle. Entropy is conserved during the idealized reversible processes only
and increases during all actual processes. Therefore, the entropy of the universe is
continuously increasing.

3. The performance of engineering systems is degraded by the presence of


irreversibilities, and the entropy generation is a measure of the magnitudes of the
irreversibilities present during that process. The greater the extent of irreversibilities, the
greater the entropy generation. Therefore, entropy can be used as a quantitative measure
of irreversibilities associated with a process. It is also used to establish criteria for the
performance of engineering devices.

ENTROPY BALANCE

The property entropy is a measure of molecular disorder or randomness of a system, and


the second law of thermodynamics states that entropy can be created but it cannot be
destroyed. Therefore, the entropy change of a system during a process is greater than the
entropy transfer by an amount equal to the entropy generated during the process within
the system, and the increase of entropy principle is expressed as

8-12
Entropy change = Entropy transfer + Entropy generation

or ∆Ssystem = Stransfer + Sgen 4-12

which is a verbal statement of Eq. 4-10. This relation is often referred to as the entropy
balance, and is applicable to any kind of system undergoing any kind of process. The
entropy balance relation above can be stated as the entropy change of a system during a
process is equal to the sum of the entropy transfer through the system boundary and the
entropy generated.

1. Entropy Change

Entropy balance is actually easier to deal with than energy balance since, unlike energy,
entropy does not exist in various forms. Therefore, the determination of entropy change
of a system during a process involves the evaluation of the entropy of the system at the
beginning and at the end of the process, and taking their difference. That is,

Entropy change = Entropy at final state - Entropy at initial state

or ∆Ssystem = Sfinal - Sinitial 4-13

Note that entropy is a property, and the value of a property does not change unless the
state of the system changes. Therefore, the entropy change of a system is zero if the state
of the system does not change during the process.

2. Mechanisms of Entropy Transfer

Entropy can be transferred to or from a system in two forms: heat transfer and mass flow
(in contrast, energy is transferred by work also).

Entropy transfer is recognized at the system boundary as entropy crosses the boundary,
and it represents the entropy gained or lost by a system during a process. The only form
of entropy interaction associated with a fixed mass or closed system is heat transfer, and
thus the entropy transfer for an adiabatic closed system is zero.

Heat Transfer Heat is, in essence, a form of disorganized energy, and some
disorganization (entropy) will flow with heat. Heat transfer to a system increases the
entropy of that system and thus the level of molecular disorder or randomness, and heat

8-13
transfer from a system decreases it. In fact, heat rejection is the only way the entropy of a
fixed mass can be decreased. The ratio of the heat transfer Q at a location to the absolute
temperature T at that location is called the entropy flow or entropy transfer, and is
expressed as
Entropy transfer with heat Sheat = Q /T 4-14

The quantity Q/T represents the entropy transfer accompanied by heat transfer, and the
direction of entropy transfer is the same as the direction of heat transfer since absolute
temperature T is always a positive quantity. Therefore, the sign of entropy transfer is the
same as the sign of heat transfer positive if into the system, and negative if out of the
system.

• When two systems are in contact, the entropy transfer from the warmer system
is equal to the entropy transfer into the cooler one at the point of contact. That
is, no entropy can be created or destroyed at the boundary since the boundary
has no thickness and occupies no volume.

• Note that work is entropy-free, and no entropy is transferred with work. Energy
is transferred with both heat and work whereas entropy is transferred only with
heat.

• The first law of thermodynamics makes no distinction between heat transfer and
work; it considers them as equals.

• The distinction between heat transfer and work is brought out by the second
law: an energy interaction which is accompanied by entropy transfer is heat
transfer, and an energy interaction which is not accompanied by entropy
transfer is work. That is, no entropy is exchanged during a work interaction
between a system and its surroundings. Thus only energy is exchanged during
work interaction whereas both energy and entropy are exchanged during heat
transfer (see figure).

8-14
3. Entropy Generation

lrreversibilities such as friction, mixing, chemical reactions, heat transfer through a finite
temperature difference, unrestrained expansion, non-quasiequilibrium compression or
expansion always cause the entropy of a system to increase, and entropy generation Sgen
is a measure of the entropy created by such affects during a process.

For a reversible process (a process that involves no irreversibilities), the entropy


generation is zero and thus the entropy change of a system is equal to the entropy
transfer. Therefore, the entropy balance relation in the reversible case becomes analogous
to the energy balance relation, which states that energy change of a system during a
process is equal to the energy transfer during that process. However, note that the energy
change of a system equals the energy transfer for any process, but the entropy change of a
system equals the entropy transfer only for a reversible process.

Entropy Balance for Closed Systems

A closed system involves no mass flow across its boundaries, and its entropy change is
simply the difference between the initial and final entropies of the system. The entropy
change of a closed system is due to the entropy transfer accompanying heat transfer and
the entropy generation within the system boundaries, and Eq. 4-10 is an expression for
the entropy balance of a closed system.

8-15
2 ⎛ δQ ⎞
∆S = S2 − S1 = ∫1 ⎜
⎝ T ⎠
⎟ + S gen (4-10)

When heat in the amounts of Qk is transferred through the boundary at constant


temperatures Tk at several locations, the entropy transfer term can be expressed more
conveniently as a sum instead of an integral to give


Qk
∆S = S 2 − S1 = + S gen (kJ/K) 4-15
Tk

Here the left term is the entropy change of the system, and the sum is the entropy transfer
with heat.

The entropy balance relation above can be stated as the entropy change of a closed
system during a process is equal to the sum of the entropy transferred through the system
boundary by heat transfer and the entropy generated within the system boundaries. It can
also be expressed in rate form as

dS Q k 
=∑ + S gen (kW/K) 4-16
dt Tk


where dS / dt is the rate of change of entropy of the system, and Qk is the rate of heat
transfer through the boundary at temperature Tk. For an adiabatic process (Q = 0), the
entropy transfer terms in the above relations drop out and entropy change of the closed
system becomes equal to the entropy generation within the system. That is,

∆S adiabatic = S gen 4-17

Note that Sgen represents the entropy generation within the system boundary only, and not
the entropy generation that may occur outside the system boundary during the process as
a result of external irreversibilities. Therefore, a process for which Sgen = 0 is internally
reversible, but it is not necessarily totally reversible.

8-16
WHAT IS ENTROPY?

It is clear from the previous discussion that entropy is a useful property and serves as a
valuable tool in the second-law analysis of engineering devices. But this does not mean
that we know and understand entropy well. Because we do not. In fact, we cannot even
give an adequate answer to the question, What is entropy? Not being able to describe
entropy fully, however, does not take anything away from its usefulness.
The discussion below will shed some light on the physical meaning of entropy by
considering the microscopic nature of matter.

Entropy can be viewed as a measure of molecular disorder, or molecular randomness.


As a system becomes more disordered, the positions of the molecules become less
predictable and the entropy increases. Thus, it is not surprising that the entropy of a
substance is lowest in the solid phase and highest in the gas phase.

In the solid phase, the molecules of a substance continually oscillate about their
equilibrium positions, but they cannot move relative to each other, and their position at
any instant can be predicted with good certainty. In the gas phase, however, the
molecules move about at random, collide with each other, and change direction, making
it extremely difficult to predict accurately the microscopic state of a system at any
instant. Associated with this molecular chaos is a high value of entropy.

When viewed microscopically (from a statistical thermodynamics point of view), an


isolated system that appears to be at a state of equilibrium may exhibit a high level of

8-17
activity because of the continual motion of the molecules. To each state of macroscopic
equilibrium there corresponds a large number of possible microscopic states or molecular
configurations.

The entropy of a system is related to the total number of possible microscopic states
of that system, called thermodynamic probability Ω, by the Boltzmann relation
expressed as
S = k ln Ω 4-18

where k is the Boltzmann constant. Therefore, from a microscopic point of view, the
entropy of a system increases whenever the molecular randomness or uncertainty (i.e.,
molecular probability) of a system increases. Thus, entropy is a measure of molecular
disorder, and the molecular disorder of an isolated system increases anytime it undergoes
a process.

Let’s justify Eq. 4-18

Consider two isolated systems 1 and 2 with entropies S1 and S2 and configurations
associated with them Ω1 and Ω2 respectively. If these systems are combined into one
composite system, each of the Ω1 configuration may be combined with any one of the Ω2
configurations of the second system, to give a possible configuration of the state of the
composite system. Thus

Ω = Ω1 Ω2 4-19a

The entropy of the new system becomes

S = S1 + S2 4-19b

However, according to Eq. 4-18, S1 = k ln Ω1; S2 = k ln Ω2


Thus
S = S1 + S2 = k ln Ω1 + k ln Ω2 = k ln (Ω1 Ω2)

or according to 4-19
S = S1 + S2 = k ln Ω

This justifies the appropriateness of setting k ln Ω equal to the entropy.

8-18
Example 4- What is the order of magnitude of Ω for a system having entropy
41.84 J/K?
3.03 x 10^24
Answer: e

We have just seen that entropy is a measure of molecular disorder. An increase in


entropy indicates an increase in disorder. We can, therefore, say that the second law of
thermodynamics is a matter of probability.

In terms of probability, the second law, which tells us that in any process entropy
increases, states that those processes occur which are most probable. However, this law in
terms of probability does not exclude a decrease in entropy, but the probability is
extremely low. It is to be noted that, if an increase of entropy is a probability, there is
always a chance that the second law might be broken. The chance that the second law is
broken can be calculated. These chances are so small for any macroscopic object that the
possibility can be ruled out.

More comments

Molecules in the gas phase possess a considerable amount of kinetic energy. But we
know that no matter how large their kinetic energies are, the gas molecules will not rotate
a paddle wheel inserted into the container and produce work. This is because the gas
molecules, and the energy they carry with them, are disorganized. Probably the number
of molecules trying to rotate the wheel in one direction at any instant is equal to the
number of molecules that are trying to rotate it in the opposite direction, causing the
wheel to remain motionless. Therefore, we cannot extract any useful work directly from
disorganized energy.

Now consider a rotating shaft shown below. This time, the energy of the molecules is
completely organized since the molecules of the shaft are rotating in the same direction
together. This organized energy can readily be used to perform useful tasks such as
raising a weight or generating electricity.

8-19
Being an organized form of energy, work is free of disorder or randomness and thus
free of entropy.

There is no entropy transfer associated with energy transfer as work. Therefore, in


the absence of any friction, the process of raising a weight by a rotating shaft (or a
flywheel) will not produce any entropy. Any process that does not produce net entropy is
reversible, and thus the process described above can be reversed by lowering the weight.
Therefore, energy is not degraded during this process, and no potential to do work is lost.

Instead of raising a weight, let us operate the paddle wheel in a container filled with a
gas, as shown below.

The paddle-wheel work in this case will be converted to the internal energy of the
gas, as evidenced by a rise in gas temperature, creating a higher level of molecular chaos
and disorder in the container.

• This process is quite different from raising a weight since the organized paddle
wheel energy is now converted to a highly disorganized form of energy, which

8-20
cannot be converted back to the paddle wheel as the rotational kinetic energy.

• Only a portion of this energy can be converted to work by partially


reorganizing it through the use of a heat engine. Therefore, energy is degraded
during this process, the ability to do work is reduced, molecular disorder is
produced, and associated with all this is an increase in entropy.

The quantity of energy is always preserved during an actual process (the first law), but
the quality is bound to decrease (the second law). This decrease in quality is always
accompanied by an increase in entropy.

As an example, consider the transfer of 10 kJ of energy as heat from a hot medium to


a cold one. At the end of the process, we will still have the 10 kJ of energy, but at a lower
temperature and thus at a lower quality.

Heat is, in essence, a form of disorganized energy, and some disorganization (entropy)
will flow with heat. As a result, the entropy and the level of molecular disorder or
randomness of the hot body will decrease with the entropy and the level of molecular
disorder of the cold body increase. The second law requires that the increase in entropy of
the cold body be greater than the decrease in entropy of the hot body, and thus the net
entropy of the combined system (the cold body and the hot body) increases. That is, the
combined system is at a state of greater disorder at the final state. Thus we can conclude
that processes can occur only in the direction of increased overall entropy or molecular
disorder.

From a statistical point of view, entropy is a measure of molecular randomness, i.e.,


the uncertainty about the positions of molecules at any instant. Even in the solid phase,
the molecules of a substance continually oscillate, creating an uncertainty about their
position. These oscillations, however, fade as the temperature is decreased, and the
molecules become completely motionless at absolute zero. This represents a state of
ultimate molecular order (and minimum energy). Therefore, the entropy of a pure
crystalline substance at absolute zero temperature is zero since there is no uncertainty
about the state of the molecules at that instant. This statement is known as the third law
of thermodynamics.

• The third law of thermodynamics provides an absolute reference point for the
determination of entropy. The entropy determined relative to this point is called
absolute entropy, and it is extremely useful in the thermodynamic analysis of
chemical reactions.

8-21
• Notice that the entropy of a substance that is not pure crystalline (such as a solid
solution) is not zero at absolute zero temperature. This is because more than one
molecular configuration exists for such substances, which introduces some
uncertainty about the microscopic state of the substance.

CALCULATION OF ENTROPY CHANGES

As shown earlier, the change in entropy depends only on the initial and final states of a
system, and not on the path taken between the states. If a given process is irreversible
(e.g. expansion of a gas into a vacuum) we obtain no value of Qrev from which to
determine ∆S with the help of ∆S = Qrev/T ; we then try to imagine some path whereby
the same final state is achieved reversibly and Σ(Qrev/T) for this gives the value ∆S for
the irreversible process. We shall discuss here the methods of calculating entropy
changes under different conditions.

Isothermal expansion of an ideal gas into a vacuum

In passing from an initial volume V1 to a final volume V2, n mol of the gas change
entirely irreversibly without doing any work or absorbing any heat. Nevertheless, as S is
a function of state, the value of ∆S for this expansion is exactly the same as the value of
∆S for the reversible isothermal expansion of n mol of the ideal gas from V1 to V2.
We know that for a reversible isothermal expansion

⎛V ⎞ ⎛P ⎞
Qrev = nRT ln⎜⎜ 2 ⎟ = nRT ln⎜ 1
⎟ ⎜P


⎝ V1 ⎠ ⎝ 2 ⎠
so

Qrev ⎛V ⎞ ⎛P ⎞
∆S = = nR ln⎜⎜ 2 ⎟⎟ = nR ln⎜⎜ 1 ⎟⎟
4-20
T ⎝ V1 ⎠ ⎝ P2 ⎠

where the expansion is in any degree irreversible, both the work done and the heat Qirrev
absorbed would to that degree be less than those for the reversible expansion, thus

8-22
Qrev Q
= ∆S > irrev
T T
Example 4-1
3 mol of an ideal gas is expanded isothermally to five times its initial volume. Calculate
the change in entropy.

Answer
∆S ≅ 40 J/ K

Entropy changes at constant volume and at constant pressure

The equation dU = dQ - P dV applies to any system. For a reversible change

Qrev dU + PdV
∆S = = 4-21
T T

For a constant volume dV = 0


dS = dU / T
or
⎛ ∂S ⎞ 1
⎜ ⎟ =
⎝ ∂U ⎠V T

CV = (∂U / ∂T)V = T (∂S / ∂T)V

Integrating this equation between the temperature limits T1 to T2, we get the change in
entropy at constant volume due to a change in temperature from T1 to T2.

T2 T2
∫ ∫
dT
∆S = CV = CV d (ln T ) 4-22
T1 T T1

Similarly, starting with:


H=U+PV,
At constant pressure,

8-23
dH=dU+PdV,
or from Eq. 4-21
dH/T=(dU+ P dV)/T= dS.

Thus,
(∂S/∂H)P = 1 / T.
Since for any system
CP = (∂H/∂T)P
we get
CP = T(∂S/∂T)P.

For a change in temperature from T1 to T2 at constant pressure, the entropy change is


given by

T2 dT T2
∆S = ∫
T1
CP
T
= ∫
T1
CP d (ln T ) 4-23

Gibbs paradox
Let us consider an interesting result. Suppose that 1 mol of an ideal gas A at standard
temperature and pressure is allowed to mix isothermally with 1 mol of another ideal gas
B at standard temperature and pressure, both having the same initial volume. As the gases
are assumed to be ideal, during mixing they move independently of each other.

8-24
Therefore, the total change in entropy is the sum of the change in entropy that each gas
undergoes individually in the expansion (assuming that the gases were originally
confined in two separate containers and allowed to mix together by joining the
containers, thus there is an expansion).
Equation 4-20 gives the change in entropy for an isothermal expansion.

⎛V ⎞ ⎛P ⎞
∆S = R ln⎜⎜ 2 ⎟ = R ln⎜ 1
⎟ ⎜P


⎝ V1 ⎠ ⎝ 2 ⎠
According to this equation, gas A has a change in entropy R ln 2 and gas B has a
change in entropy R ln 2. Hence the total change in entropy is 2R ln 2.
J. Willard Gibbs noticed that ∆S = 2R ln 2 irrespective of how closely identical the
gasses are, provided that they are not the same. Of course, if they are the same, then ∆S
equals zero. This result is often called the Gibbs paradox. The above result holds for any
two different ideal gases mixing together. It should be noted that the change in entropy is
associated with change in partial pressure at constant total pressure. If two gases
consisting of the same substance mix together, the partial pressure equals the total
pressure and remains constant. Hence there is no change in entropy.
Gibbs' paradox is connected with the principle of 'complete indistinguishability of
identical fundamental particles' which plays an important role in quantum and statistical
mechanics. From his paradox, Gibbs inferred the statistical nature of the entropy.

Phase change

Phase changes occur at constant temperature and pressure. If equilibrium conditions are
maintained, the process is reversible and

Qrev = QP = (∆H)phase change


Then, from the relation
∆S = Qrev / T
we get
(∆H ) phase _ change
∆S = 4-24
T
that is valid only for phase changes under reversible conditions.

8-25
Example 4-2
The heat of vaporization of a compound at 27°C was found to be 29.288 kJ/mol.
Calculate the molar entropy of vaporization at 27°C.
Answer: ∆S ≅ 98 J

Temperature change
When the temperature of a system is changed, it leads to a change in entropy of that
system. To obtain the total entropy change ∆S, we have to add an infinite number of
processes, in which an amount of heat Qrev has been absorbed. We know that, for a
constant pressure process,

QP = nCP ∆T
and this for each infinitesimal step of the process can be written as

dQP = nCP dT = dQrev


By combining with
∆S = Qrev / T
we get
T2 dT
∆S = ∫T1
nCP
T 4-25

This equation can be used to calculate the absolute entropy of a substance.

a. If CP remains constant over the temperature range where there is no phase change,
then Equation 4-25 can be written as

T2 dT ⎛T ⎞
∆S = nCP ∫T1 T
= nCP ln⎜⎜ 2
⎝ T1



4-26
Note: Although this equation is derived for a reversible path, the same entropy change
will result from the transition along any path.

b. If the heat capacity of a system changes with temperature and is given as

8-26
C = a + bT + CT2 + …,

the entropy change for heating such a system from a temperature T1 to a temperature T2 is
given by (from Eq. 4-25)

⎛T ⎞ ⎛c⎞
∆S = a ln⎜⎜ 2 ⎟⎟ + b(T2 − T1 ) + ⎜ ⎟(T22 − T12 ) + ...
4-27
⎝ T1 ⎠ ⎝2 ⎠

Phase changes under non-equilibrium conditions

For a transition or phase change under non-equilibrium conditions, the entropy change is
not given by Equation 4-24. In this case, the entropy change must be calculated by
integrating dQ/T for some reversible means of bringing about the same change in state.

Example: Let us consider the isothermal freezing of supercooled water at -5°C. The
overall change in state for this can be represented by

H2O (l) → H2O (s) (at -5oC)

Note that supercooled liquid water cannot be reversibly converted, by direct freezing,
into a solid at the same temperature. We, therefore, must find some indirect reversible
means of bringing about the stated change. One way of doing it is (in three steps):
1. to heat the liquid to 0oC,
2. reversibly freezing it to solid at that temperature, and finally
3. cooling the solid back to - 5°C.

Thus, the overall entropy change is

273 K Cliquid (∆H ) fusion Csolid


268 K 273 K Cliquid − C solid ∆H fusion
∆S = ∫ dT − +∫ dT = ∫ dT −
268 K T 273 273 K T 268 K T 273
THE T-S DIAGRAM

In the second-law analysis, it is very helpful to plot the processes on diagrams for which
one of the coordinates is entropy. The two diagrams used most extensively in the second-

8-27
law analysis are the temperature-entropy and the enthalpy-entropy diagrams.

Consider the defining equation of entropy (Eq. 4-5). It can be rearranged as

δQint rev = TdS 4-28

As shown in figure below, δQrev corresponds to a differential area on a T-S diagram.

The total heat transfer during an internally reversible process is determined by integration
to be
2
Qint,rev = ∫ TdS 4-29
1

which corresponds to the area under the process curve on a T -S diagram. Therefore, we
conclude that the area under the process curve on a T-S diagram represents the
internally reversible heat transfer.

This is somewhat analogous to reversible boundary work being represented by the area
under the process curve on a P-V diagram. Note that the area under the process curve
represents heat transfer for processes that are internally (or totally) reversible. It has no

8-28
meaning for irreversible processes.

To perform the integrations in Eq. 4-19b, one needs to know the relationship between
T and S during a process. One special case for which these integrations can be performed
easily is the internally reversible isothermal process. It yields

Qint rev = T0 ∆S 4-30

where T0 is the constant temperature and ∆S is the entropy change of the system during
the process. In the relations above, T is the absolute temperature, which is always
positive. Therefore, heat transfer during internally reversible processes is positive when
entropy increases and negative when entropy decreases.

An isentropic process on a T -S diagram is easily recognized as a vertical-line segment.


This is expected since an isentropic process involves no heat transfer, and therefore the
area under the process path must be zero (see figure).

The T -S diagrams serve as valuable tools for visualizing the second-law aspects of
processes and cycles, and thus they are frequently used in thermodynamics. Below are
the T-S diagrams for the Carnot, Otto and Diesel cycles.

Entropy change in the four steps of a Carnot cycle


(See pages 166 – 170 text)

8-29
Otto

8-30
Diesel

Thermodynamic Definition of Pressure (the text pages 155-172)

8-31

You might also like