Microcanonical Ensemble Unit 8

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 12

The microcanonical ensemble

Finding the probability distribution


We consider an isolated system in the sense that
the energy is a constant of motion.

N
E H ( p, q ) E

We are not able to derive from first principles


Two typical alternative approaches

Postulate of Equal a Priori Probability


Construct entropy expression from it and
show that the result is consistent
with thermodynamics

Use (information) entropy as starting


concept
Derive from maximum entropy
principle

Information entropy*
Introduced by Shannon. See textbook and E.T. Jaynes, Phys. Rev. 106, 620 (1957)

Idea: Find a constructive least biased criterion for setting up probability


distributions on the basis of only partial knowledge (missing information)
What do we mean by that?
Lets start with an old mathematical problem
Consider a quantity x with discrete random values

Assume we do not know the probabilities


We have some information however, namely

x1 , x2 , ..., xn
p1 , p2 , ..., pn
n

p
i 1

and we know the average of the function f(x) ( we will also consider cases

where we know more than one average)

f ( x ) pi f ( xi )
i 1

With this information, can we calculate an average of the function g(x) ?


To do so we need all the p1 ,
n

p
i 1

p2 , ..., pn but we have only the 2 equations

1 and f ( x ) pi f ( xi )
i 1

we are lacking (n-2) additional equations

What can we do with this underdetermined problem?


n

There may be more than one probability distribution creating f ( x ) pi f ( xi )


We want the one which requires no further assumptions
We do not want to prefer any pi if there is no reason to do so

i 1

Information theory tells us how to find this unbiased distribution


(we call the probabilities now rather than p)
Shannon defined information entropy, Si:

Si k n ln n
n

or

Si k dpdq ( p, q )ln ( p, q )

for continuous distribution

with normalization
n

dpdq ( p, q) 1

We make plausible that:


- Si is a measure of our ignorance of the microstate of the
system.
More quantitatively
- Si is a measure of the width of the distribution of the n.

Lets consider an extreme case:


An experiment with N potential outcomes (such as rolling dice)
However:
Outcome 1 has the probability 1=1
Si k n ln n 0
n
Outcome 2,3, ,N have
n=0
-Our ignorance regarding the outcome of the experiment is zero.
-We know precisely what will happen
- the probability distribution is sharp (a delta peak)

ets make the distribution broader:


1 1 1 1

Si k n ln n k ln ln 0 ... 0
utcome 1 has the probability 1=1/2
2 2 2 2

n
utcome 2 has the probability 2=1/2
1
1
utcome 3,4, ,N have
n=0
k ln 2 ln 2 k ln 2
2
2
ets consider the more general case:
utcome 1 has the probability 1=1/M
utcome 2 has the probability 2=1/M Si k n ln n

utcome M has the probability M=1/M


utcome M+1, ,N have
n=0

1
1
1
1
1
1

ln
ln
... ln
... 0
M M
M M M M

k ln M
k

So far our considerations suggest:


Ignorance/information entropy increases with increasing
width of the distribution
Which distribution brings information entropy to a maximum
For simplicity lets start with an experiment with 2 outcomes
Binary distribution with 1, 2 and 1+ 2=1

Si k 1ln 1 2ln 2

1 2 1

with

Si k 1ln1 1 1 ln 1 1

dSi
1
1
k ln 1 1 ln 1 1 1 1

k
ln

d i
1

1 1

1
0.7
0.6

maximum

1
1
1 1

0.4

Si

dSi

k ln 1 0
d i
1 1

0.5

1 1/ 2 2
uniform distribution

0.3
0.2
0.1
0.0
0.0

0.2

0.4

0.6

0.8

1.0

Lagrange multiplier technique


Once again
Si k 1ln 1 2ln 2
at maximum

with

1 2 1

constraint

F ( 1 , 2 , ) k 1ln1 2ln 2 1 2 1

F
k ln 1 1 0
1
F
k ln 2 1 0
2
F
1 2 1 0

From

http://en.wikipedia.org/wiki/File:LagrangeMultiplier

Finding an extremum of f(x,y) under the constrain

1 2
1 2 1

1 1/ 2 2
uniform distribution

Lets use Lagrange multiplier technique to find distribution that maximizes


M

Si k n ln n
n 1

1
n
n 1

F ( 1 , 2 ,..., M , ) k n ln n
n 1

F
k ln j 1 0
j
with

n 1

Si

max

1 2 ... M e 1 / k

1 2 ... M

1
M

uniform distribution
maximizes entropy

k ln M

In a microcanonical ensemble where each system has N particles, volume V


and fixed energy between E and E+ the entropy is at maximum in equilibrium.
Distribution function
- When identifying information entropy with thermodynamic entropy

const.

( p, q )

1
Z (E)

if E H ( p, q ) E

0 otherwise

Where Z(E) = # of microstate with energy in [E,E+ ]


called the partition function of the microcanonical ensemble

Information entropy and thermodynamic entropy


When identifying k=kB

S k B n ln n
n

has all the properties we expect from the thermodynamic entropy


(for details see textbook)

We show here S is additive

S 1 2 S 1 S 2

S1

(1) n : probability distribution


for system 1

S2

(2) m : probability distribution


for system 2

Statistically independence of system 1 and 2


probability of finding system 1 in state n and system 2 in state m

S (1 2) k B n(1) m(2) ln n(1) m(2) k B n(1) m(2) ln n(1) ln m(2)


n,m

n,m

k B m(2) n(1) ln n(1) k B n(1) m(2) ln m(2)


m

k B n(1) ln n(1) k B m(2) ln m(2) S (1) S (2)


n

n m
(1)

(2)

Relation between entropy and the partition function Z (E)

S k B n ln n
n

1
1
1

k
lnZ
(
E
)
k B
ln

Z (E)
n Z (E)
n Z (E)
1

S k B lnZ ( E )

Derivation of Thermodynamics in the microcanonical Ensemble

Where is the temperature ?


In the microcanonical ensemble the energy, E, is fixed

E ( S ,V ) U ( S ,V )
with

dU TdS PdV

1 S

T U V

and

P S

T V U

Lets derive the ideal gas equation of state from the microcanonical ensemble
despite the fact that there are easier ways to do so

Major task: find Z(U) :=# states in energy shell [U,U+ ]

Z (U )

U+

1
3N
3N
d
p
d
q
3N
N !h U H ( p,q )U

another leftover from qm: phase space quantization, makes Z a dimensionless #

correct Boltzmann counting

Solves Gibbs paradox

requires qm origin of indistinguishability of atoms


We derive it when discussing the classical limit of qm gas
N

For a gas of N non-interacting particles we have H


i 1

1
Z (U )
N !h3 N
U

pi2

2m
i 1

N
V
d 3N p d 3N q
N ! h3 N
U

pi2

2 m U
i 1

pi2
2m

d 3 p1 d 3 p2 ...d 3 pN

pi2
i 1

3N dim. sphere
in momentum space

2 mU

VN
Z (U )
N !h3 N
U

d p1 d p2 ...d p N

pi2

2 m U

2 m (U )

2mU

V N C3 N

N !h3 N

2m(U )

3N / 2

2mU

3N / 2

V2 dim S2dim dr 2 rdr r 2

i 1

V3 N ( pU 2m(U )) V3 N ( pU 2mU )

Remember:

4
V3dim S3dim dr 4 r 2 dr r 3
3
...
V3 N dim S3 N dim dr r 3 N 1dr C3 N r 3 N

V U

3N / 2

U
1

S k B lnZ (U )


const

3N / 2

3
U
k B N ln V N ln U ln 1

2
U

ln const

3N / 2

In the thermodynamic limit of

N
V
U

lim a n 0
n

N
const
V

3
S k B N ln V N ln U

for 0 a 1
ln1=0

U
ln 1

3N / 2

http://en.wikipedia.org/wiki/Exponen
tiation

ln const

S k B N ln V N ln U ln const
2

with

1 S

T U V

P S

T V U

PV Nk BT

3
Nk BT
2

You might also like