Ensemble Theory (classical)-II ( Ensemble, Liouville’s Theorem and Ergodicity)

P.K. Ahluwalia

epgp books

 

 

1. Learning Outcomes

After studying this module, you shall be able to:-

  • Appreciate the statistical approach called ensemble method for understanding the properties of a given macroscopic system underlining the need for such an approach.
  • Define an ensemble characterized by thermodynamic variables such as (?, ?, ?), (?, ?, ?) and (?, ?, ?) etc.
  • Understand the concept of statistical density function or statistical distribution function ?(?, ?; ?), link it to the ensemble average of a physical quantity and see its advantage over the equivalent time averaging.
  • See how knowledge of probability function allows us to calculate average values of a physical quantity.
  • Derive Liouville’s theorem that the statistical distribution function is constant along the phase trajectory of the system and discuss its significance in statistical physics.
  • Liouville’s theorem as conservation of phase space volume.
  • Show that ensemble average of any physical quantity is equal to the value one expects to obtain on making an appropriate measurement i.e. time average of a physical quantity is equal to the ensemble average of the same physical quaintity, called erg odic hypothesis.
  • Understand the concept of fluctuations, root mean square fluctuation and relative fluctuations for a given physical quantity.
  • Prove that relative fluctuation of a physical quantity varies inversely as square root of the number of particles in the system.

2. Introduction

We have already learnt the concept of phase space and its properties in module 10. In this module we introduce an altogether new concept of an ensemble introduced by J W Gibbs in his famous book Elementary Principles in Statistical Mechanics. Fully well realizing that in the study of macroscopic systems we are dealing with a situation which can not be approached from point of view of motion of these huge number of particles, J.W. Gibbs introduced this new idea which, without improving on the inherent lack of information about the system, paved way for applying statistical approach to arrive at mean value of a physical quantity via the concept of statistical distribution function or statistical probability density function. To begin with ensemble sounds an obscure abstract concept but later turns out to be an extremely useful one to lay the statistical scaffolding to understand many particle physical systems. We in this module state and prove Liouville’s theorem pertaining to probability density function and explore its properties embodied in the theorem. We also explore the statistical meaning of mean value, root mean square fluctuation and relative fluctuations. We wind up our discussion on foundation of ensemble theory by looking at the meaning of ergodicity.

3. Ensemble: An Abstract Useful Concept

Ensemble is a French word meaning a group. In day today conversation we are familiar with the usage of this term applied to a group of musicians in a band forming say Philharmonic Ensemble. In this module we look at this concept as applied to a group of phase points in phase space representing different microstates of a given macrostate taken together as members of an ensemble of systems as shown in the figure 1 below.

Figure 1 Dots in the picture represent various microstates of a given macrostate forming members of an ensemble

To define precisely, a statistical ensemble is a collection of or a group of large number of exact thought replicas of a given macrostate, with each replica having same state parameters but can find themselves in one of all possible microstates represented by points in phase space for each ensemble.

So in the Gibbs visualization different phase points evolving in time, exploring the allowed phase space can be seen together at the same instant representing members of a group called an ensemble confined to a region in phase space. As time flows each member of this ensemble can acquire any one of the all possible microstates.

4 Statistical (probability) distribution function

What is that characteristic function which can describe this ensemble? First note since we are dealing with a very large number of these systems the points in phase space are very
dense we can define their density distribution function ?(?, ?; ?), a function which describes how various members of the ensemble are distributed over all allowed microstates at different points of time. So that number of points i.e. macrosystems in the phase space volume ???? ???? around the point (q,p) of the phase space is ?(?, ?; ?)???? ???? . Therefore, number of systems in the ensemble is given by

? = ∫ ?(?, ?; ?)???? ????                      (1)

Also density distribution function helps us to define, the ensemble average 〈?〉 of a physical quantity ?(?, ?) which over different members of the ensemble may be  different. Therefore, ensemble average 〈?〉 is defined as

5 Proof of Liouvilles theorem

6 Significance of Liouville’s Theorem and Types of Ensembles

Liouville’s theorem as seen in equation (40) or (41) has some interesting consequences for arriving at functional dependence of probability distribution functions which we are going to find in later modules and help define different types of ensembles. We have already commented on the significance of the first term ??/?? becoming zero implies a stationary ensemble. For (41) to be zero implies that [?, ?] = ?, which can be made zero in two possible ways:

7 Ergodicity Hypothesis: Time Average Versus Ensemble Average

To appreciate this theorem, we must note that for a a given thermodynamic system there are two ways of calculating averages. One way is to study the dynamics of a system over a long period of time and the other is to find average through a statistical description. First type of average is called time average and the second type is called ensemble average. Ergodic Hypothesis requires that both these averages should be equal.

The dynamics of the thermodynamic system is deterministic governed by Hamilton’s equations. When we start observing the system at intial time ? = ? and let it evolve to time = ? , the state of the system evolves producing a phase trajectory and the Hamilton’s equations can be integrated providing (?, ?) = {??(?), ??(?)} at every instant of time. If we consider a physical quantity ?(?, ?), we can average over this trajectory and calculate time average defined as

Geometrically speaking, the ergodicity means that phase trajectory passes through all the points through the region of interest defined by the surface ? = ?. A system such as this is called ergodic. The problem with this assertion is that topologically no single phase trajectory can fill all the energy surface, This led to an idea of quasi ergodic hypothesis according to which over a sufficiently long time the phase trajectory for a closed system defined by ? = ?, the phase trajectory comes arbitrarily close to every point on the energy surface and then the time average can be replaced by the ensemble average.

The importance of ergodic hypothesis lies in the fact that it resolves a difficult problem of calculating the mean value of a physical quantity over time by allowing its calculation over a set of exact replicas of the system at a single instant, called ensemble average.

Proof of Ergodic hypothesis:

To prove (47), let us follow plausibility arguments using Liouville’s theorem as given below:

According to ergodic hypothesis, since all points on the ? = ? surface are reached by the phase trajectory of the system, any point on the surface can be taken as a starting point.

8 Fluctuation, Root Mean Square Fluctuation and Relative Fluctuation

Furthermore, now we must we aware of the fact that averages of the physical quantities are expected from the measurements yet there is a possibility of deviations or fluctuations
occurring from these average values. Study of these fluctuations is of great importance in many phenomenon occurring in nature for example in critical opalescence, Brownian motion etc. Let us consider a physical quantity ? corresponding to a physical system. As time progresses this value varies about its average value as ?? = ? − 〈?〉. This variation about the average or mean value can be both positive an negative and its mean value 〈??〉 = ? and, therefore, is not of any significance. A better quantity to define is (??)?. Interestingly, its average value tends to zero only when (??)? tends to zero. (??)? can be written as

 (??)? = (? − 〈?〉)? = ?? − ??〈?〉 + 〈?〉?                                      (50)

Let us take the average of both sides we get mean square fluctuation as

〈(??)?〉 = 〈??〉 − ?〈?〉〈?〉 + 〈?〉? = 〈??〉 − 〈?〉?                            (51)

Which is mean of the square minus square of the mean.

The ratio √〈(??)?〉 / 〈?〉 is called the relative fluctuation. Smaller is the relative fluctuation, less is the proportion of time for which the system remains away from mean value.

Now we shall see that relative fluctuation decreases inversely as the size of the system(i.e. number of partcles) increases.

For this suppose we have a system of N particles. We are interested in a physical quantity (say kinetic energy of the system ?) which is an additive quantity corresponding to this physical quantity ?? (say kinetic energy) for each particle i.e.

9 Summary

you can view video on Ensemble Theory (classical)-II ( Ensemble, Liouville’s Theorem and Ergodicity)

References:-

1. Pathria R.K. and Beale P. D., Statistical Mechanics, 3rd ed. (Elsevier, 2011).
2. Walecka J.D., “Fundamentals of Statistical Mechanics: Manuscript and Notes of Felix Bloch,” London: Imperial College Press and World Scientific Publishing Co. Pte. Ltd., 2000.
3. Kubo R., “Statistical Mechanics: An Advanced Course with problems and Solutions,” Amsterdam: North Holland Publishing Company, 1965.
4. Pal P.B., “An Introductory Course of Statistical Mechanics”, New Delhi: Narosa Publishing House Pvt. Ltd., 2008.
5. Panat P.V., “Thermodynamics and Statistical Mechanics,” New Delhi: Narosa Publishing House Pvt. Ltd., 2008
6. Brush S.G., “Statistical Physics and the Atomic Theory of Matter, from Boyle and Newton to Landau and Onsager”, New Jersey: Princeton University Press, 1983.
7. Khinchin A.I., “ Mathematical Foundations of Statistical Mechanics,”New York: Dover Publications, Inc, 1949.
8. Ma S.K., “ Statistical Mechanics,” Singapore: World Scientific Publishing Co. Pte. Ltd., 1985.