Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

caraher

(6,308 posts)
9. It's all about counting - a physicist's take
Sat Dec 16, 2023, 03:01 AM
Dec 2023

Last edited Sat Dec 16, 2023, 09:44 AM - Edit history (1)

The number of particles is indeed related to the number of microstates accessible to a system with a given energy, but the number of microstates depends on the total energy of the system as well as the number of particles. Hyperphysics has a nice introduction. Indeed, the key concepts are really all on that one page.

Statistical mechanics textbooks typically call Omega/W "mulitiplicity" and often begin by considering systems for which it is easy to count the ways a known, fixed total amount of energy can be distributed among elements of the system. So most modern approaches to the subject tend to begin with studying "Einstein solids," which have a number of features that simplify the treatment. The basic idea is that an Einstein solid (a solid modeled with atoms as identical balls connected by identical springs) consists of some number of quantum harmonic oscillators amongst which the total thermal energy of the system (which is simply the total energy of all the oscillators) can be distributed at random among the oscillators. A feature of the quantum harmonic oscillator is that the energy of one oscillator can only take one of a set of discrete values separated by hf (where h is a universal value known as Planck's constant and f is the frequency of the oscillator). (We can ignore the zero point energy, which is intrinsic to each oscillator and cannot be "transferred" from one element of the system to another).

So to give a concrete example, suppose you have 3 oscillators and a total energy of 1*hf. To find Omega, you simply count the number of ways to distribute the one unit of energy among the three oscillators. The answer is plainly 3 - the energy can be in oscillator 1 (that's one microstate), or in oscillator 2 (that's a second microstate), or in oscillator 3 (that's the third and final microstate). In this instance your intuition is basically correct - the number of microstates would equal the number of oscillators:

100 (1st microstate)
010 (2nd microstate)
001 (3rd microstate)

So Omega = 3.

But the picture changes... and this is all just combinatorics... if you have more energy. Suppose the total energy is instead 2*hf. There are three ways to distribute the energy in which one oscillator has all this energy and the other two have none, and there are three ways to have two oscillators have one unit of energy each with the third having zero, for a total of six possible microstates:

200
020
002
110
101
011

So here, Omega=6

Of course, listing and counting in this fashion becomes impracticable for systems of any significant size. Applying combinatorics to this results in a nice formula you can find on this Hyperphysics explanation page.

One problem with multiplicity is that the numerical values you obtain from using it become insanely large rather quickly. Taking the natural log tames their behavior and has other side benefits for thermodynamic analysis. In particular, S, unlike Omega, behaves as you would expect an extensive property to behave when you look at combining two small systems into one larger system - the entropies of the subsystems simply add to give the total entropy of the system.

Now a lot of your questions pertain to considering gases. The counting of microstates becomes much more challenging than for an Einstein solid, partly because there are subtleties associated with properly accounting for the quantum properties of the ideal gas in a box and partly because even for one particle, there are multiple degrees of freedom. Rather than energies associated with oscillation, the thermal energy of a gas is usually just the sum of all the mechanical energies of the molecules. If we limit ourselves to a monatomic gas (to evade thinking about molecular rotations and vibrations, which also contribute to thermal energy of a gas) we have three degrees of freedom, arising from the x, y, and z components of the atom's motion. If you had a one-atom "gas" there would then be essentially three ways to divide a given amount of kinetic energy, associated with each component of its velocity. I don't remember offhand exactly how the ideal gas model treats the division of that kinetic energy into countable "chunks" but conceptually one does something similar to the oscillator case, counting the number of ways the total thermal energy can be divided among the various atoms and the various directions of travel they might have.

The result is given by the somewhat complicated-looking Sackur-Tetrode Equation. You can see that there is an almost-direct proportionality with N, the number of atoms in the gas, but not quite (as N also occurs inside the natural log term). So yes, the entropy does depend on the number of gas molecules present, for a given container and total thermal energy, and rises with the number of molecules.

Finally, you ask whether Omega is somehow a probability distribution. It is not - it is just a number. But combined with a key postulate of statistical mechanics, it is linked to probability in important ways. The relevant postulate is that in equilibrium, every microstate is exactly as probably as any other microstate. Combined with the concept of a macrostate (explained below), this allows us to make predictions concerning the probability of observing a system in a given macrostate.

Returning to the Einstein solid, imagine two such systems, A and B, in thermal equilibrium with one another, such that the total energy of the complete system is U. A and B can, in general, have different numbers of oscillators, and the energy U can be split in many ways. Energy conservation demands only that U_A + U_B = U - that the thermal energy of the full system equals the sum of the thermal energies of A and B.

So let's think about solid A. It has some fixed number of oscillators N_A, but could have any amount of thermal energy ranging from 0 to U (in steps of hf). A "macrostate" in this example, for solid A, is any one of the allowed values of U_A, and given a value of U_A we can calculate Omega_A - the number of microstates associated with the macrostate U_A.

Since we have postulated that all microstates are equally probable, we should expect that the distribution of energy between solids A and B that is most probable will be the one that corresponds to the largest number of microstates (and thus maximum entropy). There's a nifty calculator at the Six Ideas web site that calculates tables and graphs of multiplicities and probabilities for small-ish Einstein solids. In the end what you'll consistently see is that the combination of macrostates for A and B that results in the largest number of microstates is always the one that most evenly distributes energy among the individual oscillators - which feels right intuitively for equilibrium. And that, in turn, is the configuration that maximizes entropy for a given amount of thermal energy.


Latest Discussions»Culture Forums»Science»I have to say that my min...»Reply #9