Bouncing Balls and the Boltzmann Distribution

Temperature

Most things that physicists study have some number of degrees of freedom – aspects of the system that are free to vary over time as it interacts with its surroundings. For example, an electron orbiting an atomic nucleus can be in one of many discrete orbitals which are allowed by quantum mechanics. A rubber bouncy ball has a position and velocity at any moment in time, each of which are vectors with three independent components, totalling six degrees of freedom (in fact, it also has spin, which we ignore here).

A branch of physics called statistical mechanics studies the probability of finding such a system in any particular state. For example, it can be used to ask how likely it is for an electron to be found in the lowest-energy orbital (the ground state) of an atom, rather than a higher-energy (excited) state. It can be used to ask the relative probability of finding a rubber bouncy ball travelling at 1 m/s versus 2 m/s.

This tutorial demonstrates how statistical mechanics applies to a tank containing a large number of bouncy balls. In many ways, these are analogous to the seething mass of air molecules in the Earth's gaseous atmosphere.

The tank of balls is shown above. The controls above the tank can be used to
change the number of balls; click on *Restart simulation* for your changes
of settings to take effect.

The balls feel a gravitational force pulling them downwards. When they collide with the sides of the container, they bounce elastically – that is to say, in a way that conserves energy. Similarly, they bounce elastically when they collide with each other. Naturally, they also obey Newton's laws and conserve momentum.

The colors of the balls are either assigned randomly if all of the balls are set to have the same mass (default). If the balls have different masses, then heavier balls are given redder colors, while lighter balls are bluer.

The *temperature* control on the right hand side can be used to change how
much kinetic energy the balls have, set in arbitrary units – a value of
one corresponds to the default amount of energy.

The red line next to the *temperature* control indicates how much kinetic
energy the balls have, and the slider can be used to change this. In the
simulation, this is implemented by briefly introducing a little friction into
the dynamics, which is removed once the balls have reached their target kinetic
energy.

### Applying statistical mechanics to the tank

In this system, the balls explore a six-dimensional parameter space of position and velocity by means of chance collisions, which exchange energy and momentum between the balls.

The exact nature of each individual ball's motion and each collision is determined by Newton's laws of motion. But statistical mechanics is not about individual balls, but rather about the bulk behavior of the whole tank. The individual motions of the balls is a level of detail beyond what it needed to understand their time-averaged bulk behaviour. It would be theoretically possible, for example, to calculate the time-averaged pressure that the balls exert on the sides of the tank by modelling every ball's individual motion. But it would be very time consuming to do so.

This is equivalent to saying that when a coin is tossed 100 times, we know that it is likely to come up tails roughly half the times. The outcome of each individual toss is determined by the exact mechanics of how it is tossed. But you do not need to model the exact trajectory and spin of every throw to know that you expect around 50 tails among 100 coin throws.

### The Boltzmann Distribution

The Boltzmann distribution states that the probability of finding a physical system in a particular state is related to the energy of that state, by the formula \[ P(state) \propto e^{-E/kT}. \]

Here, \(k\) is a numerical constant, the Boltzmann constant, and \(T\) is thermodynamic temperature – a measure of how much energy the system has, by analogy with how the amount of energy a gas has is measured by its temperature.

### The distribution of the heights of the balls

The Boltzmann distribution can be used to work out the probability of a ball being at a particular height off the ground in the tank above. The height of a ball is directly related to its potential energy, via the equation \[ \text{Potential Energy}=mgh, \] where \(m\) is the mass of the ball, \(g\) is the acceleration due to gravity, and \(h\) is the height of the ball. Substituting this into the Boltzmann distribution, we find that the probability of a ball being at some height \(h\) is given by \[ P(h)\propto e^{-mgh/kT}. \]

For a set of balls which are not exchanging any energy with the outside world – i.e. that are at a fixed temperature – this simplifies to \[ P(h)\propto e^{-h}. \] In other words, the most likely place for a ball to be is on the ground, at the lowest possible value of \(h\), and the chance of it being a certain distance off the ground decreases exponentially with height.

The graph below shows the total mass distribution of all of the balls as a function of height. Specifically, it is constructed by filing the balls into a series of bins depending on their heights, and then drawing a graph of how much mass there is in each height bin, represented by a red point. We expect the position of each ball to represent a sample from the probability distribution above, and so the shape of the mass distribution of all the balls to map out the same exponential curve.

One striking feature of this plot is that it's rather jittery. This is to be expected as it's based on the pseudo-random motions of a relatively small number of balls.

Statistical mechanics only describes the most likely configuration of a system;
random fluctuations will cause any real system to deviate slightly from this
most average configuration, in the same way that a coin tossed 100 times is
unlikely to come up tails *exactly* 50 times.

In fact, the amount of randomness in the motion of only a few hundred balls is much larger than is shown above. To make the graph more readable, the jitter is artificially reduced by time-averaging the height distribution over a period of around five seconds. This smoothing is also applied to all of the other plots below.

The number of balls in the tank can be changed using the configuration options above it. Adding more balls will reduce the amount of jitter in the height distribution, simply because an average is then being taken over the random motions of a larger number of balls.

Most of the real-world systems that statistical mechanics is used to describe have vastly more particles than the number of balls used here – each gram of air contains well over a billion billion molecules. And so statistical mechanics is a much better description of how air molecules behave than it is of the tank of balls shown here. But this example shows how even a small number of balls follow the trends that are predicted.

Aside from the jitter, the graph above does follow the curve of an exponential decay. According to the Boltzmann distribution, the rate of decay of the curve with height should be slower at higher temperatures, since \(T\) appears in the denominator of the exponent. This change in the rate of decay is indeed apparent if the kinetic energy of the balls is increased or decreased using the temperature control to the right of the tank.

### Availability of states

One feature of the height distribution shown above doesn't quite follow the Boltzmann distribution. On the far left of the graph, it reaches a plateau as it approaches \(h=0\). This is especially apparent when there are a large number of balls in the tank. The Boltzmann distribution, meanwhile, rises ever-more-steeply as \(h\) gets smaller.

Visually, it's quite easy to see what's going on. Balls are solid objects that can't occupy the same space as other balls. Towards the bottom of the tank they become closely packed together. There is a certain maximum density of balls which cannot be exceded, when they are in physical contact, and this is the plateau in the mass distribution curve.

This phenomenon can also be explained in terms of the Boltzmann distribution.
Technically speaking, we have been slightly lax in ignoring something called
*degeneracy*, which is the number of states that a system can be in at any
particular energy. The Boltzmann distribution describes the probability of a
particular ball being in *one particular state*.

In the case of a bouncing ball, a *state* refers to one particular point
in the six-dimensional parameter space of all the possible positions and speeds
that that ball can have.

If there are many states at one particular energy level, these each have a certain probability of being occupied, and the total probability of being in any state of energy \(E\) is found my multiplying the exponential Boltzmann factor by the number of states.

Suppose for a moment that the tank were to be twice as wide at some height
\(h_2\) as compared to some other height \(h_1\). It would have twice as much
horizontal parameter space open to it at \(h_2\) as at \(h_1\). The probability
of a ball being found at any *particular* horizontal position at \(h_2\)
as compared to \(h_1\) is given by the Boltzmann distribution:
\[ \frac{P(h_2)}{P(h_1)} = \frac{\text{exp}\left(-mgh_2/kT\right)}{\text{exp}\left(-mgh_1/kT\right)} =
\text{exp}\left(\frac{-mg(h_2-h_1)}{kT}\right). \]
But the probability of finding it at *any* horizontal position at \(h_2\)
versus *any* horizontal position at \(h_1\) is twice this, because there
are twice as many horizontal positions available to the ball at height \(h_2\).

Once some of the balls in our tank have become close-packed at the bottom of the tank, solid body forces – the fact that balls are solid and cannot occupy the same space as other balls – means that there are very few horizontal positions left available for other balls at the bottom of the tank, and so the chance of them being near the bottom of the tank is reduced.

### The Maxwell–Boltzmann distribution

The distribution of speeds of particles \(v\) in an ideal gas follow the Maxwell–Boltzmann distribution, given by \[ f(v) = \left(\frac{m}{2\pi kT}\right)^{\frac{3}{2}}\, 4\pi v^2\, \text{exp}\left(\frac{-mv^2}{2kT}\right). \] Our tank of balls is similar to an ideal gas, but differs in that the balls are only free to move in two dimensions rather than three. As a result, rather than following the Maxwell-Boltzmann distribution, the velocity distribution instead follows \[ f(v) = \frac{v}{kT}\, \text{exp}\left(\frac{-mv^2}{2kT}\right). \]

The plot below shows this distribution of the speeds of the balls in the tank, albeit with some random jitter as before:

To understand how these distributions arise, it is useful to look at each term in turn. The right-most term in each distribution looks very like a Boltzmann distribution, with the familiar expression for kinetic energy, \(mv^2/2\) substituted for the energy of each available state. But there is also a new term in the middle, which is proportional to \( v^2 \) for a three-dimensional gas and proportional to \( v \) for our two-dimensional simulation.

This arises because velocity is a vector, with three components for real-world objects, and with two components in our simulation. Speed meanwhile is a scalar; it is the magnitude of this vector.

The velocity of a ball encompasses three degrees of freedom, represented by three numbers in a vector. A ball's velocity can be represented as a point in a three-dimensional parameter space – the ball's so-called 'velocity space'. The ball's speed is the distance of that point from the origin, \(\text{Speed}=\sqrt{v_\text{x}^2+v_\text{y}^2+v_\text{z}^2}\).

However, many different points in three-dimensional velocity space correspond to the same speed. In fact there is a whole spherical shell of velocity states, centred on the origin, all corresponding to the same speed \(v\), but to different directions of travel. These are degenerate states as far as a speed distribution is concerned.

More precisely, the volume of this spherical shell of degenerate states, which has radius \(v\) and thickness \(\text{d}v\), is given by \[ \text{Volume} = 4\pi v^2 \text{d}v. \] It is as a result of summing over the probabilities of a ball being in any of the velocity states within this shell that the term \( 4\pi v^2 \) arises in the speed distribution.

In the case of balls which can only move in two dimensions, the sum is instead over a circular ring of states in a two-dimensional velocity space, which has an area of \[ \text{Area} = 2\pi v \text{d}v. \]

In summary, there are larger volumes (or areas) of velocity space which correspond to higher speeds than there are that correspond to lower speeds. Once account is taken for this, it becomes less likely that a ball will randomly be found in the relatively small part of parameter space corresponding to slow speeds, rather than the much larger volume of velocity space corresponding to higher speeds.

### The principle of equipartition

The principle of equipartition states that when a system is able to distribute energy between a number of different degrees of freedom, it will tend to spread the energy equally among those degrees of freedom. Each degree of freedom has an average energy of \[ \text{Mean energy} = \frac{kT}{2}. \]

This means that in the tank above, all the balls should have the same time-averaged kinetic energy, even if they have different masses. In the plot below, the balls are sorted into a series of bins according to their masses, and their average kinetic energy is plotted.

Note that the default option above is for all the balls to have the same mass, and so a spread of balls with different masses will only be seen if the balls are set to have differing masses.

### The role of collisions

A final option above allows collisions between balls in the tank to be enabled (default) or disabled. If collisions are disabled, the balls are able to pass through one another and do not interact with one another.

Enabling this option means that the balls have no means of exchanging energy or momentum. Looking at the graphs above, you will see that they no longer follow the Boltzmann and Maxwell-Boltzmann distributions.

The simulation starts with the balls spread randomly through the box. The balls are assigned random positions and speeds by a piece of computer code. However, these random initial conditions do not match the Boltzmann distribution. The rules of statistical mechanics determine that it doesn't matter how the balls start out – random exchanges of energy when they collide will quickly distribute energy between them according to the Boltzmann distribution.

However, if there are no collisions between the balls, there are no interactions between them to exchange energy and momentum. This means that there is no mechanism by which energy can be shared between them. Instead, they retain the initial energy distribution with which they started the simulation.

In statistical mechanics, a system in which energy has been evenly shared out
amongst its components is said to be *relaxed*. The *relaxation
timescale* is how long it takes for a system to smooth out any temperature
imbalances within it. The only mechanism by which relaxation occurs in our
tank of balls is through collisions between the balls. Without any collisions,
the relaxation timescale becomes infinite, and thermodynamic equilibrium is
never reached.

### Entropy

Entropy is a measure of how ordered or disordered a state is. A state in which energy is evenly spread between all available degrees of freedom is disordered, while a state in which some degrees of freedom have much more energy than others is more ordered, because energy appears to have been preferentially piled into some degrees of freedom (or balls) rather than others.

An alternative statement, therefore, is that the balls are started in a low entropy (unusually ordered) state by the computer, but rapidly evolve towards a higher entropy (more disordered) state. Without collisions, however, the entropy of the simulation remains low.