Atmosphere and entropy
I recently learned an abstract mathematical theorem, and stumbled across a remarkably direct measure. I’ll give background to this theorem before introducing it, then I’ll show the direct measure of this theorem with physical data.
This theorem has to do with entropy, which is clouded in mystery. There are several types of entropy and, during the naming of one type, Von Neumann suggested the name “entropy” to Claude Shannon in 1948 because
In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage.
Entropy fundamentally measures the uncertainty in a system. In “information theory” it formalizes the “randomness” of a random variable. If a random variable is uncertain, it has entropy. If a random variable is deterministic or takes only takes one value, it has 0 entropy.
Entropy is fundamentally related to the flow of information, which is studied in information theory. If a message can only take one state, no information can be transmitted; how would communication happen if everything is static? But if it can take two states, it’s possible to communicate one bit of information (i.e., an answer to “is the light on?”)
We care about maximizing entropy because we want to receive information quickly. If a message can take 4 states instead of 2, it’s possible to transmit twice as much information.
A typical statement about entropy maximization looks like:
For a positive random variable $X$ with fixed mean $\mu$, the entropy of $X$ is maximized when $X$ is an exponential random variable with mean $\frac{1}{\mu}$.
It doesn’t seem like there’d be a direct physical example that supports this theorem. But, there is, and it has to do with air pressure as a function of height.
Let’s take a column of the earth’s atmosphere, and ignore any weather or temperature effects. How does the air pressure in this column vary with height? The air pressure at sea level is very different than the air pressure where the ISS orbits.
An air particle’s height is the random variable we’ll study. Height is a positive variable, and a column of air will have a mean height $\mu$. We can apply the statement above if we can assume air particles maximize entropy. When this example was presented in lecture^{1}, I somewhat incredulously asked if this could be applied to Earth’s air pressure.
Air particles maximizing entropy is a fair assumption. An air particle’s position is uniformly random when it’s contained in a small region. Given the wellknown fact that uniform random variables have maximum entropy, this seems like a safe assumption.
So by the statement above we’d expect the air molecules position to follow an exponential distribution. Pressure is a proxy for how many air particles are present, and we’d expect that pressure as a function of height to look like the barometric formula:
when $c_1$ and $c_2$ are constants that hide weather/temperature.
NOAA collects the data required to test this theory. The launch weather balloons through the NOAA IGRA program. These weather balloons collect atmospheric data at different altitudes, and these data are available online^{2}. They record monthly averages of pressure at different heights at each station from 1909 to 2017^{3}. These data are from at least 600,000 weather balloon launches from a total of 1,532 stations.
We use these data to visualize pressure at different heights. We know that this curve is characterized by $\mu$, the average height of an air molecule. I’ve calculated this value from these data and have plotted expected pressure at any height, given by $P(h) = \frac{1}{\mu}\Exp{\frac{h}{\mu}}$.
I show the measured pressures at different heights using the 314 million NOAA data. This is shown alongside an appropriately scaled version of $P(h)$ given the average air particle height $\mu$.
The theorem as stated earlier holds true for an air particles height: air pressure^{4} at different heights follow an exponential distribution.
The plot for this post was generated at stsievert/airpressureheight

In ECE 729: Information Theory taught by Varun Jog ↩

Summarized at igra2monthlyformat.txt and available in monthlypor as
ghgt_*.txt.zip
↩ 
Before information theory entropy was even defined! ↩

a proxy for number of air particles ↩