NextPrevious

# What is entropy?

Entropy is a measure of the total number of microstates in a system. There have been two widely used definitions of entropy, which were suggested by Ludwig Boltzmann and J. Willard Gibbs. We’ll just look at the one specified by Boltzmann, since it’s a little more straightforward to understand. The equation for Boltzmann’s definition of entropy is:

In this equation, kb is Boltzmann’s constant, and is the number of microstates accessible to a system.

To get an idea of how entropy works, consider the example of rolling one or more six-sided die. On the first roll, there are six possible outcomes, so the entropy associated with rolling one die is kbln(6). If we roll two dice, there are 62 = 36 possible outcomes, and the associated entropy is kb ln(36). For three, it’s 63 = 216 possible outcomes, and the associated entropy is kbln(216). As you can see, the number of outcomes for statistically independent events grows very rapidly (exponentially) with the size of our system, which is also true for molecules. By taking the logarithm of the number of outcomes, we make the entropy scale linearly with system size. While the number of possible outcomes/configurations grows exponentially with system size, the entropy grows linearly, which means that if we double the system size we double the entropy. This property makes entropy fall into a category of variables known as extensive variables, which just means that they scale with the size of a system in this simple way.

Close

This is a web preview of the "The Handy Chemistry Answer Book" app. Many features only work on your mobile device. If you like what you see, we hope you will consider buying. Get the App