Course Icon

Physics

Chapter 15: 7-12

SO Icon

Web Lecture

Thermodynamics on the Large Scale: Entropy[u]

Introduction

The entropy of a system is defined this way:

  1. If heat ΔQ is added reversibly to a system at temperature T, the increase in entropy of the system is ΔS=ΔQ/T.
  2. At T=0, S=0 (third law).

In a reversible change, the total entropy of all parts of the system (including reservoirs) does not change.
In irreversible change, the total entropy of the system always increases.

— Richard Feynman Lectures, #44

Outline

Thermodynamics in the Universe "System"

What are the implications of thermodynamics for the whole universe? Energy is used to move things around, to change the relationships between objects. For any set of real, physical objects in a particular volume, there are a finite number of possible arrangements. The number of "random" arrangements generally far outweighs the number of "ordered" arrangements, so the chances are that a random change will result in a random arrangement, that order will decrease.

A Definition of Entropy

Entropy is a measure of the order of a system. Every system or group of objects has various states of position and motion that to our human eyes represent order or disorder. Usually, in a given system, there are many fewer states that we identify as "ordered" states than there are disordered states. A random change in the position or motion of any one element of the system will normally result in a state of equal or greater disorder than the previous state. It is possible that a sequence of steps can result in an ordered state.

Consider the mathematical toy phenomenon of the 1970s: Rubik's cube. In a standard 3x3 cube, there are 9 faces on each side, or 54 faces total on all six sides. Even taking into account that some faces are tied to others (each of the eight corner pieces carries three colors), there are still over 4.3 * 1019 possible positions. Only one of these has all the faces in the right place and correctly oriented (if there is printing on each of the center faces). The chances that a random sequence of motions will result in this particular situation is very small. But the distance to "order" is actually quite short. At least one optimized computer program can reorder to the cube to the "solution" position with no more than 18 turns. The number of positions only one or two turns away from a solution is non-trivial, so a single event in a "disordered" state could result in a completely ordered state, but is much more likely to result in a state that is "more" disordered, that is, one that requires more twists to return to the solution state.

In some physical situations, like the Rubik's cube or a deck of cards, the amount of disorder is easily discerned. In others, it must be carefully calculated, and several statistical methods (based on probability theory) have been used to calcate the increase in entropy.

In general, a change in entropy ΔS = Q/T: that is, the change in entropy is directly related to the heat flow into or out of a system, and inversely related to the temperature at which the flow occurs.

Practice with the Concepts

What is the change in entropy as 1.0kg of steam at 100°C condenses to water?

Remember that Q will depend on the latent heat of water for this phase change!

One of the most difficult things to keep in mind about entropy is that we have entered a situation where statistics tells us the most likely outcome of a situation. This is not a "law" in the sense that F = Gmm/r2 is a law. Gravity invariable works, so far as we can tell, the same way throughout the universe, with the same effects on any mass of whatever composition. We can predict fairly precisely what the effects of one mass will be on the motion of another nearby mass. With entropy, we can only estimate the likely outcome of a given action. Unlikely outcomes are possible, and do occur. They are also difficult to examine, since they will be unique and not repeatable phenomena.

Discussion Points