# Thermodynamics

Thermodynamics looks for statistical patterns in groups of particles. A group of particles are called a system. Systems have properties that single particles don't, like temperature, pressure, volume, density, and entropy.

Unlike other physics topics we've learned, thermodynamics is statistical. It's predictions only have a probability of being true. It is never 100% certain. Yet, the accuracy increases with more particles, and there are a large number of particles in most systems.

Example: Estimate the number of gas particles in a room (10 m × 10 m × 4 m).
strategy

Calculate the volume in cubic meters (width × height × depth).
Convert the cubic meters into moles with a conversion fraction.
Convert the moles into particles with another conversion fraction.

mol conversions

1 mol = 0.0224 m³ of gas at STP
1 mol = 6.02 × 1023 particles
0.0224 m³ of gas at STP = 6.02 × 1023 particles

solution $$\text{volume} = 10\,\mathrm{m} \times 10\,\mathrm{m} \times 4\,\mathrm{m}= 400 \, \mathrm{m^3}$$ $$400 \, \mathrm{m^3} \left( \frac{1 \, \mathrm{mol}}{0.0224 \, \mathrm{m^3}} \right) = 18\,000 \, \mathrm{mol}$$ $$18\,000 \, \mathrm{mol} \left( \frac{ 6.02 \times 10^{23}}{1 \, \mathrm{mol}} \right) = 1.08 \times 10^{28}$$ $$\small 10\,000\,000\,000\,000\,000\,000\,000\,000\,000\,\,\, \normalsize\text{gas particles}$$

Temperature is a statistical property of a system of particles. We can measure temperature by looking at the expansion of fluids. At high temperatures fluids take up more space. This increases the level of the alcohol in a thermometer.

Temperature is often defined as proportional to the average kinetic energy of a system of particles. The kinetic energies that contribute to temperature can be stored in a particle's spin, vibrations, and motion.

temperature =

# Heat

Heat is energy transferred from one system to another. This is often done through thermal energy. A system doesn't "have" heat, it has internal energy, which is called heat only when it transfers between systems.

When two systems with different temperatures are put in contact they will exchange heat until they reach the same temperature. Heat flows from high temperature systems into low temperature systems.

An example of heat exiting a system is friction. If you throw a foam ball, heat will leave the ball and enter the air. The ball's kinetic energy converts into the thermal energy of the air.

The concept of heat is important in understanding machines, like engines and refrigerators.

Question: If a room is left at the same temperature for a while, all the materials in the room should reach the same temperature. Yet if you touch metal it feels much colder than wood. Why?

Metal is a conductor of both electricity and heat. This means that if you are hotter than the metal, heat will flow out of your body into the metal at a very fast rate.

Wood is a poor conductor of heat so it doesn't pull as much heat from your body.

As the ball in the simulation falls, energy transfers from gravitational to kinetic. As it rises, energy transfers from kinetic to gravitational.

Energy is conserved. This means that if we add up the total energy of a system it will always be the same value, unless some energy enters or leaves the system we are tracking.

Turn friction on and observe what happens.

The kinetic energy gradually leaves the ball. We describe energy that leaves a system as heat. Heat can leave in many forms, often heat leaves a system as thermal energy. Thermal energy is the kinetic energy of many particles at a microscopic scale.

The internal energy of a system is always conserved, unless energy leaves or enters. Energy can leave or enter as work or heat.

The first law of thermodynamics says that the change in internal energy of a system is equal to heat flow into a system plus work done by the system.

# $$\Delta E = Q - W$$

$$E$$ = total internal energy of a system [J]
$$Q$$ = Heat, energy added(+) or removed(-) from system [J]
$$W$$ = work done by a system [J]

We can apply the concept of heat to our conservation of energy equations. If heat is leaving our system, we put Q on the right to track the loss.

$$K_i + U_i = K_f + U_f + Q_f$$ Example: A 0.43 kg soccer ball, kicked at 10 m/s, rolls down a 30 m tall hill. If there was no energy loss, the ball would have a final velocity of 26.2 m/s.

When actually performing the experiment we found the ball only has a speed of 20.0 m/s at the bottom. Calculate the heat that left the ball as it rolled down the hill.
solution $$K_i + U_i = K_f + Q_f$$ $$\tfrac{1}{2}mu^2 + mgh_i = \tfrac{1}{2}mv^2 + Q$$ $$\tfrac{1}{2}(0.43)(10.0)^2 + (0.43)(9.8)(30) = \tfrac{1}{2}(0.43)(20)^2 + Q$$ $$61.9 \,\mathrm{J} = Q$$
Example: A 2000 kg car moving at 30 m/s skids to a stop on level ground. How much heat left the car's system?
solution $$K_i + U_i = K_f + U_f + Q_f$$ $$K = Q$$ $$\tfrac{1}{2}mv^2 = Q$$ $$\tfrac{1}{2}(2000)(30)^2 = Q$$ $$900\,000 \, \mathrm{J} = Q$$

Example: A 1000 kg cart is at rest at point A. The cart loses 57000 J of energy as it rolls to point C. How fast is the cart moving at point C?
(Each grid square is 10 m × 10 m.)
solution $$E_{\mathrm{point \, A}} = E_{\mathrm{point \, C}}$$ $$U_g = K + U_g + Q$$ $$mgh = \tfrac{1}{2}mv^2 + mgh + Q$$ $$(1000)(9.8)(100) = \tfrac{1}{2}(1000)v^2 + (1000)(9.8)(47) + 16000$$ $$980000 = 500v^2 + 460600 + 57000$$ $$503400 = 500v^2$$ $$30.4\, \mathrm{\tfrac{m}{s}} = v$$

# Entropy

Entropy is hard to define because different fields of study use the term to describe slightly different ideas. The definitions of entropy are all loosely centered around disorder. A high entropy system of particles is not organized. A low entropy system is orderly, but how do you measure order?

A common definition of order is a concentration of something measurable, like energy, temperature, or pressure. This means a concentration of energy has lower entropy than an evenly spread out distribution.

We can also define entropy as information. Computer memory can record the information about a system's state. A high entropy system would require more memory to store the extra information. This extra memory is less concentrated, so it loosely fits with the other definitions of entropy.

Entropy plays a role in how engines convert concentrations of energy into work. The simulation below shows a simple engine that is designed to turn a rotor. What state of the system would make the rotor turn?

time rate

Simulation: First, click to remove the particles. Next, click inside the simulation a few times to add some particles. Try to place the particles in a state that will turn the rotor.
results

The rotor turns when you release a high concentration of particles on one side. Concentrated particles are a lower entropy state. As the engine does the work of rotating, particles spread out, and entropy increases

Once the particles are evenly spread out, the rotor will stop turning. If your system gets in this high entropy state, adjust the slider on the right to power the rotor from an external power source. The powered rotor lowers entropy by reconcentrating the particles.

rotor external power

We can view a car engine as a thermodynamic system. An engine takes in concentrated, low entropy, energy in the form of gasoline. An energy difference is produced when the gasoline goes through combustion. That difference in energy concentration is then transformed in rotational energy, which makes the car move.

Energy is only useful, when it is in a lower entropy state than it's surroundings.

Question: You can think about a human body as a thermodynamic system. As we do work, the entropy in our body increases. What keeps our entropy low?

We add low entropy substances to our system, like food, oxygen, and water.

We also lower our entropy by removing high entropy substances, like carbon dioxide, urine, and feces.

Question: The planet Earth and all it's life can be viewed as a thermodynamic system. As time goes forward the energy density of Earth spreads out and loses the ability to do work. What keeps the entropy of the Earth low?

Light from the Sun is added to our system. Sunlight lowers our entropy on the global scale by creating differences in temperature. On the atomic scale photons of sunlight reduce entropy through photosynthesis.

# The Arrow of Time

There is no arrow of space. Space is generally uniform in every direction. Yet, time does seem to have an arrow. The past is clearly different from the future. So we say there is an arrow of time. Time has a direction.

If you watched a video of a glass of water falling off a table, you would know if the video was in reverse. It would be strange to see broken glass and spilled water spontaneously come together and jump up onto a table.

Yet, sometimes the direction of time isn't clear. A video of the Earth orbiting the Sun looks similar forwards and backwards. It has no clear arrow of time.

Question: Imagine each of these situations happening in reverse. Which ones would look obviously backwards?
• a dog eating food
• a falling rock
• a dog sneezing
• a pendulum oscillating
• a dog running
• a tennis ball at rest
• light reflecting off a mirror
• a mirror shattering
• a stick resting on the ground
• a living cell dividing
• the Earth orbiting the Sun
• a dog barking

Situations with a clear arrow of time are in red.

• a dog eating food
• a rock falling
• a dog sneezing
• a pendulum oscillating
• a dog running
• a tennis ball at rest
• light reflecting off a mirror
• a mirror shattering
• a stick resting on the ground
• a living cell dividing
• the Earth orbiting the Sun
• a dog barking

• Question: What do the situations with a clear arrow of time have in common?

The situations with a clear arrow of time are complex with many moving parts. These might include systems with friction, or with living creatures.

The systems with no clear arrow of time are all simple with few moving parts.

The laws of physics are time reversible. Nature behaves the same moving forwards or backwards in time. The past doesn't produce the future. There is no cause and effect, just patterns.

At least, this is true for simple systems. For systems with many particles there can be a clear direction of time. This is because as complex systems change they tend to get more chaotic. The entropy of a complex system increases over time. This is true for all the various definitions of entropy.

So, why doesn't entropy have to increase for simple systems? This is because simple systems don't have enough possible states to progress in a direction. It is easy to clean up 3 misplaced socks, but it is harder to clean up 10 000 000 grains of sand.

The second law of thermodynamics states that entropy of an isolated system doesn't decrease over time.

As time moves forward, a system will either stay the same or get messier but never more orderly.

Why is the second law of thermodynamics true? Because there are many ways for the energy of a system to be evenly distributed, but few ways to have energy concentrated. As a system randomly progresses forwards in time, the probability of evolving into a concentrated low entropy state is low.

The entropy of a complex system always increases. Our universe began at its lowest entropy. The far future is the highest entropy.

Diffusion is a good example of the second law of thermodynamics. The simulation below will become more evenly mixed as time progresses forward. It would be extremely unlikely to see the reverse where the system separates as time progresses forward. This means there is a clear difference between directions in time.

The simulation below has 150 particles on each side. As time progresses forward, the particles quickly mix together. Yet, the odds of the 300 particles unmixing is so unlikely that it would take longer than the age of the universe.

time rate
possible states
Simulation: the simulation, and click to add 10 particles on each side. Guess how long we would have to wait for the colors to separate, with the red and blue particles on opposite sides.
Investigation

Let's say that each particle can be in only 2 states: left or right.
We can count the total number ways to put particles in one of 2 states.

number of particles total possible states
$$0$$ $$1$$
$$1$$ $$2 = 2$$
$$2$$ $$2 \times 2 = 4$$
$$3$$ $$2 \times 2 \times 2 = 8$$
$$4$$ $$2 \times 2 \times 2 \times 2 = 16$$
$$5$$ $$2 \times 2 \times 2 \times 2 \times 2 = 32$$
$$\text{\color{red}n}$$ $$2^{\color{red}n}$$
$$20$$ $$2^{20} = 1\,048\,576$$

Trying random configurations would take on average half the number of total states to reach our unique unmixed state.

$$\frac{2^{n}}{2}$$

I wrote a program that counts how often a particle in the simulation switches states at the fastest time rate. For ten particles on each sides there are about 18.75 switches per second or one every 1/18.75 seconds.

$$T_{avg} \approx \left(\frac{2^{20}}{2}\right) \left(\frac{1}{18.75}\right) \mathrm{s}$$ $$T_{avg} \approx 27\,960 \, \mathrm{s} = 7.767 \, \mathrm{hr}$$

The graphs below assume that each particle switches states once a second. The time to separate increases quickly as the number of particles goes up.

For up to about n = 14 particles, the estimated time to separate takes seconds. Waiting for n = 35 particles to spontaneously separate takes years.

How long would it take n = 300 particles to randomly separate?
How long would it take all the air molecules in a typical room?
(n = 10 000 000 000 000 000 000 000 000 000 air molecules)

Thinking about time is exciting, but confusing. Let's review.

• Time doesn't have a fundamental direction, just like space doesn't.
• We observe time's direction because our universe started out in an orderly state and it is progressing towards a disorderly state.
• The trend towards disorder occurs because there are many ways to be disordered and few ways to be ordered.
• The progress towards disorder is only a statistical probability.