Forces of Nature Weblecture
After examining the crystallization experiments from the previous lecture, Faraday starts on new demonstrations of chemical forces.
Engines change heat energy (motion of individual molecules) into mechanical energy (coherent motion to move an object where we want it to go). James Watts' invention of an efficient steam engine in the 18th century revolutionized life in Europe. It made it possible to set up a factory anywhere one could ship coal to heat the engine, rather than tying the location of a mill to the fitful actions of the local stream. It also stimulated the study of work efficiency: what happens to energy when it is converted from one form to another? Sadi Carnot's work in this area lead to the realization that the efficiency of a heat energy engine depended on how much heat energy was converted to work. The heat exhuasted by the engine had a lower temperature because some of the kinetic energy of the original material (steam) was converted to energy that did work (turning a shaft to drive a motor or spinning Jenny). The ratio of heat exhausted to a location (the sink) to the initial amount of heat from the source had the same value as the ratio of the temperature of the sink to the temperature of the source, which means that the heat energy ratios could be determined from the easily measured temperatures at the sink and source points.
Carnot's work lead to a further recognition. The efficiency of the engine — that is, its ability to convert heat energy to mechanical energy — was limited to a value given by the formula:
Efficiencymaximum = (Tsource - Tsink) / Tsource
Notice that efficience = 1 (or 100%) only if Tsink = absolute zero. We've already seen that this condition is hard to reach: some energy will always wind up as heat energy that doesn't do any work but only increases random motion.
The study of this conversion of useful energy to random motion led to the development of a theory of "entropy". Entropy is a measure of the disorder in a system. We sometimes run into the idea that energy is "lost" to entropy, which may seem like a contradiction of the rule that energy is always conserved, so it behooves us to be careful in our terminology. Within an isolate system, whether neither energy nor matter can cross the boundary of a system, the total amount of energy does not change (unless there is a matter-energy conversion). It may become useless for normal purposes and increase entropy, but the total amount of energy remains constant.
Consider a situation where some energy is released by a process, such as a chemical reaction. The state of matter at the end of the conversion has less energy than the state of matter had prior to the reaction. This released energy is the enthalpy or energy change of the system, often symbolized by ΔH. Not all of this energy is available to do work, because some of this energy will dissipate as heat energy, sound, or other random-motion energy. The energy available to do work, the Gibbs free energy ΔG, is the difference between the energy released in the reaction and the energy dissipated to increase entropy ΔS, which itself depends on temperature:
ΔG = ΔH - TΔS
Over time, chemists and physicists generalized their experience with energy exchange into the "laws" of thermodynamics. It is important to realize that these laws are statistical in nature, rather than absolute. Unlike the law of gravity, which is assumed to hold in all places at all times, the laws of thermodynamics can only predict probabilities. Instead of "in every case, the objects in a system will behave this way", which is what gravity laws tell us, with thermodynamics we can only say that "under these conditions, the objects in a system are this likely to behave in a particular way". There is a non-zero chance, however small, that they objects won't behave the way we expect.
To get a sense for how this works, consider what happens if I take a new deck of cards out of its box, drop them on the floor, and pick them up at random. The deck started in a given order: Ace to King, Spades-Hearts-Diamonds-Clubs. There are 13 cards in a suit, 4 suits in the deck, 52 cards in all: 52! (52 factorial) combinations in all, or
to arrange the cards. The chance that I will pick up the deck in the original order is 1 in 52! -- a very small but still non-zero chance for the event to occur.
Over time, physicists and chemists arrived at a consensus of laws for thermodynamics.
ΔE = q + W
where q = heat energy and W = work done.
So far, we've looked at matter and its response to forces that depend on the mass of objects, one of the fundamental characteristics of matter. We turn now to another fundamental characteristic of matter: charge.
Both mass and charge challenge the limits of what physics can tell us about the universe. We don't know why these characteristics exist, or why the fundamental constants which help us measure the forces that arise have the values they do in our universe. Questions like "why is the gravitational constant 6.67×10-11?" belong to the realm of metaphysics an area where science crosses into philosophy, and experimental evidence can only tell us how and how much but not why.
Any object with mass responds to gravitational forces exerted by other masses. Any object with charge responds to electrical forces exerted by other objects which also possess a net charge. But unlike mass, not all objects have a net electrical charge, and some materials are good at conducting or moving electrical charges along, while others can block this motion entirely.
In studying the properties of charged matter, we start by looking at charged objects that are not moving: static charges, and investigating the electrical fields they generate. Electrical field generation is more complicated than gravitational field generation. All gravitational forces are attractive (as far as we know), but electric charge produces fields that are either attractive or repulsive depending on the types of charge involved. Positive charges attract negative charges, and vice versa, but positive charges repel other positive charges, and negative charges repel other negative charges. This gives rise to the maxim "opposites attract, similarities repel".
We can trace the fundamental units of charge to components of atoms. The three primary components of every atom are its protons, neutrons (which may be missing in hydrogen atoms), and electrons. Protons have one positive unit of charge while electrons have one negative unit of charge. Neutrons have no net charge although, as we shall see, they appear to be a combination of protons and neutrons that can be split during radiation events, such as the radioactivity of uranium.
Electrons are much less massive than protons, and so they move more easily through materials. A flow of electrons, or indeed of any charged particle, which includes polyatomic ions, creates electrical current. We measure current flow as the amount of charge moving past a given point in particular period of time: amps equal coulombs of charge per second. We can generate flow in a number of ways, but one of the most common in modern technology is by using chemical reactions that drive charged molecules or ions through a fluid. Ions moving this way in a fluid are called electrolytes. Batteries are usually containers with more than one cell in which chemical reactions generate this flow. Metal rods or electrodes submerged in the liquid pickup charges and free electrons, driving them along the wire to create electrical current we can use to power flashlights, car lights, cell phones, and computers.
As we've already seen, moving objects possess kinetic energy, and this is true of electrically charged particles as well. We measure power in Watts as the amount of energy delivered by an energy source per unit of time: Joules/second. If the energy is delivered by electric current, we can measure this with voltage, or the amount of power delivered by electric current: volts = watts/amp.
Transmitting or delivering current from a battery to a device we want to power requires the use of conductors, materials that can easily transmit flowing electrodes with little resistance to the flow. We can measure the resistance as the amount of voltage divided by the current. This relationship is often written as V = IR (voltage in volts equals current in amps divided by resistance in ohms). This relationship allows us to create circuits from standard battery output (voltage) by placing different resistors in the circuit to control the current (amps) traveling through our various devices.
In the mid-nineteenth century, Michael Faraday discovered that electricity and magnetism have a close relationship: magnetic fields could be generated by moving electrical charges, which themselves give rise to a changing electrical field. Likewise, a changing magnetic field could create a changing electrical field, which would in turn give rise to electrical current in conductive materials within the changing magnetic field. His discoveries led to the inventions of both the electric motor, which converts electrical energy to mechanical energy, and the electric generator, which converts mechanical energy to electrical energy. By combining electric current in wires and magnets, we can create transformers to control the amount of electrical energy available in the circuit.
Finally, we need to consider what happens when electrons flow through loops in a circuit arrangement. We distinguish between series circuits where all the current flows through a single pathway, and parallel circuits where the current divides to flow along different pathways.
By combining different arrangements of circuits, we can send electricity where we need to, in the amount we need, to power all our electrical and electronic devices.
It's time to look at some of the excellent sources on the web for our physics topics.
© 2005 - 2019 This course is offered through Scholars Online, a non-profit organization supporting classical Christian education through online courses. Permission to copy course content (lessons and labs) for personal study is granted to students currently or formerly enrolled in the course through Scholars Online. Reproduction for any other purpose, without the express written consent of the author, is prohibited.