Core Concepts
In this article, we discuss the meaning of entropy and its importance in thermodynamics, in both the universe and within a system.
Topics Covered in Other Articles
- What is Specific Heat?
- What is Thermochemistry?
- Calculating Enthalpy
- Hess’s Law Equation
- Endothermic vs. Exothermic reactions
Definition of Entropy
Entropy is a measure of all the possible configurations (or microstates) of a system. Entropy is commonly described as the amount of disorder in a system. Ordered systems have fewer available configurations, and thus have lower entropy. Importantly, entropy is a state function, like temperature or pressure, as opposed to a path function, like heat or work. This means that as a system changes in entropy, the change only depends on the entropies of the initial and final states, rather than the sequence (“path”) taken between the states.
The letter “” is used to represent entropy.
As we’ll find out in a later section, entropy has a lot of use for chemists and physicists in determining the spontaneity of a process.
High Vs. Low Entropy
A system low in entropy involves ordered particles and directed motion. For this, think of a house. The mass that makes up the house is ordered and exact to form the walls and furniture. Any mechanical energy of motion, such as water and gas moving through pipes, remains tightly controlled and directed. Any heat energy remains controlled as well, with certain cold pockets, like the refrigerator, and hot pockets, like the oven, with different temperatures that don’t spread to the rest of the house.

In chemistry, a solid mass of crystal provides another good example of a entropically-low system. The lattice energy of the crystal limits the motion of its particles, resulting in a perfectly geometric shape.
A system high in entropy, by contrast, involves widely dispersed mass and energy. For this, think of a forest. The mass of the trees, plants, rocks, and animals remains random and widely dispersed. Any motion and heat, similarly, are dispersed, resulting in a relatively consistent temperature and unpredictable movements of trees and animals.

In chemistry, a gas provides another good example of an entropically-high system. The relatively low attraction between gas particles allows each molecule to move freely, resulting in a random dispersion.
Mathematical Representations of Entropy
Statistical Definition
The main way to quantify the orderliness of matter and energy involves summing the microstates a given system can have. Chemists define a microstate as a specific arrangement of matter and energy. Naturally, ordered, entropically lower systems have fewer possible microstates than disordered, entropically higher systems.
This statistical approach involves the following natural log formula relating entropy to microstates:
= Entropy (J/K)
= Boltzman’s Constant (1.381*10-23 J/K)
= Number of Possible Microstates
Notice that entropy comes in units of Joules per Kelvin.
However, counting individual microstates remains impossible in most systems. Thus, this definition has the most use for calculating a system’s microstates from known entropic values. In these instances, chemists often instead calculate entropy using the thermodynamic definition.
Thermodynamic Definition
Rather than deal with microstates, most chemists measure entropic values using calorimetry instead. Thus, chemists can define entropy thermodynamically, using the flow of heat and the temperature of the system:
= Small change in entropy (J/K)
= Small change in heat (J)
= Temperature (K)
This formula for entropy tends to have the most use when measuring the change between two states:
Importantly, the heat used to calculate entropy is that given off or absorbed if the given change was done reversibly. While heat is usually a path function, only one reversible path exists between two states, making it a quasi-state function, like entropy. Importantly, we still use the reversible heat even when calculating change of an irreversible change between two states.
We can further simplify the above equation depending on whether temperature changes between the two states:
(Left: Entropy change if temperature remains constant. Right: Entropy change if temperature changes.)
Universal Entropy
Entropy matters to chemists and physicists because it determines the spontaneity of processes. To better understand what this means, we need to look at the Second Law of Thermodynamics.
The Second Law states that the entropy of the universe always increases. Any physical change must either increase or not change the universe’s overall disorder. No process is possible that has the overall effect of ordering and directing the mass and energy of the universe.
With this in mind, let’s say we have two pieces of metal, one hot and one cold. Now, let’s place a conductive metal bridge, allowing heat to flow between the two. Heat will then flow between the hot metal to the cold spontaneously, or without external intervention.

Why wouldn’t heat flow from the cold to the hot metal, further raising its temperature? Afterall, this move would still comply with the First Law, since no energy becomes created or destroyed.
The Second Law explains why this does not occur. The entropic level of the system would decrease as heat becomes concentrated in the hot metal, but it would increase if the thermal energy evenly distributes throughout both metals. Thus, heat only flows from the hot metal to the cold, to allow the universe’s entropy to continue increasing.
Entropy of the System
Importantly, while the universe’s entropy must increase without exception, that of the system may decrease spontaneously.
For instance, take a heat engine. In general, a heat engine works by taking heat energy and converting it to work. This conversion involves an entropic decrease in the system since the disordered energy of heat becomes ordered motion. Despite this, heat engines still function spontaneously in the real world. The reason is that some amount of heat from the engine releases into the universe (i.e. a cold sink). This heat, as a result of distribution into the universe, involves an increase in entropy that counteracts the decrease from the engine’s work.




As mandated by the Second Law, this heat must be released by the engine to prevent the decrease in universal entropy.
Returning to our first examples, the Second Law explains why forests don’t spontaneously become houses, but given enough time, houses spontaneously degrade into forests. However, forests still become houses in the real world. As we know now, this is because chemical energy is spent, releasing heat into the universe, from human bodies performing the necessary labor.
Entropy Vs. Enthalpy
In our previous examples, chemists use the term “enthalpy” to describe this heat given off by a process that decreases entropy. Enthalpy is an important, but distinct, thermodynamic concept in determining thermodynamic spontaneity. If you’d like to learn more about enthalpy, check out this article.
Gibbs Free Energy
So we’ve covered that negative enthalpic changes can happen, but only if heat, in the form of enthalpy, releases to raise universal . To mathematically understand the relationship between spontaneity and these two variables, we must understand a third: Gibbs free energy (
). The following equation illustrates the relationship between entropy, enthalpy (
), Gibbs free energy, and temperature:
= Change in Gibbs Free Energy (kJ/mol)
= Change in Enthalpy (kJ/mol)
= Change in Entropy (kJ/mol)
= Temperature
If a chemical reaction involves , the reaction is spontaneous. Thus, if
, the reaction can only proceed spontaneously if
, which involves heat given off by the reaction. To learn more about Gibbs free energy, check out this article.
Practice Problems
Problem 1
System 1 has a value of entropy higher than System 2. What is the ratio of microstates between System 1 and System 2?
Problem 2
A system at releases
of heat in through a reversible process, lowering its temperature to
. What is the change in entropy?
Solutions
1:
2: