Calorimetry, derived from the Latin calor meaning heat, and the Greek metry meaning to measure, is the science of measuring the amount of heat. All calorimetric techniques are therefore based on the measurement of heat that may be generated (exothermic process), consumed (endothermic process) or simply dissipated by a sample. There are numerous methods to measure such heat, and since calorimetry's advent in the late 18th century, a large number of techniques have been developed. Initially techniques were based on simple thermometric (temperature measurement) methods, but more recently, advances in electronics and control have added a new dimension to calorimetry, enabling users to collect data and maintain samples under conditions that were previously not possible.
Any process that results in heat being generated and exchanged with the environment is a candidate for a calorimetric study. Hence it is not surprising to discover that calorimetry has a very broad range of applicability, with examples ranging from drug design in the pharmaceutical industry, to quality control of process streams in the chemical industry, and the study of metabolic rates in biological (people included) systems. Indeed if the full range of applications were to be mentioned, the allocated disk space on this site would soon be used up.
We discuss the basics of two types of calorimetry: measurements based on constant pressure and measurement based on constant volume. The former involves pressure-volume work, whereas the latter does no pressure-volume work.
A calorimeter is a device used to measure heat of reaction. It can be sophisticated and expensive or simple and cheep. In CHEM120 Labs, a styrofoam cup is used as a calorimeter, because it is a container with good insulated walls to prevent heat exchange with the environment. In order to measure heats of reactions, we often enclose reactants in a calorimeter, initiate the reaction, and measure the temperature difference before and after the reaction. The temperature difference enables us to evaluate the heat released in the reaction. This page gives the basic theory for this technique.
A calorimeter may be operated under constant (atmosphere) pressure, or constant volume. Whichever kind to use, we first need to know its heat capacity. The heat capacity is the amount of heat required to raise the temperature of the entire calorimeter by 1 K, and it is usually determined experimentally before or after the actual measurements of heat of reaction.
The heat capacity of the calorimeter is determined by transferring a known amount of heat into it and measuring its temperature increase. Because the temperature differences are very small, extreme sensitive thermometers are required for these measurements. Example 1 shows how it is done.
Dividing the amount of energy by the temperature increase yields the heat capacity, C,
We often compare the heat capacity of a calorimeter to that of a definite amount of water. The heat capacity of 75.2 J/K for the calorimeter is equivalent to the heat capacity of 1 mole (18 g) of water (18 g mol-1*1 cal (g K)-1*4.184 J cal-1 = 75.3 J (K mol)-1.
Do you know that the electric energy = q * dV = i * dt * dV, where q is charge; dV, voltage; i, current; and t, time.
By definition, dH is the energy (heat) released at constant pressure, whereas dE is the energy released at constant volume. These two quantities are related by the equation.
The P-V work must be taken into consideration for the calculation of depends on the extra amount (dn moles) of dH, if the calorimetry is performed at constant volume in a bomb calorimeter. A cross-section diagram of the bomb is shown here. The wires are for electric ignition, and the sample in the sample holder is in touch with the resistant wire. The bomb's diameter is 10 cm, and its height is 15 cm.
The picture shows reading the sensitive thermometer while working with the P6310 Bomb Calorimeter. The bomb is inside the tank.
Since volume does not change, a bomb calorimeter measures the heat evolved under constant volume, qv,
dE = qv = C * dT
Energy used = 10 * 5 * 60 = 3000 J.
Thus, C = 3000 / 3.0 = 1000 J/K,
This is equivalent to (1000 J/K) / (75.2 J/(K mol))
= 13.3 moles of water.
The following related examples illustrate the application of bomb calorimeter for the measurement of dE, and the derivation of dH.
dE = qv = s * m * dT
s, specific heat; m mass.
Heat released, qv,
The balanced chemical reaction equation is
This is the ideal amount of energy released when a mole of sugar is utilized by a living creature such as a person.
Reinterpret the problem, we have
More heat is giving of if the reaction is carried out at constant pressure, since the P-V work (1.5 R T) due to the compression of 1.5 moles of gases in the reactants would contribute to dH.
If 1.0 mole water is decomposed by electrolysis at constant pressure, we must supply an amount of energy equivalent to enthalpy change, dH, a little more than internal energy, dE. More energy must be supplied to perform the P-V work to be done by the products (H2 and O2).
The heat capacity of the calorimeter can also be determined by burning an exactly known amount of a standard substance, whose enthalpy of combustion has been determined. Benzoic acid, C7H6O2, is one such standard. The problem below illustrates the calculations.
The equation for the combustion is,
The balanced equation and various quantities calculated are given in a logical order below:
Can the standard enthalpy of formation dH°f for oxalic acid be calculated? What additional data are required?
The following data were looked up from thermodynamic data,
It's important to know what additional data is required for problem solving. It's equally important to know where to look for them.
We have given an introduction to calorimetry here, and for a detailed laboratory instruction, consult the CHEM120L manual and the Oxygen Bomb Calorimetry which has several pages giving the details of the measurement. In particular, how to measure the temperature difference is given in Calorimetry Data Analysis.