Illustration by Eric DeFreitas
Whether wine, beer, spirit, cider or hard seltzer, every alcoholic beverage label is required to show how much alcohol it contains. What can be confusing is the multitude of ways it can be written.
The two main methods for indicating the alcoholic content of a beverage are alcohol-by-volume (abv) and proof. In the United States, a spirit’s proof is simply double the abv. This means the liquid in a bottle of 90-proof bourbon is 45% abv, while a bottle of 151-proof rum is 75.5% abv.
So, where does proof come from and why do we still use it?
Most sources point to 16th century England, where higher taxes were levied on spirits above a certain strength. Without the tools to measure the exact alcohol level of a spirit easily and accurately, its strength was tested by a much simpler method: Will it catch fire? If the liquid was strong enough to burn (or ignite a gunpowder pellet soaked in it), it was considered proof that the bottle was strong enough to warrant the extra tax.
A scale was created in which the number 100 was chosen as the “proof” at which a spirit would burn. Anything lower was exempt from the elevated tax.
This, of course, is a generally poor way to measure the amount of alcohol in a spirit. Combustibility depends on factors beyond abv, as anyone who’s ever lit a 90-proof whiskey on fire can attest. The temperature of the liquid plays an important part, too.Warmer liquid and ambient room temperate allow for more alcohol vapor to escape, increasing flammability. Absent a more scientific process, the “proof” of a liquid could change on warm day versus a cold one.
It does, however, explain why drinks like wine and beer were historically never referred to by their proof. They would never ignite and not be subject to the extra tax, so there was nothing to prove.