More than seventy five years ago at a remote site in New Mexico, the first test of a nuclear bomb was detonated, producing a massive explosion. The test, which presaged the atomic bombs dropped on Nagasaki and Hiroshima, Japan in August 1945, forever changed the course of world affairs. Subsequent nuclear explosions, and the radioactive fallout they produced, quickly gave rise to worries over the dangers of radiation.
But what does “radiation” mean? And how have attitudes toward radiation changed over time?
The technical definition aside, for most Americans today, it means something like this: energies, often man-made, usually undetectable, that have strange effects on living things. We connect the abstract, physical concept with a personal, biological one. We take special notice when we are exposed to those energies, even briefly.
The early days: a glowing reception
In that sense, the age of radiation began in 1895 with the discovery of X-rays. In the half-century that followed, Americans indulged in optimistic fantasies about the miracles these energies could perform for better health. But they also quickly learned to fear them. On balance, the anxieties have had greater staying power.
Such reactions came from the many direct, personal experiences Americans had with irradiation in an era when radium and X-ray machines were icons of scientific modernity in the early 20th century. They were hailed as the wonders of the age, presented simultaneously as poisons and cure-alls, perpetual motion machines and planet-busting explosives. Radioactive substances (or plausible fakes thereof) were added to dozens of everyday consumer products, including toothpaste and lipstick, to enhance them with the mysterious energies of the atom. X-rays were tools of portraiture at the beauty salon (for