All Articles
Health

Cocaine in Your Cough Syrup: The Surprisingly Recent History of American Medicine

Mar 13, 2026 Health
Cocaine in Your Cough Syrup: The Surprisingly Recent History of American Medicine

Cocaine in Your Cough Syrup: The Surprisingly Recent History of American Medicine

Next time you're waiting at a pharmacy counter, watching the pharmacist verify your prescription before handing over a bottle of antibiotics, consider this: for most of American history, that entire process simply didn't exist. No prescription. No verification. No government agency checking whether what was in the bottle matched what was on the label. You walked up, pointed at what you wanted, paid your money, and walked out.

And what was on the shelves would make your head spin.

The Golden Age of Patent Medicines

From roughly the 1850s through the early 1900s, the American drugstore was a place of remarkable creative freedom — for the manufacturers, at least. The patent medicine industry sold products with names like Soothing Syrup, Mrs. Winslow's Soothing Syrup, and Cocaine Toothache Drops directly to consumers with almost no regulatory oversight and even less obligation to disclose ingredients.

These weren't fringe products sold in back alleys. They were advertised in major newspapers, endorsed by clergymen and politicians, and stocked on the shelves of respectable pharmacies in every city and small town in America. The active ingredients, however, were something else entirely.

Morphine appeared in remedies marketed to calm colicky infants. Cocaine was a standard ingredient in tonics sold for everything from hay fever to fatigue. Heroin — yes, that heroin — was introduced by the Bayer pharmaceutical company in 1898 and marketed as a "non-addictive" substitute for morphine. It was sold over the counter in the United States for years as a cough suppressant. Arsenic showed up in complexion treatments. Alcohol was the base ingredient in an enormous number of remedies, often at concentrations that would qualify them as hard liquor by today's standards.

The people buying these products weren't reckless or naive by the standards of their time. There was simply no framework — no FDA, no required ingredient labeling, no clinical trial process — to tell them anything different. If a product claimed to cure rheumatism and the label looked official, that was largely the extent of the vetting available to a consumer.

The First Cracks in the System

The shift didn't happen all at once. It came in pieces, usually pushed forward by specific disasters or public scandals that made the status quo impossible to defend.

The first significant federal intervention was the Pure Food and Drug Act of 1906, which required that certain drugs — including alcohol, morphine, and cocaine — be accurately labeled if present in a product. It didn't ban these substances from over-the-counter sale. It just said you had to admit they were in there. For the first time, manufacturers couldn't hide what they were selling.

The Harrison Narcotics Tax Act of 1914 went further, effectively beginning the regulation of opiates and cocaine by requiring prescriptions for their distribution. This wasn't framed as a health measure, technically — it was structured as a tax and registration requirement — but the practical effect was that cocaine and opiates started moving behind the pharmacy counter.

Then came 1937, and a tragedy that genuinely shocked the country.

The Sulfanilamide Disaster

A drug company in Tennessee wanted to create a liquid version of sulfanilamide, an early antibiotic that had proven effective in treating streptococcal infections. Their chemist dissolved it in diethylene glycol — a compound that, as it turned out, is essentially antifreeze. No animal testing was conducted. The product tasted fine and looked fine, so it went to market.

More than 100 people died, many of them children.

The chemist responsible took his own life. And the public outcry that followed gave Congress the political momentum it needed to pass the Federal Food, Drug, and Cosmetic Act of 1938 — the law that, for the first time, required pharmaceutical companies to demonstrate that a new drug was safe before selling it to the public.

It's worth pausing on how recent that requirement is. The idea that a drug must be proven safe before it reaches consumers only became federal law in 1938. That's within the lifetime of people who are still alive today.

The Prescription Requirement Is Even Newer Than That

Here's where it gets even more surprising. The formal distinction between prescription-only and over-the-counter drugs — the system that defines how Americans access medication today — wasn't clearly established until the Durham-Humphrey Amendment of 1951.

Before that amendment, the line between what required a doctor's authorization and what didn't was murky and inconsistently enforced. The 1951 law created the two-tier system that modern Americans take entirely for granted: some drugs require a prescription, some don't, and a licensed pharmacist is the gatekeeper between them.

The requirement that a drug be proven effective — not just safe — didn't arrive until the Kefauver-Harris Amendment of 1962, passed in the wake of the thalidomide crisis in Europe, where a sedative prescribed to pregnant women caused severe birth defects in thousands of babies. The United States largely avoided that particular disaster because an FDA medical officer named Frances Kelsey refused to approve thalidomide for the American market, citing insufficient safety data. She was right. The near-miss accelerated the push for stricter efficacy standards that culminated in the 1962 amendment.

So to put a timeline on it: the requirement that your medication actually works the way the label claims? That's been federal law for just over 60 years.

From Then to Now

The distance between a bottle of Bayer Heroin on a 1900 pharmacy shelf and a modern prescription drug with documented clinical trials behind it is almost hard to conceptualize. The entire regulatory architecture that makes today's pharmaceutical system function — the FDA's approval process, the prescription requirement, the mandate for accurate labeling, the proof of both safety and efficacy — was built almost entirely within the twentieth century, most of it within living memory.

That doesn't mean the current system is perfect. Debates about drug pricing, opioid prescribing practices, and the speed of approval processes are very much ongoing. But the baseline expectation that a medication won't poison you, that it's been tested, and that someone with medical training has decided you actually need it? That's a genuinely modern invention.

The next time a pharmacist asks for your prescription, it might be worth remembering: that question has only been routine for about 70 years. Before that, the answer was already in your hand.