Doctors Used to Hand Out Cigarettes. A Look at Medical Advice That Aged Terribly.
Doctors Used to Hand Out Cigarettes. A Look at Medical Advice That Aged Terribly.
In the early 1950s, if you walked into a doctor's office complaining of a sore throat or general stress, there was a reasonable chance your physician would suggest a cigarette. Not as a joke. Not ironically. As legitimate medical guidance.
Physician-endorsed tobacco advertising was everywhere in postwar America. Campaigns for brands like Camel famously claimed that "more doctors smoke Camels than any other cigarette." Chesterfield ran ads in medical journals. Some cigarette companies sponsored medical radio programs. The American Medical Association, for a time, accepted tobacco advertising in its own publications.
This wasn't fringe quackery. This was the mainstream. And the doctors who recommended smoking weren't bad people or corrupt ones — most of them genuinely believed the evidence available to them. Which is precisely what makes the story so unsettling.
The Bed Rest Paradox
Cigarettes are the most dramatic example, but they're far from the only one. Consider what happened when someone suffered a heart attack in the mid-20th century.
The standard treatment was strict, extended bed rest — sometimes six weeks or more. The logic seemed sound: the heart had been damaged, so the patient should rest and allow it to heal. Activity was considered dangerous, even potentially fatal. Patients were instructed not to climb stairs, not to exert themselves, not to rush their recovery.
We now know this was almost exactly backwards. Extended immobility after a cardiac event dramatically increases the risk of blood clots, pneumonia, and muscle deterioration. Modern cardiac rehabilitation is built around controlled movement — getting patients up and walking as soon as safely possible. The evidence for early mobilization is overwhelming.
But for decades, the confident medical consensus pointed the other way. And patients followed that advice, because why wouldn't they? Their doctor told them to.
The Butter vs. Margarine Wars
Few nutritional reversals have been as publicly embarrassing — or as consequential — as the fat debate of the latter half of the 20th century.
Starting in the 1960s and accelerating through the '70s and '80s, the medical and nutritional establishment became increasingly convinced that saturated fat was a primary driver of heart disease. Butter, eggs, red meat — these were cast as dietary villains. Margarine, made from partially hydrogenated vegetable oils, was positioned as the heart-healthy alternative. It was lower in saturated fat. It was modern. Cardiologists recommended it. Government dietary guidelines endorsed it.
The problem was the trans fats created by the hydrogenation process — the very thing that made margarine solid at room temperature. It took decades of accumulating research before the scientific consensus shifted: trans fats were not just unhealthy, they were more damaging to cardiovascular health than the saturated fats they were meant to replace. The FDA eventually moved to eliminate partially hydrogenated oils from the US food supply, a process largely completed by 2020.
Meanwhile, butter has been quietly rehabilitated. It's not a health food, but it's no longer the villain it was made out to be for thirty years. Eggs, similarly, have gone from dietary pariahs to essentially fine in moderation. The confidently delivered advice of one generation became the cautionary tale of the next.
Lobotomies, Thalidomide, and the Confidence Problem
The pattern runs deeper than diet and lifestyle. The prefrontal lobotomy — a surgical procedure involving the severing of connections in the brain's frontal lobe — was considered a legitimate psychiatric treatment for conditions ranging from depression to schizophrenia. Its inventor, António Egas Moniz, won the Nobel Prize in Physiology or Medicine in 1949. Tens of thousands of procedures were performed in the United States alone before the approach was largely abandoned.
Thalidomide, marketed in the late 1950s as a safe sedative and treatment for morning sickness, caused severe birth defects in thousands of children before being pulled from the market. It had been approved across Europe and was only kept out of wide US distribution by the skepticism of one FDA reviewer, Frances Kelsey, who refused to approve it without more data.
These aren't stories about ignorance. They're stories about the limits of the knowledge available at a given moment — and about the human tendency to be more certain than the evidence warrants.
So What Does That Mean for Right Now?
Here's the part that's worth sitting with: every generation believes it has finally gotten things right. The physicians recommending cigarettes in 1952 weren't idiots. The cardiologists pushing margarine in 1978 weren't corrupt. The surgeons performing lobotomies in 1949 weren't monsters. They were working with the best information they had, filtered through the assumptions and blind spots of their era.
Which raises an obvious question: what are we getting confidently wrong right now?
The honest answer is that we don't know — and can't know, by definition. That's not a reason to distrust medicine or dismiss current guidance. The overall trajectory of medical knowledge has been genuinely extraordinary. We've eliminated diseases that once killed millions. We've extended healthy life expectancy by decades. The system, imperfect as it is, works.
But a little epistemic humility probably isn't a bad thing. Some of what today's doctors are recommending — some dietary guideline, some treatment protocol, some confident consensus — will look different in thirty years. Future patients will shake their heads and wonder how we could have been so sure.
They'll feel exactly the way we feel now when we look at those old cigarette ads. The ones where the doctor is smiling, holding a cigarette, and telling you it's going to be just fine.