Sunday, January 9, 2011

Two articles for the Manhattan Institute's Medical Progress Today, on C.P. Snow's "The Two Cultures," and on FDA Reform.

In the last century, the dominant impulse of American healthcare policy has shifted decisively--from cure to care.  Whereas once we sought to eliminate disease through science, now we seek finance disease through insurance.  Why did this huge shift occur?   The answer can be found in the shift in power relations between two cultures in today’s society, one focusing on dynamic science, the other focusing on static literary, political, and legal worldviews.   

In the early 1900s, President Theodore Roosevelt decided to dig the Panama Canal, but he knew he would never succeed if malaria and yellow fever decimated the workforce, as had happened in  the French canal-building effort two decades earlier.   Medial scientists of TR’s day had not yet developed effective treatments for those diseases, but they had learned that mosquitos were the vector, or transmission agent.  So President Roosevelt simply ordered mosquito habitats cleared away from the canal worksite; as a result, infection and death rates were minimized, and the path between the seas was completed.  

Using today’s terminology, we could say that TR “bent the curve” on healthcare costs, and yet he did so not by crimping down on treatment, but by crimping down on the disease itself.  We might note, to be sure, that TR’s environment-changing solution was in keeping with those disease-attacking times; back home in the US, progressives were using the same methods to improve hygiene and sanitation, dramatically reducing rates of infectious disease.  Indeed, all through the early 20th century, forward-looking public health advocates--untroubled by “wetland” protectors and NIMBYs--drained swamps and launched other mosquito-abatement measures.   As a result, malaria and yellow fever virtually disappeared from the US.  

Later, another President Roosevelt, Franklin, also chose to fight disease.  In 1938, FDR established the National Foundation for Infantile Paralysis, soon to become known, after a groundswell of public support--“Send your dime to President Roosevelt!--as the March of Dimes.   Instead of proposing national health insurance as part of his Social Security retirement plan, FDR gave Americans something more precious: health itself.   In 1955, the announcement of a successful new Salk polio vaccine cheered a relieved and grateful public.   

The dominant idea then was not to provide insurance for disease; it was to do away with disease.   And the added advantage to such an approach--beat, don’t treat--was not only its humanitarian impact, but also that it cheaper in the long run.  The expense of developing the polio vaccine was minuscule, compared to the expense of providing multiple wheelchairs and iron lungs, to say nothing of the cost of lost productivity from those afflicted.   

Now we can fast forward to the end of the 20th century.  In the 1990s, Bill Clinton proposed a national health insurance plan that focused entirely on health insurance, as opposed to the actual science of health.   According to science-policy historian Daniel Greenberg, in Clinton’s 1993 speech outlining his healthcare plan to Congress, “The President made no mention of medical research or the role of improved scientific understanding of disease in protecting the health of the American people.”  Instead, he focused solely on “universal access to medical care and cost containment.”

And the subsequent Obamacare, of course, followed the same formula: a huge increase in “coverage,” even as the pipeline for new drugs and cures--and hope--continued to dry up.   

So what changed over the last century?  How did we go from trying to eliminate dreaded  disease to being content merely to finance its ravages?   How could it be, for example, that today we spend $172 billion a year treating Alzheimer’s Disease, but only about $500 million a year  researching the malady?

And the problem will get worse in the decades to come:  Do we really think we can manage the Alzheimer’s epidemic through either budget cuts or “financial innovation”?  Unfortunately, no presidential March-of-Dimes-like effort is in sight, to mobilize against Alzheimer’s Disease, or against any other costly killer.  

One reason for this dire policy reversal might be found in a 1959 lecture, “The Two Cultures and the Scientific Revolution,” delivered by the Englishman C.P. Snow at Cambridge University, later turned into a still-in-print book.  “The intellectual life of the whole of western society,” Snow declared, “is increasingly being split into two polar groups.”  One of these groups is the literary, or traditional culture; the other is the scientific culture.   “Between the two,” Snow continued, stretches “a gulf . . . of hostility and dislike, but most of all lack of understanding.”   

Snow had served as technical director for the Ministry of Labour during World War Two; he had seen, up close, the struggle to advance vital scientific research--most notably, the tide-turning technology of radar.  It was the scientific culture, Snow argued, that had devised the tools for defeating Hitler; yet, after the war, it was the literary culture that had triumphed, submerging the future progress of science in a languorous bath of aesthetic styles and judgments.  Snow, a notable author, well-versed in the humanities, decried the excessive influence of “the literary intellectuals,” who, he added, “while no one was looking took to referring to themselves as ‘intellectuals’ as though there were no others.”  As a result, science, as well as the positive transformations that science can bring, was being pushed out of contemporary political and policy equations.  

This dethronement of science was a huge loss, declared Snow.  “Scientists have the future in their bones,” he added, and yet “the traditional culture responds by wishing the future did not exist.”  That is, scientists seek to advance and change the future, while the literary culture is content to muse over the present.   And yet presently, Snow lamented, “It is the traditional culture . . .  which manages the western world.”

Predictably, Snow’s broadside against the humanities attracted immediate counter-fire.   The eminent literary critic F.R. Leavis denounced Snow as a “public relations man” for science, suffering from “complete ignorance” about the humanities--ignoring the fact that Snow, in addition to his government service, had published 17 novels and works of fiction, as well as a biography of Anthony Trollope.  Yet Snow was adamant: Science should take the lead, as opposed to those he termed “natural Luddites.”

Updating Snow, we can observe that in our time, the literary/humanities culture has been fused with the political/legal/social science culture, creating the current policymaking juggernaut, which we might dub the “literary-legal complex.”  We might further say that the literary-legal complex is inherently oriented to predictability, according to its non-scientific, even anti-scientific, prejudices.  And so the literary-legal worldview focuses on transactions and rule- making--creating durable, but inherently limited, routines in its own image.  By contrast, science is unpredictable and open-ended: the work of scientists disrupts predictable and familiar routines, remaking the world in heretofore unimagined ways.  So people of a scientific turn of mind, for example, produce vaccines, unsentimentally disrupting painful--but at least familiar--patterns of sickness and death.   By contrast, in past eras, the literary-legal turn of mind debated whether it was a sin to use vaccines, thereby thwarting God’s will.   Today, many of those who still retain that predictability-oriented turn of mind are still happy to see science restrained--by custom, law, or lawyers. 

Meanwhile, in politics today, we see endless ideological battles within the left and right factions of the literary-legal complex, over the financial metaphysics of health insurance, pro and con, and little consideration of where health actually comes from.  Thought-leaders of the two parties,  sharing a common background in humanities and the law, naturally end up with similar conclusions about the primacy of predictable transactions and rules.   And so their policies, dueling as they might be, end up mirroring each other.  

The left, inspired by theorists and social critics--leavened with a little Green pseudo-science--produces policies in its own image: regulations, consent decrees, sweeping theories of legal liability.   The right, on the other hand, re-reads the Constitution and studies economics, while seeking further inspiration in the novels of Ayn Rand.  That is, both sides ignore science, because they don’t know science.   So if the left says that the government should provide health insurance, the right says, no, the market should provide it.  And in the midst of that ideological  rumble, the idea of actual improvements in health--improvements that come from medical science--is sadly forgotten. 

And today, that same turn of mind seems As we are currently seeing, ideology is an endless fight that can never be resolved.  By contrast, technology is a fight that can be resolved.   Polio, for example, was resolved.   And we can resolve other diseases as well, if we want to.  But first, we will have to realize that our current exaltation of the literary-legal complex is the solution, but, in fact, the problem. 


And here's an earlier article, also for The Manhattan Institute's Medical Progress Today, published in early December: 

A consensus is developing that Alzheimer’s Disease (AD) is the next epidemic to worry about, both medically and fiscally.  But unfortunately, there’s nothing close to a consensus on what to do about it.  
Considerations of AD have been subsumed by the debate over spending and the deficit--although Washington mavens don’t yet seem to grasp that a fiscal solution is not possible without a medical solution.   Instead, policy chatterers have zeroed in on cutting entitlement spending, ignoring the medical problems of aging, favoring a strictly fiscal solution.  Such an approach has a soundbite-worthy appeal, but it disregards outside-the-beltway political reality: If the underlying problem of the disease itself is ignored--the incidence of AD is expected to triple in the next 40 years--then popular pressure to spend commensurately will not be ignored by politicians.   Indeed, we can note that if AD triples, it won’t matter much whether or not Obamacare lives or dies; either way, by mid-century, healthcare will be ruinously expensive.   

In the meantime, others seem to believe we should simply have a partisan brawl, aimed at gaining maximum political advantage going into the 2012 elections.  And of course, as soon as the ’12 elections are over, fighting over the ’14 elections will commence.   Yet amidst such see-sawing political  opportunism, we will continue to spend money on AD care--already more than one percent of GDP, and rising fast--without any real prospect for a cost-curve-bending cure.   

In August 2010, The New York Times reported on the work of a medical “jury” convened by the National Institutes of Health to evaluate the various treatments for AD.  The “verdict” of the NIH panel was discouraging in the extreme: “Currently, no evidence of even moderate scientific quality exists to support the association of any modifiable factor (such as nutritional supplements, herbal preparations, dietary factors, prescription or nonprescription drugs, social or economic factors, medical conditions, toxins or environmental exposures) with reduced risk of Alzheimer’s disease.”

To sum up: There's “no evidence” that anything we are doing to forestall or treat Alzheimer's is working.  The chair of the NIH panel, Dr. Martha L. Davigulus of Northwestern University, summed up the current state of research as “primitive.”
Yet we do see stirrings of a counteroffensive against the AD onslaught.  In October, former Supreme Court justice Sandra Day O’Connor, joined by Nobel medical laureate Stanley Prusiner and geriatric expert Ken Dychtwald, argued on the op-ed page of The New York Times for a more proactive strategy against AD and its costs:

As things stand today, for each penny the National Institutes of Health spends on Alzheimer’s research, we spend more than $3.50 on caring for people with the condition. This explains why the financial cost of not conducting adequate research is so high. The United States spends $172 billion a year to care for people with Alzheimer’s. By 2020 the cumulative price tag, in current dollars, will be $2 trillion, and by 2050, $20 trillion.

In addition, O’Connor, Prusiner, and Dychtwald brought up the benefits of an effective AD treatment: 

If we could simply postpone the onset of Alzheimer’s disease by five years, a large share of nursing home beds in the United States would empty. And if we could eliminate it, as Jonas Salk wiped out polio with his vaccine, we would greatly expand the potential of all Americans to live long, healthy and productive lives--and save trillions of dollars doing it.

That same month, another leading figure, California first lady Maria Shriver, made an overlapping argument: The goal should not be treating AD, the goal should be beating AD.  Speaking to ABC News’ Diane Sawyer, Shriver invoked the ambitious vision of her famous uncle, “We can launch an expedition on the brain, much like President Kennedy launched an expedition to the moon.”

Once again, the obvious wisdom: A cure for any malady is cheaper than palliative care.   That medical-fiscal reality was true for polio, as well as other diseases that have been mostly or completely eliminated--so why couldn’t it be true for AD?  

And yet the political establishment has paid little heed to O’Connor and Shriver, nor has it thought constructively about the polio precedent.  In the weeks since, two different “blue chip” deficit commissions have released weighty reports; both focused entirely on a “cut” strategy for healthcare, as opposed to a cure strategy.  

Why the neglect of the proactive cure-approach?  Perhaps Washington officialdom is simply incapable of a complicated “two cushion shot”--that is, hitting the billiard ball of medical research in order to hit the ball of lower costs in the long run.  Or perhaps the idea of scientific research, as opposed to writing checks and offering bailouts, is simply out of fashion in policy circles.   Or maybe Washington has quietly concluded that medical research--and just as crucially, the translation of medical research into actual medications in the marketplace--is hitting a dead end.  Why spend more money on, say, the NIH if nothing tangible is achieved?Why have faith in the pharmaceutical companies when a dozen anti-AD efforts have failed in mid- to late-stage testing since 2003?

Yet if nothing can be done to rekindle medical progress as a public-policy tool, then we are, in fact, doomed both to go gray and to go broke by the middle of this century--unless, of course, the death panels are called in.  

So is there any hope for a better outcome?  An outcome that’s both more compassionate--and less ruinous?   If there is such a hope, it will have to come from medical research.  Financial transactions won’t get us there; only scientific transformation can do the job.  

For their part, politicians can help, not by fighting each other, but by clearing away the legal and regulatory roadblocks to medical progress.  MI’s Paul Howard is correct in stating that the FDA is facing a “crisis of confidence”; critics on all sides agree that the agency is “broken.”   So the answer, of course, is to fix the agency, as part of an overall cure strategy. 

Will such an effort succeed?  There’s no way to know for sure, but our long national track record on public-private mobilization--from building the railroads to building the Interstates to building the Internet--should give us considerable hope.  If, that is, we can distill the right leadership lessons from those past large-scale successes. 

The political and economic reward for medical success is monumental.  With an effective treatment for AD, we could, of example, begin to think about raising the retirement age for Medicare and Social Security, thus solving much of the deficit problem.  

In addition, an AD breakthrough would shift cost calculations on just about every fiscal and economic variable.  Today, all those tens of millions--soon to be hundreds of millions--of people around the world suffering from AD are seen as a huge burden.  But with a real AD cure, they could become paying customers for whichever country is making the medicine, able to continue working and producing--for the betterment of each country, for the betterment of mankind.  

1 comment:

  1. What's the last cure you heard of? the only one I can think of is the identification of H. Pylori as a major contributor to stomach ulcers. That discovery was (and is) the last thing doctors seem to want to leverage when Prilosec and other proton pump inhibitors make a nice profit and give immediate relief (even if they make the condition worse). Does GERD pay for too many doctors' children's college tuitions?