Monday, November 28, 2011

DARPA to the rescue! If civilians have lost interest in developing cures, let the military do it.


Wired magazine reports that the Defense Advance Research Projects Agency (DARPA) is working on a new approach to antibiotics--or post-antibiotics.   As Wired's Katie Drummond writes,  DARPA is seeking proposals that could completely replace traditional antibiotics with a whole new kind of bacteria killer:

Darpa wants researchers to use nanoparticles--tiny, autonomous drug delivery systems that can carry molecules of medication anywhere in the body, and get them right into a targeted cell. Darpa would like to see nanoparticles loaded with "small interfering RNA (siRNA)" -- a class of molecules that can target and shut down specific genes. If siRNA could be reprogrammed "on-the-fly" and applied to different pathogens, then the nanoparticles could be loaded up with the right siRNA molecules and sent directly to cells responsible for the infection.

Drummond allows that it might seem hard to believe that DARPA could pull off something like this, but in fact, the theory has already been proven.  Last year, she notes, researchers were able to engineering siRNA and put it into nanoparticles that were injected into four primates infected with the Ebola virus, thereby arresting the killer disease.

But wait--there's more.   Not only does DARPA seek to bring about this whole new approach, skipping past familiar modes and mechanisms, it is also seeking ways to time-compress the timeline of new cures, from years down to mere days.

So it's a daunting, if enticing, prospect.   DARPA does, indeed, have a big vision.  At a time when most healthcare "experts" talk only of finance and bean-counting and rationing--that is, on the demand-side of medicine--the DARPA wants to jump in on the supply-side of medicine; that is, the creation of actual cures; it's the Pentagon, not the Department of Health and Human Services, that wants to decisively intervene in the course of disease and save lives.  Audacious?  Sure.  Impractical?  Maybe.  Popular?  Absolutely, if it works.   But as Drummond concludes:

If anybody can design a new paradigm for medicine, and a new way to mass-produce it, our money's on the military. After all, we've got them to thank for figuring out how to manufacture the medication that got us into this mess in the first place: penicillin.

Indeed, the military, British and American, was the impetus for the serious development of antibiotics.  The medicinal qualities of the blue-green mould penicillium notatum had been observed as far back as the Middle Ages, but those positive properties were not recorded in a scientific treatise until 1875.  Yet serious scientific inquiry did not begin for another five decades after that.  Alexander Fleming had been a British military doctor during World War One, working in the mud and filth of the trenches, observing firsthand the lethality of infected wounds.   For the next decade, Fleming kept seeking a remedy for infection--until that fortuitous moment in 1928, when he noticed that bread mould was inhibiting bacteria growing in a petri dish.  He called it penicillin.

Yet in the following years, progress remained slow, as Fleming and others at St. Mary's Hospital in London struggled for a decade to purify and extract the antibiotic agent and turn it into a usable drug.  It was not until World War Two that urgent military necessity led to increased funding for Fleming--and to the rapid acceleration of penicillin research and development, mostly in the US.  This heroic story was ably told by Lauren Belfer in her 2010 novel, A Fierce Radiance.  

Vannevar Bush, the director of the Office of Scientific Research and Development--DARPA's predecessor agency--ordered that penicillin research be a national priority second only to the atom-bomb project.   And it worked.  By 1944, penicillin was being produced in the millions of doses by Pfizer, working on a government contract.  As a result of this public-private partnership--this medical-industrial complex, if one prefers--more gains were made in the battle against deadly infection than in all the previous years of human history.

Fleming and two fellow researchers were awarded the Nobel Prize for Medicine in 1945.   As Bush   observed in that same year:

The death rate for all diseases in the Army, including the overseas forces, has been reduced from 14.1 per thousand in the last war to 0.6 per thousand in this war.  Such ravaging diseases as yellow fever, dysentery, typhus, tetanus, pneumonia, and meningitis have been all but conquered by penicillin and the sulfa drugs, the insecticide DDT, better vaccines, and improved hygenic measures. Malaria has been controlled. There has been dramatic progress in surgery.

So while we don't yet know if DARPA's new plan for siRNA will truly work, history tells us that if the military really puts its mind to work on a challenge, that challenge can often be overcome.  Why?   Because the military has a strong claim on national resources--and not just tax revenue.  In the past, to achieve an urgent objective, the military has black-boxed its budgets, dragooned brain power, and bulldozed any and all obstacles.

To cite one germane non-medical example, Gen. Leslie Groves, leader of the Manhattan Project, did not pause over Environmental Impact Statements when he occupied Oak Ridge, Tennessee, and set up a nuclear bomb factory that brought in 75,000 people, and he certainly did not hold public hearings in advance of the 1945 atomic tests at the Trinity site in New Mexico.    Such military mobilization of resources is a hard and Hobbesian process, but it has one virtue: It works.   If the goal is important--and theoretically, at least, the wartime military doesn't have any goals that are not important--then the Manhattan Project sums up the way the process can work to shorten the war, reduce casualties, and guarantee victory.

Similar tales could be told about the wartime (including the Cold War) invention/acceleration of such 20th century inventions as radar, synthetic rubber, aviation, electronics, nuclear power, the internet, and GPS.   As an aside, the fact that each these inventions contributed not only to military victory but also to civilian wealth is yet another bonus of constructive public-private partnerships, and a reminder that the US military has been one of the principal drivers of the American economy all through our history.   And so, too, in the case of DARPA's siRNA project; if it works, we will all owe those defense nerds yet another huge debt.

By contrast, the results for innovation and the economy in the absence of military mobilization can be painfully slow--even deadly slow.  In a free and pluralistic society, every economic activity is eventually surrounded by claimants and rent-seekers of various kinds; these claimants and rent-seekers can be variously described as remoras, barnacles, or lampreys.   That is, they can be mildly symbiotic, a slight burden--or they can be lethally parasitic.

The dismal economic consequences of runaway pluralism were ably described by the economist Mancur Olson in his 1982 book, The Rise and Decline of Nations; Olson went so far as to suggest it was more economically beneficial to lose a war than to suffer the endlessly cumulated sedimentations of special-interest encrustation.  The non-catastrophic solutions to such "demosclerosis"--to recall Jonathan Rauch's encapsulation of the Olson argument--are relatively straightforward; csh solutions include deregulation and an overall opening up of clotted economic arteries.  But as we have seen in our time, it's easier to prescribe those solutions than it is to implement those solutions.

Typically, what's needed is at least some kind of crisis--some wake-up call; a default, if not a  defeat.  Civilian leaders can sometimes make the most of a sense of urgency and crisis--but military leaders always can.

As we have seen in recent decades, bad news on the medical front has not been in any way galvanic--the situation gets worse and worse. Indeed, the worsening seems to be part of a deliberate policy of looting the medical industry to achieve other governmental goals.  

No wonder, then, that we have been losing the war against infection for some time now, and nobody in the US government, other than DARPA, seems to have noticed.  Yes, it might seem to be a strange world when all the agencies and committees that have the word "health" in their title have been allowing the problem to worse, to the point where the number of new antibiotics has fallen by more than 80 percent over the last quarter century, even amidst louder warnings about the rise of deadly "superbugs."   Yet as the historical record shows, even well-meaning civilians have not been able to overcome the cumulative blockages of the trial lawyers, the FDA, and the overall brain-drain and capital-drain out of the pharma sector.

Enter the Pentagon and DARPA, coming from a different world, pursuing different goals.   By no means is the military always a paragon of efficiency, but mission-focused command-and-control does have its bottom-line virtues.   For the most part, the military is able to fend off civilian predations and Olsonian sclerosis, because generals and admirals can invoke national security--and, at a more gut level, the well-being of our fighting forces--in order to push its projects through.

"Compared to war," General George S. Patton said during World War Two, "all other forms of human endeavor shrink to insignificance."   War is, indeed, catalytic; it does unleash vast amounts of public exertion and public forbearance.   But war, of course, is also tragic, even if, as in World War Two, the larger benefits of improved medicine save lives during and after the war.

In a better world, advocates for Serious Medicine, such as a new kind of instantaneous bacteria-killer, would be able to act just as decisively in the fight against microbes as generals can in the fight against men.   That is, we would enjoy the benefits of saving lives without predicating the effort on taking lives.   Until then, however, we can conclude that those generals and admirals care more about the well-being of their men and women than our elected politicians care about the well-being of us civilians.

So yes, someday, we should have a MARPA, for Medical Advanced Research Projects Agency, as a more mission-focused version of the NIH.   We should mimic the military's sense of purpose on the civilian side, without firing a shot.

But until that happens, we should be thankful that we have a DARPA.  

Saturday, November 19, 2011

The FDA’s Rejection of Avastin: Not Part of the Solution, Part of the Problem

The Food and Drug Administration’s decision to restrict the use Avastin for breast cancer has attracted some cautious supporters in unexpected places.  MPT’s own Paul Howard, for example--not generally regarded as a fan of the contemporary FDA--writes,“This is one case where I think the FDA did the right thing.”  

Well, here’s another perspective: This is a case where the FDA did the wrong thing.  It’s wrong for patients, wrong for the country, and wrong even for the long-term cause of saving money.   We need to do more against cancer, not less.  And paradoxical as it may seem, if we do more, we will not only save more lives, but we will ultimately spend less money.  Indeed, medical history tells us that only when we do more--that is, increase innovation and productivity--do we end up spending less.  That’s the lesson of polio from the 50s, of AIDS in the 80s and 90s, and of heart disease over the last half-century.  And it could be the lesson of breast cancer, too--but only if we take the same dynamic pro-science, pro-innovation approach.

Today, the FDA, echoing the thinking of the larger federal government, seems content to fight mere skirmishes in the war on cancer.   Yet absent any sort of strategy for victory, the casualty toll will continue to mount.   Last summer, at an FDA hearing in Washington, one woman, Priscilla Howard, declared, “Despite the potential side effects from Avastin, metastatic breast cancer has only one--death.” She added that Avastin had controlled her cancer for 32 months: “I want every available weapon in my arsenal as I fight this devastating disease.”  But now, thanks to the FDA’s action against Avastin, that arsenal has been depleted.  Indeed, it’s a safe bet that the future arsenal will be depleted even more; Uncle Sam has just sent a clear signal to researchers and developers: Don’t assume that the government is interested in financing future progress against cancer.  If you develop a new drug, the burden is all on you.  In addition, you will confront both implicit and explicit price controls.  

In fact, the FDA’s Avastin decision should be seen in the context of overall public policy in the last few decades, which can be summed up in three points:

First, the dominant healthcare policy elites, influenced by the environmental movement, have adopted a generally skeptical view of technological advancement in medicine.  Since the 60s, technology has been seen by many as a source of alienation, pollution, and even, in a metaphorical sense, mutilation.   In 1984, Dick Lamm--who had led the fight against the proposed Denver Olympics before going on to serve two terms as Colorado’s Democratic governor--struck an elitist chord when he applied the same limits-to-growth ethos to healthcare.  Older Americans should pass from the scene sooner, rather than later, he said, for the sake of future generations: “We’ve got a duty to die and get out of the way with all of our machines and artificial hearts and everything else like that and let the other society, our kids, build a reasonable life.”   With the conspicuous exception of the fight against AIDS--which was treated as an all-out war, thanks to the intervention of figures from the popular culture, as opposed to the policy culture--this go-slow approach has dominated the chattering classes.   Indeed, the Kaiser Family Foundation has noted this gulf between the elites and the masses; the elites want less healthcare as a matter of national policy, and the public, by contrast, wants more.

Second, policy makers see the need to control healthcare costs as a way of making national health insurance more acceptable and affordable.  To put it bluntly, if people die, that’s cheaper for the system, at least in the short run.   Such sentiments are rarely articulated in public, of course, but the public is nevertheless suspicious of what the elites are up to.   And so, for example, when a panel within the Obama Department of Health and Human Services put forth new and more restrictive guidelines calling for fewer mammograms, the public rose up and the new rules were withdrawn, although not before the “death panel” meme was born.   Interestingly, the same panel put forth similarly restrictive guidelines on prostate cancer screening, and those new rules have not been withdrawn--perhaps a  reminder that prostate-cancer-minded men are not as organized and energized as breast-cancer-minded women.  Meanwhile, the cost-controlling effect of the Independent Payment Advisory Board, part of the Affordable Care Act of 2010, remains to be seen.  But here’s a prediction: IPAB will be much more effective at controlling abstract costs, defined as future speculative research, than it will be at controlling tangible costs, defined as money flowing directly to patients and caregivers.  In other words, IPAB will impose “savings” in exactly the sort of research that could ultimately save lives.   In the past, the federal government has been good at making long-term investments, e.g., the railroads, aviation, and the Internet.  But in the current political environment, the healthcare imperative is for immediate savings--in time for the next fiscal year, or the next election.

Third, we now see the additional pressure of the “deficit hawks,” culminating in the so-called Super Committee, which has raised the static-analysis view of deficit-reduction to the pinnacle of national thinking.   Official Washington will be happy if there’s a deal in the next few days or weeks--any deal.  It’s not hard, of course, to find skeptics who believe that the spending restrictions will not be meaningful, but it would appear that the Establishment has settled on the idea that an agreement of some kind is desperately needed--if only, some might say, to save the same Establishment from losing face.  Yet if and when those possible spending caps are broken, it’s more likely that immediate costs--say, increasing payments to doctors or hospitals--will be accommodated, as opposed to longer-term research.  So once again, cancer researchers and developers are on notice; the real money will be in treating cancer, not in beating cancer.  And the same will hold true for other diseases, such as Alzheimer’s.   The care may ultimately cost more than the cure, but the feds are interested in paying only for the care.  And as always, we get what we pay for.

Back to Avastin: If the drug is used less, that’s a savings to the government, in the short run.    Yet as the population ages, diseases such as cancer--as well as other illnesses, such as Alzheimer’s--seem destined to become more prevalent, and the nation will have to bear the  expense.   So while the price-controlling approach to cancer research is likely to “work” in terms of restricting cancer drugs, it is ultimately doomed to fail as a means of controlling costs.   Caring for increasing numbers of sick people for long periods of time is costly--and those people, by the way, are voters.

So how can prices for healthcare be lowered?  The answer is the same for medicine as for everything else--improved productivity, getting more for less.  That’s been the secret of the Scientific Revolution over the last four centuries, and also for the Industrial Revolution over the last three centuries.  As Adam Smith explained in The Wealth of Nations, developing a more efficient way to make something as simple as a pin could increase overall output by a factor of 240--that’s 24000 percent.   Such gains have been routine over these hundreds of years, accounting for the material abundance that we enjoy today.   So it’s perverse that all the aforementioned policy elites are following a different policy path when it comes to medicine.   Instead of saying, “Push ahead, so that we can have more for less,” the elites have taken an anti-Smithian stand; they have taken a neo-Malthusian stand, arguing for rationing and scarcity.   And such neo-Malthusianism is the ultimate animating philosophy behind the FDA’s decision against Avastin.  If everybody “knows” that we need to cut back and make do with less, here is the FDA’s opportunity to be on “the right side of history.”

So the challenge for the rest of us is to rediscover Smith, and to reject Malthus yet again.   We must apply Smithian wisdom to the systematized research, and mass production, of medicine.  That is, apply the time-tested scientific and industrial principles of growth, and insist that they be applied to medicine.   And if we do that, Avastin would be seen in a new light.  The drug may or may not prove to be a great cancer treatment, but surely, at minimum, the use of the drug will save some lives, as well as help teach us about what works against cancer.  Edison didn’t get the the lightbulb right the first time he tried, nor did Einstein develop the theory of relativity in the first draft.  The process of discovery can be lengthy--and expensive.  But as we have seen, the cost of non-discovery is even greater, and ultimately more expensive.  

This further point--that we learn by doing, as millions of actors set in motion a Hayekian process of discovery that no bureaucrat could plan for, or account for--is worth pausing over, because it speaks to what saves lives in medicine.  

A powerful illustration of discovery in action comes from Harvard economist David Cutler, who describes the process by which heart disease has become vastly more survivable and vastly less expensive on a per-patient basis.   Cutler recalls that in 1955, President Dwight D. Eisenhower suffered a heart attack.  His doctors prescribed . . . bed rest.  That was the best they could do, even for the commander-in-chief, the leader of the free world.   The remedy was certainly low-cost, although for the leader of the free world, no expense would have been spared.   In fact, Cutler comments, the treatment Ike received was counter-productive: “We know today that bed rest is ineffective.  It does not prevent further heart damage, and it can lead to other complications, such as blood clots in the veins and lungs.”  In other words, the treatment for the heart attack was making the president’s condition worse.  Early failure is a familiar enough phenomenon in any scientific inquiry, and medicine is no exception.  So the challenge, therefore, is to keep pushing forward, figuring it out as one goes along.   Such problem-solving is the basic method of all science and all engineering.

By the 1970s, Cutler continues, open heart surgery had become common.  Such procedures were an improvement, albeit with huge drawbacks; any patient who spends time in a hospital runs the risk, for example, of nosocomial infections--that is, infections acquired in the hospital.  Such infections are estimated to occur in five percent of all acute-care hospital stays, causing perhaps 70,000 deaths a year.  But even as progress was being made for such surgeries, the development of alternatives continued.  In the 1960s, stents emerged, and in the following decade, the first angioplasties were attempted.  Drugs emerged, too, such as statins.   Meanwhile, science became more aware of dietary and lifestyle issues as they affect heart disease, giving people new tools to help their own health and longevity.  In addition, that old medicine-chest standby, aspirin, was now seen in a new light.   So we can see that for many, the advance of science has led to some surprisingly simple and elegant solutions, based not on faith or superstition, but on a century of accumulated scientific wisdom.  When a basic problem is solved, it stays solved, at minimal cost;  for example, for as long as people want to use the wheel, the wheel will work, and for as long as people wish to avoid rickets, Vitamin D will work.   And at the same time, we have developed the sort of heavy scientific machinery, including the pacemaker, that is keeping, for example, Dick Cheney alive.  The cumulative wisdom of simple solutions, together with complex solutions, has worked: As Cutler observes, heart disease is three-fourths more survivable than it was in Eisenhower’s time.   And that’s been a huge boost to our society and economy; unfortunately, the federal bean-counters have chosen not to notice, and so the positive-feedback impact of cures has never been factored into national budgeting.

So that means, unfortunately, that progressive scientific health solutions--as opposed to redistributive bureaucratic health semi-solutions--have never been taken seriously by the budget “experts.”  And so, absent that policy support, we haven’t made as much progress on some other diseases.   If the healthcare policy elites could forget their training and bring themselves to see medical progress as a fiscal winner, of course they would demand the sorts of changes in the legal and regulatory environment that would foster more and better medicine.  But at the rate we are going, they won’t change, and so the inhibitory environment won’t change.  

So the Avastin decision is a sign of the times, a part of the problem--and certainly not part of the solution.

This piece was cross-posted at the Manhattan Institute's Medical Progress Today.