Shafts of sunlight filter into the darkened interiors of long-abandoned homes on Waco Street, in the heart of Houston’s Fifth Ward, an impoverished neighborhood of mostly black and Latino residents two miles from downtown. Ash-coated vines snake through front yards. And in nearby empty lots, islands of trash—old clothes, disemboweled couches, cardboard boxes—float in fields of weeds.
The uncollected trash of residents combines with the garbage that spills over from the rest of the city. As the environmental-justice pioneer Robert Bullard noted in a seminal 1983 report, Houston’s solid-waste facilities are disproportionately clustered in poor, predominantly black neighborhoods like the Fifth Ward (at the time, they were home to five of the city’s six landfills). The garbage sprawls alongside unlined ditches choked with debris. While the prosperous areas of the city enjoy storm sewers with curbs and gutters, in 88 percent of Houston’s poor, minority neighborhoods, unlined ditches are the sole form of drainage.
When it rains—about 50 inches fall on Houston every year—water collects in the ditches and in the nooks and crannies of garbage-strewn lots and empty homes, providing ample breeding sites for mosquitoes. Armies of insects hatch every few days, the females fanning out into the subtropical night in search of blood to nourish their incubating eggs. People slumbering in dilapidated houses with broken window screens make for easy pickings.
As the mosquitoes pierce the skin of their victims, whatever pathogenic microbes roost inside the insects’ bodies slip in, too. In 1964, those microbes included the St. Louis encephalitis virus, which causes deadly brain infections in older people. Roughly 10 percent of those infected died. In 2002, the mosquitoes brought the West Nile virus, sickening its victims at 20 times the rate that the SLE virus did. The following year, dengue fever arrived, which threatened children in particular with an Ebola-like hemorrhagic disease. Then, in 2014, the local mosquitoes were found to be harboring Chikungunya virus, which can cause debilitating joint pain for months.
Finally, in February 2016, with the mosquito-borne Zika virus looming over Houston and the rest of the Southern United States (and with the country in the midst of a fractious presidential election), the Obama administration asked Congress for $1.9 billion in emergency funds for a national effort to fight the mosquitoes that had long plagued places like the Fifth Ward. When the Republican-controlled Congress refused, public-health experts and others railed at their intransigence. “Republicans are going to look back on this time that they’ve had to act on the Zika virus and deeply regret it,” a White House spokesperson warned.
In truth, a Zika epidemic was already a fait accompli in scores of neglected Gulf Coast neighborhoods. Bullard had warned that conditions in the Fifth Ward were a public-health disaster in the making back in 1983. Residents, too, had long known that their clogged ditches and dilapidated houses made them vulnerable to mosquito-borne diseases. “With all this water standing,” one community activist said in 2002, “they have this mosquito virus going around with this [West] Nile thing. It’s unhealthy. We need new streets, we need drainage.” The city of Houston itself admitted, in mid-2015, that the open ditches common in minority neighborhoods were “inadequate” for draining water away.
By 2016, it was already decades too late to prevent an outbreak. With the virus set to arrive this summer, the best that can be done is to hunker down in the face of a wave of untreatable infections.
* * *
Over the past 70 years, more than 300 new infectious pathogens have emerged in territory where they’ve never been seen before. In the last two years alone, two ghastly viruses—Zika and Ebola—have burst their geographic borders, surging into vulnerable new populations. And just this past April, a bacterial strain of E. coli impervious to last-resort antibiotics was discovered in a patient in Pennsylvania and in the tissues of a slaughtered pig, heralding a potential new era of untreatable infections that could transform modern medicine. Experts are bracing for a catastrophic pandemic—one that will sicken 1 billion, kill 165 million, and cost the global economy trillions of dollars. The conventional wisdom is that this rising threat is primarily a biomedical problem. But with newly emerged pathogens, some of our most important defenses are political.
To protect populations from epidemics for which no treatments exist, we need to address the underlying conditions that fuel their eruptions. In neglected communities like the Fifth Ward, that means the garbage crisis, poor drainage, and substandard housing, among other things. But for decades, we’ve refused to take the necessary action.
The failure originated in the 1970s rise of neoliberalism and the $300 billion pharmaceutical industry, and the resulting stranglehold that private interests have placed on the independence and integrity of our public-health institutions. That stranglehold has left us vulnerable to Zika, as well as to new kinds of drug-resistant superbugs, avian influenza, and other emerging pathogens.
Many of these pathogens are being driven into human populations through the global expansion of commercial activities. Mining, logging, agriculture, and urban development in the last refuges of biodiversity are introducing new pathogens into human populations. When wildlife and humans are forced into novel, intimate contact, the microbes that live in animals’ bodies jump over to ours. Ebola is one example, carried by fruit bats that once teemed in vast expanses of forests across sub-Saharan Africa, but which are now restricted to much smaller remnant forests, often in close proximity to human settlements. Avian-influenza viruses are another example, carried by migratory waterfowl that once flew over undeveloped tracts of Asia, but which now fly over and infect rapidly growing factory farms, making the strains much more virulent. Airlines and international-shipping companies, as well as the opportunities afforded by climate change, spread these pathogens to susceptible populations across the globe. And the kind of rapid economic growth that leads to crowded, unplanned urban centers provides ample fuel to spark epidemics.
Public-health regulations that target these conditions could deprive pathogens of the opportunity to erupt and spread. This approach would be far more effective than waiting for epidemics to develop and then scrambling to devise drugs and vaccines to tame them. But in many cases, it would require that the public-health sector confront corporate interests—and decades of corporate influence on public-health agencies have made that increasingly unlikely.
What’s happened at the World Health Organization is emblematic of the way private interests have commandeered the public-health agenda. The WHO is a United Nations agency governed by top health officials from the UN’s 194 member nations. Since its founding in 1948, its signal achievements include coordinating the global eradication of smallpox and the international vaccination campaigns that have slashed deaths from measles, tetanus, pertussis, diptheria, and polio by 60 to 98 percent. But when, in the 1970s and 1980s, the WHO targeted certain corporate practices, such as the marketing of infant formula, pesticides, and tobacco, the reprisals were swift. Wealthy industrialized countries accused the WHO, along with other UN agencies like UNESCO and the International Labor Organization, of being “politicized.” Over the following decades, the UN system was slowly starved of public funding. In 1980, the major United Nations donors introduced a policy of zero real growth in UN budgets; in 1993, the policy became one of zero nominal growth. The WHO’s ability to stanch epidemics in the face of rising private power steadily declined.
The most recent example concerns the agency’s handling of the 2014 Ebola epidemic in West Africa. One of the WHO’s functions is to promptly alert the international community about epidemics that could spread across borders. That was crucial in the first days of the Ebola outbreak: There are no cures for the disease, but an epidemic can be contained if each infected patient is isolated to ensure the virus doesn’t spread to others. However, sounding the alarm about an outbreak can suppress trade, tourism, and travel, damaging corporate bottom lines. For months, the WHO delayed declaring a global health emergency. As leaked e-mails acquired by the Associated Press revealed, officials were reluctant to “frighten investors” in the region’s multinational mining industry, as one aid doctor put it. Local WHO officials failed to send reports on Ebola to WHO headquarters; WHO officials in Guinea refused to get visas for Ebola experts to visit. Top officials at the WHO, including the director general, Dr. Margaret Chan, expressed similar reluctance. Chan declined to call an emergency, fearing that commercial interests might perceive such a declaration as a “hostile act.”
While the WHO dithered, Ebola infected scores, doubling its toll every couple of weeks. By the time some of the most well-equipped responders arrived, it was too late: The epidemic had already killed over 10,000 people.
Less severe but similarly punitive budget cuts befell the Centers for Disease Control and Prevention, the US government’s premier health-protection agency, when it dared to challenge entrenched private interests. In 1993, research funded by the CDC found that the presence of guns in the home increased the risk of homicide. The logical result would have been public-health regulations on gun ownership. But before such regulations could come to pass, concerted NRA lobbying efforts pressured legislators to slash the CDC’s budget by $2.6 million—the precise sum that the agency had spent on researching gun-violence prevention the previous year. The cut had a “dramatic chilling effect,” as the nation’s top medical associations wrote in a recent letter to Congress, leading to a de facto ban on gun-violence research that still persists today.
The lesson to the public-health community has been clear, if unstated: Challenge private interests and watch your funding vanish.
* * *
In response to budget cuts, many public-health agencies now secure funding directly from corporations, including those whose business activities contribute to and profit from disease. In 1983, the CDC started accepting “gifts” from private parties to fund its work. In 1992, Congress created the CDC Foundation, which actively raises money from the private sector for the agency’s public-health projects, with occasionally dubious results. When agrochemicals and harsh working conditions were suspected as factors in the chronic kidney disease suffered by sugarcane-field workers in Central America, the sugar industry financed CDC research into whether the workers’ genes were to blame instead. In 2012, the biotech company Genentech paid the CDC $600,000 to promote its products aimed at diagnosing and treating hepatitis C.
The professional interests of the public-health establishment and the pharmaceutical industry are now so aligned that top officials enjoy a “revolving door” between the two. The president of Merck’s vaccine division, for example, was formerly a director at the CDC; the head of Sanofi-Aventis’s research labs is a former director of the National Institutes of Health.
The WHO has similarly hitched its budget to the coffers of private interests. To make up for declining dues from member nations, the WHO started accepting “voluntary contributions” from private philanthropies, companies, NGOs, and others. These donors, not the WHO, decide how the funds are allocated. In 1970, such voluntary contributions accounted for a quarter of the agency’s spending; by 2015, they made up more than three-quarters of its $3.98 billion budget. As Dr. Chan admitted in an interview with The New York Times, the WHO’s activities are no longer decided by its agency’s public-health experts. They are “driven by…donor interests.”
As a result, drug companies that stand to lose billions from the use of cheaper generic medicines now help to determine the WHO’s policies on access to medicine. Similarly, insecticide manufacturers help set the WHO’s malaria policy—even though the market for their products would vanish if the disease were actually eradicated.
These outside donors have introduced a pronounced mismatch in the agency’s work. While the WHO’s regular budget is allocated to illnesses in proportion to their global-health burden, not so its voluntary contributions. According to an analysis of the agency’s 2004–05 budget, 91 percent of its voluntary contributions were earmarked for diseases that account for 11 percent of global mortality; only 8 percent went to noncommunicable conditions like heart disease and diabetes, which cause almost half of all deaths worldwide, but also implicate industry interests.
Health research is similarly biased toward industry-friendly solutions. The National Cancer Moonshot Initiative, a $1 billion research effort announced in January 2016, is a fitting recent example. Two-thirds of cancers in the United States are the result of obesity, smoking, and alcohol consumption—areas in which the fast-food, tobacco, and liquor industries are heavily invested. The cancer toll could be cut in half by reducing Americans’ consumption of tobacco products alone, and a range of public-health research could help explain how. But Moonshot, when it was first announced, included “virtually no mention of the importance of cancer prevention,” STAT News reported. Instead, the campaign, directed by a former Pfizer executive, will primarily focus on finding new treatment drugs. When outraged public-health experts complained, two additional research areas were added: vaccines and diagnostic tests. Neither takes on the industry activities that have helped create the cancer epidemic.
* * *
Even when the industrial drivers of deadly epidemics are clear, the public-health establishment has been dangerously slow to act. The accelerating crisis caused by antibiotic-resistant pathogens is a dramatic example. The fact that the nonmedically necessary use of antibiotics leads to the emergence of drug-resistant pathogens has long been known. It was first outlined by Alexander Fleming, the scientist who discovered penicillin in 1928. But in the years since, “Fleming’s warning has fallen on ears deafened by the sound of falling money,” as one microbiologist put it. Today, 80 percent of antibiotics in the United States are used for commercial reasons with no medical purpose—for example, to fatten livestock for the market faster. And just as Fleming predicted, microbes have evolved a resistance to them.
For years, a drug seldom used in humans called colistin has been called upon as the antibiotic of last resort to tame the most highly drug-resistant infections. Last November, a gene called MCR-1, which allows bacteria to resist colistin, was discovered in pigs in China, where 12,000 tons of colistin are used in agriculture every year. It’s since been found in over a dozen countries, including the United States, where it recently turned up in a Pennsylvania woman with a urinary-tract infection.
The strain that infected the Pennsylvania woman was endowed with MCR-1, along with the ability to produce enzymes that break down another class of antibiotics called cephalosporins. Fortunately, it was still susceptible to carbapenems, which are also used in hospitals to treat multi-drug-resistant bacteria. But probably not for long: These genes are mobile and can mix with other bacterial strains. And carbapenem-resistant strains are already present in the US. Once a colistin-resistant strain exchanges genes with a carbapenem-resistant one, the likely result is a “nightmare bacteria” that no antibiotic can tame, the CDC says.
It’s difficult to overstate the effect such a strain would have on health and the practice of medicine. Even minor injuries and common infections would kill. Few medical procedures, from knee-replacement surgeries to bone-marrow transplants, would be worth the risk of infection. “All medical feats will come to a stop,” the medical microbiologist Chand Wattal says.
This is a manufactured epidemic that could be effectively controlled through regulation. Countries that restrict the use of antibiotics to medical purposes suffer few if any problems with drug-resistant superbugs. But such regulations would cut into the profit margins of the livestock and pharmaceutical industries. Even as the toll rises in this country, the attempts by public-health agencies to slow the commercial use of antibiotics have floundered.
The FDA first proposed removing antibiotics from use in livestock for growth promotion in 1977. However, Congress, under the influence of the farm lobby, shelved the proposal. By 2002, the FDA said it would only regulate use of the drugs in livestock if such use could be proven to cause drug-resistant infections in people. Even experts who believe that it does admit the connection is nearly impossible to prove. Finally, in 2012, in response to a lawsuit filed by a coalition of NGOs, a federal court ordered the FDA to regulate the practice anyway. In December 2013, the agency issued a set of voluntary guidelines on antibiotic use in livestock—but it was so full of loopholes that one activist called it “an early holiday gift to industry.”
A series of guidelines released by the Obama administration in September 2014 fell roughly into two categories: those that would restrict the use of antibiotics, and those that would address the problem by fostering the development of new antibiotics and diagnostic tests. Tellingly, the former were delayed until 2020, pending the actions of a new advisory council and task force, while the latter were fast-tracked: The administration promptly announced plans to provide the pharmaceutical industry with a $20 million prize for the development of a rapid diagnostic test to identify highly antibiotic-resistant bacteria.
The reason for the hefty incentive is that under normal market conditions, the industry has little interest in developing new products to fight antibiotic-resistant bacteria. The market value of a brand-new antibiotic is just $50 million—a paltry sum considering the billion-dollar jackpot that a company can reap by selling drugs that patients must consume for decades. Despite the desperate need for new antibiotics, 15 of the largest 18 drug companies have dropped out of the antibiotics market altogether.
In the meantime, at least 23,000 people in this country die of antibiotic-resistant infections every year. Many more suffer dangerous infections against which only a select few antibiotics work. A post-antibiotic era, when infections can no longer be cured by such drugs, approaches.
The coming wave of Zika infections is a manufactured epidemic, too, in a way. One of the mosquitoes that can spread the virus, Aedes albopictus, was brought to this country via the used-tire trade in Asia, which kicked off in the 1970s. Had those shipments been quarantined and inspected, that species might never have settled here. But government regulations on invasive species are limited to those that threaten agribusiness, not human health. Vigorous public-health regulations could also have targeted the developers, waste-management companies, and others who turned neighborhoods like the Fifth Ward into vast mosquito hatcheries. Such regulations on industrial and military development had been successfully used in the past to prevent malaria outbreaks. But that was before biomedical products like insecticides were widely available.
And so today, public-health agencies have little choice but to wait for Zika to strike. There’s much that can and should be done to minimize its damage. But each prong of the federal strategy to contain the disease—protecting pregnant women from mosquito bites, reducing mosquito populations, and developing a vaccine—is highly inadequate. It’s not possible to provide 100 percent protection against mosquito bites, nor to reduce mosquito populations permanently. The best we can do with our toxic, short-acting insecticides is temporarily knock down mosquito populations. A Zika vaccine would help, but it could take three years to develop, in the best-case scenario. By then, millions will have been infected, and most of us will already be immune.
In mid-April, a massive storm hit Houston. Eighteen inches of rain fell in less than 24 hours—the city’s fourth major flood in the past year. The neglected corners of Houston were inundated by stagnant waters—and the mosquitoes they bred—for weeks after. The governor of Texas declared a state of emergency. “The floodwaters,” one meteorologist noted, “won’t disappear anytime soon.” Nor will the threat of Zika and the other pathogens lurking in its wake.