In the mid-1800s, when cholera was killing people by the thousands in South Asia, British historian and member of the Indian Civil Service W. W. Hunter reflected on life in Jagannath, a town in India: “the squalid pilgrim army of Jagannath, with its rags and hair and skin freighted with vermin and impregnated with infection, may any year slay thousands of the most talented and beautiful of our age in Vienna, London, or Washington.”
This statement reflects more than the pervasive racism of the time; it shows a basic understanding of disease as potentially infectious, something that can spread from person to person. Breakthroughs in Western science during the colonial era expanded this understanding. European doctors were studying anatomy and germs, finding cures, rooted in the scientific method, that could combat diseases. This created the field of biomedicine, which focuses more on fighting diseases than promoting overall health and wellbeing. Biomedicine, which dominates today’s health care methods, seeks to provide people with immunity to diseases through vaccination and identify microorganisms responsible for illness.
Biomedical developments in Western science generated power in the form of knowledge for European colonizers: it enabled them to decide which behaviors and descriptions were normal and which were pathological, and additionally provided them with a superior understanding of how to counteract injury or disease whenever they thought it was beneficial. Colonizers held immensely powerful leverage against populations that did not have the same knowledge of body and anatomy as they did. Colonial medicine projects involved mass vaccinations and forced quarantines that were targeted at combating the kinds of diseases that could spread throughout populations, such as cholera and smallpox.
European colonizers would have understood the harmful consequences of diseases like cholera, but they would also have known about the effects of starvation and malnutrition and dehydration that resulted from their subjects’ inadequate access to food, water, and other essential resources. However, these ailments were not at all the focus of health campaigns in the colonial era. If an indigenous person were dying of dehydration, the health consequences would fall only upon that person. On the other hand, if the indigenous person was dying of smallpox, then there existed the possibility of transmitting the disease to a European body.
Colonial medicine was only one of the tools of the empire used to justify colonial presence: the violence that occurred as a result of its implementation was collateral damage in the civilizing mission. Western law, like Western medicine, was also employed in colonial states to organize and establish colonial presence. In her book Juridical Humanity, Samera Esmair uses the British rule of Egypt to present the idea that colonialism did not simply dehumanize subjects, but rather established a transformative process through which the colonized were required to adapt to their oppressors’ strict definition of ‘being human.’ The British created a juridical category of “human” and incorporated Egyptian people into this category, using legal language involving human rights to legitimize colonial oppression. This allowed for the British to exert violence that Esmair describes as “properly measured, administrated, and instrumentalized... Only pain that serves as an end is admitted. Useless, non-instrumental pain is rejected.”
As is described in Mixed Medicines by Sokhieng Au, the colonizers “exploited both by imposing Western medicine and by withholding it.” The decision to impose Western medicine upon the person with smallpox but to withhold it from the person with dehydration is a form of violence akin to the selective humanization process that Esmair describes in Juridical Humanity. Medicine allowed colonizers to decide whose body needed caring for and whose body was disposable, much as the law allowed the colonizers in Esmair’s examples to decide who was human and who was not.
The end of World War II ushered in a new era with the birth of many newly formed nation-states and the dissolution of European empires. Health was universally declared to be a human right, and standards of health care were designated as major developmental concerns. Alongside the creation of many international development programs, the UN founded the World Health Organization (WHO) in 1948 as a specialized agency to deal with problems in international public health. The WHO is dedicated to ensuring that the human right to health is upheld around the world, stating in its constitution that “the highest attainable standard of health [is] a fundamental right of every human being.”
How does one fulfill this goal of ensuring universal access to high standards of health care? Medical anthropologists such as Paul Farmer advocate for a biosocial approach to dealing with disease. Under this approach, diseases are not simply seen as biological phenomena. The likelihood of developing an illness is strongly correlated with social factors such as poverty and underdevelopment, as these conditions dictates a person’s access to clean water, adequate nutrition, and hygienic surroundings. Factors such as gender, race and political environment are also strong determinants of susceptibility to illness. Biomedicine and advancements in biotechnology thus cannot be seen as the “magic bullets” that save the lives of people suffering from disease. Medicine might help individuals, but as long as social and developmental problems are left unaddressed, for every person that you cure with biomedicine there will always be another person who gets sick. The social and political environment itself facilitates sickness.
Despite the strong body of knowledge indicating the potential effectiveness of a holistic, biosocial approach in treating diseases worldwide, national and especially international health efforts after WWII rarely attempt to solve global health problems in such a manner. Biomedicine and other immediate provisions of medical services are cheaper than long-term investments in infrastructure (like sustainable health systems), so most aid agencies utilize “cost-effective” strategies such as vaccinations and other immediate interventions to promote health worldwide. These strategies remain plagued by the legacies of colonial medicine.
A classic example of an international health campaign that perpetuated the colonial mentality under the guise of cost-effectiveness is the Smallpox Eradication Programme (SEP), which partnered with the WHO in the 1960’s and 70’s to combat smallpox in India and Bangladesh. At the time, India had the highest incidence of smallpox outbreaks in the world. In response to this, the WHO sent Western physicians to “supervise” medical professionals in India and try to contain smallpox in the country. Their real mission was clear: eradicate the disease in a cost-effective way. For this reason the SEP organized mass vaccinations across the country and set up quarantines to contain people who were carrying the disease.
Accounts of the Smallpox Eradication Programme by a Western doctor, Paul Greenough, feature narratives of American and British physicians breaking down the doors of houses and pinning people down on the floor in order to vaccinate them. Crowds of medical staff, led by Western physicians, surrounded entire villages so that their inhabitants could be vaccinated with or without their consent. There was a far greater sense of urgency on the part of the West to eradicate smallpox in India than there was on the part of Indians themselves. This sense of urgency, of course, confused the Indian population who were facing myriad problems aside from \smallpox. At the time, there were non-infectious health problems such as soaring child and maternal mortality rates, alongside social, political and developmental challenges such as an influx of Bangladeshi refugees fleeing war in their home country, starvation, lack of access to clean water, and rural isolation.
The motivations of the WHO to eradicate smallpox in India were, at least to some extent, rooted in the principles that drove colonial medicine. Smallpox is an infectious disease which had the potential to spread to the developed world through ever-increasing globalization and trade. There wasn’t nearly as much attention given to the other problems India was facing, health-related or otherwise, as there was to the eradication of smallpox. Greenough’s accounts of the SEP contain one anecdote of an elderly woman who refused to receive vaccination if she did not get food. “Why do you care if I die of smallpox? I am already dying of starvation. Give me food instead.”
After WWII, post-colonial states inherited the cultural logic that lay at the foundation of colonial medicine. Medical anthropologist Steve Ferzacca writes in the Encyclopedia of Medical Anthropology that the binary that European colonizers drew between the dirty, wild, and magical societies of the colonized versus their own clean, modern, science-driven human civilization was absorbed and appropriated by post-colonial governments: “Ironically, the very categories of rule imposed by colonial regimes became measures of progress and development for post-colonial states.” The language of this binary was revised: no longer was there native versus European; now there was traditional versus modern, rural versus urban, developing versus developed, and Third World versus First World.
The use of cost-effective strategies to deal with global incidences of disease can be viewed as a façade to cover up the fact that international health campaigns continue to perpetuate the legacies of colonial medicine by operating along these colonial binaries. The most significant target of international health agencies like USAID, The Global Fund and The Clinton Fund (all of which are headquartered in the US and Europe, with American and European administrators) are diseases such as malaria, HIV and neglected tropical diseases in the developing world. It seems naïve to assume that the work done by these organizations is solely motivated by the desire to improve the health and wellbeing of people around the world. Their efforts are dedicated towards containing infectious diseases and preventing their spread from rural populations into urban populations, from developing countries into developed countries, from the Third World to the First World. What they call “aid,” others might call “security measures.”
The human right of access to health does not mean access to shipments of vaccines during an outbreak of an infectious disease, it means being allowed to live in conditions where the risk of disease is actively minimized. The trend has been observed time and time again: there is an infectious disease outbreak in a country prone to such incidents because of underdevelopment and instability; the Western world jumps in to help the people of the afflicted country; the disease is eradicated through biomedical interventions; then the West pulls out their funding and resources after the threat of infectious disease has been minimized.
A decade ago, for every $0.50 it spent per death by non-infectious disease, the WHO spent $7.50 per death by infectious disease. This money was spent on programs such as vaccination campaigns, quarantines, and the provision of preventive technology such as water filters or mosquito nets. Much controversy has risen over the decisions of global health bodies and international aid agencies to focus efforts almost entirely on eradicating infectious diseases in the developing world. This approach ignores the double burden of disease that developing countries face: low-income countries (even the poorest ones) have more deaths from non-infectious diseases than from infectious diseases. Additionally, low-income countries have greater incidences of non-infectious diseases than developed countries do. Non-infectious diseases require far more attention and resource allocation than what was being devoted to them at the time. Even as it tries to reduce this skewed ratio at present, 44.4 percent of the WHO’s budget is dedicated to combating infectious diseases while only 8 percent is dedicated to combating noncommunicable diseases and just 13.4 percent is set aside for helping create stable health systems.
Colonial medicine was used in order to protect the health of the European colonizers, and the conduct of current global health campaigns still seems to be following a similar principle. Victor G Heizer, director of health in the Philippines during the American occupation of the Philippines at the start of the 20th century, stated, “As long as the Oriental was allowed to remain disease-ridden, he was a constant threat to the Occidental.” More than 100 years later, Gayle Smith, a director at the White House National Security Council, stressed the importance of taking action to fight the spread of Ebola by referring to it as “not just an African disease,” evoking the same language as Heizer in stating that it was a “threat to humanity.”
The WHO has declared the current “global emergencies” to be HIV/AIDS, tuberculosis and malaria. Hysteria and public outcry have erupted over several infectious disease outbreaks over the past few years, such as SARS, bird flu, swine flu, and (more recently) Zika. The fact that people in the developing world are dying from non-infectious diseases like cancer, or that they are dying from broken health care systems and structural violence such as terrorism and ethnic conflict does not constitute a global emergency. The emergency only arises when people can transmit the threat of death to the developed world in the form of infectious disease.
Rani Chumbak B’16 is using a pen name.