Friday, September 4, 2015

They say that there many cases of Malaria here in DC.............



DATA AND MALARIA RISK

The analyses in this article are based on historical data collected by the “Early Indicators of Later Work Levels, Disease, and Death” project.7 The primary sample for this project consists of 35,570 white males mustered into the Union Army during the U.S. Civil War, who were chosen randomly from the military company books stored at the National Archives in Washington, D.C.8 The veterans are linked not only to recruitment, military, and medical records while in service, but also to pension and surgeon’s certificates data if they entered the pension system after being discharged. Because the data set provides detailed individual health information, we can determine how stressful their wartime experiences were, what specific diseases and injuries they suffered from over the life cycle, and what caused their deaths. With respect to the two main health outcomes considered, veterans’ height at enlistment was obtained from the recruitment records and the specific diseases that each veteran suffered from during the Civil War were provided by the military records.
The key ecological variable is the risk of contracting malarial fever (hereafter referred to as “malaria risk”) before enlistment. In order to control for the extent to which Union Army veterans were exposed to malaria risk in childhood and at young ages, I have searched for vital statistics showing county- or township-level malaria mortality or morbidity around 1850. However, I found that no reliable data are available for the period and that using malaria mortality data could be misleading. To overcome this limitation, I estimate county-level malaria risk, using epidemiological theories that say that malaria risk is determined primarily by environmental factors. The principal data for risk estimation are found in surgeons’ reports on the annual incidence of malarial fevers among soldiers at 143 U.S. forts in the mid-nineteenth century. I first estimate the correlates between malaria incidence rates at forts and environmental factors around those forts such as temperature, rainfall, and elevation. Then, using these correlates and county-level environmental factors, I estimate the malaria risk for U.S. counties that do not have reliable statistics.
The estimation results are depicted in Figure 1, which shows the malaria risk in mid-nineteenth century America. Risks were high in the Southern and Southeast regions and in the area along the Mississippi and Missouri Valleys and up through the Old Northwestern regions. I discuss the idea of risk estimation, detailed data source, estimation procedures and their results in Appendix 1. Some issues on measurement errors and selection biases in risk estimation are discussed in Appendix 2.
FIGURE 1
ESTIMATED MALARIA RISK OF U.S. COUNTIES IN THE 1850S
In short, this study investigates Union Army veterans’ height at enlistment and susceptibility to other infections during the Civil War, controlling for the estimated malaria risk of the counties where veterans lived in 1850.9 To obtain veterans’ residence information and socio-economic status in early life, I searched for those who are listed in the 1850 federal census records. In the next section, where I estimate the impact of malaria on height and nutritional status in childhood, I limit my analysis to the 1,067 U.S.-born veterans who were under age five in 1850.10 In the analysis estimating the effect of malaria on susceptibility to infections during the Civil War, I use all available 6,855 veterans who have a complete set of variables.
As all the Union Army veterans were enlisted from northern states, the sampling could under-represent people from certain malarial counties, a large number of which were located in the South. Table 1 presents the distribution of Union Army veterans and the general population by county of enlistment or residence in 1850 among five categories of malaria risk. Compared with the general population’s distribution among malarial areas, calculated from residence data of the 1850 Integrated Public Use Microdata Series (IPUMS) samples, the proportions of Union Army veterans who were enlisted or who lived in high malaria-risk counties (risk is 0.3 and over) prior to the Civil War were lower by approximately 12 percent. However, both samples are similarly distributed over the counties where malaria risk is below 0.3. Additionally, the distribution of Union Army veterans linked to the censuses is not significantly different from that of the entire Union Army sample, which is computed by malaria risk at place of enlistment. The samples used in the estimations of this article more often represent white male individuals in less malarial counties of the Northern States, but well represent the entire Union Army veterans in terms of malaria risk at county of residence. Consequently, this study’s results should be interpreted in light of these sampling considerations. But these selection problems are unlikely to seriously impair the results regarding the effect of exposure to malarial environments on lifetime health because the Union Army veterans lived in the counties covering all ranges of malaria risk.
TABLE 1
DISTRIBUTION OF UA VETERANS AND GENERAL POPULATION BY MALARIA RISK OF COUNTY OF RESIDENCE

CHILDHOOD EXPOSURE TO MALARIAL ENVIRONMENTS AND MALNUTRITION

A large number of studies have provided convincing evidence that malaria infection among pregnant women and children is a matter of grave concern. However, these concerns are not limited to infant malaria mortality. Much attention has been paid to the tremendous consequences that malaria infection in these groups has on children’s subsequent health and physical development. Various medical and epidemiological studies have reported that malaria infection during pregnancy increases the risk of maternal anemia and the probability of delivering a low-birth-weight baby by about 20 percent.11 As is well established, low birth weight is a leading cause of infant mortality. It can lead to serious problems related to body composition and musculoskeletal development in early life and may accelerate the likelihood of chronic conditions, such as cardiovascular diseases, diabetes, and kidney failure in late life.12 In addition, malaria is currently one of the most important causes of malnutrition in children. The infection significantly increases energy consumption for immune response as well as it causes anemia in children.13 Fever and recurrent episodes of malaria are also reported to reduce appetite and exacerbate malnutrition. Some studies argue that malaria may produce substantial global mortality through malnutrition.14 It is widely accepted that the influence of malaria on mortality, development, and nutrition in childhood is a critical issue today.
Mortality statistics in the U.S. censuses suggest that malaria was also a severe problem among infants in mid-nineteenth-century America. For instance, 36.5 percent of total malaria deaths in 1860 occurred among children under age five, while the population distribution of the age group was about 15.4 percent.15Roughly speaking, this implies that malaria mortality risk in children aged less than five years was about three times higher than that of older age groups.16 As discussed, exposure to high malaria risk is connected to early health problems of malnutrition and impaired physical development. This section estimates the nutritional impact of malaria during the nineteenth century, using the records of Union Army veterans’ height at enlistment.
Final height has been widely accepted as a measure of nutritional status, especially of nutritional status during childhood. It has been well established that the reduction of malnutrition and increased standards of living have driven a dramatic growth in heights during past the two centuries.17 Many studies have revealed the connections between height and economic variables, such as agricultural shock and real wage change.18 But a decline of the height of population was observed during the period of industrialization, which could have enhanced the nutritional status of the population by promoting an improvement in quantitative standards of living. Another line of research has emphasized environmental impacts on height, offering evidence that substantial exposure to infectious diseases and unhealthy conditions driven by industrialization, urbanization, and immigration resulted in the deterioration of the final height of the population in mid-nineteenth-century America.19
The estimations provided in this study are based on U.S.-born veterans who were linked to the 1850 census and were under age five in that census year. As shown, this age group is the most critical to malaria infections, and linkage to the 1850 census can be utilized to obtain the veterans’ family socioeconomic status in early childhood, which would greatly affect their nutritional status. As a crucial measure of disease environment, the estimated malaria risk of the county where veterans spent their childhood around 1850 will be included. Age at enlistment will show the age-height relationship, and maternal nativity is obtained from the recruitment records. Using information provided by the 1850 census, family socioeconomic status is represented by total household wealth and a dummy variable accounting for whether the household head was a farmer. As a variable regarding environmental characteristics, the regression model contains the dummies indicating both whether the veterans lived in large cities and the occupation of the household head, because most farmers lived in rural areas.
There could be variables with regard to community characteristics favorable for children’s growth that are highly correlated with county malaria risk. In particular, the New England and Middle Atlantic regions were less malarial than the North Central and Southern regions, as shown in Figure 1, but the New England and Middle Atlantic regions possessed local features unfavorable to children’s height. First, although the rate of migration into the North Central and Southern regions had been increasing in the mid-nineteenth century, the ratio of immigrants among the white population was higher in the New England and Middle Atlantic regions and the average number of years since migration was lower.20 Secondly, these regions were more industrialized and populated, whereas the North Central and Southern regions were less developed during the time period studied. Thus, the shortest recruits were born in the New England and Middle Atlantic regions mainly because of immigration and urbanization impacts.21 Third, Michael Haines, Lee Craig, and Thomas Weiss have shown that local nutrient production determined disparities in the height of Union Army veterans, though access to nutrients would be affected by wealth and income as well.22 Malarial areas usually have high temperatures and sufficient rainfall for mosquito breeding. These climatic features could be favorable to farming as well as to malaria transmission. Accordingly, on average, children in the North Central and Southern regions could have been better nourished. In order to control for this mixing of local characteristics that are favorable and unfavorable to children’s nourishment, I include dummy variables for regions in 1850, based on five regional categories.
The region-fixed-effects model provides a significant, and negative, coefficient of county malaria risk on height at enlistment, as presented in model (1) of Table 2. Veterans’ height increases based upon age at enlistment. Maternal nativity did not have a considerable impact on height of veterans in this analysis, but its information rarely appears in the recruitment records. Household-wealth level significantly affected growth during childhood. This means that veterans who had grown up in more affluent families would have been better nourished. Those in farmhouses were better nourished than were children of nonfarmers and urban veterans. As discussed, the estimates of regional dummies show that veterans in the New England and Middle Atlantic regions were shorter, on average, than were those in other regions.
TABLE 2
OLS REGRESSIONS: CHILDHOOD EXPOSURE TO MALARIAL ENVIRONMENT AND HEIGHT (dependent variable:height at enlistment (inches) (mean = 67.07))
We can also think of other county factors that affect children’s nutritional status and may be correlated with county malaria risk. In particular, if malarial counties were poorer counties with unhealthy epidemiological environments, the actual effect of malaria on height could be smaller than the estimate in model (1) of Table 2. To explore this possibility, I include county wealth per capita in model (2) and county population mortality rates in model (3) reflecting general epidemiological environments. As another measure of county environments, finally, I include the variable of county population density in model (4) to see the relevance between urbanization and malaria risk, and the effect of urbanization on height.23 But new county variables do not diminish the effect of malaria risk on height. Any significant correlations between these county variables and veteran’s height at enlistment are not found. Regarding the magnitude of the effect, the coefficient in model (5) implies that veterans who spent their childhood in the most malarial county were shorter by about 1.1 inches at enlistment than were those who grew up in the least malarial county.24

PRIOR EXPOSURE TO MALARIAL ENVIRONMENTS AND IMMUNE DISORDERS

Another principal factor relating to malaria’s impact is the secondary infections that occur as a result of immune disorder. It has been well established that malnutrition caused by an infection plays a key role in increasing susceptibility to another infection by compromising immunity.25 This mechanism applies to various infections besides malaria. During infection, sustenance and stimulation of immune response requires increased energy consumption. This increased energy consumption causes protein energy malnutrition (PEM), which is a critical factor in susceptibility to infection. PEM also undermines the linear growth of children, leading to a further reduction in food intake and nutrient absorption, to catabolic nutrient losses, and to increased metabolic requirements. Subsequently, PEM and deficiencies in nutrients diminish and impair immune functions—both acquired immunity and innate host defense mechanisms—by reducing leptin concentrations and through various other mechanisms. In the end, this increases susceptibility to major human infectious diseases, particularly among children and those in low-income countries.
A large number of studies have made various findings supporting the association among infections, malnutrition, and immune disorder. Regarding the effect of malaria infection, these studies suggest that malaria can increase the risk of pneumonia and diarrhea by considerably suppressing host resistance to bacteria and by exacerbating nutritional status.26 Some studies reveal that malnutrition is generally understood to be a major risk factor in the onset of active tuberculosis.27 Other studies show that nutrient deficiency increases the risk of measles and all other causes of mortality by deteriorating the immune system.28
Compared with the contemporary period, individuals in nineteenth-century America were living under environmental conditions where various infectious diseases were more rampant.29 The pathogens of most contagious diseases had not yet been discovered and the sudden reduction in immunity and nutrition caused by an infection increased vulnerability to other contagions. The purpose of this section is to test the hypothesis that veterans who lived in malaria-endemic counties before enlistment were more likely to be susceptible to infections during the Civil War.
Because the war forced young people from heterogeneous environments into unhealthy and stressful conditions, the impact of early health-related events before enlistment on wartime health is complex. As Chulhee Lee discovered, veterans who grew up in healthy conditions could be more susceptible to infections during the Civil War because they likely experienced fewer infectious diseases at young ages and therefore were not resistant or immune to various diseases.30 Similarly, veterans from unhealthy counties were less susceptible to infections during the war, primarily because they may have already developed resistances to contagious diseases. Local malaria risk may or may not represent the general epidemiological conditions of the counties in which recruits lived before enlistment. Malaria itself rarely confers immunity, but deteriorates nutritional status and the general immune system.31 Thus, the results in this section provide new evidence that early exposure to malarial environments could have unique effects on susceptibility to other infections that differed from those created by exposure to general unhealthy conditions.
In the first analysis, I examine the overall effect of having grown up in environments with a higher incidence of malaria, using a dummy variable indicating whether veterans had contracted any diseases during the Civil War as a dependent variable in the probit regressions. The key control variable in this analysis is the estimated malaria risk of the county where veterans lived before enlistment, particularly in 1850. To examine the role of accumulated health status in childhood and during growth periods, I include veterans’ height at enlistment. Socioeconomic status is controlled for by nativity, age at enlistment, occupation before enlistment as a form of dummy variable, and total household wealth, recorded in the 1850 census. The dummy showing whether they resided in large cities indicates the impact of exposure to unhealthy conditions such as poor sanitation. A variable indicating whether veterans’ initial rank in the Union Army was that of private is included to control for military position while in service. Finally, year of enlistment reflects variations in the severity of military missions, environments, and length of service.
The result of the probit regression is presented in model (1) of Table 3, which shows a strong effect of early exposure to high malaria risk on susceptibility to infections during the war. Although all the veterans were accepted into the Union Army based on physical examinations, weakened immune systems and malnutrition caused by prior exposure to high malaria risk—which might be difficult to detect—made affected soldiers substantially susceptible to various diseases under unhealthy, wartime environments. However, the results regarding the general epidemiological environments at residence prior to enlistment provide contrary implications. U.S.-born recruits were more susceptible to infections by about 4 percent than were foreign-born soldiers. Farmers, mostly in rural areas, were also more likely than nonfarmers in rural areas and even individuals from large cities to be infected with all sorts of diseases—by 8 and 12 percent respectively. In addition, household wealth level had no significant effect on the likelihood of wartime infections. These results imply that veterans who had lived in unhealthy conditions prior to the war would develop more immunity and resistance to contagious diseases, because they had already been exposed to various kinds of infections at an earlier age. For the opposite reason, those from healthy counties, such as U.S.-born and rural veterans, might be more vulnerable to infections under wartime disease environments, as Lee suggests.32
TABLE 3
PROBIT REGRESSIONS: MALARIA RISK BEFORE ENLISTMENT AND SUSCEPTIBILITY TO OTHER INFECTIONS DURING THE CIVIL WAR (dependent variable:dummy = 1 if the veteran contracted any diseases during the Civil War(mean = 0.7185))
For the other control variables, older veterans were more likely to be infected during the war because they had less resistance to diseases in general. The good nutritional status in childhood represented by height at enlistment seems to decrease the probability of later infections, but its significance is low. Individual’s socioeconomic status was also influential in wartime health. This is especially suggested from the result regarding the initial rank of veterans. Rank or military position was generally determined by socioeconomic status before enlistment, measured by educational level, income, height, and so on. In this respect, the coefficients of the rank variable in Table 3 imply not only that veterans with a lower rank were unhealthier because they conducted more infectious missions, but also that this effect partially originated from the low socioeconomic status that caused them to have a low rank. Finally, those recruited in the first year of the Civil War were the most insulted by infections, and their incidence rates greatly declined after 1863.
In general, the level of acquiring immunity and resistance to a certain infectious disease would depend on how prevalent the disease had been in the counties where veterans lived before the Civil War. If the counties with high malaria risk indexes were not only malarial, but also infectious and unsanitary, the result outlined above would underestimate the actual effect of malaria on the likelihood of wartime infections because prior infections could make veterans develop resistance or immunity to some infections. In addition to its potential relationship to general epidemiological condition, malaria-endemic counties could also be poorer regions. In this case, the omission of variables regarding community richness could overestimate the effect of malaria, though only if the wealth effect is favorable for wartime health. In order to control for this potential problem, I conducted additional regressions, as I did in the previous section. In model (2) of Table 3, including a county wealth variable as a regressor considerably reduced the effect of malaria, but maintained its still strong effect. I also found that veterans from rich counties were significantly less susceptible to infections during the war. In models (3) and (4) of Table 3, I utilize county population mortality rate and population density as general environment indexes, respectively. They did not create any differences with the model in which I excluded these county environmental variables. But the result implies that veterans who grew up in generally unhealthy areas were more resistant to wartime infections. Finally, the marginal effect of county malaria risk at 1850 residence in model (5) implies that veterans from the most malarial county were about 13 percent more susceptible to infections during the Civil War compared to those from the least malarial county.33
These findings can be summarized as presenting three potential implications. First, exposure to a high risk of malarial fevers before enlistment significantly increased the probability of being infected with diseases during the war due to the weakening of the host’s immune system, resistance, and nutritional status. Second, with regard to nonspecific and universal environments, living under unhealthy conditions could reduce the probability of contracting additional infections because early exposure could develop the immune system and resistances. Third, a high level of socioeconomic status measured by military rank had a positive impact on wartime health.
In the second analysis, I examine the effects of prior exposure to malarial environments on susceptibility during the Civil War to specific infections, including diarrhea, pneumonia, malaria, measles, and typhoid, which were the most prevalent infections among Union Army troops. Supporting the results for overall diseases, veterans from malarial counties had higher probabilities of contracting these specific infections. The most considerable effect is found for diarrhea in model (1) of Table 4. As discussed at the beginning of this section, malnutrition resulting from an infection is reported to cause severe diarrhea. The result of model (2) suggests that pneumonia incidence during the Civil War has a positive correlation with living in malarial counties in the earlier period, though the effect is insignificant and very small. The result of model (3) also implies that prior exposure to high malaria risk could induce its relapse. Because individuals, once infected, rarely develop immunity to malaria infections, frail veterans were always vulnerable to insults of malaria parasites during the war.34
TABLE 4
PROBIT REGRESSIONS: MALARIA RISK BEFORE ENLISTMENT AND SUSCEPTIBILITY TO SPECIFIC INFECTIONS DURING THE CIVIL WAR (dependent variable:dummy = 1 if the veteran contracted the disease during the Civil War)
Measles is an immunity-conferring disease that is known to be contagious by physical contact or through airborne exposure. Thus, a prior infection would be less prevalent among recruits from less populated, malarial counties. Parts of the effect on measles in model (4) can be explained through this mechanism, as well as by the direct results of malaria infections: weakened immune system and malnutrition. Finally, typhoid is a common immunity-conferring disease caused by the intake of contaminated food and water. Although an insignificant coefficient is estimated, its negative sign can be interpreted in two ways. First, early exposure to malaria risk could cause more typhoid fevers before enlistment, as the main argument of this section—immune disorders and malnutrition—suggests. In this case, the veterans from malarial counties would have a lower incidence of typhoid fever because of acquiring immunity to typhoid. Second, this result can be used as evidence of misdiagnoses between malaria and typhoid because typhoid fever was frequently confused with malarial fever in the nineteenth century. If a considerable number of typhoid fevers were mistakenly counted as malarial fevers in the fort data, the malaria risk used in this study could be overestimated.35 This means that highly estimated malaria counties were also high typhoid areas. Veterans from these areas likely had experienced typhoid fever before enlistment, and so had fewer incidences during the war. But given the limited data availability, it is impractical to identify which implication the result supports.36
It has been disputed whether or not early insults from disease or injury weaken individuals, making them more susceptible to infections in the future, and so result in increased mortality.37 The findings in this section provide two potential implications for this debate. Certain findings imply that prior exposure to generally unhealthy living conditions could be favorable to later health status under certain infectious situations, such as those of wartime. However, the principal finding insists that prior exposure to a specific disease that can undermine general immune system and nutritional status, such as malaria, could make people frailer under those same wartime conditions.38

CONCLUDING REMARKS

In this article, I have explored the health burdens of malaria over the early life course in the United States during the mid-nineteenth century, using longitudinal data on Union Army veterans. As a result of malnutrition driven by childhood malaria infections, Union Army recruits who spent their childhood in malaria-endemic counties were significantly shorter at enlistment, by up to 1.1 inches, than were those from malaria-free counties. Because of an immune system disrupted by malaria infections, recruits from high malaria-risk counties were 13 percent more susceptible to various infections during the Civil War, compared with those from nonmalarial counties. Previous studies using Union Army veterans have revealed that wartime infection was a major factor in developing chronic conditions in later life.39 Focusing on malaria, which is more critical in children and pregnant women, the findings of this study suggest that these strong impacts on health at older ages resulted from health events that occurred at younger ages, during childhood, and even prenatally.
The effect of early malaria infection extends far beyond the direct measures of the extents of malnutrition and secondary infections. For children, malaria could reduce attendance at school and deteriorate their learning ability by impairing cognitive development, performance, and behavior.40 For adults, it could reduce productivity at work and hinder economic development through its impact on wages and profits.41 In other words, malaria infection would be one of the major causes impeding public health improvement and economic growth in mid-nineteenth-century America, as observed in current countries suffering from malaria. Therefore, this study on the health burden of malaria infection will broaden our view of human capital accumulation and economic development in the mid-nineteenth century.

Acknowledgments

I have benefited from comments and suggestions by Robert W. Fogel, Gary S. Becker, Hoyt Bleakley, Kwang-Sun Lee, Joseph Ferrie, Louis Cain, Dora Costa, Chulhee Lee, James Smith, Tommy Bengtsson, David E. Bloom, David Canning, Jere Behrman, Eileen Crimmins, Price Fishback, Veronica Wald, Mark Guglielmo, Joseph Burton, A. J. Aiseirithe, Ki Young Park, three anonymous referees, and the editor of this JOURNAL, and by the participants in the workshop on the Economics and Biodemography of Aging and Health Care at the University of Chicago, at the 2006 Economic History Association Meetings, and at the Illinois Economics Association Meetings. I am very grateful to Alex Gendlin, Carlos Villarreal, and Jaehee Choi for their assistance in data collection and the development of GIS data and to Melissa Ptacek for her editorial assistance. Research reported in this article was supported by NIH grant number P01 AG10120.

Appendix 1: Estimating Malaria Risk

A major contribution of this article is the estimation of the risk of contracting malarial fevers for counties of the United States during the mid-nineteenth century. In order to measure the local risk of malaria infections, I have sought county-level malaria statistics for periods prior to the Civil War. Although the U.S. Census Bureau began to record mortality schedules beginning with the 1850 census, it first published cause-specific mortality statistics by county, including malaria mortality, was published in 1880. However, as a considerable amount of agricultural drainage had been conducted throughout the 1880s in many areas that were previously malarial, malaria incidence and mortality had been declining, especially in the Old Northwestern regions and the Upper Mississippi Valley.42 Consequently, the 1880 malaria mortality rates may not represent true “malaria risk”—the risk of contracting malarial fevers—of the early and mid-nineteenth century.43
In addition, malaria mortality varies by types of its parasites. Malaria is caused by four different parasites:falciparumvivaxmalariae, and ovaleFalciparum and vivax parasites were widespread throughout the United States from the colonial era onward. Because vivax malaria can continue to develop even at low temperatures, it thrived in the northern states.44 However, not causing excessively high mortality rates, it was less of a burden than falciparum malaria, which was imported with African slaves in the late seventeenth century and was prevalent in states south of the thirty-fifth parallel.45 As a result, the use of malaria mortality statistics for measuring malaria risk could be very misleading.46
In this article, I use a risk-estimation method based on epidemiological theories that say that malaria is a disease whose transmission is determined primarily by environmental factors such as weather patterns and geographical features. This estimation method has been employed by epidemiologists in forecasting global malaria risk.47

CORRELATES BETWEEN MALARIA INCIDENCE AND ENVIRONMENTAL FACTORS

Temperature, rainfall, elevation, and geographical features, such as the presence of wetlands and swamps, influence mosquito breeding, vector densities, and vector survival, and thus malaria incidence. Above all, temperature and precipitation primarily determine whether vectors of malaria would be active, and are also involved in the development and reproduction of malaria parasites.48 Altitude is essential as well, because the mosquito is inactive in areas that exceed a certain elevation, and temperature and rainfall are also influenced by altitude.49Moreover, variations of altitude primarily determine land cover and geographical features. As is well known, the availability of suitable sources of surface water for vector breeding is a prerequisite for malaria transmission. Frequent floods and heavy rains would convert flat areas near rivers into swamps and wetlands in the mid-nineteenth century, when draining technology was insufficient. Consequently, settlement in these regions was also inhibited by the presence of endemic fevers, which continued to plague residents until the drainage movement of the late nineteenth century. However, contrary to the effect of draining, water resource developments for agriculture, such as irrigation and canals could unintentionally provide new mosquito breeding sites.50 Finally, some potential vectors breed in fresh and saltwater marshes, but most species that are abundant in the United States commonly lay their eggs in stagnant fresh water.51 Thus, there exists the possibility that regions adjacent to oceans are exposed to a lower level of malaria risk.52
The principal data used in my estimation are the result of surgeons’ reports on annual incidence of malarial fevers among soldiers at U.S. forts (see Appendix Table 1). I first estimate the correlates between malaria incidence rates and environmental factors around forts, described in the above. Then, using these correlates and county-level environmental factors, I estimate the malaria risk for U.S. counties during various periods.
APPENDIX TABLE 1
SAMPLE STATISTICS OF FORT CHARACTERISTICS
The U.S. Surgeon General’s Office compiled reports on the health statistics of the U.S. Army forts from 1829 to 1874. Army surgeons and civilian physicians recorded annual or quarterly death rates and cases of various diseases and injuries for each fort.53 Surgeons and physicians also documented living conditions and fort information, such as habitation, food, geographical location, and climate. However, fort-level statistics are available only for two periods: 1829–1838 and 1871–1874. For the other periods, aggregated statistics were published. Thus, I derived the logarithm of malarial fever incidence rate—the dependent variable in my regression model—from two reports on sickness and mortality at U.S. Army forts that cover the above two periods.54 In particular, I used intermittent and remittent fevers recorded in the fort reports to obtain incidence of malarial fevers.55 The issue of misdiagnosis is discussed in Appendix 2.
I also searched for the various environmental variables discussed previously for the surroundings of U.S. forts. First, monthly weather data at forts was obtained using two U.S. Army fort sickness and mortality and, in part, U.S. Department of Agriculture Weather Bureau statistics.56 Annual mean temperature was included in the estimation, and its squared term shows whether extreme temperatures could slow malaria transmission. Rather than annual accumulated precipitation, I used accumulated rainfall during the months when mean temperatures were higher than 59 degrees Fahrenheit.57 Although there may have been enough precipitation for vector breeding at lower temperatures, mosquitoes would be inactive in those situations.
In order to control for whether or not the fort was located around a potential wetland, the standard deviation of altitude of the counties where the forts were located was employed in the estimation.58 The variable of county land improvement ratios will control for the effects of land use and water source development on malaria risk.59 In particular, I looked at the different effects of land use in two types of regions defined by county geographical features: flat and nonflat counties.60 Flatter counties were more likely to be developed into agricultural land. Water source development, settlement, and migration influxes might have contributed to malaria outbreaks in those areas. Finally, I included the dummy variable showing whether a fort county was adjacent to the ocean, in order to reflect the fact that vector breeding in salt water is uncommon.61
Among approximately 200 forts found in the United States in the periods 1829–1838 and 1871–1874, 143 forts with complete variables have been selected.62 As depicted in Appendix Figure 1, forts used in the estimation are so widespread over the country that they would sufficiently reflect regional differences in malaria incidence.63 Appendix Table 2 presents the result of estimating the correlates between malaria incidence and environmental factors around the forts studied. According to model (1) using all the available forts, the most crucial factor determining malaria risk is temperature. Malaria incidence significantly increases based on annual mean temperature of the forts studied. But its impact slows as annual temperature rises, and the threshold annual mean temperature is about 65 degrees Fahrenheit. This result well supports the current theory that extreme temperatures can deter malaria transmission. In addition, increased precipitation levels provide environments favorable for mosquito breeding, though this is not statistically significant.64
APPENDIX FIGURE 1
LOCATIONS OF U.S. ARMY FORTS IN THE STUDY
APPENDIX TABLE 2
OLS REGRESSIONS: CORRELATES OF MALARIA INCIDENCE AND ENVIRONMENTAL FACTORS (dependent variable: In(fort malaria incidence rate))
Geographical features around forts were also influential in determining malaria incidence. Forts located around flatter counties showed significantly higher malaria incidence. This implies that potential wetlands and swampy areas provided favorable environments for malaria transmission. Additionally, the effects of land improvements depended on whether or not the area of the county was flat. More land improvement in flat counties led to significantly higher malaria incidence, while it deterred malaria transmission in nonflat counties. This suggests that though land improvement can deter malaria through the effects of draining, the adverse effects created by water source development and the corresponding migration influx and settlement are also substantial. Finally, as theories support, malaria was less prevalent at forts that were close to oceans and so had environments unfavorable to mosquito breeding.65

ESTIMATED MALARIA RISK IN MID-NINETEENTH-CENTURY AMERICA

On the basis of the correlates from model (1) in Appendix Table 2, I estimate malaria risk for the counties for which no records of malaria mortality and morbidity in the mid-nineteenth century exist. For estimation, the same kinds of environmental variables have been collected and calculated at the county level. Specifically, county climate information was obtained from the “Nineteenth-Century U.S. Climate Data Set Project” developed by the National Climate Data Center (NCDC).66 As previously utilized, county elevation and its standard deviation were calculated from USGS digital elevation data using GIS. County land improvement ratios were gathered from census records.
Figure 1 presents the estimated malaria risk in the 1850s.67 To obtain the general malaria risk index, I take the exponential time of the original fitted values. Roughly speaking, the estimated risk index can be interpreted as the annual probability of contracting malarial fevers within the county, or the proportion of the county’s population infected with malaria during that year.68 It has been discussed that highly malarial areas can be characterized as warmer, wetter, and flatter regions. Reflecting this view, Figure 1 accurately displays what the distribution of malaria risk looked like in the 1850s, indicating that malaria was endemic in the Southern and Southeast regions and by the Gulf. More importantly, the result also indicates that malaria was prevalent along the Mississippi and Missouri Valleys and up through the Old Northwestern regions, including Ohio, Indiana, Illinois, Iowa, and Missouri, as Erwin Ackerknecht and, later, Daniel Drake described.69
Because county malaria morbidity and mortality data are not available for the 1850s, a strict evaluation of estimations might seem to be impossible. Although some public interventions, such as draining and the reduction in the price of quinine, should be considered, one reasonable approach is to compare county malaria mortality rates in the 1880 census with the malaria risk that is estimated from 1880 county environmental variables. As presented in Appendix Figure 2A, 1880 county malaria deaths per 1,000 population seem to be highly correlated with estimated risk. To show this, I calculate the correlates between reported and estimated indexes in Appendix Table 3. From the results indicated by model (1), there exists a significant positive relationship, and estimated risk accounts for about 39 percent of variation in reported mortality. It may be difficult to state with assurance that reported mortality is caused by estimated risk, either because other infectious diseases could be widespread in counties with high malaria risk or because the estimated risk might also represent poorer counties. This can be partly corrected for by including other county socioeconomic characteristics.70 As model (2) shows, nevertheless, this attempt increases explanatory power by only 1 percent. Similarly, I did the same with county malaria mortality from the 1890 and 1920 census records. From model (2) to model (4), the coefficient of estimated malaria risk declines greatly, implying that my estimation does not work well for later periods. This means that more considerable efforts for malaria eradication had been conducted, especially in the early twentieth century, which is also observed in Appendix Figures 2B and 2C.
APPENDIX FIGURE 2
PLOTS OF REPORTED MALARIA MORTALITY IN 1880, 1890, AND 1920 AGAINST ESTIMATED RISK
APPENDIX TABLE 3
OLS REGRESSIONS: CORRELATES BETWEEN ESTIMATED MALARIA RISK AND REPORTED MALARIA MORTALITY IN 1880, 1890, AND 1920 (dependent variable: In(malaria mortality per 1,000 population))

Appendix 2: Issues in Risk Estimation

Without sufficient knowledge of the pathogen, the diagnosis of malarial fever depended mainly on its symptoms, and so was very problematic. During the nineteenth century, surgeons generally classified malarial fever into two forms: intermittent and remittent. But malaria had been easily confused with several other diseases that had similar symptoms. In particular, some physicians considered remittent fever to be typhus or believed remittent fever could develop into typhoid. Physicians were not able in all cases to determine from symptoms alone whether a fever was malarial fever, typhoid fever, or typhus.71 After the Civil War, the term “typho-malarial fever” was used because of the common symptoms associated with typhoid and malaria. It has been widely accepted that the measure of malarial fever during the nineteenth century is biased upward. The possible measurement error should be considered throughout this article. The results of risk estimation and health impacts should be cautiously interpreted in light of this possibility.
Without additional individual health records, it is not possible to determine how much the fort malaria incidence used in this study would be biased upward. But some evidence found in the fort data suggests that the measurement error was not critical. First, remittent fever, mostly confused with typhoid, was less reported in the fort data, as shown in Appendix Table 1 and Appendix Table 4. This would reduce the size of measurement error. Second, the incidence rate of malarial fever is more seasonal than is that of typhoid and typhus fevers because malaria cannot be transmitted without mosquitoes, which are not active in winter. If typhoid and typhus fevers were counted as remittent fever throughout the year, the incidence pattern by season between intermittent and remittent fevers would be quite different. But in Appendix Table 4, the fort data recorded in the 1829–1838 period indicate that the seasonal pattern of the two types of fevers was very similar. In both cases, the rate was highest in the third quarter of the year, which is the best season for mosquitoes, and it was lowest in the first quarter. Third, actual incidence rates of typhoid and typhus fevers seem to have been so low that they would not cause serious measurement error. In particular, Appendix Table 4 reports the incidence rate at forts of typho-malarial fevers between 1871 and 1874, which is much smaller than the rate summing remittent and typhoid fevers. This implies that the cases of misdiagnoses may be small.
APPENDIX TABLE 4
INCIDENCE RATE OF FEVERS BY TYPE, SEASON, AND YEAR
In Appendix Table 5, I report the result of regression analyses using two measures of malaria risk which are estimated by including and excluding remittent fever. The result says that including remittent fever in risk estimation does not critically change the effect of malaria on height at enlistment and susceptibility to infections during the Civil War.
APPENDIX TABLE 5
REGRESSION RESULTS BY THE MEASURE OF MALARIA RISK
On the other hand, it could be problematic to use the fort data in the 1829–1838 and 1871–1874 periods for estimating malaria risk in 1850, the main control variable in the article. The regression model (2) in Appendix Table 2 uses only forts existing during the 1829–1838 period and model (3) uses forts existing in 1871–1874. For most environmental variables, similar implications to those suggested by model (1) are produced. But some considerable differences in coefficients are observed between models (2) and (3). This is mainly because newly established forts in the 1871–1874 period were largely located in the Western States, such as New Mexico, Nevada, Arizona, and California. The climate of these areas was generally drier and hotter than around forts established in the 1830s, while the malaria incidence rates of these forts were lower than were those of forts in the Southern States. As a result, the coefficient of temperature became smaller and the effect of rainfall was magnified in model (3). Similarly, many forts with lower malaria incidence rates and lower land improvement ratios were newly added to the sample in the later period. This changed the sign of the coefficient of land improvement in both types of land, indicating an adverse effect of water source development. These results imply that adding forts in 1871–1874 will better represent variation of malaria incidence by different environments so that I can estimate a more reliable correlation between environmental factors and malaria risk.
If there were considerable environmental changes between two periods that could reduce malaria risk, such as drainage, adding forts in the later period could underestimate malaria risk in 1850. But in the early 1870s, drainage efforts were at the beginning stages nationally, with significant drainage efforts underway in only a limited number of states.72 Additionally, most of the forts newly added to the 1871–1874 data were located on the frontier, and so were less affected by increasing drainage efforts in the North and Eastern States. From the early nineteenth century to the early 1870s, medical knowledge regarding malarial fever had not improved. No truly effective efforts to reduce malaria risk were undertaken. Therefore, the advantages of including forts in the 1871–1874 period surpass its disadvantages.
In models (4) and (5) of Appendix Table 5, I also report the result of regression analyses controlling for two different risk measures that are estimated with fort data for each period. The coefficients using the risk measure from 1829–1838 fort data are smaller and less significant compared with those based on forts in the later period. But considering the ranges of each estimated malaria measure, the result indicates that the impacts on later health were quite similar to each other.
Finally, most forts in the study were located in less populated areas and on the frontier, but some were located in large cities. All the forts may not be equally considered in the risk estimation. In order to see how much this issue can affect the previous results, I estimated the correlates between malaria incidence and environmental factors at forts, using the 1850 population of the counties where forts were located as weights—a population-weighted regression analysis. Then this new risk measure was used to estimate the effects on height and susceptibility to infections. The result in model (6) of Appendix Table 5 shows that the effects on later health outcomes are still strong.

Footnotes

3In the southern states, including Kentucky, Tennessee, Alabama, Mississippi, Arkansas, Louisiana, and Texas, 7.8 percent of deaths in 1850 resulted from malarial fevers. In the top ten countries for malaria mortality in 2000, about 14.9 percent of deaths were caused by malaria. (; and )
4It is reported that about 1,300,000 recruits contracted malarial fevers during the Civil War. The number of cases of other major infections and deaths is reported as follows: diarrhea (1,739,135 cases / 44,558 deaths), typhoid or typho-malarial (148,631 / 34,833), pneumonia (77,335 / 19,971), and measles (76,318 / 5,177) ().
7The project is sponsored by the National Bureau of Economic Research, the National Institutes of Health, the Center for Population Economics at the University of Chicago, and Brigham Young University.
9A limitation of this study is that it cannot determine who in the sample was actually infected with malaria in this year. A blood test survey at a southern town in the early twentieth century shows that three-quarters of the population was infected with malaria parasites, though some of them were believed to have been cured completely (). This means that a substantial proportion of people in counties denoted by high malaria risk—which is the key variable indicating the extent to which individuals in the county were potentially infected with malaria in this study—would suffer from malaria and its health consequences. But the ecological fallacy may not be avoided without additional information on Union Army veterans’ health events before enlistment. The findings in this article should be interpreted in light of this problem.
10As will be discussed, this age group—under age five—is the most vulnerable to malaria death. The subsequent health impact of malaria for this group is a matter of grave concern today.
15. The 1860 census reports the cause-specific mortality at the state level.
16This does not mean that children get malaria more frequently than do adults. Adults may have more chances of being bitten by mosquitoes, for example, at their workplaces. But because children are less resistant to infections, their fatality is much higher than that of adults and they are more likely to suffer from severe forms of malaria.
23The variable of urbanization can be closely related to malaria risk. It is generally believed that urbanization deters malaria transmission by eliminating mosquitoes and their breeding sites. But according to recent studies, urbanization can also increase malaria risk because more migrations into dense areas can initiate malaria transmission (). In the estimation of malaria risk reported in Appendix 1, the variable ofland improvement ratio partly represents the level of urbanization. Its coefficient in Appendix Table 2 implies that more land improvement causes higher malaria incidences in the forts, especially in flat counties and in the 1871–1874 period. More land improvement may be interpreted as more urbanization. But the correlation between urbanization indexes and estimated malaria risk is quite low. The correlation coefficients between estimated malaria risk and county population in 1850, and its population density are calculated as −0.0767 and −0.0550, respectively.
24From the first column of Table 1, the maximum malaria risk of the counties where the sample in this section lived is 0.4616. The minimum risk is 0.0138. The difference in height between two counties is then calculated as 1.1315 (inches) = −2.5268 * (0.0138 – 0.4616).
27Cegielski and McMurrary, “Tuberculosis.”
29From the colonial era to the turn of the twentieth century, people were exposed to and died from various endemic diseases such as smallpox, scarlet fever, measles, whooping cough, typhoid fever, tuberculosis, diarrhea, dysentery, and malaria ().
31It is reported that a person who survives malaria infections may be immune to subsequent infection, but only with high intensity and frequency of previous infections. Sickle-cell mutation is a typical example of malaria immunity gene that is found among people in some areas of Africa ().
33From the second column of Table 1, the maximum malaria risk of the counties where the sample in this section lived is 0.4788. The minimum risk is 0.0138. The difference in susceptibility to infections between two counties is thus calculated as −13.4943 (%) = 0.2902 * (0.0138 – 0.4788) * 100.
34This possibility is also inferred from the fort data. Malaria was so recurrent that incidence rates at some forts—which counted re-infections additionally in their calculation—were often higher than one. See Appendix 1 and Appendix Table 1. For a discussion of immunity to malaria infections, see footnote 31.
35For detailed discussion of issues on misdiagnosis, see Appendix 2.
36The age effect also depended on types of infections. For immunity-conferring diseases, such as typhoid and measles, older veterans had lower probabilities of infection during wartime. This implies that, compared to younger veterans, a large proportion of older veterans had developed immunities to these infections before enlistment. Although more explanations for some exceptions are necessary, veterans with unhealthy factors such as having been born in foreign countries and residing in an unsanitary city generally had increased resistance to these specific infections. Following the previous indications, veterans with low socioeconomic status who were initially ranked as private were more susceptible to these infections. Enlistment year was also a considerable factor affecting the likelihood of being infected with diseases under wartime conditions.
37See  for a discussion of “insult accumulation model.”
38Union Army veterans in this study were acceptably healthy at enlistment. They survived various infections and unhealthy environments as children. But the Union Army sample excludes many other persons who suffered more severely from early malaria infections and so were much more impaired. In this respect, the actual impact of early exposure to malaria risk on later health could be much higher than what this article has reported.
39 and  In her examination of Union Army veterans’ data, Dora L. Costa argued that exposure to various infections during the Civil War played a major role in veterans’ development of chronic diseases in old age. In another study of this same data, Robert W. Fogel and Costa argued that reduced exposure to infectious environments in early life led to a significant decline in the age-specific prevalence of chronic diseases in late life during the twentieth century (). Based on modern data, a large number of studies have shown the negative effect of exposure to many infectious diseases both in utero and during infancy and childhood on the rates of mortality, of chronic disease prevalence, and of disability at middle and late ages ( and; and ).
41  reveals that U.S. cohorts born after hookworm and malaria eradication had higher income as adults than did the preceding generation.
42; and . Northwestern regions include Ohio, Indiana, Illinois, Iowa, and Missouri.
43Although there have been several efforts to collect mortality schedules from the censuses of 1850, 1860, and 1870, such as the work done by  and a project on “Federal Census Mortality Schedules, 1850–1880” at Ancestry.com, these works are incomplete or in progress.
45Although we know that falciparum was rare in the Northern States, it is impractical to determine what the distribution of both types of malaria parasites looked like in the Southern States. This partly depended on season. The fever and chills of vivax malaria struck during spring planting season; falciparum came along during harvest season, mainly August and September (). Currently, most of the malaria in African countries is in form of falciparum, whilevivax is dominant in non-African countries (). Thus, morbidity largely depends on regional climate and geographical features and on what types of parasites are dominant. On the other hand, although vivax is milder than falciparum, it can be fatal without proper treatment. It is also known that the vivax parasite can stay in a patient’s blood for months and years, causing relapses and resulting in subsequent health problem for a long time ().
46Another problem with using malaria mortality comes from the fact that historically quinine—the prescribed treatment for malaria—rarely cured vivax malaria, though it temporarily suppressed the disease and enabled people to go back to work (.) Malaria mortality could be also influenced by various compounding factors such as the level of personal wealth and the price of quinine.
47See  and  to look at how epidemiologists and scientists forecast current global malaria risk. See  for another study using climatic and geographic data to estimate the prevalence of another parasitical infection (hookworm) in mid-nineteenth-century America.
48Malaria vectors require a minimum precipitation of 10 mm or 0.394 inches per month for breeding, mainly in temporary water. Malaria parasite remains inactive when the mean monthly temperature drops below 59 degrees Fahrenheit, and also stops its development in high heat—a temperature persistently above 85 degrees Fahrenheit. (; and ). This implies that there could be a nonlinear, or quadratic, relationship between malaria risk and temperature.
49Malaria vectors can cease transmission at altitudes above 3,300 meters, though most vectors are active at low altitude levels ().
50Irrigation in the dry season could cause another annual peak of mosquito abundance in addition to that caused after periods of rain. Similarly, risk for malaria transmission can be increased by proximity to dams and impounded water construction created to address the seasonal scarcity of water ().
52Although this study does not consider factors other than the above environmental ones, some studies suggest that soil type, deforestation, and urbanization are also relevant to malaria transmission (; and ).
53These fort soldiers, especially before the Civil War, may be substantially different from soldiers who served in the Civil War, in that many Union Army recruits were conscripted. Their socioeconomic background would be higher, and consequently their health status at enlistment would be better than that of fort soldiers. In this case, if unhealthy persons are frailer to malaria infections, the incidence rates of malarial fever among fort soldiers could overestimate the extent to which Civil War soldiers suffered from malaria.
54 and . On the other hand, the use of fort records may raise the issue as to whether soldiers’ health at forts is well representative of the general population health during the periods studied. Because malaria incidence in the general population is not available, a direct comparison is impractical. Given the limited data on malaria morbidity, this issue can be partly checked by calculating the correlation coefficient between fort malaria incidence rate and malaria mortality among the general population of the county where the forts were located. I used county malaria mortality in the 1880 and 1890 census records. The correlation coefficients are calculated as 0.5593 for 1880 and as 0.2763 for 1890. This implies that malaria incidence among fort soldiers would well represent that of the general population at least up to 1880.
55During the nineteenth century, surgeons generally classified malarial fever into two forms: intermittent and remittent. Intermittent fevers were those that followed a regular pattern of fever alternating with a return to normal temperatures every 48 or 72 hours. In the case of remittent fever, temperature was observed to fluctuate, but it did not quite return to normal.
57Footnote 48 discusses the relationship between malaria transmission and temperature, and precipitation.
58County elevation values were calculated using ten-foot contours, which were converted from the U.S. Geological Survey (USGS) National Elevation Data (NED) by Geographical Information System (GIS). In particular, I assume that altitude has not significantly changed overthe centuries.
59“Land improvement ratio” is defined by the ratio of the number of acres of improved land to that of total land available for farming and has been obtained from the U.S. Census Bureau.
60I define a county as flat if its standard deviation of altitude is less than 30 feet. For the entirety of U.S. counties, the mean of standard deviation of altitude is 60 feet and its median is 27 feet. About 55 percent of counties belong to the flat county group by this definition. The estimation result does not change much for various definitions of flat counties, especially for the cutoff value (σ) between 20 and 70.

No comments:

Post a Comment