The fate of migrants moving to cities in 17th- and 18th-century England demonstrates how a single pathogen could dramatically alter the risks associated with migration and migratory patterns today.
Cities have always been a magnet to migrants. In 2010, a tipping point was reached for the first time when, according to the World Health Organization, the majority of the world’s population lived in cities. By 2050, seven out of 10 people will have been born in – or migrated to – a city. One hundred years ago, that figure was two out of 10.
Today, cities are generally the safest places to live. If you live in one, you’re likely to be richer than someone living in a rural environment. If you’re richer, you’re likely to live longer. If you live in a city, you have better access to hospitals and healthcare, and you’re more likely to be immunised.
But that was not always the case. In 17th- and 18th-century England, city life was lethal – disproportionately so for those migrating from the countryside.
Dr Romola Davenport is studying the effects of migration on the health of those living in London and Manchester from 1750 to 1850, with a particular focus on the lethality of smallpox – the single most deadly disease in 18th-century England. In the century before 1750, England’s population had failed to grow. Cities and towns sucked in tens of thousands of migratory men, women and children – then killed them. It’s estimated that half of the natural growth of the English population was consumed by London deaths during this period. Burials often outstripped baptisms.
In 2013, cities are no longer the death traps they once were, even accounting for the millions of migrants who live in poor, often slum-like conditions. But will cities always be better places to live? What could eliminate the ‘urban advantage’ and what might the future of our cities look like if antibiotics stop working?
By looking at the past – and trying to make sense of the sudden, vast improvement in survival rates after 1750 – Davenport and the University of Newcastle’s Professor Jeremy Boulton hope to understand more about city life and mortality.
“For modern migrants to urban areas there is no necessary trade-off of health for wealth,” said Davenport. “Historically, however, migrants often took substantial risks in moving from rural to urban areas because cities were characterised by substantially higher death rates than rural areas, and wealth appears to have conferred little survival advantage.”
The intensity of the infectious disease environment overwhelmed any advantages of the wealthy – such as better housing, food and heating. Although cities and towns offered unparalleled economic opportunities for migrants, wealth could not compensate for the higher health risks exacted by urban living.
“Urban populations are large and dense, which facilitates the transmission of infectious diseases from person to person or via animals or sewage. Towns functioned as trading posts not only for ideas and goods but also for pathogens. Therefore, growing an urban population relied upon substantial immigration from rural areas,” explained Davenport.
“After 1750, cities no longer functioned as ‘demographic sinks’ because there was a rapid improvement in urban mortality rates in Britain. By the mid-19th century, even the most notorious industrial cities such as Liverpool and Manchester were capable of a natural increase, with the number of births exceeding deaths.”
Davenport has been studying the processes of urban mortality improvement and changing migrant risks using extremely rich source material from the large London parish of St Martin-in-the-Fields. The research, funded by the Wellcome Trust and the Economic and Social Research Council, is now being augmented with abundant demographic archives from Manchester, funded by the Leverhulme Trust.
For both cities, Davenport and colleagues have access to detailed records of the individual burials underlying the Bills of Mortality, which were the main source of urban mortality statistics from the 17th to the 18th century. These give age at death, cause of death, street address and the fee paid for burial, which enables them to study the age and sex distribution of deaths by disease. In addition, baptismal data allow them to ‘reconstitute’ families as well as to measure the mortality rates of infants by social status.
“The records themselves give only a bald account of death,” said Davenport. “But sometimes we can link them to workhouse records and personal accounts, especially among the migrant poor, which really bring home the realities of life and death in early modern London.
“Smallpox was deadly. At its height, it accounted for 10% of all burials in London and an astonishing 20% in Manchester. Children were worst affected, but 20% of London’s smallpox victims were adults – likely to be migrants who had never been exposed to, and survived, the disease in childhood. However in Manchester – a town that grew from 20,000 to 250,000 in a century – 95% of smallpox burials were children in the mid-18th century, implying a high level of endemicity not only in Manchester but also in the rural areas that supplied migrants to the city.
“So studying urban populations can tell us not only about conditions in cities but also about the circulation of diseases in the rest of the population.”
The greater lethality of smallpox in Manchester is, for the moment, still a mystery to researchers; but evidence suggests the potential importance of transmission via clothing or other means – as opposed to the person-to-person transmission assumed in mathematical models of smallpox transmission in bioterrorism scenarios. Although smallpox was eradicated in the late 1970s, both the USA and Russia have stockpiles of the virus – which has led to fears of their use by terrorists should the virus ever fall into the wrong hands. Data on smallpox epidemics before the introduction of vaccination in the late 1790s are very valuable to bioterrorism researchers because they provide insights into how the virus might spread in an unvaccinated population (only a small proportion of the world’s population is vaccinated against smallpox).
From 1770 onwards, there was a rapid decline in adult smallpox victims in both London and Manchester, which Davenport believes could be attributable to a rapid upsurge in the use of smallpox inoculation (a precursor of vaccination) by would-be migrants or a change in the transmissibility and potency of the disease. By the mid-19th century, towns and cities appear to have been relatively healthy destinations for young adult migrants, although still deadly for children.
“Smallpox was probably the major cause of the peculiar lethality of even small urban settlements in the 17th and 18th centuries,” said Davenport, “and this highlights how a single pathogen, like plague or HIV, can dramatically alter the risks associated with migration and migratory patterns.”
“The close relationship between wealth and health that explains much of the current ‘urban advantage’ is not a constant but emerged in England in the 19th century,” added Davenport. “While wealth can now buy better access to medical treatment, as well as better food and housing, it remains an open question as to whether this relationship will persist indefinitely in the face of emerging threats such as microbial drug resistance.”
Header Image : An 1802 cartoon of the early controversy surrounding Edward Jenner’s vaccination theory, showing using hiscowpox-derived smallpox vaccine causing cattle to emerge from patients. WikiPedia
Contributing Source : University of Cambridge
© Copyright 2014 HeritageDaily – Heritage & Archaeology News
Megacities’ Expansive Growth
For the first time in human history, more of the world’s 6.8 billion people live in cities than in rural areas. That is an incredible demographic and geographic shift since 1950 when only 30 percent of the world’s 2.5 billion inhabitants lived in urban environments.
The world’s largest cities, particularly in developing countries, are growing at phenomenal rates. As a growing landless class is attracted by urban opportunities, meager as they might be, these cities’ populations are ballooning to incredible numbers.
A May 2010 Christian Science Monitor article on “megacities” predicted that by 2050, almost 70 percent of the world’s estimated 10 billion people—more than the number of people living today—will reside in urban areas. The social, economic and environmental problems associated with a predominantly urbanized population are considerably different from those of the mostly rural world population of the past.
A megacity is an urban agglomeration (accumulation) with more than 10 million inhabitants. Sixty years ago in 1950, there were only two megacities—New York-Newark and Tokyo. In 1995, 14 megacities existed. Today, there are 22, mostly in the developing countries of Asia, Africa and Latin America. By 2025, there will probably be 30 or more.
Urbanization has been occurring in the developed countries of the West for 200 years. Since the Industrial Revolution, a period from the 18th to 19th century in which machine-based manufacturing grew tremendously, cities have grown rapidly. As technological innovations flourished, economies previously dependent on manual labor and draft-animals began to change. People moved into the cities to find work and relatively quickly, cities began to grow exponentially.
Today, the most rapid megacity growth is occurring in the world’s least developed and poorest countries—those least able to handle the political, social, economic and environmental problems associated with rapid urbanization.
In the most modern industrialized countries, on average, three out of four people already live within an urban area. In contrast, in the least-developed regions of the world, more than two out of three people still reside in a rural area. But that statistic is changing rapidly.
For people in developing countries, even the slums of cities like Mumbai, India, can offer more opportunities than their poor subsistence-based villages can. People gravitate to the cities because the potential for making money is greater there. While most of the economies in rural areas are agriculture-based with little cash flow, in the cities, people may be able to earn cash for work or retail sales.
The 10 largest cities in the world in 2010 and their projected populations by year 2025 are Tokyo, Japan (37.1 million), Delhi, India (28.6), São Paulo, Brazil (21.7), Mumbai, India (25.8), Mexico City (20.7), New York-Newark (20.6), Shanghai, China (20.0), Calcutta, India (20.1), Dhaka, Bangladesh (20.9) and Karachi, Pakistan (18.7).
According to the Christian Science Monitor, along with the masses come problems associated with providing necessary services like clean water, sanitation systems to remove the megatons of garbage and human waste and transportation systems to ferry workers. In addition, many cities have difficult times providing electrical networks, health care facilities and police protection.
Urbanization is not all bad news. According to the Christian Science Monitor, some see great promise in the trend, especially those companies that build roads and buildings. If a city is efficient, energy consumption can decrease by 20 percent. Transportation costs for goods and labor can fall considerably in cities because markets and workers are all close together. In essence, cities are where cash flows—they are where economic growth takes place.
As the world’s population increases at the rate of 134 million per year, the urbanization process is pushing more and more people into the cities. Such frenetic rates of urbanization and intense poverty of large urban populations strain resources. Nonetheless, to poverty-stricken, landless people, cities offer visions of opportunity. The resulting massive urban underclass, particularly in developing countries, represents one of the world’s greatest social and economic challenges.
The real question is, “What are the limits to urban growth?”
And that is Geography in the News.
Sources: GITN #1049, “Growing Megacities,” June 28, 2010; GITN #844, “Megacities: 10 Million or More People,” Aug. 4, 2006; and Bruinius, Harry, “March of the Megacities,” The Christian Science Monitor, May 10, 2010.
Co-authors are Neal Lineback, Appalachian State University Professor Emeritus of Geography, and Geographer Mandy Lineback Gritzner. University News Director Jane Nicholson serves as technical editor. Geography in the NewsTM is solely owned and operated by Neal Lineback for the purpose of providing geographic education to readers worldwide.
Courtesy Benjamin M. Schmidt
It’s not a secret that our subway maps distort the geographies of the metropoles they claim to represent. When we traverse a city everyday with an MTA (Metropolitan Transportation Authority NY) or WMATA map (Washington Metropolitan Area Transit Authority), our conception of the city—its boundaries, expanses—easily becomes scrambled. For instance, both Washington’s dense inner core and its spread-out outskirts are all shown on the same scale. In a grid city like Manhattan there might be some semblance of similarity, but in most other cities, reality on the ground is completely different.
A new project developed at Northeastern University tackles these problems head on. Benjamin M. Schmidt, a professor of history, has designed interactive digital maps of Boston, New York and Washington that superimpose each city’s respective subway route map onto a geographically accurate map made to scale. One can adjust the opacity or transparency between the geographic maps and the overlaid subway routes, and can zoom in and out as well.
With research that joins the fields of cultural history and digital humanities, Schmidt’s maps feed into the larger project at Northeastern University’s history department, which uses maps to investigate the urban and social changes in the city. The new maps are rectified, annotated, and aligned with historical maps to track the changes over time. Schmidt’s maps were designed to help explain the concept of “geo-rectification” to his students. “I made these because I was interested in the collision of two different views we have of our cities: the Google maps version that we use more and more, and the subway maps, which are just as important in making us think about the layout of our cities but have a totally different perspective,” explains Schmidt.
The London Underground Map, 1908. See how the map more or less accurately plots the subway lines according to their geographical placement.
Courtesy London Transport Museum
This “rectification” comes some eighty years after the first subway maps began trading in geographical accuracy for abstract clarity. In 1931, when Harry Beck, an English draftsman, first came up with the design of the London Underground Tube map, it was rejected because it was thought “too revolutionary.” Beck had removed all semblance of geography from the map and “cleaned it up” into a proportionate, rectilinear diagram with horizontal, vertical, and angular lines for train routes and evenly spaced dots for subway stops. A trial of 500 maps was run and it became hugely popular among commuters. By 1933, Beck’s diagrammatic map was in full print run and has, ever since, been the template of subways, trains and transport maps across the cities of the world.
The London Underground Map by Harry Beck, 1933. Subsequent editions have more or less left Beck’s schematic intact.
Courtesy London Transport Museum
It’s a classic case of choosing coherence over geographic accuracy. Beck, a commuter himself, understood that for the average passenger on the train, the agenda was getting from one station to another with a quick glance over the map. Geographical accuracy didn’t figure into it. By comparison, the maps that existed before have often been referred to “as legible as spaghetti in a bowl.” Beck, who used to work as an engineering draftsman at the London Underground Signals Office, designed his version invariably similar to electric circuit diagrams that he did for his day job. A map for comprehensibility rather than topographic exactitude, Beck was able to solve a universal problem of growing cities with his sound design, which is probably the reason for its longevity.
One of Schmidt’s “geo-rectified” maps that shows how the “real” DC Metro conforms to the city’s geography.
Courtesy Benjamin M. Schmidt
Unlike Beck, Schmidt makes its clear that his maps are not for commuters benefit. “I definitely don’t think think these maps are useful, per se; there’s a place for accurate subway maps, but not these twisted versions. But I thought it would be fun to show how these examples of good design that we all live with become distorted if you try to “fix” them. I definitely wouldn’t want to see them actually changed along these lines,” he says.
While Beck’s classic design persists in its relevance and needs no fixing, the digital medium does offer us a new kind of opportunity. It allows us to see a dimension of reality that we didn’t have access to before. Yet it emerges that the truth is quite twisted, in more ways than one.
Esri (GIS Mapping Software, Solutions, Service, Map Apps, and Data) just launched an interactive site that allows users to compare/contrast map data between 16 major cities (so far). Themes that can be compared include:
- Commercial/Industrial zones
- Housing Density/ Population Density/Senior population/Youth population
- Public space
- Urban Footprint
- New development