This map shows the outlines of modern Siberia (left) and Alaska (right) with dashed lines. The broader area in a darker shade of green, which is now covered by ocean, represents the Bering land bridge as it existed about 18,000 years ago.
Anthropologists say that the ancestors of Native Americans started making their way from Siberia to the Americas 25,000 years ago over a land bridge that once spanned the Bering Sea — but there are gaps in that story: Why didn’t those migrants leave behind any archaeological traces until 10,000 years later?
Now scientists are homing in on an explanation: During all those millennia, the first Americans were isolated on the land bridge itself. When the land bridge vanished, so did the evidence of that Beringian culture.
The “Beringian Standstill” hypothesis was first proposed by Latin American geneticists in 1997, as a way to explain the genetic evidence indicating that Native Americans started diverging from Siberians 25,000 years ago. In contrast, the archaeological evidence for the first Americans goes back only 15,000 years, to the end of the ice age known as the Last Glacial Maximum.
In this week’s issue of the journal Science, three researchers report new clues that support the claims for Beringia’s lost world. They say fossilized insects, plants and pollen extracted from Bering Sea sediment cores show that central Beringia was once covered by shrub tundra. That would have made it one of the few regions in the Arctic where wood was available for fuel.
Thousands of Siberian migrants might have found refuge in central Beringia until the climate warmed up enough for glaciers to recede, letting them continue their movement into the Americas, the researchers say. “This work fills in a 10,000-year missing link in the story of the peopling of the New World,” Scott Elias, a geography professor at Royal Holloway, University of London, said in a news release.
NANCY BIGELOW / UNIV. OF ALASKA FAIRBANKS
A photo of Alaska’s shrub tundra environment today shows birch shrubs in the foreground and spruce trees scattered around Eight Mile Lake in the foothills of the Alaska Range.
In addition to Elias, the authors of “Out of Beringia?” include lead author John Hoffecker and Dennis O’Rourke. For more about the “Beringian Standstill” concept, check out the reports from the University of Utah and the University of Colorado, plus this online animation and PDF presentation. For an alternate explanation of the spread of the first Americans, check out this archived story.
First published February 27th 2014, 1:37 pm
Alan Boyle is the science editor for NBC News Digital. He joined MSNBC.com at its inception in July 1996
The fate of migrants moving to cities in 17th- and 18th-century England demonstrates how a single pathogen could dramatically alter the risks associated with migration and migratory patterns today.
Cities have always been a magnet to migrants. In 2010, a tipping point was reached for the first time when, according to the World Health Organization, the majority of the world’s population lived in cities. By 2050, seven out of 10 people will have been born in – or migrated to – a city. One hundred years ago, that figure was two out of 10.
Today, cities are generally the safest places to live. If you live in one, you’re likely to be richer than someone living in a rural environment. If you’re richer, you’re likely to live longer. If you live in a city, you have better access to hospitals and healthcare, and you’re more likely to be immunised.
But that was not always the case. In 17th- and 18th-century England, city life was lethal – disproportionately so for those migrating from the countryside.
Dr Romola Davenport is studying the effects of migration on the health of those living in London and Manchester from 1750 to 1850, with a particular focus on the lethality of smallpox – the single most deadly disease in 18th-century England. In the century before 1750, England’s population had failed to grow. Cities and towns sucked in tens of thousands of migratory men, women and children – then killed them. It’s estimated that half of the natural growth of the English population was consumed by London deaths during this period. Burials often outstripped baptisms.
In 2013, cities are no longer the death traps they once were, even accounting for the millions of migrants who live in poor, often slum-like conditions. But will cities always be better places to live? What could eliminate the ‘urban advantage’ and what might the future of our cities look like if antibiotics stop working?
By looking at the past – and trying to make sense of the sudden, vast improvement in survival rates after 1750 – Davenport and the University of Newcastle’s Professor Jeremy Boulton hope to understand more about city life and mortality.
“For modern migrants to urban areas there is no necessary trade-off of health for wealth,” said Davenport. “Historically, however, migrants often took substantial risks in moving from rural to urban areas because cities were characterised by substantially higher death rates than rural areas, and wealth appears to have conferred little survival advantage.”
The intensity of the infectious disease environment overwhelmed any advantages of the wealthy – such as better housing, food and heating. Although cities and towns offered unparalleled economic opportunities for migrants, wealth could not compensate for the higher health risks exacted by urban living.
“Urban populations are large and dense, which facilitates the transmission of infectious diseases from person to person or via animals or sewage. Towns functioned as trading posts not only for ideas and goods but also for pathogens. Therefore, growing an urban population relied upon substantial immigration from rural areas,” explained Davenport.
“After 1750, cities no longer functioned as ‘demographic sinks’ because there was a rapid improvement in urban mortality rates in Britain. By the mid-19th century, even the most notorious industrial cities such as Liverpool and Manchester were capable of a natural increase, with the number of births exceeding deaths.”
Davenport has been studying the processes of urban mortality improvement and changing migrant risks using extremely rich source material from the large London parish of St Martin-in-the-Fields. The research, funded by the Wellcome Trust and the Economic and Social Research Council, is now being augmented with abundant demographic archives from Manchester, funded by the Leverhulme Trust.
For both cities, Davenport and colleagues have access to detailed records of the individual burials underlying the Bills of Mortality, which were the main source of urban mortality statistics from the 17th to the 18th century. These give age at death, cause of death, street address and the fee paid for burial, which enables them to study the age and sex distribution of deaths by disease. In addition, baptismal data allow them to ‘reconstitute’ families as well as to measure the mortality rates of infants by social status.
“The records themselves give only a bald account of death,” said Davenport. “But sometimes we can link them to workhouse records and personal accounts, especially among the migrant poor, which really bring home the realities of life and death in early modern London.
“Smallpox was deadly. At its height, it accounted for 10% of all burials in London and an astonishing 20% in Manchester. Children were worst affected, but 20% of London’s smallpox victims were adults – likely to be migrants who had never been exposed to, and survived, the disease in childhood. However in Manchester – a town that grew from 20,000 to 250,000 in a century – 95% of smallpox burials were children in the mid-18th century, implying a high level of endemicity not only in Manchester but also in the rural areas that supplied migrants to the city.
“So studying urban populations can tell us not only about conditions in cities but also about the circulation of diseases in the rest of the population.”
The greater lethality of smallpox in Manchester is, for the moment, still a mystery to researchers; but evidence suggests the potential importance of transmission via clothing or other means – as opposed to the person-to-person transmission assumed in mathematical models of smallpox transmission in bioterrorism scenarios. Although smallpox was eradicated in the late 1970s, both the USA and Russia have stockpiles of the virus – which has led to fears of their use by terrorists should the virus ever fall into the wrong hands. Data on smallpox epidemics before the introduction of vaccination in the late 1790s are very valuable to bioterrorism researchers because they provide insights into how the virus might spread in an unvaccinated population (only a small proportion of the world’s population is vaccinated against smallpox).
From 1770 onwards, there was a rapid decline in adult smallpox victims in both London and Manchester, which Davenport believes could be attributable to a rapid upsurge in the use of smallpox inoculation (a precursor of vaccination) by would-be migrants or a change in the transmissibility and potency of the disease. By the mid-19th century, towns and cities appear to have been relatively healthy destinations for young adult migrants, although still deadly for children.
“Smallpox was probably the major cause of the peculiar lethality of even small urban settlements in the 17th and 18th centuries,” said Davenport, “and this highlights how a single pathogen, like plague or HIV, can dramatically alter the risks associated with migration and migratory patterns.”
“The close relationship between wealth and health that explains much of the current ‘urban advantage’ is not a constant but emerged in England in the 19th century,” added Davenport. “While wealth can now buy better access to medical treatment, as well as better food and housing, it remains an open question as to whether this relationship will persist indefinitely in the face of emerging threats such as microbial drug resistance.”
Header Image : An 1802 cartoon of the early controversy surrounding Edward Jenner’s vaccination theory, showing using hiscowpox-derived smallpox vaccine causing cattle to emerge from patients. WikiPedia
Contributing Source : University of Cambridge
© Copyright 2014 HeritageDaily – Heritage & Archaeology News
Today, Atlantic Cities points out that the U.S. Geological Survey (USGS) has published a map of lightning strikes in the U.S. According to the description of the map in the Google Maps Gallery, the map is organized by county and represents incidents over “the years 1995-2000 and 2001-2009.” The darker the shade of red, the more “events” have occurred, and the map breaks down each county’s data in terms of total number of injuries, fatalities, cost of property damage, and cost of crop damage.
Read more: Lightning Map by USGS Shows Where You’re Most Likely to Get Struck | TIME.com http://newsfeed.time.com/2014/02/28/this-map-shows-where-in-the-u-s-you-have-the-highest-chance-of-getting-struck-by-lightning/#ixzz2ukzqKKPT
The US State of Virginia has recently voted to include the name “East Sea” in its history and geography textbooks, alongside what is now marked only as the “Sea of Japan”.
VIRGINIA: The US State of Virginia has recently voted to include the name “East Sea” in its history and geography textbooks, alongside what is now marked only as the “Sea of Japan”.
The House of Delegates passed the bill in a vote of 81 to 15, and once it is signed into law, Virginia will be the first American state to include “East Sea”, South Korea’s name for the stretch of water.
The move appears to show that Korean Americans have been mobilised by their country’s increasingly strained relationship with Tokyo.
Peter Y Kim, a Korean American lawyer living in Annandale in Virginia, was shocked when he caught a glimpse of his son’s fifth grade geography textbook recently.
“We found out that the actual textbook, the World Civilisations, only says “Sea of Japan”, (for the sea) between Korea and Japan,” he said.
Mr Kim, who is the president of the Voice of Korean Americans, was upset that the name he grew up learning was not being passed on to his children.
“So I got really frustrated, I got upset, I told them, that’s not true, this particular sea is called the East Sea,” he said.
It is a source of bitterness for the community that the name “Sea of Japan” became the worldwide standard back in the 1920s, while Korea was under Japanese colonial rule.
So Mr Kim and other Korean activists decided to do something about it on behalf of the 82,000 Koreans in Virginia.
They persuaded Virginia State Senator Chap Petersen, who is married to a Korean American and received significant Korean American support in winning his seat, to push for a law to revise the books.
Japan hired a team of lobbyists to defend its position, stressing that “Sea of Japan” was the only internationally recognised name, and was in use from the 19th century, before Japanese colonial rule.
But when it came to a vote, Seoul won by a wide margin — 81 to 15.
Mr Petersen thinks it is a sign that Korean Americans are becoming more politically active.
“I think the Korean population has become much more organised and much more sophisticated.
“And I’ve had people that have supported me, and again, my wife’s Korean so there’s a natural link for me, but people who supported me said ‘you’ve got to stand with us on this issue. And we expect you to stand with us,’” he said.
The Obama administration is also clearly well aware of the growing importance of the Korean American vote — last summer, South Korean President Park Geun-hye addressed a joint session of Congress, a rare honour even for America’s closest allies.
Around 6 per cent of Mr Petersen’s constituents are Korean American, but Mr Petersen said the latest move will not affect any bigger, national ties with Tokyo.
“I’ve made the point that America and Japan are great allies and they have been for almost 70 years.
“People are still going to buy Toyotas and buy Hondas and buy Sony televisions and that’s not going to change. This has nothing to do with any sort of antipathy towards the Japanese. This is a local issue.”
Virginia’s governor is expected to sign the bill into law within the next few weeks — a sign that the Korean American community is now very firmly on the map.
How the north ended up on top of the map
Why do maps always show the north as up? For those who don’t just take it for granted, the common answer is that Europeans made the maps and they wanted to be on top. But there’s really no good reason for the north to claim top-notch cartographic real estate over any other bearing, as an examination of old maps from different places and periods can confirm.
The profound arbitrariness of our current cartographic conventions was made evident by McArthur’s Universal Corrective Map of the World, an iconic “upside down” view of the world that recently celebrated its 35th anniversary. Launched by Australian Stuart McArthur on Jan. 26, 1979 (Australia Day, naturally), this map is supposed to challenge our casual acceptance of European perspectives as global norms. But seen today with the title “Australia: No Longer Down Under,” it’s hard not to wonder why the upside-down map, for all its subversiveness, wasn’t called “Botswana: Back Where It Belongs” or perhaps “Paraguay Paramount!”
The McArthur map also makes us wonder why we are so quick to assume that Northern Europeans were the ones who invented the modern map — and decided which way to hold it — in the first place. As is so often the case, our eagerness to invoke Eurocentrism displays a certain bias of its own, since in fact, the north’s elite cartographic status owes more to Byzantine monks and Majorcan Jews than it does to any Englishman.
There is nothing inevitable or intrinsically correct — not in geographic, cartographic or even philosophical terms — about the north being represented as up, because up on a map is a human construction, not a natural one. Some of the very earliest Egyptian maps show the south as up, presumably equating the Nile’s northward flow with the force of gravity. And there was a long stretch in the medieval era when most European maps were drawn with the east on the top. If there was any doubt about this move’s religious significance, they eliminated it with their maps’ pious illustrations, whether of Adam and Eve or Christ enthroned. In the same period, Arab map makers often drew maps with the south facing up, possibly because this was how the Chinese did it.
Things changed with the age of exploration. Like the Renaissance, this era didn’t start in Northern Europe. It began in the Mediterranean, somewhere between Europe and the Arab world. In the 14th and 15th centuries, increasingly precise navigational maps of the Mediterranean Sea and its many ports called Portolan charts appeared. They were designed for use by mariners navigating the sea’s trade routes with the help of a recently adopted technology, the compass. These maps had no real up or down — pictures and words faced in all sorts of directions, generally pointing inward from the edge of the map — but they all included a compass rose with north clearly distinguished from the other directions.
Members of the Italian Cartographic School preferred to mark north with a hat or embellished arrow, while their equally influential colleagues from the Spanish-ruled island of Majorca used an elaborate rendering of Polaris, the North Star. These men, who formed the Majorcan Cartographic School, also established a number of other crucial mapping conventions of the era, including coloring in the Red Sea bright red and drawing the Alps as a giant chicken foot. Among other hints of the school’s predominantly Jewish membership was the nickname of one of its more prominent members: “el jueu de les bruixoles,” or “the Compass Jew.”
But this is only part of the explanation. The arrow of the compass can just as easily point south, since the magnetized metal needle simply aligns with the earth’s magnetic field, with a pole at each end. Indeed, the Chinese supposedly referred to their first compass magnets as south-pointing stones. Crucially, the Chinese developed this convention before they began to use compasses for navigation at sea. By the time Europeans adopted the compass, though, they were already experienced in navigating with reference to the North Star, the one point in the heavens that remains fixed anywhere in the Northern Hemisphere. Many mariners saw the compass as an artificial replacement for the star on cloudy nights and even assumed it was the pull of the star itself that drew the needle north.
Yet even as this north-pointing compass became essential to navigation and navigational charts in the 15th century, less precise land maps showing the entire known Old World continued to offer a disorienting array of perspectives. Some had the east on top, in keeping with European tradition, while others preferred the south, in keeping with Arab tradition, and others went with the north, in keeping with the point on the compass rose. Among other things that stand out in these maps is that, given the extent of the known world, the location of the Mediterranean and a bit of uncertainly about the equator, Italy was more or less centered between the north and the south — meaning that whichever way you turned the map, Italy remained more or less halfway between the top and bottom. Conveniently, Italy was at roughly the same latitude as Jerusalem, which through most of the century map makers assumed was at the center of the known world. In fact, the first blow to this pious assumption came with the discovery of just how much of the Old World lies to the east of Jerusalem. Only later did it become apparent just how far north of the equator Jerusalem — and by extension, Italy — really was.
The north’s position was ultimately secured by the beginning of the 16th century, thanks to Ptolemy, with another European discovery that, like the New World, others had known about for quite some time. Ptolemy was a Hellenic cartographer from Egypt whose work in the second century A.D. laid out a systematic approach to mapping the world, complete with intersecting lines of longitude and latitude on a half-eaten-doughnut-shaped projection that reflected the curvature of the earth. The cartographers who made the first big, beautiful maps of the entire world, Old and New — men like Gerardus Mercator, Henricus Martellus Germanus and Martin Waldseemuller — were obsessed with Ptolemy. They turned out copies of Ptolemy’s Geography on the newly invented printing press, put his portrait in the corners of their maps and used his writings to fill in places they had never been, even as their own discoveries were revealing the limitations of his work.
For reasons that have been lost to history, Ptolemy put the north up. Or at least that’s the way it appears from the only remaining copies of his work, made by 13th century Byzantine monks. On the one hand, Ptolemy realized that, sitting in Alexandria, he was in the northern half of a very large globe, whose size had been fairly accurately calculated by the ancient Greeks. On the other hand, it put Alexandria at the very bottom of the inhabited world as known to Ptolemy and all the main civilizational centers in the Greco-Roman Mediterranean.
Even if compasses and Ptolemy had both pointed to the south, northerners could still have come along and flipped things around. In fact, with north seemingly settled at the top of the page in the 16th century, there were still some squabbles over who in the Northern Hemisphere would end up left, right or center. The politics of reorientation are anything but simple. For Americans, it’s easy to think that our position, at the top-left of most maps, is the intrinsically preferable one; it certainly seems that way if you happen to be from a culture that reads from left to right. But it’s unclear why Arabs or Israelis, who read from right to left, would necessarily think so. And while map makers usually like to design maps with the edges running through one of the world’s major oceans, it is certainly possible to put North America in the very center by splitting the world in half through Asia.
As the United States was just beginning to emerge on the world stage in the 19th century, American cartographers made some earnest efforts to give the U.S. pride of place. While there is something endearing about the idea of an Indiana map maker in 1871 preparing an atlas with Indiana squarely in the center of the world, the unfortunate side effect was that most of the Midwest disappeared into the gaping crease between atlas pages. Nepal, of course, gets a bit cut off on the sides, but that is nothing compared with what happens to Nebraska. And ironically, accepting the United States’ position in the top left leaves Africa at the very center of the map, which is hardly in line with the politics of the time. Though this puts Africa in what was once considered the map’s prime real estate, it also reduces the continent’s relative size on the standard Mercator projection — another source of complaint for carto-critics.
The orientation of our maps, like so many other features of the modern world, arose from the interplay of chance, technology and politics in a way that defies our desire to impose easy or satisfying narratives. But at a time when the global south continues to suffer more than its share of violence and poverty, let’s not dismiss McArthur’s Universal Corrective Map of the World too quickly. It continues to symbolize a noble wish: that we could overturn the unjust political and economic relationships in our world as easily as we can flip the maps on our walls.
Nick Danforth is a PhD candidate at Georgetown University. He writes about Middle East maps, history and politics at Midafternoon Map.
The views expressed in this article are the author’s own and do not necessarily reflect Al Jazeera America’s editorial policy.
Shotgun geography: the history behind the famous New Orleans elongated house
Few elements of the New Orleans cityscape speak to the intersection of architecture, sociology and geography so well as the shotgun house. Once scorned, now cherished, shotguns shed light on patterns of cultural diffusion, class and residential settlement, social preferences and construction methods.
The shotgun house is not an architectural style; rather, it is a structural typology — what folklorist John Michael Vlach described as “a philosophy of space, a culturally determined sense of dimension.”
A typology, or type, may be draped in any fashion. Thus we have shotgun houses adorned in Italianate, Eastlake and other styles, just as there are Creole and Federalist style townhouses, and Spanish colonial and Greek revival cottages.
Tradition holds that the name “shotgun” derives from the notion of firing bird shot through the front door and out the rear without touching a wall. The term itself postdates the shotgun’s late-19th-century heyday, not appearing in print until the early 20th century.
According to some theories, cultures that produced shotgun houses (and other residences without hallways, such as Creole cottages) tended to be more gregarious, or at least unwilling to sacrifice valuable living space for the purpose of occasional passage.
Cultures that valued privacy, on the other hand, were willing to make this trade-off. When they arrived in New Orleans in the early 19th century, for example, privacy-conscious peoples of Anglo-Saxon descent brought with them the American center-hall cottage and side-hall townhouse, in preference over local Creole designs.
In the 1930s, LSU geographer Fred B. Kniffen studied shotguns as part of his field research on Louisiana folk housing. He and other researchers proposed a number of hypotheses explaining the origin and distribution of this distinctive house type.
One theory, popular with tour guides and amateur house-watchers, holds that shotgun houses were designed in New Orleans in response to a real estate tax based on frontage rather than square footage, motivating narrow structures. There’s one major problem with this theory. No one can seem to find that tax code.
Could the shotgun be an architectural response to narrow urban lots? Indeed, you can squeeze in more structures with a slender design. But why then do we see shotguns in rural fields with no such limits?
Could it have evolved from indigenous palmetto houses or Choctaw huts? Unlikely, given their appearance in the Caribbean and beyond.
Could it have been independently invented? Roberts & Company, a New Orleans sash and door fabricator formed in 1856, developed blueprints for prefabricated shotgun-like houses in the 1860s to 1870s and even won awards for them at international expositions. But then why do we see “long houses” in the rear of the French Quarter and in Faubourg Treme as early as the 1810s?
Or, alternately, did the shotgun diffuse from the Old World as peoples moved across the Atlantic and brought with them their building culture, just as they brought their language, religion and foodways? Vlach noted the abundance of shotgun-like long houses in the West Indies, and traced their essential form to the enslaved populations of St. Domingue (now Haiti) who had been removed from the western and central African regions of Guinea and Angola.
His research identified a gable-roofed housing stock indigenous to the Yoruba peoples, which he linked to similar structures in modern Haiti with comparable rectangular shapes, room juxtapositions and ceiling heights.
Vlach hypothesizes that the 1809 exodus of Haitians to New Orleans after the St. Domingue slave insurrection of 1791 to 1803 brought this vernacular house type to the banks of the Mississippi. “Haitian migrants had only to continue in Louisiana the same life they had known in St. Domingue,” he wrote. “The shotgun house of Port-au-Prince became, quite directly, the shotgun house of New Orleans.”
The distribution of shotgun houses throughout Louisiana gives indirect support to the diffusion argument. Kniffen showed in the 1930s that shotguns generally occurred along waterways in areas that tended to be more Francophone in their culture, higher in their proportions of people of African and Creole ancestry, and older in their historical development.
Beyond state boundaries, shotguns occur throughout the lower Mississippi Valley, correlated with antebellum plantation regions and with areas that host large black populations. They also appear in interior Southern cities, most notably Louisville, Ky., which comes a distant second to New Orleans in terms of numbers and stylistic variety.
If in fact the shotgun diffused from Africa to Haiti through New Orleans and up the Mississippi and Ohio valleys, this is the distribution we would expect to see.
Cleary, poverty abets cultural factors in explaining this pattern. Simplicity of construction and conservation of resources (building materials, space) probably made the shotgun house equally attractive to poorer classes in many areas.
Indeed, it is possible that we may be artificially yoking together a wide variety of house types, unrelated in their provenance but similar in their appearance, by means of a catchy moniker coined after their historical moment.
Whatever their origins, shotgun singles and doubles came to dominate the turn-of-the-century housing stock of New Orleans’ working-class neighborhoods. Yet they were also erected as owned-occupied homes in wealthier areas, including the Garden District.
New Orleans shotguns in particular exhibited numerous variations: with hip, gable or apron roofs; with “camelbacks” to increase living space; with grand classical facades or elaborate Victorian gingerbread. The variety can be explained as a strategy to address market demand with a multitude of options in terms of space needs, fiscal constraints and stylistic preferences.
New Orleanians by the 20th century, as part of their gradual Americanization, desired more privacy than their ancestors, and increasing affluence and new technologies — such as mechanized kitchens, indoor plumbing, air conditioning, automobiles and municipal drainage — helped form new philosophies about residential space.
Professional home builders responded accordingly, some adding hallways or ells or side entrances to the shotgun, others morphing it into the bungalow form. House-buyers came to disdain the original shotgun, and it faded from new construction during the 1910s and 1920s.
A Times-Picayune writer captured the prevailing sentiment in a 1926 column: “Long, slender, shotgun houses,” he sighed, “row upon row(,) street upon street…all alike… unpainted, slick-stooped, steep-roofed, jammed up together, like lumber in a pile.”
Architectural historians also rolled their eyes at prosaic shotguns, and did not protest their demolition, even in the French Quarter, as late as the 1960s.
In recent decades, however, New Orleanians have come to appreciate the sturdy construction and exuberant embellishment of their shotgun housing stock, and now value them as a key element of the cityscape.
Thousands have since been renovated, and the shotgun has experienced a recent revival. Some homes in the Make It Right project in the Lower 9th Ward, for example, were inspired by the shotgun (although rendered in modernist style), and some pre-fabricated “Katrina Cottages” and New Urbanist homes in recently rebuilt public housing complexes are made to look like the shotguns of old.
It’s revealing to note, however, that among the renovations New Orleanians now make to their shotguns is something completely alien to their original form.
They add a hallway.
Richard Campanella, a geographer with the Tulane School of Architecture and a Monroe Fellow with the New Orleans Center for the Gulf South, is the author of the forthcoming “Bourbon Street: A History” as well as “Bienville’s Dilemma,” “Geographies of New Orleans,” and other books. He may be reached through his website, email@example.com or @nolacampanella on Twitter.
These Maps Show Which Areas Of The Country Have The Biggest Carbon Footprints
It’s no secret that the U.S. is one of the biggest carbon emitters around. Households in the U.S. alone are responsible for 20% of greenhouse gas emissions, even though they account for just over 4% of the global population.
But which areas in the U.S. are contributing the most? These interactive maps from the University of California, Berkeley show where the U.S. has the biggest carbon footprint. You can even calculate your city’s carbon footprint on their site.
The carbon footprint measurement equals the total greenhouse gas emissions of the zip code in question. An area’s carbon footprint includes things like energy people use at home, energy used by businesses, and transportation. The biggest source of emissions depends on the area. For example, the suburbs have a higher percentage of emissions coming from individual vehicles than big cities do.
The maps use data from the Residential Energy Consumption Survey. The full study was published in December in Environmental Science & Technology.
Here you can see the average annual carbon footprint across the U.S. — green is lower and the orange and red areas are higher emissions. The white areas on the map show where survey data was unavailable.
Most areas range between 40 to 80 metric tons of carbon dioxide. The Midwest and parts of the northeast are the worst areas.
Zooming in on New York City, you can see that as what the study calls a “mega city” its carbon footprint is low relative to its population density.
Unlike the map above, this map shows only how much emissions the average home is producing — emissions from electricity and commutes. This map excludes emissions from things like goods, food, and services.
It’s easy to see the worst regions in the dark red areas on the map:
Here you can see the average vehicle miles traveled by zip code. A lot of driving increases the size of a person’s carbon footprint. The purple areas represent the highest number of miles.
City v. Suburb
Intuitively it makes sense to assume that as the population density of an area increases, emissions per person decrease; when people and businesses are closer together, there’s less commuting, and more resources are shared between people.
But this new research shows that the relationship is more complex. The study suggests there’s really no direct correlation between population density and greenhouse gas emissions.
Emissions actually increase as population density increases until an area hits about 3,000 people per square mile.
In mega cities, like New York and Los Angeles, the emissions start decreasing again as the population density climbs. This creates an upside down “U” shape when comparing carbon footprint and population density of an area.
This is visible in the plot to the left, which has population density on the x-axis and carbon footprint on the y-axis.
Further, even though these dense metropolitan areas have a small carbon footprint relative to the number of people they hold, the surrounding suburbs have a much bigger carbon footprint, “more than offsetting the benefit of low carbon areas in city centres,” the researchers say.
A changing landscape
If more people move into the suburbs, there could be a significant increase the country’s carbon footprint. Suburbs already account for 50% of the total household carbon footprint in the U.S.
“Increasing rents would also likely further contribute to pressures to suburbanize the suburbs, leading to a possible net increase in emissions,” the researchers write in the paper.
The new insight into how population density impacts carbon footprint shows there is no one-size-fits-all strategy for reducing carbon emissions across the country: Areas with different population densities produce different amounts of carbon dioxide, they also have different main sources of CO2.
For example, transportation accounts for 50% of all emissions in suburban areas. But, in big cities like New York, one of the largest emission contributors is food services. The optimal strategy to reduce emissions in both of these areas would be different: the suburbs should focus on ways to reduce transportation emissions, and big cities should focus on ways to reduce food industry emissions.