School of Social Science

Environmental Turn in the Human Sciences

By Sverker Sörlin 

Will It Become Decisive Enough?

What do the humanities have to do with the environment? As they are commonly understood, environmental problems are issues that manifest themselves primarily in the environment itself. Natural scientists research these problems and suggest solutions, aided by technology, economics, and policy. It was scientists who defined the modern usage of the concept of “the environment” after World War II. Ecologist William Vogt famously used it in his 1948 volume The Road to Survival: “We live in one world in an ecological—an environmental—sense.” He and others at the time thought of “the environment” as a composite of issues that had been in the making for some time—most prominently, population growth, which had been much discussed since the World Population Conference in Geneva in 1927, but also soil erosion, desertification (observed by Paul Sears in his famous 1935 book Deserts on the March), pollution, food, poverty, and starvation.

The Anthropocene: What Is It?

by Sverker Sörlin 

Are Humans a Major and Defining Force on the Geological Scale?

The word “Anthropocene” has had a formidable career in the last few years and is often heard among global change scientists and scholars, in policy circles, green popular movements, and think tanks, and in all spheres where environmental and climate issues are discussed. In the literal, and limited, sense it is a geological concept, on a par with other periods or epochs during the Cenozoic era, such as the Holocene (“Recent Whole,” the period since the last glaciation, ca. eleven thousand years ago). The word anthropos (Greek for “human”) in it indicates that humans, as a collectivity across time, serve as a major and defining force on the geological scale.

Whether this is so is a matter of definition, and it is an ongoing and open issue whether this is the case. The Royal Geological Society of London handles these kinds of issues through its Stratigraphy Commission, which expects to be able to present its view on the matter to the Society by 2016. The chief criterion in their search for evidence is whether there will be enough lasting and significant traits left of the “strata” of the Anthropocene to merit it an individual geological period, or epoch (Zalasiewic et al. 2011). This is less a philosophical or judgmental than an empirical issue. Are the assembled impacts and remnants of human activities in the lithosphere, biosphere, atmosphere, pedosphere (the layer of soils), and cryosphere (the layer of ice) so overwhelming that we can be certain that the “deep future” will still be able to register the strata of humanity embedded into Earth itself?

Neurophilosophy and Its Discontents

by Gabrielle Benette Jackson 

Cartoon: cogito ergo sum consciousness

How Do We Understand Consciousness Without Becoming Complicit in That Understanding?

What is consciousness? “It is being awake,” “being responsive,” “acting,” “being aware,” “being self-aware,” “paying attention,” “perceiving,” “feeling emotions,” “feeling feelings,” “having thoughts,” “thinking about thoughts,” “it is like this!”

Who is conscious? “We humans, surely!” Well, maybe not all the time. “Animals!” Debatable. “Computers?” No—at least, not yet. “Other machines?” Only in fiction. “Plants?” Absolutely not, right?

Nearly twenty-five years ago, we lived through “the project of the decade of the brain,” a governmental initiative set forth by President George H. W. Bush.1 Presidential Proclamation 6158 begins, “The human brain, a three-pound mass of interwoven nerve cells that controls our activity, is one of the most magnificent—and mysterious—wonders of creation. The seat of human intelligence, interpreter of senses, and controller of movement, this incredible organ continues to intrigue scientists and laymen alike. Over the years, our understanding of the brain—how it works, what goes wrong when it is injured or diseased—has increased dramatically. However, we still have much more to learn.” And it concludes, “Now, Therefore, I, George Bush, President of the United States of America, do hereby proclaim the decade beginning January 1, 1990, as the Decade of the Brain. I call upon all public officials and the people of the United States to observe that decade with appropriate programs, ceremonies, and activities.”

What the former President did not say—what is perhaps understood by his readers—is that the brain is quite different from other body parts that have come under scientific investigation. We might be grateful to receive a donated kidney, or to have an artificial heart. But unlike every other body part, without my brain, there may be no I. Our sense of self, of awareness, of life—are profoundly connected to a working brain. A philosopher once said, “in a brain transplant, one wants to be the donor not the recipient.” Indeed, saying that the brain is the seat of mentality is like saying that the sun is a source of light.
 

Conspiring with the Enemy and Cooperating in Warfare

by Yvonne Chiu 

A wounded German soldier lighting a cigarette for a wounded British soldier at a British field hospital during the Battle of Épehy, near the end of the First World War (1918)
A wounded German soldier lighting a cigarette for a wounded British soldier at a British field hospital during the Battle of Épehy, near the end of the First World War (1918) (Photo: Lt. Thomas K. Aitken, British Army photographer/Imperial War Museums)

‘Live and Let Live’ as a Representative Element of War

Images that convey the essence of war are more likely to resemble the frenzied, merciless, mutual slaughter between the Aegeans and the Trojans as told in The Iliad, the rapes depicted in Goya’s The Disasters of War, the torture portrayed in The Battle of Algiers, or the indiscriminate napalm bombing in Vietnam dramatized in Apocalypse Now. It is commonly believed—and for good reason—that morality and civilization are inevitably forgotten in war, as participants become desperate to survive, get caught up in the bloodlust, or lose touch with their humanity. There is truth to that, so it might be surprising to think of banning hollow point bullets (Hague Convention, 1899) or regulating prisoner-of-war treatment (from the 1648 Peace of Westphalia through the 1949 Geneva Conventions) as simultaneously capturing an essential element of warfare, but in fact they represent a significant component of war, which is cooperation between enemies.

Some of the more amazing stories of cooperation in warfare come from the trenches of World War I. During the Christmas truces in 1914, and to a lesser extent in 1915, not only did 100,000 British and German soldiers in WWI unofficially stop fighting, but in some places in Belgium, German soldiers who decorated their trenches with candles and trees and sang carols were met with British soldiers singing in kind; eventually, the two sides mingled in No Man’s Land, exchanging gifts, food, and souvenirs, and even engaging in short, casual football games.

In addition to ad hoc cooperation on a shared holy day, opposing trenches spontaneously developed a longer-lived system of timed shellings to allow the other side to anticipate and avoid their impact. While trench warfare was a large part of the WWI experience, it is not particularly interesting militarily. Rather, it is noteworthy for what fighting did not happen. This “live and let live” system has been recounted in marvelous detail by Tony Ashworth (Trench Warfare 1914–1918). That reciprocal exchange—of minimization of injury and death—took different forms during the war: truces lasted anywhere from a few minutes to several months; some were explicit agreements between fraternizing soldiers in close quarters, while others were indirect (due to legal sanctions), over long distances, and involving large numbers of people. There were numerous reports of people walking openly above trenches; unrestricted movement in and out of the trenches; Germans frying sausages and photos of Brits frying bacon in the trenches, despite the fact that smoke from the fires would have attracted gunfire on active fronts; and descriptions of “quiet” fronts, where there were no ammunition shortages. In some trenches, people hunted and retrieved small game, harvested vegetables, kept milking cows for fresh milk, and had pianos and books.

A Declaration of Freedom and Equality

Exploring the Arguments of Independence 

The following text is excerpted from Our Declaration: A Reading of the ­Declaration of Independence in Defense of Equality (Liveright Publishing Corporation, 2014) by Danielle Allen, UPS Foundation Professor in the School of Social Science.

The Declaration of Independence matters because it helps us see that we cannot have freedom without equality. It is out of an egalitarian commitment that a people grows—a people that is capable of protecting us all collectively, and each of us individually, from domination. If the Declaration can stake a claim to freedom, it is only because it is so clear-eyed about the fact that the people’s strength resides in its equality.

The Declaration also conveys another lesson of paramount importance. It is this: language is one of the most potent resources each of us has for achieving our own political empowerment. The men who wrote the Declaration of Independence grasped the power of words. This reveals itself in the laborious processes by which they brought the Declaration, and their revolution, into being. It shows itself forcefully, of course, in the text’s own eloquence.

When we think about how to achieve political equality, we have to attend to things like voting rights and the right to hold office. We have to foster economic opportunity and understand when excessive material inequality undermines broad democratic political participation. But we also have to cultivate the capacity of citizens to use language effectively enough to influence the choices we make together.
 

Pages