Articles from the Institute Letter

Additional articles from new and past issues of the Institute Letter will continue to be posted over time and as they become available.

By Graham Farmelo

Franklin Roosevelt (front left) and Winston Churchill (front center) in Quebec, September 1944; both leaders ignored Bohr’s warnings.
Winston Churchill doubted whether politicians would be equal to the challenge of such powerful weapons.

In the early evening of March 15, 1933, a group of London socialites gathered in a Westminster mansion to hear a special lecture on the latest developments in nuclear science. The talk was chaired by Winston Churchill. The speaker—Churchill’s friend Frederick Lindemann, a friend of Einstein’s and a professor of physics at Oxford University—discussed John Cockcroft and Ernest Walton’s recent artificial splitting of the atom and James Chadwick’s discovery of the neutron. Churchill had foreseen an important role of this subatomic particle fifteen months before in his essay “Fifty Years Hence,” read widely in Britain and North America. He had told his readers in this article that scientists were looking for “the match to set the [nuclear] bonfire alight.”

“Fifty Years Hence” was by no means the only article in which Churchill looked forward to the nuclear age. He first did so in 1927 in another popular article, “Shall We All Commit Suicide?” where he alluded to the weapon envisaged by his friend H. G. Wells in the novel The World Set Free, where the term “atomic bomb” first appears. A decade later, Churchill warned four million readers of the News of the World in Britain that nuclear energy may soon be harnessed. He was right: Otto Hahn and Fritz Strassmann, working in Hitler’s capital, discovered nuclear fission eight weeks after Churchill’s piece appeared.

READ MORE>

By Piet Hut

Would something resembling life originate in other systems that are large enough in terms of space and time, starting with a few simple building blocks and some simple rules ­governing their interactions? 

Equilibrium thermodynamics explains the nature of human-made engines, but what will explain the nature of living matter? Professor Piet Hut explores the origins of life in this IL article.

READ MORE>

by Kim Lane Scheppele

What happened when the United Nations Security Council passed Resolution 1373 to fight terrorism but failed to define it?

On December 11, 2003, when asked in a press conference whether his Iraq policy was consistent with international law, President George W. Bush joked, “International law? I better call my lawyer; he didn’t bring that up to me.”

But, in fact, since the 9/11 attacks, the United States government had aggressively constructed a new body of international law: global security law. While the Bush administration is probably best known for its CIA black sites, extraordinary rendition, and defense of torture, those policies were in fact rather short-lived, lasting a handful of years at most. By contrast, global security law not only still exists but is becoming ever more entrenched. More than a decade after the attacks, global security law remains one of the most persistent legacies of 9/11.

On September 28, 2001, the United Nations Security Council passed Resolution 1373. Operating under Chapter VII of the UN Charter, which makes resolutions binding on all member states (noncompliance is at least theoretically subject to sanctions), the
Security Council required states to change their domestic law in parallel ways to fight terrorism. Previously, the Security Council had typically directed states’ actions or urged states to sign treaties, but it had not directed changes in countries’ domestic laws. With Resolution 1373, the Security Council required states to alter some of the most sensitive areas of national law, like criminal law and domestic intelligence law.

READ MORE>

by Ruben Enikolopov

The mandating of female participation by the NSP has resulted in increased male acceptance of women in public life as well as broad-based improvements in women’s lives, although there is no evidence that it has affected the position of women within the family. Image courtesy of Fotini Christia.

How has Afghanistan's largest development program affected democratic processes, counterinsurgency, and the position of women?

Each year, billions of dollars in foreign aid are directed to the developing world. Assistance comes in a variety of forms, but one particular method of delivery—community-driven development (CDD)—which came about as a response to large-scale top-down initiatives that were criticized for failing to empower aid recipients, has become especially popular. This approach emphasizes involvement of local communities in planning decisions and controlling investment of resources. Beyond benefiting communities with their involvement in planning decisions and the investment of resources, CDD is intended to encourage sustained participation through local representative institutions, thus improving social capital and local governance.

The CDD approach is particularly popular in the context of weak or fragile states, in which government bureaucracy often fails to provide public goods and services. From 1996 to 2003, World Bank lending alone for such projects rose from $325 million to $2 billion per year, reaching $30 billion in total as of 2012, toward the support of four hundred programs with CDD components in ninety-four countries. Yet, rigorous empirical evidence of CDD value remains limited.

My work with Andrew Beath from the World Bank and Fotini Christia from the Massachusetts Institute of Technology contributes to understanding the effectiveness of this approach by assessing the impact of a large-scale CDD program in Afghanistan known as the National Solidarity Program (NSP). NSP is the country’s largest development program, implemented in over thirty-two thousand out of Afghanistan’s thirty-nine thousand villages at a cost of over $1 billion. Funded by a consortium of international donors and administered by the Afghan national government, its aim is to both improve the access of rural villagers to critical services and to create a structure for village governance centered on democratic processes and female participation.

READ MORE>

By André Dombrowski

Claude Monet’s La Grenouillère (1869) captures the “now” as a unit in time and the ever-changing experience of modernity. Metropolitan Museum of Art.

How quickening brushwork arose from the industrialization of time

It is often said that impressionism sought to make represented time and the time of representation coterminous. With its seemingly quick and unpolished touch, it gave the modern cultures of speed their first appropriately modernist forms. But art historians have rarely if ever interrogated the concrete histories and technologies of time (and time keeping) that underwrote this seismic stylistic shift or to inquire into the links between quickening brushwork and the nineteenth century’s industrialization of time. This is especially remarkable given the fact that two key scientific events in the measuring of modern time—the advent of quantifiable nervous reaction time in France around 1865 and the standardization of universal time in 1884—overlap so precisely with the history of impressionism, its rise in the mid-1860s, and the turn toward postimpressionism around the mid-1880s.

READ MORE>

Pages