# Biology

## The Prisoner's Dilemma

### By Freeman Dyson

 Groups lacking cooperation are like dodoes, losing the battle for survival collectively rather than individually.

The Evolution of Cooperation is the title of a book by Robert Axelrod. It was published by Basic Books in 1984, and became an instant classic. It set the style in which modern scientists think about biological evolution, reducing the complicated and messy drama of the real world to a simple mathematical model that can be run on a computer. The model that Axelrod chose to describe evolution is called “The Prisoner’s Dilemma.” It is a game for two players, Alice and Bob. They are supposed to be interrogated separately by the police after they have committed a crime together. Each independently has the choice, either to remain silent or to say the other did it. The dilemma consists in the fact that each individually does better by testifying against the other, but they would collectively do better if they could both remain silent. When the game is played repeatedly by the same two players, it is called Iterated Prisoner’s Dilemma. In the iterated game, each player does better in the short run by talking, but does better in the long run by remaining silent. The switch from short-term selfishness to long-term altruism is supposed to be a model for the evolution of cooperation in social animals such as ants and humans.

Mathematics is always full of surprises. The Prisoner’s Dilemma appears to be an absurdly simple game, but Axelrod collected an amazing variety of strategies for playing it. He organized a tournament in which each of the strategies plays the iterated game against each of the others. The results of the tournament show that this game has a deep and subtle mathematical structure. There is no optimum strategy. No matter what Bob does, Alice can do better if she has a “Theory of Mind,” reconstructing Bob’s mental processes from her observation of his behavior.

## Studying the Shape of Data Using Topology

### By Michael Lesnick

 A still developing branch of statistics called topological data analysis seeks to extract useful information from big data sets. In the last fifteen years, there have been applications to several areas of science and engineering, including oncology, astronomy, neuroscience, image processing, and biophysics.

The story of the “data explosion” is by now a familiar one: throughout science, engineering, commerce, and government, we are collecting and storing data at an ever-increasing rate. We can hardly read the news or turn on a computer without encountering reminders of the ubiquity of big data sets in the many corners of our modern world and the important implications of this for our lives and society.

Our data often encodes extremely valuable information, but is typically large, noisy, and complex, so that extracting useful information from the data can be a real challenge. I am one of several researchers who worked at the Institute this year in a relatively new and still developing branch of statistics called topological data analysis (TDA), which seeks to address aspects of this challenge.

In the last fifteen years, there has been a surge of interest and activity in TDA, yielding not only practical new tools for studying data, but also some pleasant mathematical surprises. There have been applications of TDA to several areas of science and engineering, including oncology, astronomy, neuroscience, image processing, and biophysics.

The basic goal of TDA is to apply topology, one of the major branches of mathematics, to develop tools for studying geometric features of data. In what follows, I’ll make clear what we mean by “geometric features of data,” explain what topology is, and discuss how we use topology to study geometric features of data. To finish, I’ll describe one application of TDA to oncology, where insight into the geometric features of data offered by TDA led researchers to the discovery of a new subtype of breast cancer.

## Life on Other Planets

### By David S. Spiegel

 Sighting a deer outside of Fuld Hall; how long did Earth have to wait before life "walked into sight"?

Until a couple of decades ago, the only planets we knew existed were the nine in our Solar System. In the last twenty-five years, we’ve lost one of the local ones (Pluto, now classified as a “minor planet”) and gained about three thousand candidate planets around other stars, dubbed exoplanets. The new field of exoplanetary science is perhaps the fastest growing subfield of astrophysics, and will remain a core discipline for the forseeable future.

The fact that any biology beyond Earth seems likely to live on such a planet is among the many reasons why the study of exoplanets is so compelling. In short, planets are not merely astrophysical objects but also (at least some of them) potential abodes.

The highly successful Kepler mission involves a satellite with a sensitive telescope/camera that stares at a patch of sky in the direction of the constellation Cygnus. The goal of the mission is to find what fraction of Sun-like stars have Earth-sized planets with a similar Earth-Sun separation (about 150 million kilometers, or the distance light travels in eight minutes).

## How Our Brains Operate: Questions Essential to Our Humanness

### By John Hopfield

 From Rudyard Kipling's "Just So" story "The Elephant's Child"

All of us who have watched as a friend or relative has disappeared into the fog of Alzheimer’s arrive at the same truth. Although we recognize people by their visual appearance, what we really are as individual humans is determined by how our brains operate. The brain is certainly the least understood organ in the human body. If you ask a cardiologist how the heart works, she will give an engineering description of a pump based on muscle contraction and valves between chambers. If you ask a neurologist how the brain works, how thinking takes place, well . . . Do you remember Rudyard Kipling’s Just So Stories, full of fantastical evolutionary explanations, such as the one about how the elephant got its trunk? They are remarkably similar to a medical description of how the brain works.

The annual meeting of the Society for Neuroscience attracts over thirty thousand registrants. It is not for lack of effort that we understand so little of how the brain functions. The problem is one of the size, complexity, and individuality of the human brain. Size: the human brain has approximately one hundred billion nerve cells, each connecting to one thousand others. Complexity: there are one hundred different types of nerve cells, each with its own detailed properties. Individuality: all humans are similar, but the operation of each brain is critically dependent on its individual details. Your particular pattern of connections between nerve cells contains your personality, your language skills, your knowledge of family, your college education, and your golf swing.

## 'An Artificially Created Universe': The Electronic Computer Project at IAS

### By George Dyson

 In this 1953 diagnostic photograph from the maintenance logs of the IAS Electronic Computer Project (ECP), a 32-by-32 array of charged spots––serving as working memory, not display––is visible on the face of a Williams cathode-ray memory tube. Starting in late 1945, John von Neumann, Professor in the School of Mathematics, and a group of engineers worked at the Institute to design, build, and program an electronic digital computer.

I am thinking about something much more important than bombs. I am thinking about computers.––John von Neumann, 1946

There are two kinds of creation myths: those where life arises out of the mud, and those where life falls from the sky. In this creation myth, computers arose from the mud, and code fell from the sky.

In late 1945, at the Institute for Advanced Study in Princeton, New Jersey, Hungarian-American mathematician John von Neumann gathered a small group of engineers to begin designing, building, and programming an electronic digital computer, with five kilobytes of storage, whose attention could be switched in 24 microseconds from one memory location to the next. The entire digital universe can be traced directly to this 32-by-32-by-40-bit nucleus: less memory than is allocated to displaying a single icon on a computer screen today.

Von Neumann’s project was the physical realization of Alan Turing’s Universal Machine, a theoretical construct invented in 1936. It was not the first computer. It was not even the second or third computer. It was, however, among the first compu­ters to make full use of a high-speed random-access storage matrix, and became the machine whose coding was most widely replicated and whose logical architecture was most widely reproduced. The stored-program computer, as conceived by Alan Turing and delivered by John von Neumann, broke the distinction between numbers that mean things and numbers that do things. Our universe would never be the same.

## Identifying Novel Genes Associated with Autism

### By Chang S. Chan, Suzanne Christen, and Asad Naqvi

 This plot of genetic data from an individual with autism shows a deletion in the gene NCAM2, one of four genes that researchers in the Simons Center for Systems Biology found to be associated with autism.

Autism is a common child­hood neurodevelop­mental disorder affecting one in 180 children. It is characterized by impaired social interaction and communication, and by restric­ted interests and rep­etitive behav­ior. Autism is a complex disease exhibiting strong genetic liability with a twenty-five-fold increas­ed risk for individuals having affected first-degree relatives. Moreover, the concordance for developing autism is over 90 percent in identical twins, but only 5–10 percent for fraternal twins. Recent advances in genetics show that autism is associated with many diverse genes, with each gene accounting only for a few percent of cases, as well as complicated multigenic effects.

Researchers at the Simons Center for Systems Biology have been studying autism for the past two years. We have identified novel genes associated with autism. Our approach is to use single nucleotide polymorphism (SNP) genotyping chips that measure differences between individuals and can uncover candidate genes or regulatory elements (which control gene activity) associated with the disease.

Most individuals differ very little from one another across the human genome. SNPs are the largest class of DNA sequence variation among individuals. A SNP occurs when one base out of the four bases used in DNA is exchanged for another base at the same locus, such that the minor allele frequency is at least 1 percent in a given population. SNPs are found at the rate of roughly one out of every 1,000 base pairs of the human genome. These SNPs provide the best chance of detecting genetic variation, both normal and otherwise, between people.

## DNA, History, and Archaeology

### By Nicola Di Cosmo

 A lecture on archaeological perspectives on ethnicity in ancient China, delivered by Lothar von Falkenhausen, Professor at the University of California, Los Angeles, was part of the workshop “DNA, History, and Archaeology” organized by Nicola Di Cosmo in October 2010.

Historians today can hardly answer the question: when does history begin? Traditional boundaries between history, protohistory, and prehistory have been blurred if not completely erased by the rise of concepts such as “Big History” and “macrohistory.” If even the Big Bang is history, connected to human evolution and social development through a chain of geological, biological, and ecological events, then the realm of history, while remaining firmly anthropocentric, becomes all-embracing.

An expanding historical horizon that, from antiquity to recent times, attempts to include places far beyond the sights of literate civilizations and traditional caesuras between a history illuminated by written sources and a prehistory of stone, copper, and pots has forced history and prehistory to coexist in a rather inelegant embrace. Such a blurring of the boundaries between those human pasts that left us more or less vivid and abundant written records, and other pasts, which, on the contrary, are knowable only through the spadework and field­work of enterprising archaeologists, ethnographers, and anthropologists, has also changed (or is at least threatening to change) the nature of the work of professional historians.

Technological advances, scientific instrumentation, statistical analyses, and laboratory tests are today producing historical knowledge that aims to find new ways of answering questions that have long exercised specialists of the ancient world. Should historians, then, try to make these pieces of highly technical evidence relevant to their own work? Or should they ignore them? The dilemma is not entirely new. Archaeology, material culture, and historical linguistics have already forced historians to come out of the “comfort zone” of written sources. Archaeologists have by and large wrested themselves free from the fastnesses of the classical texts, and much of their work cannot be regarded as ancillary to the authority of the written word. Satellite photography, remote sensing, archaeo-GIS, C14 dating, dendro­chron­ology (tree-ring dating), and chemical analysis have become standard tools of the archaeologist that coexist with the trowel and the shovel. But the palaeosciences and ancient DNA studies pose challenges of a different order, directly correlated to the greater distance that exists between scientific and historical research in terms of training and knowledge base.