![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
An editorial about bees and bee-pollinated crops
Too-Busy Bees
By MARCELO AIZEN and LAWRENCE HARDER
IN the past five years, as the phenomenon known as colony-collapse disorder has spread across the United States and Europe, causing the disappearance of whole colonies of domesticated honeybees, many people have come to fear that our food supply is in peril. The news on Wednesday that a Department of Agriculture survey found that American honeybees had died in great numbers this winter can only add to such fears.
The truth, fortunately, is not nearly so dire. But it is more complicated.
There is good news: While some areas are seeing a shortage of bees, globally the number of domesticated honeybee colonies is increasing. The bad news is that this increase can’t keep up with our growing appetite for luxury foods that depend heavily on bee pollination. The domesticated honeybee isn’t the only pollinator that agriculture relies on — wild bees also play a significant role, and we seem intent on destroying their habitats.
To understand the problem, we need to understand the extent of the honeybee’s role in agriculture. Humans certainly benefit from the way bees — and to a lesser extent, other pollinators like flies, beetles and butterflies — help plants produce fruits and seeds. Agriculture, however, is not as dependent on pollinators as one might think. It’s true that some crops like raspberries, cashews, cranberries and mangoes cannot reproduce without pollinators. But crops like sugar cane and potatoes, grown for their stems or tubers, can be propagated without pollination. And the crops that provide our staple carbohydrates — wheat, rice and corn — are either wind-pollinated or self-pollinated. These don’t need bees at all.
Overall, about one-third of our worldwide agricultural production depends to some extent on bee pollination, but less than 10 percent of the 100 most productive crop species depend entirely on it. If pollinators were to vanish, it would reduce total food production by only about 6 percent.
This wouldn’t mean the end of human existence, but if we want to continue eating foods like apples and avocados, we need to understand that bees and other pollinators can’t keep up with the current growth in production of these foods.
The reason is that fruit and seed crops that are most dependent on pollinators yield relatively little food per acre, and therefore take up an inordinate, and increasing, amount of land. The fraction of agriculture dependent on pollination has increased by 300 percent in half a century.
The paradox is that our demand for these foods endangers the wild bees that help make their cultivation possible. The expansion of farmland destroys wild bees’ nesting sites and also wipes out the wildflowers that the bees depend on when food crops aren’t in blossom. Researchers in Britain and the Netherlands have found that the diversity of wild bee species in most regions in those countries has declined since 1980. This decrease was mostly due to the loss of bees that require very particular habitats — bees that couldn’t adapt after losing their homes and food sources to cultivation. Similarly, between 1940 and 1960, as land increasingly came under cultivation in the American Midwest, several bumblebee species disappeared from the area. It is difficult to count and keep track of wild bee populations globally, but their numbers are probably declining overall as a result of such human activity.
Even if the number of wild pollinators remained stable, it would not be sufficient to meet the increasing demand for agricultural pollination. Could domesticated bees take up the slack? By looking at data from the Food and Agriculture Organization of the United Nations, we found that the number of managed honeybee hives increased by 45 percent during the past five decades.
Unfortunately, this increase cannot counteract the growing demand for pollination or the shortage of wild pollinators. Domesticated bees mainly produce honey; any contribution they make to crop pollination is usually a secondary benefit. In most parts of the world, they provide pollination only locally and not necessarily where it is needed most.
Thus a vicious cycle: Fewer pollinating bees reduce yield per acre — and lower yield requires cultivation of more land to produce the same amount of food.
Eventually, a growing shortage of pollinators will limit what foods farmers can produce. If we want to continue to enjoy almonds, apples and avocados, we have to cultivate fewer of them, more sustainably, and protect the wild bees that help make their production possible.
An article about morals, sharing, and the world
Moral Lessons, Down Aisle 9
By JOHN TIERNEY
Like Diogenes with his lamp, researchers have traversed the world looking for an honest man — or, more precisely, for people who act in the same fair, unselfish way toward everyone. If you wish to learn to follow this golden rule, which of these strategies is best?
a) Move to a village in the Amazon and go foraging with the indigenous Tsimane people.
b) Move to a Dolgan and Nganasan settlement on the Siberian tundra, herd reindeer and join the Russian Orthodox Church.
c) Visit a Himalayan monastery and follow instructions to “gaze within” and “follow your bliss.”
d) Join a camp of nomadic Hadza hunter-gatherers sharing giraffe meat and honey on the Serengeti savanna.
e) Join a throng of Wal-Mart shoppers buying groceries on the Missouri prairie.
Well, the Siberian church might impart some moral lessons, but your best bet is to go shopping, at least by my reading of the experiments reported in the current issue of Science. It doesn’t have to be Wal-Mart, by the way — any kind of grocery store seems to have an effect. Wal-Mart just happens to be popular with the exceptionally fair-minded residents of Hamilton, a small rural town in northwestern Missouri. They scored higher in a test of fairness toward strangers than did any of the less-modern communities in Fiji, Papua New Guinea, Africa, Asia and Latin America.
The study doesn’t prove the moral superiority of Missourians, because traditional societies emphasize different virtues, like providing food and comfort to relatives. But the results do help explain a central mystery of civilization: How did small family clans evolve into large cities of cooperative strangers? Why are New Yorkers sometimes nice even to tourists?
Being nice made evolutionary sense when we lived in small bands surrounded by relatives, because helping them helped our genes survive. And we had a direct incentive to be fair to people who would later reciprocate kindness or punish selfishness. But why even consider returning a stranger’s wallet you find in a taxicab? Why leave a tip in a restaurant you’ll never visit again?
Some evolutionary psychologists have suggested that we have an innate sense of fairness left over from our days of living in small clans. According to this theory, our inherited instincts cause us to be nice to strangers even when we’re hurting our interests, just as our ancient taste for fat and sugar causes us now to eat more calories than are good for us.
But there’s more to it than just inherited instinct, says Joseph Henrich of the University of British Columbia, who led the study’s team of anthropologists, psychologists and economists. They found wide cultural variations by observing more than 2,000 people in 15 small communities participate in a two-player game, called Dictator, with a prize equal to the local pay for a day’s work.
One player, the dictator, was given the authority to keep the entire prize or share part of it with the other, unseen player, whose identity remained secret. Along with this power came the assurance that the dictator’s identity would also remain secret, so that no one except the researcher would ever know how selfish the dictator had been.
The most lucrative option, of course, was to keep the whole prize and stiff the anonymous partner. But the Missourians on average shared more than 45 percent of the prize, and some other societies were nearly as generous, like the Ghanians living in the city of Accra and the Sanquianga fishermen on the coast of Colombia.
But most of the hunter-gatherers, foragers and subsistence farmers were less inclined to share. The Hadza nomads in the Serengeti and the Tsimane Indians in the Amazon gave away only a quarter of the prize. They also reacted differently when given a chance, in variations of the game, to punish another player for hogging the prize.
Selfishness offended the Missourians so much that they would punish the player even though it cost them money. But the members of traditional societies showed little inclination to punish others at their own expense. “There are lots of norms in these small-scale societies for how to treat one another and share food,” says Dr. Henrich. “But these rules don’t apply in unusual situations when you don’t know anything about the kinship or status of the other person. You don’t feel the same sense of responsibility, and you act more out of self-interest.”
The researchers found that people in small communities like the Hadza camp (population about 50) were less willing to inflict punishment than people in larger communities like Hamilton (about 1,800). That makes practical sense: the more strangers there are, the more need to keep them from exploiting one another. But what enabled those larger societies to grow in the first place?
Dr. Henrich and his colleagues identified two distinguishing factors.
People belonging to a modern “world religion,” like the Islamic faith of the Orma cattle herders in Kenya or the Christian faith of the Dolgan reindeer herders in Siberia, tended to share more of their prize than did adherents of local religions. As larger communities became possible after the invention of agriculture, the researchers write in Science, “intersocietal competition may have favored those religious systems that galvanize pro-social behavior in broader communities, perhaps using both supernatural incentives (for example, hell) and recurrent rituals that intensify group solidarity.”
But a second factor seemed even more important. In explaining attitudes toward fairness, Dr. Henrich and his colleagues found that the strongest predictor was the community’s level of “market integration,” which was measured by the percentage of the diet that was purchased. The people who got all or most of their food by hunting, fishing, foraging or growing it themselves were less inclined to share a prize equally.
Grocery shopping may seem an unlikely form of moral education, but the researchers argue in Science that the development of “market norms” promotes general levels of “trust, fairness and cooperation” with strangers. (You can debate that point at nytimes.com/tierneylab.)
“Markets don’t work very efficiently if everyone acts selfishly and believes everyone else will do the same,” Dr. Henrich says. “You end up with high transaction costs because you have to have all these protections to cover every loophole. But if you develop norms to be fair and trusting with people beyond your social sphere, that provides enormous economic advantages and allows a society to grow.”
One such dynamic society was ancient Greece, whose ethical norms spread as it grew, widely, and perhaps it was no coincidence that those ethics were developed by philosophers debating alongside merchants at the central marketplace called the agora. In retrospect, maybe Diogenes and his lamp didn’t really have all that far to go.
"Want to Use My Suit? Then Throw Me Something, Mister!"
Pictures
Want to Use My Suit? Then Throw Me Something
By CAMPBELL ROBERTSON
NEW ORLEANS — Just after dusk on Friday night, Tyrone Yancy was strutting through one of the more uncertain parts of town in a $6,000 custom-made suit.
He was concerned about being robbed, but not by the neighborhood teenagers who trotted out in the street to join him. The real potential for theft, as Mr. Yancy sees it, came from the strangers darting around him and his well-appointed colleagues in a hectic orbit: photographers.
Mr. Yancy, 44, is a nursing assistant by profession. His calling, however, is as one of the Mardi Gras Indians — a member of the Yellow Pocahontas tribe, to be exact — the largely working-class black New Orleanians who create and wear ornate, enormous feathered costumes and come out three times a year to show them off.
He is also one of a number of Indians who have become fed up with seeing their photographs on calendars, posters and expensive prints, without getting anything in return.
Knowing that there are few legal protections for a person who is photographed in public — particularly one who stops and poses every few feet — some Mardi Gras Indians have begun filing for copyright protection for their suits, which account for thousands of dollars in glass beads, rhinestones, feathers and velvet, and hundreds of hours of late-night sewing.
Anyone could still take their pictures, but the Indians, many of whom live at the economic margins, would have some recourse if they saw the pictures being sold, or used in advertising. (News photographs, like the ones illustrating this article, are not at issue.)
“It’s not the old way of doing things, but the old way of doing things was conducive to exploitation,” said Ashlye M. Keaton, a lawyer who represents Indians in her private practice and also works with them through two pro bono legal programs, Sweet Home New Orleans legal services, and the Entertainment Law Legal Assistance Project.
The legal grounding of the strategy is debatable, the ability to enforce it even more so. But what may be most tricky of all is pushing the Indians themselves to start thinking about the legal and financial dimensions of something they have always done out of tradition.
Mardi Gras Indians have been around for more than a century — more than two, some say — and are generally thought to have originated as a way to pay homage to the American Indians who harbored runaway slaves and started families with them.
The Indians come out and parade in full dress on Mardi Gras; on St. Joseph’s Night, March 19; and on a Sunday close to St. Joseph’s — a tradition that arose out of the affinity between blacks and Sicilians in the city’s working-class precincts.
The 30 or so Indian tribes are representatives of their neighborhoods, and starting from home turf they venture out in their shimmering suits to meet other tribes on procession in the streets. Time was, these run-ins would often end with somebody in the hospital, or worse.
But over the past few decades, encouraged by the legendary Chief of Chiefs, Tootie Montana, the showdowns became primarily about the suits, and whose suit could out-prettify all the others.
Indian suits, which in the old days were occasionally burned at the end of a season, have become stunningly elaborate and stunningly expensive, costing upwards of $10,000. For many Indians, it is a matter of principle that they make a new suit from scratch each year.
The copyright idea has been floating around for a while — several of Mr. Montana’s suits were registered years ago — but Ms. Keaton began pursuing it more vigorously in 2006, when she was approached by John Ellison, a 52-year-old detailer in an auto body shop and a member of the Wild Tchoupitoulas.
Any photograph that focused on a suit protected by a copyright could arguably be considered a derivative work. The sale of such a picture (or its use in tourism ads, for example) would be on the merits of the suit rather than the photograph itself, and if the person selling it did not have permission, he could be sued.
But the idea is not so easy to put into practice. In American copyright law, clothing designs generally cannot be protected because they are more functional than aesthetic. Ms. Keaton argues that the suits, which can weigh well over 100 pounds, should be considered works of sculpture, not outfits.
The Sweet Home organization held a workshop for Indians on the topic last fall, and is pressing them to fill out copyright forms for this year’s suits. But there has not yet been a test case for the legal theory and it is unclear how one would fare.
“The Mardi Gras Indian costumes are pretty wild and not functional in the ordinary sense of the word, so that suggests that they might be copyrightable,” Kal Raustiala, a professor at the law school of the University of California, Los Angeles, wrote in an e-mail message.
“That said,” he added, “lots of runway fashion is also way out there and not likely to fit anyone’s ordinary idea of usefulness, yet it doesn’t receive copyright protection.”
Mr. Ellison filled out his copyright registration form on the spot, but later lost it, a testament to the difficulties of changing a culture.
Christopher Porché West, who has been photographing Mardi Gras Indians since 1979, said he had heard these kinds of complaints for years. They are counterproductive, he said, given the relatively small amount of money he and other photographers earn from Indian portraits.
“What they really need to do is self-exploit,” he said. If they want to make money from their culture, he said, “they should find a way to commodify it and bring that to the market.”
But words like “commodify” are foreign and even a little distasteful for many in this city, rather like finding tofu sausage in a gumbo. Indians do make a few hundred dollars here and there showing up at parties and concerts, and a few have tried, with disappointing results, to sell last year’s suits on eBay.
“Indian culture was never, ever meant to make any money,” said Howard Miller, Big Chief of the Creole Wild West, the city’s oldest tribe, and president of the Mardi Gras Indian Council. But neither should the culture be exploited by others.
“We have a beef,” he said, “with anybody who takes us for granted.”
Finding a possibly new (old) human species
Bone May Reveal a New Human Group
By NICHOLAS WADE
A previously unknown kind of human group vanished from the world so completely that it has left behind the merest wisp of evidence that it ever existed — a single bone from the little finger of a child, buried in a cave in the Altai mountains of southern Siberia.
Researchers extracted DNA from the bone and reported Wednesday that it differed conspicuously from that of both modern humans and of Neanderthals, the archaic human species that inhabited Europe until the arrival of modern humans on the continent some 44,000 years ago.
The child who carried the DNA lineage was probably 5 to 7 years old, but it is not yet known if it was a boy or a girl. The finger bone was excavated by Russian archaeologists in 2008 from a place known as the Denisova cave.
The researchers, led by Johannes Krause and Svante Paabo of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, are careful not to call the Denisova child a new human species, though it may prove to be so, because the evidence is preliminary.
But they say the genetic material extracted from the bone, an element called mitochondrial DNA, belonged to a distinct human lineage that migrated out of Africa at a different time from the two known archaic human species. Homo erectus, found in East Asia, left Africa two million years ago, and the ancestor of Neanderthals emigrated some 500,000 years ago. The number of differences found in the child’s DNA indicate that its ancestors left Africa about one million years ago, the researchers say. Their report is published online in the journal Nature.
Dr. Paabo, a pioneer in decoding ancient human DNA, said at a news conference that before asserting that the Denisova child was a new species, he needed to rule out the possibility that it belonged to a population formed by interbreeding between the new lineage and a known species. He said he was analyzing the rest of the child’s DNA, from the main or nuclear genome, to test this possibility.
“Back at the time this lineage came out of Africa, it had to have been a distinct group, perhaps a distinct species,” he said. “But whether or not this individual was a distinct species, we have to wait for the nuclear DNA.”
The finger bone was found in a layer laid down on the cave floor between 48,000 and 30,000 years ago, according to radiocarbon dating. At that time, toward the end of the Pleistocene Ice Age, which ended 10,000 years ago, the climate was probably much colder. The people of the new lineage presumably wore clothes, Dr. Krause said, because chimpanzees and gorillas cannot withstand much cold, suggesting that fur alone is inadequate protection.
The artifacts found in the cave in the same layer as the finger bone include ornaments and a bracelet that are typical of modern human sites from the Upper Paleolithic age in Europe. These are puzzling artifacts to be found with a nonmodern human species. But bones can move up and down in archaeological sites, and it is hard to know if the finger bone is truly associated with these artifacts, Dr. Krause said, even though there is little sign of mixing in the cave’s layers.
The valley beneath the Denisova cave 30,000 years ago would have been mostly a steppe, or treeless grassland, according to pollen analysis, and it was roamed by ice-age species like the woolly mammoth and woolly rhino, Dr. Krause said.
The region was inhabited by both Neanderthals and modern humans at that time. Counting the new human lineage, three human species may have lived together in proximity. “So the picture of the humans around in the late Pleistocene gets a lot more complex and a lot more interesting,” Dr. Paabo said.
The standard view has long been that there were three human migrations out of Africa — those of Homo erectus; of the ancestor of Neanderthals; and finally, some 50,000 years ago, of modern humans. But in 2004, archaeologists reported that they had found the bones of miniature humans who lived on the Indonesian island of Flores until 13,000 years ago, posing a serious problem for this view. The new lineage is the second such challenge, and it suggests that human migrations out of Africa, though far from continuous, were more frequent than supposed.
“We are learning more and more what a luxuriant evolutionary tree humans have had,” said Ian Tattersall, a paleoanthropologist at the American Museum of Natural History in New York. The tree during evolutionary time has kept sprouting new branches, all but one of which die off, before the process is repeated.
As recently as 30,000 years ago, it now appears, there were five human species in the world: Homo erectus, the little Floresians, Neanderthals, modern humans and the new lineage from the Denisova cave. This is similar to the situation two million years ago, when four hominid species are known to have lived in the Turkana Basin of Kenya, Dr. Tattersall said.
“We think it’s normal to be alone in the world as we are today,” Dr. Tattersall said, and to see human evolution as a long trend leading to Homo sapiens. In fact, the tree has kept generating new branches that get cut off, presumably by the sole survivor. “The fossil record is very eloquent about this, and it’s telling us we are an insuperable competitor,” Dr. Tattersall said. Modern humans’ edge over other species probably emerged from their ability to process information: “We can invent alternatives in our heads instead of accepting nature as it is,” Dr. Tattersall said.
If the nuclear DNA of the Denisova child should differ as much as its mitochondrial DNA does from that of Neanderthals and modern humans, the case for declaring it a new species would be strengthened. But it would be unusual, if not unprecedented, for a new species to be recognized on the basis of DNA alone.
In new excavations starting this summer, archaeologists will look for remains more diagnostic than the finger bone. Researchers will also begin re-examining the fossil collections in museums to see if any wrongly assigned bones might belong instead to the new lineage, Dr. Krause said.
"The hidden histories that shape the way we live now" (I'm not going to hide this one behind a cut)
Too-Busy Bees
By MARCELO AIZEN and LAWRENCE HARDER
IN the past five years, as the phenomenon known as colony-collapse disorder has spread across the United States and Europe, causing the disappearance of whole colonies of domesticated honeybees, many people have come to fear that our food supply is in peril. The news on Wednesday that a Department of Agriculture survey found that American honeybees had died in great numbers this winter can only add to such fears.
The truth, fortunately, is not nearly so dire. But it is more complicated.
There is good news: While some areas are seeing a shortage of bees, globally the number of domesticated honeybee colonies is increasing. The bad news is that this increase can’t keep up with our growing appetite for luxury foods that depend heavily on bee pollination. The domesticated honeybee isn’t the only pollinator that agriculture relies on — wild bees also play a significant role, and we seem intent on destroying their habitats.
To understand the problem, we need to understand the extent of the honeybee’s role in agriculture. Humans certainly benefit from the way bees — and to a lesser extent, other pollinators like flies, beetles and butterflies — help plants produce fruits and seeds. Agriculture, however, is not as dependent on pollinators as one might think. It’s true that some crops like raspberries, cashews, cranberries and mangoes cannot reproduce without pollinators. But crops like sugar cane and potatoes, grown for their stems or tubers, can be propagated without pollination. And the crops that provide our staple carbohydrates — wheat, rice and corn — are either wind-pollinated or self-pollinated. These don’t need bees at all.
Overall, about one-third of our worldwide agricultural production depends to some extent on bee pollination, but less than 10 percent of the 100 most productive crop species depend entirely on it. If pollinators were to vanish, it would reduce total food production by only about 6 percent.
This wouldn’t mean the end of human existence, but if we want to continue eating foods like apples and avocados, we need to understand that bees and other pollinators can’t keep up with the current growth in production of these foods.
The reason is that fruit and seed crops that are most dependent on pollinators yield relatively little food per acre, and therefore take up an inordinate, and increasing, amount of land. The fraction of agriculture dependent on pollination has increased by 300 percent in half a century.
The paradox is that our demand for these foods endangers the wild bees that help make their cultivation possible. The expansion of farmland destroys wild bees’ nesting sites and also wipes out the wildflowers that the bees depend on when food crops aren’t in blossom. Researchers in Britain and the Netherlands have found that the diversity of wild bee species in most regions in those countries has declined since 1980. This decrease was mostly due to the loss of bees that require very particular habitats — bees that couldn’t adapt after losing their homes and food sources to cultivation. Similarly, between 1940 and 1960, as land increasingly came under cultivation in the American Midwest, several bumblebee species disappeared from the area. It is difficult to count and keep track of wild bee populations globally, but their numbers are probably declining overall as a result of such human activity.
Even if the number of wild pollinators remained stable, it would not be sufficient to meet the increasing demand for agricultural pollination. Could domesticated bees take up the slack? By looking at data from the Food and Agriculture Organization of the United Nations, we found that the number of managed honeybee hives increased by 45 percent during the past five decades.
Unfortunately, this increase cannot counteract the growing demand for pollination or the shortage of wild pollinators. Domesticated bees mainly produce honey; any contribution they make to crop pollination is usually a secondary benefit. In most parts of the world, they provide pollination only locally and not necessarily where it is needed most.
Thus a vicious cycle: Fewer pollinating bees reduce yield per acre — and lower yield requires cultivation of more land to produce the same amount of food.
Eventually, a growing shortage of pollinators will limit what foods farmers can produce. If we want to continue to enjoy almonds, apples and avocados, we have to cultivate fewer of them, more sustainably, and protect the wild bees that help make their production possible.
An article about morals, sharing, and the world
Moral Lessons, Down Aisle 9
By JOHN TIERNEY
Like Diogenes with his lamp, researchers have traversed the world looking for an honest man — or, more precisely, for people who act in the same fair, unselfish way toward everyone. If you wish to learn to follow this golden rule, which of these strategies is best?
a) Move to a village in the Amazon and go foraging with the indigenous Tsimane people.
b) Move to a Dolgan and Nganasan settlement on the Siberian tundra, herd reindeer and join the Russian Orthodox Church.
c) Visit a Himalayan monastery and follow instructions to “gaze within” and “follow your bliss.”
d) Join a camp of nomadic Hadza hunter-gatherers sharing giraffe meat and honey on the Serengeti savanna.
e) Join a throng of Wal-Mart shoppers buying groceries on the Missouri prairie.
Well, the Siberian church might impart some moral lessons, but your best bet is to go shopping, at least by my reading of the experiments reported in the current issue of Science. It doesn’t have to be Wal-Mart, by the way — any kind of grocery store seems to have an effect. Wal-Mart just happens to be popular with the exceptionally fair-minded residents of Hamilton, a small rural town in northwestern Missouri. They scored higher in a test of fairness toward strangers than did any of the less-modern communities in Fiji, Papua New Guinea, Africa, Asia and Latin America.
The study doesn’t prove the moral superiority of Missourians, because traditional societies emphasize different virtues, like providing food and comfort to relatives. But the results do help explain a central mystery of civilization: How did small family clans evolve into large cities of cooperative strangers? Why are New Yorkers sometimes nice even to tourists?
Being nice made evolutionary sense when we lived in small bands surrounded by relatives, because helping them helped our genes survive. And we had a direct incentive to be fair to people who would later reciprocate kindness or punish selfishness. But why even consider returning a stranger’s wallet you find in a taxicab? Why leave a tip in a restaurant you’ll never visit again?
Some evolutionary psychologists have suggested that we have an innate sense of fairness left over from our days of living in small clans. According to this theory, our inherited instincts cause us to be nice to strangers even when we’re hurting our interests, just as our ancient taste for fat and sugar causes us now to eat more calories than are good for us.
But there’s more to it than just inherited instinct, says Joseph Henrich of the University of British Columbia, who led the study’s team of anthropologists, psychologists and economists. They found wide cultural variations by observing more than 2,000 people in 15 small communities participate in a two-player game, called Dictator, with a prize equal to the local pay for a day’s work.
One player, the dictator, was given the authority to keep the entire prize or share part of it with the other, unseen player, whose identity remained secret. Along with this power came the assurance that the dictator’s identity would also remain secret, so that no one except the researcher would ever know how selfish the dictator had been.
The most lucrative option, of course, was to keep the whole prize and stiff the anonymous partner. But the Missourians on average shared more than 45 percent of the prize, and some other societies were nearly as generous, like the Ghanians living in the city of Accra and the Sanquianga fishermen on the coast of Colombia.
But most of the hunter-gatherers, foragers and subsistence farmers were less inclined to share. The Hadza nomads in the Serengeti and the Tsimane Indians in the Amazon gave away only a quarter of the prize. They also reacted differently when given a chance, in variations of the game, to punish another player for hogging the prize.
Selfishness offended the Missourians so much that they would punish the player even though it cost them money. But the members of traditional societies showed little inclination to punish others at their own expense. “There are lots of norms in these small-scale societies for how to treat one another and share food,” says Dr. Henrich. “But these rules don’t apply in unusual situations when you don’t know anything about the kinship or status of the other person. You don’t feel the same sense of responsibility, and you act more out of self-interest.”
The researchers found that people in small communities like the Hadza camp (population about 50) were less willing to inflict punishment than people in larger communities like Hamilton (about 1,800). That makes practical sense: the more strangers there are, the more need to keep them from exploiting one another. But what enabled those larger societies to grow in the first place?
Dr. Henrich and his colleagues identified two distinguishing factors.
People belonging to a modern “world religion,” like the Islamic faith of the Orma cattle herders in Kenya or the Christian faith of the Dolgan reindeer herders in Siberia, tended to share more of their prize than did adherents of local religions. As larger communities became possible after the invention of agriculture, the researchers write in Science, “intersocietal competition may have favored those religious systems that galvanize pro-social behavior in broader communities, perhaps using both supernatural incentives (for example, hell) and recurrent rituals that intensify group solidarity.”
But a second factor seemed even more important. In explaining attitudes toward fairness, Dr. Henrich and his colleagues found that the strongest predictor was the community’s level of “market integration,” which was measured by the percentage of the diet that was purchased. The people who got all or most of their food by hunting, fishing, foraging or growing it themselves were less inclined to share a prize equally.
Grocery shopping may seem an unlikely form of moral education, but the researchers argue in Science that the development of “market norms” promotes general levels of “trust, fairness and cooperation” with strangers. (You can debate that point at nytimes.com/tierneylab.)
“Markets don’t work very efficiently if everyone acts selfishly and believes everyone else will do the same,” Dr. Henrich says. “You end up with high transaction costs because you have to have all these protections to cover every loophole. But if you develop norms to be fair and trusting with people beyond your social sphere, that provides enormous economic advantages and allows a society to grow.”
One such dynamic society was ancient Greece, whose ethical norms spread as it grew, widely, and perhaps it was no coincidence that those ethics were developed by philosophers debating alongside merchants at the central marketplace called the agora. In retrospect, maybe Diogenes and his lamp didn’t really have all that far to go.
"Want to Use My Suit? Then Throw Me Something, Mister!"
Pictures
Want to Use My Suit? Then Throw Me Something
By CAMPBELL ROBERTSON
NEW ORLEANS — Just after dusk on Friday night, Tyrone Yancy was strutting through one of the more uncertain parts of town in a $6,000 custom-made suit.
He was concerned about being robbed, but not by the neighborhood teenagers who trotted out in the street to join him. The real potential for theft, as Mr. Yancy sees it, came from the strangers darting around him and his well-appointed colleagues in a hectic orbit: photographers.
Mr. Yancy, 44, is a nursing assistant by profession. His calling, however, is as one of the Mardi Gras Indians — a member of the Yellow Pocahontas tribe, to be exact — the largely working-class black New Orleanians who create and wear ornate, enormous feathered costumes and come out three times a year to show them off.
He is also one of a number of Indians who have become fed up with seeing their photographs on calendars, posters and expensive prints, without getting anything in return.
Knowing that there are few legal protections for a person who is photographed in public — particularly one who stops and poses every few feet — some Mardi Gras Indians have begun filing for copyright protection for their suits, which account for thousands of dollars in glass beads, rhinestones, feathers and velvet, and hundreds of hours of late-night sewing.
Anyone could still take their pictures, but the Indians, many of whom live at the economic margins, would have some recourse if they saw the pictures being sold, or used in advertising. (News photographs, like the ones illustrating this article, are not at issue.)
“It’s not the old way of doing things, but the old way of doing things was conducive to exploitation,” said Ashlye M. Keaton, a lawyer who represents Indians in her private practice and also works with them through two pro bono legal programs, Sweet Home New Orleans legal services, and the Entertainment Law Legal Assistance Project.
The legal grounding of the strategy is debatable, the ability to enforce it even more so. But what may be most tricky of all is pushing the Indians themselves to start thinking about the legal and financial dimensions of something they have always done out of tradition.
Mardi Gras Indians have been around for more than a century — more than two, some say — and are generally thought to have originated as a way to pay homage to the American Indians who harbored runaway slaves and started families with them.
The Indians come out and parade in full dress on Mardi Gras; on St. Joseph’s Night, March 19; and on a Sunday close to St. Joseph’s — a tradition that arose out of the affinity between blacks and Sicilians in the city’s working-class precincts.
The 30 or so Indian tribes are representatives of their neighborhoods, and starting from home turf they venture out in their shimmering suits to meet other tribes on procession in the streets. Time was, these run-ins would often end with somebody in the hospital, or worse.
But over the past few decades, encouraged by the legendary Chief of Chiefs, Tootie Montana, the showdowns became primarily about the suits, and whose suit could out-prettify all the others.
Indian suits, which in the old days were occasionally burned at the end of a season, have become stunningly elaborate and stunningly expensive, costing upwards of $10,000. For many Indians, it is a matter of principle that they make a new suit from scratch each year.
The copyright idea has been floating around for a while — several of Mr. Montana’s suits were registered years ago — but Ms. Keaton began pursuing it more vigorously in 2006, when she was approached by John Ellison, a 52-year-old detailer in an auto body shop and a member of the Wild Tchoupitoulas.
Any photograph that focused on a suit protected by a copyright could arguably be considered a derivative work. The sale of such a picture (or its use in tourism ads, for example) would be on the merits of the suit rather than the photograph itself, and if the person selling it did not have permission, he could be sued.
But the idea is not so easy to put into practice. In American copyright law, clothing designs generally cannot be protected because they are more functional than aesthetic. Ms. Keaton argues that the suits, which can weigh well over 100 pounds, should be considered works of sculpture, not outfits.
The Sweet Home organization held a workshop for Indians on the topic last fall, and is pressing them to fill out copyright forms for this year’s suits. But there has not yet been a test case for the legal theory and it is unclear how one would fare.
“The Mardi Gras Indian costumes are pretty wild and not functional in the ordinary sense of the word, so that suggests that they might be copyrightable,” Kal Raustiala, a professor at the law school of the University of California, Los Angeles, wrote in an e-mail message.
“That said,” he added, “lots of runway fashion is also way out there and not likely to fit anyone’s ordinary idea of usefulness, yet it doesn’t receive copyright protection.”
Mr. Ellison filled out his copyright registration form on the spot, but later lost it, a testament to the difficulties of changing a culture.
Christopher Porché West, who has been photographing Mardi Gras Indians since 1979, said he had heard these kinds of complaints for years. They are counterproductive, he said, given the relatively small amount of money he and other photographers earn from Indian portraits.
“What they really need to do is self-exploit,” he said. If they want to make money from their culture, he said, “they should find a way to commodify it and bring that to the market.”
But words like “commodify” are foreign and even a little distasteful for many in this city, rather like finding tofu sausage in a gumbo. Indians do make a few hundred dollars here and there showing up at parties and concerts, and a few have tried, with disappointing results, to sell last year’s suits on eBay.
“Indian culture was never, ever meant to make any money,” said Howard Miller, Big Chief of the Creole Wild West, the city’s oldest tribe, and president of the Mardi Gras Indian Council. But neither should the culture be exploited by others.
“We have a beef,” he said, “with anybody who takes us for granted.”
Finding a possibly new (old) human species
Bone May Reveal a New Human Group
By NICHOLAS WADE
A previously unknown kind of human group vanished from the world so completely that it has left behind the merest wisp of evidence that it ever existed — a single bone from the little finger of a child, buried in a cave in the Altai mountains of southern Siberia.
Researchers extracted DNA from the bone and reported Wednesday that it differed conspicuously from that of both modern humans and of Neanderthals, the archaic human species that inhabited Europe until the arrival of modern humans on the continent some 44,000 years ago.
The child who carried the DNA lineage was probably 5 to 7 years old, but it is not yet known if it was a boy or a girl. The finger bone was excavated by Russian archaeologists in 2008 from a place known as the Denisova cave.
The researchers, led by Johannes Krause and Svante Paabo of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, are careful not to call the Denisova child a new human species, though it may prove to be so, because the evidence is preliminary.
But they say the genetic material extracted from the bone, an element called mitochondrial DNA, belonged to a distinct human lineage that migrated out of Africa at a different time from the two known archaic human species. Homo erectus, found in East Asia, left Africa two million years ago, and the ancestor of Neanderthals emigrated some 500,000 years ago. The number of differences found in the child’s DNA indicate that its ancestors left Africa about one million years ago, the researchers say. Their report is published online in the journal Nature.
Dr. Paabo, a pioneer in decoding ancient human DNA, said at a news conference that before asserting that the Denisova child was a new species, he needed to rule out the possibility that it belonged to a population formed by interbreeding between the new lineage and a known species. He said he was analyzing the rest of the child’s DNA, from the main or nuclear genome, to test this possibility.
“Back at the time this lineage came out of Africa, it had to have been a distinct group, perhaps a distinct species,” he said. “But whether or not this individual was a distinct species, we have to wait for the nuclear DNA.”
The finger bone was found in a layer laid down on the cave floor between 48,000 and 30,000 years ago, according to radiocarbon dating. At that time, toward the end of the Pleistocene Ice Age, which ended 10,000 years ago, the climate was probably much colder. The people of the new lineage presumably wore clothes, Dr. Krause said, because chimpanzees and gorillas cannot withstand much cold, suggesting that fur alone is inadequate protection.
The artifacts found in the cave in the same layer as the finger bone include ornaments and a bracelet that are typical of modern human sites from the Upper Paleolithic age in Europe. These are puzzling artifacts to be found with a nonmodern human species. But bones can move up and down in archaeological sites, and it is hard to know if the finger bone is truly associated with these artifacts, Dr. Krause said, even though there is little sign of mixing in the cave’s layers.
The valley beneath the Denisova cave 30,000 years ago would have been mostly a steppe, or treeless grassland, according to pollen analysis, and it was roamed by ice-age species like the woolly mammoth and woolly rhino, Dr. Krause said.
The region was inhabited by both Neanderthals and modern humans at that time. Counting the new human lineage, three human species may have lived together in proximity. “So the picture of the humans around in the late Pleistocene gets a lot more complex and a lot more interesting,” Dr. Paabo said.
The standard view has long been that there were three human migrations out of Africa — those of Homo erectus; of the ancestor of Neanderthals; and finally, some 50,000 years ago, of modern humans. But in 2004, archaeologists reported that they had found the bones of miniature humans who lived on the Indonesian island of Flores until 13,000 years ago, posing a serious problem for this view. The new lineage is the second such challenge, and it suggests that human migrations out of Africa, though far from continuous, were more frequent than supposed.
“We are learning more and more what a luxuriant evolutionary tree humans have had,” said Ian Tattersall, a paleoanthropologist at the American Museum of Natural History in New York. The tree during evolutionary time has kept sprouting new branches, all but one of which die off, before the process is repeated.
As recently as 30,000 years ago, it now appears, there were five human species in the world: Homo erectus, the little Floresians, Neanderthals, modern humans and the new lineage from the Denisova cave. This is similar to the situation two million years ago, when four hominid species are known to have lived in the Turkana Basin of Kenya, Dr. Tattersall said.
“We think it’s normal to be alone in the world as we are today,” Dr. Tattersall said, and to see human evolution as a long trend leading to Homo sapiens. In fact, the tree has kept generating new branches that get cut off, presumably by the sole survivor. “The fossil record is very eloquent about this, and it’s telling us we are an insuperable competitor,” Dr. Tattersall said. Modern humans’ edge over other species probably emerged from their ability to process information: “We can invent alternatives in our heads instead of accepting nature as it is,” Dr. Tattersall said.
If the nuclear DNA of the Denisova child should differ as much as its mitochondrial DNA does from that of Neanderthals and modern humans, the case for declaring it a new species would be strengthened. But it would be unusual, if not unprecedented, for a new species to be recognized on the basis of DNA alone.
In new excavations starting this summer, archaeologists will look for remains more diagnostic than the finger bone. Researchers will also begin re-examining the fossil collections in museums to see if any wrongly assigned bones might belong instead to the new lineage, Dr. Krause said.
"The hidden histories that shape the way we live now" (I'm not going to hide this one behind a cut)
no subject
Date: 2010-03-26 08:21 pm (UTC)