By PAUL KRUGMAN
NY Times Published: March 21, 2013
A couple of years ago, the journalist Nicholas Shaxson published a fascinating, chilling book titled “Treasure Islands,” which explained how international tax havens — which are also, as the author pointed out, “secrecy jurisdictions” where many rules don’t apply — undermine economies around the world. Not only do they bleed revenues from cash-strapped governments and enable corruption; they distort the flow of capital, helping to feed ever-bigger financial crises.
One question Mr. Shaxson didn’t get into much, however, is what happens when a secrecy jurisdiction itself goes bust. That’s the story of Cyprus right now. And whatever the outcome for Cyprus itself (hint: it’s not likely to be happy), the Cyprus mess shows just how unreformed the world banking system remains, almost five years after the global financial crisis began.
So, about Cyprus: You might wonder why anyone cares about a tiny nation with an economy not much bigger than that of metropolitan Scranton, Pa. Cyprus is, however, a member of the euro zone, so events there could trigger contagion (for example, bank runs) in larger nations. And there’s something else: While the Cypriot economy may be tiny, it’s a surprisingly large financial player, with a banking sector four or five times as big as you might expect given the size of its economy.
Why are Cypriot banks so big? Because the country is a tax haven where corporations and wealthy foreigners stash their money. Officially, 37 percent of the deposits in Cypriot banks come from nonresidents; the true number, once you take into account wealthy expatriates and people who are only nominally resident in Cyprus, is surely much higher. Basically, Cyprus is a place where people, especially but not only Russians, hide their wealth from both the taxmen and the regulators. Whatever gloss you put on it, it’s basically about money-laundering.
And the truth is that much of the wealth never moved at all; it just became invisible. On paper, for example, Cyprus became a huge investor in Russia — much bigger than Germany, whose economy is hundreds of times larger. In reality, of course, this was just “roundtripping” by Russians using the island as a tax shelter.
Unfortunately for the Cypriots, enough real money came in to finance some seriously bad investments, as their banks bought Greek debt and lent into a vast real estate bubble. Sooner or later, things were bound to go wrong. And now they have.
Now what? There are some strong similarities between Cyprus now and Iceland (a similar-size economy) a few years back. Like Cyprus now, Iceland had a huge banking sector, swollen by foreign deposits, that was simply too big to bail out. Iceland’s response was essentially to let its banks go bust, wiping out those foreign investors, while protecting domestic depositors — and the results weren’t too bad. Indeed, Iceland, with a far lower unemployment rate than most of Europe, has weathered the crisis surprisingly well.
Unfortunately, Cyprus’s response to its crisis has been a hopeless muddle. In part, this reflects the fact that it no longer has its own currency, which makes it dependent on decision makers in Brussels and Berlin — decision makers who haven’t been willing to let banks openly fail.
But it also reflects Cyprus’s own reluctance to accept the end of its money-laundering business; its leaders are still trying to limit losses to foreign depositors in the vain hope that business as usual can resume, and they were so anxious to protect the big money that they tried to limit foreigners’ losses by expropriating small domestic depositors. As it turned out, however, ordinary Cypriots were outraged, the plan was rejected, and, at this point, nobody knows what will happen.
My guess is that, in the end, Cyprus will adopt something like the Icelandic solution, but unless it ends up being forced off the euro in the next few days — a real possibility — it may first waste a lot of time and money on half-measures, trying to avoid facing up to reality while running up huge debts to wealthier nations. We’ll see.
But step back for a minute and consider the incredible fact that tax havens like Cyprus, the Cayman Islands, and many more are still operating pretty much the same way that they did before the global financial crisis. Everyone has seen the damage that runaway bankers can inflict, yet much of the world’s financial business is still routed through jurisdictions that let bankers sidestep even the mild regulations we’ve put in place. Everyone is crying about budget deficits, yet corporations and the wealthy are still freely using tax havens to avoid paying taxes like the little people.
So don’t cry for Cyprus; cry for all of us, living in a world whose leaders seem determined not to learn from disaster.
March 23rd, 2013
from 3 part composition, variation 2, 2012
Watercolor on Seisochan Japanese handmade paper
Through March 23, 2013
Thanks to RS
March 22nd, 2013By Ethan Watters
Pacific Standard
February 25, 2013
IN THE SUMMER of 1995, a young graduate student in anthropology at UCLA named Joe Henrich traveled to Peru to carry out some fieldwork among the Machiguenga, an indigenous people who live north of Machu Picchu in the Amazon basin. The Machiguenga had traditionally been horticulturalists who lived in single-family, thatch-roofed houses in small hamlets composed of clusters of extended families. For sustenance, they relied on local game and produce from small-scale farming. They shared with their kin but rarely traded with outside groups.
While the setting was fairly typical for an anthropologist, Henrich’s research was not. Rather than practice traditional ethnography, he decided to run a behavioral experiment that had been developed by economists. Henrich used a “game”—along the lines of the famous prisoner’s dilemma—to see whether isolated cultures shared with the West the same basic instinct for fairness. In doing so, Henrich expected to confirm one of the foundational assumptions underlying such experiments, and indeed underpinning the entire fields of economics and psychology: that humans all share the same cognitive machinery—the same evolved rational and psychological hardwiring.
The test that Henrich introduced to the Machiguenga was called the ultimatum game. The rules are simple: in each game there are two players who remain anonymous to each other. The first player is given an amount of money, say $100, and told that he has to offer some of the cash, in an amount of his choosing, to the other subject. The second player can accept or refuse the split. But there’s a hitch: players know that if the recipient refuses the offer, both leave empty-handed. North Americans, who are the most common subjects for such experiments, usually offer a 50-50 split when on the giving end. When on the receiving end, they show an eagerness to punish the other player for uneven splits at their own expense. In short, Americans show the tendency to be equitable with strangers—and to punish those who are not.
Among the Machiguenga, word quickly spread of the young, square-jawed visitor from America giving away money. The stakes Henrich used in the game with the Machiguenga were not insubstantial—roughly equivalent to the few days’ wages they sometimes earned from episodic work with logging or oil companies. So Henrich had no problem finding volunteers. What he had great difficulty with, however, was explaining the rules, as the game struck the Machiguenga as deeply odd.
When he began to run the game it became immediately clear that Machiguengan behavior was dramatically different from that of the average North American. To begin with, the offers from the first player were much lower. In addition, when on the receiving end of the game, the Machiguenga rarely refused even the lowest possible amount. “It just seemed ridiculous to the Machiguenga that you would reject an offer of free money,” says Henrich. “They just didn’t understand why anyone would sacrifice money to punish someone who had the good luck of getting to play the other role in the game.”
The potential implications of the unexpected results were quickly apparent to Henrich. He knew that a vast amount of scholarly literature in the social sciences—particularly in economics and psychology—relied on the ultimatum game and similar experiments. At the heart of most of that research was the implicit assumption that the results revealed evolved psychological traits common to all humans, never mind that the test subjects were nearly always from the industrialized West. Henrich realized that if the Machiguenga results stood up, and if similar differences could be measured across other populations, this assumption of universality would have to be challenged.
Henrich had thought he would be adding a small branch to an established tree of knowledge. It turned out he was sawing at the very trunk. He began to wonder: What other certainties about “human nature” in social science research would need to be reconsidered when tested across diverse populations?
Henrich soon landed a grant from the MacArthur Foundation to take his fairness games on the road. With the help of a dozen other colleagues he led a study of 14 other small-scale societies, in locales from Tanzania to Indonesia. Differences abounded in the behavior of both players in the ultimatum game. In no society did he find people who were purely selfish (that is, who always offered the lowest amount, and never refused a split), but average offers from place to place varied widely and, in some societies—ones where gift-giving is heavily used to curry favor or gain allegiance—the first player would often make overly generous offers in excess of 60 percent, and the second player would often reject them, behaviors almost never observed among Americans.
The research established Henrich as an up-and-coming scholar. In 2004, he was given the U.S. Presidential Early Career Award for young scientists at the White House. But his work also made him a controversial figure. When he presented his research to the anthropology department at the University of British Columbia during a job interview a year later, he recalls a hostile reception. Anthropology is the social science most interested in cultural differences, but the young scholar’s methods of using games and statistics to test and compare cultures with the West seemed heavy-handed and invasive to some. “Professors from the anthropology department suggested it was a bad thing that I was doing,” Henrich remembers. “The word ‘unethical’ came up.”
So instead of toeing the line, he switched teams. A few well-placed people at the University of British Columbia saw great promise in Henrich’s work and created a position for him, split between the economics department and the psychology department. It was in the psychology department that he found two kindred spirits in Steven Heine and Ara Norenzayan. Together the three set about writing a paper that they hoped would fundamentally challenge the way social scientists thought about human behavior, cognition, and culture.
A MODERN LIBERAL ARTS education gives lots of lip service to the idea of cultural diversity. It’s generally agreed that all of us see the world in ways that are sometimes socially and culturally constructed, that pluralism is good, and that ethnocentrism is bad. But beyond that the ideas get muddy. That we should welcome and celebrate people of all backgrounds seems obvious, but the implied corollary—that people from different ethno-cultural origins have particular attributes that add spice to the body politic—becomes more problematic. To avoid stereotyping, it is rarely stated bluntly just exactly what those culturally derived qualities might be. Challenge liberal arts graduates on their appreciation of cultural diversity and you’ll often find them retreating to the anodyne notion that under the skin everyone is really alike.
If you take a broad look at the social science curriculum of the last few decades, it becomes a little more clear why modern graduates are so unmoored. The last generation or two of undergraduates have largely been taught by a cohort of social scientists busily doing penance for the racism and Eurocentrism of their predecessors, albeit in different ways. Many anthropologists took to the navel gazing of postmodernism and swore off attempts at rationality and science, which were disparaged as weapons of cultural imperialism.
Economists and psychologists, for their part, did an end run around the issue with the convenient assumption that their job was to study the human mind stripped of culture. The human brain is genetically comparable around the globe, it was agreed, so human hardwiring for much behavior, perception, and cognition should be similarly universal. No need, in that case, to look beyond the convenient population of undergraduates for test subjects. A 2008 survey of the top six psychology journals dramatically shows how common that assumption was: more than 96 percent of the subjects tested in psychological studies from 2003 to 2007 were Westerners—with nearly 70 percent from the United States alone. Put another way: 96 percent of human subjects in these studies came from countries that represent only 12 percent of the world’s population.
Henrich’s work with the ultimatum game was an example of a small but growing countertrend in the social sciences, one in which researchers look straight at the question of how deeply culture shapes human cognition. His new colleagues in the psychology department, Heine and Norenzayan, were also part of this trend. Heine focused on the different ways people in Western and Eastern cultures perceived the world, reasoned, and understood themselves in relationship to others. Norenzayan’s research focused on the ways religious belief influenced bonding and behavior. The three began to compile examples of cross-cultural research that, like Henrich’s work with the Machiguenga, challenged long-held assumptions of human psychological universality.
Some of that research went back a generation. It was in the 1960s, for instance, that researchers discovered that aspects of visual perception were different from place to place. One of the classics of the literature, the Müller-Lyer illusion, showed that where you grew up would determine to what degree you would fall prey to the illusion that these two lines are different in length:
Researchers found that Americans perceive the line with the ends feathered outward (B) as being longer than the line with the arrow tips (A). San foragers of the Kalahari, on the other hand, were more likely to see the lines as they are: equal in length. Subjects from more than a dozen cultures were tested, and Americans were at the far end of the distribution—seeing the illusion more dramatically than all others.
More recently psychologists had challenged the universality of research done in the 1950s by pioneering social psychologist Solomon Asch. Asch had discovered that test subjects were often willing to make incorrect judgments on simple perception tests to conform with group pressure. When the test was performed across 17 societies, however, it turned out that group pressure had a range of influence. Americans were again at the far end of the scale, in this case showing the least tendency to conform to group belief.
As Heine, Norenzayan, and Henrich furthered their search, they began to find research suggesting wide cultural differences almost everywhere they looked: in spatial reasoning, the way we infer the motivations of others, categorization, moral reasoning, the boundaries between the self and others, and other arenas. These differences, they believed, were not genetic. The distinct ways Americans and Machiguengans played the ultimatum game, for instance, wasn’t because they had differently evolved brains. Rather, Americans, without fully realizing it, were manifesting a psychological tendency shared with people in other industrialized countries that had been refined and handed down through thousands of generations in ever more complex market economies. When people are constantly doing business with strangers, it helps when they have the desire to go out of their way (with a lawsuit, a call to the Better Business Bureau, or a bad Yelp review) when they feel cheated. Because Machiguengan culture had a different history, their gut feeling about what was fair was distinctly their own. In the small-scale societies with a strong culture of gift-giving, yet another conception of fairness prevailed. There, generous financial offers were turned down because people’s minds had been shaped by a cultural norm that taught them that the acceptance of generous gifts brought burdensome obligations. Our economies hadn’t been shaped by our sense of fairness; it was the other way around.
The growing body of cross-cultural research that the three researchers were compiling suggested that the mind’s capacity to mold itself to cultural and environmental settings was far greater than had been assumed. The most interesting thing about cultures may not be in the observable things they do—the rituals, eating preferences, codes of behavior, and the like—but in the way they mold our most fundamental conscious and unconscious thinking and perception.
For instance, the different ways people perceive the Müller-Lyer illusion likely reflects lifetimes spent in different physical environments. American children, for the most part, grow up in box-shaped rooms of varying dimensions. Surrounded by carpentered corners, visual perception adapts to this strange new environment (strange and new in terms of human history, that is) by learning to perceive converging lines in three dimensions.
When unconsciously translated in three dimensions, the line with the outward-feathered ends (C) appears farther away and the brain therefore judges it to be longer. The more time one spends in natural environments, where there are no carpentered corners, the less one sees the illusion.
As the three continued their work, they noticed something else that was remarkable: again and again one group of people appeared to be particularly unusual when compared to other populations—with perceptions, behaviors, and motivations that were almost always sliding down one end of the human bell curve.
In the end they titled their paper “The Weirdest People in the World?” (pdf) By “weird” they meant both unusual and Western, Educated, Industrialized, Rich, and Democratic. It is not just our Western habits and cultural preferences that are different from the rest of the world, it appears. The very way we think about ourselves and others—and even the way we perceive reality—makes us distinct from other humans on the planet, not to mention from the vast majority of our ancestors. Among Westerners, the data showed that Americans were often the most unusual, leading the researchers to conclude that “American participants are exceptional even within the unusual population of Westerners—outliers among outliers.”
Given the data, they concluded that social scientists could not possibly have picked a worse population from which to draw broad generalizations. Researchers had been doing the equivalent of studying penguins while believing that they were learning insights applicable to all birds.
NOT LONG AGO I met Henrich, Heine, and Norenzayan for dinner at a small French restaurant in Vancouver, British Columbia, to hear about the reception of their weird paper, which was published in the prestigious journal Behavioral and Brain Sciences in 2010. The trio of researchers are young—as professors go—good-humored family men. They recalled that they were nervous as the publication time approached. The paper basically suggested that much of what social scientists thought they knew about fundamental aspects of human cognition was likely only true of one small slice of humanity. They were making such a broadside challenge to whole libraries of research that they steeled themselves to the possibility of becoming outcasts in their own fields.
“We were scared,” admitted Henrich. “We were warned that a lot of people were going to be upset.”
“We were told we were going to get spit on,” interjected Norenzayan.
“Yes,” Henrich said. “That we’d go to conferences and no one was going to sit next to us at lunchtime.”
Interestingly, they seemed much less concerned that they had used the pejorative acronym WEIRD to describe a significant slice of humanity, although they did admit that they could only have done so to describe their own group. “Really,” said Henrich, “the only people we could have called weird are represented right here at this table.”
Still, I had to wonder whether describing the Western mind, and the American mind in particular, as weird suggested that our cognition is not just different but somehow malformed or twisted. In their paper the trio pointed out cross-cultural studies that suggest that the “weird” Western mind is the most self-aggrandizing and egotistical on the planet: we are more likely to promote ourselves as individuals versus advancing as a group. WEIRD minds are also more analytic, possessing the tendency to telescope in on an object of interest rather than understanding that object in the context of what is around it.
The WEIRD mind also appears to be unique in terms of how it comes to understand and interact with the natural world. Studies show that Western urban children grow up so closed off in man-made environments that their brains never form a deep or complex connection to the natural world. While studying children from the U.S., researchers have suggested a developmental timeline for what is called “folkbiological reasoning.” These studies posit that it is not until children are around 7 years old that they stop projecting human qualities onto animals and begin to understand that humans are one animal among many. Compared to Yucatec Maya communities in Mexico, however, Western urban children appear to be developmentally delayed in this regard. Children who grow up constantly interacting with the natural world are much less likely to anthropomorphize other living things into late childhood.
Given that people living in WEIRD societies don’t routinely encounter or interact with animals other than humans or pets, it’s not surprising that they end up with a rather cartoonish understanding of the natural world. “Indeed,” the report concluded, “studying the cognitive development of folkbiology in urban children would seem the equivalent of studying ‘normal’ physical growth in malnourished children.”
During our dinner, I admitted to Heine, Henrich, and Norenzayan that the idea that I can only perceive reality through a distorted cultural lens was unnerving. For me the notion raised all sorts of metaphysical questions: Is my thinking so strange that I have little hope of understanding people from other cultures? Can I mold my own psyche or the psyches of my children to be less WEIRD and more able to think like the rest of the world? If I did, would I be happier?
Henrich reacted with mild concern that I was taking this research so personally. He had not intended, he told me, for his work to be read as postmodern self-help advice. “I think we’re really interested in these questions for the questions’ sake,” he said.
The three insisted that their goal was not to say that one culturally shaped psychology was better or worse than another—only that we’ll never truly understand human behavior and cognition until we expand the sample pool beyond its current small slice of humanity. Despite these assurances, however, I found it hard not to read a message between the lines of their research. When they write, for example, that weird children develop their understanding of the natural world in a “culturally and experientially impoverished environment” and that they are in this way the equivalent of “malnourished children,” it’s difficult to see this as a good thing.
THE TURN THAT HENRICH, Heine, and Norenzayan are asking social scientists to make is not an easy one: accounting for the influence of culture on cognition will be a herculean task. Cultures are not monolithic; they can be endlessly parsed. Ethnic backgrounds, religious beliefs, economic status, parenting styles, rural upbringing versus urban or suburban—there are hundreds of cultural differences that individually and in endless combinations influence our conceptions of fairness, how we categorize things, our method of judging and decision making, and our deeply held beliefs about the nature of the self, among other aspects of our psychological makeup.
We are just at the beginning of learning how these fine-grained cultural differences affect our thinking. Recent research has shown that people in “tight” cultures, those with strong norms and low tolerance for deviant behavior (think India, Malaysia, and Pakistan), develop higher impulse control and more self-monitoring abilities than those from other places. Men raised in the honor culture of the American South have been shown to experience much larger surges of testosterone after insults than do Northerners. Research published late last year suggested psychological differences at the city level too. Compared to San Franciscans, Bostonians’ internal sense of self-worth is more dependent on community status and financial and educational achievement. “A cultural difference doesn’t have to be big to be important,” Norenzayan said. “We’re not just talking about comparing New York yuppies to the Dani tribesmen of Papua New Guinea.”
As Norenzayan sees it, the last few generations of psychologists have suffered from “physics envy,” and they need to get over it. The job, experimental psychologists often assumed, was to push past the content of people’s thoughts and see the underlying universal hardware at work. “This is a deeply flawed way of studying human nature,” Norenzayan told me, “because the content of our thoughts and their process are intertwined.” In other words, if human cognition is shaped by cultural ideas and behavior, it can’t be studied without taking into account what those ideas and behaviors are and how they are different from place to place.
This new approach suggests the possibility of reverse-engineering psychological research: look at cultural content first; cognition and behavior second. Norenzayan’s recent work on religious belief is perhaps the best example of the intellectual landscape that is now open for study. When Norenzayan became a student of psychology in 1994, four years after his family had moved from Lebanon to America, he was excited to study the effect of religion on human psychology. “I remember opening textbook after textbook and turning to the index and looking for the word ‘religion,’ ” he told me, “Again and again the very word wouldn’t be listed. This was shocking. How could psychology be the science of human behavior and have nothing to say about religion? Where I grew up you’d have to be in a coma not to notice the importance of religion on how people perceive themselves and the world around them.”
Norenzayan became interested in how certain religious beliefs, handed down through generations, may have shaped human psychology to make possible the creation of large-scale societies. He has suggested that there may be a connection between the growth of religions that believe in “morally concerned deities”—that is, a god or gods who care if people are good or bad—and the evolution of large cities and nations. To be cooperative in large groups of relative strangers, in other words, might have required the shared belief that an all-powerful being was forever watching over your shoulder.
If religion was necessary in the development of large-scale societies, can large-scale societies survive without religion? Norenzayan points to parts of Scandinavia with atheist majorities that seem to be doing just fine. They may have climbed the ladder of religion and effectively kicked it away. Or perhaps, after a thousand years of religious belief, the idea of an unseen entity always watching your behavior remains in our culturally shaped thinking even after the belief in God dissipates or disappears.
Why, I asked Norenzayan, if religion might have been so central to human psychology, have researchers not delved into the topic? “Experimental psychologists are the weirdest of the weird,” said Norenzayan. “They are almost the least religious academics, next to biologists. And because academics mostly talk amongst themselves, they could look around and say, ‘No one who is important to me is religious, so this must not be very important.’” Indeed, almost every major theorist on human behavior in the last 100 years predicted that it was just a matter of time before religion was a vestige of the past. But the world persists in being a very religious place.
HENRICH, HEINE, AND NORENZAYAN’S FEAR of being ostracized after the publication of the WEIRD paper turned out to be misplaced. Response to the paper, both published and otherwise, has been nearly universally positive, with more than a few of their colleagues suggesting that the work will spark fundamental changes. “I have no doubt that this paper is going to change the social sciences,” said Richard Nisbett, an eminent psychologist at the University of Michigan. “It just puts it all in one place and makes such a bold statement.”
More remarkable still, after reading the paper, academics from other disciplines began to come forward with their own mea culpas. Commenting on the paper, two brain researchers from Northwestern University argued (pdf) that the nascent field of neuroimaging had made the same mistake as psychologists, noting that 90 percent of neuroimaging studies were performed in Western countries. Researchers in motor development similarly suggested that their discipline’s body of research ignored how different child-rearing practices around the world can dramatically influence states of development. Two psycholinguistics professors suggested that their colleagues had also made the same mistake: blithely assuming human homogeneity while focusing their research primarily on one rather small slice of humanity.
At its heart, the challenge of the WEIRD paper is not simply to the field of experimental human research (do more cross-cultural studies!); it is a challenge to our Western conception of human nature. For some time now, the most widely accepted answer to the question of why humans, among all animals, have so successfully adapted to environments across the globe is that we have big brains with the ability to learn, improvise, and problem-solve.
Henrich has challenged this “cognitive niche” hypothesis with the “cultural niche” hypothesis. He notes that the amount of knowledge in any culture is far greater than the capacity of individuals to learn or figure it all out on their own. He suggests that individuals tap that cultural storehouse of knowledge simply by mimicking (often unconsciously) the behavior and ways of thinking of those around them. We shape a tool in a certain manner, adhere to a food taboo, or think about fairness in a particular way, not because we individually have figured out that behavior’s adaptive value, but because we instinctively trust our culture to show us the way. When Henrich asked Fijian women why they avoided certain potentially toxic fish during pregnancy and breastfeeding, he found that many didn’t know or had fanciful reasons. Regardless of their personal understanding, by mimicking this culturally adaptive behavior they were protecting their offspring. The unique trick of human psychology, these researchers suggest, might be this: our big brains are evolved to let local culture lead us in life’s dance.
The applications of this new way of looking at the human mind are still in the offing. Henrich suggests that his research about fairness might first be applied to anyone working in international relations or development. People are not “plug and play,” as he puts it, and you cannot expect to drop a Western court system or form of government into another culture and expect it to work as it does back home. Those trying to use economic incentives to encourage sustainable land use will similarly need to understand local notions of fairness to have any chance of influencing behavior in predictable ways.
Because of our peculiarly Western way of thinking of ourselves as independent of others, this idea of the culturally shaped mind doesn’t go down very easily. Perhaps the richest and most established vein of cultural psychology—that which compares Western and Eastern concepts of the self—goes to the heart of this problem. Heine has spent much of his career following the lead of a seminal paper published in 1991 by Hazel Rose Markus, of Stanford University, and Shinobu Kitayama, who is now at the University of Michigan. Markus and Kitayama suggested that different cultures foster strikingly different views of the self, particularly along one axis: some cultures regard the self as independent from others; others see the self as interdependent. The interdependent self—which is more the norm in East Asian countries, including Japan and China—connects itself with others in a social group and favors social harmony over self-expression. The independent self—which is most prominent in America—focuses on individual attributes and preferences and thinks of the self as existing apart from the group.
The classic “rod and frame” task: Is the line in the center vertical?
That we in the West develop brains that are wired to see ourselves as separate from others may also be connected to differences in how we reason, Heine argues. Unlike the vast majority of the world, Westerners (and Americans in particular) tend to reason analytically as opposed to holistically. That is, the American mind strives to figure out the world by taking it apart and examining its pieces. Show a Japanese and an American the same cartoon of an aquarium, and the American will remember details mostly about the moving fish while the Japanese observer will likely later be able to describe the seaweed, the bubbles, and other objects in the background. Shown another way, in a different test analytic Americans will do better on something called the “rod and frame” task, where one has to judge whether a line is vertical even though the frame around it is skewed. Americans see the line as apart from the frame, just as they see themselves as apart from the group.
Heine and others suggest that such differences may be the echoes of cultural activities and trends going back thousands of years. Whether you think of yourself as interdependent or independent may depend on whether your distant ancestors farmed rice (which required a great deal of shared labor and group cooperation) or herded animals (which rewarded individualism and aggression). Heine points to Nisbett at Michigan, who has argued (pdf) that the analytic/holistic dichotomy in reasoning styles can be clearly seen, respectively, in Greek and Chinese philosophical writing dating back 2,500 years. These psychological trends and tendencies may echo down generations, hundreds of years after the activity or situation that brought them into existence has disappeared or fundamentally changed.
And here is the rub: the culturally shaped analytic/individualistic mind-sets may partly explain why Western researchers have so dramatically failed to take into account the interplay between culture and cognition. In the end, the goal of boiling down human psychology to hardwiring is not surprising given the type of mind that has been designing the studies. Taking an object (in this case the human mind) out of its context is, after all, what distinguishes the analytic reasoning style prevalent in the West. Similarly, we may have underestimated the impact of culture because the very ideas of being subject to the will of larger historical currents and of unconsciously mimicking the cognition of those around us challenges our Western conception of the self as independent and self-determined. The historical missteps of Western researchers, in other words, have been the predictable consequences of the WEIRD mind doing the thinking.
Kodak Black, 2012
Oil on canvas
63 x 78 3/4 inches
Through April 6, 2013
March 19th, 2013Mark Makela for The New York Times
Discarded televisions and computers in Philadelphia.
By IAN URBINA
NY Times Published: March 18
Last year, two inspectors from California’s hazardous waste agency were visiting an electronics recycling company near Fresno for a routine review of paperwork when they came across a warehouse the size of a football field, packed with tens of thousands of old computer monitors and televisions.
The crumbling cardboard boxes, stacked in teetering rows, 9 feet high and 14 feet deep, were so sprawling that the inspectors needed cellphones to keep track of each other. The layer of broken glass on the floor and the lead-laden dust in the air was so thick that the inspectors soon left over safety concerns. Weeks later, the owner of the recycling company disappeared, abandoning the waste, and leaving behind a toxic hazard and a costly cleanup for the state and the warehouse’s owner.
As recently as a few years ago, broken monitors and televisions like those piled in the warehouse were being recycled profitably. The big, glassy funnels inside these machines — known as cathode ray tubes, or CRTs — were melted down and turned into new ones.
But flat-screen technology has made those monitors and televisions obsolete, decimating the demand for the recycled tube glass used in them and creating what industry experts call a “glass tsunami” as stockpiles of the useless material accumulate across the country.
The predicament has highlighted how small changes in the marketplace can suddenly transform a product into a liability and demonstrates the difficulties that federal and state environmental regulators face in keeping up with these rapid shifts.
“Lots of smaller recyclers are in over their heads, and the risk that they might abandon their stockpiles is very real,” said Jason Linnell of the Electronics Recycling Coordination Clearinghouse, an organization that represents state environmental regulators, electronics manufacturers and recyclers. In February, the group sent a letter to the Environmental Protection Agency asking for immediate help dealing with the rapidly growing stockpiles of the glass, much of which contains lead.
With so few buyers of the leaded glass from the old monitors and televisions, recyclers have collected payments from states and electronics companies to get rid of the old machines. A small number of recyclers have developed new technology for cleaning the lead from the tube glass, but the bulk of this waste is being stored, sent to landfills or smelters, or disposed of in other ways that experts say are environmentally destructive.
In 2004, recyclers were paid more than $200 a ton to provide glass from these monitors for use in new cathode ray tubes. The same companies now have to pay more than $200 a ton to get anyone to take the glass off their hands.
So instead of recycling the waste, many recyclers have been storing millions of the monitors in warehouses, according to industry officials and experts. The practice is sometimes illegal since there are federal limits on how long a company can house the tubes, which are environmentally dangerous. Each one can include up to eight pounds of lead.
The scrap metal industry estimates that the amount of electronic waste has more than doubled in the past five years.
A little over a decade ago, there were at least 12 plants in the United States and 13 more worldwide that were taking these old televisions and monitors and using the cathode ray tube glass to produce new tubes. But now, there are only two plants in India doing this work.
In 2009, after television broadcasters turned off their analog signals nationwide in favor of digital, millions of people threw away their old televisions and replaced them with sleeker flat-screen models. Since then, thousands of pounds of old televisions and other electronic waste have been surreptitiously unloaded at landfills in Nevada and Ohio and on roadsides in California and Maine.
Most experts say that the larger solution to the growing electronic waste problem is for technology companies to design products that last longer, use fewer toxic components and are more easily recycled. Much of the industry, however, seems to be heading in the opposite direction.
Cathode ray tubes have been largely replaced by flat panels that use fluorescent lights with highly toxic mercury in them, said Jim Puckett, director of Basel Action Network, an environmental advocacy group. Used panel screens from LCD televisions and monitors, for example, do not have much recycling value, so many recyclers are sending them to landfills.
State and federal environmental policies have also become victims of their own success. Over the past decade, environmental regulators have promoted “take-back” programs to persuade people to hand in the more than 200 million old televisions and broken computer monitors that Americans are thought to have stored away in closets, garages and basements.
The same programs have courted businesses to divert their electronic waste away from landfills to avoid the hazardous chemicals in this toxic trash from leaching into groundwater. More than 290,000 tons of the high-tech castoffs are now directed away from landfills and toward recyclers each year.
“The problem now is that the collection of this waste has never been higher, but demand for the glass that comes from it has never been lower,” said Neil Peters-Michaud, the chief executive of Cascade Asset Management, a recycling company.
Roughly 660 million pounds of the glass is being stored in warehouses across the country, and it will cost $85 million to $360 million to responsibly recycle it, according to a report released in December by TransparentPlanet, an organization focused on electronic waste research.
The stockpiling problem is especially worrisome to electronics companies and to state and federal officials since they might have to pick up part of the tab if the stockpiles were abandoned and declared federal Superfund sites.
At least 22 states have laws that make electronics manufacturers like Sony, Toshiba and Apple financially responsible for recycling their old products. But lack of oversight of these programs has led to rampant fraud. In one tactic, quietly known in the industry as “paper transactions,” recyclers buy paperwork to indicate that they collected a certain amount of electronic waste that they never actually collected.
The Obama administration, more than any of its predecessors, has strengthened oversight of electronic waste. In 2012, the General Services Administration enacted rules discouraging all agencies and federal contractors from disposing of it in landfills. The federal government, which is among the world’s largest producer of electronic waste, disposes more than 10,000 computers a week on average.
Federal agencies are failing to sufficiently track their electronic waste, and large amounts of it are still being disposed of through public or online auctions, according to a Government Accountability Office report last year. In these auctions, the waste is often sold to a first layer of contractors who promise to handle it appropriately, only to have the most toxic portion subsequently sold to subcontractors who move it around as they wish.
Some of this waste is dumped illegally in developing countries, the G.A.O. found. Congress is considering legislation to ban certain types of unprocessed and nonworking electronics and electronic waste from being exported to developing countries from the United States.
Recyclers say there is still money to be made on processing the old monitors and televisions if companies charge a price that more genuinely reflects the expense of disposing of the glass properly. But practices like “greenwashing,” whereby companies pretend to engage in environmentally responsible disposal practices, hinder such progress.
“They’re skimming off the computers, cellphones and printers that can be recycled profitably because they have more precious metals,” said Karrie Gibson, the chief executive of Vintage Tech Recyclers. “Then they stockpile the CRTs, or dump it in landfills or abroad.”
The sheer quantity of the glass accumulating at some recycling plants has contributed to environmental and workplace safety problems. In Yuma, Ariz., for example, Dlubak Glass, one of the country’s largest recyclers of glass from televisions and monitors, found itself overwhelmed.
When state regulators visited the site in 2009, they found a mountain of the lead-rich glass, several stories tall. Dust from the shimmering mound of recycled glass had contaminated the surrounding soil, including a nearby orchard, with lead at 75 times the federal limit, according to state documents.
“We have it entirely under control now,” said Herb Schall, a Dlubak plant manager.
In September, California passed an emergency measure allowing companies to send monitors and televisions to hazardous landfills for the next two years.
Charlotte Fadipe, a spokeswoman for the California Department of Toxic Substances Control, said her office’s investigation of the abandoned warehouse near Fresno is continuing, and investigators are still trying to locate Charles Li, the owner of the company, TRI Products.
Over the past four years, TRI has been paid more than $1 million by the state to recycle electronic waste from local schools, hospitals and federal agencies, including the F.B.I., the I.R.S. and Immigration and Customs Enforcement, according to state and company documents.
After a reporter found him to be running another electronic waste disposal company, Mr. Li did not respond. But when he was contacted online by another recycler and asked whether he was still looking to buy electronic waste, he immediately replied yes, with one caveat.
“Right now, we can take PC, server, telephone, printer and household e-waste,” he wrote. “I cannot take your CRT/TV as e-waste because we don’t have equipment to recycle the tubes.”
March 19th, 2013By PAUL KRUGMAN
NY Times Published: March 17, 2013
Ten years ago, America invaded Iraq; somehow, our political class decided that we should respond to a terrorist attack by making war on a regime that, however vile, had nothing to do with that attack.
Some voices warned that we were making a terrible mistake — that the case for war was weak and possibly fraudulent, and that far from yielding the promised easy victory, the venture was all too likely to end in costly grief. And those warnings were, of course, right.
There were, it turned out, no weapons of mass destruction; it was obvious in retrospect that the Bush administration deliberately misled the nation into war. And the war — having cost thousands of American lives and scores of thousands of Iraqi lives, having imposed financial costs vastly higher than the war’s boosters predicted — left America weaker, not stronger, and ended up creating an Iraqi regime that is closer to Tehran than it is to Washington.
So did our political elite and our news media learn from this experience? It sure doesn’t look like it.
The really striking thing, during the run-up to the war, was the illusion of consensus. To this day, pundits who got it wrong excuse themselves on the grounds that “everyone” thought that there was a solid case for war. Of course, they acknowledge, there were war opponents — but they were out of the mainstream.
The trouble with this argument is that it was and is circular: support for the war became part of the definition of what it meant to hold a mainstream opinion. Anyone who dissented, no matter how qualified, was ipso facto labeled as unworthy of consideration. This was true in political circles; it was equally true of much of the press, which effectively took sides and joined the war party.
CNN’s Howard Kurtz, who was at The Washington Post at the time, recently wrote about how this process worked, how skeptical reporting, no matter how solid, was discouraged and rejected. “Pieces questioning the evidence or rationale for war,” he wrote, “were frequently buried, minimized or spiked.”
Closely associated with this taking of sides was an exaggerated and inappropriate reverence for authority. Only people in positions of power were considered worthy of respect. Mr. Kurtz tells us, for example, that The Post killed a piece on war doubts by its own senior defense reporter on the grounds that it relied on retired military officials and outside experts — “in other words, those with sufficient independence to question the rationale for war.”
All in all, it was an object lesson in the dangers of groupthink, a demonstration of how important it is to listen to skeptical voices and separate reporting from advocacy. But as I said, it’s a lesson that doesn’t seem to have been learned. Consider, as evidence, the deficit obsession that has dominated our political scene for the past three years.
Now, I don’t want to push the analogy too far. Bad economic policy isn’t the moral equivalent of a war fought on false pretenses, and while the predictions of deficit scolds have been wrong time and again, there hasn’t been any development either as decisive or as shocking as the complete failure to find weapons of mass destruction. Best of all, these days dissenters don’t operate in the atmosphere of menace, the sense that raising doubts could have devastating personal and career consequences, that was so pervasive in 2002 and 2003. (Remember the hate campaign against the Dixie Chicks?)
But now as then we have the illusion of consensus, an illusion based on a process in which anyone questioning the preferred narrative is immediately marginalized, no matter how strong his or her credentials. And now as then the press often seems to have taken sides. It has been especially striking how often questionable assertions are reported as fact. How many times, for example, have you seen news articles simply asserting that the United States has a “debt crisis,” even though many economists would argue that it faces no such thing?
In fact, in some ways the line between news and opinion has been even more blurred on fiscal issues than it was in the march to war. As The Post’s Ezra Klein noted last month, it seems that “the rules of reportorial neutrality don’t apply when it comes to the deficit.”
What we should have learned from the Iraq debacle was that you should always be skeptical and that you should never rely on supposed authority. If you hear that “everyone” supports a policy, whether it’s a war of choice or fiscal austerity, you should ask whether “everyone” has been defined to exclude anyone expressing a different opinion. And policy arguments should be evaluated on the merits, not by who expresses them; remember when Colin Powell assured us about those Iraqi W.M.D.’s?
Unfortunately, as I said, we don’t seem to have learned those lessons. Will we ever?
March 18th, 2013
The Sendai Mediatheque, designed by Toyo Ito, survived Japan’s devastating 2011 earthquake. Photograph by Tomio Ohashi
By ROBIN POGREBIN
NY Times Published: March 17, 2013
Toyo Ito, a Japanese architect who broke from Modernism and designed a library that survived his country’s catastrophic 2011 earthquake, was awarded his profession’s top honor, the Pritzker Architecture Prize, on Sunday.
“Toyo Ito is a creator of timeless buildings, who at the same time boldly charts new paths,” the Pritzker jury said in its citation. “His architecture projects an air of optimism, lightness and joy and is infused with both a sense of uniqueness and universality.”
In a telephone interview Mr. Ito, 71, said he was gratified by the honor, especially because it represents an acceptance of his position as an iconoclast who has challenged the past 100 years of Modernism.
“I’ve been thinking that Modernism has already reached to the limit or a dead end,” Mr. Ito said through an interpreter. “I didn’t expect this surprising news, and I’m very happy about it.”
Nicolai Ouroussoff, then the architecture critic of The New York Times, remarked in 2009 that Mr. Ito had repeatedly been passed over for the Pritzker “in favor of designers with much thinner résumés.”
Mr. Ito will receive the award at the John F. Kennedy Presidential Library and Museum in Boston on May 29.
Looking back over his career Mr. Ito said he is particularly proud of the Sendai Mediatheque, his library completed in Sendai, Japan, in 2001. The building’s design is dominated by structural tubes that support the floor plates and provide circulation, tubes which the Pritzker jury said “permitted new interior spatial qualities.”
But Mr. Ito is also proud of the building’s significance as a project that was meant to withstand an earthquake. (It won a Golden Lion Award at the 2012 Venice Architecture Biennale.) A video of the inside of the building taken by someone under a table during the earthquake in 2011 went viral.
“The building shook and swayed violently; everything cascaded from shelves and desks onto the floor,” the architecture critic Ada Louise Huxtable wrote in The Wall Street Journal. “Ceiling panels appeared to swing drunkenly overhead. But the Mediatheque did not collapse. It stood firm against the massive seismic forces that were tearing other buildings apart; the basic structure did not fail.”
Mr. Ito has been active in the recovery effort. He recruited three young architects to help him develop the concept of Home-for-All, communal space for survivors. In his book “Toyo Ito: Forces of Nature,” edited by Jesse Turnbull and published last year by Princeton Architectural Press, Mr. Ito writes, “An architect is someone who can make such places for meager meals show a little more humanity, make them a little more beautiful, a little more comfortable.”
The citation said Mr. Ito consistently couples his personal creative agenda with a sense of public responsibility. “It is far more complex and riskier to innovate while working on buildings where the public is concerned,” the jury said, “but this has not deterred him.”
Though perhaps not as well known as architects like Rem Koolhaas or Frank Gehry, Mr. Ito rose to great prominence with the completion of his stadium in Kaohsiung, Taiwan, built for the World Games in 2009.
And he has received his share of awards, including, in 2010, the Praemium Imperiale, which recognizes lifetime achievement in areas of the arts not covered by the Nobel Prizes.
But Mr. Ito said he doesn’t worry about status or architecture competitions. “We cannot predict what we will win or we won’t win,” he said.
He said he just needs to be able to do the work he wants to do. These days that includes flatware, called Mu, introduced in Paris by the Italian company Alessi. Mu means hexagon in Japanese and refers to the six-sided shape of the handles, which resembles chopsticks. The pattern complements Ku, the porcelain service Mr. Ito created for Alessi in 2006.
He has also been drawn to practical retail projects like a building for Tod’s, the Italian shoe and handbag company, and the facade of the Mikimoto Ginza 2 flagship store — both in Tokyo. And he continues to design ambitious public projects like the Taichung opera house, whose porous exterior has been likened to a gigantic sponge, and the Tama Art University Library, an irregular grid of concrete arches.
Born to Japanese parents in Keijo — now Seoul — in 1941, Mr. Ito moved to Tokyo in junior high school and then attended the University of Tokyo, where architecture became his main interest. He went on to graduate in 1965 and began working at the firm of Kiyonori Kikutake & Associates. In 1971 he left to start his own studio, calling it Urban Robot (Urbot), which in 1979 became Toyo Ito & Associates, Architects.
Many of his early works were residences — including one in a Tokyo suburb called “Aluminum House,” which consisted of a wooden frame completely covered in aluminum, and a home for his sister called “White U,” which generated considerable interest in his work.
Throughout his career, Mr. Ito said, he has tried to establish a connection between inside and the outside conditions, an effort evident in his lightweight structures that use materials like mesh, perforated aluminum and permeable fabrics.
That fluidity pervades projects like his World Games stadium, critics said, which do not conform to conventional definitions of modern architecture.
“It reflects his longstanding belief that architecture, to be human, must somehow embrace seemingly contradictory values,” Mr. Ouroussoff wrote in his review of the building. “Instead of a self-contained utopia, he offers us multiple worlds, drifting in and out of focus like a dream.”
March 17th, 2013
NATCHEZ, MISS., FEB. 28, 1967 The scene at a car bombing that killed Wharlest Jackson, one of many racially motivated and unsolved crimes from the era.
By DAN BARRY, CAMPBELL ROBERTSON and ROBBIE BROWN
NY Times Published: March 16, 2013
FERRIDAY, La. — In the spring of 1965, the Federal Bureau of Investigation in Washington received a letter from Concordia Parish in northeastern Louisiana. Addressed to the bureau’s director, J. Edgar Hoover, the letter pleaded for justice in the killing of a well-respected black merchant.
A few months earlier, the businessman, Frank Morris, had come upon two white men early one morning at the front of his shoe-repair shop, one pointing a shotgun at him, the other holding a canister of gas. A match was ignited, a conflagration begun, and Morris died four days later of his burns without naming the men, perhaps fearing retribution against his family.
The letter expressed grave concern that the crime would go unpunished because the local police were probably complicit. “Your office is our only hope so don’t fail us,” it concluded. It was signed:
“Yours truly, The Colored People of Concordia Parish.”
Nearly five decades later, the Justice Department has written back — not directly to the family of Mr. Morris or to the black community of Concordia Parish, but to dozens of other families who lost loved ones during this country’s tumultuous and violent civil rights era.
Several years ago, the F.B.I. began reopening cold cases from that era — 112 at last count — raising hopes among some for justice. In all but about 20, though, the families of the long dead have received letters, often hand-delivered by F.B.I. agents, that say their cases have been closed, there is nothing more to be done — and please accept our condolences.
Simultaneously intimate and bureaucratic, these letters serve as epistolary echoes of an increasingly distant time. To some, they reflect the elusiveness of resolution in cases decades old; to others, they represent another missed opportunity for a full accounting of what happened, and why.
Grace Hall Miller, a retired school board member in Newton, Ga., received one of these letters two years ago. It recounted a day in March 1965 that she hardly could have forgotten, when a man named Cal Hall Jr. fatally shot her husband, Hosie Miller, a farmer and church deacon, in a dispute over cows. Mr. Hall, who was white, shot Mr. Miller, who was black, in the back.
The letter recalled the notorious travel of the case, from repeated failures by grand juries to indict Mr. Hall to a “disproportionately white” jury’s finding against the Miller family in a wrongful-death lawsuit. It also summed up what the F.B.I. had done after reopening the case: interviews with Ms. Miller and a family friend, a check with the local sheriff’s office, and a search of county death records.
“After careful review of this incident, we have concluded that the now deceased Cal Hall Jr. acted alone when he shot and killed your husband, and therefore, we have no choice but to close our investigation,” the letter said. “We regret that we cannot be of further assistance to you. Again, please accept our sincere condolences for the loss of your husband” — a man who had died nearly a half-century earlier.
Ms. Miller, now 80, said that she had placed judgment in God’s hands long ago, and could not understand why the F.B.I. had reopened the case to give it only a cursory review. “I guess they were just trying to make a show,” she said.
One of Ms. Miller’s daughters is Shirley Sherrod, who was forced to resign from her job with the federal Agriculture Department in 2010 after a conservative blogger edited a video of a public appearance to make her appear racist. The White House later apologized, and she was offered a new job.
“It’s very unsatisfying,” said Ms. Sherrod, one of six children. “Even after all these years, the system wouldn’t do anything. We didn’t get justice.”
Adam S. Lee, the chief of the F.B.I. section overseeing civil rights, said that the bureau had been rigorous in pursuing these cold cases, following any evidence that might lead to prosecutions. But, he added, “That doesn’t bring the emotional closure that the public wants, or needs, in cases like this.”
Unsolved Crimes
In 2006, the F.B.I. began a cold-case initiative that it described as a comprehensive effort to investigate racially motivated murders from the civil rights era. That effort became a mandate two years later when Congress passed the Emmett Till Unsolved Civil Rights Crime Act, named after the 14-year-old black boy who was tortured and killed in Mississippi in 1955, for supposedly flirting with a white woman.
From the outset, the government cited the formidable challenges it faced: the limited federal jurisdiction in some cases, the statute of limitations in others, and, of course, time’s passage. Suspects and witnesses die. Evidence is lost. Memories dull.
Champions of the cold-case initiative debated its primary purpose. Was it justice, which would mean chasing those few cases with the highest likelihood of prosecution? Or was it truth, which would mean pursuing the facts in scores of cases, no matter that the principal players might be dead?
“Truth was the more realistic goal,” said Alvin Sykes, a human rights worker and an early champion of what became the Till Act, who envisioned a broad, aggressive effort to identify and investigate old civil rights cases. But he lowered his expectations, he said, once the narrower scope of the initiative became clear.
The law authorized tens of millions of dollars for the project, but so far only $2.8 million has come through. In addition, some critics say that the expansive list of cases compiled by the F.B.I. — ranging from the well known to the more obscure, culled from old news clippings — resulted in the quick closing of most cases and the draining of resources from the few cases in which a prosecution might have been achieved.
Richard Cohen, the president of the Southern Poverty Law Center, which provided the F.B.I. with some possible cases, recalled some unease at the time about publicizing so many cases with very low chances of resolution. “I was worried about giving people false hope,” he said.
Limited Results
Families who held out hope for prosecution have been disappointed. In a report to Congress in October, the Justice Department acknowledged the low yield from what it always considered to be long-shot efforts to develop cases worthy of prosecution.
The report said the F.B.I.’s cold-case initiative had resulted in one successful federal prosecution, of James Ford Seale, who was convicted in 2007 in connection with the deaths of two young black men in 1964. It also said the F.B.I. had assisted in the 2010 state prosecution of a former Alabama trooper, James Bonard Fowler, in the shooting death of Jimmie Lee Jackson, a 26-year-old civil rights marcher who died after a confrontation with the police in 1965.
Left unsaid in the report was that these prosecutions were prompted in large part by the work of investigative journalists like Jerry Mitchell of The Clarion-Ledger in Jackson, Miss., John Fleming of the The Anniston Star in Alabama, and David Ridgen, a Canadian documentary filmmaker.
Also left unsaid was how the Justice Department’s approach and commitment to the endeavor have been questioned by some of those who have been toiling in the same troubled fields of history.
“That’s what we’ve been struggling with for the last six years,” said Janis L. McDonald, a law professor at Syracuse University and a co-director of the law school’s Cold Case Justice Initiative. “We’ve been asking for a regional task force, or for them to go out and say how many people really were killed. Neither has been the focus of the Justice Department.”
But F.B.I. officials said that in many cases people were interviewed, often repeatedly, during the original murder investigations. With memories fading as the decades pass, they said, those initial interviews often determine whom to interview or reinterview in the current investigation.
“I have not seen a perfunctory approach in any of these cases,” said Mr. Lee, the F.B.I. section chief.
Retellings, and Condolences
Then there are the letters.
As the Justice Department closed cases, F.B.I. agents were sent to hand-deliver letters of explanation to the next of kin. Sometimes, relatives could not be found or simply “didn’t want to hear from us,” Mr. Lee said.
Copies of dozens of these letters were given to The New York Times by James Shelledy, a journalist in residence at the Manship School of Mass Communication at Louisiana State University, and a team of journalism students who have been working on a project involving unsolved killings from the civil rights era. They obtained the letters through the Freedom of Information Act.
Here is the case of Clinton Melton, a gas station attendant in Glendora, Miss., who found himself in a dispute with a customer, Elmer Otis Kimbell, over the amount of gas he had just pumped. Mr. Kimbell vowed to kill Mr. Melton, and did just that, shooting the unarmed Mr. Melton three times with a shotgun.
“On March 13, 1956, an all-white, all-male jury acquitted Mr. Kimbell, despite the weight of the testimonial and physical evidence contradicting Mr. Kimbell’s claim that he acted in self-defense,” the government wrote in a letter to Mr. Melton’s family. “Mr. Kimbell died in February 1985.”
Here is the case of Jasper Greenwood, whose decomposed body was found next to his car in Vicksburg, Miss. According to the F.B.I., it turned out that Mr. Greenwood was with a married woman in a “lover’s lane” when he suffered a fatal heart attack — and she fled the scene.
Here is the case of John Earl Reese, who was dancing with friends in a cafe in Longview, Tex., when gunfire sprayed from a passing car killed him and injured two cousins. Two young white men, Perry Dean Ross and Joe Simpson, were eventually indicted, although charges against Mr. Simpson were dropped when he agreed to testify against Mr. Ross — who was given a suspended sentence on a murder conviction.
The F.B.I. closed its resurrected investigation because it concluded that Mr. Ross and Mr. Simpson, both long dead, had acted alone. Still, Joyce Nelson Crockett, one of the cousins wounded that night, expressed gratitude for the government’s interest.
“It did more good than no good,” Ms. Crockett, who is 70 and in poor health, said of the letter. “To know it was important enough to look into, because what they did was wrong. They had no reason to do it. They were youngsters, just as I was.”
Here, too, is the case of Herbert Orsby, a New Orleans boy of 14 whose body was found in 1964 in the Big Black River near rural Pickens, Miss., where he had been visiting his grandmother. Some civil rights workers said they had seen a young man matching Herbert’s description being forced into a pickup truck at gunpoint, and there were rumors that Herbert had been wearing a T-shirt bearing the acronym for the Congress of Racial Equality: C.O.R.E.
But an F.B.I. investigation at the time found no evidence of foul play and concluded that the boy had drowned. For years afterward, lingering suspicions were rarely discussed within the Orsby family.
“Back in those days in Mississippi, you didn’t stir up stuff,” said Roy Orsby, one of Herbert’s brothers, now 60 years old.
After 46 years of not stirring up stuff, Mr. Orsby received a visit from an F.B.I. agent who said the bureau was looking into the case. A few weeks later, Mr. Orsby said, he received a letter.
It said agents had retrieved the files from the original 1964 investigation, contacted Mississippi officials and two civil rights organizations, conducted searches of historical records at libraries and on the Internet, and “solicited information about the case via a press release.”
The conclusion: “After careful review of this incident, there is insufficient evidence to indicate that your brother’s death constitutes a racially motivated homicide.”
The experience was both brief and strange. “I figure, well there it goes again, back under the bridge,” Mr. Orsby said. “Never did find out what happened.”
Still, Margaret A. Burnham, a law professor and the founder of the Civil Rights and Restorative Justice Program at the Northeastern University School of Law, said these letters mattered, even if devoid of resolution.
“Setting aside whether the F.B.I. could have done more, I respect the dignity with which they are accepting responsibility for letting families know how justice failed them,” Ms. Burnham said. “Whether it’s a sufficient thing, I don’t know. But it’s a necessary thing.”
Still Unresolved
As difficult as it might be to receive a letter saying that the investigation into your loved one’s death is being closed without a satisfying conclusion, there is the different pain of having no resolution at all, as in the 20 or so cold cases that remain open.
Three of them concern killings that took place on or near a 12-mile stretch of Highway 84, from here in Ferriday to the Mississippi city of Natchez, across the Mississippi River. All three have been extensively investigated by Stanley Nelson, a reporter for a local weekly newspaper, The Concordia Sentinel, who said the F.B.I. had failed to give the cold cases the proper commitment because it had not assigned agents to investigate them full time.
“There’s no way they can devote the time that’s really necessary to resolve these cases,” said Mr. Nelson, who believes that the cases are open in part because he has kept publicizing them. “They’re not out in the communities canvassing, talking to people, knocking on doors, running people down.”
This part of the South is haunted by unanswered questions.
In Ferriday, all that remains of the shoe store that was burned down in 1964 — with its black owner, Frank Morris, held inside at gunpoint — is a concrete slab in an empty lot. No one has ever been arrested, but in recent years there have been grand jury investigations, most likely prompted by Mr. Nelson’s reports that a suspect is still alive.
A few miles southeast on Highway 84, in a place called Vidalia, is the Budget Inn motel. A half-century ago, it was called the Shamrock Inn, where the Silver Dollar Group — an extreme faction of the Ku Klux Klan — used to meet in the coffee shop, and where, in 1964, a black porter named Joe Ed Edwards was said to have kissed the motel’s night clerk, a white woman.
Two days later, Mr. Edwards disappeared, for good.
And a few miles farther on, across the Natchez-Vidalia Bridge in Mississippi, a plaque marks the spot where, in 1967, a Chevy pickup driven by Wharlest Jackson, a Korean War veteran, blew up. He had just accepted a promotion to a supervisory — that is, white — position at a tire plant.
F.B.I. agents soon swarmed the area, infiltrating the Silver Dollar Group and forcing an end to its campaign of terror, though without any arrests. The children of Mr. Jackson do not expect the cold-case initiative to change the cold absence of resolution, and have come to see the government’s initiative as worse than nothing.
“Why they open it and why they open up wounds, I don’t know,” said Debra Sylvester, one of Mr. Jackson’s daughters. “My mama always said all you can do is pray. One day they’re going to have to give an account. One day they’re going to have to give an account to their maker.”
Meanwhile, Mr. Lee, of the F.B.I., said the bureau would probably close more cases by June. When asked whether criminal charges might be filed, he said, “Likely not.”
March 17th, 2013
“Instrumentals,” 2013, Epson print, 75 x 42.5 inches
Opens March 16, 2013
Thanks to RS
March 16th, 2013By Christopher Knight
The Los Angeles Times
March 16, 2013
A proposed five-year collaboration deal between the troubled Museum of Contemporary Art in downtown Los Angeles and the National Gallery of Art in Washington, D.C., is, to use a critically astute technical term, a big, fat nothing-burger.
The plan sidesteps the most pressing needs facing MOCA, which are financial health and managerial competence, while needlessly embarrassing both museums.
It’s a symptom of, not a solution to, MOCA’s problems.
A possible relationship between the two — located on opposite coasts and with vastly different missions, structures and reputations — had been talked about in museum circles for weeks. But when the news finally broke Wednesday, the vacuousness of the idea was there for all to see.
The fantasy is that the federal museum will be lending its expertise in exhibitions, research and programming to MOCA. The reality is the reverse. When it comes to the art of the recent past, MOCA is a bona fide star and the Washington institution a bit player. MOCA would be lending its prestige to the National Gallery, not the other way around.
The collaborative idea, which includes zero financial support, apparently means to stick a Good Housekeeping seal of approval on the struggling contemporary museum. Donors are always hesitant to commit to an institution about whose future they are uncertain, which applies to MOCA in the wake of its lingering fiscal woes but not to a federally funded museum.
But rather than bolstering confidence in MOCA’s leadership, a foolish plan like this actually lessens it. If MOCA’s leaders are so confused about the relative merits of the two institutions and what is needed to move forward, no wonder its troubles have dragged on for the last five years.
The partnership idea was initiated by MOCA board member Eli Broad, the billionaire collector who is opening his own art museum downtown in 2014, across the street from MOCA. His National Gallery counterpart, board chairman John Wilmerding told The Times, “The hope is that our name, our programming, our expertise gives [MOCA] a sense of backbone and stability.”
Backbone? With all due respect to Wilmerding, it is probably a poor idea to begin a partnership by publicly dissing the other party as effectively spineless. Especially since it’s absurd to imagine that the National Gallery has any useful expertise in contemporary art to offer.
The National Gallery is a wonderful museum — if Old Master European, 19th century American or classic Modern painting and sculpture of the early 20th century are what you want. But the last substantial show of contemporary art I felt compelled to travel there to see was a Willem de Kooning painting survey — in 1994. Art of the last half-century is a minor diversion at the NGA.
Partly that’s because the Smithsonian’s Hirshhorn Museum and Sculpture Garden stands right across the National Mall, and international contemporary art is the Hirshhorn’s chief concern. The NGA does next to nothing with recent art, except for photographs and the occasional retrospective of an established, blue-chip American painter or sculptor from the 1950s or 1960s.
Take a look at the museum’s online tour of its Modern and contemporary collection, which features one 1971 sculpture but nothing else from after 1962. Like that De Kooning show, mounted just before the artist’s death, the tour follows the museum’s Old Master profile.
Frankly, I can’t think of a single contemporary thematic or group exhibition of any consequence to have come from there. The National Gallery has a department of Modern art, which begins in the 19th century, but nothing remotely close to a staff specializing in the diversity of global art in our time. Perhaps it’s appropriate for the federal institution to hold back in that area, but it doesn’t mean the museum has any capacity for leadership in a field about which it has virtually no demonstrable expertise.
MOCA, meanwhile, has built an international reputation on a savvy post-1945 exhibition program that uncovers forgotten histories or revises established ones with fresh scholarship and surprising insights. That was a specialty of former chief curator Paul Schimmel, deposed last summer. It’s also the reason two of Europe’s leading contemporary museums — the Stedelijk in Amsterdam and the Ludwig in Cologne — are now directed by former MOCA curators, who left as the institutional problems mounted.
The arrangement with the National Gallery, which might be formally announced next week, is actually rather sad — a throwback to the early 1980s, when MOCA was just a novel idea still being birthed. (Broad was chairman from 1979 to 1984.) The scheme tries to borrow gravitas by association.
Back then, Pontus Hulten was hired away from Paris’ Centre Georges Pompidou as inaugural director, and a major chunk of the coveted collection of Giuseppe Panza di Biumo, Europe’s first great collector of contemporary American art, was acquired. Now, America’s National Gallery is signing on as a partner.
But who cares? Perhaps to a general audience the partnership looks like a big deal, but to the engaged art public it’s just embarrassing — a desperation move, grasping at straws. In the process, MOCA’s real achievements get misunderstood.
And remember: Hulten soon quit, frustrated with program meddling by MOCA’s board, while Panza went ballistic when moves were made to try to sell off part of his incomparable collection. That MOCA overcame those ’80s blunders is a testament to its growth — to its own hard-earned artistic gravitas.
In order to begin rebuilding the stature it lost through fiscal mismanagement in the run-up to 2008, MOCA needs just two essential things: smart nonprofit museum leadership at the top, in both the boardroom and the director’s office, and substantial endowment funds — at least triple its $38-million peak a dozen years ago.
A Good Housekeeping Seal from an august but indifferent partner in Washington is just a distraction, unlikely to do anything to fix either gaping deficiency.
March 16th, 2013By Mike Boehm
The Los Angeles Times
March 15, 2013
Michael Govan came to the Los Angeles County Museum of Art seven years ago with a mission to make it one of the most prestigious institutions in the country, one worth mentioning alongside New York City’s Metropolitan Museum of Art and Museum of Modern Art.
Now he’s trying to seize an opportunity to gain ground on them in a single stroke. Govan and LACMA’s trustees have proposed a takeover of L.A.’s financially adrift Museum of Contemporary Art and its crown jewels: a 6,000-piece collection that’s one of the world’s most admired troves of post-World War II art.
But Govan has an imposing rival in billionaire Eli Broad, L.A.’s eminence grise of art philanthropy. And Broad has cards of his own to play.
Broad is brokering a possible MOCA partnership with the National Gallery of Art in Washington, D.C., whose board chairman, John Wilmerding, said this week that the federally funded museum is eager to provide curatorial expertise but not money.
Broad has repeatedly declined comment on the possible LACMA-MOCA combination, and so have MOCA leaders who ultimately will have to decide the question.
But art world experts speculate that Broad could be motivated by rivalry with Govan and a desire to ensure a successful launch of his namesake Broad Collection museum, which is set to open across the street from MOCA’s Grand Avenue base next year.
“It strikes me as purely personal,” said Bruce Robertson, a former curator of American art at LACMA who teaches art history at UC Santa Barbara. “Eli has worked very hard to have the position as most important philanthropist in town,” while “Michael Govan has demonstrated emphatically that he doesn’t need Eli Broad to be successful.”
But that theory is discounted by Thomas Lawson, dean of the School of Art at California Institute of the Arts in Valencia.
“That’s an easy way of explaining things, to find a slightly dark personal angle,” he said.
Lawson believes Broad’s resistance to a LACMA takeover could be motivated more by concerns that LACMA control could make operations at MOCA more distant and bureaucratic, less collaborative with the Broad Collection and less likely to generate the diverse and dynamic programming vital for its success.
If MOCA, whose holdings are more extensive, doesn’t offer frequently changing exhibitions that can generate repeat visits, the Broad Collection could also stagnate, Lawson said. He thinks Broad may be banking on symbiotic programming between the two museums to generate a vibrant Grand Avenue cultural scene; the concern would be that LACMA control might prevent effective teamwork.
Robertson and Lawson agreed that the best scenario would be for MOCA to remain independent and capable of asserting its own distinctive approach.
“It’s not because Michael wouldn’t do a great job, but because L.A. as an art capital needs as much diversity as it can get,” Robertson said. “MOCA’s vision has always been different from LACMA’s.”
For his part, Govan said that LACMA has promised to keep the MOCA name and its downtown locations, and would raise $100 million to ensure that it thrives. He envisions strength in unity.
“If you put the two collections together … it’s not too shabby,” Govan said this week. What’s more, he said the combined collection could be a magnet for even more strong art, since collectors who want their holdings to wind up in museums often seek out the most prestigious destinations.
“If you put this whole thing together, it’s potentially more attractive to collectors,” he said.
When he saw Broad recently at a trustee’s party, “I gave him a hug,” Govan said. “I said we should work together.”
Broad’s response? “He smiled,” Govan said, “and Edye [Broad’s wife] and I talked about something else.”
For those laying odds on who will prevail, it’s worth noting that Broad has already frustrated LACMA twice.
The 79-year-old billionaire announced in 2008 that he would not give his own impressive collection of contemporary art to the county museum, even though he gave it $56-million for a building named for him that was about to open on its campus. Later that year, he thwarted LACMA and Govan’s bid to acquire MOCA at a time when the downtown museum was all but destitute. The museum board accepted a Broad bailout of MOCA instead.
It is because that bailout didn’t really work that Govan and LACMA have a second chance.
Despite Broad’s infusion of $3 million a year in exhibition support — a pledge that will run out at the end of this year — and an additional $6.25 million for its endowment, MOCA has not achieved a stable fiscal footing. In fact, its capacity for generating art exhibitions and other important museum programming appears to have fallen considerably the last few years, the curatorial staff dwindling since last spring from five to two.
But LACMA’s second chance to absorb MOCA could be an even longer shot than the first. That’s because Broad’s 2008 deal to keep MOCA independent includes a provision expressly designed to make it harder for Govan to successfully come calling again.
Broad doesn’t have absolute veto power over a LACMA takeover, but he can make such an acquisition much more complicated while buying time for other suitors to emerge who might be more to his liking.
The 16-page bailout agreement, obtained by The Times, includes a four-page section concerning “significant actions” that MOCA trustees can’t take with a simple vote. One of them is agreeing to be subsumed under another local museum — such as LACMA — before 2019.
The review process would begin with the appointment of an independent three-member “special committee” that could overrule Broad’s opposition and allow LACMA and MOCA to consummate a union — but not before the Broad Foundation had a chance to make a counteroffer.
The committee’s first task would be to determine whether “exigent circumstances” exist that would justify breaking the prohibition against merging with a nearby museum. The agreement defines “exigent circumstances” as “the inability of the museum … to operate without a … deficit for a period of two consecutive fiscal years,” or a situation in which it was at risk of “bankruptcy, dissolution … or violations of law or contractual obligations.”
MOCA has declined requests for up-to-date financial information, so it’s unclear whether it ran a deficit in the 2011-12 fiscal year or whether deficits are projected for the current fiscal year, which ends June 30.
If at least two of the three special committee members reviewing a possible merger decided that such circumstances existed, it would then have to decide whether LACMA’s offer “would be in the best interests of [MOCA] in light of all of the then-current facts and circumstances.”
If the committee gave the deal its greenlight, MOCA trustees could accept LACMA’s offer “without any consent or approval of the Broad Foundation.”
But once the review process began, it could take unexpected turns. The special committee can propose — but not impose — its own alternate solutions to a MOCA emergency, after reaching out to “any person or entity” who might be able to help.
The Broad Foundation would have a right to submit a counterproposal, but the agreement says it “is not entitled to any special privileges or priorities” as to how that proposal is considered. Broad conceivably could even make a bid to subsume MOCA under his Broad Collection museum.
Last summer, as MOCA made headlines with the forced resignation of longtime chief curator Paul Schimmel and the layoff of seven other staff members, fears arose — including from some MOCA trustees — that Broad might be angling for the museum’s art collection. He firmly denied it.
The MOCA board would have the final vote on whether to adopt any proposal OKd by the special committee. But if board members tried to circumvent the process and strike a union with LACMA without the special committee’s approval, the agreement says that the Broad Foundation would be “entitled to seek … remedies such as injunctive relief” — a court order stopping the deal from going through.
March 16th, 2013
Mathias Poledna
Schoenberg cabinet, 2013
Courtesy Galerie Meyer Kainer, Vienna
Rodney Graham (CAN), Paweł Książek (PL), Marcel Odenbach (D), Silke Otto-Knapp (D), Mathias Poledna (A), Stephen Prina (US), Florian Pumhösl (A), Marina Rosenfeld (US), Simon Starling (GB), Hong-Kai Wang (TW)
Through Jun 30, 2013
March 15th, 2013
Anders Nilsen
By JEFFREY P. KAHN
NY Times Published: March 15, 2013
HUMAN beings are social animals. But just as important, we are socially constrained as well.
We can probably thank the latter trait for keeping our fledgling species alive at the dawn of man. Five core social instincts, I have argued, gave structure and strength to our primeval herds. They kept us safely codependent with our fellow clan members, assigned us a rank in the pecking order, made sure we all did our chores, discouraged us from offending others, and removed us from this social coil when we became a drag on shared resources.
Thus could our ancient forebears cooperate, prosper, multiply — and pass along their DNA to later generations.
But then, these same lifesaving social instincts didn’t readily lend themselves to exploration, artistic expression, romance, inventiveness and experimentation — the other human drives that make for a vibrant civilization.
To free up those, we needed something that would suppress the rigid social codes that kept our clans safe and alive. We needed something that, on occasion, would let us break free from our biological herd imperative — or at least let us suppress our angst when we did.
We needed beer.
Luckily, from time to time, our ancestors, like other animals, would run across fermented fruit or grain and sample it. How this accidental discovery evolved into the first keg party, of course, is still unknown. But evolve it did, perhaps as early as 10,000 years ago.
Current theory has it that grain was first domesticated for food. But since the 1950s, many scholars have found circumstantial evidence that supports the idea that some early humans grew and stored grain for beer, even before they cultivated it for bread.
Brian Hayden and colleagues at Simon Fraser University in Canada provide new support for this theory in an article published this month (and online last year) in the Journal of Archeological Method and Theory. Examining potential beer-brewing tools in archaeological remains from the Natufian culture in the Eastern Mediterranean, the team concludes that “brewing of beer was an important aspect of feasting and society in the Late Epipaleolithic” era.
Anthropological studies in Mexico suggest a similar conclusion: there, the ancestral grass of modern maize, teosinte, was well suited for making beer — but was much less so for making corn flour for bread or tortillas. It took generations for Mexican farmers to domesticate this grass into maize, which then became a staple of the local diet.
Once the effects of these early brews were discovered, the value of beer (as well as wine and other fermented potions) must have become immediately apparent. With the help of the new psychopharmacological brew, humans could quell the angst of defying those herd instincts. Conversations around the campfire, no doubt, took on a new dimension: the painfully shy, their angst suddenly quelled, could now speak their minds.
But the alcohol would have had more far-ranging effects, too, reducing the strong herd instincts to maintain a rigid social structure. In time, humans became more expansive in their thinking, as well as more collaborative and creative. A night of modest tippling may have ushered in these feelings of freedom — though, the morning after, instincts to conform and submit would have kicked back in to restore the social order.
Some evidence suggests that these early brews (or wines) were also considered aids in deliberation. In long ago Germany and Persia, collective decisions of state were made after a few warm ones, then double-checked when sober. Elsewhere, they did it the other way around.
Beer was thought to be so important in many bygone civilizations that the Code of Urukagina, often cited as the first legal code, even prescribed it as a central unit of payment and penance.
Part of beer’s virtue in ancient times was that its alcohol content would have been sharply limited. As far as the research has shown, distillation of alcohol to higher concentrations began only about 2,000 years ago.
Today, many people drink too much because they have more than average social anxiety or panic anxiety to quell — disorders that may result, in fact, from those primeval herd instincts kicking into overdrive. But getting drunk, unfortunately, only compounds the problem: it can lead to decivilizing behaviors and encounters, and harm the body over time. For those with anxiety and depressive disorders, indeed, there are much safer and more effective drugs than alcohol — and together with psychotherapy, these newfangled improvements on beer can ease the angst.
But beer’s place in the development of civilization deserves at least a raising of the glass. As the ever rational Ben Franklin supposedly said, “Beer is living proof that God loves us and wants us to be happy.”
Several thousand years before Franklin, I’m guessing, some Neolithic fellow probably made the same toast.
March 15th, 2013The Words of Tuck Tuck Tuck
March 14 – April 15, 2013
Tuck Tuck Tuck was the name of artist Richard Aldrich’s solo music project done between the years 1999 and 2001, with records released in 2002 and 2003 on his own Skul record label. The words of Tuck Tuck Tuck compiles exactly that: all use of printed language surrounding Tuck Tuck Tuck. What is ostensibly a lyric book contains press releases, insert texts and record reviews in addition to the lyrics proper. In this way the book mimics the concerns of Aldrich’s painting practice, objects that contain different tones, purposes and functions collected together to create a larger and multi-faceted understanding of a body.
This exhibition will inaugurate KARMA’s new space at 39 Great Jones St.
March 13th, 2013