By Kathleen Hale
Elle Published: March 24, 2015
“You are calling me from a cell phone,” Fran Lebowitz , the cultural critic, writer, and sometimes actress announces, mere seconds after I say hello. She says it’s the worst connection in the history of the world, and she’ll only continue with the interview if I call her from a landline.
Not having one of those, I suggest we meet at Burger Heaven, where I find her waiting behind a fountain soda the size of a lectern. Eventually she forgives me for being four minutes late, and launches into a series of mini-lectures on fashion—including topics such as: “Yoga pants are ruining women,” “This most recent revival of platform shoes embodies everything that’s wrong with young people,” and “Men in shorts are disgusting”—as well as many other sartorial tirades.
Kathleen Hale: You don’t have a uniform, per se, but you wear a jacket, a men’s shirt with cufflinks, Levi’s jeans, cowboy boots, two gold rings, and tortoiseshell glasses every single day.
Fran Lebowitz: Yes.
Walk me through your outfit.
This jacket is from Anderson and Sheppard in London. I don’t go there, they come to me. Or they did. Now they have a dummy made of me.
What people don’t know is: Clothes don’t really fit you unless they’re made for you. Especially when you wear men’s clothes, like I do. American women think that clothes fit them if they can fit into them. But that’s not at all what fit means.
These cufflinks, actually, were a gift. They’re made from a dice that was cut in half.
I get all my shirts made at Hilditch and Key. There’s one in Paris and one in London. There’s not one here, I don’t know why. They’re men’s shirts—they don’t really fit—but I don’t really care if shirts fit perfectly. I have all my suits and jackets made, but I’ve never had a shirt made. I’ll have them shortened, so that there’s not three yards of cloth hanging down. But it’s not as important to me that they fit perfectly.
I used to buy all my shirts at Brooks [Brothers], but that was completely ruined about 20 years ago. They discontinued the shirt I liked. If I had only known this—I mean, if you’re going to discontinue an item that thousands and thousands of people buy, announce it. Say, ‘We will no longer be making our excellent Brooks Brothers cotton shirts that we made for 5,000 years. We’re going to change them in some awful way. We’re alerting you so you can buy a lifetime supply.’ Shirts don’t go bad, they’re not peaches.
I feel very strongly that almost the entire city has copied my glasses. I went to a fashion show during fashion week, and everyone there had on my eyeglasses. Warby Parker has also copied my eyeglasses.
Here’s what started happening: A few years ago, kids—and by which I mean, my friends kids—started coming up to me and saying, ‘Fran, where’d you get those vintage glasses?’
And I said, ‘They’re not vintage. I’ve just owned them for a long time. They are vintage in the way I am.’
I’m not unhappy that everyone has copied me. There was a period when everyone was wearing those black, oblong glasses. These are better.
As with my perfect white shirts, it never occurred to me that they’d stop making my original tortoiseshell eyeglasses—the ones I started with—but then they did. So now I have glasses that are like the originals, sort of like the originals, kind of like the originals…I have made several attempts to recover what I once had.
The ones I’m wearing right now, I had them made. Now, for someone who didn’t grow up in the depression, but who basically behaves as if I did (because I was raised by people who did) it’s crazy to me that I didn’t ask up front how much it would cost. They cost so much that I never did it again. I was traumatized by it.
Would you say how much they cost?
I wouldn’t. I’m mortified.
But like, maybe in comparison to something? Like, “My eyeglass frames were about as much as…”
A real tragedy.
So we’ve covered most of what you’re wearing…except for your jeans, and your boots.
I’m wearing my very old cowboy boots because it’s going to pour. But when I was young, shoes were made in New England and I used to wear Bass penny loafers. Or before that I wore Old Maine Trotters, which aren’t made anymore.
But then I got something called a bone spur on my ankle, which there’s no cure for, and I couldn’t walk (it was very painful). And the doctor told me that what would help was wearing shoes with a heel, like cowboy boots. And I was like, great, because I’m 5’6″ in cowboy boots.
So I had my cowboy boots made. It’s very hard to find this man who makes them. (And I’m not going to give out his name because I don’t want you to know what they cost.) I have one pair of very good ones, and two pairs of ones that are ripped to shreds—the ones I have on are ones that I save for the rain.
Why did you have them made? What don’t you like about regular cowboy boots?
I don’t like the pointy toe or the square toe. I have the only pairs I know of that are wingtip cowboy boots.
Have you always worn the same kind of jeans?
I always wore 501 Levi’s. They used to make them in San Francisco. Every size was the same size, which sounds obvious, but you would be surprised—and then, I don’t know, at some point during globalization they started making them in Mexico, and like every other thing they branched out to places you’d never heard of. So now every single size of Levi’s is a different size. They cost less, too, which doesn’t make any sense. I wish that real estate were cheaper and clothes were more expensive. But that’s what young people want: $2 T shirts that fall apart in the wash.
People care more about trends now than they do about style. They get so wrapped up in what’s happening that they forget how to dress, and they never learn who they are because they never learn how to take care of anything. So much of what my generation was taught regarding clothes was how to make them last. How to wash and care for them.
I take very good care of my clothes. When I get home, I instantly hang up my jacket. If it’s hot outside, I’ll hang it on the shower rod so that it can air out a bit before I put it away. That’s the first thing I do. Then I’ll hang up my shirt if I’m going to wear it again that night, and I change into another shirt that I just wear around the house. It’s from high school and has holes in it. I love it because it’s mine and because nobody sees me in it, ever. I put my cufflinks in their little box. I shoeshine once a week. My jeans go in the washing machine, my shirts go out (they’re starched), and my clothes that need to be dry-cleaned go to the most expensive dry-cleaner. I dry-clean as infrequently as possible—not only because it’s psychotically expensive, but also because who knows what it does to the clothes? Dry…clean. These words don’t go together. Wet clean—that is how you clean. I can’t even imagine the things they do at the drycleaner. I don’t want to know.
How did you arrive at your current style?
I think I always had it, but my tastes just became more expensive.
The biggest difference is that when I was young, I wore sweaters. Crewneck sweaters, with button-down shirts and jeans, every single day. And I think at a certain point in my twenties, I decided that was childish. So I gave away all my beautiful sweaters.
Blue jeans are childish too, obviously. But luckily everyone my age kept wearing them. It used to be that adults did not wear jeans—not men, unless they were construction workers—only teenagers wore them. But I guess my generation just said, “We’re going to keep wearing them until we die, because we’re almost there.”
Besides your sweater phase, was there ever a period in your life where you indulged a style other than your current one—perhaps something you look back on and think, “I can’t believe I did that?”
No. For instance in the ’60s I never wore bellbottoms. I thought they were ridiculous. It’s a horrible line. I never wore tie-dye, either. If it comes from the ’60s I never wore it.
That’s not true. The army jackets. We used to buy clothes ironically at the Army Navy stores. So I had one of those khaki jackets, which was, you know, covered in anti-war pins and stuff.
But that’s it. Everything else I hated. The first time I ever saw platform shoes in the ’70s, I knew they’d been revived from the ’40s, and I felt sickened. And for whatever reason, they keep getting revived. They’ve come back four times. I wish we could let them die. They want to die. They were horrible then, they’re horrible now. The lines on bellbottoms don’t flatter, and neither do the lines on platforms.
But I guess they have to keep making them because teenagers see them and go, ‘Wow, that’s edgy.’ If you’re 18 right now, you think you invented platform shoes. You think you’re doing something new. You think you’ve invented something so ugly that it’s beautiful.
When we were young, we knew things. We knew basic history, even as it related to fashion. Now, when something reappears, an 18 year old has no clue that it’s a revival. Despite the fact that they’re almost always online they don’t get references.
I think that’s part of why visual things are becoming so derivative. Designers now, they all have these things called mood boards. I suppose they think a sense of discovery equals invention. It would be as if every writer had a board with paragraphs of other writers—’Oh, I’ll take a little bit of this, and that, he was really good.’ Yes, he was really good! And that is not a mood board, it is a stealing board.
What’s the point of being young if you’re not going to make new things, I wonder? It’s their job to innovate. That’s the entire point. It’s why they’re here. But I feel like they’re not holding up their end. I think someone my age should look at what young people are wearing and think, ‘What the hell is that?’ Instead I’m totally bored.
To me, the main difference between young people now and the people I was young with isn’t so much style, it’s the relationships they have with their parents. Their parents like them much more than ours liked us. Our parents weren’t our friends. They disapproved of us. All our parents cared about was how we behaved, not how we felt, not what we wanted. But now I see my friends on the phones with their, what, 30-year-old kids? And they’re talking about feelings. You would think this kind of relationship would make this adult children more relaxed, but instead they’re more concerned. Parent-child relationships have become so collegiate. And so when these grown children go into the world, they expect a certain amount of attention. And they’re very disappointed.
Do you think what you’re describing influences the fashion choices of that generation?
Because of the Internet everybody sees the same stuff. You can buy the clothes of New York, even if you’re not living there. So I think that the accessibility, in this case, drives buying choices more than anything else.
More people should be dressing like we dress in New York anyway. Not everyone in New York looks great, but you have a higher chance.
I have to say that one of the biggest changes in my lifetime, is the phenomenon of men wearing shorts. Men never wore shorts when I was young. There are few things I would rather see less, to tell you the truth. I’d just as soon see someone coming toward me with a hand grenade. This is one of the worst changes, by far. It’s disgusting. To have to sit next to grown men on the subway in the summer, and they’re wearing shorts? It’s repulsive. They look ridiculous, like children, and I can’t take them seriously.
It’s like any other sort of revealing clothing, in that the people you’d most like to see them on aren’t wearing them. And if they are, it’s probably their job to wear them. My fashion advice, particularly to men wearing shorts: Ask yourself, ‘Could I make a living modeling these shorts?’ If the answer is no, then change your clothes. Put on a pair of pants.
You know when George Plimpton died, someone told me, ‘He was so eccentric. He used to ride his bike in a suit and tie!’ and it drove me crazy. I said, ‘What’s eccentric is the bicycle. Everyone here used to wear suits and it was lovely! But only children rode bicycles.’ The trademark of New York City fashion used to be that we dressed more seriously here. More formally. Now people need special costumes to ride bicycles. I mean, a helmet, what, are you an astronaut??
To a certain extent people still dress formally. Of course, more people should wear overcoats than those damned down jackets. Please. Are you skiing, or are you walking across the street? If you’re not an arctic explorer, dress like a human being.
All these clothes that you see people wearing, the yoga clothes—even men wear them!—it’s just another way of being in pajamas. You need more natural beauty to get away with things like that. What’s so great thing about clothes is that they’re artificial—you can lie, you can choose the way you look, which is not true of natural beauty. So if you’re naturally beautiful, wear what you want, but that’s .01% of people. Most people just aren’t good looking enough to wear what they have on. They should change. They should get some slacks and a nice overcoat.
For instance: remember when the style was incredibly messy hair? That’s great if you’re a model. But if you’re not a model, you would look better if you washed your hair, because you are not beautiful.
On the one hand I think it’s hilarious that so many people think they look fantastic, because they’re wearing clothes that you should only wear if you look fantastic. If you walked around New York you would think there was a terrible mirror famine. There might be drought here, a wheat famine there, but in New York you have a mirror famine. Because everything people wear, you have to assume they bought it.
Where was the mirror? I sometimes feel like handing out citations.
Let’s do it. I’m going to name some names, and you let me know if one deserves a citation.
I danced with Michael Jackson. After Social Studies came out. Andy [Warhol] invited him. My best friend Lisa Robinson knew him from the time he was a child. So she introduced me to him, and he asked me to dance. And I danced. I was a great dancer. Not as great as Michael Jackson, but good. I don’t remember what he was wearing. I don’t remember caring.
What about Dolly Parton?
I know Dolly. What about her?
What do you think about her style?
It’s great because she invented it for herself and she can wear it. It never caught on because you have to be Dolly. The extreme, exaggerated femininity is, for most people, not so great a look. Except for drag queens, because that’s what drag queens do.
I mean, I always thought it would be much wittier for drag queens to dress in this very drab way. You know, the yoga pants? Well, what if drag queens just really let themselves go, pretending not to try, like most women?
But there are no drag queens like that, because drag queens know how to wear clothes. Can you imagine if women tried as hard as drag queens? We’d be a much more attractive culture. I wouldn’t have to give out so many yoga pants citations.
What about Hillary Clinton?
I think her lack of style comes naturally. I do, I really do. She has no style, zero. Of course there’s millions of women like this, it’s just that not everyone’s looking at them constantly.
But I would not say her look (I won’t even call it style) is so imposed on her. Yes, there’s a narrow parameter for a woman that public, but I don’t feel that inside of Hillary Clinton there’s a Jane Birkin waiting to get out. I don’t think she cares. I don’t think she is interested in how her house looks, where her furniture is from—I don’t think she has any visual interests. And there’s nothing wrong in not caring. A man who doesn’t care about what he looks like, he’s applauded. We say, ‘Oh, he’s not superficial!’
I, myself, am deeply superficial.
What do you look for in a woman’s outfit?
I notice her clothes if she knows how to wear clothes. It’s a trait, not a talent. A person who actually knows how to wear clothes…they would look good in any clothes. You see this especially at the Academy Awards. Even if the dresses are beautiful and expensive and important, the actresses can’t always carry them. Sometimes I feel like saying to them, ‘Act! You know how to act, you’re an actor. You’re about to win an award for (I don’t know) convincingly playing that Venezuelan nun who went to war. Now act like you can wear this dress.’
Maybe it’s superficial to exude a sense of confidence in one’s clothes. But it’s also integral. Yes, if you cover a man’s eyes, he legitimately might not remember what he has on. But is that really worth celebrating, or imitating? Personally I don’t think we need to emulate that level of stupidity. Because look, we have an appearance. Not all of us are beautiful. But we can appear fine looking. So we should. Feeling good about an outfit is the point at which that outfit finally becomes good.
Thanks to DKMarch 26th, 2015
SolarCraft workers Joel Overly, left, and Craig Powell install a solar panel on Feb. 26 in San Rafael, Calif. (Justin Sullivan/Getty Images)
By Joby Warrick
The Washington Post Published: March 7, 2015
Three years ago, the nation’s top utility executives gathered at a Colorado resort to hear warnings about a grave new threat to operators of America’s electric grid: not superstorms or cyberattacks, but rooftop solar panels.
If demand for residential solar continued to soar, traditional utilities could soon face serious problems, from “declining retail sales” and a “loss of customers” to “potential obsolescence,” according to a presentation prepared for the group. “Industry must prepare an action plan to address the challenges,” it said.
The warning, delivered to a private meeting of the utility industry’s main trade association, became a call to arms for electricity providers in nearly every corner of the nation. Three years later, the industry and its fossil-fuel supporters are waging a determined campaign to stop a home-solar insurgency that is rattling the boardrooms of the country’s government-regulated electric monopolies.
The campaign’s first phase—an industry push for state laws raising prices for solar customers—failed spectacularly in legislatures around the country, due in part to surprisingly strong support for solar energy from conservatives and evangelicals in traditionally “red states.” But more recently, the battle has shifted to public utility commissions, where industry backers have mounted a more successful push for fee hikes that could put solar panels out of reach for many potential customers.
In a closely watched case last month, an Arizona utility voted to impose a monthly surcharge of about $50 for “net metering,” a common practice that allows solar customers to earn credit for the surplus electricity they provide to the electric grid. Net metering makes home solar affordable by sharply lowering electric bills to offset the $10,000 to $30,000 cost of rooftop panels.
A Wisconsin utilities commission approved a similar surcharge for solar users last year, and a New Mexico regulator also is considering raising fees. In some states, industry officials have enlisted the help of minority groups in arguing that solar panels hurt the poor by driving up electricity rates for everyone else.
“The utilities are fighting tooth and nail,” said Scott Peterson, director of the Checks and Balances Project, a Virginia nonprofit that investigates lobbyists’ ties to regulatory agencies. Peterson, who has tracked the industry’s two-year legislative fight, said the pivot to public utility commissions moves the battle to friendlier terrain for utilities. The commissions, usually made up of political appointees, “have enormous power, and no one really watches them,” Peterson said.
Industry officials say they support their customers’ right to generate electricity on their own property, but they say rooftop solar’s new popularity is creating a serious cost imbalance. While homeowners with solar panels usually see dramatic reductions in their electric bills, they still rely on the grid for electricity at night and on cloudy days. The utility collects less revenue, even though the infrastructure costs — from expensive power plants to transmission lines and maintenance crews — remain the same.
Ultimately, someone pays those costs, said David K. Owens, an executive vice president for Edison Electric Institute, the trade association that represents the nation’s investor-owned utilities.
“It’s not about profits; it’s about protecting customers,” said Owens, said. “There are unreasonable cost shifts that do occur [with solar]. There is a grid that everyone relies on, and you have to pay for that grid and pay for that infrastructure.”
Whether home-solar systems add significant costs to electric grids is the subject of intense debate. A Louisiana study last month concluded that solar roofs had resulted in cost shifts of more than $2 million that must be borne by Louisiana customers who lack solar panels. That study was immediately disputed by clean energy groups that pointed to extensive ties between the report’s authors and the fossil-fuel lobby.
Other studies commissioned by state regulators in Nevada and Mississippi found that any costs are generally outweighed by benefits. For one thing, researchers found, the excess energy generated by solar panels helps reduce the strain on electric grids on summer days when demand soars and utilities are forced to buy additional power at high rates. Other experts note that the shift to solar energy is helping states meet new federal requirements to reduce greenhouse gas emissions while also producing thousands of new jobs. The residential solar industry currently employs about 174,000 people nationwide, or twice as many as the number of coal miners.
“Independent studies show that distributed solar benefits all ratepayers by preventing the need to build new, expensive power plants or transmission lines,” said Matthew Kasper, a fellow at the Energy & Policy Institute, a pro-solar think tank. “Utilities make their money by building big, new infrastructure projects and then sending ratepayers the bill, which is exactly why utilities want to eliminate solar.”
Residential solar panels have been widely available since the 1970s, but advances in the past decade have transformed home solar energy in many areas from an expensive novelty to a cost-competitive alternative to traditional power.
The average price of photovoltaic cells has plummeted 60 percent since 2010, thanks to lower production costs and more-efficient designs. Solar’s share of global energy production is climbing steadily, and a study last week by researchers from Cambridge University concluded that photovoltaics will soon be able to out-compete fossil fuels, even if oil prices drop to as low as $10 a barrel.
In the United States, utilities have embraced solar projects of their own making, building large solar farms that produce nearly 60 percent of the electricity that comes from the sun’s rays.
“We are pro-solar,” said Edison’s Owens. “We are putting in more solar than any other industry.”
But the arrival of cheaper solar technology has also brought an unexpected challenge to the industry’s bottom line: As millions of residential and business customers opt for solar, revenue for utilities is beginning to decline. Industry-sponsored studies have warned the trend could eventually lead to a radical restructure of energy markets, similar to earlier upheavals with phone-company monopolies.
“One can imagine a day when battery-storage technology or micro turbines could allow customers to be electric grid independent,” said a 2013 Edison study. “To put this into perspective, who would have believed 10 years ago that traditional wire line telephone customers could economically ‘cut the cord’?”
Support from conservatives
The utility industry’s playbook for slowing the growth of residential solar is laid out in a few frames of the computer slide show presented at an Edison-sponsored retreat in September 2012, in a lakeside resort hotel in Colorado Springs, Colo. Despite a bland title—“Facing the Challenges of a Distribution System in Transition”—the Edison document portrays solar systems as a serious, long-term threat to the survival of traditional electricity providers.
Throughout the country, it noted, lawmakers and regulatory agencies were “promoting policies that are accelerating this transition — subsidies are growing.” The document, provided to The Washington Post by the Energy & Policy Institute, called for a campaign of “focused outreach” targeting key groups that could influence the debate: state legislatures, regulatory agencies and sympathetic consumer-advocacy groups.
Two-and-a-half years later, evidence of the “action plan” envisioned by Edison officials can be seen in states across the country. Legislation to make net metering illegal or more costly has been introduced in nearly two dozen state houses since 2013. Some of the proposals were virtual copies of model legislation drafted two years ago by the American Legislative Exchange Council, or ALEC, a nonprofit organization with financial ties to billionaire industrialists Charles and David Koch.
SolarCraft worker Craig Powell carries a solar panel on the roof of a home in San Rafael, Calif. The average price of photovoltaic cells has plummeted 60 percent since 2010. (Justin Sullivan/Getty Images)
Most of the bills that have been considered so far have been either rejected or vetoed, with the most-striking defeats coming in Republican strongholds, such as Indiana and Utah. There, anti-solar legislation came under a surprisingly fierce attack from free-market conservatives and even evangelical groups, many of which have installed solar panels on their churches.
“Conservatives support solar — they support it even more than progressives do,” said Bryan Miller, co-chairman of the Alliance for Solar Choice and a vice president of public policy for Sunrun, a California solar provider. “It’s about competition in its most basic form. The idea that you should be forced to buy power from a state-sponsored monopoly and not have an option is about the least conservative thing you can imagine.”
Where legislatures failed to deliver, power companies have sought help from regulatory agencies, chiefly the public utility commissions that set rates and fees that can be charged by electricity providers. Here, the results have been more encouraging for power companies.
Last month’s decision to slap monthly surcharges on solar customers in south-central Arizona was hailed as a breakthrough for the utilities in a state that has turned back several similar attempts in the past two years. The Tempe, Ariz., Salt River Project, one of Arizona’s largest utilities, approved the new fee despite furious opposition from solar users, including about 500 people who packed the commission’s hearing room for the Feb. 26 vote.
Solar companies already have filed suit to stop a similar fee increase approved last year by Wisconsin commissioners, and others are watching closely to see if New Mexico’s Public Service Co. will adopt a proposal to impose a monthly surcharge of up to $35 on solar customers there.
Regulators in each of the three states have cited fairness as the reason for the proposed increases. But solar advocates say the real injustice is the ability of electric monopolies to destroy a competitor that offers potential benefits both to consumers and to society.
“It’s really about utilities’ fear that solar customers are taking away demand,” said Angela Navarro, an energy expert with the Southern Environmental Law Center. “These customers are installing solar at their own cost and providing a valuable resource: additional electricity for the grid at the times when the utilities need it most. And it’s all carbon-free.”March 23rd, 2015
Frank Gehry’s Winton Guest House will be offered at auction. A work of art forced to relocate, this Giorgio Morandi-inspired structure stands like a piece of art. Wright will find a new steward for this important structure.
May 19, 2015. 12 PM
Thanks to RSMarch 20th, 2015
Big Sur, 1986
melamine, foam, wool upholstery
37.8 x 82.7 x 37.8 inches
Opening Reception: Friday, March 20, 2015. 6:30PM
Through May 30, 2015
By GEORGE YANCY and NOAM CHOMSKY
NY Times Published: MARCH 18, 2015
George Yancy: When I think about the title of your book “On Western Terrorism,” I’m reminded of the fact that many black people in the United States have had a long history of being terrorized by white racism, from random beatings to the lynching of more than 3,000 black people (including women) between 1882 and 1968. This is why in 2003, when I read about the dehumanizing acts committed at Abu Ghraib prison, I wasn’t surprised. I recall that after the photos appeared President George W. Bush said that “This is not the America I know.” But isn’t this the America black people have always known?
Noam Chomsky: The America that “black people have always known” is not an attractive one. The first black slaves were brought to the colonies 400 years ago. We cannot allow ourselves to forget that during this long period there have been only a few decades when African-Americans, apart from a few, had some limited possibilities for entering the mainstream of American society.
We also cannot allow ourselves to forget that the hideous slave labor camps of the new “empire of liberty” were a primary source for the wealth and privilege of American society, as well as England and the continent. The industrial revolution was based on cotton, produced primarily in the slave labor camps of the United States.
As is now known, they were highly efficient. Productivity increased even faster than in industry, thanks to the technology of the bullwhip and pistol, and the efficient practice of brutal torture, as Edward E. Baptist demonstrates in his recent study, “The Half Has Never Been Told.” The achievement includes not only the great wealth of the planter aristocracy but also American and British manufacturing, commerce and the financial institutions of modern state capitalism.
It is, or should be, well-known that the United States developed by flatly rejecting the principles of “sound economics” preached to it by the leading economists of the day, and familiar in today’s sober instructions to latecomers in development. Instead, the newly liberated colonies followed the model of England with radical state intervention in the economy, including high tariffs to protect infant industry, first textiles, later steel and others.
There was also another “virtual tariff.” In 1807, President Jefferson signed a bill banning the importation of slaves from abroad. His state of Virginia was the richest and most powerful of the states, and had exhausted its need for slaves. Rather, it was beginning to produce this valuable commodity for the expanding slave territories of the South. Banning import of these cotton-picking machines was thus a considerable boost to the Virginia economy. That was understood. Speaking for the slave importers, Charles Pinckney charged that “Virginia will gain by stopping the importations. Her slaves will rise in value, and she has more than she wants.” And Virginia indeed became a major exporter of slaves to the expanding slave society.
Some of the slave-owners, like Jefferson, appreciated the moral turpitude on which the economy relied. But he feared the liberation of slaves, who have “ten thousand recollections” of the crimes to which they were subjected. Fears that the victims might rise up and take revenge are deeply rooted in American culture, with reverberations to the present.
The Thirteenth Amendment formally ended slavery, but a decade later “slavery by another name” (also the title of an important study by Douglas A. Blackmon) was introduced. Black life was criminalized by overly harsh codes that targeted black people. Soon an even more valuable form of slavery was available for agribusiness, mining, steel — more valuable because the state, not the capitalist, was responsible for sustaining the enslaved labor force, meaning that blacks were arrested without real cause and prisoners were put to work for these business interests. The system provided a major contribution to the rapid industrial development from the late 19th century.
That system remained pretty much in place until World War II led to a need for free labor for the war industry. Then followed a few decades of rapid and relatively egalitarian growth, with the state playing an even more critical role in economic development than before. A black man might get a decent job in a unionized factory, buy a house, send his children to college, along with other opportunities. The civil rights movement opened other doors, though in limited ways. One illustration was the fate of Martin Luther King’s efforts to confront northern racism and develop a movement of the poor, which was effectively blocked.
The neoliberal reaction that set in from the late ‘70s, escalating under Reagan and his successors, hit the poorest and most oppressed sectors of society even more than the large majority, who have suffered relative stagnation or decline while wealth accumulates in very few hands. Reagan’s drug war, deeply racist in conception and execution, initiated a new Jim Crow, Michelle Alexander’s apt term for the revived criminalization of black life, evident in the shocking incarceration rates and the devastating impact on black society.
Reality is of course more complex than any simple recapitulation, but this is, unfortunately, a reasonably accurate first approximation to one of the two founding crimes of American society, alongside of the expulsion or extermination of the indigenous nations and destruction of their complex and rich civilizations.
‘Intentional ignorance’ regarding inconvenient truths about the suffering of African- Americans can also be used to frame the genocide of Native Americans.
G.Y.: While Jefferson may have understood the moral turpitude upon which slavery was based, in his “Notes on the State of Virginia,” he says that black people are dull in imagination, inferior in reasoning to whites, and that the male orangutans even prefer black women over their own. These myths, along with the black codes following the civil war, functioned to continue to oppress and police black people. What would you say are the contemporary myths and codes that are enacted to continue to oppress and police black people today?
N.C.: Unfortunately, Jefferson was far from alone. No need to review the shocking racism in otherwise enlightened circles until all too recently. On “contemporary myths and codes,” I would rather defer to the many eloquent voices of those who observe and often experience these bitter residues of a disgraceful past.
Perhaps the most appalling contemporary myth is that none of this happened. The title of Baptist’s book is all too apt, and the aftermath is much too little known and understood.
There is also a common variant of what has sometimes been called “intentional ignorance” of what it is inconvenient to know: “Yes, bad things happened in the past, but let us put all of that behind us and march on to a glorious future, all sharing equally in the rights and opportunities of citizenry.” The appalling statistics of today’s circumstances of African-American life can be confronted by other bitter residues of a shameful past, laments about black cultural inferiority, or worse, forgetting how our wealth and privilege was created in no small part by the centuries of torture and degradation of which we are the beneficiaries and they remain the victims. As for the very partial and hopelessly inadequate compensation that decency would require — that lies somewhere between the memory hole and anathema.
Jefferson, to his credit, at least recognized that the slavery in which he participated was “the most unremitting despotism on the one part, and degrading submissions on the other.” And the Jefferson Memorial in Washington displays his words that “Indeed I tremble for my country when I reflect that God is just: that his justice cannot sleep forever.” Words that should stand in our consciousness alongside of John Quincy Adams’s reflections on the parallel founding crime over centuries, the fate of “that hapless race of native Americans, which we are exterminating with such merciless and perfidious cruelty…among the heinous sins of this nation, for which I believe God will one day bring [it] to judgment.”
What matters is our judgment, too long and too deeply suppressed, and the just reaction to it that is as yet barely contemplated.
G.Y.: This “intentional ignorance” regarding inconvenient truths about the suffering of African- Americans can also be used to frame the genocide of Native Americans. It was 18th century Swedish taxonomist Carolus Linnaeus who argued that Native Americans were governed by traits such as being “prone to anger,” a convenient myth for justifying the need for Native Americans to be “civilized” by whites. So, there are myths here as well. How does North America’s “amnesia” contribute to forms of racism directed uniquely toward Native Americans in our present moment and to their continual genocide?
N.C.: The useful myths began early on, and continue to the present. One of the first myths was formally established right after the King of England granted a Charter to the Massachusetts Bay Colony in 1629, declaring that conversion of the Indians to Christianity is “the principal end of this plantation.” The colonists at once created the Great Seal of the Colony, which depicts an Indian holding a spear pointing downward in a sign of peace, with a scroll coming from his mouth pleading with the colonists to “Come over and help us.” This may have been the first case of “humanitarian intervention” — and, curiously, it turned out like so many others.
Years later Supreme Court Justice Joseph Story mused about “the wisdom of Providence” that caused the natives to disappear like “the withered leaves of autumn” even though the colonists had “constantly respected” them. Needless to say, the colonists who did not choose “intentional ignorance” knew much better, and the most knowledgeable, like Gen. Henry Knox, the first secretary of war of the United States, described “the utter extirpation of all the Indians in most populous parts of the Union [by means] more destructive to the Indian natives than the conduct of the conquerors of Mexico and Peru.”
Knox went on to warn that “a future historian may mark the causes of this destruction of the human race in sable colors.” There were a few — very few — who did so, like the heroic Helen Jackson, who in 1880 provided a detailed account of that “sad revelation of broken faith, of violated treaties, and of inhuman acts of violence [that] will bring a flush of shame to the cheeks of those who love their country.” Jackson’s important book barely sold. She was neglected and dismissed in favor of the version presented by Theodore Roosevelt, who explained that “The expansion of the peoples of white, or European, blood during the past four centuries…has been fraught with lasting benefit to most of the peoples already dwelling in the lands over which the expansion took place,” notably those who had been “extirpated” or expelled to destitution and misery.
The national poet, Walt Whitman, captured the general understanding when he wrote that “The nigger, like the Injun, will be eliminated; it is the law of the races, history… A superior grade of rats come and then all the minor rats are cleared out.” It wasn’t until the 1960s that the scale of the atrocities and their character began to enter even scholarship, and to some extent popular consciousness, though there is a long way to go.
That’s only a bare beginning of the shocking record of the Anglosphere and its settler-colonial version of imperialism, a form of imperialism that leads quite naturally to the “utter extirpation” of the indigenous population — and to “intentional ignorance” on the part of beneficiaries of the crimes.
G.Y.: Your response raises the issue of colonization as a form of occupation. James Baldwin, in his 1966 essay, “A Report from Occupied Territory,” wrote, “Harlem is policed like occupied territory.” This quote made me think of Ferguson, Mo. Some of the protesters in Ferguson even compared what they were seeing to the Gaza Strip. Can you speak to this comparative discourse of occupation?
N.C.: All kinds of comparisons are possible. When I went to the Gaza Strip a few years ago, what came to mind very quickly was the experience of being in jail (for civil disobedience, many times): the feeling, very strange to people who have had privileged lives, that you are totally under the control of some external authority, arbitrary and if it so chooses, cruel. But the differences between the two cases are, of course, vast.
More generally, I’m somewhat skeptical about the value of comparisons of the kind mentioned. There will of course be features common to the many diverse kinds of illegitimate authority, repression and violence. Sometimes they can be illuminating; for example, Michelle Alexander’s analogy of a new Jim Crow, mentioned earlier. Often they may efface crucial distinctions. I don’t frankly see anything general to say of much value. Each comparison has to be evaluated on its own.
G.Y.: These differences are vast and I certainly don’t want to conflate them. Post-911 seems to have ushered in an important space for making some comparisons. Some seem to think that Muslims of Arab descent have replaced African-Americans as the pariah in the United States. What are your views on this?
N.C.: Anti-Arab/Muslim racism has a long history, and there’s been a fair amount of literature about it. Jack Shaheen’s studies of stereotyping in visual media, for example. And there’s no doubt that it’s increased in recent years. To give just one vivid current example, audiences flocked in record-breaking numbers to a film, described in The New York Times Arts section as “a patriotic, pro-family picture,” about a sniper who claims to hold the championship in killing Iraqis during the United States invasion, and proudly describes his targets as “savage, despicable, evil … really no other way to describe what we encountered there.” This was referring specifically to his first kill, a woman holding a grenade when under attack by United States forces.
What’s important is not just the mentality of the sniper, but the reaction to such exploits at home when we invade and destroy a foreign country, hardly distinguishing one “raghead” from another. These attitudes go back to the “merciless Indian savages” of the Declaration of Independence and the savagery and fiendishness of others who have been in the way ever since, particularly when some “racial” element can be invoked — as when Lyndon Johnson lamented that if we let down our guard, we’ll be at the mercy of “every yellow dwarf with a pocket knife.” But within the United States, though there have been deplorable incidents, anti-Arab/Muslim racism among the public has been fairly restrained, I think.
G.Y.: Lastly, the reality of racism (whether it’s anti-black, anti-Arab, anti-Jewish, etc.) is toxic. While there is no single solution to racism, especially in terms of its various manifestations, what do you see as some of the necessary requirements for ending racist hatred?
N.C.: It’s easy to rattle off the usual answers: education, exploring and addressing the sources of the malady, joining together in common enterprises — labor struggles have been an important case — and so on. The answers are right, and have achieved a lot. Racism is far from eradicated, but it is not what it was not very long ago, thanks to such efforts. It’s a long, hard road. No magic wand, as far as I know.March 19th, 2015
Untitled (Alice Neel and John Rothschild in the Bathroom), 1935
Watercolor and pencil on paper
11 7/8 x 8 7/8 inches (30.2 x 22.5 cm)
Drawings and Watercolors 1927-1978
Through April 18, 2015March 17th, 2015
Slab City, which lies between the Salton Sea and a military bombing range, has lured those trying to escape society.
Credit Monica Almeida/The New York Times
By ERIK ECKHOLM
NY Times Published: MARCH 11, 2015
NILAND, Calif. — Over the decades, the legend of Slab City as “The Last Free Place in America” has grown, fed by portrayals in magazines and film as an endless Woodstock.
An unregulated squatter settlement, Slab City is home to perhaps 150 year-round residents — refugees from mortgages and bill collectors, former hippies, rebels and self-identified misfits — who live in personal camps made from old trailers, truck campers and crude lean-tos, and call themselves Slabbers. From October to April, the population swells to perhaps 2,000 as snowbirds, attracted by the guaranteed sunshine and zero fees, arrive in sometimes majestic motor homes.
“This is the only place I have ever lived where I feel I belong,” said Christina Swistak, who goes by the nickname Dreamcatcher and moved to “The Slabs” three years ago after drifting from Arizona.
But now, the denizens of this bleak stretch of desert between the Salton Sea and a military bombing range are bitterly divided. After the notion spread that the California State Land Commission might sell the land, the Slabbers started debating what to do: Should they try to buy the place that they occupy illegally? Should they form a residents’ association to save the anarchistic soul of Slab City, or would that spawn the type of bureaucracy that people came here to escape?
“Some people come here just to disappear, and we don’t want leaders,” said Ms. Swistak, 41, part of the faction that says that residents must organize and seek title to the land before someone else does. “But sadly, yes, we have to appease The Beast,” she said, using the label here for the looming outside world.
The debate has tested friendships, with some accusing the leaders of the recently incorporated Slab City Community Group of seeking personal power. Opponents of the group say that the state is unlikely to sell the land and that its efforts will lead to the dreaded zoning and building codes, health and sanitation rules.
“We might as well go back to the suburbs,” wrote one opponent, Michael “Solar Mike” Gohl, a seller here of solar panels, on the community group’s website.
Named for the abandoned foundations of a 1940s Marine base, Slab City has no electric lines, no water, no sewage, no taxes and no rules beyond “live and let live.” Yet it is not truly off the grid: Cellular service is decent, many residents have laptops, and everyone drives into nearby Niland or other towns for propane gas, generator fuel and water.
Those with Social Security or pensions live well; those without scrounge for odd jobs and are sometimes arrested collecting scrap metal from the nearby Chocolate Mountain Aerial Gunnery Range. Some residents demarcate their plots with chain-link fences or old tires, and many have installed solar panels.
There is no prototypical resident, but Morgan Wolf, 66, and her partner, Cody Parsons, 63, tell a familiar story. Ms. Wolf, who was on her bike on a recent day leading three dogs for a walk, lost her job as a social worker in Lancaster, Calif., in 2010. The couple lost their house and decided to give Slab City a try.
They built a fenced compound with old campers, a trailer and a tent of plywood and plastic. They have stayed here through the scorching summers because it is hard to move all their dogs and cats, she said. But she has learned to love the place.
“I’ll live here for the rest of my life if I can,” she said. “It filters out the world.”
The hippie gloss was burnished by scenes in the 2007 movie “Into the Wild,” directed by Sean Penn. In the film, actors mingle with real Slabbers at the weekly music fest at an outdoor theater called The Range. They also visit Salvation Mountain, a gloriously painted monument to Jesus built from mud and hay at the entrance to Slab City, and speak on camera with its infectious creator, Leonard Knight, who died in 2014.
How the land discussions even started is in dispute, but the serious illness of Mr. Knight seems to have set things in motion.
A group of Mr. Knight’s admirers formed a nonprofit group, Salvation Mountain, and began talking with state land officials about buying a 160-acre lot and running it as a cultural shrine. Then another group organized the Chasterus Foundation, which hopes to buy 30 acres and nurture the East Jesus arts center, an artist’s colony and garden of fanciful sculptures made from discarded materials. (“East Jesus” is meant as a synonym for “middle of nowhere.”)
William “Builder Bill” Ammon, 66, who moved here 16 years ago after becoming homeless in San Diego, said he grew worried after a 2013 meeting with the California State Lands Commission, which was considering Salvation Mountain’s proposal for a purchase. The commission is supposed to use or sell its lands for the benefit of the teachers’ retirement fund, but it has never figured out what to do with this remote patch, which may well be polluted with military waste.
“I came out of that meeting thinking that all of Slab City is in danger,” Mr. Ammon said. “I had the impression that the state saw this land as a liability.”
Mr. Ammon, who runs The Range, and others formed the Slab City Community Group, elected a board early last year and have inquired about obtaining 450 acres in a trust. The group’s current president is “Pastor Dave,” Joel David Huntington, 53, who runs a ministry from his trailer with boxes of free clothes out front.
“You can’t remain static, or you go backward,” said Mr. Huntington, an 11-year resident, describing his vision of Slab City as a “living laboratory” with lessons for the world on the frugal life.
But Gary Brown, 63, a vociferous opponent of the community group, has a less lofty vision of Slab City’s purpose.
“I see the necessity to have a place where people can hit rock bottom,” he said, recalling his own arrival six years ago after he went broke in Denver. The state has put up with the squatters for more than 40 years, he said. Now, he fears, residents are going to stir up trouble and force the state to take actions that will destroy the settlement.
State officials, for their part, stress the complex issues they face and say that no sale is imminent. The lands commission is in the process of having the land appraised and surveying possible needs for chemical cleanup or even disposal of unexploded ordnance.
“We have not yet decided whether to sell it,” said Jim Porter, a land management specialist with the commission.
For now, at least, Slab City, with all its factions and characters, seems to work, even if it was never quite the Eden that some portrayed. The Imperial County Sheriff’s office, which patrols the area, said it had received 62 calls for services around Slab City in January and February, often in response to heated arguments over camp boundaries and sometimes for burglaries. Methamphetamine use has been a recurrent problem.
Mr. Ammon, who sees organizing as a necessity, expresses nostalgia for a Slab City golden age.
“People communicated on CB radio, so it was like a party line,” he said. “You always knew who was looking for a job, who was selling something, who was mad at someone.
“And everyone had some little gig — one was a welder, one fixed cars, one made pies,” Mr. Ammon said. “Nowadays, a lot of people are just coming to hang out for the party.”March 12th, 2015
Cattle graze on farmland owned by Terry McAlister, near Electra, Tex. Mr. McAlister converted to no-till farming for its apparent economic benefits. Credit Brandon Thibodeaux for The New York Times
By ERICA GOODE
NY Times Published: MARCH 9, 2015
FORT WORTH — Gabe Brown is in such demand as a speaker that for every invitation he accepts, he turns down 10 more. At conferences, like the one held here at a Best Western hotel recently, people line up to seek his advice.
“The greatest roadblock to solving a problem is the human mind,” he tells audiences.
Mr. Brown, a balding North Dakota farmer who favors baseball caps and red-striped polo shirts, is not talking about disruptive technology start-ups, political causes, or the latest self-help fad.
He is talking about farming, specifically soil-conservation farming, a movement that promotes leaving fields untilled, “green manures” and other soil-enhancing methods with an almost evangelistic fervor.
Such farming methods, which mimic the biology of virgin land, can revive degenerated earth, minimize erosion, encourage plant growth and increase farmers’ profits, their proponents say. And by using them, Mr. Brown told more than 250 farmers and ranchers who gathered at the hotel for the first Southern Soil Health Conference, he has produced crops that thrive on his 5,000-acre farm outside of Bismarck, N.D., even during droughts or flooding.
He no longer needs to use nitrogen fertilizer or fungicide, he said, and he produces yields that are above the county average with less labor and lower costs. “Nature can heal if we give her the chance,” Mr. Brown said.
Neatly tilled fields have long been a hallmark of American agriculture and its farmers, by and large traditionalists who often distrust practices that diverge from time-honored methods.
But soil-conservation farming is gaining converts as growers increasingly face extreme weather, high production costs, a shortage of labor and the threat of government regulation of agricultural pollution.
Farmers like Mr. Brown travel the country telling their stories, and organizations like No-Till on the Plains — a Kansas-based nonprofit devoted to educating growers about “agricultural production systems that model nature” — attract thousands.
“It’s a massive paradigm shift,” said Ray Archuleta, an agronomist at the Natural Resources Conservation Service, part of the federal Agriculture Department, which endorses the soil-conservation approach.
Government surveys suggest that the use of no-tillage farming has grown sharply over the last decade, accounting for about 35 percent of cropland in the United States.
For some crops, no-tillage acreage has nearly doubled in the last 15 years. For soybeans, for example, it rose to 30 million acres in 2012 from 16.5 million acres in 1996. The planting of cover crops — legumes and other species that are rotated with cash crops to blanket the soil year-round and act as green manure — has also risen in acreage about 30 percent a year, according to surveys, though the total remains small.
Farmers till the land to ready it for sowing and to churn weeds and crop residue back into the earth. Tilling also helps mix in fertilizers and manure and loosens the top layer of the soil.
But repeated plowing exacts a price. It degrades soil, killing off its biology, including beneficial fungi and earthworms, and leaving it, as Mr. Archuleta puts it, “naked, thirsty, hungry and running a fever.”
Degraded soil requires heavy applications of synthetic fertilizer to produce high yields. And because its structure has broken down, the soil washes away easily in heavy rain, taking nitrogen and other pollutants with it into rivers and streams.
Soil health proponents say that by leaving fields unplowed and using cover crops, which act as sinks for nitrogen and other nutrients, growers can increase the amount of organic matter in their soil, making it better able to absorb and retain water.
Mr. McAlister uses cover crops, like this white turnip, to preserve water and prevent erosion on his farm. Credit Brandon Thibodeaux for The New York Times
“Each 1 percent increase in soil organic matter helps soil hold 20,000 gallons more water per acre,” said Claire O’Connor, a staff lawyer and agriculture specialist at the Natural Resources Defense Council.
In turn, more absorbent soil is less vulnerable to runoff and more resistant to droughts and floods. Cover crops also help suppress weeds. Environmental groups like the Defense Council have long been fans of soil-conservation techniques because they help protect waterways and increase the ability of soil to store carbon dioxide, rather than releasing it into the air, where it contributes to climate change.
One recent study led by the Environmental Defense Fund suggested that the widespread use of cover crops and other soil-health practices could reduce nitrogen pollution in the Upper Mississippi and Ohio River basins by 30 percent, helping to shrink the giant “dead zone” of oxygen-depleted water in the Gulf of Mexico. The Defense Council, Ms. O’Connor said, has proposed that the government offer a “good driver” discount on federal crop insurance for growers who incorporate the practices.
But the movement also has critics, who argue that no-tillage and other methods are impractical and too expensive for many growers. A farmer who wants to shift to no-tillage, for example, must purchase new equipment, like a no-till seeder.
Tony J. Vyn, a professor of agronomy at Purdue, said the reasons growers cite for preferring to fully till their fields vary depending on geography, the types of crops they grow and the conditions of their soil. But they include the perception that weed control is harder using no-tillage; that the method, which reduces water evaporation, places limits on how early in the year crops can be planted; and that the residue left by no-tilling is too difficult to deal with, especially when corn is the primary cash crop.
Even farmers who enthusiastically adopt no-till and other soil-conservation methods rarely do so for environmental reasons; their motivation is more pragmatic.
“My goal is to improve my soil so I can grow a better crop so I can make more money,” said Terry McAlister, who farms 6,000 acres of drought-stricken cropland in North Texas. “If I can help the environment in the process, fine, but that’s not my goal.”
Continue reading the main storyContinue reading the main storyContinue reading the main story
For years, Mr. McAlister plowed his fields, working with his father, who began farming outside the town of Electra in the 1950s. But he began having doubts about the effects of constant tilling on the soil.
“We were farming cotton like the West Texas guys were, just plow, plow, plow,” he said. “And if you got a rain, it just washed it and eroded it.
“It made me sick,” he said. “You’re asking yourself, ‘Is there not a better way?’ But at the time, we didn’t know.”
Mr. McAlister said that he switched to no-tillage in 2005, when an agricultural economist calculated that the method offered a $15-per-acre advantage over full tilling.
Now he is a convert. Standing in a field of winter wheat, he pointed proudly at the thick blanket of stubble sprinkled with decaying radishes and turnips.
“One of the toughest things about learning to do no-till is having to unlearn all the things that you thought were true,” he said.
Mr. McAlister grows cotton, wheat, hay, grain sorghum and some canola as cash crops, using a GPS-guided no-till seeder that drills through residue, allowing him to plant precisely and effectively.
He credits no-tillage for one of his biggest wheat crops, in 2012, when extreme drought left farmers throughout the region struggling to salvage any harvest. His healthier soil, he believes, made better use of the tiny amount of rain that fell than did the fully tilled fields of other farmers.
But few growers go as far as Mr. Brown in North Dakota, who produces grass-fed beef and has given up most agricultural chemicals. Mr. McAlister, for example, still uses nitrogen fertilizer. He plants seeds that are genetically modified for drought or herbicide resistance. And he depends on herbicides like Roundup to kill off his cover crops before sowing the crops he grows for cash.
The philanthropist Howard G. Buffett, a proponent of soil-conservation practices, said that the drought and flooding that have plagued much of the country in recent years have drawn more farmers to no-till.
“When you get into a drought, that gets everybody’s attention,” said Mr. Buffett, the middle son of Warren E. Buffett, the billionaire investor. “Farmers don’t really change their behavior until they see that they have to, which is pretty much human nature.”
The Environmental Protection Agency’s regulation of nutrient pollution in the Chesapeake Bay under the Clean Water Act in 2010, Mr. Buffett said, should also be “a wake-up call that the E.P.A. is coming soon” and if farmers do not address fertilizer runoff, the government will do it for them.
Still, he said, reaping the benefits of no-tillage farming demands patience, given that it may take several years for deadened soil to recover. Some farmers try no-tilling for one season and then get discouraged. And there is no one-size-fits-all solution: Farmers must adapt what they have learned to their own land and crops.
Mr. McAlister and other no-till farmers said that perhaps the biggest barrier to the spread of no-till is the mind-set that farmers must do things the same way as earlier generations did them.
“We have a saying in our area: ‘You can’t no-till because you haven’t buried your father yet,’” Mr. McAlister said.
“You can’t take on an endeavor like this with someone leaning over your shoulder every day telling you you’re wrong and it’s not going to work,” he said.
Thanks to Jonathan MaghenMarch 10th, 2015
Wayne Gonzales, Untitled, 2009
Acrylic on canvas
84 x 84 inches
(213.4 x 213.4 cm)
ORGANIZED BY BOB NICKAS
THROUGH APRIL 11, 2015
NATHANIEL AXEL, LISA BECK, SADIE BENNING, SASCHA BRAUNIG, ALEX BROWN, MATHEW CERLETTY, WAYNE GONZALES, JOANNE GREENBAUM, DANIEL HESIDENCE, MAMIE HOLST, CANNON HUDSON, CHIP HUGHES, XYLOR JANE, ROBERT JANITZ, ERIK LINDMAN, NIKHOLIS PLANCK, DAVID RATCLIFF, NICOLAS ROGGY, IVAN SEAL, RICHARD TINKLER, STANLEY WHITNEYMarch 9th, 2015
Members of the Wrecking Crew in a recording session with producer Phil Spector
BY PETER GILSTRAP
LA Weekly Published: March 3, 2015
If you think — and want to continue to think — that Brian Wilson played that signature roller-rink organ on “California Girls,” read no further. If you’ve labored under the illusion that Karen Carpenter tapped out that delicate drum part on “Close to You,” or that Papa John Phillips strummed the sweeping intro to “California Dreamin’,” prepare for a rude awakening.
On hundreds of hits from the late 1950s through the mid-’70s by acts such as The Byrds, The Mamas and the Papas, Elvis Presley, Harry Nilsson, The Beach Boys, Sam Cooke, The Carpenters, The Ronettes, Simon and Garfunkel, Frank and Nancy Sinatra and many, many more, the backing band was a group of faceless studio musicians.
The jazz-trained instrumentalists were L.A.’s first-call players for pop, TV and movie work. They were the consummate pros, the fixers, the one-takers, the guys (and gal) behind the guys.
They were the Wrecking Crew.
Back when L.A.’s recording scene was a hit-minting machine that ruled the airwaves, they worked up to four three-hour sessions a day. Some say they slept in the studio. Huge money was made. Family lives suffered. Marriages crumbled.
Yet they clocked in and out, somehow always sounding inspired for the big names and pretty faces on the record covers, creating what has become the soundtrack to two decades of American life.
But who were these deft, anonymous masters?
Director Denny Tedesco tackles that question with The Wrecking Crew, a heartfelt, engrossing documentary 19 years in the making, which finally sees theatrical release in New York and Los Angeles on March 13. It joins the formidable ranks of behind-the-scenes music docs including Muscle Shoals, Standing in the Shadows of Motown and 20 Feet From Stardom, and it’s a story Tedesco is singularly qualified to helm.
His late father, guitarist Tommy Tedesco, was one of the core members of that integral session group, and a man whose sense of humor was as big as his six-string talent. The guitar intros to TV’s The Twilight Zone, Green Acres, Bonanza, M*A*S*H and Batman? That’s Tommy.
Tommy Tedesco, originally from Niagara Falls, N.Y., succumbed to cancer in 1997 after a decades-long, three-pack-a-day smoking habit. His illness was the catalyst for the documentary.
“When they said he had a year to live, my concern was, if I don’t do it, it’s going to be the biggest regret of my life,” says Denny, who had worked in Hollywood as a grip and set decorator but, in terms of directing, “had no idea” what he was doing. “It wasn’t going to be just about my dad; it was going to be about the group of them.”
“Them” is a bit tough to define. It was not “a set group of musicians,” Tedesco explains. Depending on who you talk to, it’s “15, 20, 35 players,” but the core group included the bassist extraordinaire Carol Kaye, drummers Hal Blaine and Earl Palmer, guitarists Al Casey, Tommy Tedesco and Glen Campbell (later of “Rhinestone Cowboy” fame), keyboardists Don Randi and Leon Russell and sax player Plas Johnson.
Tedesco began shooting in 1996, embarking on a job that would see him interview 76 musicians, producers, writers, arrangers and engineers; 29 made the final cut. He shot “on 16mm film, 8mm, 3/4[-inch] tape and Beta tape,” he says. “Everything but IMAX.”
He went into debt. His wife, Susie, footed the family bills. Friends donated. He used Kickstarter. His biggest financial hurdle was licensing 110 songs, including some of the biggest hits known to man.
“We had a $750,000 bill before we could even release this film theatrically,” Tedesco explains, “so no one was touching us. We still had this thing around our neck. Documentaries don’t sell, and music docs are the worst.”
Interviews and images were available but archival film, not so much. There’s likely more footage of Bigfoot sashaying through the woods than there is of the Crew actually playing sessions. In fact, film of the group would be nonexistent were it not for Hal Blaine.
Blaine, now 86, started drumming behind strippers in mob-run Chicago clubs in the late 1940s. In the studios of L.A., he played on 40 No. 1 singles and 150 Top 10 cuts. He supplied the beat behind eight Record of the Year Grammys and was the rock in Phil Spector’s Wall of Sound.
Think of the Ronettes’ “Be My Baby.” Boom. Boom-Boom. Crack. Boom. Boom-Boom. Crack. That’s Hal Blaine.
He retired to Palm Desert years ago, to a nice pad in a gated community. You approach the house and hear a dog growling and barking. You see pictures of a Doberman taped in the window by the front door. It opens, and there’s Hal. No Doberman, however.
“That’s Otto,” Blaine says, pointing to a speaker in the entry hall. He shuts it off and chuckles.
As to that rare Crew footage…
“Somehow I got hold of an 8mm silent pornographic film,” explains the most prolific studio drummer in history. “It was a nasty, filthy thing. Probably was done some time in the ’20s or ’30s.
“I had a camera, and I took it to work and I became a director of sorts. And I’d tell people like Tommy, ‘Hey, Tom, do me a favor. I’m gonna take a film of you. Just come walking into the studio, and all of a sudden pretend you’ve walked into a great big orgy going on here. There’s all these naked women and guys.’ And we’re laughing about it. I did that with Glen Campbell, all the guys. They were happy to do it. I stayed up all night and edited it together. When Denny needed some film footage, I thought of this ridiculous thing.”
Unlike their strictly-business predecessors of the ’40s and ’50s, the Wrecking Crew were cooler, hipper, downright casual.
“They looked down on us and this filthy new rock & roll. We were in Levis and T-shirts. These older guys in their ties and blue blazers, carrying around their little ashtrays, said, ‘These kids are going to wreck the business,’” recalls Blaine, who takes credit for coining the moniker Wrecking Crew.
Carol Kaye reportedly disputes the name, insisting that, back in the day, they were called the Clique.
Don Randi has a slightly different take as well. “We were [called] the Wall of Sound. We started with Spector,” says the affable keyboard titan. You may recall the holy keyboard pulse on a little Beach Boys number called “God Only Knows.” That’s Don Randi.
“The Wrecking Crew came later on. It’s an iconic phrase and people love it. But people would call us the Wrecking Crew because we could wreck a [session]. If you were a stupid producer, we could take you on a ride that you’ll never forget,” Randi says.
Whatever you want to call them, the group’s musical contributions are indelible, and Tedesco’s film is a long-overdue homage that puts these familiar strangers into perspective.
“It’s important,” Randi says. “It’s almost a piece of history. It’s a time that won’t be repeated again because the technology has taken all of that away, that liveness that we had. Although now some of the bands are starting to come back to it again. You know, let’s all get in a room and kill one another.”
“You’re only as good as your last hit,” Blaine says, “and no one had more hits than we did.”March 7th, 2015
By Paul Krugman
NY Times Published: March 6, 2015
If you want to know what a political party really stands for, follow the money. Pundits and the public are often deceived; remember when George W. Bush was a moderate, and Chris Christie a reasonable guy who could reach out to Democrats? Major donors, however, generally have a very good idea of what they are buying, so tracking their spending tells you a lot.
So what do contributions in the last election cycle say? The Democrats are, not too surprisingly, the party of Big Labor (or what’s left of it) and Big Law: unions and lawyers are the most pro-Democratic major interest groups. Republicans are the party of Big Energy and Big Food: they dominate contributions from extractive industries and agribusiness. And they are, in particular, the party of Big Pizza.
No, really. A recent Bloomberg report noted that major pizza companies have become intensely, aggressively partisan. Pizza Hut gives a remarkable 99 percent of its money to Republicans. Other industry players serve Democrats a somewhat larger slice of the pie (sorry, couldn’t help myself), but, over all, the politics of pizza these days resemble those of, say, coal or tobacco. And pizza partisanship tells you a lot about what is happening to American politics as a whole.
Why should pizza, of all things, be a divisive issue? The immediate answer is that it has been caught up in the nutrition wars. America’s body politic has gotten a lot heavier over the past half-century, and, while there is dispute about the causes, an unhealthy diet — fast food in particular — is surely a prime suspect. As Bloomberg notes, some parts of the food industry have responded to pressure from government agencies and food activists by trying to offer healthier options, but the pizza sector has chosen instead to take a stand for the right to add extra cheese.
The rhetoric of this fight is familiar. The pizza lobby portrays itself as the defender of personal choice and personal responsibility. It’s up to the consumer, so the argument goes, to decide what he or she wants to eat, and we don’t need a nanny state telling us what to do.
It’s an argument many people find persuasive, but it doesn’t hold up too well once you look at what’s actually at stake in the pizza disputes. Nobody is proposing a ban on pizza, or indeed any limitation on what informed adults should be allowed to eat. Instead, the fights involve things like labeling requirements — giving consumers the information to make informed choices — and the nutritional content of school lunches, that is, food decisions that aren’t made by responsible adults but are instead made on behalf of children.
Beyond that, anyone who has struggled with weight issues — which means, surely, the majority of American adults — knows that this is a domain where the easy rhetoric of “free to choose” rings hollow. Even if you know very well that you will soon regret that extra slice, it’s extremely hard to act on that knowledge. Nutrition, where increased choice can be a bad thing, because it all too often leads to bad choices despite the best of intentions, is one of those areas — like smoking — where there’s a lot to be said for a nanny state.
Continue reading the main storyContinue reading the main story
Oh, and diet isn’t purely a personal choice, either; obesity imposes large costs on the economy as a whole.
But you shouldn’t expect such arguments to gain much traction. For one thing, free-market fundamentalists don’t want to hear about qualifications to their doctrine. Also, with big corporations involved, the Upton Sinclair principle applies: It’s difficult to get a man to understand something when his salary depends on his not understanding it. And beyond all that, it turns out that nutritional partisanship taps into deeper cultural issues.
At one level, there is a clear correlation between lifestyles and partisan orientation: heavier states tend to vote Republican, and the G.O.P. lean is especially pronounced in what the Centers for Disease Control and Prevention call the “diabetes belt” of counties, mostly in the South, that suffer most from that particular health problem. Not coincidentally, officials from that region have led the pushback against efforts to make school lunches healthier.
At a still deeper level, health experts may say that we need to change how we eat, pointing to scientific evidence, but the Republican base doesn’t much like experts, science, or evidence. Debates about nutrition policy bring out a kind of venomous anger — much of it now directed at Michelle Obama, who has been championing school lunch reforms — that is all too familiar if you’ve been following the debate over climate change.
Pizza partisanship, then, sounds like a joke, but it isn’t. It is, instead, a case study in the toxic mix of big money, blind ideology, and popular prejudices that is making America ever less governable.March 6th, 2015
A government building in Goshen, N.Y., designed by the noted architect Paul Rudolph, will be demolished unless legislators act this week. Credit Fred R. Conrad/The New York Times
By MICHAEL KIMMELMAN
NY Times Published: MARCH 3, 2015
This week, lawmakers in Goshen, N.Y., have a last chance to save an archetype of midcentury modernist architecture — and themselves from going down as reckless stewards of the nation’s heritage.
The plan is to gut Paul Rudolph’s Orange County Government Center, strip away much of its distinctive, corrugated concrete and glass exterior and demolish one of its three pavilions, replacing it with a big, soulless glass box. Rudolph, who died in 1997, at 78, was a leading light of American architecture when this building, one of his best and most idealistic, opened nearly half a century ago. Like Rudolph, the center suffered abuse over the years but is now being championed by new fans that recognize his genius, and the latest plan as vandalism.
Thursday is the deadline for legislators to override a veto by Steven Neuhaus, the Orange county executive, who seems hell-bent on demolition. In January, Mr. Neuhaus vetoed a bill that would have allowed county officials to consider selling the disputed building to a Manhattan architect who wants to preserve it. The architect, Gene Kaufman, would turn the center into an artists’ residence and exhibition space, a no-brainer.
Mr. Kaufman has also proposed designing a new government center, just next door, for millions less than what the county’s present plan is projected to cost. His offer would accomplish everything legislators say they want, and even add Rudolph’s building to the county tax rolls. The center has been closed since 2011: having allowed it to deteriorate for years, county leaders closed it after Hurricane Irene, citing leaks.
This past weekend, The Times Herald-Record, a leading newspaper in the region, pleaded with legislators to reconsider Mr. Kaufman’s proposal.
“Legislators owe it to the people of the county to listen to his plan, to test the assumptions and to compare it to the plan they are in such a hurry to implement,” the editorial argued.
The lawmakers should have second thoughts. Bids for demolition came in last week at nearly twice the price estimated by the design firm, Clark Patterson Lee, that Mr. Neuhaus and his allies have enlisted. Instead of $3.9 million, as Clark Patterson predicted, the two bids topped $7.4 million and $7.7 million, The Times Herald-Record reported on Saturday.
Officials backing demolition say that debates over the Rudolph center have gone on too long. Entertaining an alternative now would mean more delays. It’s a curious argument, since county legislators themselves are the ones who have the power to expedite, or drag out, consideration of Mr. Kaufman’s plan. As the newspaper’s editorial also noted, “This urge to move on has surfaced repeatedly.” Each time, local officials have “resisted, and each time they avoided doing something irrevocable and more costly than necessary,” it said. “This time is no different.”
So what’s the real problem?
The building, now on the World Monuments Fund’s global watch list, along with Machu Picchu and the Great Wall of China, is rigorous and abstract, beautiful but unlike what’s around it. It wasn’t designed to win a popularity contest.
After I first wrote in defense of Rudolph and the building, a former legislator from Goshen, Rich Baum, reached out to me. Mr. Baum was a minority leader in the county during the 1990s. He believes the current fight is about more than aesthetics — that Rudolph’s architecture makes concrete certain values that irritate lawmakers desperate to demolish it. Mr. Baum gave a few examples.
The building’s atrium, he told me, was where “people interacted with county government, including the Department of Motor Vehicles, the records office and the passport office; a balcony above the main floor led to the legislature, the county executive and the primary county government decision-makers,” he said. “What this meant was that, as the leaders of county government went about their business, there was always the din of people coming in and out and doing their business. Critics said this was impractical. I think it was a purposeful and an inspired idea by Rudolph.”
The legislative chamber was designed so that lawmakers sat in rows facing each other, as in Britain’s House of Commons, not facing in the same direction. The consequence, Mr. Baum said, was that “as the leader of a disempowered minority, my only real opportunity to effect change was to force my colleagues literally to face arguments against their actions.” He added: “The setup of the chamber was constructed to maximize the discomfort and awkwardness of strong disagreements. The building reminded leaders of democratic ideals and fostered tough debate.”
In other words, Rudolph’s design was about openness, transparency, accountability. It was thereby a daily rebuke to how legislators “now run the county,” Mr. Baum said. “That’s why they really hate it.”
This is a provocative theory. Legislators can try to prove it wrong.
They can do the right thing Thursday. They can overturn the veto and reconsider demolition.March 4th, 2015
March 5 – 8th, 2015
Herald St at IndependentMarch 4th, 2015
Curated by Ricky Swallow
Organized in collaboration with Arts Project Australia
FRIDAY, MARCH 6 / 6 – 8 PM
March 6 – April 18, 2015March 4th, 2015
FRIDAY, MARCH 6 / 6 – 8 PM
March 6 – April 18, 2015March 3rd, 2015
Published: February 28, 2015
Lupita Nyongo’s stunning pearl-encrusted Oscars dress that was jacked from her hotel room has been returned, but it turns out the 6,000 pearls on the dress are all fake.
TMZ received a call on Friday afternoon around 2:30 p.m. by a person who claimed he stole the Calvin Klein dress from Nyong’o's hotel room at The London West Hollywood. The thief says he left the dress in a bathroom on the second floor of the same hotel, which is where Los Angeles County Sheriff’s deputies recovered the dress after TMZ passed on the tip.
The caller said he noticed the actress’ hotel room door was slightly ajar on Tuesday, and seized the opportunity to move in and take the dress. He and a few accomplices took two pearls off the dress and brought them to the Garment District where they were told they were fake. Although the dress had been reported to be worth $150,000, the thief said the dress was practically worthless and returned the dress to expose “Hollywood’s fake bullshit.”
Whatever it is that’s sewed onto the dress, authorities are still treating it as any other crime. “It doesn’t change anything in our investigation,” said Lieutenant Michael White of the LASD to the L.A. Times.
So who even said the dress was worth six-figures and encrusted with thousands of real pearls in the first place? According to TMZ, the only person ever quoted making those claims was Nyong’o's stylist. A source connected told the website that Calvin Klein never made any claims about the dress and added, “Do they really make dresses out of real jewels since Cleopatra died?”March 1st, 2015
Untitled # 1, 2015
8 X 9 X 6 inches
February 28 through March 31, 2015February 27th, 2015
Kelly Marie Conder, Untitled, 2015
fiber dye, cyanotype, inko dye, flasche on cotton and linen
42 x 52 inches
Saturday, February 28. 3-6PM
Kelly Marie Conder
Richard Aldrich, Two Dancers with Haze in Their Heart Waves Atop a Remake of “One Page, Two Pages, Two Paintings,” 2010.
BY David Salle
ArtNews Posted: 02/23/15
“The Forever Now: Contemporary Painting in an Atemporal World” is MoMA’s first survey of recent painting in over 30 years. In the museum’s crowded sixth-floor galleries, curator Laura Hoptman has corralled 17 artists who have come to notice in the last decade or so, and collectively they give off a synaptic charge. There are a fair number of clunkers, but the majority of the painters here display an honestly arrived-at complexity, expressed through a rigorous series of choices made at what feels like a granularly visual level. Their work rewards hard looking.
The good artists in the show are very good indeed. Charline von Heyl, Josh Smith, Richard Aldrich, Amy Sillman, Mark Grotjahn, Nicole Eisenman, Rashid Johnson, Joe Bradley, and Mary Weatherford have all developed tenacious and highly individual styles. Each makes work that engages the viewer on the paintings’ own terms and that shakes free whatever journalistic shorthand might, in passing, get stuck on them. What drives these artists is resolved in works that are self-reliant and unassailable while remaining open and undogmatic—it’s the ebullience of secular art freed of any ideological task.
Two words one should probably avoid using in exhibition titles are “forever” and “now,” and Hoptman uses both. “Atemporal” comes from a William Gibson story, and Hoptman worked it into a youthful-sounding phrase, but it’s just distracting, like someone talking too loudly while you’re trying to think. She wants to make a point about painting in the Internet age, but the conceit is a red herring—the Web’s frenetic sprawl is opposite to the type of focus required to make a painting, or, for that matter, to look at one.
What does “atemporal” mean, in the context of painting? Judging from Hoptman’s catalogue essay, it’s the confidence, or panache, to take what one likes from the vast storehouse of style, without being overly concerned with the idea of progress or with what something means as a sign. Today, “all eras co-exist at once,” Hoptman writes. She goes on to say that this atemporality is a “wholly unique phenomenon in Western culture.” Big news. The free-agent status accorded the artists in her show is something I take as a good thing—maybe “minding one’s own business” would be a better way of putting it—but her claim for its uniqueness is harder to swallow; it’s more or less what I’ve been advocating for the last 35 years. Not that I take any credit for the idea; within a certain milieu it’s just common knowledge.
In her desire to connect everything to a narrative of the digital future, Hoptman misses the salient difference between the best work here and its immediate antecedents: a sense of structure. By structure I don’t mean only relational composition—though that plays a part—but more generally the sense of a painting’s internal rationale, its “inside energy,” as Alex Katz would say, that alignment of intention, talent, and form. Hoptman wants to make a clean break for her crew from the mores of “appropriation,” but again, the emphasis seems misplaced. Appropriation—as a style—had a tendency to stop short, visually speaking. The primary concern was with “presentation” itself, and the work that resulted was often an analog for the screen, or field, something upon which images composed themselves into some public/private drama. Appropriation pointed to something—some psychological or cultural condition outside of the work itself—that was the basis of its claim to criticality and, at its best, excavated something deep in the psyche. But there are other things in life. At present, painting is focused on structure, discovering and molding pictorial form for its own sake.
Atemporality, then, is nothing new. Most if not all art reaches backward to earlier models in some way; every rupture is also a continuity. The “reaching back” might be to unexpected sources, but imprints of earlier achievements are what give art its gristle and grit. What’s different is the mode of seeing. As an example, Weatherford places tubes of colored neon in front of fields of paint-stained canvas. In the old, appropriationist mind-set, one might get hung up on a list of signifiers along the lines of, say, Mario Merz or Gilberto Zorio meets Helen Frankenthaler; this reductiveness was, from the beginning, an unsatisfying way to see. Pleasantly, reassuringly, more like an old friend showing up after a long absence, arte povera echoes through Weatherford’s work, but it doesn’t feel like a self-conscious reference. Her works clear a space where they can be taken on their own terms. They do, as Ben Jonson said in a somewhat different context, “win themselves a kind of grace-like newness.”
In a related, refreshing development, Warhol’s gloomy, vampiric fatalism is no longer dragging down the party. Duchamp, too, is absent. What a relief. Nothing against the two masters as far as their own work is concerned, but they have exerted such an outsize gravitational pull on generations of artists that finally being out from under them feels like waking from a lurid dream. There is camp in “The Forever Now,” to be sure, and imagery, and irony, and “presentation,” but they are not the main event.
Painting also seems to have shed its preoccupation with photography; here you will find only the faintest nod to “the age of mechanical reproduction.” Even for Laura Owens, who blithely tries on the visual conundrums of the digital world, photography isn’t really part of her DNA. It turns out that much of the art-historical hand-wringing of the last 40 years over Walter Benjamin’s famous prophecy was either misplaced or just plain wrong. Painting is not competing with the Internet, even when making use of its proliferative effects.
Imagery is present to varying degrees in many of these artists’ works. It’s front and center in Eisenman’s paintings, exuberantly evident in Smith’s, lambent in Bradley’s. Drawn forms, some with a goofy, cartoony quality, are often the basis of Sillman’s muscular lyricism. Sillman is a great picture builder; her evocative and gemütlich paintings give the show some real gravitas. Representation even shows up in the trenchant cerebral complexities of von Heyl, but none of these artists is involved with the tradition of realism. They are not translating what can be seen into what can be painted. While everything, even abstraction, is an image in the ontological sense, and there are snatches of imagery in most of these paintings, these artists are simply not imagists; their images are more like the folk melodies in Bartók—present as understructure, there but not there.
The overall tone of “The Forever Now” has a West Coast casual feel about it. Five of the artists in the exhibition—Grotjahn, Weatherford, Owens, Dianna Molzan, and Matt Connors—are based in Southern California, and their work has some of Los Angeles’s take-it-or-leave-it attitude toward materiality. It’s a feeling I remember from living in L.A. in the ’70s: a slightly secondhand relationship to the New York School pieties. The alternative to sober, grown-up painting was an emphasis on materials, often industrial or non-art materials, and on the idea of process itself. The work embodies a youthful vigor without visible strain—in a word, cool. When combined with an internal structural core, the result has a kind of multiplier effect; it wins you over.
(The situation in literature today is not so different; while still avoiding straight realism, the parodists, inventors, miniaturists, and tinkerers are now coming into prominence, taking over from the arid metafictionists. Writers like George Saunders, Ben Marcus, Sam Lipsyte, Sheila Heti, Ben Lerner, and Chris Kraus have clear parallels with painters von Heyl, Weatherford, Bradley, Aldrich, Chris Martin, et al. Painting and advanced writing are now closer in spirit than at any time in living memory.)
But I want to return to that quality that sets apart certain painters in this show—that sense of structure. Like diamonds, Grotjahn’s paintings are the result of great pressure brought to bear on a malleable material over a protracted period of time. His work is a good example of the way in which many artists today are using imagery and history—which is to say, the way that artists mainly always have. Grotjahn manages to simultaneously invoke Cubism, Futurism, Surrealism, and Abstract Expressionism—everyone from Malevich to Victor Brauner—and translate those impulses into an intensely focused, schematic composition that leaves just enough room for his hand to do its stuff.
Much has been made of Grotjahn’s Picassoid heads, but the overall looping structure of his paintings produces an effect closer to Joseph Stella’s 1920s paintings of the Brooklyn Bridge. Grotjahn reimagines Stella’s swooping catenaries into arched ribbons of impasto paint. Because the chunks of color are small and contiguous, they tend to blend together in the viewer’s eye, giving the paintings an alternating current of macro and micro focus. His colors are dark red and burgundy, forest green, warm white, cobalt blue—the colors of silk neckties. They are preppy in a nice way, with a whiff of the 1940s. More importantly, Grotjahn’s color intervals are exacting. They put the painting in a major key. Their simple, clear visual forms—arcs, circles, lozenge and ovoid shapes, like segments of an orange—sometimes overlap and cut into one another, creating a space of increasing, sobering complexity. Grotjahn’s paintings do a funny thing: they achieve great scale through the linear arrangement of small areas of paint, and their structural and imagistic concatenations are in good alignment with the color and paint application. The what and the how are in productive sync. These paintings are tight, shipshape, and very satisfying to look at. At 46, Grotjahn is close on to a modernist master.
Aldrich has been making interesting and surprising paintings for a while, and one of his works here shows great panache. Two Dancers with Haze in Their Heart Waves Atop a Remake of “One Page, Two Pages, Two Paintings,” from 2010, is Aldrich at his least gimmicky and most in tune with the spirit of abstract painting as deconstruction. The painting’s success lies in its loose-limbed sense of structure: a grid- or ladder-like armature along which an array of painted shapes and brush-drawn lines alternate with the interstitial white spaces to form a syncopated rhythm. Its painterly touch calls to mind Joan Mitchell and Philip Guston, and also Robert Rauschenberg’s Winter Pool from 1959—two canvases joined in the middle by a ladder—as well as Rauschenberg’s later Combines. Aldrich’s palette here is sophisticated, just shy of decorator-ish; he takes eight or nine hues and nudges them into perfectly tuned intervals of cream, white, Pompeii red, burnt umber, and a grayed cobalt green—colors that feel at once Mediterranean and Nordic. This particular painting touches on a number of visual cues without leaning too heavily on any of them; the four irregular black rectangles framed by cream-colored bands suggest darkened windows in a cracked plaster wall.
That Aldrich’s painting is reminiscent of earlier paintings while maintaining a clear sense of contemporaneity is perhaps what Hoptman means by “atemporal.” But this is what painting is always about, in one way or another. Rauschenberg’s work of the late ’50s and early ’60s was itself a deconstruction and reconstruction of Abstract Expressionism, freed from its self-importance. Aldrich has taken a lot from that period in Rauschenberg’s work, but his tone is lighter; it has Rauschenberg’s insouciance, without the urgent nervousness. The stakes are different. This is now. Though informal, at times almost flippant, Aldrich’s work is sturdier and more tough-minded than it first appears. His painting says, “Lean on me.”
Susan Sontag observed nearly 50 years ago, in her essay “On Style,” that no self-respecting critic would want to be seen separating form from content, and yet most seem drawn to do just that, after first offering a disclaimer to the contrary. Make that double for curators. The real problem with “The Forever Now” is that it’s two shows: there are the painters who make stand-alone paintings—we don’t need no backstory—and those who use a rectangular-ish surface to do something else. The artists in the former group are the raison d’être for the show; their work has formal inventiveness and pictorial intelligence; it lives in the moment. As for the latter, they are artists who make tip-of-the-iceberg art. What’s on the canvas is the evidence, or residue, of what happens offstage. There’s nothing at all wrong with this in principle, of course, but it can result in an arid busyness that masks a core indecisiveness or, worse, emptiness.
Here is another way to see this: there are pictures that repay our attention with interest and others that simply use it up. The qualities we admire in people—resourcefulness, intelligence, decisiveness, wit, the ability to bring others into the emotional, substantive self—are often the same ones that we feel in art that holds our attention. Less-than-admirable qualities—waffling, self-aggrandizement, stridency, self-absorption—color our experience of work that, for one reason or another, remains unconvincing. By “unconvincing” I mean the feeling you get when the gap between what a work purports to be and what it actually looks like is too big to be papered over.
Such is the case with several of the most celebrated artists included in “The Forever Now.” The problem of grade inflation has been with us since at least the 1920s, when H. L. Mencken, in his American Mercury magazine, coined the term “American boob” to mean our national variant of philistinism. The flip side of “boob-ism,” in Mencken’s formulation, was the wholesale enthusiasm for everything cultural, lest one be thought a philistine. It’s created a hell of confusion ever since.
George Balanchine once complained that the praise had been laid on a little thick. “Everyone’s overrated,” said the greatest choreographer in history. “Picasso’s overrated. I’m overrated. Even Jack Benny’s overrated.” He meant that once it’s decided that someone is great, a misty halo of reverence surrounds everything he or she does. The reality is more prosaic: some things, or some parts of things, will be great and others not. It’s annoying to be overpraised; it’s like showing your work to your parents. The lack of criticality is one of the things that give our current art milieu the feeling of the political sphere (I don’t mean political art). Politics, as a job, is the place where the truth can never be told; it would bring the merry-go-round to a halt.
I decided a long time ago not to write about things I don’t care for. So much work is deeply and movingly realized, and so many artists of real talent are working today that it’s just not worth the time to take an individual clunker to task. There’s an audience for everything—who cares? Besides, one can always be wrong. However, I’m compelled to make an exception in the case of 27-year-old Oscar Murillo. While it’s not his fault for being shot out of the canon too early, I feel one has to say something lest perception be allowed to irretrievably swamp reality. There have always been artists who were taken up by collectors, curators, or journalists; artists who fit a certain narrative but are of little interest to other artists. So why get worked up over it now? Of course it’s not just him. The problem is really one of what constitutes interpretation; it’s the fault line of a deepening divide between how artists and curators see the world. Though it may seem unfair to single out Murillo, the best way to explain why the distinction matters is to describe his work.
Murillo seems to want to say something with his work about palimpsest and memory and being an outsider, but he lacks, to my eye, most of what is needed to make a convincing picture of that type. His grasp of the elements that engage people who paint—like scale, color, surface, image, and line—is journeyman-like at best. His sense of composition is strictly rectilinear; he doesn’t seem to have discovered the diagonal or the arabesque. Worse, he can’t seem to generate any sense of internal pictorial rhythm.
Murillo’s paintings lack personality. He uses plenty of dark colors, scraping, rubbing, dripping, graffiti marks, and dirty tarpaulins—run-of-the-mill stuff, signifiers all. The work looks like something made by an art director; it’s meant to look gritty and “real” but comes across as fainthearted. This is painting for people who don’t have much interest in looking, who prefer the backstory to what is in front of their eyes. Murillo is in so far over his head that even a cabal of powerful dealers won’t be able to save him. He must on some level know this, and so he tries to make up for what’s missing by adding on other effects. One piece in “The Forever Now” is a pile of canvases crumpled up on the floor that viewers can move about as they choose. It’s interactive—get it? MoMA visitors with a long memory will recognize this as a variation on early work by Allan Kaprow, the inventor of Happenings, who wished to mimic the “expressionist” impulses in ’50s paintings and channel them into little games that invited viewer participation with the result that what had once been pictorially alive became pure tedium. To quote Fairfield Porter, writing at the time, “[Kaprow] uses art and he makes clichés….If he wants to prove that certain things can’t be done again because they already have been done, he couldn’t be more convincing.” You can kick Murillo’s canvases around from here to Tuesday—there is no way to bring them to life, because they never lived in the first place.
The real news from “The Forever Now,” the good news, is that painting didn’t die. The argument that tried to make painting obsolete was always a category mistake; that historically determinist line has itself expired, and painting is doing just fine. Painting may no longer be dominant, but that has had, if anything, a salutary effect: not everyone can paint, or needs to. While art audiences have gone their distracted way, painting, like a truffle growing under cover of leaves, has developed flavors both rich and deep, though perhaps not for everyone. Not having to spend so much energy defending one’s decision to paint has given painters the freedom to think about what painting can be. For those who make paintings, or who find in them a compass point, this is a time of enormous vitality.
Thanks to RSFebruary 26th, 2015
February 23rd, 2015
Juan O’Gorman’s house outside Mexico City (1953–56, demolished 1969)
NY Times Published: FEB. 23, 2015
By Paul Krugman
Regular readers know that I sometimes mock “very serious people” — politicians and pundits who solemnly repeat conventional wisdom that sounds tough-minded and realistic. The trouble is that sounding serious and being serious are by no means the same thing, and some of those seemingly tough-minded positions are actually ways to dodge the truly hard issues.
The prime example of recent years was, of course, Bowles-Simpsonism — the diversion of elite discourse away from the ongoing tragedy of high unemployment and into the supposedly crucial issue of how, exactly, we will pay for social insurance programs a couple of decades from now. That particular obsession, I’m happy to say, seems to be on the wane. But my sense is that there’s a new form of issue-dodging packaged as seriousness on the rise. This time, the evasion involves trying to divert our national discourse about inequality into a discussion of alleged problems with education.
And the reason this is an evasion is that whatever serious people may want to believe, soaring inequality isn’t about education; it’s about power.
Just to be clear: I’m in favor of better education. Education is a friend of mine. And it should be available and affordable for all. But what I keep seeing is people insisting that educational failings are at the root of still-weak job creation, stagnating wages and rising inequality. This sounds serious and thoughtful. But it’s actually a view very much at odds with the evidence, not to mention a way to hide from the real, unavoidably partisan debate.
The education-centric story of our problems runs like this: We live in a period of unprecedented technological change, and too many American workers lack the skills to cope with that change. This “skills gap” is holding back growth, because businesses can’t find the workers they need. It also feeds inequality, as wages soar for workers with the right skills but stagnate or decline for the less educated. So what we need is more and better education.
My guess is that this sounds familiar — it’s what you hear from the talking heads on Sunday morning TV, in opinion articles from business leaders like Jamie Dimon of JPMorgan Chase, in “framing papers” from the Brookings Institution’s centrist Hamilton Project. It’s repeated so widely that many people probably assume it’s unquestionably true. But it isn’t.
For one thing, is the pace of technological change really that fast? “We wanted flying cars, instead we got 140 characters,” the venture capitalist Peter Thiel has snarked. Productivity growth, which surged briefly after 1995, seems to have slowed sharply.
Furthermore, there’s no evidence that a skills gap is holding back employment. After all, if businesses were desperate for workers with certain skills, they would presumably be offering premium wages to attract such workers. So where are these fortunate professions? You can find some examples here and there. Interestingly, some of the biggest recent wage gains are for skilled manual labor — sewing machine operators, boilermakers — as some manufacturing production moves back to America. But the notion that highly skilled workers are generally in demand is just false.
Finally, while the education/inequality story may once have seemed plausible, it hasn’t tracked reality for a long time. “The wages of the highest-skilled and highest-paid individuals have continued to increase steadily,” the Hamilton Project says. Actually, the inflation-adjusted earnings of highly educated Americans have gone nowhere since the late 1990s.
So what is really going on? Corporate profits have soared as a share of national income, but there is no sign of a rise in the rate of return on investment. How is that possible? Well, it’s what you would expect if rising profits reflect monopoly power rather than returns to capital.
As for wages and salaries, never mind college degrees — all the big gains are going to a tiny group of individuals holding strategic positions in corporate suites or astride the crossroads of finance. Rising inequality isn’t about who has the knowledge; it’s about who has the power.
Now, there’s a lot we could do to redress this inequality of power. We could levy higher taxes on corporations and the wealthy, and invest the proceeds in programs that help working families. We could raise the minimum wage and make it easier for workers to organize. It’s not hard to imagine a truly serious effort to make America less unequal.
But given the determination of one major party to move policy in exactly the opposite direction, advocating such an effort makes you sound partisan. Hence the desire to see the whole thing as an education problem instead. But we should recognize that popular evasion for what it is: a deeply unserious fantasy.February 23rd, 2015
Opening Reception: Saturday, February 21st from 6 – 8 pm
Barrett Art Gallery at Santa Monica College
1310 11st St. Santa Monica, CA 90401
Credit Mat Brinkman
By STEPHEN MARCHE
NY Times Published: FEB. 14, 2015
A PART-TIME delivery driver named Peter Nunn was recently sentenced to 18 weeks in a British prison for tweeting and retweeting violent messages to Stella Creasy, a member of Parliament. He never saw his victim but the consequences of his virtual crime were real enough. In a statement, Ms. Creasy described fears for her physical safety, going so far as to install a panic button in her home. Mr. Nunn has been physically separated from the rest of society for posting abusive words on a social media site.
The fact that the case ended up in court is rare; the viciousness it represents is not. Everyone in the digital space is, at one point or another, exposed to online monstrosity, one of the consequences of the uniquely contemporary condition of facelessness.
Every month brings fresh figuration to the sprawling, shifting Hieronymus Bosch canvas of faceless 21st-century contempt. Faceless contempt is not merely topical. It is increasingly the defining trait of topicality itself. Every day online provides its measure of empty outrage.
When the police come to the doors of the young men and women who send notes telling strangers that they want to rape them, they and their parents are almost always shocked, genuinely surprised that anyone would take what they said seriously, that anyone would take anything said online seriously. There is a vast dissonance between virtual communication and an actual police officer at the door. It is a dissonance we are all running up against more and more, the dissonance between the world of faces and the world without faces. And the world without faces is coming to dominate.
Recently Dick Costolo, chief executive of Twitter, lamented his company’s failures to deal with the trolls that infested it: “I’m frankly ashamed of how poorly we’ve dealt with this issue during my tenure as CEO,” he said in a leaked memo. It’s commendable of him to admit the torrents of abuse, but it’s also no mere technical error on the part of Twitter; faceless rage is inherent to its technology.
It’s not Twitter’s fault that human beings use it. But the faceless communication social media creates, the linked distances between people, both provokes and mitigates the inherent capacity for monstrosity.
The Gyges effect, the well-noted disinhibition created by communications over the distances of the Internet, in which all speech and image are muted and at arm’s reach, produces an inevitable reaction — the desire for impact at any cost, the desire to reach through the screen, to make somebody feel something, anything. A simple comment can so easily be ignored. Rape threat? Not so much. Or, as Mr. Nunn so succinctly put it on Twitter: “If you can’t threaten to rape a celebrity, what is the point in having them?”
The challenge of our moment is that the face has been at the root of justice and ethics for 2,000 years. The right to face an accuser is one of the very first principles of the law, described in the “confrontation clause” of the Sixth Amendment of the United States Constitution, but reaching back through English common law to ancient Rome. In Roman courts no man could be sentenced to death without first seeing his accuser. The precondition of any trial, of any attempt to reconcile competing claims, is that the victim and the accused look each other in the face.
For the great French-Jewish philosopher Emmanuel Levinas, the encounter with another’s face was the origin of identity — the reality of the other preceding the formation of the self. The face is the substance, not just the reflection, of the infinity of another person. And from the infinity of the face comes the sense of inevitable obligation, the possibility of discourse, the origin of the ethical impulse.
The connection between the face and ethical behavior is one of the exceedingly rare instances in which French phenomenology and contemporary neuroscience coincide in their conclusions. A 2009 study by Marco Iacoboni, a neuroscientist at the Ahmanson-Lovelace Brain Mapping Center at the University of California, Los Angeles, explained the connection: “Through imitation and mimicry, we are able to feel what other people feel. By being able to feel what other people feel, we are also able to respond compassionately to other people’s emotional states.” The face is the key to the sense of intersubjectivity, linking mimicry and empathy through mirror neurons — the brain mechanism that creates imitation even in nonhuman primates.
The connection goes the other way, too. Inability to see a face is, in the most direct way, inability to recognize shared humanity with another. In a metastudy of antisocial populations, the inability to sense the emotions on other people’s faces was a key correlation. There is “a consistent, robust link between antisocial behavior and impaired recognition of fearful facial affect. Relative to comparison groups, antisocial populations showed significant impairments in recognizing fearful, sad and surprised expressions.” A recent study in the Journal of Vision showed that babies between the ages of 4 months and 6 months recognized human faces at the same level as grown adults, an ability which they did not possess for other objects.
Without a face, the self can form only with the rejection of all otherness, with a generalized, all-purpose contempt — a contempt that is so vacuous because it is so vague, and so ferocious because it is so vacuous. A world stripped of faces is a world stripped, not merely of ethics, but of the biological and cultural foundations of ethics.
For the great existentialist Martin Heidegger, the spirit of homelessness defined the 20th century, a disconnected drifting in a world of groundless artificiality. The spirit of facelessness is coming to define the 21st. Facelessness is not a trend; it is a social phase we are entering that we have not yet figured out how to navigate.
As exchange and communication come at a remove, the flight back to the face takes on new urgency. Google recently reported that on Android alone, which has more than a billion active users, people take 93 million selfies a day. The selfie has become not a single act but a continuous process of self-portraiture. On the phones that are so much of our lives, no individual self-image is adequate; instead a rapid progression of self-images mimics the changeability and the variety of real human presence.
Continue reading the main storyContinue reading the main story
Emojis are an explicit attempt to replicate the emotional context that facial expression provides. Intriguingly, emojis express emotion, often negative emotions, but you cannot troll with them. You cannot send a message of faceless contempt with icons of faces. The mere desire to imitate a face humanizes.
But all these attempts to provide a digital face run counter to the main current of our era’s essential facelessness. The volume of digital threats appears to be too large for police forces to adequately deal with. But cases of trolls’ following through on their online threat of murder and rape are extremely rare. The closest most trolling comes to actual violence is “swatting,” or sending ambulances or SWAT teams to an enemy’s house. Again, neither victim nor perpetrator sees the other.
What do we do with the trolls? It is one of the questions of the age. There are those who argue that we have a social responsibility to confront them. Mary Beard, the British historian, not only confronted a troll who sent her misogynistic messages, she befriended him and ended up writing him letters of reference. One young video game reviewer, Alanah Pearce, sent Facebook messages to the mothers of young boys who had sent her rape threats. These stories have the flavor of the heroic, a resistance to an assumed condition: giving face to the faceless.
The more established wisdom about trolls, at this point, is to disengage. Obviously, in many cases, actual crimes are being committed, crimes that demand confrontation, by victims and by law enforcement officials, but in everyday digital life engaging with the trolls “is like trying to drown a vampire with your own blood,” as the comedian Andy Richter put it. Ironically, the Anonymous collective, a pioneer of facelessness, has offered more or less the same advice.
Rule 14 of their “Rules of the Internet” is, “Do not argue with trolls — it means that they win.
Rule 19 is, “The more you hate it the stronger it gets.”
Ultimately, neither solution — confrontation or avoidance — satisfies. Even if confrontation were the correct strategy, those who are hounded by trolls do not have the time to confront them. To leave the faceless to their facelessness is also unacceptable — why should they own the digital space simply because of the anonymity of their cruelty?
There is a third way, distinct from confrontation or avoidance: compassion. The original trolls, Scandinavian monsters who haunted the Vikings, inhabited graveyards or mountains, which is why adventurers would always run into them on the road or at night. They were dull. They possessed monstrous force but only a dim sense of the reality of others. They were mystical nature-forces that lived in the distant, dark places between human habitations. The problem of contemporary trolls is a subset of a larger crisis, which is itself a consequence of the transformation of our modes of communication. Trolls breed under the shadows of the bridges we build.
In a world without faces, compassion is a practice that requires discipline, even imagination. Social media seems so easy; the whole point of its pleasure is its sense of casual familiarity. But we need a new art of conversation for the new conversations we are having — and the first rule of that art must be to remember that we are talking to human beings: “Never say anything online that you wouldn’t say to somebody’s face.” But also: “Don’t listen to what people wouldn’t say to your face.”
The neurological research demonstrates that empathy, far from being an artificial construct of civilization, is integral to our biology. And when biological intersubjectivity disappears, when the face is removed from life, empathy and compassion can no longer be taken for granted.
The new facelessness hides the humanity of monsters and of victims both. Behind the angry tangles of wires, the question is, how do we see their faces again?February 15th, 2015
February 14 – April 5, 2015
Saturday, February 14, 5-7pmFebruary 14th, 2015
By MICHAEL CHRISTIE
NY Timess Published: FEBRUARY 12, 2015
I have broken my wrists, fingers, a tibia, a fibula, chipped a handful of teeth, cracked a vertebra and snapped a collarbone. I have concussed myself in Tallahassee, Fla., and Portland, Ore. I’ve skittered across the sooty hoods of New York cabs and bombed down many of San Francisco’s steepest avenues.
For many years I was a professional skateboarder. I first stepped on a skateboard at 11. The nomenclature — switch-stance frontside tailslide, kickflip to nose manual — was the language of my first friendships, with wild, strange boys who were as ill suited for school and team sports as I was. They were from broken homes. Poor homes. Group homes. We were like little cement mixers, keeping ourselves in constant motion, our skateboard’s movement the only thing preventing us from hardening into blocks of pure rage.
It was through those friends that I first realized the oddity of my own home. Skateboarding gave my mother panic attacks. She bought me helmets and pads (which I never wore), and gasped at my scars and bruises. She would have forbidden me to skateboard at all if she believed for a second that I would comply.
This might sound like typical parental anxiety. But with my mother, it was something deeper.
My mother was agoraphobic, which means she was often housebound, terrified of stores, cars and crowds. Vacations were impossible. As were jobs, and simple errands. She cut our hair, made us clothes, prepared us complex meals. Together we painted and drew, watched movies and read. She taught me to build a bookshelf, reattach a button and piece together a quilt. She worried that school stifled my creativity. So she encouraged me to stay home whenever I wanted. Which was often. School couldn’t compare with the bright spotlight of her attention, and besides, I knew she needed me close by.
I felt uneasy around other kids until that day when I was 11, and I saw a boy outside my house perform an ollie (that magical, clacking leap by which skateboarders temporarily glue their board to their feet and vault into the air). To my mother’s horror, I rushed outside and begged him to let me try. From then on, I realized I needed to be skateboarding in the streets as much as she needed to be safe in the house. I stopped coming home except to shower and sleep.
At 17, I left for good, and spent a decade and a half on the other edge of the continent. I seldom called her. When we talked it felt as if she was trying to siphon something vital from my cells, so I parried her inquiries about my welfare with sharp, monosyllabic replies. It was a time of great anger and resentment, a time I’m not proud of.
But in 2008, when I was 32, my mother got sick with Stage 4 lung cancer. I went home to care for her during a last-ditch chemotherapy regime, and found her in the process of throwing away everything she owned. To prevent some artifact of our family history from being accidentally lofted into the trash, I presented her with boxes and three options: keep, donate or trash. There was plenty of clutter to sort. Mostly it was liquor boxes of paperbacks and her artwork and crafts — the accumulation of a life lived predominantly indoors.
Eventually I dragged a box from her closet and found it was stuffed with skateboard magazines, bookmarks peeking out from their pages. I leafed through one and discovered a picture of myself, five years younger, atop a skateboard mid-tailslide on a wooden handrail, my mouth open and my eyes fixed wide with both terror and joy.
“I didn’t think you could look at these,” I said.
“I took out subscriptions,” she said, avoiding my eyes. “You never sent any photos over the years. These were the only ones of you I could get.” Then she sighed, “Keep.”
A few weeks after her chemotherapy ended and I returned to Vancouver, I woke late one night to a call from my father. My mother had died. I remember sitting at my kitchen table, weeping, clutching my stomach. I had to get outside. I grabbed my skateboard and rolled for hours in the orange streetlights, aimlessly ollieing manholes and weaving between parked cars. Fresh-faced people were power-walking to work by the time I got home.
Five months later, my first son was born. My love for him was instant, soul-flooding. I had trouble taking my eyes off him. We went on long walks while my wife slept. Out with his stroller in the midday traffic, I found that I had suddenly become attuned to the world’s menace, to the human being’s naked vulnerability in the face of it. The city throbbed with dangers that I’d long been insensate to: veering cars, potential kidnappers, toxic exhausts, carelessly discarded needles. It was as though the world had turned double agent and become my enemy.
I developed a Border collie-like attentiveness when it came to my son’s safety. When he stood at the coffee table like a cute little drunk bellying up to a bar, I’d be hovering there, his personal safety net. After my long acquaintance with the physics of crashing, I knew exactly what whap his forehead would make if it hit the lamp, what thunk his cerebellum would issue on the hardwood if he tipped back.
Or perhaps, I worried, it was because of my mother. My inherited brain chemistry, my angst-ridden genes taking over.
Things got worse. I kicked a dog at the park that looked as if it was going to bite him. I complained to my wife that his day care workers were inattentive. If my son choked on something at the table, even momentarily, it would take me an hour and a few drinks to smooth out my nerves. Sleep became impossible.
I never imagined that parenthood meant learning to live with this unrelenting, impaling fear. With the question of when to catch your children and when to let them fall. To date I’ve watched my son’s precious body bounce off concrete, wood and brick. We had another son last year, this one more fearless than his brother. Someday I may be forced to hear their bones snap and see their blood gush. And then, after all that growing and falling, they might move away, far beyond my protective reach. My mother was ill, but she was also right: It is terrifying to be a parent.
It is a cliché to say children teach us about ourselves and about our parents, but it’s true. My sons are teaching me to calm down. I’ve seen pain shape them for the better. I’ve watched a trip to the ground leave them incrementally stronger. I even recently bought them both skateboards, which have yet to interest them.
I’m learning to forgive my mother, for her life lived inside, for her inability to cope. My mother was afraid of everything, yet she was brave. I used to fear nothing, but parenthood has rendered me a coward. I wish I could tell her that now.
But when I picture her leafing through those skateboard magazines she’d collected over the years, skipping over the interviews and advertisements, searching for her reckless, angry son, only to find me falling from the sky in some place she couldn’t follow, I’m certain she understood how I felt then, how I’d feel now.
A skateboard is the most basic ambulatory machine. It has no gears, offers no assistance. It will protect you from nothing. It is a tool for falling. For failure. But also for freedom. For living. On a skateboard you must stay balanced in a tempest of forces beyond your control. The key is to be brave, get low, stay up and keep rolling.February 12th, 2015
February 20 – March 28, 2015
Opening reception: Friday, February 20, 2015
Washing silk in a river. Artisans on Amami Oshima uses the island’s iron-rich mud to turn silk a rich shade of chocolate brown. Credit Kentaro Takahashi for The New York Times
By MARTIN FACKLER
NY Times Published: FEB. 9, 2015
AMAMI OSHIMA, Japan — Kazuhiko Kanai uses the traditional method to dye the elegant kimonos for which this small, semitropical island is renowned: he carries a bundle of pure white silk to a nearby rice paddy and hurls it into the mud.
Mr. Kanai is one of the last practitioners of a method known as “dorozome,” or “mud-dyeing,” which uses the island’s iron-rich soil to turn silk the color of the darkest chocolate. This is just one step in an elaborate production process that can take a year to produce a kimono with the glossiest silk and most intricately woven designs in all Japan. In a nation that esteems its traditional form of dress as high art, Amami Oshima’s kimonos became some of the most prized of them all, once capable of fetching more than $10,000 apiece.
But those heady days are over, as a shift to Western fashions and Japan’s long economic squeeze have led to plummeting demand, especially for high-end kimonos.
On Amami Oshima, production has fallen so far in the last two decades that only 500 people on an island with 73,000 residents remain employed full-time in kimono production, and many of them are in their 70s or 80s. That’s down from 20,000 people a generation ago, according to the Authentic Amami Oshima Tsumugi Association, the island’s union of kimono producers.
Amami Oshima has fallen harder than most of Japan’s famous kimono production centers, dragged down by a complex web of wholesalers, dealers and specialized retailers who distribute and sell the island’s kimonos. While this antiquated system once benefited the remote southern island near Okinawa by spreading its kimonos to the rest of Japan, islanders say it has now become a burden, keeping the kimonos prohibitively expensive while driving down wages.
Yet the old ways have proven hard to discard, despite a growing sense of crisis. Many fret that there will soon be too few islanders left with the skills to sustain each of the 30 separate steps needed to produce one of the kimonos.
“If we lose one link in the chain, we lose our ability to make kimonos,” said Mr. Kanai, 56, who owns a dirt-floored wooden workshop where silk is dyed in bubbling iron caldrons and then hung from the ceiling to dry. “If we cannot make kimonos any more, what will be left here?”
Mr. Kanai says the mud-dyeing process alone takes more than a month, as the silk is first colored a burgundy hue with natural dye made from the pulp of a local plum tree. Getting the right shade of red requires repeating the cycle of staining and drying the silk 30 times, he said. Only then is the silk ready to be immersed in the black mud, whose iron reacts with tannins in the tree dye to create the coveted dark brown color.
That is not the most elaborate step. Even before the silk arrives at Mr. Kanai’s workshop, it is first woven into a temporary fabric as part of a unique method that the islanders have devised for creating minutely detailed patterns.
After this temporary fabric has been mud-dyed, it is unraveled back into its original silk threads. Each colored thread now has thousands of tiny white stripes where it overlapped with another thread, blocking the mud from touching it at that point.
As the threads are rewoven into new fabric by nimble-fingered island women, they slowly reveal perfectly formed patterns, ranging from starkly minimalist shapes to elaborate scenes of bamboo groves and flying storks.
“The weaver has a tremendous responsibility,” said Mifuko Iwasaki, 70, who has been teaching young islanders how to weave these perfectly aligned patterns on hand looms for 35 years. “If we make a mistake, we undo all the hard work of those who spent so much timing preparing this thread.”
Ms. Iwasaki says that when she began teaching her yearlong classes, she typically had 40 students, who were drawn by the fact that weaving offered higher wages than fishing, farming and logging, the island’s other industries at the time.
These days, she says she is lucky to get more than two or three students, because weaving no longer pays as well. The myriad of middlemen in the cumbersome distribution system each take a cut, making it hard to reduce prices at the same rate as other items in deflationary Japan. Worse, the brunt of what price cuts have been made inevitably falls on the island’s dyers and weavers. As a result, while a new Oshima kimono can still cost $3,000 to $6,000 in Tokyo, weavers say they are lucky to get more than $400 for a month’s exacting work. Other craftsmen in the production process get even less.
Nonetheless, islanders say they are reluctant to bypass the antiquated distribution system, saying they feel bound by generations-old obligations and a fear of change. This makes them a microcosm of Japan as a whole, which has been slow to give up its outdated postwar economic model despite years of stagnation.
“It is ironic that we can no longer make ends meet producing something so expensive,” said Shigehiko Furuta, 67, who uses colored pens and graph paper to design the minutely detailed patterns.
Shinichiro Yamada, 83, the head of the producers’ union, said the island’s ornately woven patterns have their roots in the colorful culture of the Kingdom of the Ryukyus, centered in current-day Okinawa. They ruled Amami Oshima until the early 17th century, when the island was conquered by Japanese samurai, who claimed the island’s kimonos as tribute.
Mud-dyeing started when disobedient islanders buried kimonos in the ground to hide them, only to discover on digging them up again that the fabrics had turned a beautiful dark color, said Mr. Kanai, who owns the mud-dyeing workshop.
His son, Yukihito, now uses those same centuries-old dyeing techniques to color new types of items, including T-shirts, jeans and even guitar bodies. He is experimenting with selling these over the Internet, to avoid the onerous distribution system.
“We need to become more like artisans in Europe or artists in New York,” said the younger Mr. Kanai, 35, who said he is one of the few “young successors” in the island’s kimono industry. “Even traditions have to evolve.”February 9th, 2015