Clock Ticks for Paul Rudolph

Screen shot 2015-03-04 at 4.49.52 PM
A government building in Goshen, N.Y., designed by the noted architect Paul Rudolph, will be demolished unless legislators act this week. Credit Fred R. Conrad/The New York Times

By MICHAEL KIMMELMAN
NY Times Published: MARCH 3, 2015

This week, lawmakers in Goshen, N.Y., have a last chance to save an archetype of midcentury modernist architecture — and themselves from going down as reckless stewards of the nation’s heritage.

The plan is to gut Paul Rudolph’s Orange County Government Center, strip away much of its distinctive, corrugated concrete and glass exterior and demolish one of its three pavilions, replacing it with a big, soulless glass box. Rudolph, who died in 1997, at 78, was a leading light of American architecture when this building, one of his best and most idealistic, opened nearly half a century ago. Like Rudolph, the center suffered abuse over the years but is now being championed by new fans that recognize his genius, and the latest plan as vandalism.

Thursday is the deadline for legislators to override a veto by Steven Neuhaus, the Orange county executive, who seems hell-bent on demolition. In January, Mr. Neuhaus vetoed a bill that would have allowed county officials to consider selling the disputed building to a Manhattan architect who wants to preserve it. The architect, Gene Kaufman, would turn the center into an artists’ residence and exhibition space, a no-brainer.

Mr. Kaufman has also proposed designing a new government center, just next door, for millions less than what the county’s present plan is projected to cost. His offer would accomplish everything legislators say they want, and even add Rudolph’s building to the county tax rolls. The center has been closed since 2011: having allowed it to deteriorate for years, county leaders closed it after Hurricane Irene, citing leaks.

This past weekend, The Times Herald-Record, a leading newspaper in the region, pleaded with legislators to reconsider Mr. Kaufman’s proposal.

“Legislators owe it to the people of the county to listen to his plan, to test the assumptions and to compare it to the plan they are in such a hurry to implement,” the editorial argued.

The lawmakers should have second thoughts. Bids for demolition came in last week at nearly twice the price estimated by the design firm, Clark Patterson Lee, that Mr. Neuhaus and his allies have enlisted. Instead of $3.9 million, as Clark Patterson predicted, the two bids topped $7.4 million and $7.7 million, The Times Herald-Record reported on Saturday.

Officials backing demolition say that debates over the Rudolph center have gone on too long. Entertaining an alternative now would mean more delays. It’s a curious argument, since county legislators themselves are the ones who have the power to expedite, or drag out, consideration of Mr. Kaufman’s plan. As the newspaper’s editorial also noted, “This urge to move on has surfaced repeatedly.” Each time, local officials have “resisted, and each time they avoided doing something irrevocable and more costly than necessary,” it said. “This time is no different.”

So what’s the real problem?

The building, now on the World Monuments Fund’s global watch list, along with Machu Picchu and the Great Wall of China, is rigorous and abstract, beautiful but unlike what’s around it. It wasn’t designed to win a popularity contest.

After I first wrote in defense of Rudolph and the building, a former legislator from Goshen, Rich Baum, reached out to me. Mr. Baum was a minority leader in the county during the 1990s. He believes the current fight is about more than aesthetics — that Rudolph’s architecture makes concrete certain values that irritate lawmakers desperate to demolish it. Mr. Baum gave a few examples.

The building’s atrium, he told me, was where “people interacted with county government, including the Department of Motor Vehicles, the records office and the passport office; a balcony above the main floor led to the legislature, the county executive and the primary county government decision-makers,” he said. “What this meant was that, as the leaders of county government went about their business, there was always the din of people coming in and out and doing their business. Critics said this was impractical. I think it was a purposeful and an inspired idea by Rudolph.”

The legislative chamber was designed so that lawmakers sat in rows facing each other, as in Britain’s House of Commons, not facing in the same direction. The consequence, Mr. Baum said, was that “as the leader of a disempowered minority, my only real opportunity to effect change was to force my colleagues literally to face arguments against their actions.” He added: “The setup of the chamber was constructed to maximize the discomfort and awkwardness of strong disagreements. The building reminded leaders of democratic ideals and fostered tough debate.”

In other words, Rudolph’s design was about openness, transparency, accountability. It was thereby a daily rebuke to how legislators “now run the county,” Mr. Baum said. “That’s why they really hate it.”

This is a provocative theory. Legislators can try to prove it wrong.

They can do the right thing Thursday. They can overturn the veto and reconsider demolition.

March 4th, 2015
matt paweski

unnamed-12

March 5 – 8th, 2015

Herald St at Independent

March 4th, 2015
Terry Williams

1849_ca_object_representations_media_8577_large

Curated by Ricky Swallow
Organized in collaboration with Arts Project Australia

OPENING RECEPTION:
FRIDAY, MARCH 6 / 6 – 8 PM

March 6 – April 18, 2015

White Columns

March 4th, 2015
bruce m. sherman

34395_ca_object_representations_media_8571_large

OPENING RECEPTION:
FRIDAY, MARCH 6 / 6 – 8 PM

March 6 – April 18, 2015

White Columns

March 3rd, 2015

March 2nd, 2015
Six Figure Dress Returned, But It Turns Out The Pearls Were Fake

Screen shot 2015-03-01 at 7.01.22 AM

LAist
Published: February 28, 2015

Lupita Nyongo’s stunning pearl-encrusted Oscars dress that was jacked from her hotel room has been returned, but it turns out the 6,000 pearls on the dress are all fake.

TMZ received a call on Friday afternoon around 2:30 p.m. by a person who claimed he stole the Calvin Klein dress from Nyong’o's hotel room at The London West Hollywood. The thief says he left the dress in a bathroom on the second floor of the same hotel, which is where Los Angeles County Sheriff’s deputies recovered the dress after TMZ passed on the tip.

The caller said he noticed the actress’ hotel room door was slightly ajar on Tuesday, and seized the opportunity to move in and take the dress. He and a few accomplices took two pearls off the dress and brought them to the Garment District where they were told they were fake. Although the dress had been reported to be worth $150,000, the thief said the dress was practically worthless and returned the dress to expose “Hollywood’s fake bullshit.”

Whatever it is that’s sewed onto the dress, authorities are still treating it as any other crime. “It doesn’t change anything in our investigation,” said Lieutenant Michael White of the LASD to the L.A. Times.

So who even said the dress was worth six-figures and encrusted with thousands of real pearls in the first place? According to TMZ, the only person ever quoted making those claims was Nyong’o's stylist. A source connected told the website that Calvin Klein never made any claims about the dress and added, “Do they really make dresses out of real jewels since Cleopatra died?”

March 1st, 2015
David Korty | New Ceramics

IMG_8298
Untitled # 1, 2015
Glazed Stoneware
8 X 9 X 6 inches

February 28 through March 31, 2015

South Willard Shop Exhibit

February 27th, 2015
Kelly Marie Conder | Matt Paweski

unnamed-10
Kelly Marie Conder, Untitled, 2015
fiber dye, cyanotype, inko dye, flasche on cotton and linen
42 x 52 inches

Saturday, February 28. 3-6PM

Kelly Marie Conder
Matt Paweski

Either Way

February 27th, 2015
STRUCTURE RISING

Screen shot 2015-02-26 at 8.51.45 PM
Richard Aldrich, Two Dancers with Haze in Their Heart Waves Atop a Remake of “One Page, Two Pages, Two Paintings,” 2010.

BY David Salle
ArtNews Posted: 02/23/15

“The Forever Now: Contemporary Painting in an Atemporal World” is MoMA’s first survey of recent painting in over 30 years. In the museum’s crowded sixth-floor galleries, curator Laura Hoptman has corralled 17 artists who have come to notice in the last decade or so, and collectively they give off a synaptic charge. There are a fair number of clunkers, but the majority of the painters here display an honestly arrived-at complexity, expressed through a rigorous series of choices made at what feels like a granularly visual level. Their work rewards hard looking.

The good artists in the show are very good indeed. Charline von Heyl, Josh Smith, Richard Aldrich, Amy Sillman, Mark Grotjahn, Nicole Eisenman, Rashid Johnson, Joe Bradley, and Mary Weatherford have all developed tenacious and highly individual styles. Each makes work that engages the viewer on the paintings’ own terms and that shakes free whatever journalistic shorthand might, in passing, get stuck on them. What drives these artists is resolved in works that are self-reliant and unassailable while remaining open and undogmatic—it’s the ebullience of secular art freed of any ideological task.

Two words one should probably avoid using in exhibition titles are “forever” and “now,” and Hoptman uses both. “Atemporal” comes from a William Gibson story, and Hoptman worked it into a youthful-sounding phrase, but it’s just distracting, like someone talking too loudly while you’re trying to think. She wants to make a point about painting in the Internet age, but the conceit is a red herring—the Web’s frenetic sprawl is opposite to the type of focus required to make a painting, or, for that matter, to look at one.

What does “atemporal” mean, in the context of painting? Judging from Hoptman’s catalogue essay, it’s the confidence, or panache, to take what one likes from the vast storehouse of style, without being overly concerned with the idea of progress or with what something means as a sign. Today, “all eras co-exist at once,” Hoptman writes. She goes on to say that this atemporality is a “wholly unique phenomenon in Western culture.” Big news. The free-agent status accorded the artists in her show is something I take as a good thing—maybe “minding one’s own business” would be a better way of putting it—but her claim for its uniqueness is harder to swallow; it’s more or less what I’ve been advocating for the last 35 years. Not that I take any credit for the idea; within a certain milieu it’s just common knowledge.

In her desire to connect everything to a narrative of the digital future, Hoptman misses the salient difference between the best work here and its immediate antecedents: a sense of structure. By structure I don’t mean only relational composition—though that plays a part—but more generally the sense of a painting’s internal rationale, its “inside energy,” as Alex Katz would say, that alignment of intention, talent, and form. Hoptman wants to make a clean break for her crew from the mores of “appropriation,” but again, the emphasis seems misplaced. Appropriation—as a style—had a tendency to stop short, visually speaking. The primary concern was with “presentation” itself, and the work that resulted was often an analog for the screen, or field, something upon which images composed themselves into some public/private drama. Appropriation pointed to something—some psychological or cultural condition outside of the work itself—that was the basis of its claim to criticality and, at its best, excavated something deep in the psyche. But there are other things in life. At present, painting is focused on structure, discovering and molding pictorial form for its own sake.

Atemporality, then, is nothing new. Most if not all art reaches backward to earlier models in some way; every rupture is also a continuity. The “reaching back” might be to unexpected sources, but imprints of earlier achievements are what give art its gristle and grit. What’s different is the mode of seeing. As an example, Weatherford places tubes of colored neon in front of fields of paint-stained canvas. In the old, appropriationist mind-set, one might get hung up on a list of signifiers along the lines of, say, Mario Merz or Gilberto Zorio meets Helen Frankenthaler; this reductiveness was, from the beginning, an unsatisfying way to see. Pleasantly, reassuringly, more like an old friend showing up after a long absence, arte povera echoes through Weatherford’s work, but it doesn’t feel like a self-conscious reference. Her works clear a space where they can be taken on their own terms. They do, as Ben Jonson said in a somewhat different context, “win themselves a kind of grace-like newness.”

In a related, refreshing development, Warhol’s gloomy, vampiric fatalism is no longer dragging down the party. Duchamp, too, is absent. What a relief. Nothing against the two masters as far as their own work is concerned, but they have exerted such an outsize gravitational pull on generations of artists that finally being out from under them feels like waking from a lurid dream. There is camp in “The Forever Now,” to be sure, and imagery, and irony, and “presentation,” but they are not the main event.

Painting also seems to have shed its preoccupation with photography; here you will find only the faintest nod to “the age of mechanical reproduction.” Even for Laura Owens, who blithely tries on the visual conundrums of the digital world, photography isn’t really part of her DNA. It turns out that much of the art-historical hand-wringing of the last 40 years over Walter Benjamin’s famous prophecy was either misplaced or just plain wrong. Painting is not competing with the Internet, even when making use of its proliferative effects.

Imagery is present to varying degrees in many of these artists’ works. It’s front and center in Eisenman’s paintings, exuberantly evident in Smith’s, lambent in Bradley’s. Drawn forms, some with a goofy, cartoony quality, are often the basis of Sillman’s muscular lyricism. Sillman is a great picture builder; her evocative and gemütlich paintings give the show some real gravitas. Representation even shows up in the trenchant cerebral complexities of von Heyl, but none of these artists is involved with the tradition of realism. They are not translating what can be seen into what can be painted. While everything, even abstraction, is an image in the ontological sense, and there are snatches of imagery in most of these paintings, these artists are simply not imagists; their images are more like the folk melodies in Bartók—present as understructure, there but not there.

The overall tone of “The Forever Now” has a West Coast casual feel about it. Five of the artists in the exhibition—Grotjahn, Weatherford, Owens, Dianna Molzan, and Matt Connors—are based in Southern California, and their work has some of Los Angeles’s take-it-or-leave-it attitude toward materiality. It’s a feeling I remember from living in L.A. in the ’70s: a slightly secondhand relationship to the New York School pieties. The alternative to sober, grown-up painting was an emphasis on materials, often industrial or non-art materials, and on the idea of process itself. The work embodies a youthful vigor without visible strain—in a word, cool. When combined with an internal structural core, the result has a kind of multiplier effect; it wins you over.

(The situation in literature today is not so different; while still avoiding straight realism, the parodists, inventors, miniaturists, and tinkerers are now coming into prominence, taking over from the arid metafictionists. Writers like George Saunders, Ben Marcus, Sam Lipsyte, Sheila Heti, Ben Lerner, and Chris Kraus have clear parallels with painters von Heyl, Weatherford, Bradley, Aldrich, Chris Martin, et al. Painting and advanced writing are now closer in spirit than at any time in living memory.)

But I want to return to that quality that sets apart certain painters in this show—that sense of structure. Like diamonds, Grotjahn’s paintings are the result of great pressure brought to bear on a malleable material over a protracted period of time. His work is a good example of the way in which many artists today are using imagery and history—which is to say, the way that artists mainly always have. Grotjahn manages to simultaneously invoke Cubism, Futurism, Surrealism, and Abstract Expressionism—everyone from Malevich to Victor Brauner—and translate those impulses into an intensely focused, schematic composition that leaves just enough room for his hand to do its stuff.

Much has been made of Grotjahn’s Picassoid heads, but the overall looping structure of his paintings produces an effect closer to Joseph Stella’s 1920s paintings of the Brooklyn Bridge. Grotjahn reimagines Stella’s swooping catenaries into arched ribbons of impasto paint. Because the chunks of color are small and contiguous, they tend to blend together in the viewer’s eye, giving the paintings an alternating current of macro and micro focus. His colors are dark red and burgundy, forest green, warm white, cobalt blue—the colors of silk neckties. They are preppy in a nice way, with a whiff of the 1940s. More importantly, Grotjahn’s color intervals are exacting. They put the painting in a major key. Their simple, clear visual forms—arcs, circles, lozenge and ovoid shapes, like segments of an orange—sometimes overlap and cut into one another, creating a space of increasing, sobering complexity. Grotjahn’s paintings do a funny thing: they achieve great scale through the linear arrangement of small areas of paint, and their structural and imagistic concatenations are in good alignment with the color and paint application. The what and the how are in productive sync. These paintings are tight, shipshape, and very satisfying to look at. At 46, Grotjahn is close on to a modernist master.

Aldrich has been making interesting and surprising paintings for a while, and one of his works here shows great panache. Two Dancers with Haze in Their Heart Waves Atop a Remake of “One Page, Two Pages, Two Paintings,” from 2010, is Aldrich at his least gimmicky and most in tune with the spirit of abstract painting as deconstruction. The painting’s success lies in its loose-limbed sense of structure: a grid- or ladder-like armature along which an array of painted shapes and brush-drawn lines alternate with the interstitial white spaces to form a syncopated rhythm. Its painterly touch calls to mind Joan Mitchell and Philip Guston, and also Robert Rauschenberg’s Winter Pool from 1959—two canvases joined in the middle by a ladder—as well as Rauschenberg’s later Combines. Aldrich’s palette here is sophisticated, just shy of decorator-ish; he takes eight or nine hues and nudges them into perfectly tuned intervals of cream, white, Pompeii red, burnt umber, and a grayed cobalt green—colors that feel at once Mediterranean and Nordic. This particular painting touches on a number of visual cues without leaning too heavily on any of them; the four irregular black rectangles framed by cream-colored bands suggest darkened windows in a cracked plaster wall.

That Aldrich’s painting is reminiscent of earlier paintings while maintaining a clear sense of contemporaneity is perhaps what Hoptman means by “atemporal.” But this is what painting is always about, in one way or another. Rauschenberg’s work of the late ’50s and early ’60s was itself a deconstruction and reconstruction of Abstract Expressionism, freed from its self-importance. Aldrich has taken a lot from that period in Rauschenberg’s work, but his tone is lighter; it has Rauschenberg’s insouciance, without the urgent nervousness. The stakes are different. This is now. Though informal, at times almost flippant, Aldrich’s work is sturdier and more tough-minded than it first appears. His painting says, “Lean on me.”

Susan Sontag observed nearly 50 years ago, in her essay “On Style,” that no self-respecting critic would want to be seen separating form from content, and yet most seem drawn to do just that, after first offering a disclaimer to the contrary. Make that double for curators. The real problem with “The Forever Now” is that it’s two shows: there are the painters who make stand-alone paintings—we don’t need no backstory—and those who use a rectangular-ish surface to do something else. The artists in the former group are the raison d’être for the show; their work has formal inventiveness and pictorial intelligence; it lives in the moment. As for the latter, they are artists who make tip-of-the-iceberg art. What’s on the canvas is the evidence, or residue, of what happens offstage. There’s nothing at all wrong with this in principle, of course, but it can result in an arid busyness that masks a core indecisiveness or, worse, emptiness.

Here is another way to see this: there are pictures that repay our attention with interest and others that simply use it up. The qualities we admire in people—resourcefulness, intelligence, decisiveness, wit, the ability to bring others into the emotional, substantive self—are often the same ones that we feel in art that holds our attention. Less-than-admirable qualities—waffling, self-aggrandizement, stridency, self-absorption—color our experience of work that, for one reason or another, remains unconvincing. By “unconvincing” I mean the feeling you get when the gap between what a work purports to be and what it actually looks like is too big to be papered over.

Such is the case with several of the most celebrated artists included in “The Forever Now.” The problem of grade inflation has been with us since at least the 1920s, when H. L. Mencken, in his American Mercury magazine, coined the term “American boob” to mean our national variant of philistinism. The flip side of “boob-ism,” in Mencken’s formulation, was the wholesale enthusiasm for everything cultural, lest one be thought a philistine. It’s created a hell of confusion ever since.

George Balanchine once complained that the praise had been laid on a little thick. “Everyone’s overrated,” said the greatest choreographer in history. “Picasso’s overrated. I’m overrated. Even Jack Benny’s overrated.” He meant that once it’s decided that someone is great, a misty halo of reverence surrounds everything he or she does. The reality is more prosaic: some things, or some parts of things, will be great and others not. It’s annoying to be overpraised; it’s like showing your work to your parents. The lack of criticality is one of the things that give our current art milieu the feeling of the political sphere (I don’t mean political art). Politics, as a job, is the place where the truth can never be told; it would bring the merry-go-round to a halt.

I decided a long time ago not to write about things I don’t care for. So much work is deeply and movingly realized, and so many artists of real talent are working today that it’s just not worth the time to take an individual clunker to task. There’s an audience for everything—who cares? Besides, one can always be wrong. However, I’m compelled to make an exception in the case of 27-year-old Oscar Murillo. While it’s not his fault for being shot out of the canon too early, I feel one has to say something lest perception be allowed to irretrievably swamp reality. There have always been artists who were taken up by collectors, curators, or journalists; artists who fit a certain narrative but are of little interest to other artists. So why get worked up over it now? Of course it’s not just him. The problem is really one of what constitutes interpretation; it’s the fault line of a deepening divide between how artists and curators see the world. Though it may seem unfair to single out Murillo, the best way to explain why the distinction matters is to describe his work.

Murillo seems to want to say something with his work about palimpsest and memory and being an outsider, but he lacks, to my eye, most of what is needed to make a convincing picture of that type. His grasp of the elements that engage people who paint—like scale, color, surface, image, and line—is journeyman-like at best. His sense of composition is strictly rectilinear; he doesn’t seem to have discovered the diagonal or the arabesque. Worse, he can’t seem to generate any sense of internal pictorial rhythm.

Murillo’s paintings lack personality. He uses plenty of dark colors, scraping, rubbing, dripping, graffiti marks, and dirty tarpaulins—run-of-the-mill stuff, signifiers all. The work looks like something made by an art director; it’s meant to look gritty and “real” but comes across as fainthearted. This is painting for people who don’t have much interest in looking, who prefer the backstory to what is in front of their eyes. Murillo is in so far over his head that even a cabal of powerful dealers won’t be able to save him. He must on some level know this, and so he tries to make up for what’s missing by adding on other effects. One piece in “The Forever Now” is a pile of canvases crumpled up on the floor that viewers can move about as they choose. It’s interactive—get it? MoMA visitors with a long memory will recognize this as a variation on early work by Allan Kaprow, the inventor of Happenings, who wished to mimic the “expressionist” impulses in ’50s paintings and channel them into little games that invited viewer participation with the result that what had once been pictorially alive became pure tedium. To quote Fairfield Porter, writing at the time, “[Kaprow] uses art and he makes clichés….If he wants to prove that certain things can’t be done again because they already have been done, he couldn’t be more convincing.” You can kick Murillo’s canvases around from here to Tuesday—there is no way to bring them to life, because they never lived in the first place.

The real news from “The Forever Now,” the good news, is that painting didn’t die. The argument that tried to make painting obsolete was always a category mistake; that historically determinist line has itself expired, and painting is doing just fine. Painting may no longer be dominant, but that has had, if anything, a salutary effect: not everyone can paint, or needs to. While art audiences have gone their distracted way, painting, like a truffle growing under cover of leaves, has developed flavors both rich and deep, though perhaps not for everyone. Not having to spend so much energy defending one’s decision to paint has given painters the freedom to think about what painting can be. For those who make paintings, or who find in them a compass point, this is a time of enormous vitality.

Thanks to RS

February 26th, 2015
painting box

J-juan11
Juan O’Gorman’s house outside Mexico City (1953–56, demolished 1969)

Painting Box

February 23rd, 2015
Knowledge Isn’t Power

NY Times Published: FEB. 23, 2015
By Paul Krugman

Regular readers know that I sometimes mock “very serious people” — politicians and pundits who solemnly repeat conventional wisdom that sounds tough-minded and realistic. The trouble is that sounding serious and being serious are by no means the same thing, and some of those seemingly tough-minded positions are actually ways to dodge the truly hard issues.

The prime example of recent years was, of course, Bowles-Simpsonism — the diversion of elite discourse away from the ongoing tragedy of high unemployment and into the supposedly crucial issue of how, exactly, we will pay for social insurance programs a couple of decades from now. That particular obsession, I’m happy to say, seems to be on the wane. But my sense is that there’s a new form of issue-dodging packaged as seriousness on the rise. This time, the evasion involves trying to divert our national discourse about inequality into a discussion of alleged problems with education.

And the reason this is an evasion is that whatever serious people may want to believe, soaring inequality isn’t about education; it’s about power.

Just to be clear: I’m in favor of better education. Education is a friend of mine. And it should be available and affordable for all. But what I keep seeing is people insisting that educational failings are at the root of still-weak job creation, stagnating wages and rising inequality. This sounds serious and thoughtful. But it’s actually a view very much at odds with the evidence, not to mention a way to hide from the real, unavoidably partisan debate.

The education-centric story of our problems runs like this: We live in a period of unprecedented technological change, and too many American workers lack the skills to cope with that change. This “skills gap” is holding back growth, because businesses can’t find the workers they need. It also feeds inequality, as wages soar for workers with the right skills but stagnate or decline for the less educated. So what we need is more and better education.

My guess is that this sounds familiar — it’s what you hear from the talking heads on Sunday morning TV, in opinion articles from business leaders like Jamie Dimon of JPMorgan Chase, in “framing papers” from the Brookings Institution’s centrist Hamilton Project. It’s repeated so widely that many people probably assume it’s unquestionably true. But it isn’t.

For one thing, is the pace of technological change really that fast? “We wanted flying cars, instead we got 140 characters,” the venture capitalist Peter Thiel has snarked. Productivity growth, which surged briefly after 1995, seems to have slowed sharply.

Furthermore, there’s no evidence that a skills gap is holding back employment. After all, if businesses were desperate for workers with certain skills, they would presumably be offering premium wages to attract such workers. So where are these fortunate professions? You can find some examples here and there. Interestingly, some of the biggest recent wage gains are for skilled manual labor — sewing machine operators, boilermakers — as some manufacturing production moves back to America. But the notion that highly skilled workers are generally in demand is just false.

Finally, while the education/inequality story may once have seemed plausible, it hasn’t tracked reality for a long time. “The wages of the highest-skilled and highest-paid individuals have continued to increase steadily,” the Hamilton Project says. Actually, the inflation-adjusted earnings of highly educated Americans have gone nowhere since the late 1990s.

So what is really going on? Corporate profits have soared as a share of national income, but there is no sign of a rise in the rate of return on investment. How is that possible? Well, it’s what you would expect if rising profits reflect monopoly power rather than returns to capital.

As for wages and salaries, never mind college degrees — all the big gains are going to a tiny group of individuals holding strategic positions in corporate suites or astride the crossroads of finance. Rising inequality isn’t about who has the knowledge; it’s about who has the power.

Now, there’s a lot we could do to redress this inequality of power. We could levy higher taxes on corporations and the wealthy, and invest the proceeds in programs that help working families. We could raise the minimum wage and make it easier for workers to organize. It’s not hard to imagine a truly serious effort to make America less unequal.

But given the determination of one major party to move policy in exactly the opposite direction, advocating such an effort makes you sound partisan. Hence the desire to see the whole thing as an education problem instead. But we should recognize that popular evasion for what it is: a deeply unserious fantasy.

February 23rd, 2015
Fondamenta

Screen shot 2015-02-17 at 7.34.07 PM

Matt Connors
Peter Harkawik
Matt Paweski
Peter Shire

Opening Reception: Saturday, February 21st from 6 – 8 pm

Barrett Art Gallery at Santa Monica College
1310 11st St. Santa Monica, CA 90401

February 17th, 2015
The Epidemic of Facelessness

15FACELESSNESS-blog427
Credit Mat Brinkman

By STEPHEN MARCHE
NY Times Published: FEB. 14, 2015

A PART-TIME delivery driver named Peter Nunn was recently sentenced to 18 weeks in a British prison for tweeting and retweeting violent messages to Stella Creasy, a member of Parliament. He never saw his victim but the consequences of his virtual crime were real enough. In a statement, Ms. Creasy described fears for her physical safety, going so far as to install a panic button in her home. Mr. Nunn has been physically separated from the rest of society for posting abusive words on a social media site.

The fact that the case ended up in court is rare; the viciousness it represents is not. Everyone in the digital space is, at one point or another, exposed to online monstrosity, one of the consequences of the uniquely contemporary condition of facelessness.

Every month brings fresh figuration to the sprawling, shifting Hieronymus Bosch canvas of faceless 21st-century contempt. Faceless contempt is not merely topical. It is increasingly the defining trait of topicality itself. Every day online provides its measure of empty outrage.

When the police come to the doors of the young men and women who send notes telling strangers that they want to rape them, they and their parents are almost always shocked, genuinely surprised that anyone would take what they said seriously, that anyone would take anything said online seriously. There is a vast dissonance between virtual communication and an actual police officer at the door. It is a dissonance we are all running up against more and more, the dissonance between the world of faces and the world without faces. And the world without faces is coming to dominate.

Recently Dick Costolo, chief executive of Twitter, lamented his company’s failures to deal with the trolls that infested it: “I’m frankly ashamed of how poorly we’ve dealt with this issue during my tenure as CEO,” he said in a leaked memo. It’s commendable of him to admit the torrents of abuse, but it’s also no mere technical error on the part of Twitter; faceless rage is inherent to its technology.

It’s not Twitter’s fault that human beings use it. But the faceless communication social media creates, the linked distances between people, both provokes and mitigates the inherent capacity for monstrosity.

The Gyges effect, the well-noted disinhibition created by communications over the distances of the Internet, in which all speech and image are muted and at arm’s reach, produces an inevitable reaction — the desire for impact at any cost, the desire to reach through the screen, to make somebody feel something, anything. A simple comment can so easily be ignored. Rape threat? Not so much. Or, as Mr. Nunn so succinctly put it on Twitter: “If you can’t threaten to rape a celebrity, what is the point in having them?”

The challenge of our moment is that the face has been at the root of justice and ethics for 2,000 years. The right to face an accuser is one of the very first principles of the law, described in the “confrontation clause” of the Sixth Amendment of the United States Constitution, but reaching back through English common law to ancient Rome. In Roman courts no man could be sentenced to death without first seeing his accuser. The precondition of any trial, of any attempt to reconcile competing claims, is that the victim and the accused look each other in the face.

For the great French-Jewish philosopher Emmanuel Levinas, the encounter with another’s face was the origin of identity — the reality of the other preceding the formation of the self. The face is the substance, not just the reflection, of the infinity of another person. And from the infinity of the face comes the sense of inevitable obligation, the possibility of discourse, the origin of the ethical impulse.

The connection between the face and ethical behavior is one of the exceedingly rare instances in which French phenomenology and contemporary neuroscience coincide in their conclusions. A 2009 study by Marco Iacoboni, a neuroscientist at the Ahmanson-Lovelace Brain Mapping Center at the University of California, Los Angeles, explained the connection: “Through imitation and mimicry, we are able to feel what other people feel. By being able to feel what other people feel, we are also able to respond compassionately to other people’s emotional states.” The face is the key to the sense of intersubjectivity, linking mimicry and empathy through mirror neurons — the brain mechanism that creates imitation even in nonhuman primates.

The connection goes the other way, too. Inability to see a face is, in the most direct way, inability to recognize shared humanity with another. In a metastudy of antisocial populations, the inability to sense the emotions on other people’s faces was a key correlation. There is “a consistent, robust link between antisocial behavior and impaired recognition of fearful facial affect. Relative to comparison groups, antisocial populations showed significant impairments in recognizing fearful, sad and surprised expressions.” A recent study in the Journal of Vision showed that babies between the ages of 4 months and 6 months recognized human faces at the same level as grown adults, an ability which they did not possess for other objects.

Without a face, the self can form only with the rejection of all otherness, with a generalized, all-purpose contempt — a contempt that is so vacuous because it is so vague, and so ferocious because it is so vacuous. A world stripped of faces is a world stripped, not merely of ethics, but of the biological and cultural foundations of ethics.

For the great existentialist Martin Heidegger, the spirit of homelessness defined the 20th century, a disconnected drifting in a world of groundless artificiality. The spirit of facelessness is coming to define the 21st. Facelessness is not a trend; it is a social phase we are entering that we have not yet figured out how to navigate.

As exchange and communication come at a remove, the flight back to the face takes on new urgency. Google recently reported that on Android alone, which has more than a billion active users, people take 93 million selfies a day. The selfie has become not a single act but a continuous process of self-portraiture. On the phones that are so much of our lives, no individual self-image is adequate; instead a rapid progression of self-images mimics the changeability and the variety of real human presence.

Continue reading the main storyContinue reading the main story
Emojis are an explicit attempt to replicate the emotional context that facial expression provides. Intriguingly, emojis express emotion, often negative emotions, but you cannot troll with them. You cannot send a message of faceless contempt with icons of faces. The mere desire to imitate a face humanizes.

But all these attempts to provide a digital face run counter to the main current of our era’s essential facelessness. The volume of digital threats appears to be too large for police forces to adequately deal with. But cases of trolls’ following through on their online threat of murder and rape are extremely rare. The closest most trolling comes to actual violence is “swatting,” or sending ambulances or SWAT teams to an enemy’s house. Again, neither victim nor perpetrator sees the other.

What do we do with the trolls? It is one of the questions of the age. There are those who argue that we have a social responsibility to confront them. Mary Beard, the British historian, not only confronted a troll who sent her misogynistic messages, she befriended him and ended up writing him letters of reference. One young video game reviewer, Alanah Pearce, sent Facebook messages to the mothers of young boys who had sent her rape threats. These stories have the flavor of the heroic, a resistance to an assumed condition: giving face to the faceless.

The more established wisdom about trolls, at this point, is to disengage. Obviously, in many cases, actual crimes are being committed, crimes that demand confrontation, by victims and by law enforcement officials, but in everyday digital life engaging with the trolls “is like trying to drown a vampire with your own blood,” as the comedian Andy Richter put it. Ironically, the Anonymous collective, a pioneer of facelessness, has offered more or less the same advice.

Rule 14 of their “Rules of the Internet” is, “Do not argue with trolls — it means that they win.

Rule 19 is, “The more you hate it the stronger it gets.”

Ultimately, neither solution — confrontation or avoidance — satisfies. Even if confrontation were the correct strategy, those who are hounded by trolls do not have the time to confront them. To leave the faceless to their facelessness is also unacceptable — why should they own the digital space simply because of the anonymity of their cruelty?

There is a third way, distinct from confrontation or avoidance: compassion. The original trolls, Scandinavian monsters who haunted the Vikings, inhabited graveyards or mountains, which is why adventurers would always run into them on the road or at night. They were dull. They possessed monstrous force but only a dim sense of the reality of others. They were mystical nature-forces that lived in the distant, dark places between human habitations. The problem of contemporary trolls is a subset of a larger crisis, which is itself a consequence of the transformation of our modes of communication. Trolls breed under the shadows of the bridges we build.

In a world without faces, compassion is a practice that requires discipline, even imagination. Social media seems so easy; the whole point of its pleasure is its sense of casual familiarity. But we need a new art of conversation for the new conversations we are having — and the first rule of that art must be to remember that we are talking to human beings: “Never say anything online that you wouldn’t say to somebody’s face.” But also: “Don’t listen to what people wouldn’t say to your face.”

The neurological research demonstrates that empathy, far from being an artificial construct of civilization, is integral to our biology. And when biological intersubjectivity disappears, when the face is removed from life, empathy and compassion can no longer be taken for granted.

The new facelessness hides the humanity of monsters and of victims both. Behind the angry tangles of wires, the question is, how do we see their faces again?

February 15th, 2015
Jenny Monick and Gedi Sibony

unnamed

February 14 – April 5, 2015

Opening Reception:

Saturday, February 14, 5-7pm

Reserve Ames

February 14th, 2015
All Parents Are Cowards

By MICHAEL CHRISTIE
NY Timess Published: FEBRUARY 12, 2015

I have broken my wrists, fingers, a tibia, a fibula, chipped a handful of teeth, cracked a vertebra and snapped a collarbone. I have concussed myself in Tallahassee, Fla., and Portland, Ore. I’ve skittered across the sooty hoods of New York cabs and bombed down many of San Francisco’s steepest avenues.

For many years I was a professional skateboarder. I first stepped on a skateboard at 11. The nomenclature — switch-stance frontside tailslide, kickflip to nose manual — was the language of my first friendships, with wild, strange boys who were as ill suited for school and team sports as I was. They were from broken homes. Poor homes. Group homes. We were like little cement mixers, keeping ourselves in constant motion, our skateboard’s movement the only thing preventing us from hardening into blocks of pure rage.

It was through those friends that I first realized the oddity of my own home. Skateboarding gave my mother panic attacks. She bought me helmets and pads (which I never wore), and gasped at my scars and bruises. She would have forbidden me to skateboard at all if she believed for a second that I would comply.

This might sound like typical parental anxiety. But with my mother, it was something deeper.

My mother was agoraphobic, which means she was often housebound, terrified of stores, cars and crowds. Vacations were impossible. As were jobs, and simple errands. She cut our hair, made us clothes, prepared us complex meals. Together we painted and drew, watched movies and read. She taught me to build a bookshelf, reattach a button and piece together a quilt. She worried that school stifled my creativity. So she encouraged me to stay home whenever I wanted. Which was often. School couldn’t compare with the bright spotlight of her attention, and besides, I knew she needed me close by.

I felt uneasy around other kids until that day when I was 11, and I saw a boy outside my house perform an ollie (that magical, clacking leap by which skateboarders temporarily glue their board to their feet and vault into the air). To my mother’s horror, I rushed outside and begged him to let me try. From then on, I realized I needed to be skateboarding in the streets as much as she needed to be safe in the house. I stopped coming home except to shower and sleep.

At 17, I left for good, and spent a decade and a half on the other edge of the continent. I seldom called her. When we talked it felt as if she was trying to siphon something vital from my cells, so I parried her inquiries about my welfare with sharp, monosyllabic replies. It was a time of great anger and resentment, a time I’m not proud of.

But in 2008, when I was 32, my mother got sick with Stage 4 lung cancer. I went home to care for her during a last-ditch chemotherapy regime, and found her in the process of throwing away everything she owned. To prevent some artifact of our family history from being accidentally lofted into the trash, I presented her with boxes and three options: keep, donate or trash. There was plenty of clutter to sort. Mostly it was liquor boxes of paperbacks and her artwork and crafts — the accumulation of a life lived predominantly indoors.

Eventually I dragged a box from her closet and found it was stuffed with skateboard magazines, bookmarks peeking out from their pages. I leafed through one and discovered a picture of myself, five years younger, atop a skateboard mid-tailslide on a wooden handrail, my mouth open and my eyes fixed wide with both terror and joy.

“I didn’t think you could look at these,” I said.

“I took out subscriptions,” she said, avoiding my eyes. “You never sent any photos over the years. These were the only ones of you I could get.” Then she sighed, “Keep.”

A few weeks after her chemotherapy ended and I returned to Vancouver, I woke late one night to a call from my father. My mother had died. I remember sitting at my kitchen table, weeping, clutching my stomach. I had to get outside. I grabbed my skateboard and rolled for hours in the orange streetlights, aimlessly ollieing manholes and weaving between parked cars. Fresh-faced people were power-walking to work by the time I got home.

Five months later, my first son was born. My love for him was instant, soul-flooding. I had trouble taking my eyes off him. We went on long walks while my wife slept. Out with his stroller in the midday traffic, I found that I had suddenly become attuned to the world’s menace, to the human being’s naked vulnerability in the face of it. The city throbbed with dangers that I’d long been insensate to: veering cars, potential kidnappers, toxic exhausts, carelessly discarded needles. It was as though the world had turned double agent and become my enemy.

I developed a Border collie-like attentiveness when it came to my son’s safety. When he stood at the coffee table like a cute little drunk bellying up to a bar, I’d be hovering there, his personal safety net. After my long acquaintance with the physics of crashing, I knew exactly what whap his forehead would make if it hit the lamp, what thunk his cerebellum would issue on the hardwood if he tipped back.

Or perhaps, I worried, it was because of my mother. My inherited brain chemistry, my angst-ridden genes taking over.

Things got worse. I kicked a dog at the park that looked as if it was going to bite him. I complained to my wife that his day care workers were inattentive. If my son choked on something at the table, even momentarily, it would take me an hour and a few drinks to smooth out my nerves. Sleep became impossible.

I never imagined that parenthood meant learning to live with this unrelenting, impaling fear. With the question of when to catch your children and when to let them fall. To date I’ve watched my son’s precious body bounce off concrete, wood and brick. We had another son last year, this one more fearless than his brother. Someday I may be forced to hear their bones snap and see their blood gush. And then, after all that growing and falling, they might move away, far beyond my protective reach. My mother was ill, but she was also right: It is terrifying to be a parent.

It is a cliché to say children teach us about ourselves and about our parents, but it’s true. My sons are teaching me to calm down. I’ve seen pain shape them for the better. I’ve watched a trip to the ground leave them incrementally stronger. I even recently bought them both skateboards, which have yet to interest them.

I’m learning to forgive my mother, for her life lived inside, for her inability to cope. My mother was afraid of everything, yet she was brave. I used to fear nothing, but parenthood has rendered me a coward. I wish I could tell her that now.

But when I picture her leafing through those skateboard magazines she’d collected over the years, skipping over the interviews and advertisements, searching for her reckless, angry son, only to find me falling from the sky in some place she couldn’t follow, I’m certain she understood how I felt then, how I’d feel now.

A skateboard is the most basic ambulatory machine. It has no gears, offers no assistance. It will protect you from nothing. It is a tool for falling. For failure. But also for freedom. For living. On a skateboard you must stay balanced in a tempest of forces beyond your control. The key is to be brave, get low, stay up and keep rolling.

February 12th, 2015
david korty

noname

February 20 – March 28, 2015
Opening reception: Friday, February 20, 2015

Wallspace

February 11th, 2015
Old Ways Prove Hard to Shed, Even as Crisis Hits Kimono Trade

10KIMONO-articleLarge
Washing silk in a river. Artisans on Amami Oshima uses the island’s iron-rich mud to turn silk a rich shade of chocolate brown. Credit Kentaro Takahashi for The New York Times

By MARTIN FACKLER
NY Times Published: FEB. 9, 2015

AMAMI OSHIMA, Japan — Kazuhiko Kanai uses the traditional method to dye the elegant kimonos for which this small, semitropical island is renowned: he carries a bundle of pure white silk to a nearby rice paddy and hurls it into the mud.

Mr. Kanai is one of the last practitioners of a method known as “dorozome,” or “mud-dyeing,” which uses the island’s iron-rich soil to turn silk the color of the darkest chocolate. This is just one step in an elaborate production process that can take a year to produce a kimono with the glossiest silk and most intricately woven designs in all Japan. In a nation that esteems its traditional form of dress as high art, Amami Oshima’s kimonos became some of the most prized of them all, once capable of fetching more than $10,000 apiece.

But those heady days are over, as a shift to Western fashions and Japan’s long economic squeeze have led to plummeting demand, especially for high-end kimonos.

On Amami Oshima, production has fallen so far in the last two decades that only 500 people on an island with 73,000 residents remain employed full-time in kimono production, and many of them are in their 70s or 80s. That’s down from 20,000 people a generation ago, according to the Authentic Amami Oshima Tsumugi Association, the island’s union of kimono producers.

Amami Oshima has fallen harder than most of Japan’s famous kimono production centers, dragged down by a complex web of wholesalers, dealers and specialized retailers who distribute and sell the island’s kimonos. While this antiquated system once benefited the remote southern island near Okinawa by spreading its kimonos to the rest of Japan, islanders say it has now become a burden, keeping the kimonos prohibitively expensive while driving down wages.

Yet the old ways have proven hard to discard, despite a growing sense of crisis. Many fret that there will soon be too few islanders left with the skills to sustain each of the 30 separate steps needed to produce one of the kimonos.

“If we lose one link in the chain, we lose our ability to make kimonos,” said Mr. Kanai, 56, who owns a dirt-floored wooden workshop where silk is dyed in bubbling iron caldrons and then hung from the ceiling to dry. “If we cannot make kimonos any more, what will be left here?”

Mr. Kanai says the mud-dyeing process alone takes more than a month, as the silk is first colored a burgundy hue with natural dye made from the pulp of a local plum tree. Getting the right shade of red requires repeating the cycle of staining and drying the silk 30 times, he said. Only then is the silk ready to be immersed in the black mud, whose iron reacts with tannins in the tree dye to create the coveted dark brown color.

That is not the most elaborate step. Even before the silk arrives at Mr. Kanai’s workshop, it is first woven into a temporary fabric as part of a unique method that the islanders have devised for creating minutely detailed patterns.

After this temporary fabric has been mud-dyed, it is unraveled back into its original silk threads. Each colored thread now has thousands of tiny white stripes where it overlapped with another thread, blocking the mud from touching it at that point.

As the threads are rewoven into new fabric by nimble-fingered island women, they slowly reveal perfectly formed patterns, ranging from starkly minimalist shapes to elaborate scenes of bamboo groves and flying storks.

“The weaver has a tremendous responsibility,” said Mifuko Iwasaki, 70, who has been teaching young islanders how to weave these perfectly aligned patterns on hand looms for 35 years. “If we make a mistake, we undo all the hard work of those who spent so much timing preparing this thread.”

Ms. Iwasaki says that when she began teaching her yearlong classes, she typically had 40 students, who were drawn by the fact that weaving offered higher wages than fishing, farming and logging, the island’s other industries at the time.

These days, she says she is lucky to get more than two or three students, because weaving no longer pays as well. The myriad of middlemen in the cumbersome distribution system each take a cut, making it hard to reduce prices at the same rate as other items in deflationary Japan. Worse, the brunt of what price cuts have been made inevitably falls on the island’s dyers and weavers. As a result, while a new Oshima kimono can still cost $3,000 to $6,000 in Tokyo, weavers say they are lucky to get more than $400 for a month’s exacting work. Other craftsmen in the production process get even less.

Nonetheless, islanders say they are reluctant to bypass the antiquated distribution system, saying they feel bound by generations-old obligations and a fear of change. This makes them a microcosm of Japan as a whole, which has been slow to give up its outdated postwar economic model despite years of stagnation.

“It is ironic that we can no longer make ends meet producing something so expensive,” said Shigehiko Furuta, 67, who uses colored pens and graph paper to design the minutely detailed patterns.

Shinichiro Yamada, 83, the head of the producers’ union, said the island’s ornately woven patterns have their roots in the colorful culture of the Kingdom of the Ryukyus, centered in current-day Okinawa. They ruled Amami Oshima until the early 17th century, when the island was conquered by Japanese samurai, who claimed the island’s kimonos as tribute.

Mud-dyeing started when disobedient islanders buried kimonos in the ground to hide them, only to discover on digging them up again that the fabrics had turned a beautiful dark color, said Mr. Kanai, who owns the mud-dyeing workshop.

His son, Yukihito, now uses those same centuries-old dyeing techniques to color new types of items, including T-shirts, jeans and even guitar bodies. He is experimenting with selling these over the Internet, to avoid the onerous distribution system.

“We need to become more like artisans in Europe or artists in New York,” said the younger Mr. Kanai, 35, who said he is one of the few “young successors” in the island’s kimono industry. “Even traditions have to evolve.”

February 9th, 2015
Michael Frimkess

Directed by Daniel Riesenfeld

February 6th, 2015
ROBERT OVERBY

unnamed-7
Living Room, Paul’s Place (detail), 1971

February 6 – April 11, 2015

Opening Reception:
Friday, February 6th, 6-8pm

Marc Selwyn

February 5th, 2015
Robert Hudson

unnamed-5

unnamed-6
Loop Mask, 2012/2014
Kinetic Sculpture (top spins on a point)
steel, stainles steel, shards of cast iron, enamel, steel cable, aluminum, acrylic paint coated with clear epoxy
38″ x 39″ x 26″ inches

South Willard Shop Exhibit

February 4th, 2015
The Long-Run Cop-Out

By Paul Krugman
NY Times Published: FEB. 2, 2015

On Monday, President Obama will call for a significant increase in spending, reversing the harsh cuts of the past few years. He won’t get all he’s asking for, but it’s a move in the right direction. And it also marks a welcome shift in the discourse. Maybe Washington is starting to get over its narrow-minded, irresponsible obsession with long-run problems and will finally take on the hard issue of short-run gratification instead.

O.K., I’m being flip to get your attention. I am, however, quite serious. It’s often said that the problem with policy makers is that they’re too focused on the next election, that they look for short-term fixes while ignoring the long run. But the story of economic policy and discourse these past five years has been exactly the opposite.

Think about it: Faced with mass unemployment and the enormous waste it entails, for years the Beltway elite devoted almost all their energy not to promoting recovery, but to Bowles-Simpsonism — to devising “grand bargains” that would address the supposedly urgent problem of how we’ll pay for Social Security and Medicare a couple of decades from now.

And this bizarre long-termism isn’t just an American phenomenon. Try to talk about the damage wrought by European austerity policies, and you’re all too likely to encounter lectures to the effect that what we really need to discuss is long-term structural reform. Try to discuss Japan’s effort to break out of its decades-long deflationary trap, and you’re sure to encounter claims that monetary and fiscal policy are sideshows, and that deregulation and other structural changes are what’s important.

Am I saying that the long run doesn’t matter? Of course not, although some forms of long-termism don’t make sense even on their own terms. Think about the notion that “entitlement reform” is an urgent priority. It’s true that many projections suggest that our major social insurance programs will face financial difficulties in the future (although the dramatic slowing of increases in health costs makes even that proposition uncertain). If so, at some point we may need to cut benefits. But why, exactly, is it crucial that we deal with the threat of future benefits cuts by locking in plans to cut future benefits?

Anyway, even where the long-term issues are real, it’s truly strange that they have so often taken center stage in recent years. We are, after all, still living through the aftermath of a once-in-three-generations financial crisis. America seems, finally, to be recovering — but Bowles-Simpsonism had its greatest influence precisely when the United States economy was still mired in a deep slump. Europe has hardly recovered at all, and there’s overwhelming evidence that austerity policies are the main reason for that ongoing disaster. So why the urge to change the subject to structural reform? The answer, I’d suggest, is intellectual laziness and lack of moral courage.

About laziness: Many people know what John Maynard Keynes said about the long run, but far fewer are aware of the context. Here’s what he really said: “But this long run is a misleading guide to current affairs. In the long run we are all dead. Economists set themselves too easy, too useless a task if in tempestuous seasons they can only tell us that when the storm is long past the ocean is flat again.” Quite. All too often, or so it seems to me, people who insist that questions of austerity and stimulus are unimportant are actually trying to avoid hard thinking about the nature of the economic disaster that has overtaken so much of the world.

And they’re also trying to avoid taking a stand that will expose them to attack. Discussions of short-run fiscal and monetary policy are politically charged. Oppose austerity and support monetary expansion and you’ll be lambasted by the right; do the reverse and you’ll be criticized and maybe ridiculed by the left. I understand why it’s tempting to dismiss the whole debate and declare that the really important issues involve the long run. But while people who say that kind of thing like to pose as brave and responsible, they’re actually ducking the hard stuff — which is to say, being craven and irresponsible.

Which brings me back to the president’s new budget.

It goes without saying that Mr. Obama’s fiscal proposals, like everything he does, will be attacked by Republicans. He’s also, however, sure to face criticism from self-proclaimed centrists accusing him of irresponsibly abandoning the fight against long-term budget deficits.

So it’s important to understand who’s really irresponsible here. In today’s economic and political environment, long-termism is a cop-out, a dodge, a way to avoid sticking your neck out. And it’s refreshing to see signs that Mr. Obama is willing to break with the long-termers and focus on the here and now.

February 2nd, 2015

By Bryan Armen Graham
The Guardian Published: 30 January 2015

You’ve streaked at more than 500 events, most famously during Super Bowl XXXVIII between the New England Patriots and Carolina Panthers. Have you always had a taste for adventure? In school I was the class joker. In my local bar I’m the crazy guy – the guy who likes to do things for fun. It’s just stemmed from there. I’ve just had this taste for doing crazy things, mostly just to make people laugh. I love to see people smile and laugh if I do something silly or tell a joke whatever it may be.

What made you do this for the first time? I went to Hong Kong in 1993 with a one-way ticket and £30 in my pocket. The energy there at that time in the early 90s was infectious, just unbelievable. The Rugby Sevens was happening there, a two-day event, and I drunkenly said in a bar one evening that anybody can streak. The owner of the bar dared me to do it the next day during the final. I was only talking through the alcohol. I had no intention, but the next day a guy came and dragged me out of the apartment. I went to the stadium and had a few beers, then I ran on and took the ball off the New Zealand All Blacks – who were the best team in the world at the time – and scored a try. The whole stadium went crazy. I sobered up in an instant. The police threw me out of the stadium and I came straight back in and did it again about a half hour later. I believe I’m the first person who’s ever streaked twice at the same event on the same day. What a debut.

And you were instantly hooked? It was just infectious. It was the adrenaline I got from that first day. I went crazy. There was a big Chinese football game on two days later and I agreed to go to that. I quickly realized that people really enjoyed watching me do this – this crazy mad streaking – so I decided I’d see when I came home to England if it would be the same here. There hadn’t been anyone streaking for nearly 20 years in the UK. So I did a big a football game, Liverpool v Everton, and it went down great again. A couple of months later I did press-ups in the penalty area at a Liverpool v Arsenal game and that’s when I got a football ban.

A ban? I wasn’t allowed to go to any stadums for 12 months. That’s when I started doing golf, tennis, all the rugby events and such. Every time in every station at every sport, it’s the same reaction.

Are you recognized? Even if I’m not planning anything, if there’s a major sporting event in the UK, the police are pre-warned and they have security talks just on me. They carry photographs. That was all part of the adventure. Now I’ve got to get past people who are looking out for me. That’s when I started wearing disguises. I’ve even got into Royal Ascot, one of the biggest horse race weekends in the UK, dressed as a woman — thought I looked like the Undertaker from WWF. I don’t make a good looking woman, honestly.

Does the fear factor fluctuate according to the event? Streaking among the skinny stewards at Wimbledon must be different than a police state like the Super Bowl. Most definitely. Big soccer events are always scary. The biggest fear factor I had was for the Super Bowl and that started before I even left Liverpool. I went through every emotion in my head, the repercussions and the consequences. I was scared. I was worried about one of the players chasing me out and all the players jumping on top of me. That was my biggest fear because I suffer from a bit of claustrophobia. If they’d have all gone on top of me, there was a good chance I might die – I don’t know. I was even prepared to take a bullet in the leg from a sniper in the room. Remember, this was right after 9/11. The whole point of me doing the Super Bowl was to make people smile again.

Had you always eyed the Super Bowl? I was offered tickets to the Oscars that same year and said no. If I’m going to America to perform it had to the be the biggest event in the States. They said it was impossible and I said ‘Why? Who said it’s impossible?’ They said it just can’t be done.

Dare to dream. How much planning went into it? It was a year in the making. I’d tried the year before in San Diego. I’d arranged with a ticket broker over the phone for a $500 ticket, but when I arrived he wanted $5,000 and there was no way in hell. I was with friends and we didn’t even have that much money between us. I even got as far as the stadium on the tram, but the policeman on duty would not let me get off even though I said a friend had a ticket at the gate and I’d come all the way from England. He refused to let me off, so I’d made it all the way to the stadium just to be turned back, which was very disappointing. Of course I believe things happen for a reason. I wasn’t meant to do it that year because I was totally unprepared. I didn’t have a lawyer, I didn’t have any money, but now I had a whole year to plan for Houston. The next year I had front row tickets, I had a lawyer, and a referee’s uniform that I’d had sent to me from the States.

Wait, a lawyer? At this time I did have a sponsor. They said just have our name on your chest and we’ll get you the best and they did. They got me Richard Haynes, who was the No1 lawyer in Texas and one of the top six in the whole of the US. He was just unbelievable. I thought he was going to be some two-bit lawyer until I met him. You just knew you were in the presence of someone who was unbelievable. He said we go to trial, we plead not guilty: nobody told you you couldn’t go onto the field. There were no signs, no warnings. That was our argument.

Were there any close calls with security? Going into the stadium I had two sets of clothes on: my own clothes and the referee’s uniform underneath. Both sets had Velcro and could be easily torn away. As I’m getting frisked by security at the gate, he felt the Velcro on my trousers and asked what that was about. The only thing I could think of was I have a skin disorder and need to be able to put the cream on quickly – and he accepted that explanation! Then he lifted my top up and saw the referee’s uniform and asked about that. I said it was my lucky referee’s uniform and I wear it to every game. And he let me in. I looked at my friend and said this was meant to happen. If I can get through security with excuses like that, this will happen.

So what happened next? When we sat down at last, we had front row seats on the 50-yard line. The police and security were absolutely everywhere, and there was one security guard right in my direct line of where I planned to enter the field who stayed on his mark from two hours before the game started right up until the point when I needed to go on. He walked off his mark at the very moment I needed to go. It was like divine intervention.

And then you made your move. The adrenaline at that time was just pumping. The biggest noise I’d ever heard to that point was that very first time in Hong Kong. That noise was never replicated until I’m dancing naked on the 50-yard line at the Super Bowl. I felt like I was on there for so long. It felt like minutes before the chase happened. It was just surreal: I was dancing in the middle of the field at the Super Bowl and nobody was coming after me. I thought, “What the hell is going on out there?” expecting people to chase me straight away, so I had to come up with all these dance moves. I’m glad they started coming after me when they did because I was running out.

Mark Roberts says the Super Bowl remains ‘the holy grail of streaking’.
What do you think causes the hesitation? Why so often are you not chased immediately? The element of surprise – going on as a referee – is what made it possible. As soon as I disrobed, I think the players thought the referee has gone crazy straight away. As I’m dancing in circles around the ball, I’m looking at the police and they all looked confused. They didn’t know what the hell was happening for nearly a minute. If I’d have run off naked, I wouldn’t have gotten 10 or 15 yards. The element of surprise and confusion adds to the whole performance, the adventure.

What happened next? The police were going to let me go after a half hour, but the head of the NFL came running down. They’d spent millions and millions of dollars on security at the Super Bowl and I beat them all and they didn’t like it. At all. So they wanted the book thrown at me. The NFL even appeared in court against me. The head of the Reliant Stadium, the head of the NFL, a top policeman from Houston: they were going all out to get me sent to jail. I want to know where it says in the law books it’s illegal to make people laugh.

And what happened at the trial? I got a $1,000 fine, but I’d been paid $1m to do the Super Bowl. It was crazy because my lawyer turned out to be a woman – Richard Hayes assigned his right-hand lady, Sharon Levine, who’s sadly not with us anymore. The prosecutor was a lady, the judge was a lady, the jury were 12 women. So the state of my life was in the hands of all these ladies, who found me guilty. But when the courtroom was cleared and it was just me, the prosecutor, my lawyer and the judge, all the jurors went into the jury room and all you could hear was laughter. The judge said to me, “Mr Roberts, in all my years on the bench, I have never heard laughter coming from the jury room. Would you please escort me into the room?” And she linked my arm, we walked in and everyone was cheering and laughing and even the judge started to laugh. It was cool, one of those experiences that you never forget.

That’s it? A $1,000 fine? I wanted to get back to the States in 2006 to travel, but I was stopped in Newark and turned around and sent back to the UK. The Super Bowl was only a misdemeanor, but because I’ve got a police record now in the States, I’m told I need to apply for a visa at the embassy in London. There’s still a chance I might not get in. I love the States, you know. I’ll be deeply disappointed if I can never go back to the US again.

Is there a perfect time to streak? It has to be when the game is not in play, because you dont want to change the course of the game. So it’s usually just before kickoff. It’s the beginning of the second half and everyone is watching. Before the ref blows his whistle, that’s the time to go. Not like that one absolute idiot who’s trying to copy me in Spain. He does it with his clothes on during the game.

Jimmy Jump? Yeah, he’s an idiot. He is. He ruins games. He changes the course of the match.

How do you ensure you get your clothes back? Usually when I run onto the pitch or the field, I rip my clothes off when I’m on. So the first thing the police or security do is grab my clothes, give them to me and tell me to put them back on. This happens 90% of the time. Occasionally I’ve had to leave and travel from city to city – you know those blue paper suits the police give you? – I’ve had to wear those a few times to come home. But in general I get them back straight away.

But not always. Once in Spain I didn’t. I streaked at a Barcelona-Real Madrid match in Madrid and the police took me to the station. At half one in the morning they said I could go. I asked if I could have my clothes and they said security had thrown them into the spectators. I’d told nobody I was going to Spain and my passport, my phone and my money were all in my clothes. So now I’m naked at half one in the morning in the middle of Madrid, but I found my way home.

How? Well I’m running naked through the streets just to keep warm initially not knowing where I’m going. This Spanish guy stopped me so I explained my situation. The only place I thought to go was the Holiday Inn near the Bernabeu where I’d picked up my ticket. The Spanish guy gave the taxi driver his last 20 euros for me to get back to the hotel. When I walked in, everybody had watched it on TV. They lent me a t-shirt and shorts, let me stay in a room, lent me 200 euros to get home and I made it back.

All this must cost a lot of money. The cost of tickets to some of these events alone must add up. How do you account for the costs? It’s who you know I suppose. Over the years I’ve got to know a lot of ticket brokers. There’s one in particular – he’s like a best friend – and if it’s a big event he’s normally there and he gives me tickets for free. I met him actually at the 2002 Champions League final. The very first time we met, he gave me a ticket for £20 when the face value was £60. That was the time I scored the goal against the Germans. Since then we’ve been really good friends. If it’s in another country, sometimes I’ll have a sponsor. As long as I have their name on my chest, they’ll pay the costs.

Do you ever research the penalties in the countries where you streak? No, I go in blind. Usually what I’m attempting has never been done before, so it’s uncharted territory. There’s no precedent, so I just have to put faith in the hands of the police when I get there. Ever single country I’ve been to – I think I’ve streaked in 22 countries – it’s been the same. I’ve been treated the same by the police every time. The police are great. especially in England. They absolutely love it. They chase me laughing their heads off. The police after the Super Bowl when I went to jail, they took my mugshot and then duplicated it, asked me to sign it for them and their girlfriends.

It’s not really a violent crime, is it? It’s not a crime, really. If you look at in the whole sense of things it’s just a bit of fun. Which makes it more appealing for me because it is supposed to be illegal. If it was legal, everybody’d be doing it. People are scared of consequence, you see. If you cancel out the fear, you can do anything.

Thanks to Jonathan Maghen

January 31st, 2015
Can Students Have Too Much Tech?

30Pinker-master495-1
Credit Josh Freydkis

By SUSAN PINKER
NY Times Published: JAN. 30, 2015

PRESIDENT OBAMA’s domestic agenda, which he announced in his State of the Union address this month, has a lot to like: health care, maternity leave, affordable college. But there was one thing he got wrong. As part of his promise to educate American children for an increasingly competitive world, he vowed to “protect a free and open Internet” and “extend its reach to every classroom and every community.”

More technology in the classroom has long been a policy-making panacea. But mounting evidence shows that showering students, especially those from struggling families, with networked devices will not shrink the class divide in education. If anything, it will widen it.

In the early 2000s, the Duke University economists Jacob Vigdor and Helen Ladd tracked the academic progress of nearly one million disadvantaged middle-school students against the dates they were given networked computers. The researchers assessed the students’ math and reading skills annually for five years, and recorded how they spent their time. The news was not good.

“Students who gain access to a home computer between the 5th and 8th grades tend to witness a persistent decline in reading and math scores,” the economists wrote, adding that license to surf the Internet was also linked to lower grades in younger children.

In fact, the students’ academic scores dropped and remained depressed for as long as the researchers kept tabs on them. What’s worse, the weaker students (boys, African-Americans) were more adversely affected than the rest. When their computers arrived, their reading scores fell off a cliff.

We don’t know why this is, but we can speculate. With no adults to supervise them, many kids used their networked devices not for schoolwork, but to play games, troll social media and download entertainment. (And why not? Given their druthers, most adults would do the same.)

The problem is the differential impact on children from poor families. Babies born to low-income parents spend at least 40 percent of their waking hours in front of a screen — more than twice the time spent by middle-class babies. They also get far less cuddling and bantering over family meals than do more privileged children. The give-and-take of these interactions is what predicts robust vocabularies and school success. Apps and videos don’t.

If children who spend more time with electronic devices are also more likely to be out of sync with their peers’ behavior and learning by the fourth grade, why would adding more viewing and clicking to their school days be considered a good idea?

An unquestioned belief in the power of gadgetry has already led to educational snafus. Beginning in 2006, the nonprofit One Laptop Per Child project envisioned a digital utopia in which all students over 6 years old, worldwide, would own their own laptops. Impoverished children would thus have the power to go online and educate themselves — no school or teacher required. With laptops for poor children initially priced at $400, donations poured in.

But the program didn’t live up to the ballyhoo. For one thing, the machines were buggy and often broke down. And when they did work, the impoverished students who received free laptops spent more time on games and chat rooms and less time on their homework than before, according to the education researchers Mark Warschauer and Morgan Ames. It’s drive-by education — adults distribute the laptops and then walk away.

It’s true that there is often an initial uptick in students’ engagement with their studies — interactive apps can be fun. But the novelty wears off after a few months, said Larry Cuban, an emeritus education professor at Stanford.

Technology does have a role in education. But as Randy Yerrick, a professor of education at the University at Buffalo, told me, it is worth the investment only when it’s perfectly suited to the task, in science simulations, for example, or to teach students with learning disabilities.

And, of course, technology can work only when it is deployed as a tool by a terrific, highly trained teacher. As extensive research shows, just one year with a gifted teacher in middle school makes it far less likely that a student will get pregnant in high school, and much more likely that she will go to college, earn a decent salary, live in a good neighborhood and save for retirement. To the extent that such a teacher can benefit from classroom technology, he or she should get it. But only when such teachers are effectively trained to apply a specific application to teaching a particular topic to a particular set of students — only then does classroom technology really work.

Even then, we still have no proof that the newly acquired, tech-centric skills that students learn in the classroom transfer to novel problems that they need to solve in other areas. While we’re waiting to find out, the public money spent on wiring up classrooms should be matched by training and mentorship programs for teachers, so that a free and open Internet, reached through constantly evolving, beautifully packaged and compelling electronic tools, helps — not hampers — the progress of children who need help the most.

January 30th, 2015

tumblr_inline_nhvrllWJ0H1rdajkk

356 Mission

January 28th, 2015
Library Visit, Then Held at Gunpoint

By Charles Blow
NY Times Published: JAN. 26, 2015

At Yale, the Police Detained My Son

Saturday evening, I got a call that no parent wants to get. It was my son calling from college — he’s a third-year student at Yale. He had been accosted by a campus police officer, at gunpoint!

This is how my son remembers it:

He left for the library around 5:45 p.m. to check the status of a book he had requested. The book hadn’t arrived yet, but since he was there he put in a request for some multimedia equipment for a project he was working on.

Then he left to walk back to his dorm room. He says he saw an officer “jogging” toward the entrance of another building across the grounds from the building he’d just left.

Then this:

“I did not pay him any mind, and continued to walk back towards my room. I looked behind me, and noticed that the police officer was following me. He spoke into his shoulder-mounted radio and said, ‘I got him.’

“I faced forward again, presuming that the officer was not talking to me. I then heard him say, ‘Hey, turn around!’ — which I did.

“The officer raised his gun at me, and told me to get on the ground.

“At this point, I stopped looking directly at the officer, and looked down towards the pavement. I dropped to my knees first, with my hands raised, then laid down on my stomach.

“The officer asked me what my name was. I gave him my name.

“The officer asked me what school I went to. I told him Yale University.

“At this point, the officer told me to get up.”

The officer gave his name, then asked my son to “give him a call the next day.”

My son continued:

“I got up slowly, and continued to walk back to my room. I was scared. My legs were shaking slightly. After a few more paces, the officer said, ‘Hey, my man. Can you step off to the side?’ I did.”

The officer asked him to turn around so he could see the back of his jacket. He asked his name again, then, finally, asked to see my son’s ID. My son produced his school ID from his wallet.

The officer asked more questions, and my son answered. All the while the officer was relaying this information to someone over his radio.

My son heard someone on the radio say back to the officer “something to the effect of: ‘Keep him there until we get this sorted out.’ ” The officer told my son that an incident report would be filed, and then he walked away.

A female officer approached. My son recalled, “I told her that an officer had just stopped me and pointed his gun at me, and that I wanted to know what this was all about.” She explained students had called about a burglary suspect who fit my son’s description.

That suspect was apparently later arrested in the area.

When I spoke to my son, he was shaken up. I, however, was fuming.

Now, don’t get me wrong: If indeed my son matched the description of a suspect, I would have had no problem with him being questioned appropriately. School is his community, his home away from home, and he would have appreciated reasonable efforts to keep it safe. The stop is not the problem; the method of the stop is the problem.

Continue reading the main storyContinue reading the main story
Why was a gun drawn first? Why was he not immediately told why he was being detained? Why not ask for ID first?

What if my son had panicked under the stress, having never had a gun pointed at him before, and made what the officer considered a “suspicious” movement? Had I come close to losing him? Triggers cannot be unpulled. Bullets cannot be called back.

My son was unarmed, possessed no plunder, obeyed all instructions, answered all questions, did not attempt to flee or resist in any way.

This is the scenario I have always dreaded: my son at the wrong end of a gun barrel, face down on the concrete. I had always dreaded the moment that we would share stories about encounters with the police in which our lives hung in the balance, intergenerational stories of joining the inglorious “club.”

When that moment came, I was exceedingly happy I had talked to him about how to conduct himself if a situation like this ever occurred. Yet I was brewing with sadness and anger that he had to use that advice.

I am reminded of what I have always known, but what some would choose to deny: that there is no way to work your way out — earn your way out — of this sort of crisis. In these moments, what you’ve done matters less than how you look.

There is no amount of respectability that can bend a gun’s barrel. All of our boys are bound together.

The dean of Yale College and the campus police chief have apologized and promised an internal investigation, and I appreciate that. But the scars cannot be unmade. My son will always carry the memory of the day he left his college library and an officer trained a gun on him.

January 26th, 2015
Where’s the Empathy?

Screen shot 2015-01-24 at 10.01.30 PM
Kevin Green, left, and Nicholas Kristof in 1977. Carlton Union High School

NY Times Published: JAN. 24, 2015
By Nicholas Kristof

YAMHILL, Ore. — THE funeral for my high school buddy Kevin Green is Saturday, near this town where we both grew up.

The doctors say he died at age 54 of multiple organ failure, but in a deeper sense he died of inequality and a lack of good jobs.

Lots of Americans would have seen Kevin — obese with a huge gray beard, surviving on disability and food stamps — as a moocher. They would have been harshly judgmental: Why don’t you look after your health? Why did you father two kids outside of marriage?

That acerbic condescension reflects one of this country’s fundamental problems: an empathy gap. It reflects the delusion on the part of many affluent Americans that those like Kevin are lazy or living cushy lives. A poll released this month by the Pew Research Center found that wealthy Americans mostly agree that “poor people today have it easy because they can get government benefits without doing anything in return.”

Lazy? Easy? Kevin used to set out with his bicycle and a little trailer to collect cans by the roadside. He would make about $20 a day.

Let me tell you about Kevin Green. He grew up on a small farm a couple of miles from my family’s, and we both attended the same small rural high school in Yamhill, Ore. We both ran cross country, took welding and agriculture classes and joined Future Farmers of America. After cross country practice, I’d drive him home to his family farm, with its milk cows, hogs and chickens.

The Greens encapsulated if not the American dream, at least solid upward mobility. The dad, Thomas, had only a third-grade education and couldn’t read. But he had a good union job as a cement finisher, paying far above the minimum wage, and he worked hard and made sure his kids did, too. He had no trouble with the law.

Kevin and his big sister, Cindy — one of the sweetest girls in school — both earned high school diplomas. Kevin was sunny, cheerful and astonishingly helpful: Any hint that something needed fixing, and he was there with a wrench. But then the dream began to disintegrate.

The local glove factory and feed store closed, and other blue-collar employers cut back. Good union jobs became hard to find. For a while, Kevin had a low-paying nonunion job working for a construction company. After that company went under, he worked as shift manager making trailer homes. He fell in love and had twin boys that he doted on. But because he and his girlfriend struggled financially, they never married.

Then, about 15 years ago, Kevin hurt his back and was laid off. Soon afterward, his girlfriend moved out, took the kids and asked for child support. The loss of his girlfriend, kids and job was a huge blow.

“It knocked him to the dirt,” says his younger brother, Clayton, also a pal of mine. “It destroyed his self-esteem.”

Kevin’s weight ballooned to 350 pounds, and he developed diabetes and had a couple of heart attacks. He grew marijuana and self-medicated with it, Clayton says, and was arrested for drug offenses.

My kids would see Kevin and me together and couldn’t believe he had run cross country with me, and that he wasn’t 20 years older.

Kevin eventually got disability benefits, but he was far behind in child support and was punished by losing his driver’s license — which made it pretty much impossible to get a job in a rural area. Disability helped Kevin by providing a monthly check that he desperately needed, but it also hurt him because he might have looked harder for a job if he hadn’t been getting those checks, Clayton says.

Yet it’s absurd to think that people like Kevin are somehow living it up. After child support deductions, he was living on about $180 a month plus food stamps and a small income from selling home-grown pot. He supplemented this by growing a huge vegetable garden and fishing in the Yamhill River.

Three years ago, Cindy died of a heart attack at 52. Then doctors told Kevin a few weeks ago that his heart, liver and kidneys were failing, and that he was dying. He had trouble walking. He was in pain.

He was also worried about his twin boys. They had trouble in school and with the law, jailed for drug and other offenses. The upward mobility that had seemed so promising a generation ago turned out to be a mirage. Family structure dissolved, and lives become grueling — and shorter.

Kevin wrote a will a few days before he died. He bequeathed his life’s savings of $3,500 to his mom for his funeral expenses. Anything left over is to be divided between his children — and he begs them not to fight over it. His ashes will be sprinkled on the farm.

I have trouble diagnosing just what went wrong in that odyssey from sleek distance runner to his death at 54, but the lack of good jobs was central to it. Sure, Kevin made mistakes, but his dad had opportunities for good jobs that Kevin never had.

So, Kevin Green, R.I.P. You were a good man — hardworking and always on the lookout for someone to help — yet you were overturned by riptides of inequality. Those who would judge you don’t have a clue. They could use a dose of your own empathy.

January 25th, 2015
roger herman

IMG_7898

IMG_7900

IMG_7899
Untitled Pot, 2010
Ceramic and Glaze
20 X 12 X 12 inches

Roger Herman

January 24th, 2015
Whose Tomb? Greece Wonders

24alexander-1-master675-1
Two headless sphinxes at the entrance to a tomb in northern Greece. The country’s Culture Ministry announced this week that the bones of five people were found inside.

By RACHEL DONADIO
NY Times Published: JAN. 23, 2015

Call it “CSI: Alexander.” For months, the excavations at a large ancient tomb in northern Greece have gripped the country. First, a marble slab wall was unearthed. Then, through announcements and leaks from the Greek Culture Ministry meted out with the pacing of a good mystery series, headless sphinxes and other statues were found. Finally, bones! But whose?

Is it possible — as culture officials implied with a wink and a nod but never actually stated — that the tomb could be for the family of Alexander the Great? Archaeologists say it’s highly unlikely. But that’s hardly the point. By the time the Culture Ministry announced this week that the bones of five people, not one, had been found in the tomb, it was the latest episode in an archaeological reality show that has entertained and distracted Greeks from their economic troubles.

The show has also starred Prime Minister Antonis Samaras. Even before he found himself fighting for political survival in national elections to be held this Sunday, Mr. Samaras used the excavation in Amphipolis, in the Greek region of Macedonia, to tap into national pride. He made a televised visit to the tomb in August — widely seen as an evocation of Alexander’s legacy — and later showed Chancellor Angela Merkel of Germany artifacts found at the site. This week, Mr. Samaras, whose conservative New Democracy party was trailing the leftist Syriza Party in a close race, mentioned Amphipolis in a campaign speech, calling Macedonia “the eternal bastion of Greece.”

“It’s a kind of positing of national pride, but also nationalist connotations and feelings,” said Yannis Hamilakis, a professor of archaeology at the University of Southampton, in England, and the author of “The Nation and Its Ruins: Antiquity, Archaeology, and National Imagination in Greece.” Conjuring up Alexander the Great — “his assumed civilizing campaigns, his conquest of the Orient” — emphasizes his place in “the Western imagination as a whole,” Mr. Hamilakis said.

While acknowledging that the tomb is a significant find that deserves public enthusiasm, Mr. Hamilakis and other archaeologists argue that the dig has been conducted hastily and in a way that places popular appeal over serious scholarship. The Greek news media have thrived on the story — a rare bright spot in a cycle dominated by austerity and unemployment. Archaeology buffs have taken to the blogosphere, floating their own theories. Last year, “Amphipolis” was the most popular search term on Google in Greece.

Busloads of tourists have flocked to the site but can view it only from afar because the Culture Ministry has blocked access. The ministry also declined to allow Aikaterini Peristeri, the lead archaeologist on the dig, to be interviewed. (As elections neared, a recent cartoon in the Greek press shows Mr. Samaras lifting up Ms. Peristeri, whose last name means dove in Greek, as if she were a peace offering.)

The mound where the tomb was discovered lies in the ancient city of Amphipolis, about 200 miles north of Athens. Archaeologists began excavating the site in the 1960s. After a new dig started in 2012, archaeologists unearthed a 9-foot-high marble slab that would later prove to be part of a large wall with a nearly 500-meter perimeter enclosing the tomb. The Culture Ministry dates the wall to the fourth century B.C.

But the dig really picked up in June, and archaeologists eventually discovered two headless sphinxes at the tomb’s entrance, “an impressive and unique feature not previously encountered in Macedonian tombs,” Anna Panagiotarea, a spokeswoman for the Culture Ministry, wrote in response to questions. This “understandably generated great interest, excitement, enthusiasm and expectations among scientists, the media and the public alike,” she added.

The Alexander factor emerged in August, when Mr. Samaras visited the site and his culture and sports minister, Konstantinos Tasoulas, made elliptical comments about how Greece had been waiting 2,300 years for the tomb to be discovered — an implicit reference to the era of Alexander the Great, who in the fourth century B.C. was tutored by Aristotle and built up a vast empire before dying in Babylon.

Last fall, archaeologists said they had found three vaulted chambers behind the facade decorated with the sphinxes, as well as a mosaic depicting the abduction of Persephone by Hades and two female statues known as caryatids, each 7 feet tall.

On Monday, months after announcing that bones had been found in the tomb, the Culture Ministry said the fragments belonged to at least five bodies — two men ages 35 to 45, a woman over the age of 60, an infant and a fifth body whose bone fragments showed signs of cremation. One of the men’s bones had cut marks, possibly indicating wounds by a sharp object. The ministry said it would conduct DNA exams to see if there are family links between the bodies.

Though the official news release said nothing about Alexander the Great, a report on Monday in the Athens daily Kathimerini cited vague sources at the Culture Ministry saying that the female skeleton might be Olympias, Alexander’s mother, who was murdered after his death. The article first appeared under the headline “Amphipolis Scientists Point to Olympias,” but was later revised to read “Skeletons Pose Many Questions.”

Archaeologists have said it’s more likely that Olympias was buried alone, and they cite multiple inscriptions that place her burial site in the city of Pydna, in northern Greece. They also say there’s no evidence that the tomb was that of Alexander the Great or his family. “No, we don’t believe it is,” said Olga Sakali, the president of the Association of Greek Archaeologists. “All the historical sources that we have until now don’t give us such a clue.”

The association filed a formal complaint to the Culture Ministry protesting about the timing and methodology of the dig, which the association says bypassed normal methods. They were aghast that mechanical earth-moving equipment was used, instead of more delicate means. “It was an excavation for the media,” Ms. Sakali said.

In a statement, the Culture Ministry said that the earth-moving equipment was “restricted to the early stages of the excavation,” at which point “expert staff” took over and “followed all established scientific methodologies, protocols and etiquette to the fullest.” The ministry said the dig had not been “hasty, archaeologically ‘unorthodox’ or scientifically improper.” Regardless of the debate, many have just enjoyed the ride. “It’s great material,” said Antonis Kanakis, the host of Radio Arvyla, a comedy program on Greek television, who “interviewed” a skeleton in a popular sketch last fall. Mr. Kanakis quoted one of his subject’s answers: “O.K., you’re very proud of your ancestors, but have you asked your ancestors if they’re proud of you?”

January 24th, 2015
Bonfire of the Humanities

Screen Shot 2015-01-24 at 3.43.26 PM
X’ian’s Terracotta Warriors were discovered by a group of seven farmers digging a well on their communal farm in 1974.
Historians are losing their audience, and searching for the next trend won’t win it back.

By: Samuel Moyn
The Nation
January 20, 2015

History has a history, and historians rarely tire of quarreling over it. Yet for the past few centuries, historians have maintained an uneasy truce over the assumption that the search for “facts” should always take precedence over the more fractious difficulty of interpreting them. According to Arnaldo Momigliano, the great twentieth-century Italian scholar of ancient history, it was the Renaissance antiquarians who, though they did not write history, inadvertently made the modern historical profession possible by repudiating grand theory in order to establish cherished fact. The antiquarians collected remnants of the classical past, and understandably they needed to vouch for the reliability of their artifacts at a time when so many relics were wrongly sourced or outright fakes. Momigliano cited the nineteenth-century Oxford don Mark Pattison, who went so far as to remark about antiquarians—approvingly—that “thinking was not their profession.” It may remain the whispered credo required for admission to the guild.

More wary than anthropologists, literary critics or political scientists of speculative frameworks, historians generally have been most pleased with their ability simply to tell the truth—as if it were a secret to be uncovered through fact-finding rather than a riddle to be solved through interpretation. Anthony Grafton once honored Momigliano with the title “the man who saved history,” and it seems fair to say that the latter voiced the consensus of a profession that makes facts almost sacred and theories essentially secondary.

Even when historians started to think a little, they did so gingerly. If antiquarians merely paved the road for modern history, to proceed down it required doing more than displaying the hard-won truth. Momigliano reported that it took a while for our early modern intellectual ancestors to suspect that they could ever improve on the classical historians of Greece and Rome, thanks to the new facts that antiquarians had eked out. The true antiquarians simply stashed their goods and, Momigliano vividly wrote, shivered in “horror at the invasion of the holy precincts of history by a fanatic gang of philosophers who travelled very light.” But their heirs, like Edward Gibbon, author of the stupendous Decline and Fall of the Roman Empire, realized that storytellers would have to take on board speculation or “philosophy,” corralling facts within an intellectual scheme to lend them meaning. Facts alone were blind, just as theory was empty on its own. Yet Momigliano, sharing Pattison’s approval of the antiquarian origins of history, acknowledged the necessity of thinking almost regretfully, as if the results were an inevitably ramshackle edifice built on the bedrock of fact that it was the real job of historians to lay down. Theories could be stripped away, and stories renovated as fashion changed, but the facts on which the edifice was built would endure. The “ethics” of the profession, Momigliano testified, rested on the ability of historians to stay true to them.

In the early days of Gibbon’s Enlightenment, most of the frameworks on which historians relied were theories about the origins and progress of society; in the two centuries since, historians have been willing to have their facts consort with a wide variety of suitors, from nationalism to Marxism to postmodernism. The discipline has gone through so many self-styled theoretical “turns” that it is frankly hard to keep up. It is paradoxically because most historians have looked on theory with suspicion—as a lamentable necessity, at best, to allow the facts their day—that they have often been avid trend-watchers. Precisely because they are so fickle, opportunistic and superficial in their attitude to speculation, historians seem to change popular theories often, treating them not as foundations to be built on, but as seasonal outfits to clothe the facts they have so assiduously gathered.

* * *

Today, historians worry that they have lost their audience, and their distress has made the search for the next trend seem especially pressing. At the beginning of her new book, Writing History in the Global Era, Lynn Hunt remarks that “history is in crisis” because it can no longer answer “the nagging question” of why history matters. David Armitage and Jo Guldi, in their History Manifesto, concur: in the face of today’s “bonfire of the humanities,” and a disastrous loss of interest in a topic in which the culture used to invest heavily (and in classes that students used to attend in droves), defining a new professional vocation is critical. History, so often viewed as a “luxury” or “indulgence,” needs to figure out how to “keep people awake at night,” as Simon Schama has said. Actually, the problem is worse: students today have endless diversions for the wee hours; the trouble for historians is keeping students awake during the day.

In the last few decades, Hunt has had the most reliable eye for new trends in the American historical profession, and what she considers important always amounts to more than the sum of her current enthusiasms. You may not like the enterprises she is bullish on; you may try to blow up one of her bandwagons—as I did in these pages when she invented human-rights history—only to find yourself riding it for life [“On the Genealogy of Morals,” April 16, 2007]. What you cannot dispute is that she has a preternatural sense of the new new thing being touted by historians to study old things.

Like a few other famous trendsetters, Hunt, who recently retired from UCLA, was trained in the 1970s during the rising tide of social history, when what mattered most was learning about the ordinary men—and, even more important, women—lost to the enormous condescension of posterity. Having focused for centuries on kings (and, eventually, presidents) and their wars and diplomats and negotiations, historians realized that they had mostly ignored the social forces pulsing from below, and they longed to identify with the forgotten people who had been written out of history simply because they were not elites. Social historians often had left-wing sympathies, and, following the lodestar of E.P. Thompson’s luminous The Making of the English Working Class (1963), they wanted social history to chronicle the rise in political consciousness of the laboring people (and, later, other oppressed or marginalized groups) who deserved justice. Because they were interested in the shape of society and not only its working classes, social historians drew on a then-newfangled body of thought. It was not just left-wing politics but Marxism as a theory of society that prospered under social history’s reign; in turn, the whole tradition of such thinking, from the Enlightenment to Emile Durkheim and Max Weber, became canonical.

Hunt left the fold in the 1980s, bolting for what she famously dubbed “the new cultural history.” Worlds became full of meaning, renegade social historians discovered, and the representations of power that people create, the rituals they practice, and the ways they interpret their worlds now trumped basic information about the social order. It wasn’t enough to understand the class structure at the time of the French Revolution, Hunt argued in her landmark book Politics, Culture, and Class in the French Revolution (1984); one also needed to understand the world of political symbols and “political culture” that made social action meaningful—especially since class turned out not to matter as much as the Marxists believed. Trading in Marxism for anthropology and “postmodern” theory, the new cultural history was, among other things, a protest against the tabulation of people according to static categories like “the workers” or “the peasantry,” and its breakthrough coincided with the failure of political efforts to win greater social equality.

Then Hunt changed her mind again. No sooner had the ink dried on The Family Romance of the French Revolution (1992)—a creative application of Sigmund Freud’s originally individualized psychoanalysis to a collective event, which remains her most interesting book—than she declared that “theory” had gone too far. It seemed, Hunt complained, to be little more than a recipe for saying whatever you want. “Postmodernists often put the word ‘reality’ in quotation marks to problematize the ‘there’ out there,” Hunt and several colleagues wrote in Telling the Truth About History (1994). But this statement wasn’t itself realistic—the point of theory is that no “reality” is self-interpreting—and her verdict could hardly prove the uselessness of broader frameworks of interpretation, except to those who treat them as secondary in the first place. Frightened by the whirling fashions that seemed to threaten mere chaos, Hunt rallied around facts. She declared the cultural turn a vast mistake, and postmodernism a tissue of error. From whatever heaven or hell they reside in, the antiquarians were smiling.

* * *

But if facts provide permanent refuge to historians, fashions continue to entice them. Twenty years on, Hunt is again scrutinizing the latest trends, and the opinions she offers about them in Writing History in the Global Era should not be taken lightly. She begins by reviewing the shift from social to cultural history. As she confesses, one big problem with the search for “meaning” in the past is that it was so vague as to be useless, even if it showed that a shortcoming of social history was an incessant focus on anonymous and supposedly objective processes. But cultural history proved to be another cul-de-sac. Hunt explains it with a different metaphor: “What began as a penetrating critique of the dominant paradigms ended up seeming less like a battering ram and more like that proverbial sucking sound of a flushing toilet.” In Hunt’s telling, the clear need even two decades ago was for a new “paradigm” for historians to apply to their facts. But what is it?

Where cultural history often emphasized the small and the local, Hunt continues, the current wave of interest in “globalization” favors the far-flung. It gets its name from a process exalted by Thomas Friedman and excoriated by Naomi Klein, and Hunt shows that historians have hardly been immune from suddenly discovering the world beyond their cramped former national or regional redoubts. She also shows that the very term “globalization” has experienced a crescendo in the past two decades, with books and articles pouring forth from presses offering global histories on a welter of subjects. We have been treated to global histories of cod, comics and cotton, and one publisher offers a series dedicated to global accounts of foodstuffs like figs, offal, pancakes and pizza. German historian Jürgen Osterhammel’s history of the nineteenth century, The Transformation of the World, shows what life was like when it took eighty days to travel around the globe, anticipating our age of supersonic movement of people and instantaneous transmission of bytes. Even Hunt has recently gotten into the act, editing a book about the French Revolution from a global perspective.

Proponents of globalizing history have persuasively argued that history has remained “Eurocentric,” but Hunt rightly asks whether the contemporary fashion of writing history across large spaces does more than drastically expand the canvas for historical depiction. “Is globalization a new paradigm for historical explanation that replaces those criticized by cultural theories?” she asks. It may enlarge the scale of study, focusing on long-distance trade, far-flung empire or cross-border war, but such a perspective could merely draw greater mountains of facts in view, without explaining what they mean or why they matter.

What global history emphatically does not prove is that the classic authorities for interpreting the past have become obsolete, especially since Karl Marx himself described the phenomenon now called globalization. Hunt’s starting point is different. She argues that because she and her fellow cultural historians so irreparably damaged the social theories that commanded history from Gibbon’s time to our own, the options for doing history now can only take one of two forms. One is to do without any reigning “paradigm,” which Hunt stipulates cultural history never had—beyond a general commitment to recapturing meaning, without agreement on how to interpret it. The other is to invent a new paradigm. Hunt’s fear is that globalization, because it foregrounds anonymous processes once favored by social historians, will end up preferring the sorts of frameworks they once relied upon. Globalization could, that is, make obsolete the insights of the cultural revolution Hunt originally sponsored, while doing nothing to lead historians beyond the limits she now thinks are intrinsic to a global focus.

To her credit, Hunt makes it clear that her need for a new dispensation is hardly universal within the profession. It is conventional to group Hunt with her generational colleagues Joan Scott and William Sewell, since all three bolted from social history in a crowd, and all three have regularly explained their turns over the years. (Sewell is the author of the greatest book in the historiographical landscape, Logics of History: Social Theory and Social Transformation [2005].) But as Hunt notes, Scott has stuck it out with postmodernism—apparently believing it more defensible than Hunt does—while Sewell has gone “backwards” to Marxism. Hunt is not satisfied with either choice: “Must historians choose between a return to the previous paradigms,” she wonders, “or no paradigm at all?”

For Hunt to ask this question, her twin premises—that cultural history utterly devastated social theory, while generating no real interpretive worldview of its own—must bear a lot of weight. Perhaps too much: Sewell doesn’t think the first is true, while Scott would bridle at the second. For that matter, you might wonder whether the source of the problem is the roller coaster of approaches and its endless loops, which produces the demand for a new new theory.

Bravely, Hunt forges ahead to shape her own paradigm, in what is the most interesting chapter of her book. She concludes that historians need a novel approach to society—or, more precisely, a theory of the mutual relationship between the individual self and the larger society. Neither social nor cultural history, which submerged the individual in a larger system of forces or meaning—often to the point of rendering him entirely insignificant—could possibly fit the bill, Hunt says. But there is good news: “Ideas about the society-self connection are now emerging from an unlikely conjunction of influences.” Her goal is to spell out what these are, as sources for a new paradigm.

Two of Hunt’s sources are evolutionary neuroscience and cognitive psychology, which she tinkered with in earlier work. Her enthusiasm for them appears strange, given that the rule of biological processes is hardly less anonymous and deterministic than a globalizing turn that effaces human agency. Importing newfangled theories from other esoteric fields and leaning on works of pop science doesn’t seem like a recipe for success. Remember the crop of historians of the late nineteenth and early twentieth century who put their bets on scientific racism? Nobody does, except as cautionary tales, because their work is worthless.

What becomes even more confusing is that Hunt grafts this trend onto a return to the hoary tradition of social theory that she explicitly admits is simply a broader version of the approaches that cultural history supposedly overturned. The idea that “the social is the ground of meaning”—in Hunt’s ultimate formula—was central to the tradition of thinking from the Neapolitan sage Giambattista Vico to Durkheim, Marx and Weber. It may be that social historians badly misunderstood this tradition in their efforts to think about society in terms of broad categories of people, just as cultural historians reversed the error in celebrating “meaning” as a separate object. But in her proposed return to the social, Hunt is essentially admitting that we progress not by seeking a new paradigm, but by fixing past mistakes. One of the biggest is the trend-driven thought that historians had to choose between studying society and studying culture, even if that false choice once made sense to Hunt and her generation.

For this reason, Hunt’s book sometimes reads as if we have to live her own intellectual life story in order to follow her venture to craft a new paradigm. It could be, however, that all this talk of “paradigms” is misleading—a distraction from the fact that the relation of self and society has been the constant concern of social theory since its origin, and that there is a huge range of options within that tradition to explore and improve upon. Hunt repudiates the common postmodern position that the self is a historical product, as if merely proposing a compromise between the claims of society and the self were specific or sufficient. Even when it comes to her own modish neuroscientific flourish, Hunt connects it to an older French thinker, Maurice Merleau-Ponty, and his broader notion that selves are embodied. But like Marcel Gauchet, a contemporary Frenchman on whom she draws heavily, Merleau-Ponty is merely one figure within a rich fund of resources in social thought.

Hunt raises but never resolves what may be the key quandary for historians today. The emergence of global history inevitably makes one wonder if the categories—starting with “society” itself—that Westerners have devised to study themselves are applicable to peoples of all times and climes. Hunt repudiates extremist commentators who insist that Western categories can only ever explain Western things. It is not clear that this overcomes the difficulty.

* * *

Whereas Hunt wants to reckon with the fashion of globalization, Armitage and Guldi are interested in larger time scales and not merely expanded geographical spaces. Armitage, a trusted Harvard colleague of mine, has never been above spotting trends himself, having already helped define the study of Atlantic history, Pacific history and international history. Now he has a couple of new themes—long-term history and present-minded history—and in his effort to expound them he is joined by Guldi, a younger whiz kid who is an expert in “big data.”

Their exciting argument goes like this: in the past few decades, historians have dropped their emphasis on what the French historian Fernand Braudel called the longue durée. In his celebrated history of the Mediterranean Sea littoral, published in 1949, Braudel insisted on the superior reality of the long-term rhythms of life. The commanding forces of demography and environment, Braudel assumed, made individuals—even kings—mere “dust.” Armitage and Guldi offer a series of reasons why, contrary to Braudel’s inspiring example, historians broke for the short term. Perhaps the main one was cultural history: “meaning” seemed inevitably tied to a specific time and place, in ways that grand stories across vastly different times would always slight. But there were other reasons, too, like the pressures of finding new topics in the professional competition for turf. The results, Armitage and Guldi believe, were profound, as the average time scale of history books was precipitously compressed.

But retrieving our sensitivity to what the pair somewhat mysteriously call “vibrations of deeper time” is not just an attempt to return to Braudel’s cool and remote surveys of aeons. The real reason to ascend to Olympian heights and the sweeping gaze they allow, Armitage and Guldi say, is to plunge into the political affairs of the city. How is it, they ask, that since classical times history played the role as magistra vitae—roughly, a teacher for living—and especially for the guidance of political actors, but now has been rudely displaced by other fields, and especially by dismal (and often disastrous) economic thinking? History used to be, if not exactly philosophy, then at least “philosophy teaching by examples,” as Thucydides originally put it, and as the early modern Viscount Bolingbroke repeated in his Letters on the Study and Use of History (1735).

In this plea for relevance, Armitage is cutting against the famous stricture of his mentor, the Cambridge University don Quentin Skinner: if thinking is to be done, it has to be done “for ourselves,” without the aid of historical perspective. Where Skinner voiced a conventional antiquarian view that the role of writing history is to cut the present off from very different pasts, Armitage and Guldi insist on the operative value of historical work, and indeed for the highest public causes. After chronicling the cult of the short term, the two turn to the pressing political reasons for abandoning it in order to bring the long term to bear on our present, with the help of new digital tools. Historians need, they say, to immerse themselves in the vast digital archives of searchable information now on offer, and compared to which their old search for archival documents looks narrow and quaint.

Even as they have some wise and penetrating things to say about the new services that big data affords, Armitage and Guldi make it clear that their brief is not for every historian to shift to the long term. In their defense, they cite none other than Lynn Hunt. Time-bound and local puzzles will always remain to be confronted; but for Armitage and Guldi, the really uplifting new new thing is that computerized data and computing power allow a set of rapid solutions to challenges that took Braudel and his ilk a decade to decipher. And these, they argue, could in turn allow historians a return to the public stage, whether it comes to debates about international governance or global land reform.

* * *

Armitage and Guldi are careful to distinguish their notion of the long term from other calls for “deep” and “big” history. Given her scientism, Hunt has a soft spot for the call for depth, one that is associated with another Harvard scholar, Daniel Smail, author of On Deep History and the Brain (2008). Smail refuses to restrict the history of humanity to the last few millennia and their documentary record, when archaeology and especially biology provide tools to extend history back much further. For acolytes of “big” history, like the Australian scholar David Christian, “deep” history that starts so late—with human beings—is itself too unambitious. It’s an argument that has resonated beyond the ivory tower. Bill Gates has been agitating for high schools to teach history starting with the Big Bang. “I just loved it,” Gates told The New York Times of his experience exercising on his treadmill while watching Christian explain the concept of big history on a video. “It was very clarifying for me. I thought, God, everybody should watch this thing!”

Perish the thought. Apart from the fact that Gates’s scientism sacrifices the critical perspective that humanists have learned to maintain since their disastrous nineteenth-century dalliance with biology and other natural sciences, the trouble with massive expansions of the time line, even just to the totality of human history, is simple: it forces historians to become scientists, effectively converting their discipline into what is already somebody else’s job. Gates’s big historians already exist: they are called physicists. In any case, this is not what Armitage and Guldi seem to want. They justifiably insist that humanistic inquiry like history is supposed to provide an alternative to “the natural-law models of evolutionary anthropologists, economists, and other arbiters of our society.” More than that, excessive expansion sacrifices the idea that the drama of human history is about the fate of our ends, and therefore what we ought to care most about, even when they affect the nonhuman world.

Yet even in their comparatively modest call for long time lines to confront burning problems (including a literally burning earth), Armitage and Guldi have no answer to what has always been the really hard question: How do you interpret facts across a tiny or huge time scale? Just as the globe provides a larger space, an extended time line merely allows a longer frame. To think about what happens in the sunlit uplands beyond the confinement of the local and time-bound, you need a theory. Data—including big data about the long term—is never self-interpreting. Nor is orientation toward the past for the sake of the future solely a problem for which more information is the solution; it is ultimately a philosophical problem that only speculation can solve. This was the point of social theory from Vico to Marx: to integrate necessary facts with a vision of human becoming, which never lacked an ethical and political dimension. Arguably, it is this, most of all, that people need today, not merely a proclivity for the long term.

Armitage and Guldi have no use for Marx except to inspire their title, and to allow them to begin their book by invoking the specter of the long term and to end it by demanding that the historians of the world unite. Unlike Hunt, they do not regard the newly won chronological sweep—like the larger space of globalization—as something that has to be filled by some theory or other that allows new or big (or old or little) data to be interpreted in compelling ways. Or if they do, it is not the focus of their brief for ambition.

Even our boldest trendsetters, then, do not see the wall between history and philosophy as the final frontier to breach, in part because it was the first one erected to define the discipline by antiquarians in love with their facts. Armitage and Guldi wisely remark that fashionable “critical turns” conceal “old patterns of thought that have become entrenched.” Of these, the most durable is not the affection for the short term, but the refusal to risk the certainty of facts for the sake of a fusion of history and philosophy.

* * *

In 1966, Hayden White published “The Burden of History,” his still invigorating attack on his professional colleagues. “History is perhaps the conservative discipline par excellence,” White wrote, coming out swinging, and perhaps most of all against the factological ethics so central to the modern craft. The consequences, according to White, were grave: “As history has become increasingly professionalized and specialized, the ordinary historian, wrapped up in the search for the elusive document that will establish him as an authority in a narrowly defined field, has had little time to inform himself of the latest developments in the more remote fields of art and science.”

Momigliano wrote a notorious polemic against White (a former teacher of mine) precisely for denigrating the recovery of factual truth, which he thought central to history. But if Momigliano turned that recovery into a punishing imperative of the historical superego, White wanted to substitute a different “ethics” for history—one that would make room for theory, or even insist on seeing beyond the contrast between history and theory, in the service of the present. Nearly 90 years old and still ahead of his time, White is back this year with his own lively new book, The Practical Past.

Because the past needs to be practical for us—there is no reason to care about it except insofar as it is useful to the present—White begins his book by once again putting Momigliano’s professional “ethics” in their proper place:

The older, rhetorically structured mode of historical writing openly promoted the study and contemplation of the past as propaedeutic to a life in the public sphere, as an alternative ground to theology and metaphysics (not to mention as an alternative to the kind of knowledge one might derive from experience of what Aristotle called the “banausic” life of commerce and trade), for the discovery or invention of principles by which to answer the central question of ethics: “What should (ought, must) I do?” Or to put it in Lenin’s terms: “What is to be done?”

It seems as if, in roundabout ways, all of our current historiographical trend-followers finally agree with White, in the face of what they regard as a great crisis for historical writing today. But it is one thing to call for speculation for the sake of relevance, and another to bring about a new marriage of history and philosophy. For the coming generation, one thing is clear: thinking will have to become our profession.

Writing History in the Global Era
By Lynn Hunt.

The History Manifesto
By Jo Guldi and David Armitage.

The Practical Past
By Hayden White.
Edited by Ed Dimendberg.

January 24th, 2015
Betty Woodman

bw
The Chartreuse Table, 2014, glazed earthenware, epoxy resin, lacquer, acrylic paint, canvas, wood, 70 x 85 x 12 inches (177.8 x 215.9 x 30.5 cm)

January 31 — March 21, 2015

David Kordansky

January 23rd, 2015
Prev · Next