Tuesday, March 10, 2009

'The War Behind Me: Vietnam Veterans Confront the Truth About U.S. War Crimes,' by Deborah Nelson


"Villagers, acting as human minesweepers, walked ahead of troops in dangerous areas to keep Americans from being blown up. Prisoners were subjected to a variation on waterboarding and jolted with electricity. Teenage boys fishing on a lake, as well as children tending flocks of ducks, were killed. “There are hundreds of such reports in the war-crime archive, each one dutifully recorded, sometimes with no more than a passing sentence or two, as if the killing were as routine as the activity it interrupted,” Deborah Nelson writes in “The War Behind Me.”

The archive in question, a set of Army documents at the National Archives and Records Administration in College Park, Md., reveals widespread killing and abuse by American troops in Vietnam. Most of these actions are not known to the public, even though the military investigated them. The crimes are similar to those committed at My Lai in 1968. Yet, as Nelson contends, most Ameri­cans still think the violence was the work of “a few rogue units,” when in fact “every major division that served in Vietnam was represented.” Precisely how many soldiers were involved, and to what extent, is not known, but she shows that the abuse was far more common than is generally believed. Her book helps explain how this misunderstanding came about.

After the My Lai story broke, officials acted quickly. They looked into other crimes — for example, studying anonymous letters sent to superiors by “Concerned Sgt.,” which described the deaths of hundreds of civilians, or “a My Lai each month for over a year.” Serious offenses were indeed investigated, and 23 men were found guilty, though most got off easy. The harshest sentence was 20 years’ hard labor, for the rape of a 13-year-old girl by an interrogator in a prisoner-of-war compound. The rapist served seven months and 16 days.

“Get the Army off the front page,” President Richard Nixon reportedly said. Investigations were a good way to do that. A cover-up attracts attention; a crime that is being looked into does not. The military investigations, Nelson argues, were designed not to hold rapists and murderers accountable, but to deflect publicity. When reporters heard about a war crime, they’d call the Army to see if it would provide information. If they suspected a cover-up, they’d pursue the story. If a military spokesman said an investigation was under way, the story was usually dropped.

Nelson, who wrote a series on war crimes with a military historian when she was at The Los Angeles Times, is a diligent, passionate reporter. Her zeal, though, sometimes leads to awkward moments. In Vietnam, villagers tell her about killings that took place in a ravine, giving her “hope” that she has discovered a hamlet where a massacre occurred in 1968. It is a different massacre, as it turns out; she seems vaguely disappointed.

Still, this is an important book. Nelson demonstrates that cover-ups happen in plain sight and that looking for an exclusive can blind reporters to the real story. She also points out that these crimes are endemic to counterinsurgency operations. When troops fight among a civilian population, in conflicts that extend for years, atrocities are almost bound to happen. “If we rationalize it as isolated acts, as we did in Vietnam and as we’re doing with Abu Ghraib,” a retired brigadier general tells her, “we’ll never correct the problem. Counterinsurgency operations involving foreign military forces will inevitably result in such acts, and we will pay the costs in terms of moral legitimacy.” Whether it’s Vietnam or Iraq, the truth is disturbing. “After such knowledge,” T. S. Eliot wrote, “what forgiveness?”

THE WAR BEHIND ME
Vietnam Veterans Confront the Truth About U.S. War Crimes
By Deborah Nelson 296 pp. Basic Books. $26.95





Read More......

'Giordano Bruno: Philosopher, Heretic,' by Ingrid D. Rowland


"It has become an overused word, but Giordano Bruno may justly be described as a maverick. Burned at the stake in Rome on Ash Wednesday in 1600, he seems to have been an unclassifiable mixture of foul-mouthed Neapolitan mountebank, loquacious poet, religious reformer, scholastic philosopher and slightly wacky astronomer. His version of Christianity is impossible to label. Educated by the Dominicans — the guardians of Catholic orthodoxy in those days — he revered certain scriptures and the writings of St. Augustine, always doubted the divinity of Jesus and flirted with dangerous new ideas of Protestantism, and yet hoped that the pope himself would clear him of heresy.

Bruno was a martyr to something, but four centuries after his immolation it is still not clear what. It doesn’t help that the full records of his 16 interrogations in the prisons of the Roman Inquisition have been lost or destroyed. The enigma of Bruno runs deeper than that, as Ingrid Rowland, a scholar of the Renaissance who teaches in Rome, makes clear in her rich new biography, “Giordano Bruno.” Was he some sort of scientific pioneer, to be compared with Galileo, whose milder encounter with the Roman Inquisition — indeed, with the same inquisitor, Cardinal Bellarmine — followed not long afterward? Like Galileo, Bruno rejected the earth-centered cosmology and Aristotelian physics endorsed by the church. In the 19th century, historians of science saw him as an early proponent of atomic theory and the infinite universe.

Or was Bruno an occultist dreamer, more magician than mathematician, as the renowned historian Frances Yates influentially argued in the 1960s? Either way, Bruno suffered for speaking his mind, though he also had a lot of bad luck, some of which he brought upon himself.

His story begins in Nola, a small city to the east of Naples. Bruno referred to himself as “il Nolano,” and Rowland echoes this, calling him “the Nolan” and frequently speaking of the “Nolan philosophy.” (This moniker may be harmless in America today, but it has awkward connotations for those who remember the Nolans of the 1970s and 1980s European pop scene, and their biggest hit, “I’m in the Mood for Dancing.”) The son of a well-connected professional soldier, Bruno entered the Neapolitan convent of San Domenico Maggiore at the age of 14 and was quickly noticed for two things. First, there was his prodigious memory: as a 20-year-old he was sent to perform his feats of recall before the pope. The ancient art of enhanced memorization was what he was best known for in his own time, and teaching it to others was his most marketable skill. Mnemonic feats were not only a practically useful party trick, but were often held to enable a practitioner to arrive at a systematic understanding of the world. Second, there was his religious unorthodoxy. As a boy, he removed all pictures from his convent cell, keeping only a crucifix, and he scoffed at a fellow novice for reading a devotional poem about the Virgin.

Although he was ordained a priest in 1572 and licensed to teach theology three years later, he was soon under investigation by the local head of the Dominicans for his irregular and outspoken views. By 1576 he had fled to Genoa and abandoned his clerical garb, teaching astronomy and Latin in a nearby town. The next 15 years were spent wandering through Europe on a hunt for patrons and professorships. First came Venice, then Padua, then Lyons, then a copy-editing job in Calvinist Geneva, where he was jailed and excommunicated for publishing an attack on a local philosopher. After two years of lecturing in Toulouse on Aristotle and astronomy, he had some success in Paris teaching the art of memory, with Henry III as royal patron. It was in Paris that he published a long philosophical drama, “The Candlemaker,” which Rowland implausibly suggests can be staged successfully, despite its five-hour running time. Its title page names the author as “Bruno the Nolan, the Academic of no Academy; nicknamed the exasperated.”

In 1583 Bruno joined the household of the French ambassador in London, where he published his major philosophical works, all dialogues, in which he espoused an infinite universe teeming with life. The timing was bad for such unorthodox cosmology. A century earlier, a German cardinal and mathematician, Nicholas of Cusa, made similar suggestions; but back then the church was not yet threatened by Protestant heresy and took a more relaxed attitude to strange views. A century later, a book by a writer of the early French Enlightenment, Bernard le Bovier de Fontenelle,popularized the same idea. (Though technically banned by the church, Fontenelle’s “Conversations on the Plurality of Worlds” was a literary sensation.) Bruno was both too late and too early to paint a universe in which man and his planet were not the center of a cozy domain.

In 1591 Bruno returned to Italy, where the real trouble began. A Venetian grandee, Giovanni Mocenigo, invited Bruno to teach him the art of memory, and Bruno moved into the family’s palazzo on the Grand Canal. After seven or eight months, relations between the two men began to cool (there are also suggestions that relations between Bruno and Mocenigo’s wife heated up), and the Venetian denounced him. Among the many unacceptable things Mocenigo claimed to have heard Bruno say, listed in a letter to the Inquisition in May 1592, were that Christ was a wretch and a magician, that the world is eternal but divine punishment is not, that bread does not turn into flesh in the Eucharist, that the Virgin cannot have given birth and that all friars are asses.

Bruno made a few unwise admissions to his Inquisitors, but denied most of the accusations. One informant was not enough for a conviction — a second witness was needed — and Bruno was willing to repent in order to gain release. The matter could have ended there, but the Roman Inquisition asked for Bruno’s extradition, and Venice, after months of negotiations, complied. The Romans interviewed many of Bruno’s old cellmates from Venice, and found one — an unstable Capuchin friar, himself later burned at the stake — who falsely believed that Bruno had denounced him and decided to return the favor.

Even with this second witness, it took the Roman Inquisition nearly seven years to bring the case to its sorry conclusion, and it managed to do so only when the Jesuit cardinal Robert Bellarmine took charge. Rowland quotes Bellarmine as once saying that “I hardly ever read a book without wanting to give it a good censoring.”Bruno’s fate was sealed when he unsuccessfully attempted to appeal over the heads of the Inquisition to the pope himself.

Though it can be hard to follow the story line in Rowland’s early chapters, where the background to Bruno’s later work is jumbled in with biographical fact, her telling of his end is gripping. As an intellectual biography, however, the book has too little examination of his ideas. Although Rowland would like us to see Bruno as a martyr to science, his work comes across more as theologically inspired science fiction. He was a poetic speculator, not an empirical or systematic investigator. Thus it is still not clear what the great master of memory should be remembered for."

GIORDANO BRUNO
Philosopher, Heretic
By Ingrid D. Rowland
335 pp. Farrar, Straus & Giroux. $27




Read More......

'American Therapy: The Rise of Psychotherapy in the United States,' by Jonathan Engel


"Does psychotherapy work? Depends on what you mean by “psychotherapy.” And by “work.”
The answer matters. In trying to ascend from (as Freud once put it) “hysterical misery to ordinary unhappiness,” millions of Americans attend weekly therapy sessions of myriad kinds, at costs that can exceed $10,000 a year. Large professional edifices — psychiatry, psychology, social work, among others — are constructed atop the notion that psychotherapy works. If it were to be conclusively demonstrated that therapy doesn’t work, therapists would be put out of business; that’s effectively what’s already happened to Freudian psychoanalysts.

Jonathan Engel, a professor of health care policy at Baruch College, begins “American Therapy” by asserting: “Psychotherapy works. Multiple studies conducted over the past half-century have demonstrated that two-thirds of people who engage in psychotherapy improve.” But then, intentionally or not, he dedicates the better part of this fascinating book to complicating that proposition.

For starters, there’s that one-third of patients who don’t get better with psychotherapy; by definition, it doesn’t work for them. And then, perhaps more damningly, there’s the one-third of patients who have been consistently shown to get better without any treatment at all.

And then there’s this: a survey published in the early 1970s found that whereas a majority (59 percent) of people who had visited a professional psychotherapist for mental distress reported having been “helped” or “helped a lot” by the consultation, much larger majorities of people who had consulted a clergyman (78 percent) or a physician without specialized psychological training (76 percent) or — get this — a lawyer (77 percent) reported the same thing. Of course, psychotherapy did develop some pretty wacky offshoots in the 1970s — primal scream therapy, rebirthing therapy and Z-therapy (which seems to have involved, among other things, poking and tickling the patient) — so maybe it’s not surprising that people got more psychic relief from their lawyers than their therapists. But while a 1974 paper by a Johns Hopkins psychiatrist criticized the “charlatans” who “preyed on the gullible and the self-deluded,” these kooky therapies were actually surprisingly effective; many of the patients who underwent them reported themselves cured. This would certainly seem to undermine the claims of mainstream professional psychotherapy to specialized knowledge of any particular usefulness. If someone can poke and tickle a neurotic patient to health, why should an aspiring psychotherapist bother to get a graduate degree? Is psychotherapy just a high-priced placebo?

Engel describes an experiment that seems to have been animated by these very questions. In 1979, a Vanderbilt University researcher named Hans Strupp divided 30 patients with psychological problems into two groups, one to be treated by trained psychotherapists, the other by humanities professors with no psychological expertise. The result? The two groups reported improvement at the same rates. “Effective psychotherapy,” Engel writes, “seemed to require little more than a willing patient and an intelligent and understanding counselor who met and spoke regularly and in confidence.”

A University of Pennsylvania study found that the most successful therapists — regardless of whether they were Freudians or behaviorists, cognitive therapists or Z-therapists — were honest and empathic and connected quickly and well with other people. (Strupp’s humanities professors may have fared so well because they were chosen based on how well likedthey were.) Studies like Strupp’s rattled the foundations of the field and, as Engel puts it, “shook therapists’ confidence in their own rectitude.” But, as Engel takes pains to remind us, if twice as many distressed people improve with therapy as without it — as studies consistently show — those are still pretty good odds for psychotherapy.

The question of effectiveness is only incidental to Engel’s main goal, which is to tell the story of how, over the course of less than 100 years, psychotherapy went from being an obscure treatment for ­upper-middle-class Jews in fin-de-siècleVienna to being a staple of mainstream American medical practice and a fixture of our popular culture. Mining both medical journals and the popular press, Engel spins a richly textured tale of psychotherapy’s rise.

Naturally, the story begins with Freud, a thoroughly unlikely candidate to become the progenitor of anything distinctly American. He visited the United States only once, in 1909, and found the country rather barbaric. Practical-minded Americans, for their part, would not seem to have provided a receptive audience for his arcane theory of mind, with its id, ego and superego, and its references to Oedipal crises, castration complexes and penis envy. But the most eminent American doctors of the day — G. Stanley Hall at Clark University, James Jackson Putnam at Harvard, Adolf Meyer at Johns Hopkins and, later, Harry Stack Sullivan, of the Washington School of Psychiatry, among others — embraced and promoted Freudian theory. Psychoanalysis, Engel observes, “seemed to be compatible with a strain in the American zeitgeist,” and the psychoanalytic establishment here “rigidly stood by the Freudian canon for decades” after his death.

For several decades after Freud’s visit to America, psychotherapy remained at the margins of American culture; mental illness was still a little discussed, and highly stigmatized, phenomenon. World War II changed that: when 12 percent of draftees — nearly two million men — were rejected for “neuropsychiatric” reasons, it profoundly altered the American perception of mental illness; psychiatric problems became, in some sense, normal. William Menninger, who was serving as chief psychiatrist of the United States Army, noted that “people are beginning to see that damage of the same kind can be done by a bullet, bacteria or mother-in-law.” After the war, terms like “repression” and “inferiority complex” began cropping up in movies and best-selling novels. “Where the public once turned to the minister, or the captain of industry, or the historian or the scientist,” one social critic observed, “it is now turning more and more to the psychiatrist.” (Engel writes about the fascinating battle lines drawn between psychiatrists and the clergy during this time, with their diametrically opposed notions of guilt.)

When Time magazine put Freud on its cover in April 1956, the psychoanalytic moment in America had arrived, and for the next several decades psychoanalysts largely dominated the mental health field. But even as Freudians occupied the top echelons in the psychiatric institutes and the medical school residency programs, and the psychoanalytic idiom was tightly woven into the culture, more and more studies were calling into question the effectiveness of the psychoanalytic enterprise. In 1975, the behavioral psychologist Hans Eysenck declared (controversially) that “Freudian theory is as dead as that attributing neurotic symptoms to demonological influences, and his method of therapy is following exorcism into oblivion.”

The death warrant may actually have been written earlier, in the 1950s, on the first prescriptions for Thorazine, an anti­psychotic medication so effective that it became known as “the drug which emptied the hospitals.” Though Freud himself anticipated the age of biological psychiatry (in 1938, he wrote “the future may teach us to exercise a direct influence, by means of particular chemical substances, on the amounts of energy and their distribution in the mental apparatus”), the realization that drugs could so successfully treat some forms of mental illness thoroughly discombobulated the psychoanalytic profession. If drugs worked, that implied an organic, or medical, basis for neurosis, which in turn challenged some of the basic assumptions of psychoanalytically oriented therapy. If mental illness was due to some physical anomaly in the brain, wasn’t the best way to treat the illness by directly addressing that anomaly, with a pill? By the mid-1960s, the psychiatric establishment was moving definitively in a pharmaceutically oriented direction.

Meanwhile, the advent of even better drugs like Prozac (which went on the market in 1987), and the proliferation of cognitive therapies, in which the patient works with a therapist in a focused way to change maladaptive ways of thinking, further diminished Freud’s standing; repeated controlled studies clearly showed both drug and cognitive therapies to be effective in ways that psychoanalysis, with its hours on the couch, has not been shown to be.Though some Freudian analysts continue to practice today, Engel writes, they resemble “nothing more than a fanatical Essene sect, living apart in the wilderness where they could continue to seek truth in the master’s writings.”

Engel describes how factors like changes in the structure of health insurance shaped (and often distorted) psychiatric care, and his book is studded with fascinating tidbits like this one: in the mid-1960s, two buildings on the corner of 96th Street and Fifth Avenue in Manhattan had as many analysts as Minnesota, Oregon, Delaware, Oklahoma, Vermont, Wisconsin and Tennessee combined.

Engel gestures at, but doesn’t directly address, some of the most interesting questions prompted by the rise of psychotherapy. Is the enormous growth of the field over the last century simply a case of supply surging to meet demand, or does the volume of neurosis fluctuate over the years? Are anxiety and alienation always symptoms to be treated, or are they sometimes appropriate — even healthy — responses to the vicissitudes of late modernity? Is psychotherapy an art or a science, a subcategory of humanism or of biology?

But the story Engel does tell is plenty interesting and his conflicted view of Freudianism well worth absorbing: the most influential school of therapy in American history may not have worked very well as a treatment — but it did revolutionize how we think about the human mind."

AMERICAN THERAPY
The Rise of Psychotherapy in the United States
By Jonathan Engel
351 pages. Gotham Books. $27.50




Read More......

'The Man Who Owns the News: Inside the Secret World of Rupert Murdoch,' by Michael Wolff


"With continent-spanning business successes, multiple marriages and generational strife roiling beneath him, Rupert Murdoch would seem to be a perfect running story for a tabloid, but the man who owns the means of production rarely becomes grist for his own mill.

As proprietor of The New York Post, Murdoch is — in Michael Wolff’s new book — “the man who owns the news.” Murdoch has made some as well, upending the television business by creating Fox News and more recently stalking The Wall Street Journal — a newspaper that was not for sale — with the relentlessness of Ahab, but with the wrinkle of ultimate triumph.

That hunt and capture serves as the backdrop for Wolff’s portrait of Murdoch. The book is a strangely alluring artifact, with huge gaps in execution and stylistic tics that border on parody; it will nonetheless provide a deeply satisfying experience for the ­media-interested.

They are a pair, these two. Both adore gossip and revel in their unpleasantness, and neither gives a rip what anyone else thinks of him. Murdoch has achieved improbable business success, and Wolff has made no secret that he covets same. In a hybrid career that continues to this day — Wolff is a columnist for Vanity Fair and a founder of a news aggregator called Newser — he has somehow managed to both float above a demimonde of wealthy titans and seek to enter at every opportunity.

His chronicle of Murdoch’s purchase of The Wall Street Journal from the Bancroft family last year — this tale constitutes the narrative armature of the book — is full of the kind of detailed ticktock that makes business seem brutally exciting, but in general, Wolff has never distinguished himself as a reporter. Over the years, he has succeeded in cutting through the clutter by being far less circumspect — and sometimes more vicious — than other journalists, whom he views as archaic losers about to go the way of the Walkman.

“The Man Who Owns the News” attacks its subject with casual delight, but contains shockingly few actual quotations from Murdoch himself: a snippet here and there, nothing more. He remains disconcertingly spectral, even though Wolff spoke with him for many hours over many months.

Wolff drops a prophylactic mention of Murdoch’s conversational shortcomings, saying he is “not good at explaining himself and gets annoyed and frustrated when he’s asked to do so,” adding later that “he grips and clutches and descends into muttering and murmuring when forced to talk about himself.”

That may explain Murdoch’s unwillingness to sit on his own couch or one of Wolff’s making, but it does not excuse the general failure to give the reader, as the subtitle promises, a look “inside the secret world of Rupert Murdoch.”

Instead, we get Wolff’s own ineffable takes on how Murdoch became Murdoch: “He tends to create a disturbance, or pick up the tremulous motion of a disturbance, that in the chaotic motion of the atmosphere becomes amplified, eventually leading to large-scale atmospheric ­changes . . . or some such.”

This sort of protective irony, walking slowly up to a conclusion and then summarily dropping it, is mannered to the point where you can almost visualize Wolff licking his paws between sentences. But he’s smart, a wiseguy really, and often wanders his way toward insight. In the same passage, he observes that Murdoch possesses “the business equivalent of superb hand-eye coordination — of knowing when the opportunity presents itself and how to snatch it.”

Murdoch, for all his lack of an inner life, at least in this book, is an extremely engaging man to listen to. At investor conferences where other media titans drone on defensively, he is far and away the most fearless and factual. But not here. Wolff, who has the columnist’s tic of being far too struck by the fragrance of his own prose, draws attention to the Boswell at the expense of the Johnson.

“They are the substantial historical personages in the room; everybody else is . . . well, everybody else,” he says at one point. At another he writes, “He’s almost never there — except when he is, overwhelmingly, there.”

But the moment the reader is tempted to leave Wolff to marvel at his own de­vices, the author steps in and reminds us that his primary value is to speak the unspeakable. As he did in his delicious and prescient “Burn Rate,” an early book about the dot-com fantasia, he often just says it: “Every second working for Murdoch is a second spent thinking about what Murdoch wants. He inhabits you.”


And more substantively, Wolff makes it clear that as Murdoch woos the Bancrofts with negotiations over guarantees of editorial freedom for The Wall Street Journal, his cynicism is without bottom: “Murdoch is amazed these people are actually taking this seriously. Really, given everything — not least of all his own well-known history — it is preposterous that they would.”

Much was made of Wolff’s alliance with Murdoch, that it would lead to complicity and sycophancy, but Wolff remains true to his nature, which is joyously nasty. It is a baked-in reflex of a kind that Trollope described: “His satire springs rather from his own caustic nature than from the sins of the world in which he lives.”

Wolff takes no specific offense at Murdoch’s willingness to use his media properties to cold business ends, but depicts him as a cranky, monomaniacal newspaper hack, a con man with bad hearing, no interest in new media paradigms and no real friends to speak of. It is also pointed out that he is “a good family man — even if he has three of them.” Like the man he writes about, Wolff is a gossip who is very skilled at extracting information and sensing weakness.

The rest of the gang at Murdoch’s News Corporation fares no better. According to the book, Bill O’Reilly “talked dirty to underlings” and is loathed by everyone, including Murdoch. Wolff writes that Richard Johnson, the Page Six editor of The New York Post, “took money from sources.”

And Roger Ailes, the ferocious creator of Fox News, is feared by all, even Murdoch himself.


All of which makes the reader wonder why Murdoch would, in practical terms, drop a hungry ferret down his own trousers. Wolff, a guy with a lot of theories, has one about that: “Possibly his willingness had something to do with his perception that I regarded many of his enemies — particularly the journalistic priesthood — with some of the same contempt with which he regarded them.”

More broadly, Murdoch would clearly like his legacy to be cleansed by his acquisition of The Journal. That story, told with a mixture of contempt and awe, is rendered through the prism of family fecklessness and brute business tactics. It makes Murdoch’s globe-trotting history of buying tabloids and collecting politicians with equal facility seem a little beside the point, which might have been the gesture to begin with.

Characters in the Journal saga are quickly introduced and put in a riveting, ­present-tense motion. Richard Zannino, then the chief executive of Dow Jones & Company, decides to put a toe in the water next to Murdoch and soon finds himself soaked from head to foot. Andy Steginsky, a money manager and “a Rupert Murdoch groupie,” is the Zelig-like figure putting an arm on the Bancrofts, who are played like patsies from the jump. “They seemed like fools even to themselves,” Wolff writes.

A lot of people come off as fools and flunkies, including me: “New York Times media writer David Carr censoriously opined during the takeover that Murdoch ‘has demonstrated a habit over time of using his media properties to advance the business interests of his organization.’ Then, with the takeover completed, Carr pronounced him one of the most admired figures of the new media class precisely because he integrated all his business interests.” It’s much more complicated than that, but subtext is always missed when you’re the one being gored, no?

Historically, one of the problems with Wolff’s omniscience is that while he may know all, he gets some of it wrong. He opens Chapter 12 with a lovely set piece about John Lippman, a Journal reporter who wrote rugged articles about both Murdoch and his third wife, Wendi Deng, running for his life as the deal is about to be consummated in June 2007.

“Not a bad indicator of which way the deal will go is that John Lippman writes his last piece for The Wall Street Journal on June 22 and will shortly head for The Los Angeles Times,” Wolff writes, adding later on the same page, “His long-term paper is being pursued by a man whom he has written about in derisive, cruel, scathing, innuendo-laden terms.”

Lippman left The Journal in June 2006 and soon after went to work for Sitrick & Company, a public relations firm. It’s true he had a byline in The Journal on June 22, 2007, but it was atop a freelance review of a book by Jack Valenti, the longtime chairman of the Motion Picture Association of America. Lippman has since gone to work at The Los Angeles Times.

And Wolff also writes that The New York Times canceled a series on Murdoch after two articles were attacked by News Corp. Only two articles were scheduled, and none were canceled, according to the paper’s editors. Wolff prefers the purity of his constructs — one of which is that The Times is a deeply flawed artifact that is doomed to be crushed by the more nimble, less morally constricted Murdoch.


Obsessed by newsprint and digitally clueless, Murdoch is depicted as a remarkable modern figure. The issue of succession is dealt with in the book as it is at the company: people either put their fingers in their ears or cross them in hopes that Murdoch, who was born in Australia in 1931, will live forever. His unusual relationship with a crew of very talented, able children — pull them close in business matters and then humiliate them — is artfully described in the book, as is his somewhat henpecked relationship with his third wife, who reads his e-mail messages after business hours because he doesn’t use a computer.

Should I/we/you feel dirty for enjoying a little quality time with a man who believes that giving the impression of morals is better than actually having them and whose atavistic corporate impulses are put to contemporary, acquisitive ends? Probably not. Many before us have covered their eyes and waited for Rupert Murdoch to go away. Rupert Murdoch does not go away."


THE MAN WHO OWNS THE NEWS
Inside the Secret World of Rupert Murdoch
By Michael Wolff
446 pp. Broadway Books. $29.95.

Read More......

'Panic: The Story of Modern Financial Insanity,' by Michael Lewis


"For the past two decades, Michael Lewis, the most charming and one of the shrewdest guides to America’s raucous money culture, has displayed a knack for being at the right place at the right time. He was a young trader on the Salomon Brothers bond desk during the 1987 crash; the experience led to “Liar’s Poker.” His boss at Salomon, John Meriwether, a decade later became a central figure in the downfall of the hedge fund Long-Term Capital Management. Lewis spent a chunk of the 1990s in Silicon Valley, where he profiled the serial entrepreneur Jim Clark in “The New New Thing” and happened on to his next great subject, the Oakland A’s (“Moneyball”). Now, just in time for the Great Credit Debacle of 2008, Lewis has curated “Panic,” a prose exhibition on the past 20 years of monetary madness.

The new book, Lewis writes, is an effort “to recreate the more recent financial panics, in an attempt to show how financial markets now operate.” Of course, after reading this lively, frequently fine collection of newspaper articles, magazine features, academic post-mortems and the odd blog entry, you’ll find the markets are still something of a mystery. The volatility and gut-wrenching jolts seem to defy any rational expectation. “The bottom line is, no one knows,” the economist Franklin Edwards wrote in his introduction to a collection of studies on the crash of 1987.

Still, paging through “Panic” is like wandering through an eclectic museum that houses old masters (Nobel laureates like Paul Krugman and Joseph Stiglitz), but also folk artists (the satirist Dave Barry), artisans (beat reporters) and figurative painters (magazine writers like Lewis, John Cassidy and Roger Lowenstein).

“Panic” covers four major episodes: the 1987 United States stock market crash, the 1997-98 emerging-market bust-ups (called “Foreigners Gone Wild”), the dot-com meltdown and the current housing/credit/stock market collapse. Each is a triptych — the first panel is a brief essay by Lewis, the second is filled with contemporaneous newspaper or magazine articles that set up the boom, and the third presents sober analysis of why it happened. While being anthologized is usually a badge of honor for writers, some of the articles were plainly chosen for the way in which they typified the dangerous pre-panic zeitgeist, capturing “the feeling in the air immediately before things went wrong,” as Lewis puts it. I suspect the authors of the Time article from July 1987 on how individual investors were riding the bull market, and of the January 1996 New York Times article extolling emerging market mutual funds, now regard these works the way my brothers and I regard bell-bottom pants — signs of youthful indiscretion that are best forgotten.

But there are plenty of gems, especially involving the 1987 crash, which now seems quaint. An excerpt from a book by the former Wall Street Journal reporter Tim Metz sheds light on the chaos in the markets then. More broadly, the entries remind us that before CNBC, Yahoo Finance and E*Trade, those not sufficiently with it to possess a hand-held Quotron had to visit brokerage offices to check stock quotes.

What else is noteworthy? Paul Krugman’s dissection in Fortune of what went wrong in Asia in 1998, a Jeffrey Sachs interview on what went wrong in Russia. The Wall Street Journal’s 1998 article on how the stock of the second-tier book retailer Books-A-Million went on a wild rise after the company introduced its new Web site and Katharine Mieszkowski’s May 2000 Salon account on dot-coms’ blowing millions of dollars on Super Bowl advertisements don’t taste as good as Proust’s madeleine. But they sure take you back. Mark Gimein’s July 2000 Fortune article on AllAdvantage, which paid people 53 cents an hour to surf the Net with a special advertising bar on their screens, is a howler. The headline: “Meet the Dumbest Dot-Com in the World.”

The most recent episode, which Lewis calls “The People’s Panic,” is less funny — it’s too close, it roped in many more people, and the costs to the public are likely to be huge. The bailouts are especially galling given the ample warnings, like those sounded by John Cassidy of The New Yorker, who warned in November 2002 that housing would be the next crash. A single entry from the Irvine Housing Blog, which shows how a person in January 2005 bought a $1.157 million house with $270 down, refinanced with a funky teaser-rate mortgage and then proceeded to open up a $491,000 home equity line of credit by 2007, neatly encapsulates the lunacy.

Some of the best entries are Lewis’s own, including his January 1999 New York Times Magazine article on the failed hedge fund Long-Term Capital Management. The quantitative geniuses who designed this vehicle had a tough time grappling with the fact that their model had failed. “It is interesting to see how people respond when the assumptions that get them out of bed in the morning are declared ridiculous by the wider world,” Lewis writes. In each of the episodes, the bottom fell out because a bedrock belief held by many participants — smart professionals, not the perennially stupid individual investor — suddenly evaporated. “Panic” is to a large degree a chronicle of the capacity of highly paid professionals for self-delusion.

This volume could just as easily have been titled “Complacency.” As Lewis shows, there’s something distinctly Ameri­can in our propensity to blow bubbles until they pop, spend a few months licking our wounds and then hit replay. “Yuppies’ Last Rites Readied,” declared the headline on a New York Times article of Oct. 21, 1987, which documented how the stock market crash was causing materialist, money-soaked urban dwellers to reduce conspicuous consumption and focus more on human relationships. Of course, that moral awakening lasted only as long as the downturn. And the same business publications that do such a great job of dissecting the bubbles once they’ve popped are the same ones that help promote and sustain the next one. What drives this? It’s not simply greed, or stupidity, but a kind of learned naïveté. We convince ourselves, over and over again, that nothing can go wrong, and that even if it does the smart ones among us will be insulated from any ill effects. Despite Suze Orman’s pleas, as financial beings we lack self-awareness and irony. In October 2000, Jerry Useem of Fortune called prominent players and asked what they had learned from the dot-com bust. James Cramer, hedge fund manager, media personality and founder of TheStreet.com, declared the Internet over and spoke of spending his time coaching soccer. “I’m done with the material stuff.” Riiight.

In these times, $27.95 may seem a steep price to pay for a collection of articles, many of which can be found online. But there are good reasons to splurge. The book’s profits are going to charity. And as of mid-December, used copies were trading hands on the secondary market (Amazon) for $13. In other words, after a few weeks of ownership, this book still retains about half its value. Which is more than can be said for Citigroup stock."


PANIC
The Story of Modern Financial Insanity
Edited by Michael Lewis
391 pp. W. W. Norton & Company. $27.95

Read More......

'The Ascent of Money: A Financial History of the World,' by Niall Ferguson


"Niall Ferguson, it is fair to say, is a one-man book factory. In fact, if the American economy cranked out goods as prolifically as Ferguson does histories, we might not be in half the fix we are in right now. But then Ferguson wouldn’t have nearly as much to write about. The onetime enfant terrible of the Oxbridge historical establishment, Ferguson specializes in finding fault with great powers, especially the way they mismanage their empires. Ferguson first came to notice a decade ago with “The Pity of War,” a revisionist tour de force arguing that Britain made a world-historical error by entering World War I (and thereby destroying its empire) when it should have simply waited out the swift German conquest of Europe and remained a superpower, with Europe the better for it. More recently the Scottish-born Ferguson, who now spends half the year teaching at Harvard and the other half at Oxford, has turned his attention to the prodigal young heir to the British imperial crown, the United States. In “The Cash Nexus” (2001) and “Colossus” (2004), he urged Americans to emerge from their self-denial and fulfill their obvious destiny as the next “liberal” empire spreading the light of democracy and Anglo-­Saxon legalism across the globe. “The greatest disappointment facing the world in the 21st century,” Ferguson concluded in “The Cash Nexus” (published in the opening months of the Bush presidency), is that “the leaders of the one state with the economic resources to make the world a better place lack the guts to do it.” Ferguson later supported the Iraq war as evidence that Washington had finally gotten up its courage, imperially speaking.

Whatever one thinks of his arguments, it’s impossible to ignore Niall Ferguson. He’s like the brightest kid in the debating club, the one who pulls all-nighters in the library and ferrets out facts no one thought to uncover. And in his latest book, “The Ascent of Money” — humbly subtitled “A Financial History of the World” — Ferguson takes us on an often enlightening and enjoyable spelunking tour through the underside of great events, a lesson in how the most successful great powers have always been underpinned by smart money. “The ascent of money has been essential to the ascent of man,” he writes, making a conscious reference to the BBC production he loved as a boy, Jacob Bronowski’s “Ascent of Man.” (In fact, like Ferguson’s three previous books, “Colossus,” “Empire” and “The War of the World,” “The Ascent of Money” was written as a companion to a TV documentary series.)

“Behind each great historical phenomenon there lies a financial secret,” Ferguson says. He goes into fascinating detail about how “it was Nathan Roth­schild as much as the Duke of Wellington who defeated Napoleon at Waterloo” by selling bonds and stockpiling gold for the British Army. The richest bankers on the Continent in the 19th century, the Rothschilds became known as die Finanzbonaparten (the Bonapartes of finance). And, as Ferguson argues, they also played a crucial part in the South’s defeat in the Civil War by declining to invest in Confederate cotton-­collateralized bonds. Imperial Spain amassed vast amounts of bullion from the New World, but it faded as a power while the British and Dutch empires prospered because they had sophisticated banking systems and Spain did not. Similarly, the French Revolution was made all but inevitable by the machinations of an unscrupulous Scotsman named John Law, whom the deeply indebted French monarchy recklessly placed in charge of public finance. “It was as if one man was simultaneously running all 500 of the top U.S. corporations, the U.S. Treasury and the Federal Reserve System,” Ferguson writes. Law proceeded to single-handedly create the subprime mortgage bubble of his day. When it collapsed, the fallout “fatally set back France’s financial development, putting Frenchmen off paper money and stock markets for generations.”
Wilhelmine Germany, meanwhile, came up short in World War I because it “did not have access to the international bond market,” Ferguson writes. Every one of these episodes sounds like a warning shot: Will America be the next great power to fall because of unsound finance?
The question is particularly pressing in the midst of what is widely seen as the worst financial crisis since the Great Depression. And Ferguson’s conclusions are troubling. Only a few years after accusing Washington of “imperial understretch” for failing to flex its muscles — and without any hint of irony — Ferguson now argues that the United States may be succumbing to financial overstretch. Deeply in debt to the rest of the world, it has become part of a “dual country” that he calls “Chimerica.” “In effect, the People’s Republic of China has become banker to the United States of America,” he writes. Until the current global financial crisis, this seemed to be a fairly reliable relationship. American consumers over-bought goods and over-borrowed from China, and the Chinese in turn accumulated huge dollar surpluses that they plowed back into Wall Street investments, thereby supplying profligate Americans with the financing we needed to consume and sustain ourselves as the lone superpower. “For a time it seemed like a marriage made in heaven,” Ferguson writes. “The East Chimericans did the saving. The West Chimericans did the spending.”
Suddenly, however, it’s looking more like a marriage made in hell. According to Ferguson, much of the current crisis stems from this increasingly uneasy symbiosis. It turns out “there was a catch. The more China was willing to lend to the United States, the more Americans were willing to borrow.” This cascade of easy money, he argues, “was the underlying cause of the surge in bank lending, bond issuance and new derivative contracts that Planet Finance witnessed after 2000. . . . And Chimerica — or the Asian ‘savings glut,’ as Ben Bernanke called it — was the underlying reason why the U.S. mortgage market was so awash with cash in 2006 that you could get a 100 percent mortgage with no income, no job or assets.” Going forward, the system seems likely to be increasingly unstable, as Treasury Secretary Henry Paulson suggested recently when he warned that unless fundamental changes are made, “the pressure from global imbalances will simply build up again until it finds another outlet.”

Previous periods of global stability and peace had relied on judicious mechanisms like the Congress of Vienna or the Bretton Woods agreements. Now the international system — and America’s position within it — has come to depend on what looks more like a global Rube Goldberg machine running on hot money. And though Ferguson doesn’t come out and say it, the Chinese may now have the upper hand in this chimerical Chimerica. While so far it’s worked in Beijing’s interest to under­write America’s rampant consumerism — because we buy so many of their goods — the Chinese also have the option of recycling some of their surplus billions into their own huge population. We, on the other hand, don’t have the option not to borrow from them. Indeed, it’s no secret on Wall Street and in Washington that the real targets of President Bush’s $700 billion bailout plan were the foreign funds, including “sovereign wealth funds,” that keep America’s financial system afloat. Unless these foreign financiers — principally China and Japan — get reassurance that the global financial system can function properly again, Ameri­ca’s long period of growth and power may be coming to a close.

Perhaps, then, the conclusion should be that Americans need to flex our muscles less as an empire and fight a little harder for fiscal sobriety and balance in our foreign policy. To be fair, Ferguson was early in seeing that America’s fiscal problems were serious. In “Colossus,” he warned presciently of America’s increasing reliance on Chinese capital, although he argued then that we should be mainly worried about domestic entitlements like Medicare and Social Security — indicating that he, like the Bush administration, seriously underestimated the ultimate cost of the Iraq war.

As with Ferguson’s three previous documentary efforts, “The Ascent of Money” sometimes feels as if it were laid out like a shooting script. Ferguson will depart from an exegesis on the 17th century or the Great Depression to pop up in post-Katrina New Orleans or Memphis (for a report on bankruptcies), and we surmise it’s to record another on-scener for PBS. The book, whose main text comprises a scant 360 pages (a light effort for Ferguson, especially considering the ambitious subtitle), is also reductionist at times. Is it really fair to say Chimerica is mainly at the root of our current problems? (A lack of oversight and regulation of the subprime mortgage market here at home had a lot to do with it as well.) China’s backwardness between the 1700s and 1970s was largely due to its dearth of financial innovation, he suggests, but other historians have pointed equally to the absence of technological innovation of the kind that arose in Europe’s close-quartered patchwork of states because of repeated wars.

And in the end, as Ferguson himself seems to acknowledge, the scope of the financial crisis that is plaguing the world today calls into question the book’s premise — that the “trajectory” of finance through history, while “jagged and irregular,” is “unquestionably upwards.” Our increasingly sophisticated finance clearly contains self-destructive tendencies, and its very complexity may have become our undoing. Ferguson wonders whether the cruel realities of biological evolution are the model for what is happening now. Contemplating the financial Armageddon that has devastated Wall Street and set back globalization, he asks: “Are we on the brink of a ‘great dying’ in the financial world — one of those mass extinctions of species that have occurred periodically, like the end-Cambrian extinction that killed off 90 percent of Earth’s species, or the Cretaceous-Tertiary catastrophe that wiped out the dinosaurs?” Here we thought we were making all this progress as a species, and suddenly we find our supposed innovations lumped with Tyrannosaurus rex. Doesn’t sound like much of an ascent to me."


THE ASCENT OF MONEY
A Financial History of the World
By Niall Ferguson
Illustrated. 442 pp. The Penguin Press. $29.95

Read More......

'The Widow Clicquot: The Story of a Champagne Empire and the Woman Who Ruled It,' by Tilar J. Mazzeo


“The Widow Clicquot,” Tilar J. Mazzeo’s sweeping oenobiography of Barbe-Nicole Clicquot Ponsardin, is the story of a woman who was a smashing success long before anyone conceptualized the glass ceiling. Her destiny was formed in the wake of the French Revolution when, Mazzeo suggests, “modern society — with its emphasis on commerce and the freedom of the individual — was invented.” Barbe-Nicole, daughter of a successful textile maker turned Jacobin, is portrayed as someone whose way of doing business helped define the next century.

Fate cursed or blessed her with the mantle of early widowhood. Her husband, a winemaker from whom she learned the craft, died when she was 27, leaving her a single mother — the veuve (widow) Clicquot. Officially, the cause of François Clicquot’s death was typhoid, which was then commonly treated by feeding the patient Champagne, believed to strengthen the body against what was known as malignant fever. “To think that a bottle of his own sparkling wine might have saved François!” Mazzeo writes, going on to speculate that it is also possible he killed himself because business wasn’t good.

Already savvy about winemaking, Barbe­-Nicole plunged into a new life. Despite contemporary mores and the Napoleonic Code, which emphasized a woman’s role at home, she was not alone. She saw the success of such wine merchants as the widow Germon, the widow Robert and the widow Blanc, and understood that widows were the “only women granted the social freedom to run their own affairs.” With the gate open, she was off and running with spectacular results.
What a prescient entrepreneur she was, with a business outlook that sounds more 21st century than 19th. Toward the end of her life, in the 1860s, she wrote to a great-grandchild: “The world is in perpetual motion, and we must invent the things of tomorrow. One must go before others, be determined and exacting, and let your intelligence direct your life. Act with audacity.”

Her audacity was unleashed at the right time. Napoleon’s abdication in 1814 was cause for toasts among both the British and Russians. “Champagne,” Mazzeo writes, “was on its way to becoming another word for mass-culture celebration.” While the war’s naval blockade still paralyzed commercial shipping, Mme. Clicquot conspired to sneak a boat around the armada, delivering 10,000 bottles of high-proof, cork-popping 1811 cuvée Veuve Clicquot to Königsberg, where it sold for the equivalent of $100 per bottle. When the powerhouse 1811 reached St. Petersburg, Czar Alexander declared he would drink nothing else. Within two years the widow Clicquot was “at the helm of an internationally renowned commercial empire — and she was one of the first women in modern history to do it.” People said she had conquered Russia with Champagne; soon, London clubgoers simply asked for a bottle of “the Widow.”

As much about Champagne itself as about the woman who helped elevate it to celebrity status, “The Widow Clicquot” reveals that the wine’s history is as filled with faux folklore as a glass of it is with tiny bubbles. For one thing, Dom Pierre Pérignon did not invent it. The oft-told fable is that Dom Pérignon, the cellar master at the Hautvillers abbey, took a first sip and cried out to his fellow monks: “Come quickly! I am drinking the stars!” A charming tale, but bogus. Mazzeo says that for a decade after 1660, when Dom Pérignon gained fame as a master blender, he steadfastly worked at ways to prevent wine from developing bubbles. “In the 17th century,” she reports, “winemakers were anything but delighted by the voluntary sparkle that developed in their casks come spring.” Champagne did not even originate in France. While Dom Pérignon was struggling to stamp out bubbles, British oenophiles already were drinking sparkling wine made from Champagne grapes. Why? Customers rich enough to buy whole barrels realized they had to do something to keep their prize from turning to vinegar. They put still wine from Champagne into sturdy British bottles, sometimes with a little brandy to act as a preservative. At some point, somebody realized that sugar bottled with the wine would start a secondary fermentation, creating Champagne. Bubbly was not invented; it was discovered by accident.

At its beginning, Champagne scarcely resembled the dry, fine-fizzed champers we know today. Whereas a modern demi sec might contain 20 grams of sugar per bottle, the Champagne of Mme. Clicquot’s time held 10 or 15 times that much and was served as icy as a Slurpee. Nor did the original stuff have elegant little bubbles to tickle your nose. Veuve Clicquot customers complained about bubbles so big and gassy that they left the wine topped with a beery foam.
Madame Clicquot disparagingly called the unwelcome froth “toad’s eyes,” and was determined to make better bubbles. Although she was head of the company, her devotion to the craft of wine making never wavered; she worked with her cellar master to devise a riddling rack to facilitate remuage, the process by which sediment is drawn from the liquid to the bottle’s neck. Her obsession with creating a beverage as clear as a flawless diamond may well have been her most important achievement. Without it, Mazzeo writes, “Champagne could never have become the world’s most famous wine.”


THE WIDOW CLICQUOT
The Story of a Champagne Empire and the Woman Who Ruled It
By Tilar J. Mazzeo
265 pp. Collins/HarperCollins Publishers. $25.95

Read More......

'Street Gang: The Compete History of "Sesame Street,"' by Michael Davis


"In 1981, when I was 6, about 10 million American children daily tuned in to the PBS show “Sesame Street.” That same year, one of the writers for “Sesame Street,” my real-life neighbor, asked if I’d like to appear on the show. It was my golden ticket, but crossing over to the other side of the television screen can be a demystifying journey. The “Sesame Street” soundstage looked like a facsimile of the televised world — small and (surprisingly) indoors. The Muppets were controlled by operators; we were told not to look down at them. And there was Big Bird, stored in the middle of the set on a massive hook. When I reached out to pet him, a voice came from the sky: “Don’t touch those feathers!” admonished one of Big Bird’s creators, the remarkably named Kermit Love.

The address of 123 Sesame Street was never quite the same. Yet to be cast out of the garden of television-land can be a learning experience. “Street Gang: The Complete History of ‘Sesame Street,’ ” by Michael Davis, a former columnist for TV Guide, now offers the behind-the-lens story, the first comprehensive account, of this 39-year-old show.

The book details the awesome lengths that “Sesame Street,” undoubtedly the most workshopped and vetted program in the history of children’s television, went through to captivate its young audience. The show’s music and quick cuts concealed its educational ambitions. “Commercial breaks” advertised numbers and the alphabet through Jim Henson’s Muppet pitchmen: the Count, Grover and Cookie Monster. Kermit the Frog, wearing a trench coat, told fairy tales through news flashes from Rapunzel’s tower. Meanwhile, the urban street scenes at the center of the show communicated the social values of a progressive culture. Here was TV at its most sublime, but also an entrancing product of a liberal age, something Mom was happy for us to watch.

The “Sesame Street” story begins on a Sunday in December 1965. At 6:30 in the morning, 3-year-old Sarah Morrisett tuned in to the test patterns while awaiting her cartoons to begin a half-hour later. Her father, Lloyd Morrisett, an experimental psychologist and a vice president of the Carnegie Corporation, took note. “It struck me there was something fascinating to Sarah about television,” he says.

“Sarah Morrisett had memorized an entire repertoire of TV jingles,” Davis writes. “It is not too far a stretch to say that Sarah’s mastery of jingles led to a central hypothesis of the great experiment that we know as ‘Sesame Street’: if television could successfully teach the words and music to advertisements, couldn’t it teach children more substantive material by co-opting the very elements that made ads so effective?”

The thought of using the trappings of television for progressive ends seemed anathema to most intellectuals, who were wholly skeptical of this mass-culture medium, but Morrisett brought up his observation at dinner with Joan Ganz Cooney, the future creator of “Sesame Street.”

In the mid-1960s, as one of his grand social initiatives, Lyndon B. Johnson took up the cause of National Educational Television (later known as the Public Broadcasting Service), a lackluster confederation of chalk-dusted channels. Like the show she developed for PBS that would define the network, Cooney was steeped in the ideals of Johnson’s Great Society. In New York, while working in publicity for commercial television, she was introduced to William Phillips, co-founder of Partisan Review, the small but vastly influential journal of highbrow leftist opinion. In her spare time, Cooney did publicity for Partisan Review and produced a fund-raiser at Columbia that was attended by Norman Mailer, Mary McCarthy and Lionel Trilling.

Cooney’s ability to transcend the divisions between high and low culture defined her success at “Sesame Street,” which brought Madison Avenue advertisers and game show creators together with New York intellectuals and the education department of Harvard. Lloyd Morrisett, through his connections at the Carnegie Foundation, helped Cooney line up the millions in grants to cover the research, writing and production needed to create a show that could compete with the commercial networks. McGeorge Bundy, one of “the best and the brightest” in the Kennedy administration and by then president of the Ford Foundation, sharpened the show’s political edge by homing in on the children of the urban underclass. “Sesame Street” would be the television equivalent of Head Start, the federal child-welfare program founded by Johnson in the belief, Davis writes, that “the tyranny of America’s poverty cycle could be broken if the emotional, social, health, nutritional and psychological needs of poor children could be met.”

In its high ideals and comprehensive approach, “ ‘Sesame Street’ came along and rewrote the book,” Davis says. “Never before had anyone assembled an A-list of advisers to develop a series with stated educational norms and objectives. Never before had anyone viewed a children’s show as a living laboratory, where results would be vigorously and continually tested. Never before in television had anyone thought to commingle writers and social science researchers.”

“Sesame Street” turned the entertainment of children’s television into a science, as the program was extensively tested with nursery school audiences through a “distracter” machine that gauged children’s eye focus second by second during the run of each show. It is no coincidence that the program proved to be so popular. When early studies determined that its street scenes were faltering, Jim Henson brought about a final breakthrough. At the time, his Muppets were relegated to the “commercial” segments as cut-aways from the street-based story line. For this, Henson drew on his own experience. He had originally developed Kermit and the Muppets for commercial work; his 1950s show “Sam and Friends,” with its zany ads for Wilkins coffee, has now found a second life on YouTube. Over the objections of researchers, who had advised against mixing the fantasy of the Muppets with the reality of the street, Henson developed Big Bird and Oscar the Grouch to be central characters on the main stage, both driving and subverting the program’s self-seriousness.

Davis tracks down every “Sesame” anec­dote and every “Sesame” personality in his book, and the result is more an oral history than a tightly organized narrative. The development of the show’s characters, as well as the performers’ own lives, can be illuminating. Bob McGrath, who has played Bob from the start, once enjoyed a pop singing career in Japan. Gordon, the neighborhood’s black role model, played by Matt Robinson and then Roscoe Orman, was named for the photographer Gordon Parks. The character Susan, Gordon’s stay-at-home wife, was once denounced by feminists. Emilio Delgado and Sonia Manzano joined the cast in the ’70s as Luis and Maria after protests against the show’s lack of Hispanic characters. Will Lee, who played the store owner Mr. Hooper, came through the Yiddish theater and the radical Group Thea­ter, and was blacklisted in the ’50s; Lee’s death in 1982 became a defining moment when “Sesame Street” chose to address the news directly on the air. Northern Calloway, who played Mr. Hooper’s young assistant, David, proved to be an even more tragic case: by the time I appeared on camera with him, according to Davis, Calloway was medicated with lithium after a violent psychotic breakdown; a manic-depressive in and out of treatment, he remained on the show through the late ’80s, but died in 1990 after suffering a seizure in a psychiatric hospital.

Davis lingers on such gossip. I could do without dwelling on the drinking habits of Captain Kangaroo (Bob Keeshan, forever jealous of the acclaim for “Sesame Street”) or several of the book’s other trivial details. Do we really need to know that Cooney served boeuf bourguignon, “a traditional French country recipe . . . on Page 315 of the first volume of ‘Mastering the Art of French Cooking,’ ” to Lloyd Morrisett at their 1966 dinner?

Far more interesting are the failings and criticisms of the lavishly praised show. Terrence O’Flaherty, a television critic for The San Francisco Chronicle, accused “Sesame Street” of being “deeply larded with ungrammatical Madison Avenue jargon.” Carl Bereiter, a preschool authority, said, “It’s based entirely on audience appeal and is not really teaching anything in particular.” And Neil Postman complained that it relieved parents “of their responsibility to teach their children to read.”

The real challenge to the show came in the 1990s, around the time Joan Cooney retired as chairwoman of the Children’s Television Workshop, the program’s nonprofit governing body. Once revolutionary, “Sesame Street” came to be seen as a dated reminder of urban decay, while the purple dinosaur Barney took children’s television out to the clean suburban schoolyard. “None of Barney’s friends lives in a garbage can, and none grunts hip-hop,” National Review cheered. In response, “Sesame Street” made an ill-fated attempt at urban renewal, developing an extension to the set called “Around the Corner” that seemed “less like Harlem and more like any gentrified up-and-coming neighborhood in America,” Davis writes. Professional child actors were regularly employed for the first time.

The broken-window theory may have worked to clean up New York, but not so for “Sesame Street” — as its empire expanded abroad, ratings eroded at home, and the gentrified set was abandoned. “Sesame Street” ceased to be a reflection of its surroundings. Early on, the writer-producer Jon Stone rejected the traditional trappings of children’s television: “Sesame Street” would have “no Treasure House, no toy maker’s workshop, no enchanted castle, no dude ranch, no circus,” Davis says. But this is what “Sesame Street” had become, and perhaps what it really always was: an urban fantasy world born of ’60s idealism. Davis has written a tireless if not altogether artful history of this unique place. Here, finally, we get to touch Big Bird’s feathers."


STREET GANG
The Complete History of “Sesame Street.”
By Michael Davis
Illustrated. 379 pp. Viking. $27.95

Read More......

'Shakespeare and Modern Culture,' by Marjorie Garber


"Although women did not begin performing in his plays until several decades after the playwright’s death, it is hardly surprising to encounter a quotation from an actress in a scholarly volume on Shakespeare. Ellen Terry conducted a lively debate with Shaw over her interpretations of various characters. Sarah Bernhardt was among the many women who have taken on the role of Hamlet, making up, after a fashion, for female forebears denied the chance to play even Rosalind or Viola or Lady M.

But you certainly don’t expect to hear from Ali MacGraw. I wouldn’t want to classify this placid movie star of the 1960s and ’70s as the very last actress I would expect to be offering insights on Shakespeare — Jessica Simpson would come a lot farther down the list — but still, the inclusion of a quotation from MacGraw in Marjorie Garber’s new book, “Shakespeare and Modern Culture,” drew a raised eyebrow.

A smile, too. There is much heady scholarship in Garber’s wide-ranging survey of the ways in which “the timelessness of Shakespeare is achieved by his recurrent timeliness.” MacGraw, the “Love Story” star, is in the impressive company of gleamingly illustrious intellectual eminences like Freud and Marx, not to mention respected literary critics from Hazlitt to Foucault. But the book is as much about the uses (and abuses) to which Shakespeare has been put in the last few centuries, with a concentration on the one just past, as it is about the plays themselves. So Garber’s range of references encompasses a dizzyingly diverse cast of characters, from political pundits to gurus of the business world, from theoreticians known mostly to academics to the British rock band Dire Straits.

This capacious reach is actually the book’s signal achievement and its primary flaw. That’s a fat paradox, I know, but I’m playing Garber’s own game. “Shakespeare and Modern Culture” is founded on proving the truth of a mind-bending formulation, that “Shakespeare makes modern culture and modern culture makes Shakespeare.” The history of the plays as they have been performed and debated across the centuries is “the story of a set of mutual crossings and recrossings across genres, times and modes.” The book’s overarching idea derives from the rhetorical device known as chiasmus, or “crossing of words” — the theoretical two-way street illustrated by that phrase about Shakespeare both making and being made. “The structure of thinking exemplified by chiasmus,” Garber writes, “works both structurally and symbolically: the productive confusion between art and life, inside and outside, container and contained was essential to both the stability and the destabilization of Shakespearean theater.” Seemingly opposed ideas — or chronologically distant phenomena — are made to converse, and the conversation often brings forth bright streams of revelation, if also the occasional banality.

We seem to have strayed a long way from Ali MacGraw, but it should be emphasized that Garber — a Harvard literature professor and the author of “Shakespeare After All,” a big, terrific study of the whole canon, as well as several other books on varied subjects (real estate, transvestism, dogs) — writes mostly lucid, engaging prose that requires little recourse to a Glossary of Academic Literary Jargon. Following her along the many intellectual crosswalks of the book will pose no major obstacles for the general reader.

“Shakespeare and Modern Culture” is devoted to 10 major plays; the non-­hilarious “Merchant of Venice” is the only comedy among them. I wish Garber had included at least one of the lighter comedies, or addressed the question of why none were deemed worthy of attention. I also wish the inappropriately dreary, subtitle-ish title had been appended to something more pithy: “Enter, Fleeing,” perhaps, a frequent Shakespearean stage direction that set Walter Benjamin musing profitably, as Garber notes.

Each chapter explores a single play and various cultural responses to it over the years — in particular, how it relates to one of “the central concepts and topics of literary and cultural investigation for the past hundred-plus years.” “Hamlet” is discussed in concert with changing ideas about “character,” both literary and personal. “Macbeth,” with its fatally misleading prophecies, is investigated through the prism of “the necessity of ­interpretation.”

So far, so good. It is only when we come to the other major structural formula — linking each play not just with an idea but with at least one modern genre — that things occasionally go awry. When Garber examines the plays through their impact on intrinsically interesting samples of 20th-century culture, the reading is wonderful in its depth of insight and flexible toggling between the Shakespeare texts and the later works. When it is viewed in the light of less rich fare — the pep-speak of business-advice books, or glib media coverage of politicians, sometimes even the movie versions of the plays — the inspiration dries up.

The culminating chapter, titled “ ‘King Lear’: The Dream of Sublimity,” displays Garber at her best, and brings the book to an aptly sublime conclusion. She fruitfully traces the play’s rise to the top spot in the canon during the course of the 20th century, as a work that seemed both to prefigure the horrors of that blood-soaked era and to offer endless stimulation to the thinkers and writers born into it. Garber discusses Samuel Beckett and Jan Kott (who linked Beckett’s “Endgame” and “Lear” in his seminal book “Shakespeare Our Contemporary”) and offers provocative digressions about Albert Camus and many others. Her elucidation of the relationship between zero and nothingness is occasionally marked by “leaps” — to use a term she also dissects brilliantly — that don’t land her anywhere much. The intellectual hopscotching, though, is ultimately an example of lit-crit pattern­making that has the beauty and intricacy of fine lace.

But when Garber spends a few pages of her chapter on “Henry V” analyzing the many ways in which Shakespeare’s characters, especially that rousing leader of the battle-waging “band of brothers,” have been stripped of their ambiguity to serve as exemplars of leadership qualities and modern management techniques, the mind shuts down. As she herself writes at the conclusion of this section: “This kind of work is not useful in illuminating, analyzing or interpreting Shakespeare. It uses Shakespeare, but the use is not commutative. It does not go both ways.” Precisely. So why not spare us?

The paradox I referred to above — the book’s inclusivity as both an asset and a problem — derives from the issue Garber cogently describes here. I can’t argue with her when she includes a list of occasions on which comparisons to Lady Macbeth have been used to demonize women politicians. But the point will be obvious to anybody paying attention to the currents of popular culture. Similarly, in the chapter about “Richard III,” centered on the elusiveness of “fact,” Garber provides a long list of recent examples of “the temptations to rewrite history”: the weapons of mass destruction argument for waging war on Iraq, the Jessica Lynch and Pat Tillman stories, and (yawn) even James Frey. These digressions are time wasters, and have a Google-searchy quality that detracts from the far more edifying work Garber does elsewhere in exploring Shakespeare’s impact on modern art and thought. (She’s particularly astute on Freud.)

In the end, the nourishing material far outweighs the thinner gruel. And Garber’s aim, to encourage a deep engagement with the plays by emphasizing their ubiquity in modern culture, is so exemplary that the book’s occasional descent from the stimulating to the trite is forgivable. A fierce devotion to Shakespeare shines forth from every page. And, really, to corrupt a phrase made famous by Ali MacGraw, loving Shakespeare means never having to say you’re sorry."


SHAKESPEARE AND MODERN CULTURE
By Marjorie Garber
Illustrated. 326 pp. Pantheon Books. $30

Read More......

'Lives Of The Artists,' by Calvin Tomkins


"The allusion in the title of this collection of New Yorker profiles to Giorgio Vasari’s biographical sketches is only partly apt. Like Vasari, Tomkins approaches art writing as a raconteur rather than a formalist: artists’ lives, Tomkins says, are “integral to what they make.” But while the highly opinionated Vasari wrote of the “progress” of art toward “the perfection” of the Renaissance, Tomkins is a creature of our anything-goes cultural era. Though not a strictly nonjudgmental postmodern­ist, Tomkins is cheerfully ecumenical, defying both the squabbling sectarians of the art world and critics who believe art is becoming increasingly frivolous. These 10 profiles, which originally appeared from 1999 to 2008, helped ratify the contemporary canon in all its eclecticism, from the market-savvy provocations of Damien Hirst to the austere if gigantic minimalism of Richard Serra to the enigmatic theatricality of Cindy Sherman. Declinist critics might fault Tomkins for not advancing a broader aesthetic defense of the artists he champions, but his consummate mastery of the magazine profile form and enthusiasm for his subjects are winning. And it’s hard to begrudge these artists their charmed lives. These pages are filled with amiable but ambitious eccentrics who follow their whims (“What gave him the confidence to make art out of petroleum jelly and tapioca and athletic equipment?” he asks about Matthew Barney) and soon earn riches on a scale that would impress their Wall Street patrons. The painter John Currin sums up the current mood when he tells Tomkins, “All art is about saying yes.”


LIVES OF THE ARTISTS
By Calvin Tomkins
John Macrae/Holt, $26.

Read More......

'Some Of It Was Fun: Working With RFK and LBJ,' by Nicholas Katzenbach


"As Robert Kennedy’s deputy and then successor as attorney general, Katzenbach charged into the scrum over civil rights. He was the top administration official present at the deadly 1962 battle between a segregationist mob and federal marshals enforcing a Supreme Court order to admit James Meredith to the University of Mississippi. And it was Katzenbach who confronted Governor George Wallace, in a carefully staged tableau of integration and resistance, at the door of the University of Alabama. Katzenbach’s government career had its share of dramatic and consequential moments, but his memoir of those years is also concerned — to an extent that will challenge the patience of many nonspecialist readers — with the routine inner workings of the federal bureaucracy. His purpose is to recall Washington’s political culture during a less partisan era. Robert Kennedy, Katzenbach writes, was an attorney general who would never “sacrifice law to political aspirations.” This is an implied contrast with George W. Bush’s Justice Department that is made explicit elsewhere in the book. But Katzenbach would soon find that it’s not so easy to put policy over politics. He became undersecretary of state in 1966, and though he was dovish on the Vietnam war, he acknowledges the predicament Lyndon Johnson faced as he contemplated the “domestic political consequences if territory was lost to the Communists.” It is a lesson in the limitations of good intentions: Katzenbach, who so honorably pursued racial justice, wound up affiliated with a war muddled by electoral calculation."


SOME OF IT WAS FUN
Working With RFK and LBJ
By Nicholas deB. Katzenbach
Norton, $27.95.

Read More......

'Now the Drum of War: Walt Whitman and His Brothers in the Civil War,' by Robert Roper


"Walt Whitman’s mother, Louisa, had a limited education but a “quicksilver intelligence and unostentatious decency,” says Roper. Her matriarchal moral authority is clear from the family’s wartime correspondence, the main source for Roper’s book. Louisa’s letters were a lifeline to her sons George, an indomitable Union officer; Jeff, a successful engineer; and Walt, whose poetic masterpiece, “Leaves of Grass,” was still largely unheralded in 1862, when he went to Washington and became a civil servant, a hospital volunteer and a journalistic and poetic war chronicler. Roper’s book contains multitudes, in the all-inclusive and meandering spirit of Whitman’s poetry — to a fault, since Roper sometimes gets lost in domestic, military and historical detail. But he has something to say about Whitman’s hospital work and late poetry. Though often described as a “nurse,” Whitman was more like a secular chaplain, a “visitor & consolatory,” as he put it, to thousands of wounded soldiers. Roper sees this service as the application of Whitman’s concept of “adhesiveness,” which Roper characterizes as a force of “comradely love,” vaguely homoerotic, that would bind the nation. As an angel of mercy, Whitman rose to the challenge of the war, but “as a poet he was stymied,” Roper says. “Drum-Taps,” Whitman’s “main poetic response” to the war, he remarks, “is surprisingly thin.” Whitman, “profoundly immersed in the human material,” writes Roper, seemed “not really to be listening.”


NOW THE DRUM OF WAR
Walt Whitman and His Brothers in the Civil War
By Robert Roper
Walker, $28.

Read More......

'A Jury Of Her Peers: American Women Writers From Anne Bradstreet to Annie Proulx,' by Elaine Showalter


"It may be surprising that there’s been no comprehensive history of women’s writing in America. But Elaine Showalter has now undertaken this daunting venture with her vast democratic volume, “A Jury of Her Peers: American Women Writers From Anne Bradstreet to Annie Proulx,” in which she energetically describes the work of long-forgotten writers and poets along with that of their more well-known contemporaries. In the 1970s, Showalter wrote “A Literature of Their Own: British Women Novelists From Brontë to Lessing,” which established an alternative canon of British women writers at a moment when feminist studies were very much in vogue, and her new book is an attempt to do the same thing for American literature. Showalter was, for nearly two decades, a professor in the department of English literature at Prince­ton (she was the head of the department when I was graduate student there), and she remains a grande dame of feminist literary studies.

It’s worth noting that many of the most talented writers she discusses — Edith Wharton, Willa Cather, Mary McCarthy, Elizabeth Bishop, Joan Didion — objected to being categorized as women writers and preferred to think of themselves simply as writers. As Elizabeth Bishop put it, “art is art and to separate writings, paintings, musical compositions, etc. into two sexes is to emphasize values that are not art.” Showalter handles these rebels by corralling them into special subchapters with titles like “Dissenters.” One of the dissenters, Cynthia Ozick, argued against expecting “artists who are women . . . to deliver ‘women’s art,’ as if 10,000 other possibilities, preoccupations, obsessions, were inauthentic, for women, or invalid, or worse yet, lyingly evasive.”

“A Jury of Her Peers” announces its inclusiveness with its size and heft, and the breadth of Showalter’s research is indeed impressive; it seems there are women scribblers under every apple tree, in every city street and small-town cafe across our great nation. In fact, the encyclopedic nature of the book is both its satisfaction and its limitation. The entries are brisk, informative and often less than a page long. There are too many writers here to go into much depth about any of them, and one finds oneself, in many of the more absorbing passages of the book, wanting more. Of course, distilling any writer’s life work into a brief entry entails a certain amount of glossing over. To cover so much territory necessitates a kind of breezy simplification, and that very breezy simplification is also the pleasure of this kind of ranging, inclusive history.

Though she refers to “A Jury of Her Peers” as literary history, Showalter is less attentive to artistic merit, to what separates good fiction from bad, than to cultural significance; she is less concerned with the nuances of style or art than with the political ramifications of a book, or the spirited or adventurous behavior of its lady characters. She is not interested in whether the writers she discusses are good, or in the question of how their best writing works, but in whether they are exploring feminist themes. And so she ends up rooting through novels and poems for messages and meanings about women’s position in society, for plots that criticize domesticity or that expound on the narrowness of women’s lives. (She once coined the term “gynocritic” for critics freed “from the linear absolutes of male literary history.”) This exploration of subversive plots and spunky heroines is fruitful from a purely historical point of view, but it doesn’t always feel like literary criticism at its most sophisticated. One thinks of Joan Didion’s line about feminists: “That fiction has certain irreducible ambiguities seemed never to occur to these women, nor should it have, for fiction is in most ways hostile to ideology.”

Showalter is occasionally prone to bouts of reductionist readings that belong to a faded era of bell-bottoms and ­consciousness-raising groups, as when she says the elaborately drawn characters Gus Trenor, Percy Gryce and Simon Rosedale in Edith Wharton’s “House of Mirth” are “products of their own crisis of gender,” or when she writes that Sylvia Plath’s richly nuanced poem “Daddy” “embodied women’s rejection of patriarchal mythologies.” But on the whole her writing is clear and lively and mercifully free of the fashionable jargon of academic criticism.

Showalter’s wide net draws in writers like Dorothy Canfield Fisher, whose novel, “The Home-Maker,” written in 1924, includes the abysmally written passage: “What was her life? A hateful round of housework, which, hurry as she might, was never done. How she loathed housework! The sight of a dishpan full of dishes made her feel like screaming. And what else did she have? Loneliness; never-­ending monotony; blank, gray days, one after another full of drudgery.” Very few people, I imagine, would argue for the elegance of the prose, but the passage is undoubtedly interesting from a feminist point of view. And so the question becomes: Is this capacious, political way of looking at writing a flawed way to view the mysteries of literature? Willa Cather put it this way: “The mind that can follow a ‘mission’ is not an artistic one.”

Showalter’s final section on modern women writers, with headings like “From Chick Lit to Chica Lit,” is the flimsiest in the book. Where, one wonders, are some of the quirkier and more interesting talents of the past few decades, from Paula Fox to Mary Gaitskill to Claire Messud? Showalter spends too much time on frothy entertainments like Jennifer Weiner’s “Good in Bed” and Terry McMillan’s “Waiting to Exhale” at the expense of more serious literary work.

Toward the end of this ambitious book, Showalter concludes that one “must be willing to assume the responsibility of judging. A peer is not restricted to explaining and admiring; quite the contrary.” But one wishes there was more judgment in this book, more selection. The idea of resurrecting women’s writing from the neglect of previous eras is a project of ’70s feminism, but is the mere fact of being a woman and jotting down words in a notebook and then publishing them worthy of quite so many drums and trumpets? It may not be sensitive to say that some, just some, of the writers in this generous volume might have rightfully been relegated to obscurity, but one can’t help thinking, at times, that literary history may have passed them over for a reason, just as it has passed over mediocre male writers. One also wonders about the sheer democracy of the project, the fair-minded curiosity about nearly every woman who thought to pick up a pen. Does Dorothy Canfield Fisher really merit as much space as Elizabeth Bishop? It is a vexed and knotty question: Is Showalter in some way devaluing the achievements of the greatest American writers by giving equal or greater space to the less talented? Is she slighting women writers by holding them to a standard that is not about artistic excellence, but about the political content or personal drama of their writing? In her brilliant essay “Silly Novels by Lady Novelists” George Eliot wrote, “the severer critics are fulfilling a chivalrous duty in depriving the mere fact of feminine authorship of any false prestige which may give it a delusive attraction, and in recommending women of mediocre faculties — as at least a negative service they can render their sex — to abstain from writing.”

Still, this comprehensive record of American women’s attempts at literary achievement holds its own fascination; the small, vivid portraits of women’s lives are extremely readable and enlightening. Writing about times when women’s stories were too often ignored, Showalter offers a series of vignettes about what their struggles consisted of and how difficult it was for a woman to forge a professional identity as a writer. She is concerned with the drama of women writing; the lives she describes are filled with abortions, divorces, affairs, unhappy marriages, post­partum depressions and suicides. Her short, incisive biographies offer a glimpse into the exotic travails of the past and the eternal concerns of female experience; and, of course, from a purely biographical standpoint the literary mediocrities can be as interesting as the successes.

“A Jury of Her Peers” is likely to become an important and valuable resource for anyone interested in women’s history. It outlines the rich and colorful history of women struggling to publish and define themselves, and the complex and tangled tradition of women’s writing in this country. It also leaves us with many memorable moments, like Dorothy Parker praying, “Dear God, please make me stop writing like a woman.”


A JURY OF HER PEERS
American Women Writers From Anne Bradstreet to Annie Proulx
By Elaine Showalter
586 pp. Alfred A. Knopf. $30

Read More......