Tuesday, December 27, 2005

The [Brand Name Here] News

If there was ever any doubt that it's advertising's world and we just live in it, that doubt should have been erased with the announcement, earlier this month, that a radio station in Madison, Wisc., had sold the naming rights to its newsroom to advertisers. Starting jan. 1, the newsroom for WIBA radio will morph into the Amcore Bank News Center. For journalism, that apocalypse we always talk about being upon us may really, finally be here.

Or has already been here: The Associated Press reports that a Milwaukee station sold its naming rights to another bank, Pyramax, in 2004. Monica Baker, Pyramax's senior vice president of marketing, told The AP that it was just a matter of the bottom line. "This is a way for us to maintain our revenue levels and make the station successful," she said. "The concern about any possible conflict of interest is just ridiculous."

But there's much room for concern among journalists, and there should be with the public as well. The Pyramax and Amcore deals don't just smudge the line between news and advertising, they cross the line with blithe ruthlessness and a reflexive justification on grounds of fiscal responsibility. Such alliances are a problem for anyone hoping for news outlets separate from the corporations that those outlets must sometimes cover.

Company brands on news broadcasts aren't really new. David Klatell, vice dean at Columbia University's Graduate School of Journalism, told teh AP that the practice was discontinued in the 1950's, after such broadcasts as the NBC evening news broadcast featured the orange and white Gulf Oil logo on the set.

The problem NBC brass must have figured out then, Klatell noted, is the same issue they'll be forced to wrestle with this time if the trend continues. "They couldn't fairly cover not only Gulf, they couldn't cover anything that was relevant to the oil industry or its competitors."

As more and more American companies engage in the reverse mitosis of mergers and acquisitions, bulking up by conusming their smaller rivals (witness the new AT&T, happily digesting what was once SBC), the inevitability of conflict of interest is obvious. Consider the range of divisions of just some of the country's leading communications companies: from theme parks to magazines, from aircraft engines to the 50-gallon water heater I plan to buy for my home.

With a range of interests like that, it's obviously hard enough to maintain some degree of editorial independence when stories even peripherally related to a parent company's bottom line cross the editor's desk. You could imagine, for example, the dilemma of reporting a story on faulty aircraft engines implicated in a series of deadly airline crashes when your network's parent company makes the engines in question.

That dilemma is that much more transparent when you've branded your news product with the name of your biggest advertiser. Those newsrooms in Madison and Milwaukee can only hope they don't have to deal with any banking scandals tied to anyone working for Amcore or Pyramax banks. But that's the easy call; how will they handle stories on the banking sector in general? Can they be counted on to report news about other banks as aggressively as they'll cover news related to Amcore or Pyramax?

And how do these news departments handle the issue of professional integrity, namely, the matter of not just avoiding conflict of interest but also avoiding the appearance of conflict of interest?

That's a matter that two newsrooms, for now, are grappling with. The deeper, bigger question is, what happens in the future. But with advertisers increasingly challenged to find a way to get into the public mind and stay there, with corporations increasingly challenged to hit their numbers, with news outlets increasingly challenged to inform the public and bludgeon their competition in the process, the Brought To You By News of the past may well be a worrisome development to come.

Monday, December 26, 2005

No, no, nano

So there I was at at the Apple Store at University Village on Christmas Eve, internally debating which vision of the digital future of entertainment I should endorse with my dollars. Which way to go: Video iPod or Nano?

Lately there’s been a lot of press about both of them. The Video iPod got attention because of its video capabilities; the nano was rightly heralded as an advance in miniaturization that other music-player manufacturers would come around to, sooner or later.

I checked them both out thoroughly, mostly with that touchy-feely, tactile sense of validation we use before making any big purchase (one reason I wasn’t overjoyed at the prospect of waiting to get one in the mail).

The nano’s reputation for sleekness is well-deserved; you could easily forget you had the thing and lose track of its location. There's a charm aspect to the nano that tries hard to make it seem irresistible, especially to younger buyers of leisure technology.

Likewise, the video iPod has garnered a lot of attention for the video feature, which, depending on the degree of your addiction to television, may or may not be the coolest thing.

Both devices have their champions and their audiences. But there in the Apple Store, it was my turn to decide. Which paradigm would get my dime?

Then, in an instant -- in one of those flashes of thought that could be insight but is probably more just a spendthrift shopping as shrewdly as possible -- the decision became crystal clear, as clear to me as I suspect such decisions will be for others in the future.

Like others, I too had fallen in love with nano: its sleek shape, its almost invisible size. But when the rubber met the road, I began to suspect something was amiss. First, it was a matter of cost efficiency. The nano I intended to buy would let me carry about 1,000 songs on a 4-gigabyte hard drive. Cost: $199. The video iPod, on the other hand, allowed for downloading 7,500 songs on a 30-gigabyte drive. The larger size of the video iPod wasn't important. But the cost? $299. Clearly you dont have to be a math major at MIT to see which device, on a dollar-per-gigabyte comparison, is the better value.

And then there was the other matter to be considered. Let's call it the vision thing. The nano, despite its cuteness factor and the buzz that preceded its arrival into the culture, is effectively no more than taking us where we've already been -- where Apple's already been. The nano is a clever sonic tweak of the original iPod technology, a smart rehash of the model launched in 2001 -- but still a rehash, one that doesn't advance that original technology with anything more than size. Setting aside the early hosannahs for nano, I've got the suspicion that other late adopters are likely to come to the same conclusion. Not to rain on Steve Jobs' parade, but the nano of 2005 may well go the way of the iPod mini in two years' time: slowly but reliably phased out, ushered to the back of the universal serial bus.

The video iPod, with its capacity for downloading videos with the same ease as downloading music, is the trailblazing device, the one that networks and music websites will be retooling for in the future. NBC and ABC have already started that process: NBC makes downloads of the "Nightly News" available free, while ABC offers up downloads of "Desperate Housewives" and "Lost" for about $2 each. As the driver of an online business model, the video iPod had it nailed. At least that's what I thought as I walked to the counter, debit card in hand, eagerly anticipating my ticket to the cutting edge.

At least until Apple comes up with something new.

Think they won't? You watch; they will.

iGuarantee it.

Thursday, December 8, 2005

Lennon

It was Dec. 8, 1980, early in the evening in Boulder, Colorado, and I'm sitting in my bedroom at the foot of my bed with Marjorie, my next-door neighbor and infatuation of the moment. With my roomies sitting in the next room noisily watching ABC's "Monday Night Football," I kissed Marjorie for the first time; we necked and snuggled and considered the possibilities of spending some big percentage of our lives together; and then we heard, with the door not quite closed enough, the unmistakable voice of Howard Cosell telling us, telling the world, that John Lennon was dead.

Things started to turn sour for me and Marjorie from almost that moment on. It took a while to fully play out, but our romantic fortunes went downhill -- which shouldn't have been surprising. One way or another, things went south for all of us from that day on.

Some mile markers in life are unavoidable, like road obstacles that are too big to drive around. There's no escaping them when they happen or even years later. There's behavior that's common to our species -- or maybe just particular to our era: we tend to measure the gravity of events in the context of the terminal. We all have our stories of where we were when John F. Kennedy was assassinated, or Malcolm X, or Martin Luther King or Robert Kennedy or Yitzhak Rabin. John Lennon joined that pantheon of eternals, but in a slightly different way.



Lennon always had an edge about him. In his life as a Beatle and afterward, there was a sense of the precipitate, the volatile about John Lennon. More than any of the other Beatles -- the often-sullen George, the relentlessly chipper Paul, the phlegmatically loyal Ringo -- John embodied rock and roll's potential for unalloyed danger, that feeling you get in the best rock music of careening headlong into a new and strange place -- and not being frightened by the prospect in the least.

Long before that nightcrawler trapped him in the vestibule of the Dakota, John Lennon wore a target; it's been said that years before the Beatles exploded, when he was still playing with the Quarrymen, blokes in his Liverpool neighborhood wanted him hurt, or worse, for reasons we can only guess at now.

His was rock and roll's first outright assassination. With his killing, legions of fans got the wake-up call they'd been dreading, or avoiding, at least since the Beatles broke up a decade earlier.

And it was hard to let go. It was, and still is, hard to give up that giddy frisson of the Beatles' first performance on the Ed Sullivan Show, when four shaggy knuckleheads in suits landed in a country numb from the loss of Camelot about nine weeks before they arrived. The audience that night, its screams like a flock of crazed birds, was a taste of the future. The screams followed them around the world for six years -- the reaction of fans who lost themselves in a music and a style and a world-view that began the transformation of popular culture.

Not long after the bottom fell out of the Beatles, in 1970, Lennon was warning us. Telling us in his fashion to get ready. Be prepared for anything. Grow up. "I don't believe in Beatles," he told us in the song "God." "The dream is over." Lennon was teaching us to grow up even while, paradoxically, he was growing up himself. There were dalliances for a time when things with Yoko went badly.

He hangs out with Harry Nilsson in L.A. He and Harry get tossed out of the Troubadour for heckling the Smothers Brothers. He goes to another nightclub and gets upbraided by a waitress for wearing a tampon on his head. He learns the process of starting over.

We're lucky that the fruition of that process was something positive. Rather than an obituary of a rock star who passed from the scene with a spike in his arm or a shotgun to his head, we got from John Lennon the evidence of his mellowing, his maturity. You can't listen to "Double Fantasy" without hearing that growth process in the works; like a butterfly fighting its way out of a cocoon, John and Yoko were fighting to regain their own identities -- regain, hell! maybe fighting to just have identities separate from those foisted on them by the media, the music biz, and always, always, the fans.

We got a taste of that great possible. But only a taste. Now, a quarter century after Lennon died -- sounds tree-ring strange saying "quarter century" -- we're in some ways more in need of his candor, his wit, his passion, than we ever were before.

John Winston Ono Lennon remains an indelible spirit of our times, a man whose wrestling with demons within and without has made our own caged match with reality a little more bearable. It's a real tribute to someone's life when you find that you miss that person, need that person, feel that person's presence more and more as time goes by, not less.

John Lennon was a rock dropped into the water of our time and our lives, and the ripples from that rock get stronger and stronger all the time, the further and further we get from their source.

Imagine that. Just imagine.
-----
Image credits: Top photo: Roy Kerwood, 1969; Dakota: David Shankbone

Wednesday, November 23, 2005

Goodbye, Mount Koppel

For twenty-six years we've come to know him and, if not exactly love him, certainly welcome his steady presence and journalistic gravitas in our living rooms when we had enough of the late-night froth of Leno and Letterman, and the denizens of post-prime-time programming who didn't last.

Last night we bid goodbye to Ted Koppel, the craggy, sometimes relentlessly mirthless fixture of ABC News' "Nightline," a program that began its life with the United States in the midst of a hostage crisis -- a show that, ironically enough, finds the nation caught up in another hostage crisis in the Middle East, this time of its own unwise design. Koppel handed over the reins of the program to multiple hosts who'll produce the show from New York and Washington. Reporters Chris Bury and John Donvan will take over hosting duties, along with Vicki Mabrey, late of CBS' "60 Minutes II," as well as Martin Bashir, Cynthia McFadden and Terry Moran as contributors. The new raft of hosts and correspondents will begin on Nov. 28.


"Nightline" began, of course, with the ABC News program "The Iran Crisis: America Held Hostage," which aired Nov. 8, 1979, just days after American hostages were seized at the U.S. Embassy in Tehran. Koppel introduced viewers to what would become the "Nightline" formula. After "Nightline" began as a formal entity in March 1980, Koppel unveiled the point-counterpoint style of interviewing, pitting two ideological opposites against each other on a given topic, with Koppel aiming thoughtful, sometimes provocative questions at both combatants.

In some respects Koppel's form was the antecedent to the attack-dog style of TV news journalism that's led us to "Hardball" and any number of other confrontational programs. But Koppel's on-air demeanor was a cut above that of the pit-bull interrogators that populate the 21st-century TV landscape.

Even in his globe-trotting prime, Koppel always betrayed a bit of Alistair Cooke in his delivery and his use of language. He maintained a high editorial standard, and resisted doing stories on the lowbrow tabloid titillation of the moment.

Even in last night's valedictory broadcast, when he could have resorted to the summational reflex of a video-grab retrospective -- a greatest-hits approach that's so overdone you wonder why TV journalists bother to do it or why we still bother to watch -- Koppel took another tack, going back to his poignant 1995 broadcasts on the life of Morrie Schwartz, the sociology teacher whose battle with Lou Gehrig's disease gave "Nightline" some of its more riveting moments, even as it gave a young sportswriter named Mitch Albom the storyline for a book, "Tuesdays with Morrie," that's become a publishing phenomenon.

Still, only being human, Koppel couldn't resist taking one final shot at ABC News, the network that almost kicked him to the curb not so many years ago, in the big rush to late-night, stand-up stupidity.

Since ABC is presumably still in the hunt for a permanent anchor for "World News Tonight," a post vacant since the death of Peter Jennings earlier this year, we have to wonder how Koppel might have done in the big chair. ANC could do worse, and probably will.

Last night the ever-charitable Koppel asked his dwindling audience -- down to about 3.6 million viewers from 5.5 million a decade ago, according to "Nightline" producer James Goldston (speaking to AP) -- to give the new hosts a chance to become a regular fixture in their TV-viewing lives, the way he had become one.

“If you don’t,” he said, “I promise you the network will just put another comedy show in this time slot. Then you’ll be sorry.”

But to some degree, we're already sorry at the departure of something modern American television sorely lacks these days: an institutional memory, a hard drive of the historical, a conduit between the present and the past, a tree-ring experience in a time of loudmouthed saplings ... something that lasts.
-----
Image credit: Koppel: ABC News

Friday, November 18, 2005

A hawk flies north

Not that the antiwar movement needed a bigger gun than the thunder of its own convictions, but those opposing the debacle in Iraq gained a powerful ally yesterday, when one of the more resolute and influential hawks in Congress – a Democrat! – came out, with passion, power and an eloquence sorely lacking on Capitol Hill, against the war he had supported in 2002.

Pennsylvania Rep. John Murtha, a Vietnam veteran with a Bronze Star and two Purple Hearts, said it plainly in Washington. “It’s time to bring them home. … Our troops have become the primary target of the insurgency … we have become a catalyst for violence,” he said. “The war in Iraq is not going as advertised. It is a flawed policy wrapped in illusion.”

In his comments, Murtha went on to fire a smart broadside at Vice President Dick Cheney, who the night before at a black-tie event, adopted the administration party line, railing against those who want U.S. forces brought home as “dishonest,” “reprehensible,” and claiming that withdrawal now sends the wrong signal and opens the door for more insurgents, presumably emboldened by our nation's premature departure from Iraq.

“I like that,” Murtha said yesterday, with a palpable sarcasm. “I like guys who got five deferments and never been there and send people to war, and then don’t like to hear suggestions about what needs to be done.”

The same day, Senate Minority Leader Harry Reid, chimed in, warning the White House to halt its ad hominem smear campaign against Iraq-war critics, calling it “a weak, spineless display of politics at a time of war.”

As expected, the administration responded aggressively. “They want us to retreat,” said Speaker of the House Dennis Hastert. “They want us to wave the white flag of surrender to the terrorists of the world.”

Texas Republican Rep. Sam Johnson, himself a Vietnam veteran and a POW for seven years, weighed in as well. “We’ve got to support our troops to the hilt and see this mission through,” Johnson said, underscoring one of the fundamental administration disconnects related to public debate on the war: an eagerness to equate criticism of the war with criticism of the decent, loyal Americans sent to fight it.

In a statement, White House press secretary & mouthpiece Scott McClellan leveled another attack. “The eve of an historic democratic election in Iraq is not the time to surrender to the terrorists,” McClellan said. “Congressman Murtha is a respected veteran and politician who has a record of supporting a strong America. So it is baffling that he is endorsing the policy positions of Michael Moore and the extreme liberal wing of the Democratic Party.”

The vitriolic administration position against Murtha begs the question of why they think a man who has stood on principle for so long, in both his career as a public servant and as a decorated military veteran, would have suddenly vacated those principles. It escapes them that, by the very fact of his reversal, just maybe the policy positions Murtha has adopted aren’t as extreme as the administration would have Americans believe.

One problem for the Bushies is Murtha’s 31-year stature as a Congressman. As the leading Democrat on the House Appropriations Defense Subcommittee, Murtha has expertise long been sought by Democrats and Republicans. He once worked as an aide to Cheney when Cheney was secretary of defense, and he has visited Iraq numerous times. As a confidant of American forces, Murtha is thought to be that rare politician: one who’s got both the throw weight in the hallowed halls and the gravitas to speak for troops on the ground on military matters – not an easy thing for the administration to dismiss.

Murtha’s comments from Capitol Hill lead us to make a not-so-venturesome prediction: The year 2006 will be the watershed year for debate on the Iraq war. The relatively sporadic protests against the war – Cindy Sheehan’s mobile vigil; comments and reporting from disinterested international observers and journalists; the growing concern among Republican lawmakers – will coalesce into the visibly broad, transgenerational tide of fearless public sentiment that the administration has no doubt anticipated, if not feared outright.

It’s taking shape already: On Tuesday, even while defeating a Democratic plan for a firm exit timetable, the Republican-controlled Senate approved a statement of its concern, saying that 2006 should be the year in which the conditions are established for the start of a gradual withdrawal of U.S. forces from Iraq.

On Thursday, the Republic of South Korea blindsided President Bush by announcing its intention to withdraw 3,200 troops from Iraq sometime next year – an announcement that must have been a particular embarrassment to Bush, who the same day met with Asian economic leaders at a summit … in Pusan, South Korea.

And today, Sen. John Kerry, who knows a thing or three about being the victim of character assassination, spoke from the Senate floor with a novel interpretation of the oft-used Republican phrase “cut and run”:

“We are in trouble today, Mr. President, precisely because of a policy of cut and run – a policy where the administration made the wrong choice to cut and run from established procedures of gathering intelligence … to cut and run from the best military advice, to cut and run from sensible wartime planning, to cut and run from their responsibility to properly arm and protect our troops, to cut and run from history’s clear lessons about the Middle East and about Iraq itself – to cut and run from common sense.”

Another example of the cut-and-run was suggested in a retort from Arizona GOP Sen. Jon Kyl, who spoke right after Kerry did. Kyl resorted to the longstanding Republican “choral error” argument in defending the decision to go to war, saying (again) that the United States had a lot of company in its prewar assessments of Iraq’s danger.

“Our intelligence, and that of virtually every other nation in the world, believed that Saddam Hussein was a threat to the world and had weapons of mass destruction, and in some cases was developing capability for additional weapons of mass destruction,” Kyl said.

Thus was this nation led into the single most disastrous American military misadventure since Vietnam – by cutting and running from the facts: Not waging war on the basis of knowledge but on the basis of a suspicion; not on the strength of singular intelligence singularly arrived at but on the strength of groupthink, assumptions and a deviously cultivated fear.

And at the end of the day, it’s not prewar intelligence that’s the pivotal issue, despite the vituperative claims and counterclaims of those in Congress. The real issue up for debate is the actions taken by the United States even after the inaccuracy of our prewar intelligence was well-established. Long after that prewar intel was found to be toweringly wrong – the aluminum tubes Colin Powell demonized at the United Nations; the yellowcake uranium found not to even exist; the fictional linkage of 9/11 and the Iraqi regime – this nation persisted in following a military course of action, which strongly suggests that course of action was what the administration intended to pursue all along, no matter what the facts were.

Jack Murtha’s courageous stand – one of several he’s taken in the last thirty-five years – really shouldn’t surprise anyone. It’s a case of a hawk having the nerve to fly north for the winter, breaking from a flock still largely heading in the other direction. Murtha’s a bird of a different feather; sooner or later, others are likely to join him. Once again, the chickens are coming home to roost.

Wednesday, November 9, 2005

An elephant in distress

It may be too soon to officially put the Republican Party on a respirator, and reports of their demise may be premature, but the outcome of some of Tuesday’s state and local elections have clearly sent a signal that the GOP is in trouble. It is trouble that party loyalists insist has nothing to do with the standard-bearer of the Republicans, President George Bush, but the shifting mood of the electorate suggests otherwise.

Ron Fournier, veteran political writer for The Associated Press, said “President Bush’s political ills seem contagious,” but that's a masterful understatement. Right now, politically speaking, George Bush has the walkin’ pneumonia and the boogie-woogie avian flu, and the results of some of those contests point to the Republicans’ ironic inability to inoculate themselves from the one man who should have been their doctor.

It began Tuesday night with a very big win for the Democrats in Virginia, that longtime Republican stronghold, once a seat of the Confederacy, a state that hadn’t gone for a Democrat for president since 1964 – and one in which Republicans have control of the Legislature and the state’s seats in Congress. Despite a personal appearance on the stump by President Bush, GOP gubernatorial candidate Jerry Kilgore was beaten by Democratic lieutenant governor Tim Kaine, who trumped Kilgore by 6 percentage points.

The postmortem indicated that citizen Kaine took a page from the Republican playbook, outflanking Kilgore on the GOP’s once-unassailable selling point: Values. Kaine placed his first campaign ad on a Christian radio station, Fournier reported.

Kaine’s first television ad played up his past experience with Catholic missionaries. And maybe most damaging was Kaine’s association with the popular Democratic governor, Mark R. Warner, who last year was mentioned, albeit briefly, as a possible running mate for Sen. John Kerry.

Then came news of the governor’s race in New Jersey. Sen. Jon Corzine, the popular Democrat who vaulted to his senate post after a career on Wall Street, trounced Republican challenger Doug Forrester in a vituperative contest that saw a lot of name-calling before it was all over.

Elsewhere, the drumbeat against Republicans continued. In California, Arnold Schwarzenegger, the actor turned Republican governor, saw pet-project ballot initiatives rebuffed by voters increasingly fed up with the once- and probably-future movie Terminator.

It may be hard to find the links between these sound Republican defeats and President Bush – other than the power of the theory that standing next to a man perceived as a loser makes you a loser by association. Bush’s job approval ratings continue to spiral downward in the wake of Katrina fallout, and the still-developing problems stemming from the indictment of White House aide and novelist “Scooter” Libby in the CIA leak investigation.

The president’s loyal minions are working hard to shore up the distance between Bush and the GOP losers on Tuesday. White House press secretary Scott McClellan, for example, dismissed the idea that Bush’s problems aided in Kilgore’s defeat in Virginia.

“Any thorough analysis of the gubernatorial elections is going to show that the elections were decided on local and state issues, and the candidates and their agendas,” McClellan said, presumably with a straight face, at the White House on Wednesday.

But whether he realizes it or not, McClellan’s comment deftly, if accidentally, undercut the ability of This President to employ the intangible powers of his office – the bully pulpit of the presidency – to do anyone in his party any good at all. It’s a tacit admission of the toothlessness of George Bush in his second term, an indicator of a relative impotence that’s likely to continue.

Whether the Democrats can capitalize on this next year is anyone’s guess. Tuesday’s elections in a handful of states point to the strong possibility of the Dems finally getting some messengers, still leaving open the question of whether they’ll finally get a message.

But the Republicans in power are looking down the tunnel and, to paraphrase the poet Robert Lowell, the light they see at the other end may well be an oncoming train. After five long years, it seems, reports of Republican invulnerability have been greatly exaggerated.
-----
Image credit: Kaine: Steve Helber, Associated Press

Monday, November 7, 2005

The Making of the President 2008, take 1?

Just for a moment, as you watched the two politicians prowling the stage, microphones in hand as they argued their own positions and railed against the other guy’s, you forgot what day it was. Or even what year it was. Did they give an election and forget to send you an invitation?

For just the sliver of time it took for your faculties to kick in, what was broadcast last night on NBC appeared to be a real debate between contenders for the presidency of the United States. But wasn’t that Detective Bobby Simone, Sipowicz’s former partner on “NYPD Blue” in one corner, and Hawkeye Pierce from “M*A*S*H” in the other?



Last night NBC undertook, in prime time, the latest smudging of the ever-blurring line between fiction and reality when it broadcast a live episode of its acclaimed political series “The West Wing.” Jimmy Smits, late of “NYPD Blue,” appeared as Rep. Matt Santos, the Democratic challenger. Alan Alda, from “M*A*S*H,” portrayed his Republican challenger, Sen. Arnold Vinick, in a mock debate meant to hew to the “West Wing” story line of a presidential election just starting to heat up in TV World (three years before the real thing in our own).

It was a compelling prime-time trick, meant to help shore up NBC’s sagging ratings. MSNBC.com followed the stunt a step further, ordering a poll from the esteemed Zogby International polling organization to sample 1,208 viewer opinions to find out who “won.”

Regardless of who prevailed – Santos/Smits was declared the “winner” by a handy double-digit margin – the “West Wing” exercise may be instructive in what it suggests could play out in the 2006 elections, and maybe even in 2008. It would be the height of folly to follow this thing out the window, but who, right now, can say for sure that the mock debate won’t set some baseline of perception for the real elections to come?

The real-life polls for the Republican Party, and especially for President Bush, are nothing to write home about. A failed Supreme Court nomination, a burgeoning scandal with possible origins in the White House, a bid for a Social Security overhaul stuck in neutral (at best), and the slow bleed of our national misadventure in Iraq have combined to wear down the national patience for GOP leadership. Perception is reality, the saying goes, and the perception of the Republicans has been increasingly disappointing for more and more Americans.

If the perception is that a fictional Democrat won a fictional presidential debate, it’s at least possible that Americans fed up with the gradual erosion of the nation’s global credibility will take a subconscious cue from the “West Wing” goof and entertain the notion of change in who leads the country starting in 2008.

It wouldn’t be the first time that television has imparted its own reality to American politics.

In the legendary September 1960 presidential debate between John F. Kennedy and Richard Nixon, TV viewers were treated to a contrast of styles that became its own reality. Kennedy, prepared and looking haberdasher-smooth and polished, engaged Nixon, who was clearly fighting off some kind of ailment, looking sweaty and nervous, eyes shifty and seemingly insincere, a man pasty as a corpse even on black and white TV.

Theodore H. White understood it even then. In his book “The Making of the President 1960,” White observed the viral proliferation of television sets among American families and how “[W]ithin a single decade the medium has exploded to a dimension in shaping the American mind that rivals that of America’s schools and churches.”

It’s telling that, according to White, Nixon was thought to have held his own in the debate for a radio audience, but for the millions of television viewers Kennedy was the clear choice hands-down:

“Those who heard the debates on radio, according to sample surveys, believed that the two candidates came off almost equal. Yet every survey of those who watched the debates on television indicated the Vice President [Nixon] had come off poorly and, in the opinion of many, very poorly. It was the picture image that had done it – and in 1960 television had won the nation away from sound to images, and that was that.”

The rest is history, the stuff of our political folklore, and proof of the still-evolving power of visual perception in our culture.

Again, it’s a huge leap from a real debate of presidential contenders to a fake debate of actors pretending to be presidential contenders. But again again, perception engenders its own reality.

And when one political party is beset with intractable challenges, some of its own making, the burden of responsibility rests on its shoulders to separate – in the mind of an impatient public dazzled by a visual culture – fiction from the realest of real things.
-----
Image credit: NBC

Friday, November 4, 2005

'Simple Sambo'

That lovely sobriquet is one blogger’s name for Michael Steele, the African-American lieutenant governor of Maryland, a man seeking to become the state’s first black senator. The phrase didn’t come from what might be seen as the Usual Suspect of a white supremacist or a Web-savvy bigot on a tear. The label – and its accompanying minstrel-makeup image – was the work of a New Yorker who helms a news commentary Web site, a black member of the blogosphere with a bone to pick about a brother seen to have crossed the political tracks.

Therein lies a story of how African-American sentiment is at some pivotal intersection of race, politics and historical loyalty – a story of the ways in which black Americans are caught up in the same ugly, red state-blue state polarities as everyone else.

Steele – who plans to run in 2006 for the Senate seat that is set to open with the coming retirement of Democrat Paul Sarbanes -- has been the target of black Democrats for his presence and emerging role in the Republican party, a party seen, right or wrong, as antagonistic to black causes and aspirations.

Steele, and the GOP leadership generally, have been working to overcome the longstanding shaky courtship between blacks and Republicans. Steele has hastened to put things in a wider historical context, reminding people about the past love feast between the two in centuries past.

It's sure true enough – as the GOP has never, ever tired of letting people know – that the Republican party is “the party of Lincoln.” After the agonies of the Civil War, the presidential heirs to the Great Emancipator made black advancement more of a priority than it’s been in modern times.

After the assassination of President Lincoln, and no doubt partly as a reflexive commiseration reaction to that murder, African Americans flocked to the Republican party, which led the way to Congressional passage of the 13th Amendment to the Constitution — the one outlawing slavery. Republican efforts were also central to passage, in 1866, of a civil rights act that extended full rights to black Americans.

Progress continued in fits and starts into the twentieth century. But under President Franklin Roosevelt’s New Deal in 1936, and thoroughly in the context of “what have you done for me lately?” blacks changed course, voting overwhelmingly for FDR, in part because of government spending initiatives, like the work relief programs that benefited black Americans during the depths of the Great Depression.

Black people were beneficiaries of other Democratic policies, including President Truman's 1948 signing of Executive Order 9981, which committed the U.S. government, at long last, to integrating a long-segregated military.

Fast forward 15 years or so: Civil rights programs under President Kennedy and, more dramatically, under President Johnson (the Great Society program and his role in the passage of the 1965 Voting Rights Act) further cemented the relationship between blacks and Democrats.

But by then, the split that started between the Democrats and many of their white southern counterparts — who hated Truman's 1948 desegregation order and the Democrats' subsequent support of the civil rights movement — had widened into a figurative Grand Canyon.

The capstone of that Democrat-driven bid for progress was probably when Johnson undertook his Great Society initiative – an effort that led to the Voting Rights Act, the Civil Rights Act, and Johnson’s own prophetic suspicion that his actions on behalf of black Americans delivered the Southern states as a voting bloc into the hands of Republicans for years to come.

The Dixiecrats — those disaffected Democrats who bolted to the Republican party in 1964 — were centraal to the success of Richard Nixon's “southern strategy” in 1968, and formed the nucleus of what would ultimately become the modern GOP.

The problem many black Americans have with the GOP is largely a contemporary one, and it explains, to some extent, the allergic reaction blacks have toward the Republicans.

Let's count the ways: There's the lingering bad taste of the 2000 “hanging chad” election, and the persistent belief among many black voters that the results were somehow rigged, invalidating their votes.

Then there's Bush's nomination of Judge Charles Pickering to the Court of Appeals for the Fifth Circuit, despite Pickering's less-than-stellar record on civil rights decisions. Pickering criticized the “one-person, one-vote” principle recognized by the Supreme Court and tried to limit remedies provided by the 1965 Voting Rights Act.

Apparently indifferent to minority concerns, Bush installed Pickering on the federal bench in January 2004 as a recess appointment, bypassing the routine Senate confirmation process.

In January 2003, on what would have been Rev. Dr. Martin Luther King's 74th birthday, Bush condemned the admissions system at the University of Michigan, which used race as only one of several factors to determine qualification for admission, as “divisive, unfair and impossible to square with the Constitution.” Rep. John Conyers, leader of the Congressional Black Caucus, called the president's position “yet another slap in the face of African-American and minority leaders across the country.”

And in July 2004, in an action fraught with deep symbolism, Bush rebuffed the NAACP’s invitation to speak before its national convention, becoming the first sitting president since Warren G. Harding to refuse to address the convention. Bush chose instead to speak before the National Urban League convention – apparently oblivious to the fact that many members of the one organization also belong to the other.

The chafing was perhaps symbolized in August 2004, Nadia Naffe, a black former field director for the Republican Party, filed a federal lawsuit accusing the Florida GOP of racial discrimination, saying she was fired after complaining about being “race-matched,” or assigned to work only with black organizations.

Naffe's lawsuit alleges she was threatened by Republican Party officials and subjected to stereotypical comments by the staff.

“It seems like the Republican Party is in a continuous search for those elusive black voters,” said David A. Bositis, a senior research associate at the Joint Center for Political and Economic Studies, a think tank concentrating on African American and minority issues.

“The party of Lincoln? I don't think so,” Bositis told me in October 2004. “The Republican party is now the party of Jefferson Davis.”

All of which begins to explain the genesis of “Simple Sambo.” Such a hateful, ugly label doesn’t emerge in a vacuum. It speaks to an automatic antipathy that’s as unproductive as any directed at African Americans.

At least one black conservative said as much. “I don’t quite understand why Michael Steele has been targeted for this kind of hatred,” said Garland Williamson, president of a black business organization. “Anybody can disagree with Michael Steele or anybody else they want to disagree with, but let’s talk about the issues,” Williamson told The Associated Press in a Nov. 3 story.

But another black conservative, while not agreeing with the slander, at least appreciated its historical foundation. Ron Walters, author of a book on conservative public policy in black America, said black Republicans are often “perceived to be tools of the conservative white power structure.”

“Terms like Uncle Tom, sellout, Stepin’ Fetchit — those terms have not come from nowhere. They have a history,” Walters told the Associated Press. “It is deserved, to the degree that they support anti-racial policies.”

It hasn’t gone unnoticed by the GOP leadership and others under its banner. In October 2004, Marc Racicot, the former chairman of the Republican National Committee, told me of the GOP mission: to increase outreach to black voters. He pointed to the diversity of the Bush Cabinet.

“Have we made progress? We’ve made significant progress," Racicot said. "This president has the most diverse Cabinet in the history of America and has relied on the competence of a talented group of African Americans in that Cabinet – from [Secretary of State] Colin Powell to [HUD secretary] Alphonso Jackson to [then-National Security Adviser] Condoleezza Rice, and others.”

“Those are the kind of demonstrations that mean something to people in the African American community.”

Which begs the question, then and now, more than a year later, how such opposing philosophies can exist in one party – the diversity of the Cabinet played out against policies that call into question the value of having such a diverse group of advisers in the first place. It’s this dichotomy that confuses black voters, and suggests a fundamental insincerity that’s hard to overcome.

Armstrong Williams, the conservative black commentator, cut to the chase in January 2003: “The Republican Party has to realize that it cannot be lily-white any longer,” he said. “Change must come about, and it must start within our house.”

No question, the “Simple Sambo” slight was a nasty slap in the face of a brother hoping to advance his own political star and, by extension, elevate the electoral profile of black America in a way that breaks with the Democratic party’s longstanding reflexive assumptions of black people as being in their camp, no matter what.

But that slight didn’t come out of nowhere. It’s also a manifestation of the frustration that African Americans more broadly feel because of a party that’s seemingly complacent about addressing the needs and concerns of black Americans – concerns symbolized most recently by the piss-poor response of the federal government to the plight of mostly-black victims of Hurricane Katrina [see “American Tsunami I-V”].

Michael Steele’s upcoming campaign may well change his own political destiny, but black people are hoping to change their own destinies – to shape their own destinies – and the GOP needs to step up to the plate and speak to their social, economic and political concerns in a way that doesn’t look like so much photo-op window dressing. Speaking to the complexities of those concerns may be the paramount challenge for the Republicans, in both the midterm elections next year and the presidential vote in 2008.

It’s as simple as that.
-----
Image credit: Steele: U.S. Navy (public domain). Bottom image from the 1899 version of the children's book 'Little Black Sambo.'

Friday, October 28, 2005

The first shoe drops

With stunning speed after a seemingly endless 22-month investigation, U.S. Attorney Patrick Fitzgerald today finally put someone’s ass in his briefcase in connection with the CIA leak inquiry [see “Waiting for Fitzgerald”]. The first shoe has dropped and it landed squarely on Irving Lewis “Scooter” Libby, the chief of staff for Vice President Cheney, who was indicted by a federal grand jury, on Fitzgerald's recommendation, on five counts of perjury, false statements and obstruction of justice.

Libby, lately glimpsed coming in and out of the White House on crutches, resigned his position in the White House, surrendering his access pass and, presumably, all the perks and privileges connected to his high-profile, maximum security clearance position in the Bush administration. The man known as “Dick Cheney’s Dick Cheney,” the first White House staffer indicted while in office in 130 years, is looking down the barrel of a possible maximum sentence of 30 years in the slammer and $1.25 million in fines. Scooter was last seen scooting his Yale-educated ass out of the White House, probably for the last time without a visitor’s pass.

President Bush, the gravity of the situation evident on his face, appeared to offer a terse, six- or seven-sentence statement he might as well have phoned in: We’re sorry to see Scooter go; like everyone in America, he’s entitled to a presumption of innocence until proven guilty; we’ve got a job to do on behalf of the American people and we’re going back to work. He turned and practically fled from the microphones and headed for Marine One for his weekend trip to Camp David, where he’s no doubt hunkering down with aides to try and pull a distracting rabbit out of his hat – most likely the name of a new nominee for the Supreme Court, most likely to be released to great fanfare on Monday.

What followed on the television talk shows and news programs was one of the most immediate and orchestrated exercises in damage control this White House has ever undertaken, and there have been plenty. Right-wing apologists of every stripe emerged to defend the administration, some of them with the nerve to say that, paraphrasing now, “things could have been worse; all in all, this is actually a great day for the administration.” The thinking seemed to be that, with Libby out of the way, things could get back to something approaching normal for the Bushies.

MSNBC analyst and former Republican apparatchik Pat Buchnan called the president’s statement on the White House lawn “brilliant and savvy.” It was neither. In Bush’s brief comments there was nothing more or less than an example of boilerplate butt-covering, hardly anything akin to brilliant, not so much savvy as politically instinctive CYA.

Questions remain. At a 66-minute news conference announcing the indictment, Fitzgerald repeatedly made it clear that the investigation continues and that more indictments were possible. Fitzgerald didn’t mention administration architect Karl Rove by name, but Fitzgerald spoke of a shadowy “Official A” who may be the source of the leak, and possibly the linchpin to the outcome of this sordid mess. Could it be Rove? Could it even be Cheney?

There's much to compel that suspicion. Newsweek reporters Howard Fineman and Richard Wolff, in the magazine's newest issue, offer a plausible connecting of the dots: "Fitzgerald will inevitably have to shine a light on the machinery that sold the Iraq war and that sought to discredit critics of it, particularly Joseph Wilson. And that, in turn, could lead to Cheney and to the Cheney-run effort to make Iraq the central battleground in the war on terror."

Thus are second presidential terms frozen in place. It damn near happened to Reagan; it did happen to Nixon and Clinton -- that slow unraveling of noble intentions and grand agendas, made the victim of hubris in high places.

Some people in Washington must surely know this, or at least sense it. There’s a strong probability, for example, that junior White House staffers are working this weekend at 1600 Pennsylvania Avenue, though not so much working as pursuing their own form of damage control: updating their resumes to remove all mention of employment there.

Next week should be entertaining, if not instructive. The wheels of Fitzgerald’s investigation are grinding forward, with no telling where all this will end up. This bit of political theater is revealing the seams and patches of the Bush administration mindset more tellingly than any of the Bushies’ official policy statements. The Libby indictment, along with the Harriet Miers debacle and the long-running agony of the Iraq war, show the Bush White House in the position of reacting to events, rather than directing them. The flight to Camp David will lead, mark our words, to a transparent attempt to regain the high ground of public attention next week.

Regardless of those attempts, the historical parallels between this administration and another presidency are inescapable and have been for some time. A new parallel is lately emerging.

It’s been obvious for some time that Iraq is George Bush’s Vietnam; what remains to be seen is whether or not the president and his associates can prevent this current domestic meltdown from becoming George Bush's Watergate.

The betting window is open.

Thursday, October 27, 2005

The 'pit bull' in the kennel

George Bush’s annus horribilis just keeps rolling along. An announcement today reveals just how high the burn rate really is on that “political capital” he claimed to have inherited two days after the 2004 election. With White House counsel Harriet Miers’ voluntary withdrawal from consideration to fill the pending vacancy of Associate Justice Sandra Day O’Connor, President Bush has experienced the latest in a series of body blows to whatever vestige of Leadership & Prestige he has left. In the world of bad-luck presidential poker, George Bush keeps doubling down – by accident.
It’s anyone’s guess if the president’s nomination of Miers was meant as a real, honest validation of his estimation of her as a litigator and a champion of the law, or as a flattering but calculated maneuver meant to buy him time to build momentum for naming Attorney General Alberto Gonzales to the high court, which is what he’s thought to have wanted all along. Whatever the real reason, it has backfired in a way that underscores the image of an administration in a free fall – a government waiting on what may or may not be the next blow to the head from some guy named Fitzgerald.

You’re tempting to blame his handlers and advisers, but something about the nomination of Miers from the beginning seemed to be so stunningly off the wall, it was quite clear in this instance that the president was taking no one’s bad advice but his own. It was always hard to imagine Harriet Miers as an especially brilliant mind, with anything close to withering erudition, anything faintly resembling the rapier insights of Holmes, Brandeis, Frankfurter and Warren.

There were, in the early going, some people (me included) who were willing to give her the benefit of the doubt on the grounds of bringing to the high judiciary a lived-in life, with experiences and viewpoints off the beaten track; we knew she was a lawyer, one who worked capably and efficiently in the pressure cooker of the post-Sept. 11 White House, and she was in some personal respects refreshingly out of the mold [in both spatial and fungal contexts] of those who ascend to the closest thing we have to a throne in America.

Then the Senate Judiciary Committee began the process of drilling down into the nominee’s substance – as much for her positions on topics of the day as for a sense of her grasp of constitutional law. Sen. Charles Schumer of New York, a mensch on the Hill if there’s ever been one, gave her more than one benefit of the doubt, saying at first that she could be the consensus candidate they were dreaming of. Schumer later met with her, made nice … and kept digging, and not digging so terribly far before he found himself coming up on dry wells of constitutional scholarship where there should have been gushers.

Others in the Senate said much the same. A groundswell of bipartisan opposition emerged, rude and smug, strident and talk-radio mean. You half expected a crowd with torches and pitchforks to roll the tumbrels up onto the White House lawn in a hunt for her office, her staff and herself.

Harriet Miers had the inescapable bad luck to be the juggling act to follow a performance by the judicial equivalent of Lord Olivier. The nimble, polished presentation of John G. Roberts Jr. at his confirmation hearings set an incredibly high bar, and anyone following his act in front of the senators would have paled by comparison.

But there were shortcomings. The senators on the committee knew it. Harriet Miers probably knew it. The press knew it (or they said they knew it, but of course they say they know everything). Maybe even the president knew it (he wouldn’t say he knew it, but frankly sometimes we wonder if he knows anything).

The “pit bull in size 6 shoes” who stormed out of the gate with such promise– what, two weeks ago? – is back at her desk in the White House kennel again, one of many people lately dispatched to the doghouse, and not likely to be the last.

Wednesday, October 26, 2005

Waiting for Fitzgerald

All apologies to the late great Samuel Beckett, but one can't resist appropriating the title of his most famous play and tweaking it a little, in honor of the most intense, excruciating waiting game Washington has seen in at least five years -- one that's likely to have a big impact on the Bush administration's domestic agenda for the next three years.

The nation's capital waits -- in what MSNBC's Chris Matthews turgidly called "a tangy meringue of the maudlin and the giddy" -- on the indict/no-indict decision by U.S. Attorney Patrick J. Fitzgerald, the special counsel appointed more than two years ago to investigate the possible complicity of certain figures in the Bush White House in the outing of CIA agent Valerie Plame's identity, as an act of political retribution against her husband, former ambassador Joseph Wilson, for his failure to toe the mark on the White House party-line policy of Iraq as nuclear-capable boogeyman.

The administration is in the unlikely position of being at the mercy of an outside force; long used to being the catalyst of events, the Bushies are compelled to wait and see what transpires. In the balance: possible indictments for perjury and obstruction of justice for Karl Rove, White House deputy chief of staff and by all estimations one of two powers behind the throne; and I. Lewis "Scooter" Libby, chief of staff for Vice President Cheney, the shadow president of the United States, who walks the corridors of power with the smirk of a man giddy with the realization of being the one who runs the country without getting the heat of the man elected to run the country.

The prospect of losing the architect of the Bush presidency and another high-ranking official a heartbeat away from the man an irregular heartbeat away from the presidency has Foggy Bottom Republicans in a quiet panic. Some have already retreated into defensive postures that owe more to instinct than to intellect. Sen. Kay Bailey Hutchison, the Texas senator herself under a past ethical cloud, has already gone on the record saying that the possible perjury charges amounted to a "technicality." Bow-tied right-wing apologist Tucker Carlson said as much to Matthews on the same show last night.

These reflexive political crouches beg the question of why perjury was an impeachable offense for President Bill Clinton, during the Monica Lewinsky scandal, but is now no more than a minor flaw in the judicial process and one to be overlooked this time. What can you say? It's automatic, about as automatic and scripted as the probable response from the White House if indictments do come down (possibly tomorrow). The Los Angeles Times Web site has a story up by the excellent Doyle McManus reporting the likely strategy if the shithammer comes down: try to do the backstroke in the toilet bowl, work to serenely rise above the situation and come up with a suitable distraction.

As if they haven't adopted that approach in the past. Like yesterday, when the 2,000th known U.S. military fatality was reported. President Bush, in as close to a pre-emptive strategy as he's ever ventured, spoke before an audience of military spouses at Bolling Air Force Base in Washington, and admitted the agony of the nation's mounting losses affected him as well -- all the while insisting that America stay the course in Iraq.

These strategies may fail to work if Fitzgerald finds a true bill in this case. If he brings indictments against Rove, Libby or possibly even Cheney for being the source of the leak of Valerie Plame's identity, the mendacity of this administration will be laid bare on a global scale. The very underpinnings of the rationale for pre-emptive war will be questioned again, but with an urgency and a foundational skepticism from the national judiciary not seen before. And indictments will sure as hell embolden the already growing number of Americans against the war in Iraq, giving weight to their longstanding belief that the war on terrorism as prosecuted in Iraq was a fiction from start to finish.

Even if Fitzgerald doesn't indict them, or anyone else, there's a sentiment out there that suggests Fitzgerald -- by all indications something of a Boy Scout, a single, single-minded Irish-immigrant's son from Brooklyn utterly dedicated to the case at hand, whatever the case is -- may feel compelled to release some kind of statement of progress, a disclosure of what he did and didn't find, if for no other reason than to justify his expenditure of the taxpayers' money for two years.

Unlike the situation in Beckett's play, in which the two characters wait for someone who never comes, at least two of the protagonists in this little drama will realize a very real finalilty, one way or the other. Will Vladimir Libby and Estragon Rove be the subject of target letters? Let's wait and see.

Tuesday, October 25, 2005

2,000 points of light

The news came across the wires at 12:07 p.m. West Coast time: The U.S. armed forces fatality count reached 2,000 today. Army Staff Sgt. George T. Alexander Jr., 34, of Killeen, Texas, died over the weekend in San Antonio, Texas. Alexander, assigned to the 1st Batallion, 15th Infantry Regiment, 3rd Brigade, 3rd Infantry Division at Fort Benning, Ga., was wounded by a roadside bomb on Oct. 17 in Samarra, 60 miles north of Baghdad.

The 2,000th-death event was expected for some time; newspapers and Web sites were for weeks preparing special sections to make note of that grim, presumably inevitable numerical signpost. Editors and reporters probably had the phrase "grim milestone" coded into macro keys on their computers; so many of them used those words for previous somber war-related anniversaries -- like when 100 troops died, and when 500 died, and when 1,000 died.


Today the chief spokesman for the American-led coalition, Army Lt. Col. Steve Boylan, asked reporters covering the conflict not to read too much into a single number, actually having the nerve to describe the number as an “artificial mark.”

“The 2,000 service members killed in Iraq supporting Operation Iraqi Freedom is not a milestone," Boylan said presumably with a straight face, given the gravity of the circumstances. "It is an artificial mark on the wall set by individuals or groups with specific agendas and ulterior motives."

Boylan's half-right. It's an artificial mark not unlike the convenient numerical benchmarks the press relishes as a way to make their lives easier. Another one the press is fond of has happened with every administration of the past thirty years: the 100-day "report card" that's such a bane of our existence it begs the question of why we even bother to do it any more.

But in another way Boylan, like so many others seeking to legitimize an illegitimate conflict, misses the point. To call it artificial is to minimize the impact, individually and collectively, on the people involved. In our society we use numbers as an index to our joy and our pain, our triumphs and our sadness. That's how we keep score ... of everything that matters. There's nothing artificial about reaching the level of two thousand Americans killed in the prosecution of an unnecessary war.

“The 2,000th Soldier, Sailor, Airman, or Marine that is killed in action is just as important as the first that died and will be just as important as the last to die in this war against terrorism and to ensure freedom for a people who have not known freedom in over two generations,” Boylan e-wrote to reporters with an eloquence that would be profound if it weren't so self-serving.

The statement overlooks the fact that if the first soldier to die wasn't sent to die -- wasn't dispatched to perform a politician's errand -- the 1,999 to follow wouldn't have had to die either. Those deaths, either the first or the most recent, would be somewhat easier to take if the mission that Boylan parrots -- "to ensure freedom for a people who have not known freedom in over two generations" -- was the real reason those soldiers, sailors, airmen and Marines are over there in the first place.

And it's not. And the continuing deception practiced by the Bush administration -- its very own "specific agenda," pursued for the administration's own "ulterior motive" -- only makes our great national agony that much worse.
-----
Photo credit: Department of Defense

Monday, October 24, 2005

Rosa Parks (1913-2005)

It was on a day in December 1955 when a black seamstress in Alabama got uppity in the Deep South, boarded a city bus and, by taking a seat in the wrong place, took a stand for the right thing.

Rosa Louise McCauley Parks died today at her home in Detroit, at the age of 92. In the rapidly passing parade of events -- a relentless cascade of tragedy and folly that makes it hard to keep track of what happened fifty hours ago, let alone fifty years -- Rosa Parks' statement endures, resonates in ways that many Americans, and just as certainly many black Americans, may have forgotten.

They never lived a life in which they had no choice about where to sit on a city bus or a commuter train; they've never had to contend with ridiculous distinctions made between one water fountain and another, or one bathroom on another, or one lunch counter seat or another.

Those distinctions began to be erased with Parks' stand on principle in December 1955. From that action, and her arrest afterward, the black residents of Montgomery, Ala., began a boycott that underscored the economic power of African Americans living under siege. For 381 days, blacks boycotted Montgomery buses, in an action spurred on by a relatively unknown minister named Martin Luther King.

With the landmark Brown vs. Board of Education ruling of the year before her sitdown statement, and the bus boycott that lasted more than a year after, the groundwork was laid for what became the modern civil rights movement, a concatenation of events, legislation and acts of personal courage that rings in the nation's ears today, regardless of whether or not the nation really wants to hear.

What more to say? Thank you, Sister Rosa; go to your rest, your job well done.
-----
Image credit: Montgomery (Ala.) Sheriff's Department

Wednesday, October 19, 2005

Doc Rice & the contractors

Sometimes the fallacy in an argument, a policy, a world-view is revealed in the smallest, slightest way. Walls of rationale and well-thought-out positions fall apart with a single word, one word that points to how weak the whole structure is.

Dr. Condoleezza Rice, the secretary of state, was testifying on Capitol Hill today, offering senators a blueprint for the foreseeable future of Iraq, flush with the apparent success of the Oct. 15 referendum on a draft constitution, despite vote certification irregularities that at this writing are yet to be fully identified -- is this the Middle Eastern equivalent of the hanging chad controversy?

In the course of her appearance -- customarily categorical, abrasive and dismissive, sometimes in the same sentence -- Doc Rice responded to a question from Massachusetts senator John Kerry. Madame Secretary spoke of the future of Iraq and how Iraqis' assumption of their own affairs, in the embrace of a Western-style democracy, would constitute "victory in this war."

Victory. The word summons every outmoded, antediluvian image you can think of. There's a 17th- or 18th- or 19th-century feel to its usage in this context, a subscribing to a polar, binary, us-vs.-them view of the world that properly ended with the dissolution of the Soviet Union. Doc Rice's use of the word, maybe more than any other, illustrates the anachronistic thinking behind much of the planning and execution of the war in Iraq.

The present conflict is fundamentally at odds with traditional conceptions of warfare, the comfortable framing devices of past conflicts that defined success in terms of geography, loot, empire in the most martially atavistic terms. Doc Rice's embrace of the word as a level of achievement suggests she's overlooked the ways that ideology, religion and faith have become the new yardsticks of success and failure. The war in Iraq is not a turf battle; this is hearts and minds writ large, and the danger is in this country's failure to see that indelible message, and to see how we ignored or overlooked that message before.

Doc Rice's simplification is one that the architects of the current conflict continue to embrace. On PBS' "News Hour With Jim Lehrer," one of the prevailing thinkers on the right gave a glimpse into the way the conservatives have distilled the contours of this conflict into something innocuous, and not a little elitist.

Walter Russell Mead, senior fellow at the Council on Foreign Relations, defended Doc Rice against the barrage of questions on the timetable of the war, and explained, or tried to, the futility of coming up with a timetable for getting out. "Fighting a war is a little like having a contractor come in and redo your kitchen," Mead said. "You want to know the deadline and you want to know the cost and you want everything to be done on time, and if things go over budget, you're very irate."

In Mead's "contractor mentality" cosmology, war is a willfully improvisational exercise in which armies are "testing their srength and trying strategies and looking for weak points in the opposition." (Much like you look for weak spots when you're trying to shore up your home's foundation.)

Mead's is a perfectly plausible, articulately expressed, readily accessible argument. It is also complete and unmitigated bullshit.

The basic wrong-headedness of his position stems from making an assumption based on his apparent past experiences with unscrupulous contractors. Despite what Mead believes, there's not a contractor worth his Better Business Bureau recommendation that doesn't really know what it takes to complete a job. It's not that the contractor doesn't know what the job will cost or how long it'll take -- it's just that he won't share that knowledge with you.

This begins to explain the dancing Doc Rice got into when asked, point-blank by Maryland Sen. Paul Sarbanes, for something resembling a timetable for extraction from the land of George's miseries. Will we start to be out in five years? In ten?

"We are moving on a course in which Iraqi security forces are rather rapidly able to take care of their own security concerns," she said. "...And as they are able to do certain tasks, as they are able to hold their own territory, they will not need us to do that."

The sharp reader will of course notice Doc Rice's use of the word -- that single undermining word again -- rapidly. That word, by its very nature, implies an action in some chronological framework, one in apposition to another, longer chronological framework. You don't think of things happening rapidly in a theoretical vacuum. It's always "rapidly" in relation to "slowly." The adverb preceding "rapidly," "rather," makes the sense of a prospective timetable even more obvious.

But clearly, this is a timetable that head contractor Doc Rice would prefer to keep to herself. It takes as long as it takes; it costs what it costs.

For Mead, the idea of a defined timetable for exit from Iraq isn't possible because of the precarious, unpredictable nature of warfare. "It's not one of those simple controllable processes," he said on "News Hour." Mead overlooks just how controllable a situation can be when you create it in the first place. His insistence that a timetable for gradual withdrawal is neither prudent nor possible runs up against the way the United States got into the war: by slow degrees, through incremental deceptions that happened so slowly but methodically it was hard to discern them as a building wave, until the wave was upon us.

Over more than a year the United States accrued the intelligence, materiel and popular support for beginning a war, then slowly developed plans for attack and invasion, then slowly executed those plans. The documentable metrics for starting military action were arrived at over a period of time; how can developing a documentable plan for ending that military action be unsound or difficult?

Like we said, it's not that the contractor doesn't know what the job will cost or how long it'll take -- it's just that he won't share that knowledge with you.

Zbigniew Brzezinski, the national security adviser under President Carter, understands what's at stake. On the same "News Hour" segment, Brzezinski made the case for full disclosure. "We have to take a critical look at the overall costs of the war, for America's legitimacy in the world, for our moral standing and, indeed, even for our resources, both military and economic."

Doc Rice and the contractors have the high ground right now, and they can charge whatever they want. But for the American people, the bill comes due too frequently. It shows up somewhere in America every day, a bill that doesn't come in the mail, a bill that shows up on the front porch, in uniform and somber expressions that say, without a word being said, that we regret to inform you ...

It's a bill the nation won't be willing to pay forever.

Subject to change

All props to the Swedish Academy, the august and righteous body of intellectuals responsible for selecting the Nobel laureates, and one of the few groups on the planet apparently able to keep a secret in the Internet age.

Within the last ten days, the Academy has made two inspired laureate selections that, when revealed, had the appearance of nothing less than magic (def. the art of elegant misdirection). The likely winner of the Nobel Peace Prize kept the people at Ladbrokes and the Las Vegas Sports Book crazy for days. A multitude of names were tossed around, including some of the academy's customarily compelling choices from the developing world -- as well as a few zingers, for the sake of sexing up an awards ritual quite long in the tooth.

For their own personal endeavors for addressing some of the world's enduring problems -- debt relief, famine relief, affordable AIDS therapies -- Live Aid architect Bob Geldof and Bono of U No Who 2 were nominated. Theirs were long-shot chances; Bono graciously admitted as much the night before the announcement, telling Conan O'Brien it was just an honor to have been nominated.

The winner of the Nobel, a man who was said to be on the short list but somehow got lost in the wash of current events, hit like a clap of thunder: The director of the International Atomic Energy Agency, Mohammed ElBaradei, shared the $1.3 million prize with the agency that tried to act as an honest, reasonably non-ideological broker between the Middle East and a fractious Washington. ElBaradei's award was widely perceived as a global/philanthropic two-by-four to the head of the administration, which had done battle with ElBaradei in the past, as recently as the summer, when the United States opposed ElBaradei's reappointment as IAEA director. The choice was, well, inspired. And there was more.

Harold Pinter, long acknowledged as the lion of British theater for the second half of the twentieth century, and lately an avowed opponent of the U.S.-led initiative in Iraq, won the Nobel Prize for Literature. The Academy hailed Pinter as a playwright "who in his plays uncovers the precipice under everyday prattle and forces entry into oppression's closed rooms."

A kind of prolonged howling was observed coming from the West Wing that evening, followed in short order by the customary conservative outrage. It was the second pointed rebuff-by-proxy of the American misadventure in the Middle East.

And what might be seen in isolation as the rogue reflex of intellectuals in one of the world's enduring pacifist nations is actually something wider. When culture is pushed, sooner or later, culture pushes back. It's happening now, building on previous successes, and taking advantage of a slow groundswell of opposition to the war in Iraq. And not just at the investiture-and-morning-coat level of the Nobel Prize. It's happening, again, at the multiplex near you.

With "Good Night, and Good Luck," George Clooney's masterful glimpse at one high point in the clash between Edward R. Murrow and Sen. Joe McCarthy, the reigning device of popular culture managed to tell a contemporary American story through the images of an older one, the demagoguery of an earlier uncertain age suddenly a mirror on the slicker, more camera-ready demagogues of the present day. "We will not walk in fear," Murrow says. "We cannot defend freedom abroad by deserting it at home." The linkage of then and now couldn't be more real, more pointed, more indicative of a tide beginning to turn.

Then there's "Jarhead," Sam Mendes' treatment of Anthony Swofford's chronicle of a Marine's wrenching, emotionally expensive transition from raw boot-camp recruit to sniper in the first Gulf War. And on Dec. 9, we'll have "Syriana," Clooney's next film, a political thriller that plumbs the interplay of rapacious U.S. oil companies and the disillusioned of the Middle East, who find solace and meaning in pursuing violent work against the West. The film, written and directed by Stephen Gaghan ("Traffic"), seems likely to be a project to exercise Clooney's maverick streak for candor, and opposition to policies and practices the administration holds dear.

So what's different? "Fahrenheit 9/11," released in early 2004, couldn't have the benefit of hindsight. For all its impressive splash into the culture (it was the first documentary to gross more than $100 million at the box office), Michael Moore's masterpiece could only take us so far into events that were occurring as the film went into post-production. Now, there's a sense, broadly supported by numerous opinion polls, that the populist underpinning of antiwar sentiment is way broader than before, in the angrily heady months after Sept. 11.

What's different? More and more Americans are against the war in Iraq, and unlike before, they're increasingly willing to say so. And since culture, high and low, is the basis for so much of our everyday identification -- the fabric of the national conversation -- that antiwar sentiment takes on a life and a resonance it didn't have before.

Our culture, like our society, is always subject to change; the slow perforation of a tissue of lies is underway, and you get to watch it with popcorn in your lap.

When culture's pushed, sooner or later, culture pushes back.
-----
Image credits: ElBaradei: IAEA. Jarhead poster: Universal Studios.

Monday, October 17, 2005

American Tsunami V

Like we didn't know already: Today, courtesy of memos obtained by the Associated Press, we get a fuller picture of the ineptitude of the Federal Emergency Mismanagement Agency -- FEMA -- as the magnet for taxpayer dollars struggled to prepare for the arrival of Hurricane Katrina. The memos showed an agency woefully out of touch with the gravity of the situation, an organization that failed to get it before the storm made landfall and didn't fully understand things days after the storm devastated the Gulf Coast.

Among other tragic missteps: FEMA could not find food, ice, water and even the necessary body bags in the days after Katrina hit the city. The agency charged with stockpiling such basic emergency provisions in advance apparently couldn't track them down when the real deal hit.

The AP story recounts running e-mail conversations, policy discussions, bureaucratic infighting and concerns about how the agency would be perceived in the press -- the usual shortcomings of a government agency increased by orders of magntiude by the greatest natural disaster to hit the United States since the Galveston hurricane of 1900.

Of particular note are the shortcomings of Michael Brown, then director of the agency, a man who, on the weight of available evidence, could not find his ass in the Category 3 windstorm that ultimately cost him his job. Five days after the storm hit on Aug. 29, Brown sent an e-mail to an aide saying there had been “no action from us” to evacuate storm victims using planes that airlines had provided.

“This is flat wrong. We have been flying planes all afternoon and evening,” said a subordinate, Michael Lowder, in an e-mail sent half an hour later. The question is an obvious one: How can the director of an agency not know whether planes are in the air or not?

AP: "A day earlier, a FEMA official in Mississippi received an e-mail asking for Brown’s satellite phone number so a senior Pentagon official in the Gulf Coast could call him. 'Not here in MS (Mississippi). Is in LA (Louisiana) as far as I know,' FEMA official William Carwile e-mailed back, seemingly uncertain on the whereabouts of the administration’s point man for responding to the disaster."

The late Dr. Laurence J. Peter was the author of "The Peter Principle," a celebrated bible for business management before bibles for business management became the rage. One of the book's core principles is instructive when considering the FEMA debacle: [paraphrasing now] Ineptitude -- what Dr. Peter describes as "occupational incompetence" -- rises to its own level of authority in a given organization.

The curious rise and utterly predictable fall of Michael Brown reveals again how the trajectory of a disaster is something discernible from a long way off -- sometimes, even often, from before the disaster even takes place. FEMA's lack of imagination combined with a lack of resources and a lack of willingness to step outside the usual boxes of procedure and routine resulted in a perfect storm within a storm: the hurricane of bureaucratic chaos that doomed hundreds of people as surely as the waters themselves.

Saturday, October 15, 2005

iWorld

With his customary fanfare and flair for the theatrical, Apple Computer Chief Executive Steve Jobs on Thursday announced the release of the video-enabled iPod, the latest tweak on the music recall device everyone seems to have today. With Jobs' latest bold move, the entertainment industry is again on notice that the relationship between producer and consumer is changing more quickly than ever; and that, regardless of market share and gross capitalization relative to other corporations in the space, Apple continues to enjoy an almost mythic high ground among devotees of the cutting edge.

As with other Apple products, the introduction of the video iPod will likely terrify the business world of personal entertainment technology. The rollout ads and at least some of Jobs' presentation were impressive. One ad for the new iPod showcases a video starring U2 (fast becoming Apple's house band); the camera pulls back slightly to frame up the video not in the spatial context of an actual performance but as an image on the screen of the New Device. And justhatfast, entertainment changes.

Microsoft, Apple's eternal bete noire, was late to the party, having introduced mobile video in 2004, about two years after the introduction of the iPod. In a story in yesterday's Seattle Post-Intelligencer, analysts told reporter Todd Bishop that sales of Microsoft's Portable Media Center "have been sluggish and consumer awareness is low."

Bishop asked a tech-savvy lawyer what he thought of the Microsoft product. "Never heard of them," the lawyer said. But he was hot to maybe buy the new iPod.

This begins to explain the differences one encounters when comparing Microsoft and Apple; both are visionary companies with enviable track records and reliable products; both are immediately recognized around the world; both are adapting technology to the human experience in compelling ways. But in the look & feel world of marketing, in the way the public experiences the product, Apple comes out on top.

Why? Let's call it the JNSQ Factor. Those four letters are fom the words in the French phrase "je ne sais quoi," which more or less literally translates to "I know not what," an expression used to describe the ineffable, ephemeral, something quality that makes a person, an idea, a performance, a device stand apart from, and usually above, the competition.

Apple has always held down that lofty perch. From the beginning, when the Macintosh was introduced, Apple set the pace for weaving art and computer technology in a way Microsoft couldn't hope to achieve. One reason for this is very much rooted in what each company is and what each company sells.

Apple's stock in trade is the development of the computer and tangible devices -- objects that, like all objects, summon a personal identification with the owner, a sense of individuality, a personality. Apple's core products reinforce our proprietary instincts, our embrace of things -- things we can hold and touch and manipulate and feel. Microsoft's business is software, arrays of code and markup language, ones and zeroes designed to improve efficiency, enhance productivity, streamline operations -- all utterly necessary in the modern world, but in the end not seen being as personal, as individual, as downright cuddly as that work of art sitting on your desktop, blinking every so often to remind you just how cool it looks.

JNSQ for sho. It's that extra element in an Apple product rollout that a Microsoft rollout doesn't seem to have. That edge. That mystery. That cool, elegant visual vocabulary that appears to owe as much to Zen-monastery simplicity as to the technology that embodies that simplicity. Want proof? Walk through an Apple retail store, like the one hard by the campus of the University of Washington. Watch people handling the new products. They play with them like at any consumer electronics store, of course. But there's often an extra lingering glance, the sweep of a hand across the monitor that's less the act of a consumer than it is the act of a suitor, one truly desirous of the object of his or her affections.

Sorry -- you don't feel that way about software, no matter how enabling it is. It's hard to fall in love with tags and code.

Not that Microsoft isn't subject to people's emotional reactions. One problem for MSFT is the still-lingering perception of the company as technological bully, that perception arising from the landmark civil trial in which the U.S. government sued the company for stifling competition, and despite a settlement in the case in 2004. With baggage like that trailing and preceding you, as well as a longstanding reputation for playing hardball with everyone from competitors to your own contract workers, it's hard for the public to embrace you, even if your product is in every facet of their daily lives. It all comes down to perception.

Apple has always played up its reputation as the mouse that roared; Microsoft has always positioned itself as an unstoppable force. Little guy vs. big guy. David vs. Goliath. And the public has voted, about 30 million times at last count, for the iPod from the little guy, while the big guy's Portable Media Center goes wanting. Reason? It's partly, or even mostly, human nature: People respect Goliath but their hearts, and regions south of their hearts, are with the underdog.

Apple's made missteps too -- can you say "Lisa"? Ironically enough, the video iPod Jobs rolled out this week was the result of a change of mind; as recently as last year, Bishop reported Friday, Jobs had come to the conclusion "that video may be the wrong direction to go." Jobs presumably got religion when he saw the market-share bite that Sony's PlayStation enjoys, and considered the prospect of a groundbreaking distribution deal with Disney (and by extension the best programming on ABC).

And Apple innovates more frequently (almost too frequently). That's great for running rings around the competition, but it can be pleasantly maddening if you're a Macintosh devotee. Apple zealously tweaks and upgrades hardware and operating systems so often it can be hard to keep up. Case in point: I bought a new iMac computer with 75GB, Bluetooth wireless mouse and 17-inch all-in-one display for about $1,900 in February. By July, if memory serves, Apple had dropped the price: had I waited, I could have bought a 20-inch iMac for the same as my 17-inch five months earlier. Then they did it again: When Jobs announced the video iPod, he also took wraps off the new iMac, which features a remote control for access to music and images from across the room, and a built-in web camera -- all for about what I paid, months earlier, for less.

That's the cost of keeping up. And the 30 million who ponied up for the iPod understand that, and they're willing to pay even more. They can't get enough. They're in the gym and at the coffee shops and in their cars and on the street, listening to whatever they want whenever they want, every one caught up in their own little world.

But not really. It's Steve Jobs' world, actually. We just boot it up.

Wednesday, October 12, 2005

White House of Pain

The escalating cost of war in lives and dollars comes home on the eve of the next stab at democracy in Iraq. With the Oct. 15 constitutional referendum dead ahead, the Pentagon announced Oct. 7 that the number of U.S. troops in Iraq has mushroomed to 152,000. A Pentagon official told the AP that troop strength probably will hold near that level at least through the election.

At almost the same time we get the money hit: A paper by the nonpartisan Congressional Research Service determined that the average monthly costs of waging war in Iraq has gone from below $5 billion to $6 billion. The confllict in Afghanistan costs another $1 billion a month. The research service concluded that war costs could total $570 billion by the end of 2010, assuming those 152,000 troops are gradually, eventually brought home.

This begins to explain the haggard, beaten look lately on the face of George Bush. The cocksure certainties of the Ivy League jokester who changed into a governor who morphed into a president in a flight suit have given way to the insecurities of a man under siege. The political capital he announced with a smirk after the election is gone, like the billions already poured into Iraq, like the global capital of goodwill expended at a white-hot burn rate over the last three years.

The last three years. You want proof of how the presidency, this presidency has aged George Bush? Compare pictures from Bush's Cocksure Period -- the State of the Union 2002; even from the summer of 2003, before the grim slide began -- to shots taken today. His hair was dramatically darker then, his face less drawn, more animated, the gait of his walk sturdier, more aggressive, like the stride of a good ol' boy spoiling for a parking-lot fight. In that first year in office, and even after the horrors of Sept. 11, George Bush nearly bounded off of Air Force One and Marine One, walking with a jauntiness that underscored a joy of those marvelous perks of the presidency, pointed to an embrace of the levers of the most formidable power on earth.

Fast forward to the Blue Period: Nowadays when Bush alights from Marine One, he looks like a man painfully going through the motions, returning the salute from the Marine at the foot of the chopper's staircase like he'd rather be doing anything else. There's an aspect of sourness to some of his more recent public appearances, what looks like a foundational disappointment ... or maybe it's the look of a man who realizes that, as bad as it's been so far, he's staring down the barrel of another three years of the same.

It's tempting to write this off as the obligatory second-year jinx common to second-term presidents: Roosevelt, Eisenhower, Johnson, Nixon, Reagan, Clinton. Presidential historian Michael Beschloss, talking to Don Imus on MSNBC this morning, called it "almost an eternal part of the modern presidency." But there are factors in play that are specific, things particular to this administration that give the president's current straits a singular status in the annals of second-term debacle.

A new poll brought that home today. A new NBC News/Wall Street Journal poll finds that Bush’s approval rating has fallen to below 40 percent, a new low, while the percentage of Americans thinking the country is moving in the right direction has dropped below 30 percent. A plurality of poll respondents would prefer a Democratic-controlled Congress, and fewer than one in three -- just 29 percent -- think White House counsel Harriet Miers is qualified to serve on the Supreme Court.

What makes it especially damning for the Bushies is the fact that the poll was taken after a period of small triumphs. Bush delivered a prime-time speech from New Orleans, promising to rebuild the Gulf Coast after the damage caused by Hurricanes Katrina and Rita. He made no fewer than eight visits to the region, pressing the flesh, wielding the odd hammer, doing his best to build goodwill among the locals. And earlier he witnessed the Senate confirm walking Caucasian archetype John Roberts to the Supreme Court with relatively little fanfare or controversy.

It'd be easier to commiserate with a president going through a bad patch if so much of this agony weren't self-inflicted. The president and his crew have stubbornly resisted a rising tide of popular opposition to the war in Iraq. They've resisted learning the lessons of our peculiar American history and the grim Santayanaesque course of events that is playing itself out in Iraq: that ultimately, democracy cannot be imposed, even by proxy, and still be called democracy. They've turned deaf ears to the thousands of American families who have suffered losses in the Iraq engagement, and called on the United States to leave a place we should never have entered in the first place.

They have even disregarded the admissions of the generals charged with prosecuting that war, generals and officials who have said the primary objectives of the war are unattainable [see post "Great(ly lowered) expectations"]. And they will no doubt be ready to disregard or ignore the gravity of the report from the Congressional Research Service, continuing to exhaust our childrens' childrens' treasure in the name of a chimera.

John Lee Hooker got it right: "Serve you right to suffer ... "

With all due respect to whatever football stadium that first acquired the nickname -- Lambeau Field? Applied Materials/Amgen/Unilever Park? -- there's a new house of pain in America, and its address is 1600 Pennsylvania Avenue. Its current occupants will be there until the lease runs out in January 2009, unless the current occupants decide to sublet the place in the meantime -- most likely to a vice president with a bad ticker, an implanted pacemaker and orders to stay away from microwaves.

We don't really want to go there.

Three more years! Three more years!
Related Posts Plugin for WordPress, Blogger...