On the News – The Public Sphere http://thepublicsphere.com A Provocative Space of Critical Conversation Fri, 03 Apr 2015 11:39:17 +0000 en-US hourly 1 https://wordpress.org/?v=5.3.2 U.S. Exceptionalism and Opposition to Healthcare Reform http://thepublicsphere.com/political-exceptionalism-opposition-healthcare-reform/ Tue, 15 Dec 2009 04:47:07 +0000 http://thepublicsphere.com/?p=1800 Political discourse surrounding healthcare reform has included purposeful disruptions of Congressional town hall meetings, the brandishing of firearms at opposition rallies, and the use of Nazi imagery to depict President Obama.

By Luke Perry | The post U.S. Exceptionalism and Opposition to Healthcare Reform appeared first on The Public Sphere.

]]>
Political discourse surrounding healthcare reform has included purposeful disruptions of Congressional town hall meetings, the brandishing of firearms at opposition rallies, and the use of Nazi imagery to depict President Obama. Why has opposition to healthcare reform been so contentious? Conventional responses from the political right typically focus on ideological differences, such as varying views on the appropriate role of government in society, or the perceived need to prioritize other issues, such as the economy. Conventional responses from the political left typically focus on the perceived entrenchment of private insurance companies or the unwillingness of Republicans to work in a bi-partisan fashion. Discussion of U.S. political culture is notably absent from efforts to understand opposition to healthcare reform. This essay will illuminate the ways in which the exceptionalism of U.S. political culture provides a context to better understand this opposition.

Exceptionalism is the idea that U.S. society, politics, and economics are unique and better than other societies and peoples. U.S. political culture has a long history of exceptionalism dating back to colonial America. Puritan leaders, such as John Winthrop, viewed the Massachusetts Bay Colony as “a city on a hill with the eyes of the world upon them.” The Puritan goal was to create a model of Christian morality. Theocracy gave way to broadening conceptions of freedom, which eventually led to an irreparable relationship with Great Britain. The Founders articulated their conceptions of freedom using universal language, which was focused on all of humanity, rather than just citizens of the U.S.A. This was remarkable considering how this little group of colonies broke away from the most powerful empire in the world; success was far from likely. Thomas Jefferson began the Declaration by placing the American Revolution “in the course of human events” and explaining that when rebellions occur, reasons had to be provided. The Founders justified the rebellion through dedication to certain natural rights premised on the notion that all “men” were created equal. Essentially, the one thing all human beings have in common is that we are not God, so all people, including government, must respect basic human rights. The Founders believed they were making a grand statement for all people whose government infringed on their natural rights, not just colonial Americans in 1776. The U.S. remains unique in having natural rights written into the country’s founding document, including the right to rebel if government infringes on these rights. To this day U.S. leaders regularly invoke the imagery of “a city on a hill” in speaking about the exceptional character of the U.S. experience.

A second way exceptionalism is manifested through U.S. foreign policy. The U.S. first embraced democracy promotion during World War I under Woodrow Wilson, who famously stated “the world must be made safe for democracy.” This quote is revealing because it highlights the belief that the world must be adapted to suit U.S. political beliefs and values, rather than the other way around. The U.S. emerged as a major superpower after World War II and emerged as the world superpower after the Cold War. From a Western perspective democracy’s major ideological rivals, fascism and communism, were severely discredited after the three major conflicts of the twentieth century. Exceptionalist elements of U.S. political culture now believe that the U.S.’s unique path to the top demonstrates that U.S.-style democracy and capitalism constitute the best of all types of social order. This is personified in President George W. Bush’s 2003 State of the Union, where he stated that “Americans are a free people who know that freedom is the right of every person and the future of every nation.” The exceptionalism of the U.S. tradition is now connected with the geo-political realities of U.S. military and economic power. The U.S. views itself as the model of democracy in an era of globalization where major powers have profound impact on the world at large.

Exceptionalism provides a useful perspective through which to better understand the contemporary healthcare debate given its historical prominence in U.S. development and culture. Senate Republicans, such as Senate Minority Leader Mitch McConnell, Orrin Hatch, Jim DeMint, and Richard Shelby, have argued that the U.S. has the best healthcare system in the world, as did George W. Bush, and President Barack Obama’s rival in the 2008 election, John McCain. These arguments have created controversy and confusion. One of the few things that Republicans and Democrats agree on is that healthcare reform is needed. Major differences emerge over how to do this. How can the U.S. healthcare system simultaneously be the best in the world and be in need of reform? Conservatives inherently want to conserve the pace of change. One way to articulate and justify this political behavior is to laude the status quo, which in this case, is the current healthcare system. One tactical way to do this is to hyperbolize the effectiveness of the current system, which particularly resonates with many U.S. citizens because of the role of exceptionalism in U.S. political culture. The inverse approach has been adopted as well. In addition to lauding the status quo, the enemy, Barack Obama in this case, has been demonized. Prominent examples include Representative Joe Wilson’s unprecedented shout of “you lie” during a presidential address before Congress and popular conservative talk show host Rush Limbaugh comparing Obama to Hitler. “Going negative” and criticizing political rivals is not new. Importantly, however, these criticisms have more traction and can be more outlandish, when framed in a belief that U.S. healthcare is exceptional, so that whoever seeks to change the status quo, threatens national well-being, and is deserving of harsh criticism.

Public opinion is a second way to consider the impact of exceptionalism in the opposition to healthcare reform. Access to healthcare, a major concern of Democrats, does not resonate with broader U.S. culture to the same degree that it does in the Democratic party, even though Democrats received widespread support in the 2006 and 2008 elections. People in the U.S. predominately view poverty as the result of individual failures; this view contrasts to much of Europe, whose people predominately view poverty as the result of structural problems, such as the lack of education or the lack of opportunity. The U.S. view constrains reform efforts because people who are financially successful are considered exceptional and thus more deserving of healthcare coverage than financially challenged Americans, who are blamed for being poor and their inability to gain or purchase healthcare coverage. These attitudes reflect a form of Social Darwinism,. In the nineteenth century, Social Darwinists, such as Herbert Spencer and William Graham Sumner, justified economic inequality as a natural product of competition and used this belief to advocate limited government involvement in social activity, and such attitudes linger in U.S. public exceptionalist sentiments. Not surprisingly, the U.S. has the most limited welfare state in the West. In turn, people in the U.S. are divided over whether the federal government should make sure all U.S. citizens and legal residents have healthcare coverage, again in sharp contrast to European countries, all of which have an increased federal role in healthcare to ensure access.

The divisions that now plague healthcare reform in the U.S. run much deeper than this moment. U.S. political culture is inherently resistant to political change that questions the exceptional nature of how people in the U.S. live and seek to build a more collective understanding of the public good. The U.S. has not decided whether it wants to remain committed to the welfare state, pursue a long term process of deregulation and privatization, or continue shifting back and forth in a highly polarized fashion. Greater understanding and appreciation for the cultural dynamics influencing this situation helps explain why opposition to healthcare reform has been so contentious. Conventional and scholarly examinations of opposition to healthcare reform would be well-served by greater discussion of the role of U.S. political culture. The final bill, regardless of the specific form, will likely raise a new and important set of questions, the answers to which will determine whether a movement toward a more European style welfare state is truly progressive or moving the U.S. away from the exceptionalism that made the country what it is today. This will inevitably shape and be shaped by U.S. political culture, no matter how exceptional and enlightened we think we are.

By Luke Perry | The post U.S. Exceptionalism and Opposition to Healthcare Reform appeared first on The Public Sphere.

]]>
I Am Indignant – These Are the People We Have to Look up to Now? http://thepublicsphere.com/i-am-indignant-these-are-the-people-we-have-to-look-up-to-now/ Tue, 15 Sep 2009 06:08:40 +0000 http://thepublicsphere.com/?p=1552 Only a handful of artists have truly made an enduring mark on popular culture in the past century; Charlie Chaplin, Elvis Presley, Audrey Hepburn, the Beatles, Madonna, Michael Jackson, to name a few. These are people whose images and work are recognized almost everywhere. They displayed talent, hard work and dedication, and what they created inspired people all over the world. They also gained their fame and popularity long before the age of “new media.” Perhaps it's not coincidence then, that of all the faces featured in current celebrity-focused magazines and websites, none stand out as potential Beatles or Madonnas. I’m convinced none ever will because with the rise of 24-hour news, internet tabloids and social networking sites, our concept of fame and our ability to recognize and bestow it has been utterly altered.

By Paloma Ramirez | The post I Am Indignant – These Are the People We Have to Look up to Now? appeared first on The Public Sphere.

]]>
In case you hadn’t heard, Michael Jackson, aka the “King of Pop,” passed away earlier this year.  Even though no one had really heard anything from him in a while and the last time he was in the media it was something to do with allegedly inappropriate relationships with kids, his death was kind of a big deal.  In fact, it was one of those events for which newsroom directors the world over fall to their knees and thank the media gods.  If you were anywhere near a television or computer or people talking, there was no escaping the momentous news of his unexpected passing.  For that entire weekend, it seemed as if nothing else of note had taken place anywhere in the world.

It exemplified the extent to which our culture has become irrationally obsessed with celebrity.  At the time, I couldn’t resist joking about how Michael Jackson’s death had brought world peace, simply because it created a media blackout of everything else.  Many people were disturbed by the level of attention Jackson received, especially when the Iranian government was violently repressing election protestors, over 70 people had just been killed in another bombing in Baghdad, the US government had just sent arms to aid the Somali government’s fight against Islamists, and, of course, the governor of South Carolina had just admitted to having an affair.  But in a way, it made sense to focus on the sudden permanent loss of a person whose fame will most likely never be equaled, a person whose death actually does signal the end of an era.

Unless you happen to be a member of one of those South American tribes who have managed to exist completely isolated from the modern world, you knew who Michael Jackson was.  That’s only slight hyperbole.  I remember, as a kid, seeing video footage of his concerts in Europe and Asia, even in Russia during the Cold War.  He had fans in Iran during the Revolution.  My own father, who deliberately ignores almost everything that could be considered pop culture, has fond memories of listening to the Jackson 5 in his younger days.  For the entire decade of the 1980s, Michael Jackson was probably the most famous non-politician on the planet.  He’d worked for it, and he’d earned it.  There is something to be said for that.

Only a handful of artists have truly made an enduring mark on popular culture in the past century; Charlie Chaplin, Elvis Presley, Audrey Hepburn, the Beatles, Madonna, Michael Jackson, to name a few.  These are people whose images and work are recognized almost everywhere.  They displayed talent, hard work and dedication, and what they created inspired people all over the world.  They also gained their fame and popularity long before the age of “new media.”  Perhaps it’s not coincidence then, that of all the faces featured in current celebrity-focused magazines and websites, none stands out as potential Beatles or Madonnas.  I’m convinced none ever will because with the rise of 24-hour news, internet tabloids and social networking sites, our concept of fame and our ability to recognize and bestow it has been utterly altered.

We live in the Age of Information.  The internet is the great democratizer.  Anyone with a mobile phone can broadcast their thoughts and observations to any number of people at any time via Facebook or Twitter.  Anyone with a video camera can subject the general public to their pets’ quirks, their friends’ idiocy or anything else via Youtube.  This is all well and good, but it has had a few consequences.  One is that everyone wants to be famous and believes not only that they should be, but also that they deserve to be.  Another is that fame itself has been completely diluted.

Thanks to the prevalence of reality TV and voracious internet tabloids, there are so many famous people in this country, that I gave up trying to keep track years ago.  Names I have never seen or heard of before pop up in the latest celebrity gossip headlines everyday.  They’re always treated as though everyone naturally knows who they are.  Most of the time, not only do I not know who they are, I can’t even discern what they might have done to warrant their apparent fame.  As it turns out, most of them haven’t done anything beyond mug for the cameras on some random cable network reality show or date someone with a well-connected PR person.  This generation of celebrities has earned their fame by being the bitchiest, sluttiest, craziest, crudest, most racist or sexist person in the cast of whichever reality show they appeared on.  They don’t seem concerned with displaying any real talent or holding any responsibility, only with their own notoriety.  Media outlets like Us Weekly and TMZ highlight every scandal, every bar brawl, every traffic ticket and botched Botox job that these personalities can conjure.  And the public consumes it like a drug.  No one seems particularly concerned with the fact that fame of this kind is especially fleeting in this age of instant gratification.  With so many outlets, so many sources, so many contenders, the public consciousness can only process each one for so long.  Like bubbles on a playground, these celebrities rise and burst in an instant.  Occasionally, they snap, like the contestant from a VH1 show who apparently murdered his ex-wife and became a fugitive only to commit suicide himself.  Or the DJ who was known among Hollywood celebrities, but who I heard of only because he’d died of a drug overdose.  Or the unfortunate Jon and Kate whose marriage disintegrated in the glare of the spotlight, which boosted their show’s ratings but at what cost to their eight kids?

Of course there have always been one-hit wonders, flash-in-the-pan starlets, and child stars who disappeared after they hit puberty.  But most of them made some kind of positive contribution to the entertainment world while they had their moments, whether it was a fun, catchy song or a movie that made people happy.  Many were part of a larger pop culture trend (80s hair metal bands, for example) that had its day and faded. I can only hope that the current obsession with superficiality in celebrity is one of those.  As it stands, it is beyond me how people who become famous for shooting each other with staple guns on cable TV (does anyone even remember those guys?) can possibly be making a positive contribution let alone a lasting impact that inspires anything good in anyone. And I find it a bit sad that, given the viewing public’s devotion to a show like American Idol, even the competitors who show real talent and stage presence usually last barely long enough to release an album before that same public has lost interest.  Some don’t even last that long.

Ancient heroes sought glory, fame and fortune in quests and on the battlefield.  In the early days of Hollywood and in the old Broadway musicals, a small town kid was always trying to break into show business to become a famous actress, singer or dancer.  For all his inexplicable eccentricities, Michael Jackson was an extremely talented musician and performer.  People gained fame because they had unusual talent, determination, charm, intelligence, or at least savvy.  Even people who sought fame for its own sake, had to do something to earn it.  Madonna, for example, could never really sing, but she’s an intensely ambitious self-promoter, and she worked her ass off, quite literally, to become a world-class entertainer. With the rise of new media, our admiration of talent and dedication is fading along with our capacity to appreciate a well-crafted coupling of gifted performance and marketable personality. Now we just pay attention to whomever makes the most noise until they are drowned out by someone else.  Mass media truly does represent the masses now that just about everyone has a digital camera and internet access, but there are very few filters and even fewer incentives to create anything of quality.  As Andy Warhol predicted, people who have done nothing more than lipsync in front of a webcam seem to feel entitled to their fifteen minutes.  Fame has always been something to aspire to and admire, but very rarely to achieve. The whole point was that not everyone could do it.  It meant more than having your picture taken on a red carpet and posted on Perez Hilton’s website with graffiti over it.  It took more than sitting around gossiping with your friends in front of a video camera. And yet, it seems that this is what fame means now.  But, in this world, where anyone can become famous for the slightest or most random act, how can fame mean anything at all?

By Paloma Ramirez | The post I Am Indignant – These Are the People We Have to Look up to Now? appeared first on The Public Sphere.

]]>
On Iranian Cats, Mice, and Revolutions http://thepublicsphere.com/mice/ Mon, 15 Jun 2009 04:07:10 +0000 http://thepublicsphere.com/?p=1427 June 12, 2009 was the date of the latest Iranian political crisis, a coup. This coup was special, however. Not only was this coup a military act to seize power, but it is also an act that completes the Iranian revolution in a very ironic fashion. The last remains of those who began the revolution and developed its ideology have been wiped out. Thirty years after the revolution's victory, the revolution finally ate all its first children.

By Mohammad Razi | The post On Iranian Cats, Mice, and Revolutions appeared first on The Public Sphere.

]]>
June 12, 2009 was the date of the latest Iranian political crisis, a coup. This coup was special, however. Not only was this coup a military act to seize power, but it is also an act that completes the Iranian revolution in a very ironic fashion. The last remnants of those who began the revolution and developed its ideology have been wiped out. Thirty years after the revolution”s victory, the revolution finally ate all its first children.

The revolution”s generational consumption was completed in different stages. First, starting in June 1980, Marxists and political organizations with Marxist tendencies were massacred. Then the secular nationalists and moderate religious were banned and pressured. In 1989 Ayatollah Khomeini”s designated heir, Ayatollah Montazeri, was removed from power, and a few months after Khomeini”s death in the same year, the newly-elected government of Rafsanjani eradicated from parliament (the Majlis) those who were considered “leftist” inside the political establishment. During the 1990s there was a fight for power within the right wing of the Islamic Republic. For the first time elements of the traditional religious groups who had no revolutionary background found their way into the government and held key positions. The revolutionary left came to power again in June 1997, and the years between that date and today were the years of political struggle between the last of the revolutionaries of the 1960s and 1970s and the new generation of rulers trained not in the battle with the Shah”s regime but in the military camps of the Iranian Revolutionary Army. With the defeat of the Reformists in this recent “election,” and their arrest for supposedly inciting riots, the revolution is complete; all her children have been consumed.

In my Iranian childhood in the 1970s, the most memorable mouse and cat characters were not Tom and Jerry -whom I used to watch on the “American” channel- but the characters from a short story for children written by the fourteenth-century Iranian satirist poet, Ubaid Zakani. My sixteen-page book of “The Mice and the Cat” was a reproduction of an old lithograph print, which gave it a unique look among my other books.

Zakani, as is customary among the classics, began his story reminding the young readers that by the end of the book they should pay attention to the moral of the story: “Be smart and mind the story of the cat and the mice. You”ll be astonished about what the story might teach you. Even you, who are wise and prudent, listen to the tale and let it be like a jewel earring on your ear.” The playful language of the story and its funny unusual rhyme scheme made it easy to memorize and a joy to read. But the ending was not quite what one might expect from a children”s tale.

The story, as the name suggests, narrated the tale of mice, powerlessly oppressed before the paws of a brutal cat. At some point in the story, the cat”s conscience appears troubled by what he does to the mice. Taking refuge in a mosque, he prays, cries, regrets his viciousness towards the mice, and becomes a “man of god.” A mouse hidden under the “manbar” (pulpit) sees the repentant cat and takes the news to the other mice. The news about the cat”s spiritual change spreads among the mice. The joyful mice decide to show their appreciation by offering food to the cat. So they send their leaders to the cat to deliver him a message of friendship with trays of food. The message delivery, of course, gets interrupted; the new cat of god eats both the food and the messengers. This makes the mice extremely angry, unites them, and motivates them to change the course of their miserable life once and for all. They decide to fight back against the cat. The mice organize a revolution, defeat the army of the cats, and capture the cat that ate their leaders.

Up to this point, we have a regular Hollywood-style movie plot where the little guy rises up against an oppressive overlord and seemingly wins; the good and the meek defeat the evil and the cruel. The last few lines, though, undo such a happy ending. The mice take the cat to the stake to hang him. In the last minutes the cat frees himself from the ropes, kills the mice around him, and forces the army of mice to scatter. Brutality wins. Life goes back to “normal.” The “oppressed” remain powerless, and the winner is the one who uses hypocrisy, brutality, and ruthlessness.

I remember being nine years old and reading that story in 1977. Iran was pregnant with a revolution. The Shah was widely despised by the educated, secular intellectuals as well as many traditional Shiite clergy and their followers. For many members of the newly formed middle-class families of the 1960s and the 1970s, Islam was the alternative to reform Iran, a country supposedly corrupted by Western ideas. In those decades, many Iranian religious intellectuals tried to create a socialist and Marxist inspired Islam, a “modern” Islamic ideology. To many of them Shiite Islam was considered an authentic “Iranian” alternative to Western radical ideas. They believed a reinvigorated political Islam could be the revolutionary solution that makes Iranians independent of Marxism or any other Western ideology. Many of these intellectuals were more invested in the power of the idea than in their own faith in Islam. They believed political Islam would mobilize the masses against the Shah”s dictatorship. Others, perhaps more faithfully, viewed Islam as the true solution to any problem, even though they never could define how the religion would digest modern values. For the secular nationalists, liberals, and Marxists, it did not matter how Shiite Islam would become a modern political ideology.

The year 1978 began with the first serious anti-Shah demonstrations. Massive protests continued for the rest of the year. By January of 1979, the Shah left the country. In February of that same year the secular and Islamic revolutionaries, united under the leadership of Ayatollah Khomeini, overthrew the Shah”s regime. The Shah”s army could not fight back.

On February 11, 1979, at the age eleven, I witnessed the collapse of one of the most brutal dictatorships of the century. I was elated that the mice had defeated the cat, that the oppressed could finally live free. The moral of Zakani’s story seemed to be wrong.

Things didn”t go the way the mice had intended. The next thirty years witnessed a Zakani-style victory of the cat. The king was gone but the kingdom reincarnated in the Islamic dictatorship called “Velayate Faqih.” In 1979 the first constitution of the newly-formed “Islamic” republic institutionalized a new position above the government and the president to overlook the acts of the republic and “guide” them according to Islamic Sharia: “Velayate Faqih,” meaning the Jurist Ruler, or as it is translated into English, the Supreme Leader. The story of post-revolutionary Iran became the struggle of a nation with its self-invented monster.

Today”s fight in Iran between the reformists and the hardliners is the result of a thirty-year struggle within the nation”s mind, a battle between those who finally recognize the face of the brutal cat in their self-made system and those who do not. No one knows if the story must ultimately end as Zakani would predict, the cat”s brutality triumphing, leaving a status quo of oppression on the mice. I still want to believe, as I did on February 11, 1979, that Zakani does not always have to be right.

By Mohammad Razi | The post On Iranian Cats, Mice, and Revolutions appeared first on The Public Sphere.

]]>
The End of the End of the University http://thepublicsphere.com/the-end-of-the-end-of-the-university/ Mon, 15 Jun 2009 04:04:26 +0000 http://thepublicsphere.com/?p=1420 If Taylor has done nothing else, and in my opinion, by and large he hasn't, he has nevertheless succeeded in launching an opening salvo for a broader conversation as to what the university endeavors to be. How and toward what end should our institutions of higher education operate?

By Marc Lombardo | The post The End of the End of the University appeared first on The Public Sphere.

]]>
In the April 26th, 2009 New York Times, the eminent scholar of religion and technology Mark C. Taylor contributed an op-ed entitled “End the University as We Know It” in which he suggests that the university is in the midst of a tremendous crisis requiring the institution to initiate significant reform. Professor Taylor–perhaps an ironic title given his transparent resentment of his own professional class–begins his series of proposals for the future of the university by drawing an attention-grabbing metaphor between the state of U.S. academic institutions and the state of U.S. automakers and banks. I cannot help but admire the boldness of Taylor”s approach and his lack of reserve in questioning the entrenched practices and assumptions of his profession. Many of his proposals, e.g., the abolition of traditional disciplines and the permanent abandonment of the tenure system for faculty in favor of renewable contracts, seem specifically chosen to incite the antipathy of university professors the world round. In these proposals, Taylor directly attacks both the manner in which professors make their livelihood and the very nature of the work that they do.

While hearing this kind of talk from amongst the professorial ranks is rare, the ability to offer proposals that draw the ire of one”s colleagues does not in itself amount to an actual platform of reform that is in the public interest. If Taylor has done nothing else, and in my opinion, by and large he hasn”t, he has nevertheless succeeded in launching an opening salvo for a broader conversation as to what the university endeavors to be. How and toward what end should our institutions of higher education operate?

Taylor”s argument presumes that the U.S. system of higher education is going through a fundamental crisis; one similar in nature to the financial crisis. Already in Taylor”s opening metaphor, he demonstrates the pronounced influence of the very sort of disciplinary specialization his proposals would eliminate.  As a scholar of Kierkegaard and other philosophers of religion, Taylor understandably sees the world around him through that particular disciplinary lens. In this academic theological parlance, a crisis (as in “crisis of faith”) refers not simply to a dilemma posed to a particular person, institution, or belief structure regarding how it ought to proceed; a crisis is a dilemma that both threatens and stems from the essence of what a person, institution, or belief structure most takes for granted. A crisis can only be resolved (and such a resolution is itself always and only temporary, the philosophers remind us) insofar as the person, institution, or belief structure in crisis abandons the very basis upon which it relates to the world. Moreover, no real assurances can be given that the person, institution, or belief structure which comes after the “leap of faith” will be any better or less problematic than the one that came before. After all, if a radical transformation seemed like a good idea from the untenable perspective that it was trying to abandon, then such a transformation would not be radical enough because it may still be wedded to the basic orientation of the original perspective.

Viewing contemporary social issues through the lens of specialized theoretical concepts may (or may not) be accurate, suggestive, or useful. For instance, “crisis” may very well be a good name for the state of financial institutions at present, and this is perhaps why Taylor compares the state of higher education to the financial crisis. The collapse of the housing market and the credit market (and the subsequent reverberations throughout the world economy) was not on the whole caused by isolated cases of fraud and misconduct. Propelled by the drive to ever-increased speculation that comprises their very essence, financial institutions kept making more and more bets on more and more bets that became increasingly distant from the actual transactions.  To offer an extended metaphor, it”s as if instead of betting on boxing matches seen in person, or boxing matches watched on TV, or boxing matches with results that would be put in the paper, our financial institutions were all betting on boxing matches for which the results would never be known at all–in fact, boxing matches that would never even take place. One doesn”t have to be Marx to see that this logic presupposes its own destruction.

To say that such a situation constitutes a crisis–if we follow out the Kierkegaardian line of thought–is to suggest that the only possible way of recovering from the current situation, or preventing a similar crisis from happening in the future, is to transform the fundamental basis for how we go about coordinating economic transactions. Given the deleterious effects that follow from speculative capitalism even when it is working well–e.g., ever-increasing disparities of wealth, the various forms of social stratification that result from these disparities, the transformation of the Earth into a place less and less habitable by various forms of life including human beings, etc.–a case could be made for Taylor”s Kierkegaardian “crisis” approach to the problem. The course of action counseled by this analysis (the “leap of faith”) would not be the ameliorative-incrementalist approach of utilizing the political system to place a series of regulations upon financial institutions in the attempt to prevent those institutions from realizing their inner need for destruction. The leap of faith would instead demand that we get rid of financial institutions entirely. After all, if we allow the continued existence of these institutions, it will only be a matter of time before their immense power and influence, which will always be accorded them due to the central role they play in capitalist production, will once again be utilized in order to manipulate whatever political systems dare to constrain them.

In this essay, I am not claiming that an anti-capitalist revolution is the most sensible way of responding to the current financial debacle. ((Just as it is necessary to point out the likely failings of incrementalist regulation, it is also necessary to recall the conservative, Burkean observation that revolutions often serve to pave the way for counter-revolutions. However, the current approach to the financial crisis taken by our political representatives is neither the radical leap of faith of anti-capitalist revolution, nor is it the band-aid approach of confining the market’s impetus to self-destruction through strict regulation. The approach adopted by our political leaders (the “bailout”) could best be equated to the strategy employed by an alcoholic who drinks more and more every morning in order to get over the hangover from the night before. This strategy will continue to work in the short-term–again, as long as one defines “working” as the cycle of booms and busts that we have grown accustomed to–until one day when it blows itself up completely. Given their role in encouraging this process to reach its apotheosis, our leaders might very well be considered anti-capitalist revolutionaries after all. We might just all be exterminated in the process, but that’s the leap of faith for you!)) I chose to elucidate the revolutionary position regarding the financial crisis in order to demonstrate how the application of specialized theoretical concepts can help us to consider courses of action for addressing social problems beyond those which are the most obvious. By enlarging our discussion of social problems through the deliberate inclusion of a pluralistic variety of ideas, perspectives, positions, proposals and opinions (including many with which we disagree, perhaps even vehemently) we are more able to see the limitations of our habitual ways of encountering those problems. Professors, artists, and intellectuals of specialized training and temperament–especially those who have a difficult time marketing their labor directly in the consumerist economy-have a unique role to play in public deliberation. We need them to think up crazy, impractical, nihilistic, idiosyncratic, faulty, utopian, tangential ways of seeing the world so that they can share those perspectives with the rest of us. This social role of the university is performed best when the pluralism of its constituents is encouraged to the greatest conceivable extent. The university should be an asylum (or perhaps a zoo) without walls in which the freaks and outcasts who are its inhabitants are encouraged to come and go as they please and the rest of us are free to visit as long as we agree to preserve (and perhaps contribute to) the oddity of the surroundings.

It is in the public interest (and in the interest of the capitalist market as well, incidentally) for the university to function as an incubator and store-house for ideas that would otherwise be discarded because they are too iconoclastic, counterintuitive, or controversial to come into being equipped with their own revenue streams. The internet is an example of one such idea. From this perspective, the problems faced by U.S. institutions of higher education today are not best understood as the result of a fundamental crisis stemming from the university”s pursuit of specialized, impractical knowledge as Mark Taylor suggests. In fact, I believe that a better argument could be made for the converse hypothesis: the university”s present problems can be seen as the result of its failure to adequately preserve its own working ideal–entelecheia in Aristotelian terminology–as a sphere of infinite pluralistic debate that operates in relative autonomy from the immediate dictates of the market.

Creative Commons License photo credit: seantoyer

By Marc Lombardo | The post The End of the End of the University appeared first on The Public Sphere.

]]>
We Don’t Live in Postracial America Yet http://thepublicsphere.com/we-dont-live-in-postracial-america-yet/ http://thepublicsphere.com/we-dont-live-in-postracial-america-yet/#respond http://thepublicsphere.com/?p=933 After the inauguration of the 44th president of the United States of America, Glenn Beck, on FoxNews.com, quickly criticized the racialism of Barack Obama's inaugural ceremony.

By Jacqueline Hidalgo | The post We Don’t Live in Postracial America Yet appeared first on The Public Sphere.

]]>
After the inauguration of the 44th president of the United States of America, Glenn Beck, on FoxNews.com, quickly criticized the racialism of Barack Obama’s inaugural ceremony. While he was not the least bit bothered by Rick Warren’s divisive invocation, Beck found the closing benediction from civil-rights veteran, Joseph Lowery, aggravating simply because it ended on a theme of race. Incensed by Lowery’s implication that “white” people may not always embrace what is “right,” Beck responded with frustration, “Even at the inauguration of a black president, we are being called racist.” Though Beck no doubt would have happily criticized everything about the ceremony, he focused his critique on Obama’s inaugural failure to meet his supposed “post-racial” promise.

Beck’s criticisms suggest that, at least for him, post-racial means that he should never have to be accused of being racist again, and perhaps he should never have to hear the word “race” again without the “post” in front of it. While 52% of the voting populace of the U.S. can congratulate itself on the election of Obama and the transformation of racial discourse such an election may portend, said election is not license to end conversations about race. Obama, in his own candidacy (in Philadelphia in March of 2008), actually demanded we push conversations about race further and deeper. In his recent book, Between Barack and a Hard Place: Racism and White Denial in the Age of Obama, Tim Wise, as he has done in previous works, critiques the machinations of white privilege and the need for this nation to wrestle with “racism 2.0,” the subtler forms of racialization that perpetuate white dominance. Current practitioners of racism 1.0 will admit they did not vote for Obama because they do not trust a black man. Practitioners of racism 2.0 may have even voted for Obama, but they still found him difficult to pin down and untrustworthy, a perception, which, though they may not admit it to themselves, had everything to do with Obama not being white. Statistics show that on average, people in the U.S. tend to maintain negative stereotypes of minoritized communities, ((“Minoritized communities” refer roughly to those who fall into the following “racial” categories: African American, Asian American, Latin@, Middle-Eastern American, and Native American.)) even while willingly casting a vote for Obama. Such negative stereotypes participate in a system of continued white racial dominance, whose effects can clearly be seen, for instance, in the disproportionate numbers of Latin@s and African Americans in prison. ((Tim Wise’s book provides further details that lay bare the inequalities resulting from such dominance, such as white Americans’ disproportionately lower rates of imprisonment for drug use than other parts of the population, in spite of this group’s being able to boast a higher percentage of drug users among them than is the case for other races. The work also points to systemic inequalities that target racialized minority groups besides Latin@s and African Americans.)) Such stereotypes also appear in subtler ways, like the recent New York Post cartoon and fiasco in which people debate whether the image, because it is no longer an overtly racist moniker, can still be deemed racist. ((The reluctance to deem the cartoon racist also circles around the way that the label “racist” tends to be a conversation stopper instead of the conversation-starter it was meant to be. In a recent speech, Attorney General Eric Holder called the people of the United States “cowards” for their unwillingness to confront race. New York Times’ editorialist Charles M. Blow criticizes Holder’s comments on the basis that calling people cowards does not make them willing to talk to you; it just makes them defensive. Numerous people of color have made similar statements about the “r” word; don’t call people racist because then they just get defensive and refuse to change. Confronting aspects of what John L. Jackson, Jr. has called de cardio racism in his recent book, Racial Paranoia: The Unintended Consequences of Political Correctness, Holder challenges all the people of the U.S.A., not just white Americans, to confront the issue of race straight on and have the guts to talk about these aspects of de cardio (internal matters of the heart that hide beneath the surface) racism, those parts of racial discourse and practice that can only be read between lines and beneath the surface because political correctness has eliminated most surface racism and made many Euro-Americans terrified of being called prejudiced, as Blow elucidates in his own essay. Jackson suggests, as my own experience does, that in fact the only way forward in confronting de cardio racism is through close personal relationships that require courage. We must be willing to confront racism in our friends, who must get over their fear of being called racist. We must be willing to confront racism in our own minoritized communities and ourselves, and not just internalized racism against ourselves. Latin@s, for instance, need to confront racism toward African Americans. Why, for instance, is the only significant African American character in Ugly Betty, played by Vanessa Williams, also a major villain (even if she has a better rounded character in the most recent season)? Too often comments like Holder’s focus on the relationship between white and black America without consideration of other communities, like Native Americans who are so painfully absent in Holder’s comments, even if his comments address black history month. All communities need to respect and listen to the racial quagmire confronting all groups. These groups should also confront additional questions of privilege like class and heteronormativity. We must have the courage to remain committed to friendships with people who do not always make things easier for us, but who challenge us most of all when we lie to ourselves.

)) Recent research on the subconscious connection of “apes,” African Americans, and police violence propose that racialized programming, with deep roots in U.S. history, continues at a deep level among people of all racial-ethnic backgrounds and political inclinations.

I write this editorial not solely out of concern that Beck and other Euro-Americans think that Obama’s election means we can stop talking about race. I also fear that those of us who hail from minoritized backgrounds have internalized racism 2.0. Many of us need to query the privileges we have had in life, but we also need to examine the ways we have internalized negative stereotypes about ourselves from dominant culture. I, for one, continue to perceive myself through the Dubois’ double-consciousness, ever concerned about my measurements according to others’ tape, feeling my plural identities ((I have at least two warring and at times complementary identities, maybe more since I am a woman, a gender category that carries its own double-consciousness, and my racial and ethnic identities are not neatly circumscribed by U.S. racial terminology.)) ever unreconciled. This internalized racism also means that I never learned to feel, truly, that such double-consciousness is my strength and not my shame. Following some arguments in John L. Jackson’s recent book, Racial Paranoia: The Unintended Consequences of Political Correctness, I have been unwilling to own my paranoia around matters of race in personal and public life as justified. While confronting new and complex racial realities, realities that cannot be adequately addressed through pre-civil-rights-era terminology, I, too often, fear that I misread and over-react. At the same time, in these trying economic times, I wonder if I can actually get a job given my “strangeness,” my ethnic and other non-ethnically specific behavioral deviations from established cultural norms. And the big question, if I do get a job, is it because of my skill set and unique abilities, or is it simply because my last name helped an institution fill a quota?

That those of us from historically dominated groups still must wrestle with racism 2.0, especially around the question of affirmative action, was made apparent to me in the weeks following Obama’s election. On November 20, 2008, I watched an episode of the television show Ugly Betty, “When Betty Met YETI.” Ugly Betty, a U.S. interpretation of a Latin American telenovela, centers around Betty Suarez, a young assistant to the editor-in-chief of the fictional fashion magazine, Mode. While the show has been known to confront challenging issues artfully, in “When Betty Met YETI,” the heroine decries the inhumanity of affirmative action.

At the beginning of the episode, Betty learns of the existence of the “Young Editors’ Training Initiative” (YETI). This course of seminars is a key career stepping stone. Betty decides to pursue admission to YETI, though she only learns of it two days before the deadline. As it turns out, the heroine must compete for a spot in YETI against fellow Mode assistant, and gay Euro-American, Mark (because heaven forbid she should compete against an unambiguously privileged white heterosexual male). While Betty gives a strong presentation in her interview for YETI admission, everything about Mark’s application strikes the viewer as significantly better than Betty’s. Where she only supplies a cover and a letter from the editor, Mark creates an entire magazine, featuring articles contributed by famous columnists. When YETI admits Betty over Mark, she goes to comfort him, but he informs Betty that she was only admitted because she is Latina and fills a quota. Later in the show, Mark hopes she doesn’t think he is a racist for having said this, and the viewer is made to feel that what he said was completely reasonable and obvious. Then, YETI confirms for Betty that they only admitted her because she was Latina. ((YETI”s acknowledgment surprised me because, unless I have applied to a program that specified a desire strictly for under-represented applicants, I have never had anyone admit so plainly that I received anything for filling a quota. I have, however, had plenty of colleagues hint at such an agenda, claiming that I have had advantages they were denied because of their blinding whiteness. I have had white male colleagues tell me I have a better shot at getting a job than they do because I have affirmative action 2.0 working in my favor, but they never doubt that they deserve a job more than I do, or question whether all Euro-Americans who receive good jobs necessarily earned them.))

Tortured by this revelation of her quota-filling prowess, Betty decides to withdraw from YETI so that Mark can have her spot. ((This is another leap from my reality; I have never been admitted to a program and then been enabled to name my replacement if I withdrew.)) Betty’s father, Ignacio, provides a short litany of the discrimination he experienced for being Mexican, and ultimately he concludes that if they want to give Betty something because she is Mexican, she should take it. ((I, unlike the producers, happen to agree with Ignacio that this is the least YETI can do to compensate for all the times people at Mode have insulted her father’s cuisine.)) Betty, however, prefers YETI to want her for who she is, not because of some arbitrary system that picks her just because she is Latina. Fair enough, the experience of what I dub affirmative action 2.0, the system of racial quotas that education and employers have wielded in response to the 1960s civil rights movements, can be incredibly painful. We would all prefer to be admitted and hired because we really are the best people around. Alas, we live in an unjust world that is not a meritocracy, and without affirmative action 2.0, Betty’s application would have been disregarded because it bore the name Suarez. For instance, a 2003 University of Chicago-MIT study found that people whose resumes bore equal qualifications but white-sounding names were 50% more likely to be called for an interview than people with black-sounding names. ((Project Implicit also studies our unexamined racial biases. It is worth everyone taking a trip to this site and taking these psychological tests. We may be surprised to discover the racial biases we hold inside.))

So Betty withdraws from YETI. In the end, however, her boss, the wealthy and well-connected Euro-American heterosexual man, Daniel Meade, pulls some strings. He speaks with the YETI board and manages to get Betty re-admitted alongside Mark. Betty has now become a recipient of what I term affirmative action 1.0 (or “old-school affirmative action”), the old boys’ network. Affirmative action 1.0 has no-doubt existed for countless generations; thus Plutarch partially admires ancient Sparta for its supposed elimination of nepotism, thereby criticizing Roman-style affirmative action 1.0. Affirmative action 1.0 is not a system that explicitly prevents members of minoritized communities from acquiring jobs and educational degrees. Instead affirmative action 1.0 assists the already well-connected in acquiring jobs and spots in schools because of whom they know and who their family is rather than through any merit-based analysis of their actual skill-set. ((Actually, another recent book suggests that post-civil-rights affirmative action is actually the era of affirmative action 3.0, and that the New Deal ushered in an era of affirmative action 2.0 that specifically helped Euro-Americans while denying assistance to minoritized communities. See Ira Katznelson, When Affirmative Action Was White: An Untold History of Racial Inequality in Twentieth-Century America.)) In the case of Ugly Betty and YETI,her wealthy and powerful (and not coincidentally Euro-American) boss likes her and pulled strings for her. Betty, who refused affirmative action 2.0, should be equally incensed that YETI would admit her just because of affirmative action 1.0. Here again, YETI admitted her not for the merit of her ideas but because of systemic practices whereby an admissions committee would be reluctant to deny wealthy and powerful Daniel Meade’s pushy request. Oddly, our heroine is no longer upset that YETI has no interest in her as Betty Suarez; rather, she is thrilled that Daniel supports her, and she happily takes a position received thanks to affirmative action 1.0.

What lesson should a Latina draw from watching such a television episode, especially cast in the light of an historic election, which supposedly, finally, signifies that those from non-dominant communities in the U.S. can do anything? Affirmative action 2.0 is, and always has been, an imperfect solution to an imperfect system. Yet affirmative action 2.0 always leaves recipients open to scapegoating. We have become accustomed to the label “affirmative-action case,” and naturally, someone at Ugly Betty wished there were some other way to navigate our uncomfortable reality. For most of my life, I have been riddled with self-doubt about whether I deserve what I have achieved. At the same time, I have watched a fairly talentless upper-middle-class “WASPy” heterosexual male slide easily into well-paying jobs and artistic gigs, without once questioning whether he really deserved them or if he was just an affirmative action 1.0 case, receiving appointments merely because of his privileged upper-middle-class white racial identity and family connections. Perhaps one of the main problems with affirmative action 2.0 is its failure to challenge the injustices of affirmative action 1.0, injustices which affect people from the lower middle classes, regardless of their racial background, just because they lack connections to the elite and powerful. ((Marc Lombardo in reviewing this piece also suggested that an advantage of affirmative action 2.0 for the dominant culture could be that affirmative action 2.0’s recipients constantly question whether they are good enough. Such self-doubt often pushes these recipients from minoritized communities to work harder than their non-affirmative-action colleagues.))

I understand that talented people genuinely believe they have earned better in life and are frustrated to see others receive accolades they think that they deserved. Affirmative action 2.0 allows many Euro-Americans to think that some minority kid unfairly took their spot, instead of pushing these Euro-Americans to question not only their own qualifications (did they really deserve that job; or has their privileged background led them to assume they deserved it?) but also the broader injustices of the world in which we live. In a discussion of the Gratz v. University of Michigan case, ((Jennifer Gratz sued the University of Michigan-Ann Arbor for a system of what she deemed unfair racial preference, which granted students from minority backgrounds extra points in their admission applications. This system, she contends, denied her a spot at her first choice university because it went to some non-white student with lower SAT scores. How she actually proved this is beyond me. Ultimately, the Supreme Court decided in her favor, 6-3. Naturally, I am in complete disagreement with Gratz and the then-Rehnquist court.)) however, Wise points out that while some students of color at the University of Michigan had lower test scores than Gratz did, over 1000 white students did as well. Her spot, if in fact she deserved one, was more likely taken by a white student whom the University of Michigan favored because of athletic ability or because s/he grew up in the wealthy Upper Peninsula (both categories also received extra points in the Michigan admissions process). ((The one worthwhile anti-affirmative action 2.0 question to pursue here might be, are there people of color in the wealthy Upper Peninsula who applied, and did they get double bonus admissions points? Affirmative action 2.0 undoubtedly needs to take up issues of class, still with a preferential option for minoritized communities who are still underserved even in comparison to their underserved lower middle class white families. Yet issues of class should factor into these considerations more as underprivileged poor Euro-Americans likewise lack access to affirmative action 1.0 and need systemic help. Jennifer Gratz, however, being from a middle-class family who could afford to sue the University of Michigan was not underprivileged in any way; thus my objections to the Supreme Court’s decision in her favor stand.)) Yet her anger was not directed at athletic or class preference, a class preference largely favoring those of Euro-American background. Her anger did not criticize affirmative action 1.0 as it was encoded into the University of Michigan admissions policies; her anger was directed at affirmative action 2.0 and the recipients from minoritized communities.

The assumption that we are more entitled to a spot at USC than an African American with slightly lower SAT scores has everything in the world to do with “privilege,” being privileged enough to nurture a sense of entitlement. Privilege allows one to challenge affirmative action 2.0 because its recipients are the underprivileged. I can only assume one refuses to challenge affirmative action 1.0 because its recipients are privileged, and one hopes to someday avail oneself of precisely affirmative action 1.0’s privileges. ((Privilege is also too complex to examine in this essay. Again, I direct readers to Peggy McIntosh or Tim Wise for further information on white privilege in particular.)) I certainly recognize that I had privileges over other Latin@s that enabled me to be where I am, and among these privileges are my fair skin and perfect standard English, privileges that allowed me to widen a door cracked open for me by the privilege of affirmative action 2.0. Yet my life is not everything I wanted. I have not received everything I have applied for. Why? Because that is life; we’re not all Susan Sontag or Albert Einstein. The Jennifer Gratz of the world need to make peace with the injustice of life just as much as I do, while fighting for a system that is fairer for everyone, and not just for our privileged selves. I would have respected Betty’s character more if she had turned down YETI’s position the second time because she did not want to receive affirmative action 1.0 either. I would have admired her for taking the position at YETI but saying I am going to use this affirmative action 1.0 now to make sure that no one is ever admitted for something other than their qualifications again.

Sorry, Will Smith, just because Barack Obama is president does not mean that other people in this country have no excuse for failing to achieve their dreams. How many people, black, white, or what have you, have had either the educational opportunities of Punahoe (Obama’s elite Hawaiian preparatory high school) or the sheer genetic good fortune to be as handsome and intelligent as Obama (or Smith for that matter)? Yes, the U.S.A. has made immense progress on what Jackson terms de jure and de facto racism. ((According to John L. Jackson, Jr. in Racial Paranoia, de jure racism is racism “of law”; that which is rooted in and can be rectified by law. De facto racism is racism “of fact”; that which is obvious and easy to name, which differentiates it from de cardio racism. De cardio racism is racism “of the heart,” that which transpires beneath the surface, between the lines, and that which we hide even from ourselves. Racial paranoia exists in our new post-political-correctness racial age where much racism is a matter of de cardio racism, a racism that irks and evokes suspicion but cannot be named. Its elusiveness gets treated as license to pretend it no longer exists, and the post-racial epithet responds precisely to the perpetuation of de cardio racism, that which lies at the heart of continuing racialization. We must all own our racial paranoia and work on de cardio racism if we are ever to be truly “post-racial.”)) Still, real systemic inequalities persist, and all of us still wrestle with de cardio racism, the racism that transpires in our hearts and that we hide from the world and often from ourselves. Affirmative action 2.0 sought to redress some of the inequalities that pervade our racially contorted system, but yes, it is an imperfect solution. The imperfections of this approach do not mean, though, that we have the luxury of throwing it out the window. Ugly Betty over-simplifies the complexities of affirmative action 2.0 while favoring the even more unfair unwritten policy of affirmative action 1.0. If a show supported by prominent Latin@s can propagate such a simplistic narrative, then those of us who can speak complexly about race have much work before us. Sorry, Glenn Beck, but just because affirmative action 2.0 made it possible for enough people of color, including President Obama, to take advantage of affirmative action 1.0 does not mean the work of affirmative action 2.0 is complete or that we can stop talking about race and racism.

By Jacqueline Hidalgo | The post We Don’t Live in Postracial America Yet appeared first on The Public Sphere.

]]>
http://thepublicsphere.com/we-dont-live-in-postracial-america-yet/feed/ 0
Stadiums and Terrorism http://thepublicsphere.com/stadiums-and-terrorism/ http://thepublicsphere.com/stadiums-and-terrorism/#respond http://thepublicsphere.com/?p=631 The public's right to know or the public's right to be safe? Preserve civil liberties at all costs or err on the side of caution? These questions, honestly asked, are at the heart of debates over how best to preserve both our safety and our liberties in an age of terrorism and violence.

Some time ago, an ideal test case for these questions played out here in Texas, where the Dallas Cowboys tried to fight requests (that entered the legal system and fast became demands) for public release of the plans for their new $650 million stadium in Arlington.

By Derek Charles Catsam | The post Stadiums and Terrorism appeared first on The Public Sphere.

]]>

The public’s right to know or the public’s right to be safe? Preserve civil liberties at all costs or err on the side of caution? These questions, honestly asked, are at the heart of debates over how best to preserve both our safety and our liberties in an age of terrorism and violence.

Some time ago, an ideal test case for these questions played out here in Texas, where the Dallas Cowboys tried to fight requests (that entered the legal system and fast became demands) for public release of the plans for their new $650 million stadium in Arlington.

Their rationale? Both security and business concerns.

The problem? At least $325 million, and likely a lot more, is coming out of taxpayer pockets, and the city used eminent domain to force homeowners to sell their property to make way for the new stadium. Whoever has their name on the lease, the stadium is in many ways public property and should be considered only nominally the Cowboys’ property.

The Cowboys and their advocates argued that both proprietary business interests and security concerns should have allowed them to keep the information secret. Yet, for the public, the Cowboys sacrificed their proprietary business claims as soon as they stuffed their snouts in the public trough. The Cowboys’ claim so reeked of arrogance that it almost overwhelmed all of the other arguments about security. It is increasingly common for professional sports teams to suckle at the public teat and then turn around and pretend that they owe that same public nothing because they are fundamentally engaged in private enterprise.

Billionaire owners want to have the public pay for their opulent facilities, in which the former will charge exorbitant prices for tickets and concessions. The public is slowly learning, just like the poor guy who stands in line at halftime to spend $60 for a gelatinous pile of food and drinks, that there is no such thing as a free lunch. Or a half-billion dollar stadium.

But the other claims, those tied to security, are less temporal and thus gave pause. After all, we now know the plans for the stadium, but security will be an enduring concern. I’ve argued for years, even before 9/11 and the occasional recent news of potential attacks at football games (always revealed to have been a hoax, but such hoaxes still remain all too credible), that stadiums on game day or concert night are among the most vulnerable targets for terrorist attacks: tens of thousands of people in a celebratory mood, unwary and focused on something else; screaming crowds; loud public address announcers; amps at rock shows; lots of drunk people; showy but largely perfunctory security. Providing diagrams and blueprints to terrorists, whether Islamist or of the home grown variety (Many in the U.S. seem to have forgotten about the Eric Rudolph, Ku Klux Klan, Tim McVeigh, Unabomber, Charles Whitman types), does appear shortsighted at first blush. The public does not have the right to know everything.

Then again, someone with malicious forethought can take plenty of time to plan an attack upon an open stadium. Providing diagrams that, once a stadium opens, will be available anyway hardly seems like a serious breach of either public safety or security. The danger will not come from terrorists simply knowing a stadium’s layout, however essential that might be to a planned attack, but rather from terrorists who are able to identify and exploit weaknesses and security flaws.

Prevention of a stadium attack will come in the form of vigilance, intelligence, and competence, rather than slapdash and showy efforts to appear tough. A little sanity would also go a long way in bringing a level of reasonableness to our discussions. When you enter a stadium on a hot day and are drinking a bottle of water, scare stories from the news notwithstanding, the odds that your water will become a deadly weapon are almost nil. It is hard not to be cynical about a policy that happens to profit the concessionaires who sell overpriced drinks without demonstrably increasing safety. It also inspires less, not more, confidence if our official approach to matters of terrorism and security seems reactive to news stories or rumors rather than part of a rational and comprehensive strategy. Meanwhile, if I had hidden a gun in my waistband, security would not have noticed because they did not bother checking. In terms of odds, I would surmise that an attack at a big game will more likely come from someone wielding a gun than someone wielding a half-empty bottle of water.

We are similarly foolish and shortsighted in our approach to security at airports, where appearing vigilant and tough on potential terrorism has taken the place of commonsense policies that will actually make us safe. A batty Englishman tries to light a shoebomb, and now we all have to take our shoes off at security. There are rumors that terrorists are going to try to use small amounts of liquid explosives, so we develop an inane policy whereby we can take on a few ounces of liquid in small containers that we must place in a plastic bag. In your shaving kit? It’s a menace to the airways. In a ziplock? We can all breathe easier. And then there is the water issue again–if you try to bring a bottle of water or juice or soda through security, you’re going to lose it. But don’t worry, you can buy any drinks you want at the usurious rates the airport concessionaires are able to get away with charging. You can even buy an extra hot venti coffee right before you board–a potentially more lethal weapon than all of the aftershave and Nikes and half-consumed Ozarka water. But woe unto you if you forget to take your laptop out of its case or if you are impatient with a security person because your child is crying and you’d rather attend to her than to the guy who randomly pulled you out of line for a perfunctory pat-down.

Texans take football seriously. They take travel seriously. They take terrorism seriously. But there is a difference between serious and foolish. The Cowboys finally released the plans to the enormous new stadium, as was inevitable. Thus far, nothing bad has happened to Jerry Jones’ gleaming jewel. And if terrorists ever do attack the new stadium, the blame will fall on our scattershot, improvised, shoddy policies and lack of foresight because we were preparing for the last attack rather than the next one.

Creative Commons License photo credit: Traveling Fools of America

By Derek Charles Catsam | The post Stadiums and Terrorism appeared first on The Public Sphere.

]]>
http://thepublicsphere.com/stadiums-and-terrorism/feed/ 0
Is Mexico Headed for War? http://thepublicsphere.com/is-mexico-headed-for-war/ Sat, 13 Dec 2008 16:00:00 +0000 http://thepublicsphere.com/?p=614 Is Mexico heading for war?  If history repeats itself, Mexico will present a challenge for president-elect Barack Obama.  Mexico was at war in 1810 and in 1910, and a war in 2010 seems imminent if the country is not in fact already at war.  This time around, it seems the United States will have to do more than provide monetary aid to its neighbor.  This year alone has been one of the bloodiest for Mexico, with deaths surpassing the number of U.S. military casualties in Iraq since that war began.

By Rosa Guzmán | The post Is Mexico Headed for War? appeared first on The Public Sphere.

]]>

Is Mexico heading for war?  If history repeats itself, the nation will present a challenge for president-elect Barack Obama.  Mexico was at war in 1810 and in 1910, and such a state of affairs in 2010 seems imminent if the country is not in fact already there.  This time around, it seems the United States will have to do more than provide monetary aid to its neighbor.  This year alone has been one of the bloodiest for Mexico, with deaths surpassing the number of U.S. military casualties in Iraq since that war began.

The root of the problem is drug cartels’ involvement in violent turf wars over trafficking routes. A mega-alliance was formed when the Gulf Cartel hired a paramilitary group, known as Los Zetas, as a hit squad.  The latter were originally members of the Mexican Army’s elite Airborne Special Forces Group known as GAFE, and they specialized in locating and apprehending drug cartel members.  They have been trained by the Israelis, French, and the U.S.  Some believe they received instruction in the infamous School of the Americas in the U.S.  Many assert that they became rogue officers because the cartel pays substantially higher wages than the Mexican government does.

These turf wars impact both small towns and big cities. The cartels incite fear in residents, even sometimes conscripting them to work for them.  Many citizens who had gone to Mexico to retire are forced to return to the United States to secure their safety.  Among other reasons for the U.S. to act in a meaningful fashion is the fact that more Mexican citizens will seek asylum to the north in order to flee from their war-torn country.  It is becoming a place where lawlessness dominates daily life, and they are losing faith in their government’s ability to punish the perpetrators.

With the murders becoming more cold-blooded, the question becomes whether the punishment will fit the crime.   Mexico, a country that has long been opposed to the death penalty, a country that has often been at odds with the U.S. over the issue, has served as a safe haven for criminals fleeing prosecution.  With the current wave of violent crime sweeping the nation, however, many argue for the death penalty’s reinstatement in an effort to curb the violence and to punish those committing the crime.  The cartels are carrying out gory acts, including beheadings and executions, which almost always are meant as a message for a rival cartel.  The government wants to send a strong message back, and many hope that if Mexico follows the U.S. practice of the death penalty, then that strong message will be heard.

The U.S. has already agreed to provide $400M in foreign aid, but they have yet to release the funds, even though Condoleezza Rice went to visit the country to assure them that help is on the way.   When Mexican President Felipe Calderón called President-elect Barack Obama to congratulate him on his historic victory, Calderón made sure to mention that Mexico is in dire need of help.  Corruption is being uncovered at the highest levels of the Mexican government.  Cartels are laughing in the rest of the government’s face, because the former openly recruit soldiers to join them, even posting recruitment banners and signs in states across the country.  They promise significant benefits, like higher pay, a house, and a car.  A corrupt government cannot offer such luxuries to its citizens, who are living in poverty.  Drug wars are not foreign to the Americas, as Colombia has raged in the midst of drug wars for years. But this time, residents in U.S. towns that border Mexico are living in fear. If the U.S. cannot help Mexico out of a desire to meet its neighbor’s needs, then at the very least, the country to the north should assist Mexico in order  to protect its own citizens.


Creative Commons License photo credit: ANGELOUX

By Rosa Guzmán | The post Is Mexico Headed for War? appeared first on The Public Sphere.

]]>
Your Government Lied to You. So What? http://thepublicsphere.com/your-government-lied-to-you-so-what/ http://thepublicsphere.com/your-government-lied-to-you-so-what/#respond http://thepublicsphere.com/?p=290 Members of the Bush Administration-Vice President Cheney seems the most clearly implicated-led an effort to forge documents alleging Iraq's possession of WMDs. If that wasn't enough, these documents were attributed to a source who was actually saying exactly the opposite. It is hard to imagine a clearer breach of the public trust by governmental officials. The severity of this overt deception is compounded by the numerous atrocities that have followed in its wake. And yet, no one really cares. President Bush will finish his term unabated, after which time he'll probably go back to his old career of running businesses funded by Saudi oil money-running them into the ground, that is. He'll make for some amusing stories in the tabloids. Maybe he'll even get his own reality TV show. In any event, he'll never have to account for his actions in a court of law.

By Marc Lombardo | The post Your Government Lied to You. So What? appeared first on The Public Sphere.

]]>

The Bill of Rights

Ron Suskind’s new book, The Way of the World, contains a number of revelations concerning the activities of the George W. Bush Administration during the months preceding the Iraq War. According to Suskind’s sources, when Bush was confronted with intelligence reports that did not support the projected war with Iraq he quipped, “Why don’t they [the CIA] give us something we can use?” This petulant remark was followed by a concerted effort to fabricate the evidence. Members of the Bush Administration-Vice President Cheney seems the most clearly implicated-led an effort to forge documents alleging Iraq’s possession of WMDs. If that wasn’t enough, these documents were attributed to a source who was actually saying exactly the opposite. It is hard to imagine a clearer breach of the public trust by governmental officials. The severity of this overt deception is compounded by the numerous atrocities that have followed in its wake. And yet, no one really cares. President Bush will finish his term unabated, after which time he’ll probably go back to his old career of running businesses funded by Saudi oil money-running them into the ground, that is. He’ll make for some amusing stories in the tabloids. Maybe he’ll even get his own reality TV show. In any event, he’ll never have to account for his actions in a court of law.  

The lack of public outrage over this new confirmation of the Bush Administration’s mendacity is not a great surprise. Suskind’s account does not really change our understanding of any of the facts regarding how the country was led into war; it only gives us a clearer picture of the complicity of members of the Administration (including Bush himself) with respect to those facts. We have known for over four years now, that the central justification for the Iraq War was incontrovertibly false. They said we had to go to war because Iraq might have WMDs. We went to war. We now know Iraq didn’t have WMDs. Mission accomplished. Not even the Administration’s chief apologists now deny that the intelligence leading to war was faulty. If everyone-regardless of political or ideological affiliation-recognizes that the intelligence which led to war was faulty, then why haven’t those responsible for the decision to go to war faced legal consequences? Indeed, as we look at the political landscape on the eve of the 2008 elections, those responsible for the Iraq War will likely never be held legally accountable for their actions. If impeachment proceedings could be started against President Clinton for a blowjob, then why not against President Bush for an unnecessary war?

It is not enough for us merely to remind ourselves of the grave results of Bush’s unpunished crimes and misdemeanors: e.g., over 4000 dead Americans, over 500,000 dead Iraqis, ((The number of Iraqi dead may very well be much higher than the conservative estimate of 500,000. Already in a  2006 study, researchers at the Johns Hopkins Bloomberg School of Public Health estimated the number of dead Iraqi civilians at 600,000. A more recent  study  by the British research firm Opinion Research Business calculates the death toll at over a million lives. Neither of these studies factored in deaths from sources like malnutrition and disease that occurred due to the war.))  millions of Iraqi refugees, the emboldening of extremists within Iran, the decline of the U.S.A.’s standing in the world community, the collapse of the U.S. economy, etc. The more pressing task at hand is to examine-in as dispassionate, apolitical, and clinical a matter as possible-just why it is that Bush’s crimes will almost certainly go unpunished. Understanding why we won’t prosecute the Bush Administration for the Iraq War just might help us to diagnose the functional disorders within the body politic as a whole. We need to regard our inability to hold Bush legally accountable as a matter of public health.

The first major barrier to prosecution of the Bush Administration is a simple but remarkably effective one: ignorance. War apologists now claim that while they may have been wrong about Iraq’s possession of WMDs, they were wrong in good faith. Intelligence is an imperfect science. We used the best information we had. We got it wrong. Sorry. Now, it is certainly refreshing any time political ideologues admit that they were wrong about anything. Indeed, this admission indicates that reality still has a small role in mainstream political debate. However, we must not lose sight of the fact that the admission of ignorance is being used in order to avoid responsibility.     Why is the claim to ignorance a plausible excuse for legal and political responsibility?

In order to answer this question, we should consider the role of the claim to ignorance in our moral and legal judgments more generally. We take it for granted that when we hold person X legally accountable for a given action, the physical evidence concerning how that action transpired ought to demonstrate that person X was the causal agent of the action in question. The claim to ignorance attempts to deny person X’s moral and/or legal responsibility for a given action without denying the evidence of his or her participation in that action. For this claim to be valid, person X’s lack of adequate information concerning the plausible alternatives and/or consequences of the action in question must be of such a constraining nature that, for all intents and purposes, person X was physically forced to commit the action in question.

The claim to ignorance is very difficult to refute on the basis of the scientific model of the universe. This worldview assumes that every action is caused by other prior actions. When a forest fire starts, we don’t assume that the fire caused itself. Rather, we investigate what might have ignited the fire, and if and when we find the source of ignition (e.g., a lightning strike, an arsonist, etc.), we accept that source as the fire’s causal agent. Why do we stop there? Isn’t it fundamentally arbitrary to say that either the lightning strike or the arsonist was responsible for the fire? Surely, other prior actions must have caused the lightning to strike or arsonist to set the fire. We identify the lightning strike or the arsonist as the fire’s causal agent, simply because finding the next anterior agent becomes exceedingly difficult. In other words, according to the scientific model of the universe, agency is always a matter of explanatory efficiency and nothing else.

Today, we all more or less accept this model of the universe-whether we know and admit it or not. The claim to ignorance relies upon the difficulty in scientifically identifying a single agent as the beginning of a causal chain. Aren’t we all fundamentally ignorant of the reasons why we do anything? What would it mean to make a truly informed decision? Could anyone but God ever really accomplish such a feat? When Bush, Vice President Cheney, Dr. Rice, et al. appeal to ignorance as an excuse for starting an unnecessary war, they are appealing to our inculturated sense of skepticism and fallibility with respect to the limits of knowledge.

At various points in our lives, each of us has made a decision that later turned out to be wrong. Modern science’s single greatest achievement made the fallibility of our personal experience applicable in an even more fundamental way to judgments and observations about the universe as a whole. Thus, when the Left criticizes Bush for his ignorance and stupidity, they are thereby placing his actions within a narrative that equates intuitively with the way we understand the world and our place in it. Not coincidentally, Bush and his fellow apologists themselves appeal to ignorance when explaining the decision to go to war. The more we accept Bush’s claim to ignorance, the less able we are to hold him accountable for the consequences of his actions.

If ignorance were the only thing that Bush could rely upon, however, he would likely have been impeached already. Even though we are likely to give the benefit of the doubt to people who claim to have acted out of ignorance, this benefit extends only so far. Therefore, we have the long-standing legal principle ignorantia juris non excusat (ignorance is no protection from the law). The success of a claim to ignorance depends entirely upon others perceiving the claimant to have acted in good faith. While no one blames an idiot, everyone hates a liar. Hence the moral outrage and legal proceedings directed at Nixon and Clinton.

For what other reason do we not prosecute Bush? I have already argued that part of the problem resides in our skepticism, which is emblematic of what is best in both our common sense and the scientific method. The other significant barrier to holding Bush legally accountable for his actions similarly arises from another great and rich cultural tradition: distrusting the government. The United States inherited this tradition from the philosophies and practices of modern liberalism as articulated by figures like John Locke and Adam Smith.

John Locke argued for limiting governmental powers on the basis of a strict distinction between public and private: the king can perhaps tell a man what he ought to do when that man is out in the world, but no one should tell him what he can or cannot do when he is in his own home. The male pronoun is instructive in this case, as Locke was effectively transposing the classical figure of the pater familias, resulting in the birth of a peculiarly modern entity: homo economicus. Adam Smith made the economic significance of Locke’s notion of private liberty more explicit, showing that the concepts of property and liberty are fundamentally intertwined. Smith argued that even the public good (i.e., what is best for all) is most effectively and efficiently pursued only when private interests are left unchecked by any external influences whatsoever (most especially, that of the government). The liberals defined private liberty as existing only to the extent that the government did not interfere with it. This in turn required that private liberty could only be protected if and when private individuals came together collectively in order to limit the exercise of governmental power upon their lives. As such, from the liberal viewpoint, the ability to do what one wants in one’s private life depends entirely upon the public and cooperative practice of constantly and diligently surveiling and criticizing everything that the government does. The active public manifestation of the distrust of government is the basis for all other private liberties. The U.S. Founders, being good liberals, naturally placed the “First Amendment” first in the Bill of Rights.

What do I take to be the pernicious consequences of this tradition? For the time being, I will leave aside the issues concerning the conflation of human dignity with economic property. I would like to focus upon the issue that most directly applies to the distrust of government. When such suspicion results in the active communication of information concerning the pernicious effects of government policies, no temperament is more conducive to the well-being of the public as a whole. The distrust of government, however, becomes pernicious if it results in either of the following: 1, its use by government institutions in order to disavow accountability for their actions, or 2, a vague but inoperative cynicism on the part of the populace, whereby the task of holding the government accountable is abandoned as a fool’s errand.

Both of these pernicious uses of the distrust of government mutually reinforce one another. When the government itself claims its own incompetence, ignorance, and/or ineffectiveness, the aim is to escape public accountability. A government’s appeal to the distrust of government will displace successfully the responsibility for policy insofar as that claim resonates with the people’s own cynicism regarding what government can do. Similarly, whenever the people are content with their own private suspicions regarding the incompetence of government, an incompetent government that disavows responsibility for its own policies is exactly what they will receive. When these sentiments of government disavowal and personal resignation support one another, the distrust of government becomes a self-fulfilling prophecy.

Precisely this brand of mistrust turned into a self-fulfilling prophecy prevents us from holding Bush and his accomplices accountable for their abuses of the public trust. There is only one cure for this disease, and one hope for rehabilitating our social institutions in its wake: the distrust of government must be transformed from a vague, inoperative, private sentiment into a series of specific public accusations. Those in power must be made to account for the consequences of their actions directly. It will not matter what party we choose in November unless all of our representatives fear their constituents more than they defer to private interests. Furthermore, if this fear of the public only extends to the election cycle or the polls, its actual effects upon policy are nil. Cheney’s response, when asked about the Bush Administration’s low poll numbers, was quite candid in this regard: “So?” This blatant disregard for any and all forms of public accountability should be taken as a challenge upon which the health of our democracy entirely depends. For the liberal tradition, it is meaningless to talk about personal liberty unless it is accompanied by a collective mechanism for restraining governmental and corporate interests. If the crimes of the past eight years go unpunished, we can conclude that the U.S. iteration of the liberal experiment in governance ended in failure.

I have already said that Bush and his accomplices will never be held legally responsible for their actions in a criminal court. Fortunately, humanity discovered another more effective means of punishing those guilty of violating the public trust. The social response to such crimes does not strictly require that guilty individuals undergo physical suffering comparable to that which they exacted upon the innocent. Surely, a traitor can never be hanged enough. Rather, criminals of this sort must be made to account for their actions publicly. Admittedly, as Frederick Douglass notes, the threat of physical violence may be required as a persuasive instrument for the procurement of such testimony. However, the true punishment and the true reparation is the testimony itself. Bring on the Truth and Reconciliation.

Image of the U.S. Bill of Rights is available from the National Archives.

By Marc Lombardo | The post Your Government Lied to You. So What? appeared first on The Public Sphere.

]]>
http://thepublicsphere.com/your-government-lied-to-you-so-what/feed/ 0
Evidence of Things Hidden Behind the Voting Booth http://thepublicsphere.com/evidence-of-things-hidden-behind-the-voting-booth/ http://thepublicsphere.com/evidence-of-things-hidden-behind-the-voting-booth/#respond http://thepublicsphere.com/?p=25 As the 2008 Democratic primary season ends with Barack Obama as the presumptive nominee, I want to reflect back on some of the political themes, realities, and pundit theories that have shaped and invigorated the United States of America over the last year. Some could say citizens of this nation faced three major questions leading to the Democratic nomination.

By Edward Robinson, Jr. | The post Evidence of Things Hidden Behind the Voting Booth appeared first on The Public Sphere.

]]>

As the 2008 Democratic primary season ends with Barack Obama as the presumptive nominee, I want to reflect back on some of the political themes, realities, and pundit theories that have shaped and invigorated the United States of America over the last year. Some could say citizens of this nation faced three major questions leading to the Democratic nomination. Americans faced the heart of blackness in Senator Obama, a strong and politically shrewd woman in Hillary Rodham Clinton, and the love and hate relationship with her husband, the former United States President Bill Clinton.  Between these two talented senators, Americans also had to face the question of religion in the antics of Rev. Jeremiah Wright. Thus, Americans have had to ask themselves serious questions about their attachments to race, religion, and what the Oval Office should stand for in a time of economic and environmental crisis as well as an unpopular war.

Commentators of all stripes have heralded this Democratic Primary as the fulfillment of democracy’s promise. The primary has claimed a new generation of youth, especially white students from the halls of Columbia to the spacious green grounds of the Claremont Colleges, who have grown weary of their hollow privilege, power, and racial divisions. Perhaps more impressive is the sudden maturity and awareness black America is displaying in weighing in on issues important to their community and to the country. The academically inclined would ponder the possibility that the US is finally ready, after 232 years, to be the light of democracy it has claimed it would be since 1776. As much as I would love to join the “Yes We Can” bandwagon, I suggest that commentators, students, and black Americans take a step back and truly assess the political realities flowing from Senators, US Representatives, and even some Republicans regarding Barack Obama. Admittedly, I have seen people of all races, ages, and class groups speak earnestly about what Senator Obama as president could possibly represent. People have voiced such excitement about the senator even when he asks Americans to do the one thing that the sixties and the rise of Republicans has reminded all citizens that America has a hard time doing: change.

The word change reminds me of an important twentieth-century black novelist and essayist James Baldwin in The Evidence of Things Unseen (1985). Laws can be changed and treaties can be signed, but can US citizens really put in the necessary work required to really change? Baldwin concluded that, except for the few, the evidence of change is not visible. Senator Obama’s bid to be the president of the United States has caught Americans in a perfect storm of post-racial aspirations and public sphere inertia. Americans seemed to be unable to speak and act truthfully within one American public without the residue that still exists in the old American divisive republicthe ever-present racial divide between black and white. It appears that the roots of a 262 year old racialized society still run too deep. We must remember that black Americans fought and celebrated along with white citizens in the Revolutionary War and Civil War and, yet, freedom did not ring true for all. Another example is the incredible coalition of black, Jewish, and white Americans in the Civil Rights Movement. When the laws were changed, American citizens slipped back into the comforts of their homes, suburbs, jobs, schools, and clubs and the U.S.A. became two nations again splintered not only by race, but racial class groups as well.

The evidence of the racial past reviving itself is revealed in how the democratic primary competition has ended. More and more, what seemed to have been a cohesive union of black people, white intellectuals, the working poor, college students, and people fed up with the Bush administration slowly revealed the U.S.A.’s inescapable racial divide. Consider for a moment the differences between caucus voting patterns and poll booth voting patterns. In caucuses people voted openly, and Senator Obama generally came out ahead. Perhaps in a caucus room, Senator Obama pulls at the souls of Americans who are publicly forced to reckon with history denied and commonly unspoken in the public arena of presidential politics. In a caucus, people must make a public profession and hope that GOD does not confront them with questions about blackness when they reach the pearly gates of heaven. Imagine going to hell because of one’s treatment of or for looking silently away from the treatment of black people. I am talking about the treatment of people who brought so much comfort as slaves and served as scapegoats in bad times after integration. I am writing of the same people who were crucified by Affirmative Action except for the careers it built for men such as Ward Connerly and Shelby Steele. Perhaps, one would answer GOD by saying they were allowed to sing, dance, and contribute spirituals, blues, rock, soul, jazz, pop music, and hip hop?

What the Democratic primary revealed instead was that there is still a major divide between private and public. Unlike the caucuses, in the primaries, people retreat to the privacy of the voting booth. In that voting booth Americans start to waiver, and differences, such as race, religion, and class, still matter in the U.S.A. The race-baiting tactics of the Clinton campaign revealed a voting booth contrast. President Clinton’s rhetoric in the South Carolina Primary parted the thin cohesive sheen of the Democratic Party when he reminded the country that Obama would win South Carolina mainly because he was black and not on his impressive credentials. Surely, Senator Obama had an upper hand in the state but he still had to win black and white voters who were just getting to know him. Moreover, the problem with the former President’s statement was that he was reminding white voters around the country to be aware of those black people. It has never failed in the U.S.A. that when masses of black people start getting energized about something, other people assume it means something is being taking away from white America. Like clockwork after South Carolina, it seems most white people started pursuing the more familiar and more comfortable racial path regardless of the cost to the country, American youth, and the world. The U.S.A. is starting to resemble the world that Andrew Hacker’s Two Nations: Black and White, Separate, Hostile, Unequal (1992) and John Hope Franklin’s The Color Line: Legacy for the Twenty-First Century (1993) warned against in the last decade.

Americans have been given a great opportunity in the nomination of Senator Obama. He has managed something unique, promising, and infrequently remarked upon. He broke the coalition of elder black politicians and pariahs who have fed the flames of race baiting and hatred over the last century. Let me be clear on this point: I understand that many black elders lived through some of the worst aspects of a racist system, like being called a nigger and being treated as second-class citizens with second-class toys and parks because a segment of their own country felt and voiced that they were somehow less than human. When US Representative John Lewis, a major figure in the Civil Rights Movement, looked into his heart and saw that the people he represented supported Senator Obama with excitement and record numbers of votes, Rep. Lewis switched his support from Senator Clinton. From city to city and state to state, politicians started listening to their constituents. The promises made by black politicians who were looking for gains under the Clinton regime gave way. Black Americans and Senator Obama broke the black politics of the past and showed that they would stand up to the Jesse Jacksons and Al Sharptons.

Black America gave the U.S.A. an opportunity to move beyond race. However, Senator Clinton’s campaign, after a succession of losses in the South, decided that their best chance was to play the race card. The Clintons, who were the pearls of black Americans’ hearts, asked the pivotal question. Can Americans seriously be considering a black man over the storied house of Clinton? Suddenly, pundits, youth, black and white Americans started to rehash the stories of yesterday. Racial divisions and classism returned. And, of course, one can purchase the new book by Shelby Steele, titled A Bound Man: Why We Are Excited About Obama and Why He Can’t Win (2008), fresh off the press as he continues to profit on the racial divide.

Regardless of what the November elections will teach Americans about themselves, Senator Obama’s triumph as the presumptive nominee is incredible. But, my heart tells me that that the evidence of change is still hidden behind the voting booth. Will the United States finally live up to its founding creed of liberty for all? White Americans hold the answer, but they will only reveal it behind the closed door of the voting booth. In the U.S.A. and around the world, people seem to feel as if something amazing is happening in America. I stand firm on James Baldwin’s sentiment of the evidence of things unseen. There will be no post-racial America and people will continue to say the right thing in public and do the race thing in private. I just hope that white America does something good for itself and not for black Americans. Exorcising the ghosts of the past might improve the US economy, education, and foreign relations. We, the people of the Unites States of America, run our own government and if Senator Obama cannot deliver the changes the country desperately needs after the Bush Administration then we can always change again. Yes we can!

By Edward Robinson, Jr. | The post Evidence of Things Hidden Behind the Voting Booth appeared first on The Public Sphere.

]]>
http://thepublicsphere.com/evidence-of-things-hidden-behind-the-voting-booth/feed/ 0