The Public Sphere http://thepublicsphere.com A Provocative Space of Critical Conversation Wed, 02 May 2018 15:48:28 +0000 en-US hourly 1 https://wordpress.org/?v=5.3.2 The New, Hard Work of Play http://thepublicsphere.com/new-hard-work-play/ Fri, 31 Oct 2014 15:25:37 +0000 http://thepublicsphere.com/?p=2542 Perhaps we would all enjoy ourselves more... if we let kids be kids when and in the places they need to be kids, and parents be adults when and in the places they need to be adults.

By Hollis Phelps | The post The New, Hard Work of Play appeared first on The Public Sphere.

]]>
As a parent of two young children, I spend a lot of time at playgrounds. Most parents probably do. We’re certainly lucky to have some great ones where I live, but to be honest, I really don’t like going to them. In fact, I often hate it. It”s not that I don’t enjoy spending time with my kids, playing with them and doing all the normal things that parents are supposed to do. I love my kids dearly, but that doesn’t mean I want to spend every single minute with them.

And that’s where playgrounds come in. Maybe I’m getting nostalgic now that I’m closer to forty than thirty, but it seems that my childhood playground trips largely involved me hanging out with other kids, without any parental interference. In fact, I distinctly remember going to the playground by myself—without my parents—when I was about the same age as my oldest, who’s six. Granted, we lived in a small town that was mostly walkable, but the point is that I didn’t need my parents with me, guiding my every move to make sure I had a good time. In fact, it was the lack of supervision that made it fun. Going off on my own was probably good for my parents, too, since it allowed them some worriless free time away from children.

It’s not that way anymore. Going to the playground with kids involves constant supervision on the part of parents, supervision that is exhausting for all parties involved. Rather than being a place for play, where kids can get together with other kids free from the watchful eye of parental authority, the playground now largely just mirrors household organization in another space.

I observed this change this past weekend. There were, I would guess, about fifty kids at the neighborhood playground—but rather than making up games together or seeing who could get down the slide first, for the most part, they just “played” with their parents. It was an interesting, yet disconcerting scene: the members of each familial group interacting with each other, but not with anyone outside of their immediate circle.

Indeed, for the most part, the other groups present appeared to be a nuisance, intruders on the terrain of solitary, yet oddly public, family fun. The parents were, on the whole, protective, intent on establishing clear boundaries (“Be careful!,” “Watch where you’re going!,” “Slides aren’t made for climbing up!”). In response, the kids could only envision a morning governed by parental wishes (“Push me on the swing!,” “Play hide and seek with me!,” “Watch me do the monkey bars!”).

Let me be clear: I’m as guilty as the next parent, so I’m not trying to take the moral high ground here. And I’m certainly not knocking anyone who enjoys the structured back and forth of parent-child playground interaction. But I find it exhausting, and I would bet that most parents—and children—do, too.

I feel it, and I can see it on their faces. The parents, me included, rushing from one activity to the next, when all they want to do is sit on a bench for a while and have a normal conversation with another grown human being. The kids, looking for permission and approval, when they obviously would love to be left alone to do what they want, without guidance and, of course, criticism. They’d prefer, in other words, to climb the slide, without being told, “No!”

One might object that it’s important to keep an eye on things, so that no one gets hurt. It”s also important to teach children to behave properly. That’s why our kids need our constant attention, of course. Don’t misunderstand me, but it’s good for kids to get hurt now and then, and it’s good for parents to let their kids get hurt now and then. Besides, at most playgrounds, it’s virtually impossible to be seriously injured: the merry-go-rounds are now gone, all the edges are rounded, and the ground on which it all stands is akin to memory foam. There are, of course, exceptions, but exceptions should never determine the rule. As to teaching kids to behave properly, is it really that big of a deal to climb a slide?

All to say, perhaps we would all enjoy ourselves more—at the playground, but more generally as well—if we let kids be kids when and in the places they need to be kids, and parents be adults when and in the places they need to be adults. And that often involves both parties doing their separate things, without interference. Allowing that to happen is much better for the sanity of all involved, but also, I would argue, for everyone’s enjoyment.

* Photo: Children at play, from the collections of the State Library of NSW.

By Hollis Phelps | The post The New, Hard Work of Play appeared first on The Public Sphere.

]]>
What We Lose When We Lose God http://thepublicsphere.com/lose-lose-god/ Sat, 25 Oct 2014 18:29:24 +0000 http://thepublicsphere.com/?p=2532 AO Scott has recently proposed that we are living in a post-partiarchal age that is also the end of adulthood. Here I want to suggest that the death of God continues to be a more fundamental liberating loss of our cultural moment.

By Alan R. Van Wyk | The post What We Lose When We Lose God appeared first on The Public Sphere.

]]>

Our most interesting beginnings often only appear after a time of wandering; usually we end where we ought to begin. So at the end of a rather strange essay arguing for the death-of-patriarchy as the death-of-adulthood AO Scott offers a wonderful articulation of our present cultural moment:

A crisis of authority is not for the faint of heart. It can be scary and weird and ambiguous. But it can also be a lot of fun, too. The best and most authentic cultural products of our time manage to be all of those things. They imagine a world where no one is in charge and no one necessarily knows what’s going on, where identities are in perpetual flux. Mothers and fathers act like teenagers; little children are wise beyond their years. Girls light out for the territory and boys cloister themselves in secret gardens. We have more stories, pictures and arguments than we know what to do with, and each one of them presses on our attentions with a claim of uniqueness, a demand to be recognized as special. The world is our playground, without a dad or mom in sight. I’m all for it. Now get off my lawn.

Or, as Nietzsche had it a few years earlier:

God is dead. God remains dead. And we have killed him. How shall we comfort ourselves, the murderers of all murderers? What was holiest and mightiest of all that the world has yet owned has bled to death under our knives: who will wipe this blood off us? What water is there for us to clean ourselves? What festivals of atonement, what sacred games shall we have to invent? Is not the greatness of this deed too great for us? Must we ourselves not become gods simply to appear worthy of it?

The argument Scott presents prior to this conclusion-as-more-interesting-beginning is, in many ways, an attempt to reconcile himself to the rather simple realization that culture, whatever else it may be, is an historical thing, that it is a heterogeneous becoming constantly in motion and as movement sometimes passes us by. And in being passed by, as Nietzsche also pointed out, we may find ourselves overcome in resentment; or as Scott might recognize in himself, if allowed a moment of self-reflexive retraction, we become invested by our own irrelevance (a subjection he quite rightly recognizes in others). In this irrelevance we are left to the whims of our own personal preferences masquerading as critical insight; which is to say, with Scott, that the “elevation of every individual’s inarguable likes and dislikes over formal critical discourse, the unassailable ascendancy of the fan has made children of us all.” At which point Scott becomes irretrievably implicated in his own argument; which isn’t to say the argument is wrong, just that it is a little confusing.

To consider our cultural moment as saturated with the death of God, especially in its Nietzschean variations, proposes that what is lost or has been lost and continues to be lost – or more properly what has been struggled against and partially overcome moving into our cultural past – is not patriarchy or adulthood per se but the hierarchy of being through which our world has been organized. That hierarchy functions according to an imago dei of imagined likeness and proximity to a creating-seeing god. This is a god who created through the word but judged through vision, a world spoken into being but seen to be good. This god whom Nietzsche declared murdered, saw all – and in seeing judged all, in life and in death – but could never be seen, or rather, could only be seen at the price of death. For Nietzsche there are always too many murders to keep track of. In its historical development, European-American patriarchy was organized around this imagination: drawn to a certain height by the transcendence of their god, white men projected themselves as mundane judges of the world, unseen seers and organizers of the world of their vision. We see this in practice when Scott admits to “feeling a twinge of disapproval when I see one of my peers clutching a volume of ‘Harry Potter’ or ‘The Hunger Games.’ I’m not necessarily proud of this reaction.” (Although, his loss of pride is not much of a deterrent.) “As cultural critique, it belongs in the same category as the sneer I can’t quite suppress when I see guys my age (pushing 50) riding skateboards or wearing shorts and flip-flops…” For Scott, the world presents itself before his gaze to be judged and in being judged put in its place. The patriarchy and adulthood-conflated-parenthood that concern Scott seem, in this sense, to be different if at times overlapping practices of this spectacular hierarchical judging imagination.

In this sense, the “best and most authentic cultural products of our time” are not, as Scott argues, simply those that manage to be invested by the scary, weird, ambiguous and fun condition of our day. This could be said of any day of any time, because abundant living is rather scary, weird, ambiguous, fun, and quite a bit more. The most interesting cultural – and we should add political here too – products of our time are those that are attempting to create worlds outside hierarchies of vision, worlds organized around being together on an immanent plane feeling ourselves in our worlds and in our relations other than as being on display to be judged.

Scott almost sees this in his recognition of a feminism that exists outside the world of post-patriarchal men, a world of irrelevant losers who can imagine nothing other than their being the center of their own narratives. But rather than being merely a passive concept for existing in wake of and absence of patriarchy, the cultural feminism that Scott sees (as that circumscribed by Beyoncé and pop-music and network and cable television) is also an active practice of imagining the world otherwise, a world organized around practices other than transcendent seeing and judging. These are worlds organized around finite relationships of friendship, work, sex, cultural production and participation, and maybe, even sometimes, love. This is of course a difficult world to see in; in a literal way, when meeting face-to-face, without the domineering privilege of transcendent height, I only ever see at best half of you. For our being in the world, our relations with and in the world, something, and usually many things are necessary beyond a mere seeing; a relational experience is necessary that cannot be reduced to a simple vision. This is in part why the casual nudity of Hannah Horvath/Lena Dunham is often so troubling for cultural critiques to understand: her body is not being presented for a gaze, anyone’s gaze, It is, as we learn almost immediately, a way of relating and of being together. This is also why it is sounds so strange for Scott to even worry about the sexualization of female pop-stars: most bodies simply aren’t for your viewing pleasure. It takes a special kind of hubris to think otherwise.

But this, of course, is not easy. As Nietzsche warned and worried, the announcement of God’s death may be a bit premature: we have not gotten rid of God because we have not gotten rid of His grammar. If patriarchy and parental adulthood are lost we still live in their shadow and their logic. Alongside Hannah/Lena’s casual nudity is a pervasive self-criticism of her appearance. But what makes these shows interesting, what makes feminism – and all those other movements that Scott ignores – interesting, is not that they are complicated but that they are trying to create a new world and begin new conversations that may make no sense in the grammars that we have been given. This, it would seem, is the proper work of adulthood in the wake of God’s death, a learning to live without.

By Alan R. Van Wyk | The post What We Lose When We Lose God appeared first on The Public Sphere.

]]>
Red Baiting Mandela http://thepublicsphere.com/red-baiting-mandela/ Fri, 17 Oct 2014 15:58:54 +0000 http://thepublicsphere.com/?p=2516 In the wake of Nelson Mandela’s death in 2013, a small, but vocal, number of critics decided to pull out an old canard about his affiliation with Communism.

By Derek Charles Catsam | The post Red Baiting Mandela appeared first on The Public Sphere.

]]>
A friend and I were having drinks in one of the many bars in Melville in the midst of South African winter in 2012. We both write about South African history and politics, and so were debating the governing African National Congress.

We were confronted by a disheveled white guy. He came over and sat down and started talking with us. He discovered that we both were professors and seemed interested in us being American, but only long enough to allow him to start off on a tirade about liberal Americans not getting the realities of the role of Communism in the ANC. He was getting increasingly riled up and increasingly incoherent. But he was name-dropping Stephen Ellis, the veteran scholar of African history and politics, and insisted that Ellis’ book would rip the roof off of the cover up of Communism in the ANC, and especially Nelson Mandela’s Communism.

The man, I later realized, was Rian Malan, the scion of the Apartheid elite who rose to fame for his apostasies against the legacy of white supremacy to which he was the rightful inheritor. His book, My Traitor’s Heart, was a bestseller, and branded Malan as a courageous figure who turned his back on a world in which he could have risen rapidly.

Yet in recent years, Malan, for reasons I cannot divine, has become increasingly obsessed with the idea that Communists deeply pervaded the ANC, and in particular Nelson Mandela. This is a peculiar obsession in no small part because it is not particularly accurate, and whatever accuracy there might be to the case is not especially compelling.

It is worth pointing out briefly the historical salience of rabid anticommunism in South Africa. Throughout the entirety of the apartheid era, the ruling National Party attributed nearly all opposition to its rigidly racist policies to an encroaching Communist menace. The Nats used absurd rhetoric and draconian policies not only to keep black nationalist opposition (such as the ANC, but also the Pan African Congress and other organizations) at bay, but also to prevent the United States and Great Britain, themselves besotted with anticommunist furor, from taking a stand against their white South African Cold War allies. This gambit was born of both cynicism and self-preservation. And it worked.

Earlier in the year, I was reminded of my unpleasant encounter with an aggressive, bombastic, boorish Malan when, as celebrations of Mandela’s life were still pouring in after his death, a voice emerged out of the wilderness of the overheated right wing bunkers. In a January 2014 PJ Media article, Ron Radosh, who has made his own lucrative career at the nexus of scholarship and journalism by finding Commies under every bed and in every faculty club, played up the Mandela-as-Communist trope. And naturally he invokes Rian Malan. Who invokes Stephen Ellis (whom Radosh also invokes).

Reading the piece brought me back to the conversation with Malan, and in both cases my thoughts were: There is not much there there. Radosh and Malan have created an echo chamber that is deeply reliant on Ellis’ scholarly imprimatur. It pretends that it is somehow revelatory that Communists and the ANC worked together even though that was never a secret. And crucially, it makes all sorts of arguments based on a shockingly thin evidentiary base. I’ve linked to the article above and won’t do much to rehash what Radosh has to say – it is the usual strident Red Baiting coupled with more than a little race baiting.

But let’s go straight to Ellis, the alleged source for this new Red Scare. First, in some of his writing Ellis has shown himself to be a bit of a fellow traveler with Radosh and Malan, but he is also undoubtedly a far more serious and prolific scholar. Ellis’s book, which was supposed to provide the big reveal that Malan warned me about in Melville, External Mission: The ANC in Exile 1960-1990, is presented as a trump card. However, it’s telling that Radosh and Malan don’t really cite the book. External Mission is a fine work of scholarship. But let’s just say that if it provides the foundation for the Mandela-as-Commie-radical meme, that foundation has a lot of cracks.

According to Radosh and Malan, Ellis purports to reveal that far from having dabbled in Communism, Mandela was instead a Party Member and a high-ranking one at that. His evidence is pretty shallow – a handful of interviews from people testifying to Mandela’s membership in the party decades after the fact are about the extent of it. Not a single document. Not a single testimony from the time. Not a whole lot at all. And he presents even less to indicate anything other than perfunctory involvement, even if, of course, Mandela had access to numerous higher-ups in the SACP hierarchy, given how much it was intertwined with the ANC’s leadership.

And yet, Radosh and Malan, so reliant upon Ellis to bolster their claims of Mandela as an untrustworthy Commie Radical, must not have read as far as pages 33 and 34, where Ellis writes, “Despite” [South African Communist Joe] “Slovo’s disappointment at the cooling of Mandela’s communist sympathies, it is evident that Mandela’s brief membership of the Party was motivated by pragmatism rather than ideological commitment, that his opinions on communism had a strongly Christian tint, and that his primary allegiance was to Africa.” In essence, it is tough to take seriously Radosh’s dismissal of the allegedly leftist claim that Mandela’s alliance with the SACP was brief, utilitarian, and subsumed to larger ideals in light of the fact that the man he cites as demolishing that allegedly leftist myth is pretty clear that Mandela’s alliance with the SACP was brief, utilitarian, and subsumed to larger ideals.

But if we even grant that there is a scintilla of an argument that Nelson Mandela was indeed more deeply involved than is commonly believed with Communism for a few months (at most) in the early 1960s, when a radical response to Apartheid seemed like just about the only sensible response– so o what? Why does it matter? What does it all mean now, and how does any of it have anything to do with who Mandela was and what he meant and what he accomplished?

Mandela had a brief alliance with Communism and clearly grew disenchanted; this is not only my interpretation, this is Stephen Ellis’ in the very book that Malan and Radosh use to try to build a Communist mountain out of a molehill of an alliance. However, this should not come as a surprise. Communists were, after all, right on the Apartheid question, which is to say the question that was at the heart of Mandela’s struggle, his existential struggle and his fight for freedom. And in South Africa, the various strands of Communism, even the most doctrinaire, yielded the class struggle to the struggle against white supremacy, though many hoped to see the two yoked together.

So many of those who opposed Mandela, the ANC, the anti-apartheid opposition especially in the fraught post-Soweto era, now want to point out Mandela’s communism. They were wrong then and now they are trying to validate where they stood as history played out by invoking the Communist bogeyman still lurking in the fever dreams and dark corners of a certain kind of conservative and neoconservative in the US, Great Britain, and even South Africa.

It is perhaps also worth noting that Mandela’s flirtation with Soviet Communism, with what amounted to an alliance of those who opposed National Party white supremacy, took place just a decade and a half after the United States, Great Britain and other allies – including, it must be noted, the white segregationist government of South Africa (an alliance that admittedly created divisions that sowed the seeds for the National Party takeover in 1948) – had allied with Joseph Stalin’s Soviet Union. And the red baiters are usually the first to acknowledge Stalin as an apodictic form of evil. But why is it that FDR and Truman, Churchill and Atlee, and even Smuts were able to ally with Stalin to fight the “Good War,” but Mandela and Mbeki, Tambo and Sisulu, were not able to join with a particularly South African and decidedly non-Stalinist version of Communism for their own existential struggle in which they were on the side of right?

And it is not as if the pragmatic alliance with evil in World War II marks a particular exception in American history. The United States – Ron Radosh’s United States, the United States of Ronald Reagan – has forged myriad alliances that toss cold water on a whole lot of freedom-loving ideals, and oftentimes in the very name of combating the Communist evil they have allied with their own devils. One can imagine that justifying selling arms to Iranian jihadists in order to support nun-raping, child-murdering anti-Communists in Latin America, supporting the Mujahedeen in order to combat Soviet incursions into Afghanistan, supporting Saddam Hussein against that very same Iran, and support for Mobutu Sese Seku (just to name four) might give some Americans pause when it comes to condemning even the strangest bedfellows of others. Ron Radosh, alas, does not have a history of being particularly self-reflective as he plows forward with his moral verities.

Finally, if Mandela was a Communist, he was, in the words of the friend with whom I spent that evening in Melville, “The Worst Communist Ever.” It was Mandela’s government that set the New South Africa on the path of what some of its critics have loudly and often declared to be the path to neo-liberalism. Mandela’s South Africa has become many things. A Socialist paradise or Communist idyll is not among them.

Nelson Mandela may or may not have briefly been a Communist. But if he was, that membership was done long before he went to trial in 1964, spent 27 years in prison, negotiated with the National Party (alongside Joe Slovo, the Communist, who also was central in the CODESA negotiations), governed as a center-left pro-market neoliberal, and worked to reconcile a country torn apart because of an apartheid regime that the most ardently anti-Communist nations buttressed. That particular strain of anti-Communism was wrong and dangerous when it was applied to South Africa then. It is wrong, anachronistic, and frankly, just kind of sad now.

By Derek Charles Catsam | The post Red Baiting Mandela appeared first on The Public Sphere.

]]>
Howling Again http://thepublicsphere.com/howling/ Fri, 10 Oct 2014 21:23:22 +0000 http://thepublicsphere.com/?p=2463 It would be wrong, wouldn’t it<br>
to ask for sacrificial stand-ins<br>
cheap substitutes

By Katy Scrogin | The post Howling Again appeared first on The Public Sphere.

]]>

People just keep checking out

forever:

Wits savage enough for prophecy

Straight backbones with x-ray vision

and unheeded verses

Malcontent hearts ready to recycle the heated shrapnel colonizing their guts

over years

over voices raised

over the choking scrape of verbal metal on membrane

ready to expel ancient buckshot newly honed

for the slaughter of aches and ignorance

I would quote Ginsberg but

his long truth is all wrong

now after anthologies

I would say why not others

any random sampling of the mundane mass

of less than lambs

upright on pale feet

wrapped in child-crafted throwaway single-season size sevens

innocent of the weight of annihilation or

the meaning of that word

in conjunction with any reality in which they star

I would say that but

It would be wrong, wouldn’t it

to ask for sacrificial stand-ins

cheap substitutes

death spores wandering

amid untimely removed kernels

of brilliance

and light

and hard-shelled compassion

Wouldn’t it?

I would howl but

in empty rooms

the screaming echoes

rebound

gain force with each pound

against rigid absence

take on too much momentum

to swerve around the creator

who gave them shrill life

cannot avoid throwing her

to the floor of the world

still searching for the annulled

the ghostly whys

her friends

By Katy Scrogin | The post Howling Again appeared first on The Public Sphere.

]]>
Roaming the Land: The Immigration Crisis and The Walking Dead http://thepublicsphere.com/roaming-land-immigration-crisis-walking-dead/ Fri, 03 Oct 2014 04:29:04 +0000 http://thepublicsphere.com/?p=2444 Like the walking dead, the immigrant embodies an unknown menace, one that threatens the physical, moral, and economic health of America and its way of life.

By Josh Barfield | The post Roaming the Land: The Immigration Crisis and The Walking Dead appeared first on The Public Sphere.

]]>

The immigration crisis at the US-Mexico border is beginning to look like an episode of AMC’s widely popular drama, The Walking Dead (TWD). This year alone has seen a 500% spike in apprehension of families at the border, while it is estimated that over 57,000 unaccompanied children have been detained in fiscal year 2014. If these numbers, along with pieces such as Leni Velasco’s on the Filipino sugarcane crisis and NBC’s on Honduras’ gang-violence, are any indications of these travelers’ reality, which includes a search for place amidst hunger, danger, and dehumanization, this alarming correlation between TWD and the current immigrant exodus into the U.S. is not unwarranted.

In line with the experiences of many immigrants, TWD focuses primarily on its characters’ search for a new home and their fight to survive along the way. Although this zombie-filled show has its fair share of zombie-on-human crime, TWD is rarely about zombies, and is instead often at its best when showing the day-to-day struggle of a traveling group of survivors dealing with hunger and difficult social situations, travelers in search of a place to settle down after losing everything. At its roots, the show is a tale of desperate homelessness in a land where finding sanctuary is the hope, and the greatest impossibility.

This sense of homelessness, shared by the show’s characters and the very-real immigrants risking their lives to cross the border, is what I call dislocation: being cut off from one”s own land and the life that comes with it. Like a limb separated from its socket, with great pain, the characters become dislocated from their own homes, pasts, and often, loved ones. The show’s main character, Rick Grimes, knows that stability is important, and as the leader, his number one concern for and answer to creating such stability is to find a suitable place to live and start over, to call home. By the show’s third season, the zombies are almost an afterthought, and easily protected against. The real enemy, then, in both fiction and reality, becomes dislocation.

At one level, the threat of being dislocated from a land to call your own is simple: danger is everywhere on the road. Walkers (the show’s name for zombies) are always a threat; other hungry travelers are always trying to take what is yours; and the places you think are safe and inviting end up belonging to a horde of cannibals. If Cormac McCarthy’s The Road – another father-son tale of wilderness survival – teaches us anything, it”s that traveling post-apocalypse is perilous and unpredictable, which is why it induces so much fear among those forced to wander.

That fear, however, also operates in another register in current political rhetoric: the immigrant’s fear of landlessness is met with fear of the immigrant. This other register displaces immigrants from the place of the survivor, and paints them in the shade of the zombie. For example, Texas Governor Rick Perry sent 1,000 National Guard troops to the Texas border in early August, as if trying to eliminate a zombie threat or contain a contagion, while in 2008, former Senator Fred Thompson likened immigrants to zombielike, “ . . . suicidal maniacs [who] want to kill countless innocent men, women, and children around the world.” Also in zombie-movie fashion, Fox News ran a piece on the possibility of immigrants carrying communicable diseases, noting that some doctors fear that immigrants carry a drug-resistant strain of TB that is spreading in several counties in Texas. The immigrant, in this rhetoric, is the embodiment of an unknown menace, one that threatens the physical, moral, and economic health of America and its way of life—a rhetoric containing a dehumanizing zombification of the immigrant.

In conflicting manner, TWD embodies both this cultural fear of the immigrant and its paradigmatic tales of survival, in terms of a conservative reaction that focuses on setting humans apart from zombie-immigrants. Concerning the current immigration-crisis narrative, the characters in the show simultaneously serve as allegories for both the weary, immigrant travelers looking for new homes and for the conservative project that would keep these travelers on the other side of the wall.

TWD presents the dueling narratives of the zombification of immigrants and of control and counter-immigration particularly well. We simultaneously see a group of survivors and separatists. This conjunction is quite telling at times, especially in exposing what both groups share: a concern for land and place. Throughout the show, this concern cuts through the ambiguous division of good and evil, human and zombie. As far as the zombies are concerned, the only thing that separates them from humans is the latters’ hope of shedding vagrancy and settling down, a further reiteration that the zombies are not necessarily the enemy, but fellow travelers forced into survival against their will. Hence, the worth in calling the zombies “walkers” and “roamers”: they are creatures who wander around, look for food, and kill to survive, much like Rick and his group. Whether Rick is biting the neck off a guy to save his son, or group members are covering themselves in zombie-flesh to disguise their smell and avoid the roamers, the show often plays with the dark similarities between survivors and walkers.

The difference between the dead and the living further collapses when the group discovers everyone is infected and will inevitably turn into zombies when they die. “You KNOW that when we die,” Rick powerfully explains in Issue 24 of the comic book parallel to the show, “we become them. You think we hide behind walls to protect us from the walking dead? Don”t you get it? We ARE the walking dead! WE are the walking dead.”

The arrival at Terminus at the end of Season 4 clearly shows this lack of difference between zombie and human. Terminus is undoubtedly a colony of cannibals, representing the breakdown of the only thing that truly separates walkers from (sometimes) smarter, less flesh-hungry human walkers. Here, people eat other people, formidably bypassing the need to turn into a zombie to act like one. Terminus, which means “end destination,” suggests that cannibalism is the end point on humanity’s devolving path from people into monsters. Terminus is what it looks like to survive in the apocalypse, which means becoming more like the dislocated zombies who roam about.

Despite this dark revelation of their path to zombiehood, Rick”s group is in constant search of what sets them apart from their zombie co-inhabitants of post-apocalypse Georgia, refusing to accept that people can”t be good and live together in peace. That hope is anchored in finding a place to make a new home, a real community in a permanent place. The characters of the show are happiest and most stable when they know they have safe keep. If the most humanizing aspect of TWD is finding a place and a home and forgoing wandering around searching for food like the zombies, then it is clear why places like the prison and the town of Woodbury are so coveted.

But as we have seen, there are pitfalls to even the most stable communities, especially when their sense of community is predicated on fear of the other and on their unity in keeping everyone else out. As much as Woodbury, a protected community with heavy-handed leadership, values place, and people”s attachment to the safety and humanity it brings, it ultimately falls apart from greed, exploitation, and a failure to live with the other. Terminus also seems a stable, safe place to settle down, but the end of the fourth season revealed that it is only a static hub for weary travelers to come and become dinner. In this way, Terminus still relies upon and, literally, feeds off, transient culture.

Whether the show is telling the story of conservative walling off or immigrant survival, or both simultaneously, it is surely concerned with the effects and struggles of people looking for something more while juggling whether or not to let others, outsiders, join in that search. At many levels, TWD expresses our inner brokenness and capacity to devour those around us without concern. In the end, the show powerfully suggests the risky, yet beautiful, reality of letting others in from the outside to live and survive in the land together, a reality ironically and somewhat unevenly expressed in the prison and its farm. Here, Rick and Herschel come to realize land and place are vital to survival, which is what makes them different from the wanderers on the road, both human and dead alike.

The similarities we’ve seen between fearful humans and rootless zombies play into the viscous nature of the characters and their attempt to stay away from such violence. More precisely, I think both human and zombie remind us what we could become if we continue to act in an exclusionary way. That is, if we are unwilling to rethink our own constructed borders, we may find that we, too, are the walking dead.

By Josh Barfield | The post Roaming the Land: The Immigration Crisis and The Walking Dead appeared first on The Public Sphere.

]]>
Issue № 6 | December 2009 http://thepublicsphere.com/issue-6-dec-15-2009/ Tue, 15 Dec 2009 05:15:54 +0000 http://thepublicsphere.com/?p=1808 What is "health," and what does it mean to be "healthy"?

By The Public Sphere | The post Issue № 6 <small> | December 2009</small> appeared first on The Public Sphere.

]]>
What is “health,” and what does it mean to be “healthy”? In this issue, Breanne Fahs queries how we, along with the pharmaceutical industry, have come to redefine mental, emotional, and sexual health. Meanwhile, Helen Heightsman Gordon’s poem reflects on caretaking. Alex Jay Kimmelman reminds us that people once traveled to find healthier climates in the Western U.S.A., while Hope Miller reflects on a last breakfast before leaving the Western state of Utah. Luke Perry offers insights into the exceptionalist bent of U.S. political culture that underlies a contentious healthcare debate, and James K. Walker examines alternative approaches to the body in Le Parkour in Britain.

This is the last issue of The Public Sphere until September 15, 2010, when The Public Sphere will return from a brief hiatus. Because much of the magazine will stay the same, we will accept submissions until August 15, 2010.

By The Public Sphere | The post Issue № 6 <small> | December 2009</small> appeared first on The Public Sphere.

]]>
When the U.S. West was a Place to Find Health http://thepublicsphere.com/american-west-health/ Tue, 15 Dec 2009 05:14:43 +0000 http://thepublicsphere.com/?p=1792 When Josiah Gregg and a company headed southwest on the Santa Fe trail in 1831, the young man was confined to lie prone in the bed of a Dearborn wagon. He suffered from chronic dyspepsia and tuberculosis, and western travel was prescribed for his condition.

By Alex Jay Kimmelman | The post When the U.S. West was a Place to Find Health appeared first on The Public Sphere.

]]>
When Josiah Gregg and a company headed southwest on the Santa Fe trail in 1831, the young man was confined to lie prone in the bed of a Dearborn wagon. He suffered from chronic dyspepsia and tuberculosis, and western travel was prescribed for his condition. This therapy proved to be highly successful. Two weeks into the journey Gregg was riding a pony and within eight weeks he had recovered completely. When his book, Commerce of the Prairies was published in 1844, it became one of the most influential books of its time. The legend of the West as a place where health was restored became firmly embedded in America and beyond.

The healthseeker, health migrant, old lunger, or one “chasing the cure” were all names given to people who took to the road in search of healthier places. Where today when one becomes seriously ill they check into a hospital, the same could not be said of the earlier era. Finding better health was a search for a place where the person felt better. This idea of travel for health was an ancient tradition. Since the sixth century, when Greek and Roman physicians proscribed a sea voyage across the Mediterranean to North Africa for their patients with cardio-pulmonary conditions, the travel therapy remained one of the few options for invalids.

America of the 19th and early 20th centuries was a sickly place. Gastrointestinal ailments caused by bad food and bad water afflicted nearly everyone. Season fevers persisted, especially in humid, riverine locales; precisely the places where most Americans lived. The medical theory that fog and mist was miasma (bad air) persisted well into the 1900s. The major killer, however, was tuberculosis. The disease thrived in most places, especially in conditions of overcrowding, humidity and contaminated air. Mortality rates for TB ran ten to twenty percent overall, and as high as forty percent in urban areas.

Together with Josiah Gregg, two other notable personalities at the end of the century, Teddy Roosevelt and Mark Twain, expressed their own siren calls to “rough it”; that is obtain a tent to live in and pitch it in the forests, prairies and deserts of the American West. Billy Jones in his 1967 book, Healthseekers in the Southwest concluded that “the search for health was a factor second only to the desire for land in attracting permanent settlers to the Southwest; easily 20 percent of those who migrated to the region between 1870 and 1900 were hopeful invalids.” As late as 1904, the International Conference on Tuberculosis issued a declaration stating, “in the failure of any medication and therapy, travel remains the most effective method for combating the disease.”

Two schools of medical thought emerged to channel the healthseekers to certain locales. The heliotherapists, taking their lead from Swiss physician Auguste Rollier, sought places where the solar rays were superior and might be employed to kill bacteria. High mountain retreats offered in Switzerland and the U.S. Rockies competed with the low valley desert communities of southern Arizona and California. U.S. climatologists headquartered in Colorado Springs sought to match patient needs to climate conditions. Individuals in the incipient stage of disease might be directed to high altitudes where heart and lung function would be taxed with the beneficial result of white blood cell creation. Those in the acute stage were directed to low altitudes where weather conditions provided warmer days without extended periods of precipitation.

Both the heliotherapists and the climatologists took a page from the work of Dr. Edward Trudeau at his wilderness sanitarium in the Adirondack Mountains. Trudeau housed his patients in tents, affording them maximum exposure to clean, fresh air. Plenty of good food and absolute rest rounded out the therapy regimen.

Regardless of the intent of the medical practitioners, individual patients often took matters into their own hands and engaged in seasonal migration. Finding the heat of the deserts in summer as oppressive as snow in the highlands during winter, they moved about as need be, always in search of the illusive “maximum level of comfort.” In the absence of a mechanism that might cure their condition, they traveled in search of that place where they seemed to alleviate their ailments. Financial status was not a limiting factor among the minions traveling the west in search of better health. For those with financial means, an industry was rapidly growing throughout the West to compete for their dollars. Hotels, convalescent homes, sanatoria, rest camps, boarding houses; all were sprouting up in towns and villages along the railroad routes. For the indigent healthseeker shanty towns and tent cities had to make due.

This phenomenon was never more evident than in the aftermath of World War I when the federal government faced the daunting task of caring for some 300,000 veterans with a variety of conditions. Among these were victims of poison gas, survivors of the Spanish influenza who had developed secondary conditions, wounded soldiers from the war, and tuberculars. In the name of efficiency, the government concluded that regional treatment centers were the answer.

The veterans had other ideas. In the instance of the southwest regional center located at Livermore, California, some found the proximity to the Pacific Ocean’s damp and the salt air irritating. Levels of precipitation might be disagreeable along with a number of other factors. While technically assigned to the post, men took the initiative to move to more agreeable environs. Hence places like Tucson and Phoenix became inundated with gas victims and tuberculars. The search for comfort paralleled the impulse for survival.

However, not all local populations embraced the healthseekers who sought convalescence in their communities. In the early years, healthseekers that brought investment capital were welcome citizens. The legions of poor and working classes who followed were not. Healthseekers were soon ostracized by those fearful of the highly contagious diseases they carried. Denver, Colorado was an excellent example. Touting its climate as therapeutic, four tuberculosis hospitals opened in the city by the early 1890s. Within a decade, locals were incurring TB in dangerous numbers. Suddenly, the invalids were not longer as welcome. Those already in the Queen City were shunted off to isolated areas.

Some places sought to dissuade the arrival of tuberculars through quarantine or outright prohibition. States on the southern tier appealed to the Federal government to take action. They argued that they suffered an excessive financial burden in being forced to care for the large number of indigent invalids. In 1914, the Shafroth-Calloway Bill was proposed in Congress by nine Southwestern states. Among the bills’ provisions were the use of abandoned military reservations and other government property as tuberculosis sanitariums specifically for indigent patients. Western cities and states would receive financial aid for providing welfare to those arriving from the eastern part of the country without the assistance of their home state. Critics contended that the legislation included no provisions to prevent physicians and other welfare agencies from sending their indigent consumptives West, an argument that helped convince Congress to reject the bill.

Others took a different tact to convince invalids to remain in their home communities; they wrote reports and editorial comments that appeared in magazines and major eastern newspapers. Journalist Samuel Hopkins Adams, contributed several articles, including an influential piece that appeared in McClure’s Magazine in January 1905 in which he lambasted the western health movement, promoting instead the principal element of Dr. Trudeau’s therapy: fresh air. Adams pointed out that fresh air and building ventilation was not the sole purview of the West and argued that, “where a tent is unavailable, a roof or porch will do. . . . Climate, while it may be an aid in some cases, has much less influence on tuberculosis, except in the later stages, than is generally supposed.” Adams and others offered as an alternative to moving west; move out onto your porch. Thus was created the “porch cure.”

Writers in the West also contributed missives about migrating for health. Warner Watkins, a Phoenix physician contributed an article, “Ignorance or Malpractice,” to a 1909 issue of the Journal of the American Medical Association. Watkins blasted Eastern doctors who sent “patients of meager means with advanced cases of consumption” to Arizona. He pointed out that “each winter the Associated Charities of this city [Phoenix] is swamped with such a class of patients and the county hospital is filled with them and our potter’s field is a veritable monument to the guilt of all practitioners who are guilty of such malpractice.” Consumption was a term used to describe the withering away of the body and the difficulty in maintaining weight that was common among tuberculars. However, given the limitations of medical diagnostics of the time, consumption was also used to describe a range of respiratory ailments including lung cancer, emphysema, asthma, chronic bronchitis and sinusitis.

Journalist and historian Sharlot Hall in her article, “The Burden of the Southwest” appearing in Out West (January1908), spoke of “a strangely careless disregard of details, an iridescent illusion” created about Arizona. She wrote of an all too familiar situation for the healthseeker. “He goes out, too often, with a light pocket to a strange place, to seek work which he is not able to do for the sake of a climate about which he knows nothing.” Hall was one of the very few to point out that conditions in the west were especially difficult for females. She advised women that bringing sufficient financial resources was a must to insure a good place to live and adequate fresh food. The most likely employment available to women was of the domestic variety, which was not likely to provide the rest necessary to recovery and recuperation.

As the 1920s commenced, many western communities sought to attract other migrants, just not indigent healthseekers. Thus Western writers re-inscribed their locales. Instead of being the place of last resort when a person had one foot in the grave, the West became a site of youthful vim and vigor. Instead of going out West to regain health, one traveled there to retain good health. The new marketing approach targeted the healthy tourist, rather than the sickly immigrant. The tourist and retiree took over as the seasonal migrants. The snowbirds had arrived.

Why did the healthseeker movement last so long and, by some accounts, continues on today? Science prior to the mid-twentieth century offered only personal observation to support the travel therapy, and the individual in most cases engaged in multiple relocations. In fact, a hard science discovery inadvertently led to greater dislocation of invalids. In 1882 when German physician Robert Koch discovered the tubercle bacteria as the cause of tuberculosis, he disproved the conventional wisdom that the disease was passed through heredity. Where a family might keep a loved one who suffered the disease as “God’s will,” they were more likely to evict one who incurred the disease through moral failings. Overwhelmingly, one started down the road to health and traveled to a specific location because they read a testimonial touting the life saving aspects of that place.

And what of today? The U.S. West is still attractive as a location for retirees and seasonal migration. The climate, particularly during winter, has not changed. Some desire to live close to the Mexican border so as to access cheaper drugs, medical services, and therapies unavailable in the U.S. Some travel to places where folk remedies are more accessible (and less scrutinized). The medical industry has reached new heights in diagnostics, drug and physical therapies, new technology, and health maintenance mechanisms. At the same time, diseases like tuberculosis have mutated into drug resistant varieties, particularly among the HIV/AIDS community. The cost of conventional care is rising and some individuals are increasingly resistant to institutional and regulatory dictates. Are we merely connecting health and place in a newer, more technological fashion in the telecommunications era, or are we preparing for a return to the physically wandering healthseeker?

By Alex Jay Kimmelman | The post When the U.S. West was a Place to Find Health appeared first on The Public Sphere.

]]>
U.S. Exceptionalism and Opposition to Healthcare Reform http://thepublicsphere.com/political-exceptionalism-opposition-healthcare-reform/ Tue, 15 Dec 2009 04:47:07 +0000 http://thepublicsphere.com/?p=1800 Political discourse surrounding healthcare reform has included purposeful disruptions of Congressional town hall meetings, the brandishing of firearms at opposition rallies, and the use of Nazi imagery to depict President Obama.

By Luke Perry | The post U.S. Exceptionalism and Opposition to Healthcare Reform appeared first on The Public Sphere.

]]>
Political discourse surrounding healthcare reform has included purposeful disruptions of Congressional town hall meetings, the brandishing of firearms at opposition rallies, and the use of Nazi imagery to depict President Obama. Why has opposition to healthcare reform been so contentious? Conventional responses from the political right typically focus on ideological differences, such as varying views on the appropriate role of government in society, or the perceived need to prioritize other issues, such as the economy. Conventional responses from the political left typically focus on the perceived entrenchment of private insurance companies or the unwillingness of Republicans to work in a bi-partisan fashion. Discussion of U.S. political culture is notably absent from efforts to understand opposition to healthcare reform. This essay will illuminate the ways in which the exceptionalism of U.S. political culture provides a context to better understand this opposition.

Exceptionalism is the idea that U.S. society, politics, and economics are unique and better than other societies and peoples. U.S. political culture has a long history of exceptionalism dating back to colonial America. Puritan leaders, such as John Winthrop, viewed the Massachusetts Bay Colony as “a city on a hill with the eyes of the world upon them.” The Puritan goal was to create a model of Christian morality. Theocracy gave way to broadening conceptions of freedom, which eventually led to an irreparable relationship with Great Britain. The Founders articulated their conceptions of freedom using universal language, which was focused on all of humanity, rather than just citizens of the U.S.A. This was remarkable considering how this little group of colonies broke away from the most powerful empire in the world; success was far from likely. Thomas Jefferson began the Declaration by placing the American Revolution “in the course of human events” and explaining that when rebellions occur, reasons had to be provided. The Founders justified the rebellion through dedication to certain natural rights premised on the notion that all “men” were created equal. Essentially, the one thing all human beings have in common is that we are not God, so all people, including government, must respect basic human rights. The Founders believed they were making a grand statement for all people whose government infringed on their natural rights, not just colonial Americans in 1776. The U.S. remains unique in having natural rights written into the country’s founding document, including the right to rebel if government infringes on these rights. To this day U.S. leaders regularly invoke the imagery of “a city on a hill” in speaking about the exceptional character of the U.S. experience.

A second way exceptionalism is manifested through U.S. foreign policy. The U.S. first embraced democracy promotion during World War I under Woodrow Wilson, who famously stated “the world must be made safe for democracy.” This quote is revealing because it highlights the belief that the world must be adapted to suit U.S. political beliefs and values, rather than the other way around. The U.S. emerged as a major superpower after World War II and emerged as the world superpower after the Cold War. From a Western perspective democracy’s major ideological rivals, fascism and communism, were severely discredited after the three major conflicts of the twentieth century. Exceptionalist elements of U.S. political culture now believe that the U.S.’s unique path to the top demonstrates that U.S.-style democracy and capitalism constitute the best of all types of social order. This is personified in President George W. Bush’s 2003 State of the Union, where he stated that “Americans are a free people who know that freedom is the right of every person and the future of every nation.” The exceptionalism of the U.S. tradition is now connected with the geo-political realities of U.S. military and economic power. The U.S. views itself as the model of democracy in an era of globalization where major powers have profound impact on the world at large.

Exceptionalism provides a useful perspective through which to better understand the contemporary healthcare debate given its historical prominence in U.S. development and culture. Senate Republicans, such as Senate Minority Leader Mitch McConnell, Orrin Hatch, Jim DeMint, and Richard Shelby, have argued that the U.S. has the best healthcare system in the world, as did George W. Bush, and President Barack Obama’s rival in the 2008 election, John McCain. These arguments have created controversy and confusion. One of the few things that Republicans and Democrats agree on is that healthcare reform is needed. Major differences emerge over how to do this. How can the U.S. healthcare system simultaneously be the best in the world and be in need of reform? Conservatives inherently want to conserve the pace of change. One way to articulate and justify this political behavior is to laude the status quo, which in this case, is the current healthcare system. One tactical way to do this is to hyperbolize the effectiveness of the current system, which particularly resonates with many U.S. citizens because of the role of exceptionalism in U.S. political culture. The inverse approach has been adopted as well. In addition to lauding the status quo, the enemy, Barack Obama in this case, has been demonized. Prominent examples include Representative Joe Wilson’s unprecedented shout of “you lie” during a presidential address before Congress and popular conservative talk show host Rush Limbaugh comparing Obama to Hitler. “Going negative” and criticizing political rivals is not new. Importantly, however, these criticisms have more traction and can be more outlandish, when framed in a belief that U.S. healthcare is exceptional, so that whoever seeks to change the status quo, threatens national well-being, and is deserving of harsh criticism.

Public opinion is a second way to consider the impact of exceptionalism in the opposition to healthcare reform. Access to healthcare, a major concern of Democrats, does not resonate with broader U.S. culture to the same degree that it does in the Democratic party, even though Democrats received widespread support in the 2006 and 2008 elections. People in the U.S. predominately view poverty as the result of individual failures; this view contrasts to much of Europe, whose people predominately view poverty as the result of structural problems, such as the lack of education or the lack of opportunity. The U.S. view constrains reform efforts because people who are financially successful are considered exceptional and thus more deserving of healthcare coverage than financially challenged Americans, who are blamed for being poor and their inability to gain or purchase healthcare coverage. These attitudes reflect a form of Social Darwinism,. In the nineteenth century, Social Darwinists, such as Herbert Spencer and William Graham Sumner, justified economic inequality as a natural product of competition and used this belief to advocate limited government involvement in social activity, and such attitudes linger in U.S. public exceptionalist sentiments. Not surprisingly, the U.S. has the most limited welfare state in the West. In turn, people in the U.S. are divided over whether the federal government should make sure all U.S. citizens and legal residents have healthcare coverage, again in sharp contrast to European countries, all of which have an increased federal role in healthcare to ensure access.

The divisions that now plague healthcare reform in the U.S. run much deeper than this moment. U.S. political culture is inherently resistant to political change that questions the exceptional nature of how people in the U.S. live and seek to build a more collective understanding of the public good. The U.S. has not decided whether it wants to remain committed to the welfare state, pursue a long term process of deregulation and privatization, or continue shifting back and forth in a highly polarized fashion. Greater understanding and appreciation for the cultural dynamics influencing this situation helps explain why opposition to healthcare reform has been so contentious. Conventional and scholarly examinations of opposition to healthcare reform would be well-served by greater discussion of the role of U.S. political culture. The final bill, regardless of the specific form, will likely raise a new and important set of questions, the answers to which will determine whether a movement toward a more European style welfare state is truly progressive or moving the U.S. away from the exceptionalism that made the country what it is today. This will inevitably shape and be shaped by U.S. political culture, no matter how exceptional and enlightened we think we are.

By Luke Perry | The post U.S. Exceptionalism and Opposition to Healthcare Reform appeared first on The Public Sphere.

]]>
Lifestyle drugs and the new wave of pharmaceutical personality sculpting http://thepublicsphere.com/lifestyle-drugs-wave-pharmaceutical-personality-sculpting/ http://thepublicsphere.com/lifestyle-drugs-wave-pharmaceutical-personality-sculpting/#comments http://thepublicsphere.com/?p=1789 At a the annual conference for the Society for the Scientific Study of Sexuality last year, I heard a researcher describe how the pharmaceutical industry “jukes the stats”—that is, crunches numbers creatively in order to persuade the public that their products actually accomplish their stated tasks.

By Breanne Fahs | The post Lifestyle drugs and the new wave of pharmaceutical personality sculpting appeared first on The Public Sphere.

]]>

“Ask your doctor if medical advice from a television commercial is right for you.”
—Bumper sticker slogan ((www.northernsun.com))

At a the annual conference for the Society for the Scientific Study of Sexuality last year, I heard a researcher describe how the pharmaceutical industry “jukes the stats”—that is, crunches numbers creatively in order to persuade the public that their products actually accomplish their stated tasks.  This researcher, Dr. Duryea, offered a succinct finding: Antidepressant manufacturers go to great lengths to disguise the fact that people kill themselves during the “wash out” phase of antidepressants.  Once participants stopped taking certain antidepressants (and, in clinical trials, before they resumed taking them again), those taking the antidepressants had an increased risk of suicide compared to their pre-drug state.  Of course, since these users were not technically ingesting the drug during this “wash out” phase, the pharmaceutical industry convinced the FDA that antidepressants did not increase the risk of suicide—a creative interpretation with a potentially fatal cost to those who blindly take these drugs. ((Duryea, E. J. (2008, April). What every sexuality specialist should know about ‘sexual numeracy’: How we present quantitative information is important. Paper presented at the annual meeting of the Society for the Scientific Study of Sexuality, Western Region, San Diego, CA.))

I bring up this anecdote because it is one of many in a long list of such problems that occur in the U.S. today surrounding the issue of “lifestyle drugs”—drugs one takes not just for a temporary cure to an ailment (in the way Penicillin kills bacterial infections), but rather, as a response to lifelong, forever ailments (e.g., depression, anxiety, high cholesterol, acid reflux, impotence, and so on).  As anyone who has watched television commercials in the last decade can imagine, the pharmaceutical industry expends enormous sums of money to encourage consumers to “ask their doctor” about a host of drugs, nearly all of which advertise “lifestyle” remedies.  Get erections that last for days!  No more burping up acid after eating mountains of salty, fatty, chemical-laden food!  Stop feeling anxious despite chronic sleeplessness and slaving away at your vacationless McJob!  And, like all advertising ploys—particularly ones where astronomical sums of money are expended—it works.  Not only do people in the U.S. tolerate direct-to-consumer advertising (note that, within the Western world, the U.S. is alone in such a practice), but we indeed do consume more and more lifestyle drugs each year, making us the most medicated, and pharmaceutically-profitable, society around.

So how do we explain this phenomenon?  What about the U.S. lends itself to this perfect synthesis of self-medication, corporate greed, and pharmaceutical horsepower?  I propose that to tackle such a question, we must consider three separate entities: first, the invention of sickness, whereby normal aspects of daily life get branded as illness, like inventing female Viagra because women may not always desire sex; second, our refusal to live with the most basic elements of the human condition, as evidenced by the multi-billion dollar antidepressant industry; and third, our nearly reckless disregard for common sense, as evidenced by a host of lifestyle drugs, particularly Viagra for men.  It is not just that those in the U.S. have been duped, or that the pharmaceutical industry wields uncanny powers, or that we largely cannot decipher the difference between self-generated needs and manufactured needs (all true); additionally, at its most basic level, people in the U.S. have embraced a new wave of pharmaceutical personality sculpting, ((This phrase was first used in Zita, J. (1998). Prozac feminism. Body talk: Philosophical Reflections on sex and gender. New York: Columbia University Press.)) a philosophy arguing that pharmaceuticals can compensate for our unfulfilled desires and needs.

Let’s begin with the case of female Viagra.  Six years ago, pharmaceutical efforts to repackage the success of Viagra into a female-friendly version began in earnest.  First, Pfizer attempted to replicate the powerhouse success of male Viagra with a simple goal: create physiological arousal in women, simulate lubrication and swelling responses, and (voila!) women would achieve orgasm in unprecedented numbers, thereby ending their relatively higher rates of “sexual dysfunction.”  Unfortunately, this did not come to pass as expected.  The big problem?  Women who became aroused physiologically still did not choose to initiate or submit to sex with their (male) partners.  Unlike male Viagra—where physiological arousal and desire for sex allegedly worked more in tandem—female Viagra successfully achieved physiological arousal but failed to generate mental arousal or motive for sex.  Women with aroused vaginas still said no.  This frustrated Pfizer to the point where, during one interview with the New York Times, researchers declared, “Although Viagra can indeed create the outward signs of arousal in many women, this seems to have little effect on a woman’s willingness, or desire, to have sex…Getting a woman to connect arousal and desire…requires exquisite timing on a man’s part and a fair amount of coaxing.  ‘What we need to do is find a pill for engendering the perception of intimacy.’” ((Harris, G. (2004, February 28). Pfizer gives up testing Viagra on women. The New York Times, C-1.))

Perhaps said in jest, this statement nevertheless perfectly illuminates the first of three problems that contribute to the age of pharmaceutical personality sculpting: illnesses are invented, often for profit, by industries that have a serious investment in making people believe they are sick when they are not.  In a for-profit healthcare industry where sickness is money, invented sickness makes even more money.  Case in point, a recent psychological study by Jan Shifren and her colleagues found that, though 43.1% of women reported feeling that they had some form of sexual dysfunction, less than half felt troubled by this fact. ((Shifren, J. L., Monz, B. U., Russo, P. A., Segreti, A., & Johannes, C. B. (2008). Sexual problems and distress in United States women: Prevalence and correlates. Obstetrics & Gynecology, 112(5), 970-978.)) Rather than rely upon women’s self-description, the pharmaceutical industry instead convinces women through conversation and commercials that their inconsistent sexual desire is a defect, and that their bodies are imperfect and in need of drug treatments to “repair” their “dysfunctional” libidos.  We live in an age where illness makes profit, and where the invention of “disorders” improves the economic bottom-line of the health care industry. Such profit-driven health care requires the consumer to imagine these invented illnesses as real. Unless people learn to call out and resist such inventions, pharmaceutical personality sculpting will become the mainstay of the industry.

Step two in the process of selling people on lifestyle drugs involves an almost laughably ill-advised premise: convince people that the human condition no longer entails sadness, anxiety, depression, loneliness, social unease, lost erections, ups and downs in libido, and grief.  Indeed, the antidepressant industry has swooped in during a time when we have a lot to be unhappy about: unprecedented class warfare (the top 1% of U.S. earners now make more than the bottom 95% combined!), new and insidious forms of sexism (women’s desires usurped by the whims of patriarchy, ongoing failure of the Equal Rights Amendment, increasing reports of eating disorders and body dysmorphia, alarmingly high rates of women faking orgasm, national failure to recognize working mothers’ needs, and so on), rampant and shameless forms of racism (states retaining rights to block interracial marriages, anti-Obama rhetoric latching onto anti-socialist rhetoric throughout the nation, erosion of communities of color, overrepresentation of men of color sent to Iraq, etc.), and, in essence, a whole lot of things to be anxious, depressed, and un-aroused about!

Again, denying the difficulties of human existence seems to be a peculiarly U.S. phenomenon.  Along with their ironic taste for high cholesterol foods, plentiful red wine, and good health, the French (yes, the French!) construct tragedy as an unavoidable process of the human existence.  It is entirely remarkable that people in the U.S. want to manufacture an existence without such tragedy, yet this is exactly what antidepressant manufacturers count on.  They make a bargain, albeit without full consent: Take these drugs and you’ll feel less—both positive and negative.  Those on antidepressants report exactly this: they feel less sadness, they can get out of bed in the morning, and they can go to work and walk their dogs and enjoy modest pleasures.  However, they no longer feel the same happiness they once felt either.  They are dampened down, as the clinical literatures say.  The antidepressant industry wants to trick us out of experiencing ourselves as fully human, as fully engaged in the process of being alive.  How bad for business if we accepted that, when people die, grief is a horrendous, sometimes long, and certainly painful process, but one that we need to experience in order to process death. What a blow to their bottom line if people in this country started considering what their anxiety at work meant about their job satisfaction?  What a downer to the share holders’ stock portfolios if we stopped to consider that feeling bad might propel us to take action in order to feel better?  After all, aren’t we at least a little bit suspicious that Prozac and Zoloft and Wellbutrin create obedient, gracious, mellow, toned-down citizens, ready for the work of tolerating gender inequities, pay inequities, class inequities, and race inequities?  What if people instead confronted their reasons for being upset, depressed, and anxious?

Which brings us to the third point: The pharmaceutical industry relies upon our most basic denial of common sense, intuitive wisdom, and self-affirmation.  Consider the recent discussions about the paradoxes of the modern food industry.  As Michael Pollan has pointed out, we have lost touch with common sense about eating because the food industry has systematically done three things.  First, the food industry has asserted a singular, authoritative knowledge of what kinds of food make us healthy.  Second, it has extracted, via “nutritionism,” the elements of food that yield health without considering the interplay between enzymes and vitamins within a whole piece of food (e.g., Eat Omega-3s! ((Pollan, M. (2009). In defense of food: An eater’s manifesto. New York: Penguin.)) Don’t worry if it comes from actual salmon or fish oil tablets!  It’s all the same!).  Third, the food industry has assaulted our common sense by forcing us to rely upon their definitions of “healthy food” at the expense of what our grandmothers and great-grandmothers already knew to be true (e.g., we eat processed boxes of chemical goo that claim to be “low fat” and “enriched with vitamins” rather than simply eating an apple or a carrot or a head of lettuce in the produce aisle).  The same process has occurred with other elements of health, particularly mental health.  Rather than considering the ways that our unhappiness, anxiety, and grief stem from elements in our lives that deserve our attention, “experts” feed us insights about how pill-popping and pharmaceutical personality sculpting will come to the rescue.

Case in point: A friend of mine once dated a man who had erectile dysfunction with onset in his early 20s.  All physiological tests came out normal, indicating that doctors could find no physiological reason why he had erectile dysfunction.  He tried Viagra for four or five years, with decreasingly successful outcomes.  He had a more and more difficult time becoming erect, and often could not get an erection even in the most stimulating of circumstances.  Viagra eventually stopped working entirely (as it often does).  The man sought out psychological therapy to discuss his distress about his seemingly inexplicable erectile dysfunction.  Frustrated by his lack of success at relying upon Viagra, he eventually discovered, during the course of a multi-year therapy, that his lifelong incestuous relationship with a family member—one in which he consistently became aroused in situations of potential punishment and shame—had contributed greatly to his current erectile dysfunction.  Indeed, all of the signs pointed to his traumatic sexual history as a culprit to his current dysfunction.  He had begun to masturbate at work, and could get aroused only right before his boss walked in on him.  He had asked his partner to have sex in crowded movie theaters, subway cars, and park benches.  He could never become aroused while at home in bed with her.  During this course of treatment, he began a slow and difficult recovery, disentangling his associations with shameful early life experiences and replacing them with healthier models of consensual, non-punitive sex.  I tell this story because it represents, most basically, a truth that should seem obvious to most people if they consider common sense: erectile dysfunction, like most “illnesses” treated by lifestyle drugs, is rooted in a person’s reality, and without addressing that reality, the drugs simply mask the underlying issues.

Yet, we in the U.S. continue to perfect our skills at denying common sense to the point of rapidly dismissing the real rootedness of our psychological problems in the reality of our existences. We do this with food and we do this with mental health.  We eat fewer and fewer apples because food-industry consultants have told us to eat fiber-enhanced apple-flavored fruit-roll-ups.  We deal less and less with the complexities of our psychological lives because “scientists” have told us that a pill will solve the problems of our brain chemistry and will repair our wounded histories.  We rarely stop to consider why unhappiness pervades our culture because the “experts” have told us that it not only is possible to medicate this away, but is in fact medically sound to do so!  This all comes at a great cost, personally, socially, and culturally.  A generation raised on Lean Cuisine and Paxil has learned to condition away the intuition of mind and body.  As a consequence, we do not recognize what tastes good any longer because experts have successfully tricked our taste buds into believing we are eating “butter” when we aren’t.  We do not recognize that unhappiness can have positive, affirming, enriching results on our lives (as in, motivation toward something else—a new partner, a new job, activism on behalf of oppressed groups, and so on) because we have become susceptible to marketing campaigns selling us on the fundamental lie that life is pleasant.  We have already begun selling women on the promise of pharmaceutically terminating menstruation for “convenience” and trimming their labias in order to generate better orgasms, despite known tissue damage and reduced sensation from such surgeries.  Just last week, advertisements promoted a new “mint” that will disguise the vagina’s natural smell.  We sculpt and trim, tweak and prune.  This comes at a considerable cost, as individuals, as a society, and as a potentially toxic contagion within the global community.  Until we seriously challenge the impact and reach of the pharmaceutical industry, these assaults on our most basic ways of being human will continue in earnest.

By Breanne Fahs | The post Lifestyle drugs and the new wave of pharmaceutical personality sculpting appeared first on The Public Sphere.

]]>
http://thepublicsphere.com/lifestyle-drugs-wave-pharmaceutical-personality-sculpting/feed/ 2
Le Parkour: The Body as Politics http://thepublicsphere.com/parkour-body-politics/ Tue, 15 Dec 2009 04:44:06 +0000 http://thepublicsphere.com/?p=1804 As an eighteen year old climbs up on top of a telephone box, a couple on their  Saturday errands  prepare to tell him to get down. By the time they have cantered over he is back on the ground, thanks to a reverse back-flip.

By James K. Walker | The post Le Parkour: The Body as Politics appeared first on The Public Sphere.

]]>
As an eighteen year old climbs up on top of a telephone box, a couple on their  Saturday errands  prepare to tell him to get down. By the time they have cantered over he is back on the ground, thanks to a reverse back-flip. This is greeted with applause from his friends and whitened knuckles by the couple, as they grip securely to their shopping bags pretending nothing untoward is going on. Welcome to the world of Le Parkour or ‘free- running’ as it is more commonly known, a subcultural movement which combines mental and physical agility to achieve oneness.

Television documentaries such as Jump Britain have described this activity as ‘urban ballet’ given the sense of ceremony or ‘Tai-Chi’ like deliberation which comes with the performed movements. On a more realistic level, and one away from the television cameras, however, it appears as a hybridized leisure activity – incorporating elements from gymnastics and break-dancing to enable elegant and graceful movement over ‘obstacles’ found in the urban environment. Having studied a group of Parkour enthusiasts for the past couple of years in Nottingham, UK (NottsPK) I have become as intrigued by their ‘sport’ as I have been with public reactions. Because Parkour takes place mostly in urban space, it has been seen as a kind of reclaiming of the streets. Although this is undoubtedly true, it is the reclaiming of the body which I find of particular interest and the implications this has for health. Before expanding on this further though, we should take a brief historical look at how the body has been used elsewhere to construct identity.

Many socially marginalised groups have positively employed the phrase, ‘The personal is political’ for celebrating their identities. Within sections of the gay community this is best exemplified in the ‘hanky code’ whereby different coloured bandanas signal individual sexual preferences and interests. Encoding sexual activities enables conversations to develop in which they are ‘talking’ rather than ‘listening’. I see this as a political act,  taking control of your own identity. The resulting sense of self is visual and proud and, in its defiant construction around sex, celebrates [and subverts] a common prejudice used to marginalise gay men.

Similarly the feminist movement in the early 1970s attempted to reclaim ownership of the body through the politics of abortion, ‘access’ and diet. Taking control of the body and using it as a boundary enabled a certain level of self-control, particularly in relation to identity.

By this logic, voluntary mistreatment of the body must also be thought of as political and personal expression. Nowhere is this better exemplified than in the MTV spin off show Jackass. (1996 – ) The programme revolves around a group of men recording a series of humiliating and dangerous pranks on camcorder such as BMX jousting, shark hugging or being shot at. The group leader, Johnny Knoxsville, warns with subtle irony ‘Do not try this at home.’ This bodily mistreatment clearly struck a chord with the public as Jackass: The Movie (2002) grossed US $64 million.

While Jackass may be read as another example of the ‘levelling-down’ process, an analysis of the body in Jackass provides an alternative explanation. Using the body as a cultural text, self harm and mutilation give material expressions to certain cultural anxieties, like the supposed ‘crisis of masculinity,’ and are a basic inversion of the destructive machismo which epitomised 1980s classics such as the Rambo and Rocky films. Both explanations are plausible, but Jackass also entails a rational assessment of risk. As people encounter greater daily intervention into their lives from bureaucratic forms of governance, such as ‘health and safety’ legislation and the whole ‘culture of blame’ which this has created, the message of Jackass is simple:  This is my body and it is the one thing which you can’t control, so sit back and watch me smash bottles over my head and fire nails into my arse.

I situate Parkour within this tradition of bodily empowerment and as a more nuanced reaction to similar anxieties. In Jackass the body is treated with contempt, as something expendable, which could be seen as indicative of a wasteful capitalist modernity. In Le Parkour we see an inversion of these values so that self-preservation, finesse and agility are favoured. The goal is to move as fluidly between objects with the minimum of fuss and hopefully no injury. Through this experience a kind of oneness is achieved with the body, mind and environment.

Le Parkour can be thought of as an urban philosophy as it has a clearly defined manifesto but rather than having one specific ideology, it is formed out of multiple narratives drawn from a wide range of influences such as fantasies, escapism, cult icons, films, books, comics etc. It also extends into philosophies of self-improvement and self-awareness drawn from both the West and the East. In many ways this is emblematic of many new forms of modern identity which have grown out of internet forums and chat rooms; thus Parkour as philosophy is a kind of cultural sponge which is able to absorb information and influences without ever losing its shape.

This is possible because Parkour centres around emotive rather than factual language and thereby opens itself to interpretation and play e.g. one word which pops up more than others is ‘fluidity’, which itself implies the ability to change and transform smoothly. For fluidity to be achieved, participants must overcome four obstacles: mental, social, martial and family.

The mental obstacle – and perhaps the most difficult of them all – entails conquering your fears and gaining the necessary mental strength and confidence to make a particular jump. As different movements vary in complexity and risk so too the rate at which strength and confidence are perfected depends individually. Working together as part of a large cooperative helps as each group member is able to guide and reassure the other. When one member performs a particularly risky jump it then motivates another to try.

Overcoming mental obstacles leads to a certain degree of confidence arguably will translate into other areas of personal life. It is for this reason that Le Parkour can be seen as a philosophy of self-help and realization. The underlying message is if you can make a jump which seemed impossible, what is to stop you from sorting out emotional and mental problems in other areas of your life.

It should be noted that some movements are clearly built upon physical agility and power and therefore easier for older, taller and more disciplined bodies to achieve. Self- confidence in itself is not enough. But the fact that you are able to realise these limitations of your own volition is important as it is only by emotionally relating to something that we are able to fully comprehend it. Far too often in life restrictions are imposed on people without allowing them to discover for themselves. It is perhaps for this reason that Parkour enthusiasts on forums such as Urban Freeflow, turn to the wisdom of movie idols such as Bruce Lee. ‘If you always put limits on what you can do, physical or anything else, it’ll spread over into the rest of your life. It’ll spread into your work, into your morality, into your entire being. There are no limits.’

Le Parkour is described as ‘the way’ on the UF website, which suggests that it is a particular way of perceiving reality. While Parkour’s ideologies are influenced by films such as The Matrix (1999) these may also have had a physical influence as well. One thing which The Matrix, comic super heroe’s such as Batman and Spiderman, and computer games all have in common is that the characters can do super human things with their bodies as they swing and fly through the metropolis. Technology has been criticised for creating inertia, obesity and an artificiality in everyday human existence. Yet could it not be the case that engaging in such fantasies has inspired individuals to redefine and expand the limits of human potential? Le Parkour in trying to overcome mental objects and achieve seemingly impossible movements seeks to reverse the potentially negative effect of technology while heightening human experience and the body in the process.

As Le Parkour is performed in public space, individuals must be prepared to overcome certain social obstacles or stigmas such as people staring, pointing, ridiculing etc. In letting go of inhibitions and ignoring negative comments by passersby (who are rare I should point out) can lead to more confidence in other areas of life. However, in my experience it is the observers rather than the participants who go through the real anxiety. On numerous occasions I’ve seen people try to coerce one of them down from a wall because they might injure themselves only to be shocked when they exit with such panache.

Martial obstacles come in the form of authority figures who move participants on because they’ll ‘cause damage’ or are unwittingly on ‘private property’. As frustrating as this may be, the group I studied never argued back or were rude. Arguing with authority figures who weren’t listening because they were ‘just doing their job’ was seen as a waste of time and stopped them from doing what they were here to do. It was easier to just move somewhere different.

Contemporary sub-cultures like Le Parkour are often described in terms of moral decay whereby social regulation has broken down, metanarratives have crumbled, and youth have been left to run wild. But Le Parkour clearly refutes such claims. In explaining their ‘art’ to law enforcement agencies, they are learning to reason. They are also learning humility, tolerance, and understanding, thereby re-embedding a sense of order in a supposedly atomised and increasingly fragmented society. Indeed, they are actively encouraged to show concessions towards authority figures in arguments over space as in effect they are ambassadors for this relatively new discipline. Failure to be civil could lead to the activity being banned in certain areas and thereby ruining it for other enthusiasts.

Perhaps the most formidable of hurdles to overcome is the negative attitude of relatives, in particular parents. This can be intensified by negative representation in the media which tend to favour the more extreme aspects of the discipline rather than the more everyday practise that I witnessed. But you only need to watch this group of kids working with each other to realise that everything is calculated risk and clearly well thought out and planned before anything serious is attempted. Similarly, there are endless videos and training advice on the UF website. As one member of the group once pointed out to me, ‘my mum’s just glad I’m not doing drugs or getting’ in fights’.

The ability to persuade loved ones to trust and support the decisions you make with your life helps to develop communication and reasoning skills which will spread into all areas of lived experience. These may seem like a new set of values but really all recreational activities, in particular sports, promote a certain degree of friendship, fair play, respect, team work, problem solving etc. However what differentiates this urban sport from more traditional sports is that it is built around cooperation rather than competition.

Risk clearly has an important role to play in Le Parkour as it has to be managed to minimise injury and courted to fully enjoy the extreme experience. But what it really offers participants is the opportunity to draw a thick line between life and death. There are many false or thin risks in modernity which have made death appear ubiquitous: killer bugs in hospitals, terrorism, GM and processed food, overzealous health and safety intervention, etc. The list is endless – but such ‘risks’ make everything seem to be a potential danger.

Le Parkour reacts against this gross and perhaps inevitable trivialisation of knowledge. The constant intervention by the state and its systems ‘for our own good’ (and often it is), has meant alternative forms of expression and self diagnosis have emerged. As history has proven time and time again, how we use our body and the boundaries it enables us to make are as integral to our mental and physical health as they are to our identity.

By James K. Walker | The post Le Parkour: The Body as Politics appeared first on The Public Sphere.

]]>
Breakfast: December 2007 http://thepublicsphere.com/breakfast-december-2007/ Tue, 15 Dec 2009 04:18:06 +0000 http://thepublicsphere.com/?p=1795 The furniture was gone. And only the promise of empty space stared back at me. It was the promise of empty space that had beckoned me to Utah six and a half years earlier. The naked sky offered me the possibility to do anything and be anyone, and the silent mountain sentinels assented to shield me from mistakes.

By Hope Miller | The post Breakfast: December 2007 appeared first on The Public Sphere.

]]>
The furniture was gone. And only the promise of empty space stared back at me. It was the promise of empty space that had beckoned me to Utah six and a half years earlier. The naked sky offered me the possibility to do anything and be anyone, and the silent mountain sentinels assented to shield me from mistakes.

This isn’t exactly how it happened.

I fell in love, first with the mountains, then with a woman. And it ended. But it didn’t end quickly, in one fell swoop, or a nice quick chop. The love faded like the furniture, piece by piece. This is the love of the woman I’m talking about. My love of the mountains never vanished even if the jagged outline of a ridgeline or a range no longer appears on my horizon. It took a whole week for my furniture to disappear. It took 18 months for that love to evaporate.

The teaky futon was the first to go, and my kitchen table was the hardest to let go. It was the first piece of furniture I had bought in Salt Lake and I carted it everywhere: from the South Temple apartment to our house on the West Side to the 9th & 9th cottage. A modest pine table with two hinged leaves, it had been painted multiple times and partially sanded. The leaves had stretches of green and white paint on them while red and yellow paint twisted down the legs. My dad and I had found it at an art gallery consignment shop, and, with a little elbow grease, we unearthed matching chairs in the bowels of the shop. I sold the table and chairs for a crumble of cash to a half-drunk guy in his 20s.

With the furniture gone, the cottage reminded me of what it felt like when I moved in. This place had radiated potential. Here was a humble place where someone could make a life or recover from a past one. The shower had good water pressure, and the kitchen had a gas stove. Windows dotted the east, south, and west walls. The neighborhood boasted a park, a coffee shop, a yoga studio, and killer burritos. The basement provided ample storage space. There was a porch and screen door off the living room. The backyard was fenced. I had even started a compost pile. But sometimes it’s not enough to have everything in place.

The one thing that remained in the cottage was the Delta Sky Kennel, Prufrock’s home for the next several hours. I had purchased the kennel a month ago, and Pruf and I had been practicing. First, we worked on simply being in the kennel. I’d coax him in with a treat and close the door. We worked up to Pruf spending 30 minutes or more in the kennel while I was in the other room or out on a quick errand. But, now, our last morning in Salt Lake, our beloved Salt Lake, I went full-bore with him. I scooped him into the kennel, slammed the door, and prepared him for takeoff. The rumpled brown carpet offered just the right amount of resistance. We didn’t sail across the floor; instead, we bumped along, much like, I told my dog, flying over the merciless Wasatch. Wrestling the kennel across the floor, I pushed, pulled, and shimmied. I dragged it in circles, I rocked it side-to-side, I pounded on the top, I rattled the sides. I even howled. I just didn’t want him to be scared. I didn’t want him to be as scared as I was.

Natalie chuckled when she walked in on me whirling around my dog. I had said goodbye to everyone else, leaving my best friend Natalie to the final hours. We had met through a yoga workshop and had built our friendship from the ground up: funny emails at first, followed by more personal ones, and eventually, we found the courage to hang out in person. Over “B & N,” beer and nachos, we listened to each other cry, offering up yoga pointers—“Try 10 minutes of bound lotus every day for a month to break bad habits”—and relationship counsel—“She’s really missing out on life by not going with you.” While nine a.m. was too early for B & N, we could at least have coffee and eggs at the Avenues Bakery, a prime brunch spot in my old neighborhood.

It’s hard to say what the Avenues Bakery was more famous for: its tasty food or its unforgivable service. Waiting 10 minutes for a cup of coffee was routine. The wait staff was young, pierced, and inked, and as they clotted behind the counter in their black T-shirts and aprons, glaring off into the distance, it was clear that they had better things to do. The well-intentioned middle-aged couple who ran the bistro bakery had studied in France and were trying to import a foodie culture to a homogenous city whose idea of fine cuisine extended little beyond green Jell-O. They sponsored wine tastings, scrambled local farm-fresh eggs, served up a mouth-watering assortment of tortes, tarts, and other tangy confections, and yet they consistently hired a slow, surly staff. This questionable combination of the earnest and the disenfranchised made any meal there a dangerous proposition.

Shortly after nine, the Saturday crowds had yet to appear at the Avenues Bakery. Natalie and I easily found a table by the window, and our coffee arrived within a few minutes of our order. The Bakery covered half a block on South Temple, a wide boulevard with cast iron streetlamps, ancient trees, and Gothic “gentile” churches. The windows spread almost from floor to ceiling, making this place a prime people-and-car-watching venue. My first couple of years in Salt Lake, when I was in grad school, I had lived just three blocks away, between the Presbyterian church and the Catholic cathedral. Once a week, usually after my seven o’clock seminar, I would treat myself to take-out. Picking up the turkey-and-brie panini on my way home from the university, I’d pass the evening stretched out on the floor, with plenty of beer and a weepy Lifetime movie, my books, notebooks, packets, papers, and handouts circling me. No dog yet, no lover yet, just all those words.

Over huevos rancheros and rosemary toast, Natalie regaled me with the latest Buchi family drama. This time, her younger siblings were torpedoing her efforts to resurrect Grandma Marge’s famous Christmas Eve Pajama-Waffle party. I fixed on Natalie’s story, laughing on cue, because I had gotten tired of saying goodbye. I commiserated on cue, inhaling and nodding, because there were too many questions I couldn’t answer, not even to myself. There was only the thin thread of something I knew. The thread was enough to hold on to, but if I tugged too hard or tried to pull myself up, it would snap. And so I explained sparingly but ached excessively. If it hurts so bad, then something here must matter. But if that were true, if something—or someone—here mattered, then why would I leave?

But, I could commiserate for only so long. Together, we had to face the unavoidable: I was leaving. Bumping over our words, we tried to explain what it meant to know each other. I thanked her for taking care of me during the six months of the so-called “separation” from my lover. I wished I had said more, but the thread tightened in my throat. Natalie thanked me for dragging her outside to play in the dead of winter. We laughed about our final excursion, just last week. Natalie and her husband Sam joined Prufrock and me on the Shoreline trail after a snowfall. The fresh snow tempted me. “I want to roll down this hill,” I announced, uncertain, for a moment, of my own sanity because the hill in question was really the side of a mountain. Natalie and Sam looked at each other and shrugged. “Let’s do it,” Sam said in his honey-velvet voice. Praying we didn’t lose our keys, we dove off the ridge, belly-flopping on the snow. And then we slid. And the momentum of the slide sent our legs up and over our heads. And then we tumbled. And we went faster and faster until the tumbled turned into a roll. Rolling over and over until we plowed to a stop at a gully full of scrub oak. Drunk on vertigo and Utah’s famous champagne powder, we tried to stand up. And we fell over. And we tried again. And we fell over again. Piece by piece we pulled ourselves back up the hill, wobbling, cackling, and chucking snowballs at each other. And then we did it again. And again. And again. All three of us were thirty, and we flew and fell, over and over, with the grace and promise of a child, someone not yet disappointed, not yet afraid of the rocks, lying in wait under the thin veil of snow.

Pruf danced around us, darting up the hill and down. Reaching his haunches up in the air, he stretched his paws forward and barked, his black ears waving. He licked Sam’s face, sat on my belly, and nipped at Natalie’s heels. He taunted us for being slow and dizzy and showed us how to run and kick up powder at the same time. My dog taught me how to love the mountains. I hoped he’d forgive me for taking him away. Another space lost.

The server cleared our plates and twisted his lips in something like a smile. Natalie and I drained our coffee cups empty and settled the bill. We still had time.

On the way to the airport, I asked Natalie to drive me around town, my last chance to lock my eyeballs on this city. From the Bakery, we headed north through the Avenues neighborhood, and I marveled at the cozy arts and crafts bungalows with their recessed porches and the fanciful Queen Anne’s. We worked our way up to 11th Avenue and then headed west, winding around City Creek Canyon. The road hugged close to the steep, towering land. We swayed from side to side at every bend. Pruf began to stir. He stood up, his claws clicking against the plastic floor of the kennel. His tail thumped and he whinnied. Turning circles in the kennel, Pruf’s whinnies grew into full-fledged barks. He wanted to get out and run. I wanted to get out and run. I thought about the moose, deer, elk, coyotes, bobcats, magpies, jackrabbits, and rattlesnakes I had seen in this canyon. We had seen in this canyon. This was our place, and when the car made the last bend in the road, the canyon vanished. Defeated, Pruf pancaked on the plastic floor. The car continued on its course. As we made our way out North Temple, passing the Red Iguana, I asked Natalie to take me by the house. We still had time.

I hadn’t seen it in a year, since my lover had sold it. We idled at the curb. “Wow,” Natalie exhaled, “it’s so cute.” Except for that storm door, I thought. But I was also glad that the new Mission-style front door was protected. It took us two contractors and three months to get that door from the factory in Tennessee. The living room window, with the BB-gun bullet holes in it, had been replaced with a monolithic plate of glass. We had wanted to repair that window—which had snowflake stickers over the holes when we bought the house—but we didn’t want to do what these people had done. We didn’t want to swap one giant plate of glass, albeit with small holes, for another, equally unattractive plate of glass, however solid. Somehow, we wanted that window to be able to open, to offer us some fresh air, but we couldn’t figure out how. We left it the way it was, snowflake stickers and all, for the full four years of our shared life.

The front yard was still intact. We spent every weekend of August 2004 digging up with the front, just the two of us, armed only with a shovel whose handle was splintering and a pick ax. Hours and hours passed as we wedged the blunt shovel into the sun-baked sod and wielded the ax overhead. Thirsty and tired at the end of the hot afternoon, my lover and I stumbled to the Red Iguana and sought refuge in cold Coronas and homemade mole. That August was the only time in my life I ever looked forward to Monday mornings. At work, I could rest, recharge, recover, my muscles twitching, my eyelids heavy.

I made more than she did and that fall, I spent my money on plants. Silver fountain grass, yucca, Japanese blood grass, saltbush, Russian sage, feather reed grass, and blue fescue. The front yard was spare but textured. The violet blossoms of the Russian sage sparkled next to the corduroy bricks of the house. The pointed yucca and billowing saltbush took over the southwest corner of the yard. The silver fountain and feather reed grasses reached high as their plumes bobbed in the wind. Struggling to find their footing in the rocky soil, the fescue and blood grass kept their bold colors close to the ground. But, it was the zebra grass that enthralled me the most. The tall, broad blades alternated from base to tip between a rich but pale green hue and a neutral fawn color. Like a tiger-striped kitty or my own speckled blue heeler, this grass was nature’s version of a rugby shirt, the Fair Isle sweater, argyle socks. Patterns released by genes, no elaborate stitching required.

Next to the zebra grass, there was a spot in my heart for the Alpine Blue Spruce, a young evergreen we had planted in front of the living room window. We had told ourselves we planted the tree there to block the late-day western sun. But really we had planted it to prevent passer-bys from seeing the snowflake stickers and their sister BB holes. We named the tree Bruce, Bruce the Blue Spruce. He was a squat Christmas tree, tinged with smoky blue, and we loved him. When you look at something you love every day, you don’t really notice that it’s changing. Bruce looked the same every day, but we told each other that he was getting bigger. “Look at him, now,” she’d say to me. “He’s getting so tall! In a few years, we may have to prune him. In 10 years, we’re going to have so much shade in the front yard.” Today, on this bitter, drab December morning, Bruce did look taller. My throat swelled and my jaw tightened. In 10 years, that will be an enormous tree. In one year, the space in my heart for her will contract so smoothly that I won’t even notice until it’s almost closed. This isn’t exactly how it happens.

At the airport, Natalie gave me a gift, a candle. “For meditation,” she said. We watched Pruf and his kennel get wheeled away. We hugged goodbye.

The plane to Atlanta was empty. I scooted over to a window seat. Pruf’s kennel sat on the tarmac, next to a ramp. I could see his black nose pressed up against the holes. The ground crew sweet-talked him as they loaded the kennel on the ramp. His tail flickered. He disappeared into the cargo hold. I closed my eyes.

By Hope Miller | The post Breakfast: December 2007 appeared first on The Public Sphere.

]]>
Close Parentheses (The Last Love Song) http://thepublicsphere.com/close-parentheses/ Tue, 15 Dec 2009 04:01:07 +0000 http://thepublicsphere.com/?p=1785 A poem by Helen Heightsman Gordon.

By Helen Heightsman Gordon | The post Close Parentheses (The Last Love Song) appeared first on The Public Sphere.

]]>
As I watch you eating creamed corn with a fork, I think of your mother,
Who once placed a spoon in your hand as I do now.
You take it trustingly and finish your corn without spilling
On the napkin I tucked under your chin.
I think again of her, seeing my mirror image.
We are the women whose love framed the eight decades of your life,
The opening and closing parentheses,
The braces enclosing your magnificence.

When you cough, I hand you a napkin, remind you to cover your mouth,
Coach you in swallowing. I take your hand, gently, as she would have done,
Help you rise from your chair, steady your hesitant steps.
You are like petals folding into their calyx, or a hibiscus closing for the night.
I admire the young mother who coaxed this bud into bloom,
Who intuited from slender wisps of hope the man you might become.

Now as our world shrinks to a table for two,
The taste of butter your sole residual joy,
I remember how you could spin me in a waltz,
Turn on the sun with a moonlight kiss,
Harbor me within your encircling arms.
I feel sure you cannot unbecome what you became.
What you have been, you are.

I must intuit, as your mother did, what you need and feel.
Once you said fervently, “With all my being I love you.”
Now the words will not come, yet I believe them.
Even as my heart grows heavy with fearful tears,
I read your smile, and strangely find content.

She has done well by you,
The woman whose love you did not have to earn,
Who guided your toddler steps uphill,
Releasing you when your manly stride
Assured her all was well.
May I do equally well by you, holding your hand to guard against a fall,
Helping you gently down the shadowy slopes,
Releasing you when the evening petals close
And the music from the stars
Assures me all is well.

© 1999 by Helen Heightsman Gordon

By Helen Heightsman Gordon | The post Close Parentheses (The Last Love Song) appeared first on The Public Sphere.

]]>
Issue № 5 | September 2009 http://thepublicsphere.com/issue-5-sept-2009/ Tue, 15 Sep 2009 06:13:27 +0000 http://thepublicsphere.com/?p=1569 While questions of “identity” may seem very 1990s and pre-Facebook, certain discourses surrounding summer events, like the nomination of now-Supreme Court Justice Sonia Sotomayor, remind us that questions of “identity,” individual and collective, still remain with us in a globalized age. Valerie Bailey finds that her best friends all share a uniquely common bond, the [...]

By The Public Sphere | The post Issue № 5 <small> | September 2009</small> appeared first on The Public Sphere.

]]>
While questions of “identity” may seem very 1990s and pre-Facebook, certain discourses surrounding summer events, like the nomination of now-Supreme Court Justice Sonia Sotomayor, remind us that questions of “identity,” individual and collective, still remain with us in a globalized age. Valerie Bailey finds that her best friends all share a uniquely common bond, the cultural memory of being ancillary to someone else’s meta-narrative, while Colin Dickey meditates on the study of phrenology and our changing assumptions about identity. Living life in the hyphen, Sheila Espineli explores the complexities of her first visit to the Philippines, the country in which her parents were born, and Cesar Gomez remembers his grandmother and the lessons from her Andean youth that impacted his California childhood. Carrie Hawks’s art work initiates many questions about how we imagine women’s sexuality. Following the death of Michael Jackson, Paloma Ramirez wonders about the future of fame in the age of the internet.

By The Public Sphere | The post Issue № 5 <small> | September 2009</small> appeared first on The Public Sphere.

]]>
Sprawl http://thepublicsphere.com/sprawl/ Tue, 15 Sep 2009 06:12:40 +0000 http://thepublicsphere.com/?p=1624 To be docile, demure and alluring. There's often focus on the soft aspects of women, but why not celebrate the aggressive side of female sexuality? I've started this series using collage elements from clothing catalogs. I looked for the least threatening part of the model's anatomy. Arms resting on a beach towel, arms hung to the side, or hands stuffed in a pocket. Sexuality has power. Not just to be the object of attainment, but to actively pursue with confidence.

By Carrie Hawks | The post Sprawl appeared first on The Public Sphere.

]]>
To be docile, demure and alluring. There’s often focus on the soft aspects of women, but why not celebrate the aggressive side of female sexuality? I’ve started this series using collage elements from clothing catalogs. I looked for the least threatening part of the model’s anatomy. Arms resting on a beach towel, arms hung to the side, or hands stuffed in a pocket. Sexuality has power. Not just to be the object of attainment, but to actively pursue with confidence.

[nggallery id=9]

By Carrie Hawks | The post Sprawl appeared first on The Public Sphere.

]]>
My So-Called Asian Identity: I Shall Go: In Search of My Filipina Roots http://thepublicsphere.com/i_shall_go/ http://thepublicsphere.com/i_shall_go/#comments Tue, 15 Sep 2009 06:11:42 +0000 http://thepublicsphere.com/?p=1565 The question, “Have you been back?” used to bother me much more than the question “Where do you come from?” because it stabbed me with a pang of guilt. It was this self-created guilt that I had not yet made the pilgrimage that so many of my fellow Filipino-Americans had already made, some multiple times. While most Filipinos do emigrate to the United States to create a better life for themselves economically, many of them visit frequently and end up retiring back in the Philippines since the cost of living there is comparatively low. I heard “Have you been back?”so much, I was tempted at times just to lie, to claim that I had been there so I could get out of having to explain why I hadn’t made the journey. Eventually the question only strengthened my resolve. I knew I would go to the Philippines at least once in my life before I became too old to appreciate its natural wonders and to see the places where my parents were raised before deciding to embark on the American dream they bequeathed to my sister, brother and me.

By Sheila Espineli | The post My So-Called Asian Identity: I Shall Go: In Search of My Filipina Roots appeared first on The Public Sphere.

]]>
“Have you been back?”  This is a question I got a lot whenever I met with other Filipino-Americans during various family functions like birthdays and baptisms throughout my youth. I would then have to explain to a nosy tita ((“Tita” or “Tito” is Tagalog for Aunt and Uncle. However, this title is not just for actual aunts and uncles. We use the title for close family friends who are just like relatives to us. This is similar to how “Aunt” and “Uncle” are used in the United States and other countries.)) or cousin, that since I was born in Culver City, California, I had never been much less been back.  “Back” refers of course to “the mother country” as many Filipinos and Filipino-Americans call the country of my parents’ birth – the Philippines.  The question, “Have you been back?” used to bother me much more than the question “Where do you come from?” because it stabbed me with a pang of guilt.  It was this self-created guilt that I had not yet made the pilgrimage that so many of my fellow Filipino-Americans had already made, some multiple times.  While most Filipinos do emigrate to the United States to create a better life for themselves economically, many of them visit frequently and end up retiring back in the Philippines since the cost of living there is comparatively low.  I heard “Have you been back?”so much, I was tempted at times just to lie, to claim that I had been there so I could get out of having to explain why I hadn’t made the journey.  Eventually the question only strengthened my resolve.  I knew I would go to the Philippines at least once in my life before I became too old to appreciate its natural wonders and to see the places where my parents were raised before deciding to embark on the American dream they bequeathed to my sister, brother and me.

I did not have a lot of opportunities to “return.”  My parents did not go back so often that my siblings or I would be able to accompany them.  When they did go back, the cost of a trans-Pacific trip was too prohibitive for my sister, brother, or me to be able to join them.  My brother was the first of my siblings to visit the Philippines, and he went with Dad after his sophomore year in high school.  He had a great time meeting our relatives but complained about having been a feast for the mosquitoes there.  After they returned, I told Mom and my sister that someday we would have to make a girls’ trip to the Philippines as this was only fair.  At that moment, my brother became one of “them,” someone who had “been back,” and I admit that I envied him.

I used to feel this interminable divide between Filipino-Americans like me who were born in the United States and Filipino-Americans who immigrated mostly as small children with their parents to the United States.  I often wondered if they were somehow superior Filipinos, and that they were somehow culturally predisposed to be more proficient in Tagalog and have an undiscerning taste for Filipino cuisine, no matter what ingredients and strange animal parts were involved.  Being U.S.-born, I felt that there was some ineffable, missing element that made me more of a poseur than a “real” Filipino-American.  In fact, for a time, I insisted on identifying myself as just “American” because I was born in the United States and did not see the point of placing my parents’ national origin in my own ethnic identification.  I also saw the label “Filipino-American” as something of a lie – how could I dare to label myself with a country I have never seen with my own eyes?

As originally planned, we Espineli women finally set off on our own Philippine journey on June 7, 2007 with the intention to canvass a selection of its thousands of islands in a scant two weeks.  Like some kind of strange time warp across the International Date Line, Mom, my sister Lauren, and I left Los Angeles for Manila in the early evening of a Thursday and arrived in Manila early Saturday morning.  The sixteen-hour plane ride was punctuated with many hot meals – an unexpected treat given the fact that all U.S. domestic plane trips no longer serve meals.  The hot meals were Filipino dishes which helped make it all the more real that we were finally going to visit our parents’ home country.   I remember feeling nervous about meeting my large extended family and wondering what they will think of us.  Mom is the fifth of nine children, so we had plenty of aunts, uncles, and cousins to meet.  Dad only had three siblings, all of whom are now in the United States, but his uncle had eleven children and his aunt had sixteen. So this makes for many more cousins, many of whom are scattered around the world (such as in Norway).

As per usual, Lauren and I procrastinated about packing, and we each ended up each packing a huge suitcase, a decision we regretted as soon as we landed.  In addition to all of our suitcases, we had a huge cardboard box, filled with gifts and supplies for relatives.  If you have ever passed the Philippine Airlines counter in the international terminal, you have probably seen many passengers waiting to check in huge cardboard boxes called Balikbayan ((“Balikbayan” literally means “returnee” or someone coming home after an extended stay.)) boxes.   These boxes are a long-standing tradition which also adds to the cost of a trip to the Philippines – because you can’t just go there empty-handed.  We brought old clothes, little gifts and souvenirs as well as foodstuffs like instant coffee, corned beef, and Coffeemate that are very expensive and hard to come by in the Philippines.

As we deplaned and made our way to the baggage claim, we felt the profound humidity engulf us as we tried to find our bearings.  So this was what the tropics really felt like.  Our first trip to the bathroom was an experience!  We had to tip someone in the bathroom when we finished using the facilities…the last time I encountered this was going to the bathroom at a nice hotel so it was a bit unexpected in an airport.  Thankfully, we had been warned in advance to bring our own toilet paper as this convenience is very much a Western one.  As soon as we gathered all of our luggage, we needed to find our connecting flight to Tacloban.  Our first stop on our journey was to go to Mom’s hometown of Calbayog on the island of Samar.  Samar is part of the middle region of the Philippines known as the Visayas.

As we dipped beneath the thin layer of clouds, we got our first peek at the lush greenery that awaited us.  I had seen some photos of Mom’s hometown but they were mostly of people and of buildings so my imagination forgot to fill in the fact that it was enclosed by all of this amazing nature!  I wondered why my mother never mentioned this…then again, it was probably something she saw as normal and not something worth pointing out to us.

Fortunately, we got help in acclimating to our new environment.  Our uncle Tito Ecot (Mom’s brother-in-law) and our cousin Francis met us at the airport in Tacloban.  Tacloban is on the island of Leyte and is best-known for being the humble birthplace of Imelda Marcos – this was a factoid with which we were immediately supplied.  Tito Ecot and Francis hired a van for the day to pick us and our luggage up since it was a five-hour trip by car to Calbayog.  I think that our luggage outweighed us so this was good planning.  Tito Ecot warned us that it would be a bumpy road, but that was an understatement.  The potholes in some places were so deep that the driver would drive on the dirt shoulders which were actually smoother than the roads themselves.  We were amazed that this was the main highway of Samar!  When we asked why the roads were in such a state, Mom explained that due to political corruption, the funds for public works were siphoned off to more personal interests.  This got me to thinking about how much I took for granted in the United States.  Despite the frequent potholes I encounter in the Boston area, I don’t complain about them anymore.  Having a road in good repair is not a right but a privilege in my mother’s home province.

When I asked about whether they would ever do any repair on the road, my uncle and cousin laughed.  They explained that the road had been and would always be dangerous to travel and that they avoided taking this route when possible.  Our cousin Francis also mentioned that there was a possibility that Calbayog’s airport would soon offer flights to and from Manila (it does today).  It was great to see Francis, having only known him through photos and relatives’ stories.  I knew he was a little older than me and that he and his twin brother Terrence were both married with kids.  I looked forward to connecting with him and all of our cousins.  I could not help but wonder what we would talk about, if we had any interests in common, and what they would think of me and Lauren and our American ways.  It was a nice surprise to discover that he had so much to share with us about the Philippines, including local attractions and historic sites that he wanted us to see.

Before we started out on our treacherous five-hour journey to our relatives’ hometown, Calbayog, we took a quick trip to a nearby monument.  General MacArthur’s words, “I shall return,” was one of the few tidbits I remembered learning about the Philippines in my high school world history class.  It was a surprise for Lauren and me to learn that we could go to the exact spot where General MacArthur had indeed returned with forces to liberate the Philippines at the end of World War II.   The monument’s statues of MacArthur and his officers looked to me like performance artists standing in water.  It was meant to duplicate how MacArthur and his men waded through the Pacific waters to return to the Philippine shores marking the fulfillment of his promise of his famous words.

Just as I had once felt awe standing  in front of Notre Dame Cathedral in Paris where so much of the history of that city had taken place, I felt chills about what a momentous occasion this moment was for both the United States and the Philippines that had only happened 63 years previously.   Now here were my sister and me, making our way to the country that our parents left behind to pursue a brighter future in the United States.  Would they have left the Philippines behind had MacArthur not returned as promised?  The Philippines would never be the same and still struggles with the repercussions of that moment today.  Gone were its Japanese oppressors and in came the democratic saviors.  But at what cost?  Did the United States seduce the Philippines with so much of its culture and language that we first-generation Filipino-Americans feel even more of a disconnect between our ethnic origins than other first-generation Asian-Americans?  I couldn’t help but think of all of the implications that MacArthur’s return had for both the Philippines’ destiny as well as my own.

We crossed a bridge connecting Leyte to the island of Samar, and I was blown away by this island of palm trees.  It looked completely untouched by human hands as the palms grew thick and wild to the very ends of its shores.  How many islands were there like that in this archipelago of thousands?  In crossing this bridge, it made me think of the threshold I waited for so long to cross – to be one of those Filipino-Americans who have been “back.”  Of course, I did not feel any differently, but I knew that thereafter, I would never be the same.

(Sheila Espineli’s travels in the Philippines will be continued in a later issue)

By Sheila Espineli | The post My So-Called Asian Identity: I Shall Go: In Search of My Filipina Roots appeared first on The Public Sphere.

]]>
http://thepublicsphere.com/i_shall_go/feed/ 1
Becoming Nona: Memories of a Grandmother http://thepublicsphere.com/becoming-nona-memories-of-a-grandmother/ Tue, 15 Sep 2009 06:10:16 +0000 http://thepublicsphere.com/?p=1550 Nona. To a platoon of us Americanized cousins that included my little brother and me our maternal grandmother was always Nona. "Nona" is not a common term for grandmother in Latino families. Abuelita is much more widely used, especially in Mexican families, but my grandmother trained a whole wave of her first- and second-generation immigrant grandchildren to use "Nona." You see, we “americanos,” as Nona described those of our generation (even if technically we had been born in our original home country of Peru), spoke utterly broken Spanish.

By Cesar Gomez | The post Becoming Nona: Memories of a Grandmother appeared first on The Public Sphere.

]]>
Nona. To a platoon of us Americanized cousins that included my little brother and me, our maternal grandmother was always Nona. “Nona” is not a common term for grandmother in Latino families. Abuelita is much more widely used, especially in Mexican families, but my grandmother trained a whole wave of her first- and second-generation immigrant grandchildren to use “Nona.” You see, we “americanos,” as Nona described those of our generation (even if technically we had been born in our original home country of Peru), spoke utterly broken Spanish.

Describing the Spanish that we used as “broken” is like saying water is wet. Our mangled word pronunciation, notoriously bungled syntax, and grammatical non-sequiturs were linguistic train wrecks in the making every other second that we opened our mouths to “articulate” our breathlessly pidgin Spanish. In contrast, Nona and her adult children spoke a sturdy and grammatically flawless Spanish. So all things considered, our grandmother had a world of patience for the linguistic disasters that we sent crashing her way during our everyday conversations with her.

There was one exception. In Peru, the term for grandmother is “mamavieja,” an affectionate if rather formal compound title comprising four syllables that translates into “Old Mother.” My older brother by nine years and his contemporary cousins enunciate this word perfectly. Alas, “mamavieja” was at least three if not four syllables too long for us latter born “americanos” to ever come within a Peruvian kilometer of pronouncing even semi-correctly.

So here our grandmother, one of the most practical people I have ever known, intervened at a point in time before I myself was out of diapers and drew the line with the then present and all future grandchildren. “Nona,” which means grandmother in French and other cultures, was so comparatively easy to say that not even we could blow the pronunciation. So “Nona” her title would be, and “Nona”she always was to us, even after her death in 2002.

Being that my grandparents lived with my mom, my brothers, and me, in an extended family household until I turned sixteen, Nona played a towering role in the world that I grew up in. Because my mom worked the night shift during my early grade school years, Nona was the one who got me up for school in the morning, and Nona was the one who waited for me when I ambled home from school, as my mom got in what rest she could before she would be off again to her night time job.

Nona was old school strict and old world tough.  She grew up in the 1920s on a wind swept and isolated mountain ranch located in the nether reaches of the northern Peruvian Andes far above Peru’s second largest city of Trujillo.   The glorified hamlet of about 150 people that was her ancestral hometown carried a Quechua name, Paranday.   Paranady in the 1920s more closely resembled say, Fargo, North Dakota circa 1890 than the relatively antiseptic 1980s era California surroundings that I walked out to every time I left the family house.   In fact, Paranday was so geographically and technologically shut off from the rest of the country that its entire location, along with all of the surrounding mountain ranches like Nona’s, were completely inaccessible by car until after 1981, nearly sixty five years after Nona was born.  Until that year any hardy soul trying to reach Paranday from the nearest sizeable population center had to do so Old Testament style, traveling twelve hours by donkey just to make it to the town limits.

Nona’s upbringing was forged in the crucible of this frontier like environment.   She grew up living a utilitarian and hard-scrabble life that put iron in her blood.   Six of the seven children she gave birth to were born right on the ranch she grew up in, without the benefit of epidurals or any other kind of modern anesthetic.   All things considered it is safe to say that Nona brought her frontier values with her everywhere she went and this was as true in how she raised me as it was for anything else. One thing that meant was nothing ever went to waste. Let me repeat: Nothing. Wasted. Ever.

This was most especially true in the area of food. Nona”s rural upbringing, which meant she was intimately familiar with the back breaking manual labor involved in cultivating agricultural products, and Nona”s legendary cooking wizardry in preparing her home-cooked meals, combined to form in Nona”s heart an exalted appreciation for the sanctity of food. Thus, for Nona, throwing away food was akin to an insult against God’s benevolence and an affront to the starving Ethiopian children depicted in what at the time felt like an infinite loop of World Vision television commercials.

In my early grade school years I was often Nona’s captive audience for one of her home-cooked meals. Ever faithful to her Spartan values and rural heritage, Nona naturally considered me morally obligated to eat all of the food she served on my plate. This stayed true even if the designated meal-time consequently tumbled into an overtime period of interminable length because of my passive resistance to what I then considered Nona”s culinary tyranny.

Those endless meal times often devolved into a test of wits between Nona and I. However, school morning breakfasts were especially perilous for my second-grade self because Nona insisted on serving me a daily bowl of Quaker Oats oatmeal, and there was a school bus to catch, so I was up against a clock, in addition to Nona’s formidable resolve.  Now, Nona always mispronounced this non-Spanish word for oatmeal as “Quack—errr”, dutifully left out the Oats part, and she saw it as her For example, if your company is a manufacturer, it will be important to use the coming from sensors to monitor the purity of chemicals being mixed in the production process. grandmotherly duty to make me ingest this particular kind of breakfast meal down to the last soggy oat. As for me, I was just as determined not to. In fact I felt I had a sacred responsibility to my kid palate not to drink the despised Quack–errr to anything like the bottom part of the bowl, where all the doomed soggy oats submerged to rest in watery oblivion.

However, I could not argue this point with Nona directly. I never did, as I had been raised not to. At this particular point in my family”s immigrant experience the rules were so strict that young children could never for any reason so much as say the word “No” to any responsible adult. So despite my kid”s eye view of the tragic injustice involved, no way and no how was I going to start the soundtrack of “No” with Nona around the consumption of Quack–errr.

Instead I employed subterfuge and tactical misdirection wrapped up in a metaphorical falafel of non-violent resistance. Dr. Martin Luther King Jr. organized historic sit-ins for racial integration. John Lennon choreographed a televised 1969 bed-in for peace. And at age seven I began staging spoon-ins for escaping the de facto jail that Nona”s kitchen table was to me.

You may ask, what was a “spoon-in”? While Nona watched (or more accurately stated, pretended not to watch) me “finish” my breakfast from the business side of the kitchen (where the oven was), I dramatically and repeatedly buried my spoon deeply into the tilted bowl and pretended to scoop out every one of the surviving oats to eat them all, and thus in Nona”s eyes justify my getting off the kitchen table. My goal was to sustain my spoon-in pantomime just convincingly and long enough so that Nona would soon be distracted by a phone call or a bathroom break or some other minor miracle that would result in me being outside her line of sight. This in turn would allow me to jog sight unseen to the kitchen sink and flush the offending Quack-errr oats down the drain before Nona would be the wiser.

My spoon-ins were occasionally successful but in truth, Nona usually achieved her goal of making me eat everything she set on my plate.   She could and often would wait me out my spoon-ins because right after breakfast she walked me straight to the school bus stop.  Even at age seven I knew the school bus waited on no one, not even anti-Quack-err kid crusaders like myself.  And seeing as how Nona physically stood in the middle of the only possible route to the kitchen sink, unless Nona was distracted or otherwise called away from her ambush spot, my spoon-ins were doomed to fail.   Of course, the quiet irony is that at this current point in my life I would gladly trade any number of material things in exchange for being able to again taste any and every part of Nona”s cooking and to hear, even if only one more time, the soft grandmotherly laugh that she would so often share with me at the beginning of our meal times together.

Nona had a wonderful meal time laugh, I assure you.  Her laugh was vibrant, infectious, and carried within in it a love of life that found its original expression in Paranday and brought its resilience and generosity to my little childhood corner of Pasadena.    No matter where I am, I can hear its echo in my memory and know how blessed a grandkid I am to have had her in my life.   Nona’s laugh was graceful, loving, and communicated the elemental essence of who she was, how she lived, and where her truest treasure could be found.

By Cesar Gomez | The post Becoming Nona: Memories of a Grandmother appeared first on The Public Sphere.

]]>
I Am Indignant – These Are the People We Have to Look up to Now? http://thepublicsphere.com/i-am-indignant-these-are-the-people-we-have-to-look-up-to-now/ Tue, 15 Sep 2009 06:08:40 +0000 http://thepublicsphere.com/?p=1552 Only a handful of artists have truly made an enduring mark on popular culture in the past century; Charlie Chaplin, Elvis Presley, Audrey Hepburn, the Beatles, Madonna, Michael Jackson, to name a few. These are people whose images and work are recognized almost everywhere. They displayed talent, hard work and dedication, and what they created inspired people all over the world. They also gained their fame and popularity long before the age of “new media.” Perhaps it's not coincidence then, that of all the faces featured in current celebrity-focused magazines and websites, none stand out as potential Beatles or Madonnas. I’m convinced none ever will because with the rise of 24-hour news, internet tabloids and social networking sites, our concept of fame and our ability to recognize and bestow it has been utterly altered.

By Paloma Ramirez | The post I Am Indignant – These Are the People We Have to Look up to Now? appeared first on The Public Sphere.

]]>
In case you hadn’t heard, Michael Jackson, aka the “King of Pop,” passed away earlier this year.  Even though no one had really heard anything from him in a while and the last time he was in the media it was something to do with allegedly inappropriate relationships with kids, his death was kind of a big deal.  In fact, it was one of those events for which newsroom directors the world over fall to their knees and thank the media gods.  If you were anywhere near a television or computer or people talking, there was no escaping the momentous news of his unexpected passing.  For that entire weekend, it seemed as if nothing else of note had taken place anywhere in the world.

It exemplified the extent to which our culture has become irrationally obsessed with celebrity.  At the time, I couldn’t resist joking about how Michael Jackson’s death had brought world peace, simply because it created a media blackout of everything else.  Many people were disturbed by the level of attention Jackson received, especially when the Iranian government was violently repressing election protestors, over 70 people had just been killed in another bombing in Baghdad, the US government had just sent arms to aid the Somali government’s fight against Islamists, and, of course, the governor of South Carolina had just admitted to having an affair.  But in a way, it made sense to focus on the sudden permanent loss of a person whose fame will most likely never be equaled, a person whose death actually does signal the end of an era.

Unless you happen to be a member of one of those South American tribes who have managed to exist completely isolated from the modern world, you knew who Michael Jackson was.  That’s only slight hyperbole.  I remember, as a kid, seeing video footage of his concerts in Europe and Asia, even in Russia during the Cold War.  He had fans in Iran during the Revolution.  My own father, who deliberately ignores almost everything that could be considered pop culture, has fond memories of listening to the Jackson 5 in his younger days.  For the entire decade of the 1980s, Michael Jackson was probably the most famous non-politician on the planet.  He’d worked for it, and he’d earned it.  There is something to be said for that.

Only a handful of artists have truly made an enduring mark on popular culture in the past century; Charlie Chaplin, Elvis Presley, Audrey Hepburn, the Beatles, Madonna, Michael Jackson, to name a few.  These are people whose images and work are recognized almost everywhere.  They displayed talent, hard work and dedication, and what they created inspired people all over the world.  They also gained their fame and popularity long before the age of “new media.”  Perhaps it’s not coincidence then, that of all the faces featured in current celebrity-focused magazines and websites, none stands out as potential Beatles or Madonnas.  I’m convinced none ever will because with the rise of 24-hour news, internet tabloids and social networking sites, our concept of fame and our ability to recognize and bestow it has been utterly altered.

We live in the Age of Information.  The internet is the great democratizer.  Anyone with a mobile phone can broadcast their thoughts and observations to any number of people at any time via Facebook or Twitter.  Anyone with a video camera can subject the general public to their pets’ quirks, their friends’ idiocy or anything else via Youtube.  This is all well and good, but it has had a few consequences.  One is that everyone wants to be famous and believes not only that they should be, but also that they deserve to be.  Another is that fame itself has been completely diluted.

Thanks to the prevalence of reality TV and voracious internet tabloids, there are so many famous people in this country, that I gave up trying to keep track years ago.  Names I have never seen or heard of before pop up in the latest celebrity gossip headlines everyday.  They’re always treated as though everyone naturally knows who they are.  Most of the time, not only do I not know who they are, I can’t even discern what they might have done to warrant their apparent fame.  As it turns out, most of them haven’t done anything beyond mug for the cameras on some random cable network reality show or date someone with a well-connected PR person.  This generation of celebrities has earned their fame by being the bitchiest, sluttiest, craziest, crudest, most racist or sexist person in the cast of whichever reality show they appeared on.  They don’t seem concerned with displaying any real talent or holding any responsibility, only with their own notoriety.  Media outlets like Us Weekly and TMZ highlight every scandal, every bar brawl, every traffic ticket and botched Botox job that these personalities can conjure.  And the public consumes it like a drug.  No one seems particularly concerned with the fact that fame of this kind is especially fleeting in this age of instant gratification.  With so many outlets, so many sources, so many contenders, the public consciousness can only process each one for so long.  Like bubbles on a playground, these celebrities rise and burst in an instant.  Occasionally, they snap, like the contestant from a VH1 show who apparently murdered his ex-wife and became a fugitive only to commit suicide himself.  Or the DJ who was known among Hollywood celebrities, but who I heard of only because he’d died of a drug overdose.  Or the unfortunate Jon and Kate whose marriage disintegrated in the glare of the spotlight, which boosted their show’s ratings but at what cost to their eight kids?

Of course there have always been one-hit wonders, flash-in-the-pan starlets, and child stars who disappeared after they hit puberty.  But most of them made some kind of positive contribution to the entertainment world while they had their moments, whether it was a fun, catchy song or a movie that made people happy.  Many were part of a larger pop culture trend (80s hair metal bands, for example) that had its day and faded. I can only hope that the current obsession with superficiality in celebrity is one of those.  As it stands, it is beyond me how people who become famous for shooting each other with staple guns on cable TV (does anyone even remember those guys?) can possibly be making a positive contribution let alone a lasting impact that inspires anything good in anyone. And I find it a bit sad that, given the viewing public’s devotion to a show like American Idol, even the competitors who show real talent and stage presence usually last barely long enough to release an album before that same public has lost interest.  Some don’t even last that long.

Ancient heroes sought glory, fame and fortune in quests and on the battlefield.  In the early days of Hollywood and in the old Broadway musicals, a small town kid was always trying to break into show business to become a famous actress, singer or dancer.  For all his inexplicable eccentricities, Michael Jackson was an extremely talented musician and performer.  People gained fame because they had unusual talent, determination, charm, intelligence, or at least savvy.  Even people who sought fame for its own sake, had to do something to earn it.  Madonna, for example, could never really sing, but she’s an intensely ambitious self-promoter, and she worked her ass off, quite literally, to become a world-class entertainer. With the rise of new media, our admiration of talent and dedication is fading along with our capacity to appreciate a well-crafted coupling of gifted performance and marketable personality. Now we just pay attention to whomever makes the most noise until they are drowned out by someone else.  Mass media truly does represent the masses now that just about everyone has a digital camera and internet access, but there are very few filters and even fewer incentives to create anything of quality.  As Andy Warhol predicted, people who have done nothing more than lipsync in front of a webcam seem to feel entitled to their fifteen minutes.  Fame has always been something to aspire to and admire, but very rarely to achieve. The whole point was that not everyone could do it.  It meant more than having your picture taken on a red carpet and posted on Perez Hilton’s website with graffiti over it.  It took more than sitting around gossiping with your friends in front of a video camera. And yet, it seems that this is what fame means now.  But, in this world, where anyone can become famous for the slightest or most random act, how can fame mean anything at all?

By Paloma Ramirez | The post I Am Indignant – These Are the People We Have to Look up to Now? appeared first on The Public Sphere.

]]>
Bumps: Confessions of an Amateur Phrenologist http://thepublicsphere.com/bumps/ Tue, 15 Sep 2009 06:02:23 +0000 http://thepublicsphere.com/?p=1562 Two hundred years ago, a young Austrian medical student found himself with the same question. He was struggling in school, and he was jealous of those among his class who so easily excelled at memorization. In interminable lectures he watched these men trying to figure out what made them different from him, why it was so easy for them to remember and so difficult for him.

It was the eyes, he decided. They all seemed to have larger eyes.

By Colin Dickey | The post Bumps: Confessions of an Amateur Phrenologist appeared first on The Public Sphere.

]]>
Lately I’ve spent a lot of time looking at the size of people’s eyes. Not just the eyes themselves, but also the area around it: the bags under the eye, unusually heavy lids, prominent brows and all the rest. Strangers, I stare at furtively, behind sunglasses or in sideways glances. With friends and relatives I can make direct eye contact, but too long can create uncomfortable intimacy. And it’s not intimacy I want; I’m measuring. Gathering data.

All the while I ask myself: what is a large eye? A small eye? What is a normal sized eye?

Two hundred years ago, a young Austrian medical student found himself with the same question. He was struggling in school, and he was jealous of those among his class who so easily excelled at memorization. In interminable lectures he watched these men trying to figure out what made them different from him, why it was so easy for them to remember and so difficult for him.

It was the eyes, he decided. They all seemed to have larger eyes.

This young medical student was Franz-Joseph Gall, and this simple, odd insight would within two decades bloom into an unstoppable cultural force. Convinced of this causal connection, Gall began to look for other correlations between mental attributes and physical appearance. “Proceeding from reflection to reflection,” he would later write, “from observation to observation, it occurred to me that, if memory were made evident by external signs, it might be so likewise with other talents or intellectual faculties.” Gall set out looking for other correspondences between physical appearance and personality, and from then on, “all the individuals who were distinguished by any quality or faculty, became the object of my special attention, and of systematic study as to the form of the head.”

Gall’s obsession drove him to search for a visible means of discovering the brain’s secrets: a process he called “cranioscopy”—what became colloquially known as “bump reading” and what his pupil Johann Spurzheim would rechristen “phrenology.” It was predicated on a few simple principles. First, Gall theorized that, all other things being equal, size determines propensity: A bigger brain implies a higher capacity for intelligence. This was, Gall asserted, equally true of different parts of the brain—if the segment of the brain devoted to memory was larger in one individual than in another, then it stood to reason that the former would have a higher capacity for memory. Second, it was well known that the skull, like all bones, is initially malleable upon birth, only gradually becoming more rigid. So it stood to reason, Gall theorized, that the ridges and folds of the brain might imprint themselves on the bone when it was still pliable and that one could come to know the brain by understanding these imprints. From this apparent insight Gall began to explore the possibility that the brain’s workings might be made visible by the patterns it made on the skull. Each part of the skull became assigned a different aspect of personality—mirthfulness in the temples, sexual propensity at the base of the skull, and so on. With precise measurements of the size of each of these areas, Gall theorized, you could develop an entire picture of an individual’s character.

One’s identity, in other words, was written in the bumps of one’s head.

The rest of the story of phrenology is well known enough: blossoming into full scale quackery, it became a juggernaut of an industry unto itself, even as it was more and more discredited by legitimate science. By the twentieth century it was all but abandoned, but in the nineteenth century it was perhaps the most popular mode of understanding the human brain. In his preface to Leaves of Grass, Walt Whitman proclaimed, “the anatomist chemist astronomer geologist phrenologist spiritualist mathematician historian and lexicographer are not poets, but they are the lawgivers of poets and their construction underlies the structure of every perfect poem.” It seems odd that the one profession on this list that actually purports to deal with who we are, why we’re motivated to do what we do, and how we define ourselves, is the one profession that seems so startlingly out of place nowadays. But it makes some sense that the rest of the disciplines on Whitman’s list are hard sciences, since phrenology presents itself as the hard science of the mind, a system of objective measurements and offers, in its own way, a certain amount of rigor. Phrenology has none of the messiness of psychoanalysis or modern therapy; the phrenologist doesn’t care about your dreams, needs no narratives about your past, your abusive parents, your failed aspirations. Everything the phrenologist needs is right there, laid out in a perfect, analytic grid. Your mind revealed in the same topographic language the lexicographer would use.

For all the ridiculousness of such a premise, there is a simple elegance in such a map of identity, where everything is so neatly arranged, so perfectly knowable. I’m not the only one who’s drawn to the trappings of Gall’s pseudoscience—lately, phrenology charts have popped up everywhere, from CD covers to bicycle helmets. They’re a graphic designer’s dream: iconic, ironic, eye-catching, nostalgic. But as much as layout artists may fetishize Gall’s chart nowadays, no one is eager to revisit the science. I’m not bothered that phrenology—with its dubious method and explicit racism, sexism, and all the rest—has disappeared. Good riddance. But what intrigues me is that such a ubiquitous measure of personality has literally disappeared off the face of the earth in less than a century. Compare the number of people who can read Egyptian hieroglyphs or ancient Greek to the number of practicing phrenologists—there are dead languages and there are dead languages, and the language of phrenology is about as dead as it gets.

And this is where my problem begins. For the past year, I’ve been trying to teach myself phrenology, this now-dead art.  At first I assumed this would be a fairly easy task, far easier than reconstructing Egyptian hieroglyphs from the Rosetta Stone. After all, the relics of phrenology are visible everywhere; libraries and online resources still preserve the literature. It’s everywhere in popular memory—the pseudoscience to end all pseudosciences, the template for every self-help scheme from The Secret to the Master Cleanse. How hard could it be to learn it?

It was easy enough to track down what I thought would have been the Holy Grail: Lorenzo Fowler’s “Self-Instructor in phrenology.” Lorenzo and his brother Orson did far more to popularize phrenology in the United States than anyone else, selling their now-iconic busts and performing thousands of readings out of their New York headquarters. The title says it all; who needs phrenological experts, when the book promises to let you teach yourself?

“To TEACH LEARNERS those organic conditions which indicate character is the first object of this manual,” the preface boldly proclaims. “And to render it accessible to all, it condenses facts and conditions, rather than elaborates arguments because to expound Phrenology is its highest proof states laws and results, and leaves them upon their naked merits; embodies recent discoveries, and crowds into the fewest words and pages just what learners need to know, and hence requires to be STUDIED rather than merely read. ‘Short, yet clear,’ is its motto. Its analysis of the faculties and numerous engravings embody the results of the very extensive observation and experience of the Authors.”

The library copy I acquired, an original from 1850, even has its first owner’s chart, filled out by Lorenzo Fowler himself, with each region given a number on a scale from 1 to 7. His pencil marks faint but still visible; I found myself wondering what a graphologist would make of them. But as tantalizing as Lorenzo’s presence in these pages is, it is also the problem: the book’s owner did not phrenologize himself. As the preface goes on to explain, the actual work is done by the examiner, in this case, Fowler: “The examiner will mark the power, absolute and relative, of each function and faculty, by placing a figure, dot, or dash on a line with the name of the organ marked, and in the column headed ‘large,’ or ‘small,’ according to the size of the organ marked, while the printed figure in the square thus marked refers to those pages in the book where, under the head ‘large,’ ‘small,’ etc., will be found description of the character of the one examined in respect to that organ….”

This is the problem—the Fowlers don’t teach you how to read heads, they teach you how to interpret their readings. And the bust they sold is great for learning where the various propensities of Amativeness, Philoprogenitiveness, Adhesiveness, Inhabitiveness, Alimentiveness, and all the rest are located, but it’s useless for separating a “3” from a “4.” You still need a phrenologist, one who knows how to classify the size of each bump.

In all the phrenological literature I’ve scoured, there’s not one description of bump size in objective terms, no measurements that can be applied to a contemporary head. How does one even objectively measure such bumps? In centimeters? In degrees? What is the “normal” shape of the head, from which one could single out a noteworthy bump? Once proprietary trade secrets, now these secrets of identity are likely lost for good. As with any dying language, without a living community practicing phrenology, its mysteries have disappeared from the storehouse of knowledge.

So I spend my time trying to reconstruct this data, taking my measurements, looking for enough statistical data to form a working knowledge of an elusive “average” by which to judge the remainder of humanity. Not unlike the work of the Egyptologist, there’s an archeological aspect to this work, a reconstitution of a forgotten discourse.  I have no dreams of spreading the bump-reading gospel. The question for me has never been: how do we resurrect phrenology? Rather, the question is: what does it say about our ideas of identity when a “science” (however dubious) can go from such importance to the dustbin of history, in such a short space of time? The disappearance of phrenology suggests that the study of identity isn’t like biology—it doesn’t necessarily move inexorably forward, building on past discoveries. Each age has its own ideas about identity, and its truths are always in flux.

By Colin Dickey | The post Bumps: Confessions of an Amateur Phrenologist appeared first on The Public Sphere.

]]>
Of Red Shirts: The Saga of the Minor Character in Someone Else’s Epic http://thepublicsphere.com/red-shirts/ Tue, 15 Sep 2009 06:02:20 +0000 http://thepublicsphere.com/?p=1557 The Red Shirt character is a colloquial reference among fans of a 1960s era television science fiction program. In Star Trek’s opening scenes, two or three of the lead characters (often wearing yellow or blue uniforms) would land on a planet, accompanied by one or two characters wearing red uniforms. Within the first ten minutes of the show, generally someone wearing a red uniform died, and her or his demise introduced the central conflict of the episode’s plot. So, at the beginning of the episode, if someone appeared in a red shirt, you knew that this person, no matter how likeable, competent, or regardless of how much this character connected for the moment with the yellow and blue uniformed lead characters (often the stars of the show), this Red Shirt was toast.

By Valerie Bailey | The post Of Red Shirts: The Saga of the Minor Character in Someone Else’s Epic appeared first on The Public Sphere.

]]>

“Remember Mortimer, there are no small actors. Only small parts.” — from the play, “The Fantasticks” (end of Act 1)

Much to my parents’ and black community’s surprise, I found friends among my peers in my private, predominately white elementary school. That’s fine for children, said some of the elders at my black Pentecostal church. For even before the civil war, white children played with the black slave children, they would say. However, these elders would say in hushed tones, once the children became of age, those friendships were impossible. And so it will be with you, the elders said, as I came of age in the early 1980s. Some boundaries, the elders said, are impossible to cross.

In college, some of my roommates who shared my theologically conservative upbringing were skeptical about my secular peers, especially my friends who were neo-atheists, and, in come cases, Wiccan wannabees. My conservative friends were fine with having relationships with “non-believers,” as long as I was trying to convert them to Christianity. Other than that, my religious friends said, these relationships were impossible and would eventually fade once the superficial boundaries of dormitories and classes ended with graduation. Some boundaries, my friends said in quiet, prayerful tones, are impossible to cross.

Much to my delight, while having dinner with two college friends in the early 1990s, we realized that our friendship had lasted more than ten years. We marveled at how our college-era acquaintanceship had evolved into lasting friendships. We were from different ethnic and religious backgrounds, and all of us had grown up in communities that cautioned us against alliances with the communities that we each represented. During that dinner, we talked about how we were able to cross the impenetrable boundaries that we had been raised with, the fences that were supposed to keep us within communities often defined more by who we were not than who we happened to be.

At first, we thought we had been friends because we were able to forgive each other. We had other close friends in college, some of whom we assumed we’d be friends with for the rest of our lives. However, disagreements, busyness, distance and shifts in ideology ended many of those relationships. Despite our ability to forgive each other for various clashes, this did not seem to define why we had managed to remain friends into adulthood.

Perhaps we were friends because the world had changed so much that the boundaries of our childhood were no longer applicable. Those ethnic and class boundaries that once confined us to a station in life were now looser. A shared college degree from the same institution also leveled our playing field. We found that we had arrived on the doorstep to adulthood with more baggage from college than from childhood. Perhaps we were surprised at how four years at the same institution created new bonds that now redefined our communities of origin. The old fences of ethnicity and religion still mattered; however, four years in the same place created new alliances and boundaries.

And in the new communities formed by this common experience of college, we discovered that we as a group of friends shared something that barely registered in today’s multicultural discussion. This “something” is probably what gave us that additional comfort level with each other. The best way I can describe this “something” is that my friends and I all come from ethnic and religious communities that had once been on someone’s list for being wiped from the face of the earth. Now, this aspect of our identity is not the kind of thing you introduce yourself with; hello, my great grandparents were once forcibly detained in some manner (concentration camp, reservation, ghetto, sexual, ethnic or religious discrimination laws, immigration status designations) for some difference deemed dangerous by the majority culture. Although these nineteenth and twentieth-century atrocities are rarely discussed in polite company, even among Jews, African Americans, and Native Americans, this legacy of oppression still defines these communities. These narratives of communal and shared oppression are often talked about among close family members and friends. The stories of pain are spoken in whispers over dinner and drinks, often while reflecting on the latest news of some genocide, somewhere in the world. Our whispers tell stories where our family members were not the main characters, but the secondary, unnamed cast members, the corp, the nameless masses, the expendable people who were not important to some oppressor’s major plot point. And this aspect of our identity as the secondary character in someone else’s story of glory and power is a powerful moniker.  For the lack of a better metaphor, this aspect of our identity as someone else’s minor character is like being the doomed “Red Shirt” character in a popular television series.

The Red Shirt character is a colloquial reference among fans of a 1960s era television science fiction program. In Star Trek’s opening scenes, two or three of the lead characters (often wearing yellow or blue uniforms) would land on a planet, accompanied by one or two characters wearing red uniforms. Within the first ten minutes of the show, generally someone wearing a red uniform died, and her or his demise introduced the central conflict of the episode’s plot. So, at the beginning of the episode, if someone appeared in a red shirt, you knew that this person, no matter how likeable, competent, or regardless of how much this character connected for the moment with the yellow and blue uniformed lead characters (often the stars of the show), this Red Shirt was toast.

When I first walked into my private, white religious school as a sixth grader, an African American from the inner city, my classmates probably looked at me with a mixture of shock and pity. They had designated me a Red Shirt in their meta-narrative of their educational experience which was suppose to result in a high school diploma, the gateway to college, business or some kind of suburban success. This suburban success would elude me, the new black kid, because, I was slated to eventually suffer some kind of fate early in the narrative of our shared school experience. This new black kid, they might have thought, is probably a nice person, but the poor girl is doomed. She’s probably a future welfare mom, I imagined people would think, or, perhaps they thought I would become a member of the service industry that would help cater to someone’s suburban success. I remember being treated politely, but eventually, people stopped reacting to me at all. I became invisible; maybe my expected short and irrelevant existence was too much to bear. As a Red Shirt, I could not be an equal in a community where the white children were groomed for the leadership and privilege that no minor character could acquire. My presence was merely to be a prop, or a token of their kindness. Eventually, for the convenience of the plot, I would be dismissed, either in actuality or existentially through being ignored and rendered invisible. I suppose this is much better than being wiped off the face of the earth. Then again, there is not much difference. Either way, I was being removed from the plot.

Perhaps the warnings about crossing boundaries to make friends came from the reality that if you are the designated Red Shirt in someone’s narrative, the initial camaraderie could quickly devolve into the experience of genocide on a personal or communal level. The warnings were quite accurate, and there was wisdom in not becoming too comfortable with your friends until you understand where you fit into someone else’s narrative. Being a Red Shirt created insanity, psychosis, neurosis, paranoia, addictive behaviors, all related to the strangeness of knowing that you are the extra, easily disposable character, in someone else’s epic narrative. It’s probably why so many marginalized people end up being designated the “crazy” Red Shirt person. As part of the elimination process, the crazy Red Shirt person is blamed for their own negation, thus relieving the main characters of guilt and insuring their roles as heroes in their own meta-narratives.

So, in an effort to find true friends and avoid insanity, I heeded the warnings, I made friends cautiously, and tried to live out my own meta-narrative where I was the lead character and conquering hero. I had not planned on the narrative’s transformation. The change started after college in the 1990s, when my Red Shirt status expired and was replaced with a new narrative shaped by the shared experience at an institution that treated me not like a minor character, but as an equal with my peers. My new uniform after graduation was not red. I was no longer the character whose demise was required by the plot of the larger narrative. I had become a productive member of society with a college degree and thus no longer a threat to the meta-narrative of US culture…sort of.

While I enjoyed this new narrative status, I found that most of my friends felt similar about their former Red Shirt status. The Red Shirt status crosses ethnic, religious, and gender boundaries, as was also the case with the science fiction show. I remember taking comfort that it was not always the black character who died in the first ten minutes, but it was the Red Shirt character, who might be a man or woman, or a black or white or Asian character. The Red Shirt status of non-existence was an equal opportunity position.

For my friends (whom I have known for almost 25 years), this former Red Shirt identity was often a coat that hid our original ethnic and religious attributes. For some, the assimilation process was adopted in an effort to stay off the Red Shirt list. For others, assimilation was adopted as part of living out our own personal meta-narrative while ignoring the majority culture’s efforts to assign us to the role of the doomed Red Shirt (like, for example, attending college and gaining access to networks of privilege). I found that beneath the surface of my friend’s skin lurked Catholic guilt, habits honed in former British colonies, a hidden ability to dance rhythmically shaped by a Celtic heritage, or perhaps a secret and unexpressed taste for kugel, bratwurst, and kimchi.

I vacillated between the paranoia of being someone else’s minor character in their major culture epic narrative and my new found identity outside of my previous Red Shirt status. As I grow older as an African American, I must not forget my Red Shirt reality, that in someone else’s meta narrative, I am not suppose to exist. I must hang onto the sane part of my paranoia as a reminder that someone’s meta-narrative once required my demise. This paranoia is not needed to keep me safe from false friends or tokenism anymore. What I hope is that by remembering my former Red Shirt status, I won’t absentmindedly write my own meta-narrative that assigns the role of the Red Shirt to some kind, jovial, and unsuspecting person out of convenience or in a delusionary attempt at some kind of suburban nirvana.

By Valerie Bailey | The post Of Red Shirts: The Saga of the Minor Character in Someone Else’s Epic appeared first on The Public Sphere.

]]>
Issue № 4 | June 2009 http://thepublicsphere.com/issue-4-june-2009/ Mon, 15 Jun 2009 04:09:35 +0000 http://thepublicsphere.com/?p=1425 We at the The Public Sphere are celebrating a one-year anniversary since we ran our test issue 0. Given the myriad anniversaries honored in 2009, from the French Revolution to the Chinese revolution to Venezuelan President Hugo Chávez”s show, Aló Presidente, it seemed an appropriate time to ponder the power and meaning of [...]

By The Public Sphere | The post Issue № 4 <small> | June 2009</small> appeared first on The Public Sphere.

]]>
We at the The Public Sphere are celebrating a one-year anniversary since we ran our test issue 0. Given the myriad anniversaries honored in 2009, from the French Revolution to the Chinese revolution to Venezuelan President Hugo Chávez”s show, Aló Presidente, it seemed an appropriate time to ponder the power and meaning of anniversaries while considering the complex issues confronting us in daily life regardless of annual recognitions. Linda Levitt takes up the impending 40th anniversary of the moon landing, querying mediations of NASA”s space explorations. Meanwhile, T.R. Kiyoshi Oshiro questions the role anniversaries play in individual lives, and Mohammad Razi reflects on his own anniversary, having lived through the Iranian Revolution. Continuing explorations of life as a Filipina American, Lauren Espineli examines the importance of language and public recognition in her own life. Responding to Mark C. Taylor”s editorial on the nbso crisis of U.S. higher education, Marc Lombardo fathoms both the deeper source of Taylor”s use of crisis as descriptor while also considering how we might better understand the state of the university. Missing her local video rental store, Paloma Ramirez wonders about what we have lost in our transition from physical to virtual consumerism. Finally, Nikhil Thakur considers the five issues Republicans must confront if they hope to revive their party in the next four years.

Creative Commons License photo credit: erix!

By The Public Sphere | The post Issue № 4 <small> | June 2009</small> appeared first on The Public Sphere.

]]>