Monday, May 18, 2009

Sundays At Six is on hiatus!

Dear Readers,

First of all, thank you for reading my blog over the last few months. Whether you read every week or have only popped in once, thanks for checking it out.

Sundays At Six will be on hiatus this summer, as I will be bicycling across the United States with the organization Bike and Build. I'll have limited access to Internet and less energy at the end of long cycling days, but I will be keeping a blog about my travels. It won't be the political commentary of Sundays At Six but instead a travelogue of my journey across the country and a journal of my experiences building houses with the affordable organizations we'll help out along the way.

Check out my blog at http://abbycoasttocoast.blogspot.com. I'll post as often as I can and try to include plenty of pictures.

Have a great summer!

Thanks so much,

Abby

Sunday, May 3, 2009

These days coughing in public can earn you looks of poor-concealed contempt. I know because I’ve been fighting a cold for the better part of a week, which has coincidentally coincided with the explosion of swine flu hysteria. Whatever I have is nasty and spreading prodigiously around campus, but it is not, I promise, swine flu. When I went to the health center to get checked out I was shocked to many of the students waiting were wearing surgical masks. While the waiting room was no doubt full of icky ills I think the surgical mask is a bit drastic. I considered making a joke about swine flu, but thought better of it. I think making a joke about swine flu at the health center is rather like joking about bombs at the airport.

By the way, I realize it’s not called swine flu any more. But I’m a creature of habit, and H1N1 just doesn’t have the same ring to it. Whatever it is—swine flu, H1N1, influenza A—it is causing some funny behavior in humans.

Of course I believe that contagious illnesses are not to be taken lightly, especially when an illness is both contagious and deadly. But the alarmist headlines and non-stop coverage are doing nothing to help the situation. Fear mongering in the media has reached a fever pitch.

Fear-inducing propaganda is certainly nothing new. Our country has turned it in to an art beginning with the Salem witch trials, moving along at a steady clip right up until the color-coded terrorist alert scheme. I’m a frequent traveler, and never once has it been less than “orange.” Have we really been at “high risk of terrorist attacks” for 8 years? Although the most disturbing aspect of the chart is that “no risk” is not an option. Even the lowest rated color, green, alerts of low risk of terrorist attacks. I suppose a terrorist attack is always a possibility, but so is getting hit by an asteroid, and there isn’t a color-coded warning system for that (yet).

Perhaps humanity’s most visceral emotion, fear makes people do stupid things. Scared people don’t think rationally. Biologically speaking, this is a pretty good plan. When an organism is faced with an imminent threat, its ability to drop everything and assume a “fight-or-flight” mentality helps ensure survival. But what is an imminent threat? And who decides?

One of the biggest advantages of the hyper-connectivity age is the quick dissemination of news. In an emergency situation, the ubiquity of cell phones and the internet can help save lives. While there were many problems which created the situation of the bubonic plague, one wonders if the effects could have been lessened with quick communication. Certainly easy access to everyone eases the psychological effects of disaster. Parents around the globe can ease their anxious worrying with a simple phone call or email where the pony express would have had to suffice in years past.

While we can find things out fast, we can also find them out constantly. If I wake up at 3 am with a burning desire to know what’s going on in Belarus at this precise moment I can find out with just a few clicks of a button. But this also makes news pretty difficult to escape. As technology becomes more ingrained into society, it becomes more addictive. The constant onslaught of information can make it difficult to decide what is actually “news” and what is just filler.

When it comes to fear mongering, I think there are two different kinds at work, the sales-driven actions of the media, and the more sinister programs of the government. I am by no means a believer in the Big, Bad Government. I don’t buy conspiracy theories that the government is out to get us, nor do I think higher-ups spend all their time orchestrating evil schemes. But I do think programs like the terrorism warning charts are not entirely conceived with our best interests in mind.

The Bush Administration won support for many of its programs based on the assumption that we were under threat of attack. If most Americans didn’t believe terrorism was an imminent threat, it would have been a lot more difficult to justify policing the world and knuckling down on the “Axis of Evil.”

The fear mongering of the media is a very different case. Newspapers and channels are all pursuing the same thing: a profit. In the news media, the bigger the news is the bigger the audience (and the bigger the sales). If a political sex scandal or school shooting isn’t available, the easiest thing to do is take a piece of real news and blow it up. I think this is exactly the case with the swine flu.

Obviously it is a tragedy that people have lost their lives to this illness. And obviously since it is contagious it behooves us to exercise caution. But screaming doomsday predictions only incites fear and produces illogical responses, like the slaughtering of pigs in Egypt. Sure someone at the World Health Organization warned that the swine flu could reach “pandemic proportions”. But their job is to keep contagious illnesses like this under control so it makes sense they would be preparing for a worst-case scenario.

While I’m glad the people at the top are bracing themselves for impact, I don’t see any reason why the rest of us should be any more frightened than usual. Wash your hands, lead a healthy lifestyle, and cover your cough and you should be just fine. And spare that sneezing girl your dirty looks: she probably just has a cold.

Sunday, April 26, 2009

Since the release ten days ago of so called “torture memos” from the Department of Justice, the blogosphere has been abuzz with condemnation, excitement, and anticipation. Will CIA operatives under the Bush administration be prosecuted for greenlighting torture? Do the “enhanced interrogation techniques” described in the memos constitute torture in the first place? Can the United States finally move past this grisly period in our history in which we, according to Obama, lost “our moral bearings”?

First of all, I find the idea that the instances described in the torture memos constitute a loss of our moral bearings to be a bit laughable. Considering our history of genocide, slavery, segregation, internment, and a host of other ills conducted at home and abroad, it seems the United States is not recovering from a loss of moral bearings so much as a setback in the from-scratch construction of our moral compass in the first place.

That being said, I do rejoice in the transparency afforded by the release of the memos, and am heartened by the communal outrage that has been a result. There is little doubt that “enhanced interrogation” is torture, in the minds of the American people any way. And while I doubt that a majority of Americans realize that torture is forbidden in international law (meaning we are committing illegal acts by doing it) the general consensus is that torture does not align with American values.

In his 2002 memo to John Rizzo, former Assistant Attorney General Jay S. Bybee spends no time justifying the use of torture. Rather, he takes 18 pages to explain why the interrogation techniques used against Abu Zabaydah are not torture, and why they are necessary. Specifically he outlines ten different techniques, ranging from a facial slap to stress positions to waterboarding and argues in detail why each cannot be considered torture. I find the memo (available to read on the American Civil Liberties Union’s website) disturbing not just due to the techniques described, but the way in which Bybee argues for their use. What he does, in essence, is attempt to civilize an uncivilized thing.

Whatever the great strides achieved by humanity since the dawn of civilization close to 10,000 years ago, we continue to practice many crudely uncivilized things. The biggest of these is warfare, and the tools of war. While certainly not confined in use to war, torture is defined as the deliberate causation of physical or mental suffering to an individual, and in war in particular it is used as a means of soliciting information. But the problem with torture is that its very definition is inhumane. Regardless of motive, the deliberate causation of suffering to an individual is a violation of that individual’s human rights.

Bybee seems to understand this, and spends significant time in the memo arguing that the suffering endured by Zabaydah as a result of the techniques really isn’t that bad. The statute prohibiting torture in the United States defines torture as techniques which cause “severe physical or mental pain or suffering”. But who decides the definition of “severe”? Or for that matter, the definition of “suffering”? For his part, Bybee defines “severe physical pain” as that which is akin to pain felt from a major injury. And I would agree with him that none of the techniques in the memo is likely to inflict pain of that intensity.

But what about suffering? Certainly sleep deprivation, stress positions, confinement in small spaces, and waterboarding constitute suffering. What Bybee doesn’t seem to understand, or what he simply chooses not to acknowledge, is that the whole point of torture in this context is to get someone to talk who won’t do so willingly. Zabaydah proved himself to be resilient in standard question and answer sessions, so the interrogators decided to up the ante. And the only way to get someone to say something they don’t want to is to make it worth their while. Meaning, if you cause them enough discomfort (or suffering) they will say what you want to hear in order to make circumstances change.

My wording in the last sentence is deliberate. That is, I don’t believe torture is likely to produce any useful information, regardless of the ethical question. If suffering is great enough a person is likely to say anything just to get it to stop. But I don’t think we need to spend much time debating whether or not torture is effective, because it seems clear to me that its efficacy is a moot point.

Bybee takes pains to outline all the protective measures in place: presence of medical and mental health professionals, assurance that none of the techniques used would interfere with the healing of an injury Zabaydah sustained in capture. While he might have thought he was covering his bases, I think these admonitions confirm without a doubt that the actions described are unethical. The presence of doctors attests to the potentiality of the techniques to turn dangerous, and the participation of mental health professionals substantiates the claim that harsh interrogation can have adverse psychological effects.

To me, if a technique has the potential to become dangerous, it should not even be considered. The problem I think is that interrogation as a tool of war is part of a larger scheme of uncivilized practices we really should have outgrown by now. Does Zabaydah have information which could be useful to the United States? Of course he does. But the need to hold enemies prisoner (really, the whole concept of “enemies”) and interrogate them against their will points to an acute lack of human cooperation and belief in a common destiny.

In a way, humanity is like an adolescent. We have come far and grown much in our development, but we still have trouble rationalizing all of our choices or seeing our conflicts for what they are. Like teenagers, we are capable of both profound insights and dismal failures of judgment. But my hope is, like teenagers, we will with time grow to see the errors in our thinking and enter healthfully and happily into adulthood.

Sunday, April 19, 2009

There are some problems which are so multi-faceted and so controversial it seems a solution will never be found. The Israeli-Palestinian conflict is one of these. History, religion, ignorance, racism, unending violence and numerous grievances by all parties involved have created the most hotly contested political environment on planet Earth. The conflict tends to incite a for-or-against mentality in both people and governments, leaving little room for compromise or middle ground. The United States is no exception to this stubborn thinking. It is so set in its ways that it has chosen to boycott the United Nation’s Durban Review Conference Against Racism, Racial Discrimination, Xenophobia, and Related Intolerance.

The absence of the United States, as well as Israel, Italy, and Canada from the conference is just part of the controversy. The presence of Iranian President Mahmoud Ahmadinejad has incited outrage and worry that the conference will crumble into an attack on Israel. Hordes of protestors from both sides have already taken to the streets in Geneva. The International Jewish Anti-Zionist Network, the International Coordinating Network on Palestine, and Boycott, Divestment, and Sanctions Committee hosted their own conference over the weekend, representing an extremist anti-Israeli attitude. Conference attendees equated Zionism with Nazism and Apartheid and called for Israel to be brought before the International Criminal Court for war crimes.

I find the stance of the United States and the stance of anti-Israeli conference attendees to be equally troublesome. This is not an either/or situation. Both Israel and Palestine are guilty of violence toward the other, and both bear some of the responsibility. While I tend to side with the Palestinians, I think comparing Zionism to Nazism is a bit over the top.

While I find the circumstances of the creation of Israel to be highly disturbing and regrettable, an obsessive focus on these beginnings will not help us move forward. After all, the circumstances of the creation of my own nation are highly disturbing, and a burden we carry with us to this day, but we cannot go back and undo them. All we can do is move forward.

The staunch either/or thinking from both sides of the Israeli-Palestinian conflict is what has been keeping us running in circles for years over the same problem. As both sides refuse to budge and the whole world adds its shouting to the din, the violence simply escalates, giving both sides more reason to dig in their heels and wish for the total destruction of the other.

As happens often with highly controversial subjects, it seems both sides refuse to see any truth in the other’s message. Protesting the alternative conference, anti-Palestinian advocates said they were in Geneva to “fight the good fight.” But of course, that is what anti-Israeli groups say about their message as well. It was my hope that with the Obama administration, the United States might finally adopt a more balanced approach to the conflict but so far he seems to be maintaining the nonsensical attitude of his predecessors.

There will not be peace in the Middle East without an independent Palestinian state. There may not be peace with one, but there certainly will not be peace without one. Of this I am sure. The US’s harsh stance on the Durban Review Conference is due to its assertion that the Palestinians have a right to an independent state, and protection from racism. It also states that the Israelis have the right to freedom from racism as well, and the right to national security, but this point seems to be lost on the Obama administration.

But the larger problem is that the Durban Review Conference is not just about the Middle East. It is about racism and racial discrimination in all its manifestations around the globe. It is about the caste system in India, the ethnic conflicts in Sudan and the Congo, the discrimination against indigenous peoples of Australia. And it is also about issues of racism of particular prominence in the US, such as gang violence and the burden in the national conscience of the heritage of slavery. What message is America sending the world by not being present in these discussions? What message is it sending to its own country?

The conflict in the Middle East is absolutely partially a product of racism. It is also a product of religious discrimination, which these days can be pretty indistinguishable from racism in many contexts. So it makes sense that it would be a central talking point of the conference. Even if the United States feels it will be in the minority opinion when it comes to this issue, why decide to not show up at all? If its opinions about the issue are not strong enough to stand up to scrutiny, maybe it needs to rethink its position.

Sometimes those who disagree with us are best left ignored (Ann Coulter comes to mind). And sometimes there really is no gray area between right and wrong, and a firm stance must be maintained no matter how long it takes to achieve victory. I certainly wouldn’t want to sit in a room and listen to President Ahmadinejad deny the Holocaust and call for the destruction of Israel. But no more could I tolerate the insistence that Israel only uses violence in defense of lands it rightfully claimed. Both opinions are likely to come up at the Durban Review Conference, and the wisest delegates will refrain from entertaining either idea. This is not an issue of absolutes. The longer we recognize extremists on either side the longer it will take for a resolution to be found. And by boycotting the conference the United States is taking an extremist stance and missing out on the first reasonable step: conversation.

Sunday, April 12, 2009

On April 15th, protesters will gather around the country for TEA (Taxed Enough Already) parties. These cleverly named rallies are meant to send a message to President Obama that his stimulus package is not appreciated by all. While I am always pleased to hear of the engagement of citizens in political affairs, the irony of this particular situation is almost comical.

The gist of the TEA partiers woes is what they fear is the impending socialization of our nation’s banking system. As one protest poster reads: Revolt Against Socialism. I will refrain from wondering here why socialism is such a bad thing (after all, the free health care and college education that Western Europe has looks pretty nice from where I’m sitting…) because hatred of socialism is of course a calling card of the Republican Party. And to be fair, the government’s response to the economic crisis is certainly a Republican’s nightmare. With bailouts placing the government as a primary shareholder in many of the nation’s largest industries, it’s easy to see how terrifying this looks to a right-winger.

But what flabbergasts me about these protests is the complete lack of responsibility Republicans are accepting for the crisis in the first place. They should be wandering sheepishly around Washington with their tails between their legs, begging to be given another chance, not organizing large scale protests around the country. It was precisely the extremist ideas of some Republicans which brought us to this mess in the first place.

Of course, we could have had a Republican president to get us out of this mess. That Republican ideas got us here is with little doubt, and that a Democrat is attempting to get us out is the reality, but what if we had a Republican president attempting to get us going again? I have to say, I am at a complete loss wondering what they would have us do instead. After all, the government isn’t bailing out the auto industry or mortgage brokers or banks for giggles and grins. They are doing it because the economy cannot get moving again without these industries recovering. If they were to just let the housing market flounder and writhe in a prolonged death, or the domestic auto industry decay into obsolescence, the effect on the economy would be devastating—possibly irreversible.

Many of the alternative solutions proposed by protesters are actually not bad. Yes, some are horrifically, terribly, dreadfully awful, such as expanding the search for American oil (what part of “The Oil Will Run Out” do you not understand??) but others are worth considering. For example, I would think homeowner’s assistance is something everyone can get behind. The economy cannot recover without a functioning housing market and tax credits to home buyers is a great way to get things moving again. But Congress already approved at $15,000 tax credit to home buyers. Are protesters asking that a larger tax credit be given, or are they just not aware that there already is one?

I also delight in the sweeping naiveté which must have been required for them to come up with the goal of “eliminating Congressional earmarks and wasteful pork-barrel spending.” Well sure, that sounds like a great idea, right up there with getting politicians to actually achieve everything they say they will in their campaigns. While they have been splashed across the headlines throughout this crisis, earmarks and pork-barreling are nothing new to this or any economic climate.

For those not familiar with the lingo, earmarks are special spending items that members of Congress request, regardless of the benefit of those items to the country at large. Sound like a bad idea? Darn right it does. But eliminating this practice would require the dethroning of one of the most powerful pieces of the governmental machinery: lobbyists. Lobbyists wage a huge amount of influence over government affairs (see: unquestioning support for Israel) and elected officials who attempt to rise above do so at their peril. Because lobbyists have the power to give officials what they need most. Think about it: officials need votes to get and keep their jobs. So if a lobby threatens to remove their support if they don’t get 100,000 bucks to build some ice rinks, you fork over the cash. Because really, in a multi-million dollar budget, who is going to miss 100 grand? But the problem is these earmarks add up fast.

The other problem is, in times like this, every penny counts. Am I thrilled that my tax dollars are going to hockey enthusiasts in Toledo? Not particularly—but I’m also not thrilled they’re going to Iraq to help kill people, and I’d like to think I can keep my priorities straight. Still, I agree that earmarks are a pestilence in our budget, and in a perfect world we could be rid of them. And that’s not to say that people aren’t trying. Heck, George W. Bush tried, or at least gave a nod to the effort every now and again. Maybe I’m too cynical, and maybe we can be rid of them, but I think anyone who considers these a primary financial concern on par with energy spending and a health care plan is a bit out of touch with reality.

The same could be said about pork barrel spending. A mainstay in the United States government since the Bonus Bill of 1817, pork barrel spending is much like earmarks except it is the result of a more direct relationship between representatives and constituents, instead of representatives and lobbyists. Funding is given to localized projects in an official’s district, and in return the citizens of that district continue to support that official. From this text book definition, it actually seems like, well, isn’t that kind of the idea? An official is elected to represent the best interests of its constituency, so wouldn’t that include securing funding for projects which would improve the lives of voters? But again, the conceptual notion and the reality are somewhat removed. Especially in a time of national (and international) turmoil, the federal government really doesn’t have the money to indulge in the pet projects of specific representatives, even if they are well-meant and possibly helpful to the beneficiaries.

Will President Obama’s stimulus package work? Only time will tell. It certainly is a shame that we are shouldering more and more debt that we will pass on to our children and grandchildren. But ignoring the ideas that got us here and suggesting we attempt to solve the problem with those very same ideas is preposterous and stupid. After all, the only bad mistakes in life are the ones you make twice.

Sunday, April 5, 2009

In the months following the November election, the blogosphere has been overflowing with predictions and speculation regarding the future of the Republican Party. How can it recover from its devastating defeat in 2008, everyone wonders. Can it pull itself out of the muck of the Bush Administration by its own bootstraps, or will it wither into increasing obscurity as the Bright Light of Barack Obama shines ever brighter? The answer, I think, could be the former, if the Republican Party could remember what it means to be conservative.

In his brilliant essay for Harper’s Magazine, Garret Keizer scathingly points out the irony of the Republicans’ current predicament. After all, it “was supposed to stand for small government and fiscal restraint, and instead it has given us big government and the virtual socialization of large segments of our economy”. When did conservatism call for the completely unrestrained frolicking of large corporations? Say what you want about the free market, conservatives don’t trust large bodies of people to do the right thing (no pun intended) and as a result preach the virtues of small government high and low. So how can a true conservative (or anyone, for that matter) believe that corporations aren’t susceptible to the same flaws which make a large government dangerous? After all, corporations have the same powers of influence as government, only with more money.

The politics of the Bush Administration haven’t been based on conservatism so much as absolutism. As Barack Obama points out in his book The Audacity of Hope, it is not a conservative idea which brought on the current financial crisis, but an absolutist idea. No market regulation, no government interference, no safety net, rather than conservative political participation in the same. To again quote Keizer, such blatant eschewal of authority reeks more of '70s adolescence than any political idea ought. He even refers to Reagan as “the last of the California hippies, a man who told us that if we just let the markets run wild and the Magic Bus of juggernaut capitalism go barrel-assing down the road with its freak flag flying all would be groovy and out of sight.”

From that point of view, the idea of a free market economy seems to stem from a healthy dose of flower power as much as insatiable greed. Free love is a great idea too, in theory, but only in a world without chlamydia.

As everyone knows, the economy is seriously infected. All that playing sure was fun, but now we have an epidemic on our hands, and as is often the case, the ones who are most affected are not the ones that got us into this mess in the first place. And the sad thing is, the ones that did bring us here were the very ones who should have been the first to spot trouble coming.

But that’s water under the bridge now, and I don’t see any reason to add my voice to the finger-pointing multitudes. What interests me is where the Republican Party is going to go from here, whether they will forge a new identity which reclaims their conservative roots, or whether they will continue to alienate themselves from American reality with absolutist ideas.

Obama goes on to describe the “religious absolutism of the Christian right…[which] insists not only that Christianity is America’s dominant faith, but that a particular, fundamentalist brand of that faith should drive public policy, overriding any alternative source of understanding, whether the writings of liberal theologians, the findings of the National Academy of Sciences, or the words of Thomas Jefferson.” For a party which considers upholding the Constitution as a matter of fierce patriotism and pride (see: maniacal obsession with the Second Amendment) they have a crafty way of sweeping “separation of church and state” under the rug.

Or take the example of abortion. A great example of how much the Christian right has overtaken the Republican Party is the GOP’s stance on a woman’s right to choose. To quote Barry Goldwater’s 1960 book, The Conscience of a Conservative: “the choices that govern [a man’s] life are choices that he must make; they cannot be made by any other human being, or by a collectivity of human beings.” Sexist language aside, that certainly sounds like a classic pro-choice argument. It seems to me that a true conservative would tell any government which attempts to control their body to take a long walk off a short pier.

Surely there are some aspects of the Republican Party which are reminiscent of conservatism. Its views on gay marriage can hardly be surprising, given the very definition of conservative includes a proclivity to tradition and the maintenance of existing institutions. So I am certainly not saying that a return to conservatism by the GOP would erase its sins in my eyes, or earn my affiliation. But I do think a genuinely conservative Republican Party would be a lot more successful, a lot more effective, and a lot more American.

Sunday, March 29, 2009

It’s funny how people can get all up in arms when an official simply asks that a law be enforced. Indiana Superintendent of Public Instruction Tony Bennett announced last week that starting next year schools could no longer count half-days towards the 180 full days each year required by state law. He also ordered they make up days cancelled because of bad weather, as is also the law.

One would think Mr. Bennett’s announced would be greeted by nary a bad word from anyone, save high schoolers with acute senioritis, but it turns out that government officials are getting just as riled up. House Democrats proposed a bill adding language to the state law which would allow schools to count half-days used for teacher training and parent teacher conferencing towards the 180 full days.

As a liberal, I find the Democrats' position to be a bit embarrassing. For a party which claims to be an advocate of education to the bitter end, working to keep kids out of school is a base move. Of course, it isn’t that the Democrats' sole aim is to keep kids out of school. They argue that teacher training and parent-teacher conferencing are vital to the overall performance of schools.

Teaching is a challenging job, and frequent training can help keep teachers in top form. Similarly, parent-teacher communication is crucial to students’ success. Whatever progress or struggles a student is experiencing in the classroom, involvement at home is one of the best ways to overcome problems or ensure advancement.

But while these things are important, so is making sure students receive adequate instruction each year. With the school year making up just half of the calendar year, kids aren’t really in school all that much as it is. Many countries around the world have closer to 200 instructional days a year, which still leaves plenty of time for breaks.

My guess is the position of the Democrats is being largely influenced by the teacher’s union. Superintendent Bennett’s plan would in fact result in more working days for teachers. And of course, the problem is they will probably not be paid accordingly. So it seems the root of the problem is—as ever—valuing our teachers.

American schools have many problems, and Indiana is no exception. In Indianapolis fewer than 50% of middle schoolers are meeting the state standards in reading and math. Will more instructional days solve this problem? Not on its own surely, but having fewer days will only aggravate the issue.

Of course the length of the school year has been a hot-button topic for awhile now. I remember being in high school and being struck with fear by the thought of “year-round school”. The long summer break is a holdover from our agrarian roots, and many feel it is passé in the 21st century.

Summers can be wonderful. For me summers involved dance training in New York City and choir tours to Europe and family vacations to Yellowstone and Yosemite. But for some children summers mean long days without lunch, since they are normally provided with a free one at school. Some kids attend science camp and horse camp and music camp, padding their college applications with the requisite diversity of experience. But many kids can’t afford these opportunities, so while the kids that can get farther and farther ahead, the kids that can’t get farther and farther behind.

That being said, I’m not sure year-round school is the answer. Certainly not the solution is the ludicrous idea of “multi-tracking” wherein the school building is in use year round with different “tracks” of students attending school at different times. To get an idea of the logistical hell on Earth that is this plan, look at the academic calendar of the Wake County Public school system here. Pity the parent who has children in different tracks!

But even without multi-tracking it seems year-round schooling would be a very hard sell. Another main argument for the idea is that students lose much of what they’ve learned over the summer, and teachers have to spend too much time reviewing. But arguments against it claim that students start to forget what they’ve learned as soon as two weeks into a break, so having more frequent two- to three-week breaks will actually make progress slower.

As a university student, I still operate on the school calendar system. The main school year is just two semesters, and the summer is long. But as an adult, I have a wide range of opportunities at my disposal during the summer months. Some summers I’ve taken classes, others I worked full time to save money for the regular year, and others I supplemented my degree program with study abroad and intensive training. To me the current calendar system at the collegiate level works just fine.

But the public school system is a different story. Kids don’t have as much control over how they spend their summers as college students do, and some kids have no control at all. Teachers are not usually PhDs with research grants and fellowships to supplement their income, and as a result have to live 12 months on a 9 month salary.

If all my dreams came true, the school system would stay as it is, but all students would have ample summer enrichment opportunities. Furthermore, teachers would have more work days in the form of training and parent conferencing, but they would be paid accordingly.

Governor Mitch Daniels has promised to veto the measure in Indiana, so it seems for now schools are stuck with following the law. I do sympathize with teachers who may take a hit from this, but I think the Democrats would have served them better by introducing legislation to ensure teachers’ salaries are appropriate for their work load.

Sunday, March 22, 2009

Facebook recently changed the layout of its home page. To no one’s surprise, everyone hates it. Every change in Facebook’s design, organization, or configuration has come as a major upheaval to users whose “status updates” convey their confusion, exasperation, and outrage at the nerve of the administrators. It’s just a slight change in how the content on your home page is displayed—what’s the big deal?

But the fact is, Facebook is a big deal. Facebook is a big deal whether you are a junkie or choose to ignore it. Facebook and its ilk—MySpace, Twitter, and so many others—are changing the rules of social interaction. I remember the very moment I was introduced to Facebook. I was in college, living with 5 friends and in the midst of becoming myself when one of my roommates introduced me to it. I wasn’t in the very first batch of users, but I was initiated fairly early on.

I loved it. As I said, I was in the middle of becoming myself, and excited about it. And now I could share it with the whole wide world! Dating became a snap. Have your eye on someone and need to know their sexual orientation, relationship status, and political and religious bent before proceeding? Just look it up on Facebook—there’s no need to waste time on a date or three. I reconnected with childhood friends and high school buddies, and became “friends” with fellow participants in summer programs before I met them.

I carried on in a happy relationship with Facebook for a couple of years before it exploded. Now it is bigger than ever, with more and more users and more and more features, and of course there is only one sensible reaction. Panic. Reading the sensationalist headlines about how Facebook will turn your kid’s brain kumquat or some such I got pretty tired. Again—what is the big deal?

But then, when I joined Facebook I was 20. How would I have dealt with it at 12? The horrifically awful braces-and-glasses pictures, the agonizing “relationships”, the fleeting friendships, all displayed online for the world to see? I had some pretty nasty fights with friends in junior high school, how might they have played out in cyber space? What about the “break-ups” I found so devastating? A secret “de-friending” from a boy who just crushed me probably would have seemed like the end of the world to me.

As I read more about it, the issue that seems the most pressing is bullying. Whether you were the perpetrator (which I sometimes was) or the victim (which I also sometimes was) we all dealt with bullying at some point. What happens when it goes global? I read an article recently about some junior high school kids getting in trouble at school for creating a group “Eric is a Hairy Beast.” Complete with cell phone pictures and anecdotes, the group’s sole purpose was to torment a boy at school. Of course in any generation kids will make fun of other kids, but I think the public humiliation of cyber bullying is worse.

And bullying isn’t the only problem. High schoolers (and indeed, many of my peers) often show remarkable lack of judgment in what they post online. Drunk pictures, obscene quotes, photos in various stages of undress, and far, far, Too Much Information may seem tame when you think you’re just sharing it with your friends. But what goes online stays online, and even after you change your mind about something you posted it’s pretty much impossible to make it go away forever. One thing is for sure: I do not envy the presidential candidates of 2044.

As everyone who has been through it knows, adolescence is a time of huge change. And not just in fashion sense and music tastes, research shows that adolescence includes a rate of brain development on par with toddlers’. In fact, researchers now believe that the brain doesn’t reach complete maturity until the mid-twenties. As someone currently living her mid-twenties, I’d have to say my personal experience leads me to believe that is true. After a decade of thinking I was all grown up, I think it’s only been in the past year or so that I really have gotten close.

After giving it some thought, I can understand parents’ concerns for young peoples’ use of Facebook. But I still don’t think panic is quite the right reaction. Because what the alarmist viewpoints don’t take into consideration is that the times are changing. Online social networking is changing how we interact and it will never be the same as it was before. Of course it looks scary, when you look at today’s situation with yesterday’s rules. But if every politician in 50 years has drunk pictures from spring break in Cancun, what will happen then? Maybe, just maybe, no one will care.

Of course I still think it’s a good idea to be prudent about what one posts online. It is also definitely a good idea for parents to be involved in their child’s online life and make sure it doesn’t cross into legitimately harmful behavior. But beyond that, I think we can all calm down. The times are changing, and the wise change along with them. There may be growing pains, reminiscence for the good old days, or worry about the unknown future, but that’s how change is. We’ll be just fine.

Sunday, March 15, 2009

A few weeks ago, my beautiful friends Mark and Miranda got engaged. They are extraordinary people and a wonderful match, and I am excited for them as they embark on this adventure of a lifetime together. And while I’m more interested the wedding itself and their plans for the future, yes, I did notice the jaw-droppingly gorgeous ring on Miranda’s finger.

My surrender to the allure of diamonds annoys me. While jewelry of all kinds is a favorite status symbol in the Western world, in America especially diamonds are in a class of their own. Diamond stud earrings are considered a fashion staple, a diamond necklace is marketed as the “go-to” anniversary gift, and the diamond engagement ring is all but obligatory. Touted as the symbol of eternal love and described with phrases like “fiery brilliance” and “as everlasting as your love”, diamonds are the most ubiquitous rare gem in the American marketplace. And the most fascinating thing about diamonds is that they really aren’t all that rare. They are, however, one of the most brilliant examples of marketing of all time.

As minerals go, diamond is in fact pretty interesting. Diamond is compressed carbon, the fundamental element of life. That alone is a pretty romantic notion. Furthermore, diamond is the hardest substance known. In geological terms, hardness defines a substance’s resistance to being scratched, and the true nature of diamond’s hardness is undefined, because it can only be tested against itself. It is amazingly dense, repels water, and has the highest rate of reflectance of any transparent substance.

So it isn’t that diamond isn’t interesting. It’s fair to say it is unique geologically, and it’s also fair to say that it’s pretty. But its unparalleled status in the minds and hearts of Americans is not a result of its chemistry or its beauty, but of pure marketing genius.

The giving of engagement rings is a centuries old tradition. Emperor Maximilian I gave his wife Mary a diamond ring upon their betrothal in the 15th century, and the rest, as they say, is history. Traditions die hard, especially where love is concerned, so it’s unsurprising that diamond engagement rings are as highly prized in the 21st century. But what if diamonds weren’t very expensive? Is it actually the somewhat romantic chemical properties of diamond that so draws in consumers? Is it actually their beauty, when cubic zirconium looks pretty dang similar? Or is it their extravagance, their apparent rarity?

The De Beers Diamond Trading Company certainly wants to keep diamonds expensive. They have monopolized the market for over 100 years, artificially controlling supply to keep diamonds rare. Their scheme of stockpiling gems and orchestrating a massive advertising campaign has kept the industry running like a well-oiled machine for decades.

While the artificiality of the rarity of diamonds is a bit annoying, it is actually the aspect of the industry which bothers me the least. For a symbol of eternal love, diamonds carry a lot of negative baggage.

The most pressing problem is of course, conflict diamonds. Thrust into the public eye with the 2006 film Blood Diamond, conflict diamonds have been used to fund devastating wars in Sierra Leone and Angola. While De Beers and other industry leaders claim to have abolished any illegal doings in their mining processes, it is hard to tell what is truth and what is carefully crafted propaganda.

The current focal point of diamond conflict is Zimbabwe. Diamond smuggling is becoming an increasing aid to Robert Mugabe’s stronghold on the presidency, as more diamond mines are coming under military control and stones are smuggled out of the country by the bucket load. The Kimberly Process, the diamond industry’s regulatory body, has thus far done nothing effective towards reversing situation which is ever spinning out of control. In a place where less than 6% of the population is employed, six million people are dependent on emergency food supplies, and infant mortality has tripled since the 1990s, a corrupt diamond industry is the worst thing imaginable.

Clearly diamonds are still shrouded in misery, but they’ve never been more popular. To no one’s surprise, the diamond industry is currently suffering like everyone else due to the financial crisis, but if the ring fingers of my colleagues are any indication, traditional engagement rings are as popular as ever.

While the mining end of the industry is troubling, the American end isn’t exactly nothing but sunshine. Perhaps nowhere else are our expectations of women lower. My jaw dropped at a recent visit to www.diamonds.com. Under the pages “For Women Only” and “For Men Only” disgusting stereotypes about men, women, and relationships were paraded one after the other. The women’s page was all about how to get a man to propose, or more specifically, how to get a man to propose with a big, fat, diamond ring.

It shocks me that we are still here. That still, in the 21st century, it is expected that the man will propose, and it’s the woman’s job to manipulate him into doing so, for heaven forbid she propose herself. The tips in this section range are laughably absurd (“respond to everything with ‘that has a nice ring to it’”) but the sad truth is many women buy into this kind of thinking. I asked a coworker recently how long she and her boyfriend had been together, and her response was a smirk and “Too long to not be engaged.” I asked why she didn’t just propose herself and she looked at me as though I had six heads.

The “For Men Only” section was just as bad. From the picture of a dazed, confused, and slightly terrified fellow (because all men are thus struck at the idea of marriage and diamond buying) to the tip that “a diamond gets you out of the doghouse” the page would make me laugh if it wasn’t serious. Of course, this is a diamond website, so it can be expected to proudly perpetuate the lowest common denominator. Still, I am convinced this kind of thinking is well ingrained in many an American brain.

I think jewelry is lovely. I have spent many a daydream on thoughts of a white-dress, sparkly-ring, one-knee nature. And actually, let me just say here and now that I actually like tradition when it comes to cultural ritual. But even traditions need to evolve and change with the times, and I think the American culture of the diamond ring is falling far behind. If a man wants to propose with an African-mined, De Beers diamond, that’s fine by me (so long as it was truly conflict and child-labor free). But the societal insistence that this be the only way to embark on an engagement is ridiculous. Every person is unique, every relationship is certainly unique, and we should have engagement traditions enough to go around.

Sunday, March 8, 2009

I was going to write a post today about feminism. Several things made me think of this: the appearance of one of my feminist heroes (Jessica Valenti) at my university, my article in the student paper about gendered language. I was going to talk about the use of the generic masculine in English grammar is damaging to women (and especially to girls learning the language as children). I was going to talk about the many double standards which face women in our society, especially when it comes to sexuality. I also planned to discuss how women of my generation often avoid the term “feminism”, thinking it synonymous with man-hating and bra-burning.

I had the post pretty well figured out (because I, um, always have my posts carefully planned out before beginning to write…) and was ready to go when I opened my email inbox this morning. I had an email from the National Organization for Women, announcing that today is International Women’s Day.

International Women’s Day has been observed since the early 1900s. In some incarnations it has been similar to Mother’s Day and is treated as an opportunity to give gifts to mothers and grandmothers. It is an official holiday in 15 countries and is still often celebrated with flowers and gifts, but in the 21st century it has taken on a more ambitious bent.

For decades (centuries, really) people around the world have been fighting for women’s equality. Today International Women’s Day serves as a catalyst for social change to benefit women. Thousands of events take place around the world celebrating women’s achievements and bringing attention to causes which require further work.

Globally, the inequality between men and women is fairly obvious (some may argue that it is obvious in the United States as well, but I think that if so many young women are so averse to the word “feminism” then it isn’t). Women are dwarfed in numbers by men in business, politics, and the sciences. 25% more girls than boys are out of primary school. More than half a million women die every year from complications from pregnancy and childbirth. Violence towards women and girls is prevalent, and some of it is even proscribed by society, as in the case of honor killings and female genital mutilation.

From a global perspective, the feminist issues I normally write about seem paltry. But I don’t think we should dismiss them. Any efforts, subconscious or deliberate, to undermine women need to be reversed. And the more empowered women from this country are, the more able they will be to help women around the world. So from that standpoint, I think International Women’s Day needs to serve as a reminder to Americans that yes, while we have work to do in our own country, we cannot forget the women of other nations who struggle so much more.

The theme of this year’s International Women’s Day is “Women and men united to end violence against women and girls.” Violence against women is unfortunately a problem that is just as familiar in this country as it is in others. On average, every day in the United States 600 women are raped or sexually assaulted. One-third of female murder victims are killed by an intimate partner, and domestic violence affects millions of women every year. And this is in the United States alone. In the world as a whole, the statistics are truly horrifying.

One in five women will be a victim of rape or attempted rape. Homicide is a disturbingly high cause of death among women of reproductive age. In some countries as many as 50% of women have been abused by an intimate partner. Violence is often underreported, and the victim is frequently held responsible for the violence against her. In many cases the patterns of abuse are so engrained, and the public support structure so faulty (or absent) that the task of reversing these statistics is unbelievably daunting. But I think that the global community, and Americans in particular, cannot forget or ignore the issues of women. I also think many of the problems women are facing are interrelated. Education, health care, legislative rights, all these things will help improve the lives of women. They may not alone end violence against women, but if we look at all the issues together we have an excellent chance of succeeding.

Clearly there is work to be done. But International Women’s Day also serves as an opportunity to celebrate all the progress that has been made. We have women prime ministers, Olympians, astronauts, Nobel Prize winners, and CEOs. The world is full of women role models, women who work or raise families or do both. Women are accepted into universities around the world, and in the United States graduate at greater rates than their male classmates. We have far to go, but we have come so far already. At the dawn of the 21st century, the future is very bright.

Sunday, March 1, 2009

Liberty University scares me for many reasons. Any university which has a dress code (“ponytails for men are unacceptable”), dorm room checks to make sure beds are made, and rules forbidding any media displaying an “anti-Christian” message seriously disturbs me. And let’s not forget the code of conduct which disallows any physical contact between members of the opposite sex besides hand-holding (though I have to wonder how they’d feel about hand holding from members of the same sex).

But of course, no one is required to go to Liberty University. If someone really wants to attend a school which has the good graces to allow its students to attend the movie theater but not if the movie is rated R, then I guess that’s their prerogative. It’s the student’s education, their life, and their business. However, there is one aspect of Liberty University that I do consider my business: the teaching of intelligent design in biology classes.

“But wait,” you say. “Liberty is a private university; they can teach whatever they want.” True, but the problem is the graduates of this biology program, many of whom will go on to teach science in public schools. Liberty University is licensed by the State Council of Higher Education for the State of Virginia to award degrees. Those degrees are recognized as a primary qualification for a teaching job, including jobs teaching science.

Looking at the syllabus for an introductory biology course, it seems ludicrous that it can pass for a science course. There may be discussions of photosynthesis and DNA, but it is all meant “to reach the conclusion that life is derived from and dependent on a Creator.” The scientific community by no means uses biology to reach that conclusion. While I don’t think the purpose of a university is to churn out students brainwashed to regurgitate concepts without thinking critically on their own, I do think it’s a bad sign if one school or course dramatically differs from the consensus of an academic body.

Biology 101 at Liberty University also asks its students to “explain how truth in Scripture distinguishes the species Homo sapiens theologically and spiritually from all other species created by God.” They may feel they’ve tipped their hat to science by using human’s biological name, but the statement drips with such absurdity that it cannot be called science.

First of all, there is the God part. As I previously mentioned, the scientific community does not endorse the creation of life by a god of any kind. I’m sure many scientists haven’t ruled it out as a possibility, at least in their own spiritual life, but it is simply not a question which is being considered by science at this time. Will science someday be able to answer the question of God’s existence? Maybe. But to assume the existence of a god when there is no scientific evidence to support the claim makes the statement, well, not science.

Secondly, the differentiation of humans theologically and spiritually is not really a scientific matter. At least it seems the professor is aware of humans’ biological similarity to animals—after all, it is fairly obvious. And whether other animals have what seem to be uniquely human experiences (dreaming, killing of one’s own species, homosexuality, etc) is being researched by scientists in a variety of fields. But theological concerns are primarily (if not exclusively) being investigated by theologians. Again, not science.

But what is most intensely disturbing about this statement is the “truth of Scripture.” Again the assumption here is that the Bible is true, and the professor has built an entire course upon that assumption. Jerry Falwell founded an entire university based on it. Never has it been proved that the entire Bible is true. While there are some historically supported events and people, the vast majority of it is either proven false by science or history, or is unknown.

To fancy that the Bible is 100% factual is to dismiss not only biology but also geology and other areas of science. It has not stood up to the scrutiny that every single scientific theory has withstood. Teaching it alongside heavily researched and monumentally provable science would be funny if it wasn’t so scary.

Of course everyone should be able to believe whatever they like. I’ll respect your view that God created the Earth in seven days if you’ll respect mine that we’re actually situated on a ping pong ball that is careening out of control in a table tennis set in some alien's garage. Religion certainly has secured its place in society and culture, and I don’t have a problem with religion being taught at institutions of higher education (especially the private ones). But to award science degrees to students who base their thinking on legend rather than science is reprehensible. The scientific method is a discrete and resolute process for acquiring human knowledge. It cannot be matched by any amount of superstition, questionable religious “evidence,” or faith.

If Liberty University wants to teach its biology students that life was created by an all-knowing God, fine. But if they do not want to abide by the knowledge of the scientific community they should not be awarding science degrees. Period.

Sunday, February 22, 2009

To be bold, if I may: the United States’ predilection for standing strongly opposed to major international accords is arrogant, greatly harmful to the world at large, and a damned nuisance.

Let me explain.

On February 11th, 2009, the International Criminal Court (ICC) issued an arrest warrant for President Omar Hassan al-Bashir of Sudan. It was a bold move in the increasingly desperate quest for an end to the tragic events taking place in Darfur, and further proof that the young ICC is fully prepared to kick ass and take names.

In only seven years of existence, the ICC has become a dominant force in the pursuit of peace and justice worldwide. Prosecuting crimes against humanity, war crimes, and genocide when a nation is unwilling or unable to do so itself, the ICC’s sole aim is to ensure no one—regardless of military or government rank—is exempt from accountability. One wonders who could be against such an endeavor, but surprise surprise, the United States was one of only seven countries to vote against the Rome Statute, which created the ICC.

The Bush Administration further distanced the US from the ICC in 2002 with the American Servicemembers’ Protection Act (ASPA). The law prohibits the US from cooperating with the ICC, authorizes the use of “all means necessary” to free US nationals from ICC custody, and prohibits US participation in international peacekeeping activities unless immunity from the ICC is guaranteed. Furthermore, the Bush Administration requested countries around the world to agree to “Article 98 Agreements” which prohibit the surrender of US nationals to the ICC. Given the overwhelming support of the ICC by the majority of the world’s nations, it looks like the United States would like two sets of rules: one for us, and one for everyone else.

This isn’t the first time the US has remained conspicuously absent from participation in major international legislation. We may have signed the Kyoto Protocol, but we have yet to ratify it, which renders the signature meaningless. World leader we may be, but if an international resolution means progress around the globe at the price of a possible hit to Wall Street, our decision is made swiftly and always in favor of the mighty dollar.

But the ICC issue isn’t so much about money as it is about power, in however much you can distinguish those two things. The ASPA does little to hide the government’s fear that the ICC will prosecute members of our military. Well yes, the ICC can and will investigate people regardless of military status, but only if their own government won’t, and only if they are accused of truly horrific crimes. So it seems that a government desperately trying to avoid this is a government with something to hide.

The ICC does not interfere with legal proceedings of nations, provided they are carried out in a fair manner. They don’t remove heads of state, invade countries to “set things right”, or unnecessarily step on the toes of functioning legal systems. If they did, then the disapproval of the United States would be understandable, if scathingly ironic. But the ICC does none of those things. The ICC steps in only when all else fails, and when no one else will.

In addition to President Bashir’s warrant, the ICC is dealing with issues in the Central African Republic, the Democratic Republic of the Congo, and Uganda. Currently on trial are war lords charged with enlistment of child soldiers, directing attacks on civilian populations, and knowingly permitting sexual slavery, among other atrocities. The scope of the crimes is truly incomprehensible, but the process is absolutely as fair as any country with the Bill of Rights should expect. The defendants are presumed innocent, granted legal counsel, and have the right to present evidence in their defense in court and to remain silent. All charges must be proved beyond a reasonable doubt.

It is both baffling and infuriating that the most powerful nation in the world is opposed to proceedings such as these. Fair due process of the law addressing the most heinous crimes in recent memory would seem to be something everyone can get behind. While the circumstances from which these wrongs come are immeasurably complex and not solvable by justice alone, letting them go unpunished is unthinkable.

Given the gravity of the offenses currently on trial, my hope is that the United States military doesn’t have anything on the same level it’s trying to hide. But I have to say; perhaps our objection to the ICC will become prudent when they discuss adding the crime of aggression to the statute later this year. Without so much as a definition, it’s hard to say whether the US is guilty of this, but I’d be surprised if we weren’t close. But that’s all speculative. As far as I know we have not committed crimes against humanity, war crimes, or genocide since 2002 which we are not already investigating ourselves.

President Obama has said multiple times he plans on repairing America’s relations with the world. I can think of no better step in the right direction than supporting the ICC, especially in light of President Bashir’s arrest warrant. After all, it’s up to himself or his government to produce him at court in The Hague. Without substantial international pressure it’s likely he’ll opt to skip town instead (or worse, escalate violence in hopes of garnering a desperate plea for peace in the form a deferral from the UN).

It may seem hypocritical for us to throw our weight behind one of the most important human rights organizations in the world when we are certainly guilty of some questionable human rights practices of our own. But it’s time for the US to earn the respect of the world again, and being nonchalant about whether war criminals get away with their crimes or not isn’t exactly the way to do it. We can hardly be beacons for peace and justice in the world if we don’t support their greatest defender.

Sunday, February 15, 2009

It’s the middle of February now—how are you doing on those New Year’s Resolutions? If you’re like me, and love to set goals, you probably concocted a long list of things to do, achieve, eradicate, and solve in 2009. If you’re like me, you also probably have declined to take action on many of them in favor of sitting on the couch. It’s not that I don’t care about my goals any more. It’s just that sitting on the couch is so pleasant, and taking action is, well, hard.

But the difficulty of my resolutions is nothing compared with the plans of the United Nation’s Millennium Development Goals. These aren’t just New Year’s resolutions, these are New Millennium resolutions. Adopted in 2000, the Millennium Development Goals (MDGs) are ambitious in the extreme. End poverty and hunger. Ensure universal primary education, and equal access to education for girls. Reduce child and maternal mortality and combat HIV/AIDS. Promote and ensure environmental sustainability, access to safe drinking water, and open access to the market for developing nations. Oh, and all these things are supposed to be achieved by 2015.

I don’t mean to be a cynic, but it’s 2009. 2015 is only six years, two World Cups, and two Harry Potter movies away. It took me nearly six years to graduate from college, so I am just a little bit skeptical that plans as grand as the MDGs are achievable in just over half a decade. It would seem impossible in the best of times, but now, with the global economy still in a free-fall, extreme poverty is actually increasing.

This isn’t to say that they haven’t made any progress. Given the enormity of the goals and the brevity of the time span, the numerous agencies working on the MDGs have made some remarkable changes. They have halved the number of people living in extreme poverty in East Asia. They have provided widespread measles immunization in Northern Africa and Latin America. They have increased access to clean drinking water in urban areas around the globe. Clearly there is cause to celebrate, but the end is nowhere near in sight.

There are 20 specific goals, and Sub-Saharan Africa has improved on none of them. No region on Earth reports progress in all 20 areas, and many of the problems are worsening instead of improving. What is going wrong? In an incredibly thorough report Investing in Development: A Practical Plan to Achieve the Millennium Development Goals, issued in 2005, Professor Jeffrey D. Sachs and his coauthors walk through the current state of affairs in all the target regions, and discusses what is working and what isn’t. They identify four major contributors to lack of progress: governance failures, poor geographical conditions, pockets of poverty in otherwise middle-income nations, and “poverty traps”, their term for countries too poor to help themselves.

These pitfalls are obviously hugely obstructive to improvement, but I believe, perhaps naively, that they can be rendered obsolete. All it would take would be one monumentally simple but impossibly difficult act: getting the people to care. History has proven time and again that any adversity can be overcome if enough human souls are dedicated to overcoming it. But the larger the obstacle the more people are needed, and the eradication of poverty world-wide is about as big an obstacle as there is.

The other problem with getting the people to care is, in almost all cases, the care of the people affected by the crisis is not enough. If there is a government which needs overthrowing or a revolution that needs starting, a group of sufficiently angry people is often adequate. But most of the time, those in trouble are unseen enough that their voices alone go unheard. The abolition of slavery in the United States took not just the efforts of black abolitionists and the slaves themselves, it also took the work and passion of white Northerners whose everyday lives weren’t affected by slavery one way or the other. Female suffragists campaigned tirelessly and nobly to get the vote for women, but ultimately it was up to male government officials to make it a reality.

Last semester I took a gender studies class, and in one session we broke into small groups to discuss gay marriage. After an inconclusive discussion in my group, a classmate announced, “Well, it doesn’t affect me, so I guess I don’t really care.” At the time I was offended by her comment, but of course I am guilty of the same apathy towards issues which don’t affect me. While I do think people are naturally empathetic, it is difficult to turn empathy into action.

The problem with the Millennium Development Goals is not only that they don’t directly affect those of us in the best position to make a difference, but also that many of us are completely unaware of the true scope of the problem. How many Americans and Canadians do you suppose have heard of the MDGs? How many Western Europeans are insisting that their governments put more time and money into their realization? How many Japanese and Australian citizens use poverty eradication as a primary factor in their voting decision making? We all feel sadness when we see a picture of a starving orphan, but when poverty isn’t staring us in the face it is all too easy to forget about it.

I do believe large-scale caring about the MDGs would be the most powerful agent to seeing them achieved. I do not, however, have any clue as to how to make manifest this caring in people the world over. The most powerful governments will only take strong action if their citizens insist upon it. The economic crisis has paralyzed many with fear, and it seems unlikely that the concerns of the poorest billion folks on the planet will occupy much of our thoughts until it is over. I think it is unwise, however, to assume that once the economy is on its way up the MDGs will achieve themselves. Poverty has existed as long as civilization has. Its eradication is going to take more than some random acts of kindness during the good times.

I don’t have any answers. I have only questions. Evolutionarily, it makes sense for us to care for one another, and I think this is the biological basis for empathy. But why does empathy sometimes motivate us to action and sometimes pass without a second thought? What will it take to convince the wealthiest in the world to help the poorest? We know our strength, and what we are capable of when we join together. What is stopping us?

To learn more about the Millennium Development Goals, please visit http://www.un.org/millenniumgoals

Sunday, February 8, 2009

“Government…can’t be trusted to control its own bureaucrats or collect taxes equitably or fill a pothole, much less decide which of its citizens to kill.”

~Sister Helen Prejean

Since 1973, exactly 130 people have been released from death row upon evidence of their innocence. To me, this is a troubling number, since I find it highly disturbing that an innocent person could be on death row at all. But being a human enterprise, the justice system is bound to make mistakes, and the sentencing of criminals to death is not an exception. That is precisely the problem.

Troy Anthony Davis has been in prison for 18 years following his conviction of the 1989 murder of a police officer. His execution has been scheduled and rescheduled three separate times, and once the order of a stay of execution came mere hours before the lethal injection was to be carried out. His case has attracted the attention of human rights organizations worldwide, and it was through Amnesty International that it was brought to my attention.

Since Mr. Davis’ trial, 7 of the 9 witnesses have recanted their testimonies. One of the remaining witnesses has been implicated as the real killer. There is no physical evidence linking Davis to the murder, and the murder weapon has never been found. I am not a lawyer, but I find it hard to believe that the likelihood of his guilt is beyond a reasonable doubt. While he has once again been granted a stay of execution, no court has yet heard an evidentiary hearing to examine the witnesses.

I cannot imagine the suffering Mr. Davis has endured throughout this agonizing process. It could be argued that the process of repeatedly scheduling executions and granting last-minute stays is itself cruel and unusual punishment. Of course, the unconstitutionality of cruel and unusual punishment is often cited as an argument against capital punishment at all. Others contend that executions are no better than the murders for which the prisoner is being executed.

The death penalty is certainly a very intriguing issue, philosophically. Whether the state ever has the right to kill one of its citizens is a fascinating if morose ethical question. Whether “an eye for an eye” is morally viable is a similarly intriguing topic for philosophers armchair and professional. But in my opinion these and any other theoretical musings hold little value in the actual debate over capital punishment in the United States and elsewhere.

Capital punishment should be abolished because it is not fair. The snarky observer may point out that nothing in life is, but that too is beside the point. Life itself will never be fair, but that doesn’t give us license to continue practices which we know could put an innocent person to death.

The handing out of death penalties in the United States is discriminatory in two of America’s favorite ways: race and class. Many people know of the rampant racism at work in the justice system. African-Americans in particular are imprisoned in hugely disproportionate numbers, with Hispanics in not much better shape. What may come as a surprise then is that more white people have been executed than black. Sadly, this is not a case of improved attitudes towards race in this country. For while the race of the perpetrator doesn’t seem to be a huge factor in capital cases, the race of the victim makes all the difference in the world. 78% of victims in capital cases are white, while only about 50% of total murder victims are white. So you see, it’s not that we see black criminals as being worth less, we just see white victims as being worth more.

Furthermore, the poorer defendant is at a huge disadvantage. This is true of all criminal proceedings, but when the stakes are as high as a capital case it becomes especially disturbing. Those who are in the lowest 1/3 in terms of spending on attorneys are more than twice as likely to be sentenced to death as those who spend more.

With odds like these, it is sadly unlikely that Troy Davis is a remarkable exception to an otherwise spotless system. Given that 130 people have been exonerated, one has to wonder how many innocent people have been put to death. Just one is enough to require that the practice be abolished, and I have little doubt that there has been more than one. Even with improvements in forensic science and the ubiquity of DNA testing, there will be more. It is a human enterprise, after all, and can never be infallible.

Of course, while Troy Davis may be innocent, there is also a chance he is not. But whether you believe he is guilty or innocent, whether you are opposed to the death penalty or not, I hope you can agree that not granting him a hearing with such an upheaval in the witness testimony would be a crime in itself. He has a right to a fair trial, one free from police belligerence and harsh interrogation. One would hope that he would have had this the first time around, but since he did not, it is unthinkable that he may have to pay the price without the chance at all. Especially when the price is his one life.

It is easy to feel powerless in these situations, but I hope you remember that you have one very powerful tool at your disposal: your voice. It may just be one voice, but as Thomas Jefferson said, one person with courage is a majority. I added my voice in the pursuit of justice on Amnesty International’s website about Troy Davis. If you are enraged by this situation as I am, I hope you will visit too, and add yours.

Sunday, February 1, 2009

If newspaper headlines are any indication, the economic crisis is the only topic worth talking about. Gaza and the Super Bowl get their time in the sun here and there, but for the most part everyone is talking about everyone’s favorite topic: money. Thankfully it seems we’re spending a little less time rehashing who is to blame for the current state of affairs and are focused on how to get out of it.

The stimulus package currently on the Senate floor has many elements. It includes tax incentives for home buyers, lower mortgage rates, and billions of dollars on infrastructure spending. None of these things is set in stone of course, as the bill is still under debate, but the golden combination that seems to be preferred by politicians and economists is stimulating infrastructure and the housing market. The other piece of the pie is tax breaks to “working families” (whatever that means). There are dozens of proposed tax breaks, each with a vehemently passionate economist behind it. But the common thread running between all the various ideas is: how do we get the people to spend?

It’s an odd idea, in this country. When have the American people not spent? For better or worse, consumerism defines us as a nation perhaps better than any other concept. In the land that invented malls and McDonald’s, the real American dream seems to be to get more. The well-oiled advertising machine assures that we deserve the big screen TV at the same time that it makes sure we realize that what we have is never enough. More people visit the Mall of America annually than the Grand Canyon, the Smithsonian, and the Statue of Liberty combined. Americans spend hours on the internet every day, and the ubiquity of online shopping now means we need never be far from that coveted pair of shoes. And in addition to all that online time, the average American watches 4.5 hours of TV per day, and that includes over an hour of commercials.

I’m proud to be an American, but these things disturb me. I am certainly not immune to the allure of either a big screen TV or a fierce pair of heels, but the acquisition of goods is not a cornerstone of my existence and never will be. I am more concerned with my time, and how I spend it: with my loved ones, in places which revive me, and doing things which fulfill me. Of course, I am not the only one with these values, and they are not safe from the advertising behemoth either (just look at Hallmark commercials and tourism ads). Be it goods or services, advertisers want you to consume endlessly and voraciously, at the expense of everything else.

I don’t believe that advertisers are inherently evil, that the existence of the credit card is the root of all our problems, or that capitalism should die. What makes me sad is that so many people choose less-fulfilling lifestyles in order to buy more. Or as one unknown sage said: “you work a job you hate, to buy stuff you don’t need, to impress people you don’t like.” Keeping up with the Joneses is a national pastime, and folks will do just about anything to get the newest, fastest, shiniest, hippest product. And yet—despite sacrifices made for such purchases, there doesn’t seem to be any direct correlation between happiness and the amount of “stuff” a person has.

As omnipresent as the voice of consumerism is, the anti-consumerism culture is growing. More people are becoming concerned with the environmental impact of our consumption culture, and others simply can’t afford to spend so much. Campaigns such as Ad Busters and Kill Your Television promote a lifestyle focused on family, community, and personal fulfillment. Environmentally focused groups educate people on how to consume less as reducing one’s carbon footprint becomes a mainstream ambition. It actually seems like anti-consumerism may be on the rise, and it’s not just for hippies anymore.

So where does that leave us, in the current economic climate? Personal consumption makes up 70% of our economy. While it will take a lot to get the economy going again, it seems like it will be virtually impossible without increased spending. This is of course why the government is carefully crafting its tax cuts to inspire people to spend, not save. If there is one thing this country doesn’t need, it’s sound financial decisions on the part of the masses.

On the one hand, this seems like a perfect opportunity to stick it to The Man. “Oh yeah, you want me to spend that tax rebate? Well too bad for you, mega-conglomerate! I’m going to put it in savings and see if I care if you don’t get your Christmas bonus.” Unfortunately, it is not just The Man who gets short shifted by this thinking. We’ve been seeing for months the results of a crumbling retail industry. Thousands of jobs have been lost, and more announcements of more layoffs seem to come daily. This on top of the fiasco that is Wall Street and the worthless housing market has sent this country into an economic tailspin.

In addition to a massive unemployment rate, the dark side of an economy in crisis is a huge cutback in public spending. We all know that the arts, libraries, parks, and museums are the first programs to get the ax. So it seems the irony is that the very things that serve as alternatives to consumerism, the things that nourish us and keep us healthy in body and mind, are the first casualties of a drought in consumption.

It is the horrific nature of a country without these things that cause me to, for the first time in my life, call upon my fellow Americans to go out and spend some money. After all, it is not the products themselves which are the problem, but the place they hold in our lives. I believe that if you appreciate and enjoy your material wealth, but ultimately value it less than people, places, and activities, then happiness can always be yours whether you are flush or strapped for cash. So by all means, eschew television and orchestrate your plan for overthrowing Big Bad Business. But just for fun, go ahead and buy those shoes, just because you want to.

Sunday, January 25, 2009

Barack Obama is not the Messiah. Some of you may believe he isn’t because someone else already is, but for the rest of you, I just wanted to clear that up. Whether he will even be a stand-out American president is still up for grabs. But one thing is for certain: the American people are beguilingly both wishy-washy and emphatic when it comes to political leadership. President Obama is the best example in recent memory of the latter.

The overwhelming roar from his fans produced an almost maniacal devotion to him: comedians who mocked him found their audiences silent; manufacturers churned out t-shirts, mugs, jigsaw puzzles, and lava lamps (yes, lava lamps. Buy yours at http://lavaworld.com/obama/obamalavalamp.html). Even after the race was won and the new president assumed office, the enthusiasm has continued. The logo of Pepsi’s Optimism Project is a not-at-all-veiled imitation of Obama’s campaign logo, and you can even express your adulation for him by eating a scoop of Ben and Jerry’s Yes Pecan. Whether this will last throughout his term will largely depend on how he performs, but it certainly seems like many are seriously considering proclaiming him the Son of God and calling it a day.

The Messianic rise of Obama has been brought about through a combination of the public’s potential for ferocity when ferocity is due and, I believe, a conscious taking advantage on Obama’s part of this capability in the people. It didn’t hurt that he is also young, good-looking, an eloquent and charismatic speaker, and the first of his ethnicity to represent a major party’s ticket. It also didn’t hurt that he followed one of the worst presidents in United States history.

Many of our best-loved and most influential presidents have succeeded our most abysmal failures. President James Buchanan is widely regarded among scholars as one of the worst presidents in our history. It might not be an unfair title, since his dismissive treatment of Southern succession resulted in the Civil War. After him was Abraham Lincoln, who not only navigated and ended the war, but also oversaw the end of slavery during his two terms in office. Since civil war is arguably the most serious crisis to befall a nation, any president who gets reelected during one is a remarkable politician indeed. In the last century, a similar exchange took place, when the widely unpopular Herbert Hoover lost his reelection to Franklin D. Roosevelt. Whether President Hoover could have in fact ended the Great Depression if he had more time is left only to speculation. But the people were sufficiently unconvinced that he could. Not only did they choose someone else when given the chance, they reelected that someone three times. But FDR is remembered as a great president not only because he had an easy act to follow, but because he did shovel the country out of the Great Depression. He also helped the Allies to victory in World War II, founded the Social Security system, and helped create the United Nations. Would he and Lincoln be so revered if they hadn’t followed such incompetent presidents? That too lies in only speculation, but surely the great popularity with which they were embraced by the public is at least partly due to severe dissatisfaction with their predecessors.

At this point, most agree that George W. Bush was a terrible president. Even many Republicans abhor his decisions, and many who voted for him—twice—have grown to resent the effects of that decision. Between multiple wars, a complete economic meltdown, and growing energy concerns, our country hasn’t been in such bad shape for a very long time. Barack Obama may very well have been elected president under different circumstances, but in my opinion his historic win is due not only to his own merits but also the conditions under which he ran.

Citizens are nervous about transferring power when the going is good. And why wouldn’t they be? When what you currently have is doing just fine, it’s hard to get too excited about an unknown. And despite multiple books, thousands of speeches, and endless internet campaigning, Obama was a complete unknown. A junior Senator with no executive experience, very little of what he had accomplished in the past gave direct evidence to how he would perform as president. But the people didn’t seem to care. Obama knew this, and sculpted his campaign based on electrifying speeches rather than exhaustive accounts of his prior accomplishments. He certainly didn’t hide that he went to Harvard, that he ascended through the political ranks quickly, or that he voted against the Iraq invasion. But he didn’t focus his campaign on these things.

He nailed his campaign by telling the people what they wanted to hear: change. Few recent presidents have had approval ratings as low as Bush’s. Obama’s campaign capitalized on this, not just by making “change” its mantra but also by pronouncing Senator McCain just another Bush. Most conservatives still went for McCain in the end, but the liberals got so fired up about Obama that it was almost impossible to hear anything else above the din. While many conservatives were fed up with Bush, by the end of Bush’s second term liberals had become completely unhinged. Obama fed the fire of the liberals’ desperation by delivering his message of change in eloquent, inspiring, and sometimes even quasi-religious language.

He frequently asked his supporters to believe, most prominently through the slogan “Change We Can Believe In.” And in at least one instance (a speech in Lebanon, New Hampshire in January of ’08) he likened voting to a spiritual experience: “You will have an epiphany…and you will suddenly realize that you must go to the polls and vote for Obama.” Senator Clinton mocked this rhetoric, but that it was effective is without question. The American people were so utterly convinced that he could spearhead change in Washington and in their lives, and that he was personally invested in making the country a better place, that not only did they nominate him as the Democratic candidate, they elected him president.

I believe that President Obama is serious about wanting to bring change to America. I believe that he genuinely wants to make the country, and the world, a better place. But I also think that he, like every other candidate this nation has seen, has much more personal reasons for running for the most powerful office in the world. People don’t run for president just because they want to change the world. They run for president because they are ambitious.

So while I am excited by the civic involvement he inspired in so many people, and while I am excited by what he has done thus far in office, I think the Obama-as-the-Messiah idea is growing a little tired. There is no need to put him on magazine covers basking in a celestial aura, as The Rolling Stone and many other publications did. And there is danger in assuming that because he is now president all our problems will be solved. He himself spoke sternly against complacency in his Inaugural Address, and it is imperative that citizens don’t rest on their laurels assuming he’ll do all the work. He is a human being, he will make mistakes. Ultimately, the fate of this nation is in the hands of the people, as it has always been.

Sunday, January 18, 2009

As you may have heard, we made history on November 4th, 2008, with the election of Barack Obama as the President of the United States. Headlines erupted around the world announcing America’s first black president, magazines declared a triumph for African Americans, and comedians quickly parodied a projected downfall of white power. Along with millions of other Americans, I shared in the elation about Obama’s election. I was elated first and foremost that he is a Democrat—bringing a much needed balm to a Republican-scarred Washington—and a charismatic one besides. I was elated by his message of change, the way he was able to get so many millions of Americans fired up for their country again, and by his plans for our troubled nation. And while I absolutely appreciate the historic significance of this election, I can’t say that I am particularly excited because Barack Obama is black. I am excited because he is not white.

From what I’ve seen in the months since the election, the media, as well as the general populace, places special emphasis on President Elect Obama’s blackness, as opposed to his minority-ness. But I would have been equally excited about a Hispanic candidate, an Asian candidate, or a Native American candidate (that’ll be the day). This is not to say that I am indifferent to the unique history of African-Americans. While every minority has faced adversity in this great but often unforgiving nation, most agree that African-Americans had it the worst (probably because the Native Americans were wiped out to such an extent that people forget they are there). Because of the horrific nature of slavery, and the dramatic civil rights struggle previous generations of blacks went through, it is truly thrilling to see one of their number elected to the most powerful office in the world (even though he, um, isn’t a descendent of slavery or the civil rights movement…but that’s another topic for another blog). So while this history does move me, I am more focused on the input of diversity into the American government—any elected official who is not a white man gives me at least a little excitement, regardless of party affiliation. Because the problem with our government, if I may be so bold as to reduce the woes of a preposterously complex institution to a single fault, is that it is made up of too many white men.

Not that I have anything against white men. Several of my absolute all-time favorite human beings are white men. And while the penchant of some white men in previous centuries to steal land and kill and/or oppress the previous occupants can leave a sour taste in one’s mouth, it is of course neither the whiteness nor the maleness that is itself the problem (let’s not forget: whiteness and maleness gave us penicillin, the Sistine Chapel, and the moon landing). In fact, I think a government of almost exclusively white men would be just dandy—if our country was made up of almost exclusively white men. The United States of America is an astonishingly diverse nation, yet the government represents but a sliver of that diversity with its overwhelming percentage of white male elected officials.

And of course I’m not just talking race here. Half the population of our country is made up of women, yet we hold just 16 seats in the United States Senate. Can it come as a surprise then that issues of particular interest to women are often ignored and routinely suppressed? Similarly, while blacks make up 12% of the population, they have just one representative in the Senate, and representation of Hispanics, Asian Americans, and Native Americans is also disproportionate to their actual numbers. The result is a government which requires insurance companies to cover Viagra, but bends over backwards to restrict access to contraception. The result is a justice system which puts African-Americans behind bars in staggeringly incommensurate numbers, despite equitable participation in illegal activity by all ethnicities.

This is not to say that a diverse government will magically right these wrongs. After all, theoretically politicians pledge to represent all their constituents, so all colors and flavors of citizens should have a voice. It is abundantly clear, however, that this is not the reality of the United States government. Perhaps instead of working to have broader representation of women and minorities, we could just work on getting the current government to listen to them a little closer, but in the end I don’t think this would be as effective. Say what you want about empathy, about walking a mile in someone else’s Manolos or however the adage goes, I simply cannot believe that so uniform a government can truly be of the people, by the people, and for the people; not when those people are so diverse.

This logic is admittedly a slippery slope; I readily admit that class misrepresentation among elected officials is as bad or worse as that of gender or race. But the realities of achieving a high-level political career are such that the need for an advanced education is basically built in—or so I’d like to hope anyway. Theoretically in Horatio Alger America, anyone from any class can become a politician, but the nature of the education involved assumes that they will be of a higher class once they get there. I do think though that a wider representation of races and genders in government can only help the causes of other disproportionately represented demographics.

In the late 18th century, a group of white male landowners had a dream. It was a stunningly magnificent dream, but it did provide rights first and foremost for other white male landowners, leaving everyone else behind. From that point of view, the very foundation of this country is based on sexism, racism, and classism. And yet, in the past 233 years, people of all genders, races, and classes have worked tirelessly to create a new version of that dream. Suffragists, abolitionists, civil rights and labor leaders have all demanded a voice in their government, and it is through their efforts that the American political landscape is becoming more inclusive each day. We have a long road ahead of us, but I for one am incredibly grateful that I am alive to see the great step forward taking place this Tuesday, with the inauguration of President Barack Hussein Obama. In the 21st century, let us have a new dream: that someday the United States will have a government that is not only by the people and for the people, but truly of the people—all the people.