Category Archives: Essays

Haiyan and the Resilience of a Community

Typhoon Haiyan: residents of Tacloban city

It’s been a little more than a week now since Typhoon Haiyan plowed into the Philippines.  With maximum sustained winds of 195 mph, Haiyan – also known as Yolanda – is the most powerful tropical storm system in recorded meteorological history to make landfall anywhere in the world.  The previous record had been held by Hurricane Camille, which packed 190 mph winds when it slammed into the U.S. Gulf Coast in 1969.

The Philippines are no stranger to typhoons.  Strategically situated just north of Indonesia, between the South China Sea and the Philippine Sea, this country of 96.7 million has to brace itself every year for tropical events.  But, this time things are much worst.  As far as storms go, Haiyan couldn’t have hit a more vulnerable location.

Barely a quarter century removed from the brutal, 20-year dictatorship of Ferdinand Marcos, the Philippines still rank as a developing nation, even though it’s a relatively fully-functioning democracy.  Geographically classified as an archipelago, the Philippines are comprised of 7,107 islands.  But, it’s actually part of the overall Malay Archipelago, the world’s largest such areaHumans have occupied the Malay region for at least 30,000 years.  For centuries, though, the Philippines often served as a crossing point between mainland Asia and the larger islands of Borneo and New Guinea.  The arrival of Islam in the latter part of the 14th century changed much of the Philippines’ culture; a fact that remains even now, as the nation battles more radical Islamic elements.  In 1521, Spanish explorer Ferdinand Magellan became the first documented European to arrive in the Philippines.  He didn’t last long.  Barely a month later, local warriors killed him and several others who were part of his expedition during an intense battle.  But, the Spanish government, in its own bitter rivalry with Great Britain for world domination, persisted and launched more expeditions to the Malay area.  More battles ensued and more blood was spilled, but in 1565, King Phillip II succeeded in making the islands a Spanish colony.  It is him for whom the Philippines are named.  The Philippines remained a Spanish outpost until the 1898 Spanish-American War.  In 1935, the islands became a self-governing entity.

The “self-governing” part is always tricky for any nation that tries to set itself apart.  It’s especially difficult for those where democracy is an alien concept – which is pretty much most of the developing world.  After centuries of Spanish domination and Roman Catholic indoctrination, the Philippines weren’t a good candidate for automatic conversion to the democratic process.  I recall how a contingency of average Filipinos known as EDSA 1 toppled the Marcos regime in 1986, sending him and his family fleeing for their lives.  Even if his wife, Imelda, couldn’t haul her cache of designer shoes out of the imperial palace, the Marcos family had managed to siphon billions from national coffers before exiling themselves to Hawaii.  As the haggard clan disembarked from a plane, one Marcos relative clutched a bag of diapers, as if it was her only possession.  Then again, it’s quite possible fine jewelry and blocks of cash were hidden inside, so why wouldn’t she keep a tight grip on it?  In an attempt to make peace with the Philippines, the U.S. government indicted Ferdinand and Imelda Marcos on a series of racketeering and money laundering charges.  After Marcos died of cancer in 1989, the U.S. dropped all charges against Imelda.  She may have never got her shoes back, but at least she’s living in paradise.  Who says crime and corruption don’t pay?

When EDSA 1 finally rid the Philippines of Marcos, it installed Corazon Aquino as president.  Her husband, Benigno, had been a vocal critic of Marcos and was exiled in 1980 for his views.  When he dared to return to the land of his birth three years later, Marcos had him assassinated.  Thus began the torturous battle for freedom and the long slog towards a democratic state.  When international pressure compelled Marcos to call for elections in February 1986, Corazon Aquino was chosen as the opposition leader.

But, as foreign observers feared, everything that could have gone wrong with the Philippine election process did.  Results eventually proved Aquino as the victor, but not before scores had died in rioting.  When the Marcos family fled, Aquino took her rightful place as president of the burgeoning democracy and spent her single, six-year term fending criticisms of ineptness and coup attempts by Marcos supporters.

With a labor rate that is about 52% services and 32% agrarian, it’s no surprise the Philippines continues to struggle against the tide of wealth inequality.  Roughly 26% of the population lives at or below the poverty line.  Thus, Haiyan’s arrival added to the misery.  But, that happens wherever communities subsist in states of financial insecurity.  When Hurricane Katrina struck the U.S. Gulf Coast in 2005, President George W. Bush received staunch criticism for his inaction.  True, as a lackluster president, Bush didn’t have the mindset to respond to a natural catastrophe.  No one in his administration did.  But, for years, scientists had been warning the state of Louisiana that its southern enclaves were vulnerable to devastation, notably low-lying New Orleans.  But, the Crescent City itself was already in a state of decay.  Most of its citizenry relied upon government assistance and menial cash jobs just to survive.  The people were ill-equipped to help themselves get out of harm’s way; e.g. rent a car or buy a plane ticket.  The endemic corruption in both the city of New Orleans and the state of Louisiana set everyone up for disaster.

As of now, the death toll in the Philippines from Haiyan stands at 3,631 – the “official” estimate.  With so many rural areas still cut off due to lack of electricity and telecommunications, the number of victims may be higher.  I see reports of how bodies were left to rot on city streets and I’m glad chances of that happening here in the U.S. are rare.  People were upset that so many damaged vehicles were left on the streets of New Orleans almost a year after Katrina.  But, human bodies and animal carcasses?

Every one of those bodies was once a person; an individual who had a family and friends; someone who had hopes for a better future.  When death occurs on so massive a scale, it’s often difficult to think of the deceased as individuals.  It personalizes the disaster for us, so it’s easier to think of the dead masses and just shake our heads at the horror of it all.

Governments can’t address each one of them, so it has to consider the entire calamity and do what it can.  But, it’s really up to the survivors and their communities to cope with the aftermath.  They have to deal with the destruction; they have to clean out their homes; they have to gather what food and water they can find; they have to tend to the injured; they have to defend what’s left of their world.  In other words, they have to care for themselves.  That sounds brutal, but in a brutal situation, who best to take care of you and your loved ones except you, if you’re able-bodied?

I do know this: despite the mess, people will survive.  Someone will always get through such disasters and continue with their lives by rebuilding their neighborhoods and therefore, their countries.  After the initial shock, they stand up and just keep going.  It’s hard and it hurts; nothing like that is ever easy.  They may never recover emotionally or even physically from the upheaval, but they go on for as long as they can.  It’s just human nature.

This post from fellow blogger, Donna Amis Davis, a long-time resident of the Philippines, provides more personal insight into the disaster.

International Red Cross.

Doctors Without Borders.

Project Hope.

International Fund for Animal Welfare.

1 Comment

Filed under Essays

Sometimes People Do Deserve to Die Like That

homicide

One of my favorite television shows is “The First 48” on the A&E Network.  Camera crews follow homicide detectives around major metropolitan areas as they try to solve murders.  The show’s title is based on the concept that police must try to solve a killing within 48 hours of its occurrence, or the chances of finding the culprits decreases exponentially.  People who know me may find it’s a strange choice, considering I’m suspicious of law enforcement.  The few times I’ve needed the help of a police officer none are around.  But, if I should exceed the speed limit by 5 miles, or have an expired inspection sticker, suddenly they’re on the scene.  Still, I admire the tenacity of the homicide detectives I’ve seen on “The First 48.”  I also admire their tendency to remain neutral in the face of such tragedies; the worst that humanity has to offer.

While consoling the victim’s relatives, the detectives almost always declare that the person “didn’t deserve to die like that.”  True, no one really deserves to be murdered.  The adage about playing with fire and getting burned applies just as well to criminal activity.

In one of the “The First 48” episodes, a Miami homicide detective stood in the middle of a street in a particularly crime-riddled neighborhood and announced that it was “haunted by the ghosts of young Black men.”  Indeed, it seems so many of the crime victims and perpetrators are either Black or Hispanic.  I’m honestly surprised when a White person shows up as either a victim or a suspect.  That feeds into the mythology, though, that Blacks and Hispanics are more crime-prone than their White and Asian counterparts.

But, I’ve also noticed many of the homicide detectives – at least half – are either Black or Hispanic also.  So are many of the regular police officers.  They somehow go unnoticed in discussions of race and crime.

It’s not so much, however, that non-Whites are more likely to commit crimes.  Civil rights activists have long accused the criminal justice system in the U.S. as being skewered against non-Whites, especially non-White men.  The U.S. also maintains the highest number of incarcerated individuals in the world: roughly 2.3 million people, or 25% of the global prison population.  When one realizes that the U.S.’s 300 million residents comprise only 5% of the people on planet Earth, it should make folks stop and think.  While Blacks and Hispanics each represent less than a quarter of the U.S. population, together they make up 58% of the U.S. prison population.

People may scoff at these statistics and proclaim the U.S. just has a better legal system.  If that’s the case, then why do we boast the highest violent crime rate in the world?  As of 2011, the U.S. experienced 1.2 million violent criminal acts.  One would think we’re akin to Somalia: a completely lawless state with no functioning government.

I’m neither a criminologist nor a psychologist, so I have to rely on whatever statistics I can find and verify, instead of on personal or professional knowledge.  But, in viewing “The First 48,” I’ve noticed something critical: whenever police enter a crime-ridden neighborhood and seek help, they’re often met with a wall of silence.  No one saw anything; no one heard anything; no one knows anything.  It’s as if the victim abruptly turned up with a bullet in their brain, while nearby residents were sleeping, watching TV, or talking on the phone and ‘didn’t hear anything,’ or ‘don’t know nothing.’  At times, it seems such neighborhoods are group homes for the mentally retarded.

In one of the show’s episodes here in Dallas, officials arrived to investigate a shooting death in an apartment complex.  When one of the detectives approached a group of young men sitting on the hood of a car, the latter jumped off the vehicle and walked away.  They didn’t say anything, but their actions spoke for them: ‘we don’t want to talk to you.’  But, if you’re upset about crime in your neighborhood, then why don’t you talk to the police and tell them what you know?  Of course, that’s always easier said than done.  The police don’t have to live there.  People are often mired in poverty and can’t afford just to get up and move to a safer place.

In one episode of “The First 48,” a resident of a Miami housing complex complained to a detective that police only come around to issue tickets for cars parked in front of the trash dumpsters.  I can understand her point.  Police get frustrated when people won’t communicate with them.  But, why should they, if all police officers are going to do is write up parking tickets?  I can see both sides of this issue.  Criminals don’t just hurt one person; they terrorize the entire community.  People become scared and lose hope that law enforcement will help them.

There are no easy answers to these complex social issues where race, gender and socio-economic circumstances often factor into the discomforting mix.  People have noted that, when a White female goes missing or turns up dead, police not only move Heaven and Earth to find out what happened, the story goes national.  Think Jon Benet Ramsey; think Natalee Holloway.

Still, things really are different when you compare a child who is kidnapped from their own home in the middle of the night to a 20-something in an impoverished neighborhood who’s trying to get into the drug trade because of the easy money.

Consider the case of Gary Leon Ridgeway, known colloquially as the “Green River Killer.”  From 1982 to 1998, Ridgeway murdered as many as 66 women and teenage girls in the state of Washington.  He dumped the bodies in wooded areas near the Green River.  Most, if not all, of his known victims were prostitutes.  The teenaged ones were most likely runaways.  Ridgeway had become a suspect in 1983, a year after he’d been arrested in Seattle for patronizing a prostitute.  He took and passed a polygraph in 1984, when police again questioned him about the string of murders.  Thus, he remained on police radar for nearly two decades, before being arrested in 2001.  In 2003, a judge sentenced him to life in prison; a shocking outcome to one of this nation’s worst serial murderers.  But, prosecutors took the death penalty off the legal bargaining table to coax Ridgeway into confessing to other slayings; including some in the state of Oregon.  How he managed to escape a massive police dragnet for so long confounds even the most seasoned homicide detectives.

But, the families of many of the victims say they know why: Ridgeway murdered prostitutes, not choir girls.  That many of his victims were Black or Native American added the ubiquitous and disturbing racial component.  Except for Ridgeway’s teenaged victims – naïve girls who may have fled broken homes – I think it’s fair to say the adult women knew what they were doing.  Yes, prostitution is illegal.  But, don’t expect police to stand by and ignore the interactions between hooker and client, unless the latter turns violent.  Police can only do so much to protect average citizens.

It’s tough for me to have empathy for someone who consumes alcohol for half a century and then complains when they develop cirrhosis.  As a former alcoholic, I can see where my life was headed and got hold of the problem years ago.  And, it’s equally tough for me to have sympathy for a drug dealer who ends up in a dark alley with scores of bullet holes in his or her body.  I’m not being judgmental.  I’m just pointing out the obvious.

In yet another episode of “The First 48,” homicide detectives in Memphis looked strangely at a suspect when he told them that murder is just how some people die.

“Do you realize how serious this is?” responded one of the detectives.

Obviously he didn’t, as he sat in the interrogation room with a sour expression.  He was young, but already emotionally hardened by a community that seemingly had accepted its dire fate as a crime pit.

Most people don’t deserve to be murdered.  But, when individuals deliberately engage in criminal activity and end up on a mortician’s table, what did you expect?

1 Comment

Filed under Essays

And Me?

hands-of-elderly-man-hold-006

In September of 2012, I was at my parents’ house when my father was getting ready to go have his car inspected, and my mother decided she needed to take out the trash.  I had come in following an earlier and somewhat stressful job interview.  I had brought my dog with me to the house – not the interview.  I suddenly thought that I needed to check on my mother.  I don’t know why; it just suddenly occurred to me.  Good thing, though.  As I entered the garage, my mother was returning from the recycle bin, when one of her slippers got caught on the cracked driveway.  She slammed hard onto the concrete and immediately started screaming.  I rushed to pick her up; her left arm looked broken.  With my help, she hobbled back into the house.

My father came down the hallway, horrified.  “What the hell happened?” he bellowed.  He already has a loud voice, so with any extra effort, he could wake the dead.

I quickly explained the situation, which only made him mad.

It wasn’t the first time my mother had tripped while wearing those slippers.  They were cheap, rubber footwear with a two-inch heel; what I called high-heeled slippers.  A few months earlier I was again at their house with my dog, when he indicated he needed to visit the back yard.  My father decided to take him out; my mother decided to join them.  She leapt up from the couch and tripped on those same slippers; slamming hard onto the tile floor.  In fact, she came out of them.  They literally seemed to get stuck to the floor.  She ended up with a severe bruise up the right side of her leg.  A visit to their orthopedic doctor the following week confirmed nothing was broken, or even fractured.

When she fell in the driveway, my father hurriedly called that same orthopedic doctor.  He told them to come in immediately.  I drove them to his office; the receptionist could sense my frustration, as I signed them into the log book.

“Be patient, hon,” she drawled.

My mother’s arm wasn’t broken, but her shoulder was dislocated.  The doctor and two of his assistants tried to pop it back into place, as she lay on the X-ray table, but the muscles and ligaments around it had swollen too much.  They had to admit her to the neighboring hospital and put her to sleep.  It turned out to be an all-day affair.  We left the hospital around 7 P.M.

My father tossed that pair of slippers – and another similar pair my mother had in their closet – into the trash.  Since they were made of rubber, I switched them over to the recycle bin.  I hoped they could be reincarnated as the wheels of a “Hoveround” and therefore, serve a greater purpose.

She’s not the only one who’s tripped in and around the house.  My father, an avid gardener, has fallen several times outside with no one but himself to get back up.  One afternoon he fell in the master bathroom and couldn’t get back up.  He started hollering for help.  My mother had fallen asleep on the couch and couldn’t hear him.  I had lain down in my old bedroom and – with the door closed – couldn’t hear him either.  My dog’s whining woke me up.

It’s a good thing I was there to help my parents in both those predicaments.  Many senior citizens live alone and often find themselves in compromising situations.  Several years ago I had a friend who volunteered for “Meals on Wheels.”  One afternoon he arrived at the home of a client, an elderly woman who lived alone.  Two of her neighbors were at the front door; frantic because she wasn’t responding to their knocks.  My friend wandered towards the back where he climbed the tall wooden fence – and saw the woman lying on the ground, just outside the back door.  She had stepped out the previous evening and tripped.  Unable to get up by herself, she simply remained on the ground; knowing her “Meals on Wheels” visitor would be there the next day.

As I rapidly approach 50, I’m now seeing all these incidents in a new light.  Who’s going to take care of me when I get old – if I should be that lucky?  I’m an only child.  I’ve never been married and don’t have any kids.  I’m close with a couple of cousins on my father’s side, but they have their own lives.  I don’t know if I’ll end up in this house where I grew up, or if I’ll have a home of my own.  But, if I should have the good grace of living to an old age, who could I depend on for support?  I can see dogs in my future though.  They make great companions, yet unless they can be trained to dial 911, or administer first aid, that’s about the extent of their practicality.  Still, I’d almost rather have a dog than a spouse or a partner.  I’ve never been good at romantic relationships.

It’s a serious issue facing us, as life expectancy in the U.S. and other developed nations reaches ever-increasing highs.  The current (and relentless) American obesity epidemic may put a dent in the welfare of my fellow citizens.  However, medical and scientific advances have allowed the populations of developed nations to experience greater rates of longevity, which is a good thing, of course.  People should be able to live as long as they possibly can.  But, those longer life expectancies also present some unique challenges; a fair trade-off, I presume.  It goes beyond just tolerating old folks’ stories of ‘way back when.’  Older people generally require specialized medications and treatments.  Arthritis, hearing and vision loss and immobility are among many such concerns for senior citizens.  There’s a growing industry within the medical community that targets elder care.  It’s virtually uncharted territory.

My paternal grandmother lived to age 97.  But, in the years between the death of my grandfather in 1969 and one traumatic night in the spring of 1992, she’d spent mostly alone.  She got up in the pre-dawn hours, needing to go to the bathroom, when her foot became entangled in the bedding.  She stumbled forward into the baseboard of her antique bed and fell to the floor – her right elbow cut and broken.  Despite the pain and bleeding in the pitch-black darkness, she managed to pull herself back around to the nightstand where she found the telephone cord; she yanked the phone down and called one of my aunts.  My aunt called one of her sisters, before rushing to my grandmother’s house with her husband.  Someone called the paramedics.  As my aunts and uncles stood outside, they simultaneously realized one terrifying fact: none of them had a key to the house.  One of the paramedics announced he was going to break a window, when one of my uncles remembered he had a glass-cutter in his car.  They used that to gain access to the house.  At the hospital, everyone was startled to learn something more critical than not having a key: my grandmother’s body was riddled with bumps, bruises and cuts.  She conceded that she’d fallen several times in the house and had always managed to get back up.  This time was worst, though, because of the elbow break.  The emergency room doctor looked askew at my relatives.  Elder abuse had become a hot topic in the medical community by the early 1990s, and our family became concerned that someone would look at those bumps and bruises on my grandmother and think the worst.  But, no one did.

Ultimately, my father and his six siblings decided that someone needed to be with her at all times.  My grandmother wasn’t too keen on the idea, though.  She relished her independence and privacy and didn’t want someone monitoring her every move.  But, her children ruled against her.  She was fortunate – and blessed.

A close friend of mine is caring for his elderly mother and an elderly aunt.  His aunt is in her early 90s, and his mother is fast approaching that milestone.  He works full-time, so it’s a challenge to tend to the needs of both women.  On a few occasions, he once confided to me, he literally wanted to pack up and leave Dallas for somewhere else; anywhere!  Just as long as he had no one to worry about except himself.  Alas, he couldn’t bring himself to do it.  He’s not so cold-hearted.  His older brother died a few years ago, and his younger sister has a daughter who just turned one.  His sister also has a 20-something son from a long-ago relationship who lives in the same house as his uncle, grandmother and grand-aunt.  He’s a very responsible young man who finished a hitch in the U.S. Marines three years ago and just earned an associate’s degree from a community college.  But, he also works and, at his age, I don’t think he envisions a lifetime of caring for old folks.  One day, however, he and his half-sister may face the concerns of elder care with their mother.

It’s difficult to watch my parents age.  “It’s hell getting old,” they inform me periodically.  Not until a few years ago, about the time I turned 45, did I really sit down with nothing but my most honest thoughts and contemplate life as a senior citizen.  Aside from previous bouts with alcohol addiction, I’ve tried to take care of myself both physically and mentally.  I’ve suffered from severe depression and anxiety in the past; adverse effects, I now realize, of not being able to kill people who pissed me off and get away with it.  Otherwise, I’m pretty healthy.  People who don’t know me occasionally tell me I look 30-something.  That’s a good thing.  But, surficial appearances can’t make up for a strong inner core.

In 2043, for example, I’ll be 80 – the same age as my parents are now.  Will I still have relatively good vision and the mental acuity needed to drive a vehicle?  More and more older Americans are still driving, even as their reflexes slow.  Some states are approaching the delicate issue of how to deal with the growing number of senior citizen drivers; another effect of longer life expectancies.  Will I learn from my parents’ mistakes and watch where I’m walking?  Falls are the leading cause of injury to the elderly.  It was bad enough that my mother would wear those damn rubber slippers with a two-inch base, but she also had the habit of dragging her feet.  I can understand why.  Joints become stiff with age, as cartilage behind the knees wears thin.

It would be nice for me, at age 70 or 80, to sit around the house and relish the fruits of a successful writing career.  But, at some point, I’d have to do laundry, or go to the grocery store.  If I have dogs – which I honestly intend to have – I must take them to the vet periodically.  It is possible that, in 30 years, grocery shopping will be done strictly online with customers sitting at their computers using web cameras to analyze fruits, meats and vegetables.  I wouldn’t be surprised if the U.S. Postal Service – now fighting valiantly to stay alive and relevant – will be a memory in 30 years; akin to my paternal grandfather’s early 20th century carpenter tools.  But, could there also be a vet who makes house calls?

Twenty years ago, when a good friend of mine died of AIDS, I felt lucky to reach my 30th birthday less than two months later.  Before I knew it, though, the turn of the century came – and went – a rare milestone for most humans now.  I turned 40 just weeks before I marked my first anniversary with an engineering firm – and then came down with the flu for the first time in my entire life.  Now, the first decade of the 21st century is old news.  Yes, technology changes, but so do people.

I’m not a braggart.  I don’t live for the moment, or for mounds of attention.  I’m an introvert who prefers quiet spaces most of the time; a hermit, perhaps, but one who cherishes books more than booze and dogs more than people.  If we’re fortunate, we get to live to see 70, 80, 90 and so on.  But, for me personally, what does that type of future hold?

I don’t cry out, ‘What about me?’  I’ve moved beyond wishing for the adulation of others.  But, seriously contemplating my later years, I really do have to ask, ‘What’s going to happen to me when I get old?’

3 Comments

Filed under Essays

Morass

yelling

As of 12:00 A.M. today, October 1, the United States government – for all intents and purposes – has stopped functioning.  I know it seems nothing has changed.  I mean, seriously – is there any difference?  But, the painful reality is that some 2 million government employees will not get paid and national parks have closed.  That’s the immediate effect.  It gets worse if the shutdown continues: military veterans won’t receive their benefits; the Centers for Disease Control and Prevention (CDCP) will have to halt its flu vaccination program – just as flu season approaches; some food safety operations will stop (and in a nation where behemoth butts have become the norm, that spells catastrophe); small business financing will stop; Head Start programs will start closing; disability benefits could be interrupted; funding for disease treatment through the National Institutes of Health could cease.

In the meantime, every member of both houses of Congress will receive their paychecks; their own health care won’t be adversely impacted.  Ironic, though, considering that the Affordable Care Act is the genesis of the squabble between the 2 principal political parties.  Most Republicans – especially the “Tea Party” clowns – despise the ACA, which they’ve derisively called “Obamacare.”  And, in an attempt to stop funding for the President’s signature law, the GOP is willing to risk what little integrity they have in their xenophobic bones and shut down the government.

Over the weekend, one particular “Tea Party” darling, Senator Ted Cruz, launched into a staunch tirade against the ACA.  Hoping to make a name for himself, the Canadian-born, Cuban-Italian Cruz has been campaigning for president since he took office back in January.  Representing my beloved home state of Texas, Cruz has done little else with his time and energy except commandeer the Republican Party’s vitriolic bandwagon and try to obstruct President Obama in any way possible.

Altogether congress has about a 10% approval rating.  I think ptomaine poisoning and getting stranded in the desert without water or cell phone service rank just above them.  Last year I wrote about the ongoing lack of progress from the Senate and the House of Representatives.  My wishful demand was for every elected official in Washington to get impeached, so we – the average, hard-working Americans – can elect more level-headed people to fill the apathetic void.  A million dollars in gold bullion has a greater chance of landing on my doorstep tomorrow morning.

I clearly remember the 1995 – 96 government shutdown in which a beleaguered President Bill Clinton ran head first into a recalcitrant Republican Party (led by the self-righteous Newt Gingrich) – and won.  It was a different time though.  The GOP held strong majorities in both houses of Congress; we weren’t at war; and the economy exploded into profitability for everyone shortly thereafter.  Clinton didn’t back down, thus forcing the GOP into embarrassingly humble defeat.

Today, the U.S. economy is still reeling from the worst downturn in 80 years; we have troops in Afghanistan and Iraq; and Republicans control only the House of Representatives.  Regardless, I’ve lost all respect for our elected officials.  Obama still hasn’t found any steel bars to inject into his spine, and the GOP has let itself be dominated by right-wing extremists.  I’m trying to imagine how things could get any worse.  If they do, colonizing Mars looks better all the time.

5 Comments

Filed under Essays

Okay, Let’s Attack Syria, But…

warrior_of_ideas_196945

President Obama has placed himself into a quandary with Syria.  As the world observes what can only be deemed a human atrocity with a chemical assault upon Syrian civilians, the United States collectively contemplates intervention.  Obama won the presidency in 2008 primarily based on his opposition to the Iraq War – the illegitimate enterprise launch by the draft-dodging George W. Bush and Dick Cheney.  We now know that American oil interests used the horror of 09/11 to justify the invasion of Iraq.  Those same entities are surely behind Obama’s sudden desire to attack Syria.

It’s amazing how the U.S. government selects its battles.  President Bill Clinton says he didn’t interfere in the 1994 Rwandan massacre because he simply had no idea what had happened; a dubious claim at best.  Ronald Reagan sent covert military operatives into Central America allegedly to stamp out any communist insurgencies.  In reality, U.S. conglomerates like United Fruit wanted to maintain their lock on local commodities.

Chemical warfare is nothing new.  Technically, people have been using them for millennia; starting with poisoned arrows.  They gained prominence, however, at the start of the 20th century with a chlorine gas attack in Belgium in 1915.  Germany made good use of them during World War I.  Consequently, in 1925, an assemblage of nations banned chemical weapons with the Geneva Protocol.  But, things always look great on paper.

No one jumped when Saddam Hussein used mustard gas and sarin against Kurdish civilians in 1988; perhaps because the U.S. might have been involved.  Hussein may have used chemical weapons against the U.S. military during the 1991 Persian Gulf War.  A decade ago the U.S. accused Hussein of stockpiling uranium, which of course, prompted the invasion.  Notice how these things are cyclical?

Now Obama wants to don the mantle of international hero by ousting Bashar al-Assad.  So far, he hasn’t convinced too many in the U.S. Congress, nor has he been able to persuade our biggest ally, Great Britain.  He plans to take his case to the American public in a televised address tomorrow night.  Good luck.

But, if the U.S. does plan to attack Syria, here are two conditions I’d like to see take place first:

  • Raise taxes on the wealthiest citizens and largest corporations to fund the war.  Our engagements in Iraq and Afghanistan occurred without the benefit of significant tax revenue, which ultimately led to the current economic crisis.  Besides, all those rich folks and oil conglomerates are the one who benefited the most from the conflicts.
  • Institute the military draft for every able-bodied person ages 18 – 25.  But, this time include women and rich men.  Yes, if women want to be treated as equals to men in business and politics, that means they have to serve alongside men on the battlefield.  In the past, sons of affluent families have been able to bypass military service.  (Mitt Romney comes to mind.)  But, if those boys can expend energy racing their million-dollar speed boats or partying all night in Cancún, then they can damn well haul rucksacks across the Syrian desert.  There also should be no exceptions for conscientious objectors, such as Mormons, Amish, or Jews.

I won’t hold my breath on passage of either.  I know it’s a long shot to expect multi-millionaires to share the tax burden (not their hard-dollar wealth), or for “Millenials” to set down their I-pods and actually do something constructive.  But, what’s life worth if you can’t dream?  Ultimately, my dream is for the Syrian people to rise up and depose al-Assad all on their own.  Regardless, war is just too ugly for only a handful of people to endure.

Image courtesy Warrior of Ideas.

1 Comment

Filed under Essays

Save the Boys, Damn the Religion!

knife

Where was the outrage?

Last year the Centers for Disease Control and Prevention (CDCP) reported that, between November of 2000 and December of 2011, eleven infant boys in the New York City area developed herpes infections following orthodox Jewish circumcision rituals.  In keeping with religious tradition, every infant male born into the Jewish faith undergoes a bris, or brit milah, on the 8th day of life, during which the foreskin of his penis is removed.  The cleric, a mohel, often dabs the infant’s lips with a drop of wine supposedly to numb the pain before performing the ritual.  In the rare cases when the baby is born without a foreskin – a condition called aposthia – or if he was circumcised outside of the standard bris ceremony, the mohel performs a symbolic circumcision called a hatafat dam brit in which he pricks the head of the infant’s penis to draw a drop of blood.  All of this is done in accordance with Jewish scripture, Genesis 17:10-14 and Leviticus 12:3, which Abraham, the founder of Judaism, allegedly wrote.  Orthodox Jews, like many staunchly religious people, view their faith as an unmitigated commandment that should not be questioned.

No one knows if Abraham considered the possibility of herpes infections.  But, during some of these ultra-orthodox rituals, the mohel often performs metzitzah b’peh, or oral suction, to minimize blood loss.  In other words, he sucks on the baby’s penis, while family members and others stand around in quiet observation.  I believe, in keeping with contemporary federal law, that’s called pedophilia and – regardless of one’s religious affiliations – it’s a felonious criminal offense.

Health officials have known for years that herpes infections can be detrimental to newborns.  Because of their undeveloped immune systems, babies born to women infected with genital herpes (herpes simplex type 2) can develop fevers, seizures and / or blindness.  Death is not uncommon among these infants.  Herpes simplex type 1 usually causes blisters on the mouth, lips or eyes; otherwise known as cold sores.  Of the 11 aforementioned New York cases, 10 of the babies were hospitalized; at least 2 developed brain damage, and 2 others died.

In December of 2005, New York Mayor Michael Bloomberg – tiptoeing through the minefield of religious sensibilities – issued a letter to the local Jewish community warning of the health risks of metzitzah b’peh and politely asked rabbis to cease the practice.  Religious leaders scoffed at the notion, insisting that the ritual was perfectly safe.  As usual, they claimed religious freedom and vowed to fight any attempts to ban it.

Such cases may be rare, but I noticed no demands were made of New York’s Jewish community to stop putting their infants at risk; no threats of prosecution; no criminal charges – nothing but courteous requests to think about what they were doing.  Had those infants been girls, I realized, Bloomberg himself would have rounded up every religious leader and every parent and thrown them in jail.  But, since male circumcision has become such an insidious element of pediatric care in the U.S. and since violence against males – even infant males – is socially acceptable here, no one seemed to notice.

Religious freedom – like free speech and voting – is one of the hallmarks of American society.  It’s a critical feature of any civilized state.  But, I have to wonder how the public would react to infant females contracting genital herpes following some archaic religious ceremony.  Would the local mayor merely ask religious leaders to stop and just hope for the best?  Where, in fact, was the media outrage over the 2012 CDCP report?  Why is that people seem to think it’s okay that baby boys aren’t just being cut up in the name of religion, but dying because of it?

Male circumcision is primarily associated with Judaism, but it’s also a sacred rite among Muslims.  Unlike Jews, however, Muslims wait until their sons are older to perform the ritual – usually between the ages of 6 and 11.  But, its origins in the Islamic faith, however, are unclear.  It’s mentioned in the hadith (sayings from the profit Mohammed), but not in the Quran.  Circumcision is not considered a religious rite among Christians, even though the “Gospel of Luke” states that Jesus was circumcised on the eighth day after his birth.  Circumcision was also considered a rite of passage among some African and Indigenous Australian groups where it was viewed as a pathway to manhood for boys.

A purported circumcision from the Temple of Khonspekhrod in Luxor, Egypt, c. 1360 B.C.

A purported circumcision from the Temple of Khonspekhrod in Luxor, Egypt, c. 1360 B.C.

Male circumcision was once virtually unknown in the United States.  Early proponents were doctors who believed it would prevent male sexual deviants from committing further crimes, such as rape and pedophilia; others included homosexuality in that evil repertoire.  Circumcision was even recommended for men charged with adultery and to stop boys from masturbating.  This was during a time when physicians believed human sexuality (and its various perversions) were strictly tied to genitalia.  In 1858, for example, the European medical community urged clitoridectomies to overcome frigidity and hysteria in women.  In 1891, England’s Royal College of Surgeons published On Circumcision as Preventative of Masturbation.  Around the same time, John Harvey Kellogg, a nutritionist and self-proclaimed sexual advisor, developed his corn flakes cereal as a means to prevent children from masturbating.  Kellogg believed masturbation – then often called onanism or self-pollution – caused insanity and, if left unchecked, could be fatal.  He even suggested threading silver wire through the foreskins of young boys to prevent them from getting erections and therefore, stamp out their sexual urges.  He also came up with the idea of injecting some of his patients with yogurt enemas to cleanse their intestinal tracts.  Fortunately, neither of these latter two practices caught on with the American public.

Neither did circumcision.  That began to change, however, after World War II.  Much of it has been credited to the rapid influx of Jewish immigrants fleeing Nazi-riddled Europe.  But, a growing body of medical practitioners had already begun to urge circumcision of newborn boys as a means of preventing penile cancer later in life.  In 1932, Abraham Leo Wolbarst, [Circumcision and penile cancer. Lancet 1932; 1: 150-153], published a review of 1,103 cases of penile cancer in the U.S. and noted that none occurred among Jews.  He cited similar figures from Europe and pointed out that Muslim men who had been circumcised as pre-teen boys were less likely to develop penile cancer.  A 1935 report entitled “Epithelioma of the Penis,” published in the Journal of Urology, [Dean AL Jr. Epithelioma of the penis. J Urol 1935; 33: 252-283], seemed to confirm those findings with an analysis of a mere 120 penile cancer victims at New York’s Memorial Hospital: none were Jews.  Circumcision among adult males began to increase throughout the 1930s.

Detail of Friedrich Herlin’s 1466 depiction the circumcision of Jesus, “Twelve Apostles Altar.”

Detail of Friedrich Herlin’s 1466 depiction of the circumcision of Jesus, “Twelve Apostles Altar.”

Then, in 1946, various reports started coming out in the U.S. claiming that men returning home from World War II, especially those who’d served in North Africa, were suffering from penile cancer.  These men, some medical professionals supposedly observed, had gone for long periods without bathing and, for the uncircumcised ones, this culminated in a build-up of smegma; which in turn, developed into penile cancer.  It is true that many of those servicemen were uncircumcised and had gone without bathing for lengthy stretches.  But, they weren’t suddenly afflicted with penile cancer.  Instead, many of them were suffering from venereal diseases, mainly syphilis.  It’s quite plausible to assume many of them, happy that the relentless war had finally ended, celebrated by patronizing local brothels before returning home.  Yet, the unsubstantiated claims of a sudden outbreak of penile cancer nonetheless launched a movement and circumcisions of newborn males began occurring at a rapid pace.  By the mid-1950s, up to 90% of newborn American boys were circumcised; thus making it the most common surgical practice in the country.  By the early 1960s, some health insurance companies began reimbursing doctors for circumcisions, thus invoking a profit motive.  Some hospitals started performing circumcisions without the parents’ knowledge or consent – and then charging them for it.  In the early 1980s, the rate of newborn male circumcisions began to drop; albeit slowly, and continued dropping.  By 2010, the rate stood at roughly 40% in the U.S. – the first time it was below 50% in over half a century.

Preventing penile cancer is perhaps the top myth related to male circumcision.  As with anything, the truth often gets lost amidst the rancor of popular opinion and uncertain medical advice.  Tell a lie often enough, as the saying goes, and people start to believe it.  But, here are the facts, starting with that number one lie:

Myth:  It prevents penile cancer.

Fact:  Penile cancer is one of the rarest forms of carcinoma known to humanity.  Worldwide penile cancer accounts for about 0.2% of all cancers in men.  In the U.S., it accounts for some 0.1% of all cancers in men, or about 1 man in 100,000.  Men are actually more likely to die from a rare form of male breast cancer than penile cancer.  Even in other developed nations, such as England and Japan, where male circumcision is uncommon, penile cancer is actually more rare.

After years of intense medical analyses with various groups of men, no doctor has been able to prove conclusively that intact foreskins are linked directly to penile cancer.  Doctors do know that the number one cause of penile cancer is the human papilloma virus (HPV), which is spread through unprotected and often frequent sex.  Poor diet, obesity and nicotine consumption are other contributing factors.

Myth:  It prevents cervical cancer in men’s female partners.  This is another top reason provided for male circumcision.

Fact:  As with penile cancer, HPV is the leading cause of cervical cancer, with poor diet, obesity and nicotine consumption listed as other risk factors.  Up until the mid-1950s, cervical cancer was one of the leading causes of cancer deaths among women in the U.S.  But, physicians don’t credit the increase in male circumcisions for the decline; rather, they point to the increased prevalence of pre-cancerous screenings (Pap smears) and greater attention to women’s overall gynecological health.

Circumcising males to protect females may be politically correct, but it’s morally unethical and medically impractical.  You don’t safeguard one group of people by violating the basic human rights of another.  Even if all men are circumcised, venereal diseases can still be spread through unprotected sex.  As with the number of pregnancies and births, the rates of venereal disease infections drop when women are empowered with information.  Women in developed countries, for example, have on average 2 children; while women in developing nations have as many as 5 children.

Myth:  It minimizes the risk of venereal disease transmissions.

Fact:  The term “minimize” is often substituted for the term “prevent,” but the misunderstanding can be dangerous.  Even though most males born in the U.S. from the 1950s to the 1970s were circumcised, the rates of sexually transmitted diseases increased exponentially during that same time period.  Gonorrhea was one of the biggest culprits, with 193 reported cases per 100,000 individuals in 1950; and 442 reported cases per 100,000 individuals in 1980.  Syphilis actually experienced a dramatic decrease: 642 reported cases per 100,000 individuals in 1950; and 60 reported cases per 100,000 individuals in 1980.  The key term, of course is “reported.”  Even now, though, both those ailments remain the most commonly-transmitted venereal diseases.  (Health, United States, 2010, U.S. Health and Human Services, Trend Tables: Table 44, p. 212.)

Genital herpes exploded from an average annual 5% infection rate in the late 1960s to about 30% by 1980Chlamydia, which was rare before 1990, saw 1.4 million cases in the U.S. in 2011.  Hepatitis B has also been tenuously linked to male circumcision.  Scientists identified Hepatitis B as a separate strain in 1955 and discovered it could be sexually transmitted in 1975; the same year they identified Hepatitis C, which they initially called “non-A, non-B.”  Until the 1970s, Hepatitis B had been dubbed the “druggies’ disease” because it primarily infected intravenous drug users.  In the 1980s, Hepatitis B became linked with another growing epidemic, another consequence of the sexual revolution: AIDS.  And, that in turn, has now metamorphosed into yet another ruse for circumcision.

In recent years, some epidemiologists have claimed that circumcision minimizes the spread of HIV (human immunodeficiency virus) infections.  Much of this is based on a controlled study of 5,534 uncircumcised, HIV-positive Ugandan men, beginning in 2002.  Doctors convinced the men (all of whom identified as heterosexual) to get circumcised.  None of the physicians believed the men would be cured of HIV, but they wanted to see if the men developed higher T-cell counts once their foreskins were removed.  As often happens, things looked great on paper, but didn’t go as planned once put into action.  Many of the men – believing they’d been cured of HIV – began having unprotected sex; others disappeared from the control group, so doctors couldn’t track their activities.  Still, the doctors insisted the study showed promise; claiming that circumcision reduced a man’s risk of acquiring HIV by as much as 60%.  But, to me, the concept of a bunch of mostly White, mostly female European and American physicians urging a cluster of uneducated, basically illiterate Black men to have their penises mutilated seems as racist and sexist as it does immoral.

Myth:  It prevents urinary tract infections (UTI), especially in male children.

Fact:  The medical community can’t seem to make up its mind on this one.  On average, about 5% of girls and 2% of boys will develop a UTI.  Between 1971 and 1999, the American Academy of Pediatrics published 5 policy statements on the circumcision of boys in relation to UTIs and could find no credible evidence of a direct correlation.  In other words, circumcision didn’t prevent UTIs in boys.  In 1986, however, they still noted the procedure “has potential medical benefits.”  Then, in 1999, they reversed course and didn’t recommend it.

An analysis of 136,086 boys born at U.S. Army hospitals from 1980 to 1985 showed that 100,157 were circumcised.  Of those, 193 experienced complications related to the procedure; that apparently included UTIs.  Of the 35,929 uncircumcised infants, 88 (or .24%) developed UTIs.  It’s obvious infants develop UTIs because they can’t control their bladder and therefore, can’t clean themselves.

There is only one legitimate medical reason for circumcision: phimosis, which is the inability of the foreskin to be retracted.  The condition can lead to inflammation of the penile glans and urinary tract infections.  Occasionally, topical ointments such as hydrocortisone can relieve the tightness of the skin and subsequent inflammation.  But, more practically, removal or loosening of the foreskin is appropriate.  Still, on average, only about 1% of boys are born with or develop this condition.

Another medical reason often given for circumcision is prevention of balanitis, which is inflammation of the penile glans.  This usually occurs in uncircumcised men, but is traced to one primary cause: poor hygiene.  Severe balanitis requires more aggressive treatments, such as antibiotic pills or steroid creams.  But, it’s amazing what regular hygienic habits can accomplish.  Simple hand-washing, for example, can reduce the risk of respiratory-associated infections by up to 16% and reduce the risk of diarrheal disease-associated deaths by up to 50%.

Yet another explanation often given to justify circumcision is purely aesthetic: it allegedly makes the penis look better.  That, of course, is a personal opinion, but not enough to warrant mandatory foreskin removal.  I’ve entered into a number of debates about this one in particular; often with women who would scream if I suggested they have a surgical procedure done to meet what I think is my own definition of beauty.  Any woman who thinks the uncircumcised penis looks ugly needs to hold a mirror up to her own crotch; the female genitalia isn’t exactly a work of art either.  Human genitalia altogether isn’t built for appearance; it’s built for function.  You don’t look at it; you work with it.

Then, there’s the presence of smegma – the nasty buildup of dead skin cells beneath the foreskin.  It’s primary cause?  Once again, poor hygiene.  For most uncircumcised men, hygiene is a simple matter, like breathing – we retract the foreskin and clean ourselves.  Any uncircumcised man who doesn’t engage in this most basic of behavior has far more problems than the inability to reach for soap and water.

If circumcision truly prevented penile or cervical cancers, then perhaps we should mandate, or at least strongly recommend, that women have double mastectomies once they pass their child-bearing years to avoid breast cancer.  Despite recent medical advances and awareness, breast cancer remains the number one killer of women in the U.S.  For that matter, we should mandate adult males have prostatectomies to avoid prostate cancer, which is the third greatest cause of carcinoma-related deaths of men in the U.S.  (Lung cancer is the top killer, but I don’t think mandatory thoracotomies would be practical.)

Appendicitis is much more common than penile cancer, and since the appendix serves absolutely no purpose in the human body, appendectomies could save valuable time and money.  Tonsillitis is a common affliction in children, but doctors still don’t perform tonsillectomies as a preemptive measure.  Wisdom teeth often become impacted and necessitate removal, but again, doctors don’t seem to automatically mandate it.

Global_Map_of_Male_Circumcision_Prevalence_at_Country_Level

Unlike so-called female circumcision, calls to ban male circumcision have been met with hostility from people who suddenly develop an affection for religious freedom.  The loudest voices have come from the Jewish community; many of whom will use any excuse to play the victim.  When a handful of Muslim groups protested that banning female circumcision violated their religious freedoms, human rights activists paid no attention.  In that regard, protecting the health and safety of infant and toddler females trumped the religious ideologies of their parents.  A number of countries rightfully passed laws outlawing the practice, including the U.S.  When it comes to males, however, that religious freedom issue abruptly rears its ugly head and suddenly takes precedence over the rights of the child.

In 1996, then Congresswoman Pat Schroeder of Colorado proposed the Female Genital Mutilation Prevention Act (FGMPA) to outlaw female circumcision in the U.S.  It didn’t seem to matter that the ritual never had been practiced here, or most anywhere in the developed world.  The FGMPA passed unanimously, and then-President Bill Clinton signed it into law.  I’d never even heard of female circumcision until the early 1990’s, when human rights advocates started complaining about the thousands of girls suffering and dying in isolated parts of Africa and Asia.  For a much longer period, however, others had been complaining about the savagery of male circumcision and the fact that boys are suffering and dying as well.  The same devout Muslims who practice female circumcision in Africa and Asia also practice male circumcision – with the same level of barbarity; no anesthesia, no sterilization and no post-operative medical care.  With each child – female or male – they just cut off part of the flesh.  But, as in the developed world, the deaths and injuries suffered by males are ignored.  It is truly a gender-bias abomination.  But, in the politically correct universe of 1990’s America, that didn’t seem to matter; thus, the FGMPA became law without question and remains law, even though female circumcision was never practiced in the U.S. or any other developed nation.

In 2011, two California cities – San Francisco and Santa Monica – proposed to ban male circumcision.  In both cases, the issue reached the state legislature where Assemblyman Mike Gatto reacted by introducing a bill that would prevent any municipality in California from outlawing the procedure.  Ultimately, supporters of the ban in both cities experienced disenfranchisement.  In San Francisco, voters defeated the measure at the ballot box in November of 2011.  In Santa Monica, those who had proposed the anti-circumcision measure merely withdrew it from consideration.

In July of 2012, the German government backed away from its sweeping proposal to ban male circumcision.  Chancellor Angela Merkel announced that Jewish and Muslim groups will be allowed to circumcise their sons in accordance with their respective religious beliefs.

The 11 cases highlighted in the 2012 CDCP report aren’t really anomalies.  Every year in the U.S., about 100 infant and toddler boys die due to botched circumcision procedures, which include complications from administration of anesthesia.  Some say the number sometimes reaches 300, but actual statistics are difficult to ascertain.  I’m quite certain if 100 to 300 infant or toddler girls were dying from botched medical procedures, the practice would have been outlawed without question, no matter whose religion was offended.  If 100 to 300 adult females died annually from a botched cosmetic procedure, it definitely would have been outlawed!

It’s shocking to think that infant male circumcision is the most common surgical practice performed in the United States, but it has been for over six decades.  Even with the rash of weight reduction surgeries and face lifts in recent years, removing the foreskins of baby boys still ranks number one among cosmetic procedures.  But, the adverse effects of those circumcisions are conveniently left out of the debate.

Almost every year for nearly three decades, a bill simply titled the “Male Genital Mutilation Bill” has been presented to the U.S. Congress.  And, every year it never comes up for discussion.  It goes back to the cloak of religious freedom, and the grip it has on society.

When people make medical decisions based on religious ideology, other people – usually infants and children – often die.  In medieval Europe, the Roman Catholic Church often punished as heretics any medical practitioner who tried to ease the difficulties of pregnancy and childbirth; the Church believed women had to suffer for the sins of “Eve.”  Even now, the Fundamentalist Church of Jesus Christ of Latter Day Saints (FLDS) forces women and girls to endure the agony of childbirth because of Eve’s alleged transgressions.  When the “Black Death” struck 14th century Europe, the Roman Catholic Church pointed to Jews as the culprits.  As we now know, of course, the “Black Death” was the bubonic plague, which is a virus transmitted by fleas that live on rats and other animals.  The lack of hygiene among medieval Europeans and the fact they often slept in the same quarters as their animals contributed to the virus’ spread.  Jews were saved mostly because they often washed their hands before preparing food and engaged in other such ghastly habits like bathing more than once a year.

Just recently, a measles outbreak in Fort Worth, Texas has been traced to an evangelical Christian church where members refused vaccinations of any kind.  When some in the congregation returned from overseas proselytizing trips infected with the highly contagious disease, leaders prescribed prayer instead of medicine.  Now, 21 people in two counties have been diagnosed with measles.

I realize it’s difficult to alter religious ardor.  People tell me Jews and Muslims should be allowed to circumcise their sons because they’ve been doing it for centuries.  Well, for centuries, slavery was considered perfectly acceptable.  Blatant racism was a factor of American life from its beginning; something that changed only in recent decades.  That, in and of itself, ties into the enslavement of the first African-Americans; their contemporary European counterparts believed slavery was mandated by the Bible.  In the 19th century, White Americans concocted the philosophy of “Manifest Destiny” to forge westward across North America, which obligated them to destroy any darkness and savagery they encountered; meaning, of course, God commanded them to kill any heathenous Indians who got in their way.

As a former Roman Catholic devotee – an altar boy at that! – I once believed in the concept of “original sin” and the story of creationism.  Then, I saw the light and divorced myself from such ludicrous ideology – a sacrilege unto itself in the Church.  The Church’s disrespectful treatment of women was the real catalyst for my departure from its ranks of the blind faithful.  Roman Catholicism – and all branches of Christianity – has always taught that women were second-class citizens; another by-product of Eve’s wickedness.  Even now, the Church forbids birth control; believing everyone should procreate whether they like it or not.  The Church naturally doesn’t feel obligated to provide financing for those procreative results.

When human rights clashes with religious freedom, religion needs to take a back seat – always and forever, no exceptions.  I don’t care about anyone’s religious affiliation – Jew, Christian, Muslim, whatever – infants have more of a right to have their bodies left intact than their parents or their communities have to practice a certain philosophy.  If all of Judaism or Islam collapse because parents won’t be able to carve up their sons’ penises, then that would be a good thing.  Religion has been a great oppressor throughout human history.  Judaism, Christianity and Islam, in particular, have been the worst offenders; more people have been maimed and murdered because of those three religions than any other human construct.  It’s still happening even now.

And again, with 11 newborn babies infected with herpes, I ask – where was the outrage?

Attorneys for the Rights of the Child

International Coalition for Genital Integrity

MGMbill.org

Jews Against Circumcision

Mothers Against Circumcision

Nurses for the Rights of the Child

8 Comments

Filed under Essays

Sinkers

bench-in-river-131

The above photo is from fellow blogger Penny Howe who sat on a bench overlooking the Columbia River, near her home, during this past spring’s winter snow melt.  I shared it with several friends who expressed concern for Penny’s mental health.  I assured them she’s a writer like me, so they immediately understood.

But, the picture made me think of the real threat soil erosion poses to major urban areas located near large bodies of water.  It’s a genuine concern with climate change and rising sea levels.  Half of the world’s population – roughly 3 billion people – lives in urban areas; a sharp rise from 13% in 1900.  At the start of the 20th century, only 12 cities across the globe had populations of 1 million or more; now there are 336.  More alarmingly has been the rise of “mega-cities,” urban areas with populations of at least 10 million.  In 1950, New York was the only city in the world with that distinction; now, there are a total of 17 such metropolitan areas.  Those people have to live and work somewhere, and that has increasingly come to mean larger edifices – gargantuan structures of concrete, steel and glass.  All of those individuals and all of those buildings weigh several tons, which – along with food and water consumption – has an impact on the overall environment.

People will probably be debating the pros and cons of global warming until…well, until they drown.  But, here in alphabetical order, is an informal list of some of the world’s fastest sinking cities.

Amsterdam – The Dutch capital is also the Netherlands’ largest city with about 820,654 people crammed into 84.56 square miles (219 km²); the greater metropolitan area has over 2.3 million residents.  More importantly, Amsterdam is at constant threat from the water that surrounds it on 3 sides.  In February of 1953, a series of calamitous floods from the North Sea killed over 1,800 people in the Netherlands alone and prompted Dutch engineers to rethink defenses for all of the nation’s cities.  A large series of dikes and canals mostly keep the waters under control, but Amsterdam – built on sand and clay – is still sinking at roughly .078 inches (2 mm) per year.

Winter floods in 1953 forced the Dutch to re-think their urban defenses.

Winter floods in 1953 forced the Dutch to re-think their urban defenses.

Bangkok – The capital of Thailand boasts a population of some 8.281 million people, crowded into 606 square miles (1,569 km²), with over 14 million living in the general metropolitan area.  Located on the Chao Phraya River delta, Bangkok has experienced a major economic boom in recent years.  Like Amsterdam, Bangkok residents used intricate waterways to navigate the city for centuries.  But, constructed on soft marine material known as Bangkok clay, the growing metropolis is sinking some 4.7 inches (120 mm) annually.  Some engineers have warned about the dilemma for decades; mainly due, of course, to soil erosion and groundwater removal.  Only recently, however, has Thailand undertaken measures  to protect Bangkok by building dykes and retrofitting flood gates.  But, for a city considered a “climate change hot spot,” that may not be enough.

Houston – The fourth largest city in the United States has some 3 million residents in its 627 square miles (1,625 km²) and practically sits right on the Gulf of México.  In June of 2001, Tropical Storm Allison devastated parts of the Texas Gulf Coast, but Houston experienced the worst flooding.  Allison dropped 6 – 10 inches (152 – 254 mm) of rain in less than 5 hours.  That made Houstonians realize how vulnerable they are to nature’s elements.  But, in 2010, University of Houston geologist Shuhab Khan announced that much of Houston (and overall Harris County) is sinking at approximately 2 inches per year.  Like so many other coastal cities, Houston continues to build and drain groundwater to accommodate the expansion.

Jakarta – Located on the northwest corner of the island of Java, Indonesia’s capital has nearly 11 million people residing in 285.8 square miles (740.3 km²) and over 28 million inhabitants in the greater area known as Jabodetabek.  About 40% of Jakarta’s land area sits at or below sea level.  A 2010 report by the Bandung Institute of Technology noted that Jakarta is sinking at a rate of 3 – 4 inches (10 – 12 cm) per year; most of it due to the usual culprits: groundwater extraction and rapid infrastructure development.  But, they act in concert with soil compaction and plate tectonics.  A massive 9.1 earthquake off the coast of nearby Sumatra in December 2004 proved that seismic activity makes the entire Indian Ocean region vulnerable.  Analyses done from 1974 to 2010 show that large portions of Jakarta sank anywhere from 9 – 27 inches (25 and 70 cm).  A massive seawall built to prevent the Java Sea from inundating the city is also sinking.  The Indonesian Forum for Environment has gone so far to claim that Jakarta will sink completely into the Indian Ocean by 2030, if construction and groundwater extraction aren’t limited.

Flooding earlier this year almost paralyzed Jakarta.

Flooding earlier this year almost paralyzed Jakarta.

London – As the provincial capital of the United Kingdom and the official capital of England, London is unique its dual role.  And, contrary to popular American mythology, not everyone in England lives here – even with 8.174 million residents in its 607 square miles (1,572 km²).  People have lived in the area for millennia, but the Roman Empire began building the former Londinium at the mouth of the Thames River in the first century A.D.; thus, making it one of the oldest continuously-occupied cities in Europe.  In 2002, however, satellite photos showed that London had sunk about 2 cm between 1996 and 2001.  Recent observations have noted that the legendary “Big Ben” at Britain’s Palace of Westminster is tilting at a somewhat precarious angle and that the entire parliamentary structure is gradually sliding towards the Thames.  The growing subsidence may be due partly to development of the “Jubilee Line Extension” and the new “London Power Tunnels;” all constructed to meet the demands of a growing population.  But, much of London’s descent could be traced to Britain’s overall recovery from the last Great Ice Age, when a massive ice sheet blanketed most of the island and depressed the entire land area downward.  With the retreat of the ice, Britain is showing signs of a colossal rebound: Scotland is actually rising, while Wales and eastern England are technically sinking.  Still, with a series of walls, dykes and the “Thames Barrier” – the world’s second-largest movable flood barrier – London hopes at least to delay any pending deluge.

México City – The Mexican capital is the largest city in the Western Hemisphere – in both population and land area – with some 8.851 million residents in 573 square miles (1,485km²) and roughly 21.2 million people in the overall metropolitan area of 761,601 square miles.  It’s the only city on this list not located by an ocean or a sea, but its continuing subsidence is very real.  Both can be attributed to the ancient Aztecs who began building Tenochtitlan, the center of their vast empire, nearly 1,000 years ago on a marshy island amidst 5 lakes that formed the base of the Valley of México.  They dredged water to create an extensive series of canals and bridges, as the city grew.  Spanish explorers were awed by the sight of it upon their arrival in 1519; at the time, Tenochtitlan had about 200,000 residents, larger than any city in Europe.  After gaining control of the region, the Spaniards merely continued the expansion.  Today, a small portion of one of those bodies of water, Lake Texcoco, remains.  But, this giant metropolis, which was plunging at an astonishing 19 inches annually in the middle of the 20th century, is still sinking 2 inches per year into the soft bedrock.  Many streets have sharp drop-offs from their sidewalks, while water and electricity lines are in constant danger of snapping or bursting.

An artist’s conception of what Tenochtitlan may have looked like when Spanish explorers arrived.

An artist’s conception of what Tenochtitlan may have looked like when Spanish explorers arrived.

New Orleans – Like Amsterdam, New Orleans is surrounded by water on 3 sides: Lake Pontchartrain to the north and the Mississippi River to the west and south.  With about 343,800 people in 350.2 square miles (907 km²), it also has the dubious distinction of being the fastest-sinking city in the U.S. – roughly 1 inch (2.5 cm) per year.  In fact, after the devastation of Hurricane Katrina, scientists took a closer look at New Orleans’ geological state and realized it was sinking into the Gulf of México much faster than previously thought.  That may explain why the city’s complex levee system failed during Katrina and allowed some 80% of it to be flooded.  As of 2006, scientists estimate, some of those levees had sunk up to 3 feet in the previous 40 years.

New York City – Composed of 5 individual, self-governing boroughs, New York City has about 8.245 million people in a total land area of 468 square miles (1,213 km²); altogether, about 19.3 million people reside in the New York metropolitan area.  Last year’s Hurricane Sandy made all New Englanders realize the extent of their vulnerability to nature’s wrath; even toughened New Yorkers trembled.  Sandy flooded parts of 4 of the city’s boroughs, but scientists have noted for years that densely-packed Manhattan Island, in particular, is slowly sinking into the Atlantic.

Shanghai – Founded around A.D. 1291, the most-populous city in the world boasts some 23.47 million residents.  Located on the Yangtze River, Shanghai (which means “Above the Sea”) is sinking as much as 4 inches (101 mm) per year.  Groundwater extraction has added to the problem, but so has the city’s rapid infrastructure growth.  One report by the Shanghai Geological Research Institute claims that the physical weight of the city’s skyscrapers account for as much as 30% of Shanghai’s subsidence.  In response, city officials have begun pumping roughly 60,000 tons of water per year back into wells; built hundreds of levees along the Yangtze; and are planning an emergency floodgate on the river’s estuary.

Venice – One of the oldest and most ornate cities in the world, Venice long ago ceded its fate to the sea; they’re just partying in advance of the grand finale.  Some 264,000 people live in its 160.1 square miles (414.6 km²), which scientists believe is dropping between .04 and .08 inches (1 – 2 mm) annually – more than previously thought.  Built against the Adriatic Sea, Venice is actually a cluster of islands that has always had a classic love / hate relationship with the water.  Moreover, it’s tilting to the south at .12 – .16 inches (3 – 4 mm) per year.  In just under the past 300 years, scientists believe Venice has dropped 2 feet (60 cm).  That may not seem like much, but Venetians are experiencing more floods – about 4 or 5 times a year.  Canals and bridges are part of the city’s landscapes, along with floodgates designed to close when high tides reach a level of 43.30 inches (110 cm).  Street lamps linked to flood gauges automatically shine brighter as the water begins to rise; thus warning pedestrians to seek higher ground – or at least jump into a boat.

While sunken cities feel like the products of wild imaginations, recent advances in submarine archeology have proven the existence of submerged metropolises across the globe.  Take Helike, for example, an ancient Greek port city that once thrived on the southwestern shores of the Gulf of Corinth.  For centuries, its existence and demise were dismissed as purely mythical.  But, in 2001, scientists found remnants of Helike buried further inland and have since confirmed that a massive earthquake and tsunami devastated the region in 373 B.C.; subsequently leading to the city’s destruction.  It’s possible the catastrophe spawned the legend of Atlantis.  Karen Mutton’s “Sunken Realms” provides an extensive and fascinating list of many other submerged cities, along with theories of what may have happened to them.

Nothing lasts forever – certainly nothing made by humans.  Quakes and tsunamis may have once posed the greatest threat to archaic urban areas.  But, in our infinite arrogance and bloated self-assurance, modern people don’t realize how little control we often have over our own fates.

Proven true – Helike really did exist.

Proven true – Helike really did exist.

4 Comments

Filed under Essays

Vote Like It Counts

image029

One of the many elements that came out of the 1963 “March on Washington for Jobs and Freedom” was a loud call for the United States to honor its commitment to voting.  People here often don’t think much about it, but voting is a critical factor in any democracy.  If you look at what’s happening in Syria right now, I’m certain a number of that country’s citizens wish they had the luxury of just voting, or impeaching, Bashar al-Assad right out of office.

A positive effect of the March on Washington was the 1965 Voting Rights Act, which guaranteed that the U.S. would uphold that right for every proper citizen to cast one vote for the candidate of their choice.  It struck down poll taxes and literacy tests; measures often used, particularly in the Southeast, against non-Whites and poor people.  Why don’t people take this seriously?

I’m especially concerned after a report showing my beloved home state of Texas ranks 51st, after the District of Columbia, in voter turnout.  On average, declares the Texas Civic Health Index, only about a third of eligible voters in the nation’s second-most populous state make a concerted effort to vote.  I think that explains why Texas looks to be a blood-red bastion of far-right lunatics.  It’s why Rick Perry has been able to hold onto the governorship like the Pope and why Ted Cruz easily won a Senate seat last year, despite his extremist views.

The state’s Democratic Party hopes to turn its political establishment a striking royal blue.  I personally don’t want to see Texas metamorphose into another California or Illinois where extreme taxation and heavy regulations drive away businesses.  But, I definitely don’t want it to remain mired in crimson red.  A nice fuchsia would be more palatable, but I’m not a color maven.

The study noted – not surprisingly – that people with higher levels of education are more likely to vote.  Thus, it recommended improving civic literacy through education, starting at the grade school level.  But, recent cuts by the Texas legislature in education funding may make that challenging.  Conservative state officials moved Heaven and Earth to ban abortion, but don’t have too much concern for those children once they reach school.  Hence, the need for voting.

It’s actually an embarrassment.  I’ve made a concerted effort to vote in every major state and national election since 1992.  Obviously, I haven’t always seen the results I’d like – but, at least I tried to make a difference.

Low voter turnouts appears to be a national trend.  Last year only some 57.5% of eligible voters made it to the polls; lower than in the 2 previous elections, but surpassing the dismal rate of 54.2% set in 2000.  Critics at the time liked to point out that more people voted in “American Idol” than in the 2000 presidential elections.  When you realize that, in 2012, Mexican voters turned out at a rate of 62.45% – despite the omnipresent threats of violence and endemic corruption – it certainly speaks poorly of Americans.

Voting is like budgeting: you just can’t let things go and hope for the best.  It requires work and patience.  It’s what any civilized society – not just the United States – is all about.  It’s the foundation of democracy.  It really does count.

1 Comment

Filed under Essays

One Quiet Voice

handsbw

The story is disturbingly familiar: a White male with anger and / or mental health issues storms into a crowded venue with a bevy of firearms intent on doing unmitigated damage.  It occurred twice last year: in Aurora, Colorado and Newtown, Connecticut.  In this uniquely American phenomenon – a relentless nightmare – another such drama unfolded at a Georgia elementary school on Tuesday, the 20th.  Michael Brandon Hill, a 20-year-old, entered the school with a cache of weapons – and was stopped with an ‘I love you’ from an unimposing office clerk.

As school administrators and teachers frantically ushered the young students out of the building and police descended upon the area, Antoinette Tuff dialed 911 and began talking calmly to the troubled young man.  Her reassuring voice has been playing out on the national media these past couple of days; leaving people amazed and thankful that she managed to diffuse a hostile situation with mere words.  This is not the end people have grown accustomed to seeing.  All of the other hallmarks were present: people running for their lives; scores of police officials in riot gear; and media hawks jockeying for the best camera position.  Antoinette Tuff provided a surprising, yet pleasantly different conclusion.  No one expected that.  Even veteran hostage negotiators are expressing awe.

I have to admit I was surprised as well.  But, only for a moment.  As a life-long pacifist who suffers bouts of anxiety from not trying to hurt people who piss me off, I know that words can soothe the angst of almost any situation.  It’s a sign of intellectual prowess and emotional maturity when people make an attempt to be quiet and interact on a verbal level.  Dialogue solves more problems than a hail of bullets.

After last year’s massacre in Newtown, the ubiquitous National Rifle Association was compelled to speak publicly about the issue of guns and America’s brutal gun culture.  “The only thing that stops a bad guy with a gun,” Wayne LaPierre, the group’s executive vice president, proclaimed, “is a good guy with a gun.”

Listening to Antoinette Tuff tell Michael Hill that she identified with his emotional distress and insist that he’s worth something, I feel almost vindicated.  It’s better to talk than to fight.  It’s better to discuss matters and find common ground than to inflict bodily harm and relish in the bloody aftermath.  In the end, over 800 children went home and returned to school the next day.  Police took Michael Hill into custody and spirited him away for psychological evaluation.  Now, for the first time that I can recall, a would-be mass murderer was stopped.  Hopefully, doctors can learn what happened inside Hill’s mind; what traumatized him so badly that he went to that school with so many weapons.  And, we won’t have to rely upon Facebook rants or indecipherable drawings to ferret out the truth and try to make sense of the insensible.

Here’s something that’s not surprising – Antoinette Tuff doesn’t consider herself a heroic figure.  She merely views herself as an unimposing school district employee who became enmeshed in a frightening situation and utilized both her spiritual faith and her unconditional love to thwart a tragedy.  She didn’t need a gun and she didn’t need a bomb; she just needed some gentle words.

1 Comment

Filed under Essays

No Tears

images

Last month actor Cory Monteith died of a drug overdose in a hotel room in Vancouver, British Columbia.  He was 31.  Monteith, a star of the popular musical TV series “Glee,” apparently had struggled with drug addiction for some time.  I had never heard of him until his death; due mainly to the fact I’ve never watched “Glee.”  Something about cheery high school kids breaking out into song in the midst of their teenage angst is just too saccharine for me.  But, while I didn’t know Monteith even existed until after he died, I’ve heard of his sad dilemma too many times.  His circumstances are all too common: celebrity – drug addiction – rehab – dead in a hotel room.  Think Janis Joplin; think Whitney Houston.  Drug addiction and celebrity-hood are almost symbiotic.  It’s truly heartbreaking when someone becomes hooked on drugs or alcohol to the point that it rules and ultimately destroys their lives.  But, despite the tragedy, I simply can’t bring myself to cry for them.  I have the same reaction to someone who smokes for 40 years and comes down with lung cancer, or who fucks almost everybody they meet and contracts HIV.  Yes, it’s sad, but what did you think would happen?

I also find hypocrisy in the mix.  Trayvon Martin, for example, only had a trace of THC in his system when he was killed by an overzealous neighborhood watchman last year, but he was branded a thug.  Monteith had been in and out of drug treatment for most of his young life, but he’s considered troubled.  The glaze of celebrity seems to upgrade one’s station in life, and thuggish behavior transmutes into personal issues.

Drug addiction costs the U.S. roughly $160 billion annually; second only to alcohol abuse, which costs us about $185 billion every year.  Those are just hard dollar figures related to various tangible things like hospitalizations and property damage.  There’s no way to put a price on the emotional toll substance abuse takes on people.  There’s no real means to assess the heartbreak parents feel as they look at their dead child in a coffin, or the fear residents of a neighborhood racked by drug violence experience every night.

I don’t feel too sorry for people like Monteith because they pretty much bring the damage upon themselves.  They’re essentially responsible for the incessant carnage along the U.S. – México border.  Since 2006, when then-Mexican president Felipe Calderón launched a massive crackdown on drug trafficking, some 40,000 people have been killed.  Thousands more have disappeared.  And, not all of them are tied to the drug cartels.  Not every victim is a drug mule, or a hit man for a powerful drug lord.  Many of them are innocent people caught in the crossfire of spontaneously brutal narcotics battles.  Others victims are people who dared to refuse to bow to the cartels’ extortion tactics.  The U.S. has supplied the funding, which only makes sense, since the problem lies here.  Mexican officials like to point out that, for every Mexican who uses illegal drugs, there are up 10 Americans who do.  The other half of the problem, of course, is the gross incompetence and glaring corruption of the Mexican political system, as well as the governing bodies of other Latin American countries.  But, if people didn’t have an insatiable appetite for narcotics, the border region wouldn’t be in the vise grip of bloodshed.

Drug laws in the United States have always had a racial component.  The first – anti-opium laws passed in the 1870s – were aimed at Chinese immigrants.  The first cocaine laws, passed in the early 1900s, were designed to prevent Black men from raping White women, even though White women at the time were much more likely to use cocaine.  It’s hard to imagine now, but cocaine was once perfectly legal.  It was a common substance in many cold medicines.  And – in case you didn’t know – it was the principal element in Coca Cola.  Contemporary narcotics laws – most stemming from Richard Nixon’s self-proclaimed “War on Drugs” – have put more people in jail in the past four decades than at any time in U.S. history.

But, think how Cory Monteith obtained his drugs.  He had to go out and get it; he had to know where to get it.  Or, he had to pay someone to go out and get it.  Or, know someone who could bring it to him.  The stuff didn’t just magically appear in his hands.  No one accidentally dropped it into his luggage – a ruse some celebrities have tried before.

I can’t relate to the anguish of drug addiction, but I understand alcoholism.  I had known for a long time I had a problem.  But, it all came into focus for me back in the mid-1990s, when a young man named Byron* arrived to work in the same bank as I did.  Not much taller than me, Byron was affable and intelligent; his wire-rimmed glasses making him look especially distinguished.  And, he walked with a pronounced limp – one result of a catastrophic drunk driving wreck a few years earlier.  He was returning home from his job as a waiter, around 1:00 one weekday, a college student trying to balance school and work; when he noticed the car ahead of him suddenly veer off to the right.  Then, he saw a pair of headlights bearing down on him.  That’s the last thing he recalled before waking up in the hospital some two weeks later.  In a strange twist to the usual drunk-driving tragedies, he had survived, and the intoxicated driver had died.  But, Byron wasn’t much better.  His body was damaged as badly as his sense of security.  He spent months in recovery, which included a partial hip replacement and a prosthetic lower leg.  But, aside from being alive, he found something good amidst the tragedy: that’s how he met his wife; she was a nurse in the hospital.

Hearing his story made me reflect on one weekend night in 1988.  I attended a party at a coworker’s place where I consumed plenty of wine and even smoked some marijuana.  I have to concede marijuana never did anything to me, except dry out my throat.  But, as I headed home, I spotted a set of headlights far off in the distance.  They were coming right at me.  I managed to steer right and return to the proper side of the road.  But, that fleeting second scared me enough to stay sober – for a while.  I can’t remember the number of times I’ve driven intoxicated.  Occasionally, I was smart enough to lie down on the front seat of my vehicle for a while; other times, I pulled off the road; on some nights, I was fortunate to have a friend drive.  I ruined entire weekends because I let Friday happy hours get out of control.  A few times I had to take a day off work because I’d imbibed too much on a week night.  I recall one Friday several years ago where a long happy hour inexplicably metamorphosed into suicidal mania.  I arrived home suddenly feeling lethargic and viciously depressed.  I don’t know what came over me or why, but I managed to calm myself down after a while.  I haven’t had any such events in years.  I’ve long since learned to control myself.  Some people never get that proverbial grip on themselves.  And, the outcomes are filled with sadness.

America’s drug policy obviously hasn’t worked out as well as its designers intended.  We saw what happened with alcohol prohibition early in the last century.  People still consumed it, and its banishment led to a long series of crime waves.  Once prohibition was repealed, alcohol was regulated and taxed.  That didn’t exactly solve the problem of alcoholism.  But, anti-drunk driving campaigns that began in the 1980s raised awareness of that particular crisis, and people take alcoholism much more seriously now.  Personally, I think the U.S. at least could legalize marijuana.  But, legalization of any narcotic is a much more complex matter.

If we could somehow track that one last drug hit Cory Monteith consumed, I doubt if he’d turn out to be the only casualty.  God only knows how many people died just so he could get a fix.  Yes, it’s tragic.  It’s never a good thing when someone that young dies, much less under those circumstances.  But, my heart doesn’t ache too much for them.  I just can’t bring myself to shed too many tears.

*Name changed.

Image courtesy Pomegranates & Pearls.

1 Comment

Filed under Essays