Category Archives: Essays

What’s It Like?

road-in-the-grass-field-310-2560x1600

My father sat on a riding lawnmower; an interesting thing considering we’d never had one. But, as he traveled across a vast field of bright green grass, he came upon some people standing beneath a large tree; an oak, he thought. Getting closer to them, he realized they were some relatives: his parents, his oldest sister, his older brother and another sister. They all had one thing in common – they were deceased. He could see his parents clearly, especially his father who died in 1969. He could make out the face of his older sister. His brother looked to be in the shadows, and the other sister was cloaked in a black veil. But he knew it was them.

“Do you want to come with us?” his mother asked him.

My father turned to the expanse of grass and nodded his head no. “I can’t,” he told her. “I have to finish mowing.”

Then, he woke up. It was late 2004, and he abruptly snapped out of a depressive funk. He’d lost his second-oldest sister and his older brother within a five-month period that year. Our family was still reeling from that. But, when he recounted the episode to me, he wondered aloud if his time was coming sooner than expected.

“No,” I told him. “They were testing you. They wanted to see if you were ready to give up. But you obviously have a lot of things left to do in this world.” He’d always liked gardening, I reminded him; a trait he’d gotten from his mother. The large field of grass was just a metaphor for life.

What’s it like, I wonder, to be dead? How do people navigate in the afterlife? I’ve always been fascinated with that; what happens to people when they die. Unlike some people, I don’t pretend to know what exactly will happen to me once I expire. But, unlike others, I don’t believe this is it; our life here on Earth is all we get. I’m not so arrogant as to express a firm knowledge of such things. I just have my own beliefs.

When my father’s oldest sister died in February of 1998, we had a simple ceremony in a chapel at the cemetery and then watched her be interred in a place near her father. That’s how she wanted it: just throw her body into a box, drop her into the ground and go on with our lives. Nothing fancy; no drawn-out church mass; no miles-long funeral procession; and no rosary. When I told a close friend about it, he expressed shock that we didn’t have a rosary; a pre-funeral Catholic affair akin to a Protestant wake.

“I hate to tell you this,” he said matter-of-factly, almost ominously, “but your aunt’s chances of getting into Heaven are slim.”

If we’d been sitting face-to-face, I would have smacked him. “Who the fuck do you think you are?!” I screamed into the phone. I unleashed a slew of other invectives, before slamming down the receiver.

This came from a guy who was raised devoutly Catholic, like me, but who – at some point in his early 20s – detoured into voodoo. He had renounced the latter and returned to his Christian roots. Yet, his self-righteous proclamation about my aunt’s spiritual survival was more than an insult; it was an abomination.

Several years ago, while attending a Catholic parochial school, an antiquitous nun (I knew of no other kind) abruptly informed me and some other students that animals have no soul. They just die, she said, and that was it. I was horrified. Did that mean I would never see my beloved dog, a German shepherd named Joshua? I cried deep inside. How could that be? Why would God be so cruel as to deny we animal lovers the company of our pets in the afterlife?

Fortunately, I’ve long since recovered from the perversions of Roman Catholicism and Christianity in general. It’s one reason why I divorced myself from that mess – a sin unto itself. Religion makes people say and do stupid shit.

Theology or not, I’ve never really been afraid of the unknown. I’m not a Goth-like critter who looks for ingenious new ways to kill himself – well, not anymore. My fascination with death started when I was young; perhaps, because I really did think of killing myself. The relentless bullying I experienced in school and the loneliness of being an only child made me contemplate suicide when I should have been thinking about sports or games.

Now, as an adult, I still think about death, but not so much dying. I consider it the afterlife, or more appropriately, the after-this-life. It’s another level the human soul attains; a world superior to this one. I’m not eager to get there! I’m just curious about it. I tell people I have so many books I hope I get to read them all before I die. But then, maybe my after-this-life activities will include reading. And playing with dogs!

6 Comments

Filed under Essays

The Chief at 51

number51

Wow! I’m 51 today. That’s more than half a century. So, what? I feel pretty good. I’m certainly glad to make it to this age. The alternative isn’t pleasant. I think of the few people I know who died well before 51 and I certainly can’t be thankful enough that I’ve lived this long. Each day I wake up gives me another chance to make my life better.

I do have a few simple wishes:

  • That my parents’ health improves long enough for me to get their life stories on video. They’re not celebrities, but they’ve led some interesting lives and have some great tales to share.
  • That my dog lives a few more years. He’s 12 now, which apparently puts him in the same age bracket as my parents. He’s only the second dog I’ve ever own, but he’s made realize what’s important in life.
  • To get my novel published within the next few months. I’ve worked on this thing longer than most DVD players have been around, so it’s way past time to get it into print. Being a professional, published writer is all I’ve ever wanted to do with my life anyway. I know it’s a tough business, but I can imagine no better profession for me.
  • To see my freelance writing career take off. Business or technical writing is the second greatest passion I have – somewhere after lifting weights and sleeping nude.
  • To find a box with $1 million in cash somewhere on the side of the road.

Okay, maybe getting my novel published now is a bit of a stretch. But who says we can’t dream extravagantly?

I don’t know why I’ve made it to this age, nor do I know why I’ve gone through all the crap I’ve experienced. I’ll find out one day. But it’s brought me here. And my life isn’t done yet. I don’t know how much longer I have, but I want to make up for all the lost years of being terrified of the future. Here’s to more time on Earth with the people I love and care for the most!

9 Comments

Filed under Essays

Monumental

redskinstrademark

Tomorrow evening, October 27, the Dallas Cowboys will play the Washington Redskins at AT&T Stadium in Arlington, Texas. Currently, the Cowboys are on a winning streak, and hopes for a successful season look brighter than a cure for Ebola. But, amidst the usual revelry of a brutal contact sport, the issue of naming has arisen once again – the Redskins’ name.

Yolanda Blue Horse, a Dallas resident and member of the Lakota Nation, has scheduled a formal protest outside the stadium for 3 p.m. on Monday.

“When we all stand together as one, we also honor those before us and those to come after us,” Blue Horse declared. “The continued use of this negative word is not only derogatory, but it is offensive and we demand that the owner, Dan Snyder, stop using this racist word to promote his football organization.”

For years Native Americans have been demanding that Washington change its team name; a racial slur as bad as nigger, spick, chink, or elected official. And, for years, Washington has balked at the suggestion. But, in recent years, I’ve noticed something different: people are starting to pay more attention to the issue. Moreover, the Federal Communication Commission (FCC) has taken greater interest in the subject. For the first time in memory, they’ve actually contemplated banning radio and TV stations from using the term ‘redskin’ while broadcasting.

“We will be dealing with that issue on the merits, and we’ll be responding accordingly,” said FCC Chairman Tom Wheeler. Wheeler admits he’s a critic of Washington’s name, calling it “offensive and derogatory” in a recent interview. He refers to the club as “the Washington football team” instead.

Banning the term ‘redskin’ would effectively prevent radio or TV outlets from utilizing it while on the air. If they do, in other words, they could lose their license. That would mean any TV network or radio station broadcasting a game featuring Washington couldn’t openly refer to them as the Washington Redskins. The announcers couldn’t utter it, and the name couldn’t be displayed even in written form. Therefore, it’s possible a network wouldn’t take the chance and decide not to televise the game. That could result in millions of dollars in lost revenue for the network and its sponsors. If Washington should make it to the annual Super Bowl, that could create a financial calamity. Earlier this year the U.S. Patent and Trade Office went so far as to cancel the team’s trademark; denouncing it as disparaging to Native Americans. That’s the closest anyone has come to banning ‘redskin’ from public usage at the national level.

This past spring 50 members of the U.S. Senate sent letters to National Football Commissioner Roger Goodell prodding him and the league to endorse a name change for Washington.

“The NFL can no longer ignore this and perpetuate the use of this name as anything but what it is: a racial slur,” said one letter, signed by 49 senators. “We urge the NFL to formally support and push for a name change for the Washington football team.”

Not surprisingly, owner Dan Snyder has refused calls to change the team’s name, proclaiming it a noble moniker, not a slur. In a recent interview with ESPN, he once again insisted he won’t bow to public pressure. “It’s just historical truths,” he said, “and I’d like them to understand, as I think most do, that the name really means honor, respect.”

Snyder highlighted both William Henry “Lone Star” Dietz, Washington’s first coach and for whom the team was named to honor his “Native American heritage,” and Walter “Blackie” Wetzel, the late former president of the National Congress of American Indians and chairman of the Blackfeet Nation, who helped design and approve the team’s logo, as true-life examples of the positive history of the nickname.

I wrote up an essay on this issue a couple of years ago; wondering aloud if anyone would tolerate sports team names such as the Washington Niggers or the Houston Hebes. The word ‘redskin’ has a muddled history. Many claim it was a reference created by early European explorers and / or colonists who took note of the often-ruddy complexion some Indigenous Americans have. Others declare it was a reference to the reddish body paint some native peoples adorned themselves with, as they prepared for battle, or engaged in some kind of religious ceremonies. Whatever its origins, redskin is still a vulgar and racist term.

Quite frankly, though, some people of Indian extraction aren’t offended by it; seeing it strictly as a name only, with no racist overtones. In the ever-mutating world of American English, however, plenty of folks view attempts to ban ‘redskin’ and force Washington to change its name as another chapter in the ‘Book of Political Correctness.’

In an editorial last year, “Washington Post” columnist Charles Krauthammer lamented, “I don’t like being lectured by sportscasters about ethnic sensitivity. Or advised by the president of the United States about changing team names. Or blackmailed by tribal leaders playing the race card.”

The “Conservative Tribune,” deemed calls for Washington to change its name “absurd,” adding, “If anything, the team is showing respect to native Americans by actually naming themselves after them.” The same site also just published this brilliant photo of a “conservative’s reaction” to the ‘Hands Up, Don’t Shoot’ mantra over the Michael Brown shooting.

Erick Erickson, editor of RedState, blamed President Obama for the USPTO’s decision. “The lesson here is that guilty feeling white liberals are a threat to freedom and, in Barack Obama’s America, the key to survive is to not appear on the radar of in Washington, D.C.,” Erickson wrote. He further implicated “a bunch of overeducated white guys who cry during ‘Love Actually’” and “a class of men who pee sitting down.”

Rush Limbaugh noted the Patent and Trademark Office is part of the Obama Administration, which, in turn, is the source of all this “tyranny.”

Right-wing blogger Matt Barber sees an unsettling trend looming on the horizon with the USPTO’s decision. “Whether or not you believe the Redskins should change its team name, you should be concerned by this troubling development,” he wrote. “It’s a harbinger of things to come. The American free market and private enterprise are no longer free nor private. Liberty is under threat as never before. Here’s to the good ol’ U.S.A.! We’ve officially become an Obamanation.”

Comments to a “Dallas Morning News” piece about the matter last week displayed an exorbitant amount of vitriol. One man complained that he felt like suing the Cracker Barrel restaurant chain for its obviously racist name. Yes, I’m sure millions of Caucasian-Americans get sick to their stomach when they see the Cracker Barrel sign; that’s why so many of them keep patronizing those stores!

Okay, I get it! A bunch of middle-aged White conservatives are pissed off that someone dares to challenge their view of American society. It’s the same reaction many had to school desegregation and the 1964 Civil Rights Act. They’re the ones who believe the United Nations still has covert operatives hovering along the U.S.-Canadian border, just waiting for the right moment to launch an assault and force gay marriage and mandatory abortions on God-fearing Americans.

No, you idiots, this isn’t political correctness. Political correctness is saying that all Indian people are great and wonderful, even if they’re drunk-ass bastards who engage in criminal behavior. Political correctness is telling men they must always respect women, no matter what stupid or awful things she does or says to him. Political correctness is U.S. foreign policy towards Israel.

Since Snyder is Jewish, he could easily change it to Washington Kikers, but then, political correctness would really get turned upside down. But, I believe the Washington Monuments would be appropriate. Washington, D.C., is home to some of the nation’s premier monuments to its heritage. Besides, a monument – as in the Washington Monument – is a long, thick column of granite, sticking straight up to the sky. I think it’s appropriate, considering football is the last bastion of male athleticism in the U.S.; a tribute to excess testosterone and men’s aggression.

Despite the right-wing rancor, this issue isn’t going away. And it’s never been a matter of political correctness; it’s simply a matter of respect.

6 Comments

Filed under Essays

Viral Vitriol

fearappeal

By the time the President of the United States made a public statement about the epidemic, several people had died and an untold number were already infected. But, when he stepped to the podium to address the media, his words weren’t anything some in the audience had hoped he’d say. His brief speech wasn’t about funding or education directed towards stemming the scourge and ultimately finding a cure; it was about policy. A cacophony of jeers slammed into his geriatric face, and he merely lifted an eyebrow, as if saying, ‘Well, that’s all I need to say about it.’ Indeed, that’s all anyone should have expected Ronald Reagan to say about AIDS.

On June 5, 1981, the “Morbidity and Mortality Weekly Report,” a publication from the Centers for Disease Control and Prevention, presented data about the peculiar cases of 5 young men, “all active homosexuals,” who had developed Pneumocystis carinii pneumonia (PCP) at 3 different hospitals in Los Angeles. Two of them were dead by the time the report came out. PCP is a very rare form of pneumonia, occurring only in people with depressed immune systems. That seemingly healthy young men in large urban areas around the country were coming down with it seemed to contradict medical scripture about the ailment. Because the patients were all “active homosexuals,” however, the CDC labeled the new disease “Gay-Related Immune Deficiency” (GRID). Within months, however, the CDC realized that “active homosexuals” weren’t the only victims. Intravenous drug users were also coming down with the mysterious new disease; then prostitutes, but also other people who didn’t fit into any of those groups. They quickly renamed it Acquired Immune Deficiency Syndrome (AIDS). But the damage was already done by those 2 words: “active homosexuals.”

When Reagan addressed the press on September 17, 1985, he mentioned AIDS only to declare a travel ban for all HIV-positive and AIDS-afflicted people. By then, scientists had identified the AIDS virus, and the U.S. Food and Drug Administration (FDA) had approved usage of the first test to detect it, the ELISA test. Scientists had already confirmed one critical fact about the new scourge: it was a blood-borne pathogen; infectious, but not contagious. Still, panic had set into the nation. Gay men were being targeted with more violence than they ever had been in the nation’s history. Even as the gay-rights movement gained momentum in the 1970s, gay men didn’t face the sort of vitriolic backlash as they did with the rise of AIDS.

In 1983, Pat Buchanan, a former speech writer for President Richard Nixon, published a column about the AIDS epidemic, in which he claimed, “The poor homosexuals – they have declared war on nature, and now nature is exacting an awful retribution.”

In 1986, Libertarian Lyndon LaRouche proposed legal discrimination against people with HIV and AIDS as a matter of public interest. He wanted federal and state governments to protect people from AIDS in the same way it protects the citizenry against other diseases by quarantining them in concentration camp-like structures.

Reagan’s lack of concern for the burgeoning epidemic has always been a sore point for human rights activists. The former actor, however, repeatedly extolled the virtues of personal responsibility, even with health matters, and bemoaned government involvement. During his 1966 run for governor of California, Reagan denounced President Lyndon B. Johnson’s Medicare program as “socialized medicine.”

But, previously, the U.S. government did respond quickly to health scares. When several people attending the annual legionnaire’s convention in Philadelphia in September of 1976 came down with a vicious flu-like ailment, health care workers jumped into action and almost immediately identified the source: a water-borne bacteria later called Legionella.

That same year U.S. health officials warned the public about a pending influenza epidemic, swine flu, and urged people to get vaccinated as soon as possible. Panic set into the American psyche and several individuals rushed to their doctors. The resulting hysteria now stands as one of the worst debacles in U.S. healthcare history.

When 7 Chicago-area residents died from ingesting cyanide embedded in Tylenol capsules in the fall of 1982, the federal government jumped into action to help Tylenol maker Johnson & Johnson manage the crisis. The company pulled every single one of its products off store shelves, resulting in a multi-million dollar loss, and then reintroduced them with tamper-resistant packaging. It’s difficult for younger folks to imagine now, but there was a time when you could open a bottle of something and not have to peel away a layer of plastic or foil. The crime spawned only one known copycat incident – in Auburn, Washington in 1986 – but it remains unsolved.

For those of us who recall the hysteria over the AIDS epidemic in the 1980s, the current reaction to the Ebola fiasco is painfully similar. Like HIV, Ebola is a blood-borne virus; spread only by close contact with the body fluids of an infected person. They both originated in Africa. HIV has been traced to green monkeys, where it started out as simian immunodeficiency virus, or SIV. How or when it metamorphosed is still being investigated, but researchers believe it made its first appearance in humans in Kinshasa, Democratic Republic of the Congo (then known as Zaire) in the 1920s. Scientists still don’t know the host source of Ebola, but they believe it comes from fruit bats. That’s pretty much where the direct comparisons end. Ebola is far deadlier; it induces a severe hemorrhagic fever, in which the internal organs not only collapse, but literally begin to disintegrate. Once an infected patient reaches the stage where they’re bleeding incessantly, it’s too late to save them. There are now drugs that can slow the advance of HIV and even full-blown AIDS. But, there’s not even a vaccine for Ebola. Agents like ZMapp haven’t gone beyond the experimental stage yet. Now some have the audacity to wonder why there isn’t enough of it.

It’s ironic that the world learned of Ebola before it learned of HIV and AIDS; yet more people have died from the latter. That the developed world never contemplated (outside of scientific circles) that Ebola could spread beyond remote Central African villages signals a certain degree of naiveté, if not stupidity. In this increasingly interconnected global economy, there’s no reason to suspect otherwise.

But, the attitude of ‘them-vs-us’ is what allowed the AIDS epidemic to get so out of hand. The “active homosexuals” comment – something the CDC regrets to this day – burned into the minds of socially conservative activists who saw the scourge merely from the viewpoint of a moral lens. Conservatives warned Reagan not to mention AIDS or HIV during his speech at the 1984 Republican National Convention in Dallas, lest he lose the party faithful. Those in control of the U.S. blood industry, such as the Red Cross, didn’t want to believe their products and patients were at risk from HIV; literally asking some hemophiliacs and organ transplant recipients if they wanted to be placed in the same group as “them” – meaning the gay male / drug user / prostitute gallery.

If the U.S. had taken AIDS seriously from the start, we might have developed protease inhibitors by the end of the 1980s, instead of a decade later. By now, we might even have a vaccine, if not a cure. (If you read my 2012 essay, “I Almost Hope They Don’t Find a Cure for AIDS,” you might understand my sense of trepidation about this particular matter.)

The perception of ‘it’s their problem’ has impacted countless issues of various types: economic, medical, political, religious and social. Some health officials saw the need to work towards a cure, or at least a treatment for Ebola long ago. Dr. Kent Brantly, a U.S. medical missionary, contracted Ebola this past July while working with patients in Liberia. When he was brought to Atlanta’s Emory University, looking like an extreme beekeeper, he became the first person with the disease to step foot on American soil, or anywhere in the Western Hemisphere for that matter. Some people have wondered aloud why he would have spent so much of his time and energy in the first place to work with Ebola patients in Africa, when we have people dying of obesity and drug addiction here in the U.S. Those are fair questions. Yet Brantly sees his purpose in life as more than just a dispenser of medicine and sage advice. His Christian outlook on life (and I don’t want to bring religion into this debate) prompted him to be concerned about everyone around him – not just his immediate circle of family and friends. More than just a few people have used their religious ideology to narrow their view of ‘Others.’ I’ve worked with plenty of them. Just look at the AIDS epidemic. Even now, more than three decades after the epidemic was given a name, several individuals still look at AIDS from a moralistic perspective. They still don’t understand that morality really has no place in health and medicine.

Right-wing extremists have proposed simple solutions to the Ebola epidemic. Sen. Ted Cruz called for a complete ban of people traveling from Ebola-ravaged nations in West Africa. “Common sense dictates that we should impose a travel ban on commercial airline flights from nations afflicted by Ebola,” he said. “There’s no reason to allow ongoing commercial air traffic out of those countries.”

He’s just one of many who have made such idiotic proclamations. But Dr. Anthony Fauci, an early proponent of AIDS research and current head of the National Institute of Allergy and Infectious Diseases (NIAID), literally scoffed at the notion; dubbing it “counterproductive.” “[W]hen people come in from a country, it’s much easier to track them if you know where they’re coming from,” he noted. “But what you do, if you then completely ban travel, there’s the feasibility of going to other countries where we don’t have a travel ban and have people come in.”

There are no direct flights from anywhere in Africa to the U.S. Thomas Eric Duncan, the Liberian man who developed Ebola shortly after arriving in Dallas last month and who died on October 8, had initially flown from Monrovia to Brussels; then from Brussels to New York City.

Reductions in the CDC’s budget also may have played a part in the Ebola mess. As usual, conservative Republicans were quick to demand cuts in health care; rampaging through the CDC’s financial allotments like a drunk rabbi in a Catholic boys’ school. Even President Obama bought into the philosophy that this was a wise move, slashing $72 million from the CDC’s public health emergency preparedness program for fiscal year 2012. I’ve noticed social conservatives are never so eager to cut military spending or funding for more prisons.

I don’t know what’s next in the Ebola scourge. It shows no signs of abating in West Africa, and there’s a good chance more people are going to contract the virus outside of that region. I shudder at the thought of it reaching India or China. Politics and religion don’t have places in health and medical care. Whenever they’re factored into the mix, people get hurt and die. In this modern world, we can no longer afford it.

1 Comment

Filed under Essays

Where Are They Now?

images

A couple of weeks ago I watched the latest documentary series by Ken Burns, “The Roosevelts: An Intimate History.” It focuses on the three most famous members of this legendary family: Theodore, Franklin and Eleanor. They are also three of the most fascinating individuals of the 20th century, and this series only solidifies, in my mind, a deep longing for similar people in public life today. The Roosevelts were much like the Kennedy family of Massachusetts. They were ambitious, assertive, intellectual and strong-willed. Their progressive ideals and bold honesty shoved the United States onto a (sometimes unwilling) forward track. Yes, they were wealthy and traveled in elitist circles. But, for the most part, they had overwhelming respect for their fellow citizens. They were committed to public service, not politics. And, as the United States stumbles from one crisis to the next in this strange, new world of the 21st century, I have to ask where are people like the Roosevelts and the Kennedys now?

The U.S. never has had a royal family. Our official founders technically escaped European feudalism because of the vice grips that small bands of inbred groups had on their ancestral homelands. But, I’d have to say the Roosevelts and the Kennedys come close to American royalty. The Roosevelts produced two extraordinary presidencies, and the Kennedys produced one; albeit a tragically short one. Yet, both families charted progressive courses for the U.S. that ultimately gave freedom to so many of their contemporaries and challenged future generations to keep America as a beacon of democracy.

I’ve always viewed Theodore Roosevelt as a personal hero. It’s odd, considering he had been a sickly child burdened with asthma. As an adult, he suffered from depression. Yet, he grabbed life by the throat and rung every ounce of energy from it. He was a ball of lightning; unafraid to take on the notorious bosses of Tammany Hall and the ruthless titans of industry. A nature lover, he established the national park system.

His fifth cousin, Franklin, and the latter’s wife, Eleanor, helped move the nation closer to racial equality than anyone had before. Franklin broke from family tradition when he accepted a post as Assistant Secretary of the Navy in 1913 in the administration of Woodrow Wilson. He ran for the vice-presidency as a Democrat in 1920. After losing that race, he returned to a simpler life, enjoying his family and earning a living as lawyer. But, in the summer of 1921, while vacationing in New Brunswick, Franklin experienced a life-altering event: he contracted polio; then called infantile paralysis, a frightening and debilitating scourge (usually afflicting children) with no cure or vaccine. Franklin never regained use of his legs and could only stand or walk with the help of someone or something. He persevered, however, and became determined to heal himself as best as possible with lengthy stays at a resort he eventually purchased in Warm Springs, Georgia. There he could languish in a pool for hours, which eased the agony of twisted muscles and constricted joints.

But, Franklin also remained committed to life as a public servant. In 1928, he ran for and won the governorship of New York state. Four years later he successfully ran for president. He ran three more times, holding the office for an unprecedented 12 years.

Like his familial predecessor, Franklin Roosevelt wasn’t afraid to make bold decisions and launch big projects. Whereas Theodore took on various industries, such as oil and timber; compelled the U.S. Congress to mandate safe working conditions; and commence the national parks system, Franklin forced the federal government to take control of the slew of banks still faltering during the Great Depression; created the Civilian Conservation Corps; and introduced Social Security. Franklin’s predecessor, Herbert Hoover, and Hoover’s Treasury secretary, Andrew Mellon, boasted typically conservative attitudes about business and the economy: government had no real role in managing corporations; if a company – or even a bank – got itself into financial trouble, it was incumbent upon that entity to get itself out of trouble. Franklin knew that was true, but he also understood the true scope of the economic calamity afflicting the nation in the early 1930s. People were losing their jobs, their money and sometimes their lives, as banks folded. The crisis was gigantic in scope, and the hands-off approach of the Hoover Administration only exacerbated matters. Roosevelt created the Federal Deposit Insurance Corporation (FDIC) during his first year in office; an entity that would safeguard consumer bank assets and – slowly – reinstill trust in the nation’s financial institutions.

After the U.S. became embroiled in World War II, Franklin’s health began to deteriorate. He hardly campaigned in 1944. But, he didn’t give up. He was determined to lead the nation out of the war. Sadly, he didn’t see the day when America’s enemies surrendered, yet he maintained a high degree of spiritual vigor. He didn’t stop until his body forced him to do so.

Eleanor Roosevelt triumphed as well on many levels, but not really until after Franklin died. Like most of her female contemporaries, Eleanor had few choices in life. She had to be someone’s wife or someone’s mother, but she could never be her own person. A niece to Theodore and a distant cousin to Franklin, she felt uncomfortable in the role of First Lady. But, once she realized how desperately poor much of the nation’s citizens were because of the Great Depression, she pushed her husband to enact the strident and controversial legislation for which he’d become famous; not being given even a smattering of credit for it, of course, until decades later. Almost accidentally, she also became a torch bearer for the burgeoning civil rights movement; knowing that all Americans – regardless of gender, race or ethnicity – deserved to be treated equally. Not long after Franklin’s death, Eleanor prodded his successor, Harry S. Truman, to proceed with establishment of the United Nations and, later, battled for the “The Universal Declaration of Human Rights.” Her tireless efforts towards gender and racial equality made her an enemy of the staid social right-wing (even to the point of receiving death threats), but they helped her carve out her own legacy in the gallery of extraordinary Americans. “No one can make you feel inferior about yourself,” she declared in her book “This Is My Story,” published in 1939.

John F. Kennedy is another personal hero of mine, but not because he was the nation’s 35th president, or an heir to a prominent and wealthy Irish Catholic family. Like his older brother, Joseph, Jr., John Kennedy joined the military during World War II. Joseph was killed in action in August of 1944, and John nearly lost his own life in the South Pacific a year earlier. John had joined the U.S. Navy shortly after graduating from Harvard University in 1940. While commanding a torpedo boat, a Japanese warship rammed the small vessel. Despite severe injuries, Kennedy led other surviving crew members to a nearby island. His back never fully healed, and he suffered with the pain for the remainder of his life.

Before his stint in the Navy, however, John Kennedy attained a modest level of intellectual notoriety. In 1939, his father was appointed U.S. Ambassador to Great Britain. During a visit to England that same year, the younger Kennedy researched why the nation was unprepared to fight Germany at the onset of WWII. It became his senior-year thesis; a detailed analysis so well-received that it was published as “Why England Slept.” Kennedy launched the space race by challenging the U.S. to “land a man on the moon” before the 1960s ended – which we did.

Other giants of the 20th century shouldn’t go unnoticed: Wilson and Truman, of course, but also Dwight D. Eisenhower and Lyndon B. Johnson. Wilson was reluctant to jump into World War I (then called “The Great War”) and envisioned the U.N., which he called “The League of Nations,” a multi-national entity that forced the United States onto the world stage. Truman integrated the U.S. armed forces. Eisenhower jumpstarted the interstate highway system. Johnson signed into law some of the most important pieces of legislation of the modern age.

They were not without their faults. Theodore Roosevelt was essentially a racist in that he believed Caucasians were biologically superior. But, one has to consider that he was a product of his time, so I think he can be forgiven for that. A lot of otherwise good people felt that way back then. Franklin Roosevelt, John F. Kennedy and Lyndon Johnson were adulterers. Johnson may have been a modernist in regards to civil rights, but he also led the U.S. into the quagmire of Vietnam.

The closest the U.S. has to a political dynasty is the Bush family, which isn’t saying much. The Bush clan has produced two of the most dismal presidencies within a quarter century. Therefore, I lament the fact I can’t point to many notable political leaders right now. I placed a great deal of faith Barack Obama, when he first ran for office. Now, I’m disappointed in him. I know it’s not completely his fault. He’s dealing with an arrogantly recalcitrant Congress; a hodgepodge of right-wing extremists who are more concerned with banning gay marriage and instituting creationism into America’s educational curriculum than more critical tasks, such as punishing those responsible for the 2008 economic collapse and rebuilding our crumbling infrastructure. I’m certainly disappointed in U.S. Attorney General Eric Holder who just announced his resignation. Our elected officials are wrapped up in petty battles with one another.

There seem to be no big dreamers anymore – and I don’t know why.

3 Comments

Filed under Essays

Set Up

09-11-01_raising_the_flag

“They’ve bombed the World Trade Center in New York,” my mother said.

“Who?” I asked.

“I don’t know. We just heard about it.”

It was a few minutes before 9 A.M. on Tuesday, September 11, 2001, and her phone call had awoken me from a sound sleep. I had lost my job at the bank almost five months earlier and had taken to sleeping late throughout that summer. I had my alarm clock set for 11 A.M. My father had an appointment with his eye doctor at 1 P.M. Exactly one week earlier one of the doctor’s colleagues had implanted some radiation pellets into his left eye; another attempt to destroy a small tumor that had formed behind the eyeball. The doctor had made his first effort to eliminate it nearly seven months earlier by cauterizing the blood vessels around the tumor. But, it had regenerated. The pellets could only remain in his eye for a maximum of seven days.

After my mother had called me, I really couldn’t go back to sleep and finally got up around 10:00. Turning on the TV, I couldn’t believe what I was seeing: the South Tower of the World Trade Center collapsing. It had just occurred, and I was watching a replay. “What the hell happened?” I kept asking myself.

I thought back to Memorial Day weekend, when I visited New York with a close friend, Phillip*, who had lived there for five years. He had attended New York University, beginning in 1991, and after graduating, decided to stay and try to build a long-desired career in the film industry. When that didn’t go as planned, he returned to Dallas. Yet, Phillip kept his tiny one-bedroom apartment in the heart of Greenwich Village; subletting it college students. He hoped to move back there one day.

I had first visited New York with Phillip over Memorial Day weekend 1997. We stayed with some friends of his who lived across the Hudson in New Jersey. I had no desire to patronize the World Trade Center back then. Seeing the Statue of Liberty from a boat was enjoyable enough, but a cluster of office buildings wasn’t exactly akin to viewing the remains of Tenochtitlán.

But, before our 2001 visit, I told Phillip I’d visit the World Trade Center complex – just to say I’d been there. Then, as we made our way to Manhattan’s financial district, I stopped. Literally. In mid-stride.

“What’s wrong?” Phillip asked me.

I was silent for a moment. “Nothing,” I finally said. I don’t know what it was, but I had suddenly developed a sickening feeling as I looked at those two gargantuan structures just a few blocks away. I don’t remember exactly what I said afterwards, but I shifted my focus to an Indian restaurant Phillip had wanted me to try out. My appetite had evaporated, yet we made our way back up to the Greenwich Village area. I grew hungry, though, by the time we reached the restaurant. I couldn’t explain to Phillip why I’d abruptly changed my mind about the World Trade Center. I couldn’t explain it to myself.

As I sat alongside my father in the waiting room, we stared at the TV monitor snuggled high up into a corner. An older couple sat opposite us, and, of course, we all wondered aloud who had wreaked such havoc on us and why. None of us actually cared why. We just wanted retribution.

But, thirteen years on, I know why Al-Qaeda attacked the U.S. in so brutal a manner. It’s not like they all woke up one morning and decided to highjack those planes because they had nothing else better to do with their time and money.

 

Forgetting Afghanistan.

On December 27, 1979, the Soviet Union unexpectedly invaded Afghanistan. Back then, the average American probably couldn’t locate the landlocked nation on a map. It was the U.S.S.R.’s last concerted effort at a land grab. At the time, however, the United States was preoccupied with the Iran hostage crisis. Before then, most Americans probably couldn’t find Iran on a map either. In retrospect, though, the quandary was the U.S.’s first battle with radical Islam. President Jimmy Carter appeared thoroughly inept in his handling of it; a fact that cost him the 1980 presidential election. Ronald Reagan rode into the White House with a promise to help the mujahideen fighters drive out the Soviets. The Cold War was still very active; the U.S. and the U.S.S.R. locked in a never-ending battle of hearts and minds. In March of 1985, Reagan signed National Security Decision Directive 166, which allowed for much-needed financial and military support to the Afghan warriors. Within two years, the U.S. was shipping up to 65,000 tons of arm supplies and covertly spiriting a bevy of military operatives and specialists into Afghanistan via Pakistan. When the Soviets finally left Afghanistan in 1989, the Afghan people expected the U.S. to live up to its Reagan-born vow. We were supposed to stay and help the impoverished country move from its medieval environment into the 20th century. We never did. President George H.W. Bush simply didn’t see it as a priority. Neither did Bill Clinton. People don’t forget something like that.

Blindly supporting Israel.

The U.S. and Israel have one major thing in common: both were founded by White Europeans fleeing religious persecution who ended up displacing the indigenous peoples through violence and intimidation. As of 2013, the U.S. has been providing roughly $3.1 billion annually to support its only true ally in the Middle East. This small nation of 7.1 million was formally established in 1948 and now has the highest standard of living of any country in the region with a 95% literacy rate and an average life expectancy of 79. It’s not that its neighbors are bitterly envious of Israel’s global success. The harassment of non-Jews by Israeli police and government has always bordered on the criminal. But, any criticism of Israel’s actions is met with a harsh rebuke by its supporters. President Barack Obama is repeatedly accused of abandoning Israel; a declaration born more out of political partisanship and racism than fact. Yet, Israeli Prime Minister Benjamin Netanyahu has received scant criticism about his refusal to acknowledge a “Palestinian state,” or a “two-state solution.” The ongoing battle between Jews and Palestinians is a little like the English-French divide in Canada, but more pointless. Israel’s assault upon Lebanon in 2006 was met with silence, even as news of atrocities at the hands of the Israeli military seeped out, along with images of civilians fleeing to the island nation of Cyprus. The U.S. also remains mum on Israel’s constant push into the West Bank; forcing out entire families and destroying Palestinian property. Other democratic nations always seem to look away.

 

The United States should have seen 09/11 coming. There were plenty of signs: the 1993 bombing of the World Trade Center; the 1996 assault on the Khobar Towers in Saudi Arabia; the 1998 bombings of the U.S. embassies in Kenya and Tanzania; and the attack on the U.S.S. Cole in October of 2000. There’s also plenty of blame to go around. On September 12, 2001, people kept asking how something so horrifically grand could happen. Didn’t anyone suspect that planes could be used as missiles? Didn’t anyone believe it was imprudent to overlook the expired visas of foreign nationals? Didn’t someone think box cutters and pocket knives could be so deadly? Didn’t somebody alert authorities to the curious behavior of Middle Eastern men at flight schools? Well, yes to all of the above. Various people at various times had already expressed concern about those things. And, it goes far beyond just the infamous “August 6, 2001 Presidential Daily Briefing.”

There’s nothing that can take back the horror of that late summer day more than a decade ago. People launching themselves from the top floors of the World Trade Center towers is one of the most blood-curdling things I’ve ever seen. We’ll never just get over it. And, while I’m no security expert, I know the U.S. should never set itself up for catastrophe through an imaginary veil of isolation.

09/11 Memorial.

*Name changed.

2 Comments

Filed under Essays

Laborious

Finally – some good news!

Finally – some good news!

A few years ago – about a year after I got laid off from an engineering company and while I struggled to find even a temporary job while trying to launch my freelance writing career – I told a close friend of mine via email that, when the economy improves, people will start switching jobs without giving much, if any, notice to their employers.

“True,” he replied.

It’s starting to happen. The recent economic crisis – the worst in this nation’s history since the Great Depression – almost completely destroyed our financial stability. Multiple factors were responsible for it: broad-based tax cuts for the wealthiest citizens and largest corporations; further deregulation of banking and housing; and the wars in Afghanistan and Iraq. Between December 2007 (when the recession officially commenced) and June 2009 (when it officially ended), the U.S. economy shed roughly 8.7 million jobs. Employers began to add jobs in 2010. Only recently, however, have we regained all those lost jobs.

There’s no real cause for celebration. The after effects of such a prolonged economic debacle are as varied as the causes. People lost accumulated personal wealth; state and local economies suffered decreased tax revenue; and home values dropped. Wages, however, remain stagnant, despite increased productivity. People have always worked too damn hard for their money. Of course, everyone feels they’re overworked and underpaid. But now, we have statistical proof. But, according to Ben Bernanke, chairman of the U.S. Federal Reserve System, the “Great Recession” actually was worse than the Great Depression. In a statement filed on August 22 with the U.S. Court of Federal Claims, as part of a response to a lawsuit over the 2008 bailout of insurance giant American International Group (AIG), Bernanke said:

“September and October of 2008 was the worst financial crisis in global history, including the Great Depression.” Of the 13 “most important financial institutions in the United States, 12 were at risk of failure within a period of a week or two.”

When asked why he thought it was critical for the U.S. government to rescue AIG, Bernanke replied:

“AIG’s demise would be a catastrophe” and “could have resulted in a 1930s-style global financial and economic meltdown, with catastrophic implications for production, income, and jobs.”

Obviously, too-big-to-fail truly has become too big to fail! The Great Depression was exacerbated by the fact the Federal Reserve System didn’t take command of the banks. Billionaire financier Andrew Mellon was the U.S. Treasury Secretary during the Hoover Administration and – like a typical conservative Republican – believed the nation’s banks had gotten themselves into trouble and needed to get themselves out of it, even if that meant they failed and took their customers’ money with them. Which they did, of course, in very large numbers. At the time, though, we didn’t have a Federal Deposit Insurance Corporation (FDIC) to safeguard people’s financial assets. The federal government’s lackadaisical attitude at the onset of the Great Depression forced Republicans to lose both houses of Congress during the 1930 midterm elections and shoved Hoover out of the White House two years later. That same kind of ineptitude is probably what caused them to lose both houses of Congress in 2006.

Yet, as the economy continues to recover and employers continue adding jobs, I see my aforementioned prediction materializing. During sluggish markets, employers can afford to be picky on who they hire and can freeze wages and salaries at will. It’s almost cruel and inhumane the way some can behave. And, what’s the average worker to do? With children, mortgages, car payments and other debts, they’re often stuck. They have little power.

But, from January to June of this year, more than 14 million people quit their jobs. I would like to think they left for better jobs. And, I’d like to believe they gave little notice to their employers. After all, companies don’t have to give employees any real notice when they plan to let someone go; albeit, quite often, people can feel it. In 2009, there were approximately seven people for every job opening. As of June 2014, the ratio had dropped to 2-to-1. Overall, the number of unemployed has dropped by 5 million, while the number of new jobs has grown by 2.5 million. Now, there’s talk of a problem we haven’t seen in a while: labor shortage. Companies are starting to feel one of the adverse effects of an improving economy; there aren’t enough people, or at least not enough qualified people, to fill certain positions. Thus, it’s employees and jobseekers who can be picky.

And, that’s a good thing. It’s really the way it should be. Only once in my life have I had the pleasure of quitting a job I hate; in January 1989, I left a retail position, which I’d held for nearly three years. I just walked into the place and gave my immediate supervisor a typewritten note announcing my resignation. But, I’ve known a few people who, in recent years, essentially gave their boss the middle finger and walked out of a company. They recounted their experiences with glee. We spend a great deal of time at work; often more than with our own families. Work gives people personal value and a sense of accomplishment, and everyone who makes an effort to complete a job should be respected. Whether that person answers the phones in a call center; digs ditches for sewer lines; programs a voice mail system; or rings up items at a cash register, they should be considered important. They pay taxes and insurance and they put the rest of their money back into the economy as consumers.

Last week, an executive in the company where I’m working as a contract technical writer staged an impromptu meeting to announce a major organizational change. After presenting a variety of business details, he said something that I’d never heard from someone at his level: “Family is more important than work.” He emphasized that everyone needs to place greater value on their loved ones than on their careers; noting that he hadn’t done that and almost paid the price for it. I’ve heard some executives tell people on an individual basis the same thing – but never in such a large setting. He’s right. A company won’t collapse because you can’t make it to a business conference. You won’t necessarily recall that training seminar. But, you most likely will remember a child’s sports event. And, you’ll cherish it forever.

3 Comments

Filed under Essays

Now I Understand

????????????????????????????????

In the mid-1970s, Freddie Prinze was leading an extraordinarily successful life. In December 1973, at the age of 19, he had come to the nation’s forefront after a stint on “The Tonight Show” in December 1973, which led to him landing the first half of the title role in “Chico and the Man,” an NBC television comedy. He appeared opposite Jack Albertson, a stage and film veteran. Despite their age and cultural differences, the two became good friends, with Albertson serving as a mentor to his younger co-star. I remember the series clearly. Prinze’s character was a breakthrough role. For the first time, American television boasted a Hispanic figure who spoke English perfectly.

By January of 1977, Prinze had a rollicking standup comedy career with sold-out gigs wherever he went and a top-selling comedy album; “Chico and the Man” remained a highly-rated show. He even performed at Jimmy Carter’s inaugural ball. He was married with a 10-month-old baby boy, Freddie, Jr.

And, he was miserable.

Things had begun to spiral out of control for Prinze. He’d become addicted to Quaaludes and cocaine and, in November 1976, was arrested for drunk driving. Then, on January 26, 1977, his wife, Kathy, startled him with a restraining order.  Two days later Prinze planted himself at the Beverly Hills Hotel and began making a series of “goodbye” calls to his mother, a few friends and his manager, Marvin Snyder. Snyder rushed to the hotel to try to stop his young client from harming himself. But, it was too late. Prinze put a gun to his head and pulled the trigger. He survived the initial shot, but the next day, his family authorized officials at ULCA Medical Center to remove Prinze from life support. He was 22.

The news of Prinze’s death – a suicide, no less – shocked and horrified the masses who loved him. How could someone that young with so much talent, success and money, plus a beautiful wife and baby, be so unhappy? I was 13 at the time and couldn’t understand. He was popular, right? He had lots of money, right? Why would he kill himself? It just didn’t make sense.

The recent suicide death of actor / comedian Robin Williams exposes, yet again, a miserable underside that lurks beneath a life of outwardly blissful happiness in the entertainment world. There’s a reason why the symbol of the theatre is comprised of dual masks: the comic Thalia, smiling, and the dramatic Melpomene, frowning. They’re high and low; top and bottom; the moon’s bright side and its dark side. Intertwined and – for the most part – interchangeable. All emblems of life. One can’t exist without the other.

Both Prinze and Williams had a great deal of money and a great deal of fame. It seemed everybody loved them. If someone has those two things – money and fame – then everything else is inconsequential. They should be completely and totally satisfied with their lives. Isn’t that the way it’s supposed to be?

Money may make life easier, but it really doesn’t make it completely satisfying. As cliché as it sounds, money truly does not buy happiness. No amount of money will make you like a job you hate. I love writing, for example, even though I haven’t made much money from it; a few freelance and contract technical writing gigs over the past few years. When I lost my job with an engineering firm in 2010, I was earning more money than I ever had before. Yet, in that last year, I hated the place. For some reason, tension had been building since the end of 2009, and I ultimately felt management was targeting me specifically. It was almost a relief to get laid off.

It’s difficult for people outside of artistic communities to understand. But, comics, actors, singers and other artists are people, too. We’re weird, yes, but we’re human beings first. We have the same emotional fluctuations and experience the same anxieties in life that everyone else does. We’re just a bit more expressive about it. Yet, because professional artists exist in the public realm, their lives fall under greater scrutiny. They’re magnified a thousand times for all to see. And, when someone makes a career out of telling jokes and doing impersonations, people assume they’re always happy. But, it’s difficult for most to imagine the pressure an artist must feel to perform and be “on” all the time. People expect a comedian to make them laugh – all the time. Entertain me, my little clown. I want nothing less from you.

And so, the entertainer does what they’re supposed to do – entertain. That’s why they’re paid – very well, sometimes – and thus, despite whatever agonies they’re facing, they pull the spirit of that entertainer deep from within the depths of their souls and put on a show. The writer, the singer, the dancer – all of them do what they’ve trained themselves to do; what they’ve wanted to do perhaps since childhood.

It appears artists, in particular, are prone to severe mood swings that often lead them to substance abuse and untimely deaths. Actors, writers, painters and the like experience the best and worst that humanity has to offer. That’s why the word “troubled” often accompanies the moniker of artist.

Jackson Pollock was one of the most innovative abstract painters of the 20th century, but he battled alcoholism his entire adult life. Ernest Hemingway was a literary giant, a larger-than-life persona who was the epitome of masculinity and steadfast courage; yet injuries he incurred during his raucous life apparently took a toll on his mental and physical health, and he committed suicide in 1961.

But, it’s not that every artist is troubled; we’re not all mentally unbalanced and destined for an early grave. We merely troubled; we’re not all mentally unbalanced and destined for an early grave. We just observe life through a more acute lens; we balance things out differently. We don’t see the world strictly in terms of black and white. We watch it move in all its colorful glory; the laughter and the pain mixed up together. That’s how and why we create the art that we do. If we didn’t experience the full gamut of human emotions, then we wouldn’t be so creative. We’d be … well, just like everyone else.

Fellow blogger Gus Sanchez touched on this very subject a few weeks before Williams’ death. “On Mood Disorders and the Writing Process” jumps directly into the fire of the artist-mental illness connection. As someone who’s gone through the manic highs and lows of creativity and dry spells where I feel the entire world is out to get me, I fully comprehend the realities of depression and anxiety.

It’s a blessing to be imbued with such creative elements. We can make other people happy, or make them think. It’s a curse in that we see the ugliest sides of the world glaring back at us and challenging us to do something about it. We often take up that challenge. Many times it works out for the best; sometimes, it hurts.

The Melpomene mask doesn’t conform to our vision of life in the limelight. Everyone wants to be around Thalia; we always demand Thalia be there to make us feel good about things. But, Thalia just can’t be a part of our world unless Melpomene is also present. They’re undeniably symbiotic; conjoined twins held together by the same heart. They can’t live separately. Without cold, there can be no hot. Wherever there’s a smile, there must also be a frown.

Towards the end of my tenure at the engineering company, I had a private meeting with my immediate supervisor. I told her that everyone was on edge and just didn’t feel good about things. She shot back, accusing me and the others of “creating all this drama.”

“There’s no drama,” I quietly responded. This wasn’t a soap opera. It was the real thing. I guess she couldn’t understand it the way I did. She was looking at the situation through a narrow, gray tunnel. I saw all of the sign posts, in blazing red and yellow, warning of danger ahead.

When Freddie Prinze passed away, my young mind couldn’t fathom such horror. But, as information about Williams’ emotional problems begin to surface, his tragic death seems only slightly more comprehensible. I keep thinking Freddie Prinze and other artists who died at their own hands reached out from the netherworld, grabbed Williams’ soul as it departed his beleaguered body and said, ‘Come with us. We understand. You’re safe now.’

So, I look at all the happiness and all the tragedy that make up this wonderfully unique thing called human existence, and I understand, too.

 

National Suicide Prevention Lifeline.

2 Comments

Filed under Essays

Frat Crap

skull-shaped-beer-pitcher

Thirty years ago this month I made one of the worst decisions of my entire life: I joined a fraternity. In August of 1984, I was a shy, naïve 20-year-old; the kind of person college social groups eat up and spit out. When I started classes at what was then North Texas State University (now, the University of North Texas), I hoped to complete my education within two years and begin a career in computers – anything to do with computers – like my parents had planned for me. I also hoped to break out of my shell of insecurity, make plenty of friends and find my future wife – after losing my virginity first. I ended up suspended from school for the fall 1985 semester, addicted to alcohol, maniacally depressed – and still a virgin.

Then, as now, I blame that fucking fraternity. I know the status of “Victim” has been a coveted one in America since the 1980s. But, hear me out on this mess.

I’ll say flat out that social Greek-letter organizations serve absolutely no purpose. They have only one function: party, which means getting drunk and having sex. Yes, they toss in the occasional charity function bullshit just to look good. For example, in November 1984, the frat I joined teamed up with the county to drive people to voting stations. In another self-righteous instance, we participated in a campus blood drive; where the director (a pre-med professor) walked around in a stupid vampire outfit. (Get it? Blood drive? Vampire?) Anne Rice probably would have killed him on the spot. Other than those two saccharine-laced, cringe-worthy exceptions, we just got drunk (they called it “enjoying alcohol – immensely”); tried to seduce as many unwary females as possible; engaged in quasi-macho antics; and partied at an aging two-story house on the edge of campus.

On my first day in the dorm, I saw a flyer advertising a party for the frat, which I’ll call Alpha Omega Dipshit (AOD). After I settled in – living away from home for the first time in my life, along with a flamboyantly gay roommate – I looked again at that ad for AOD and thought it must be great way to make new friends. I was desperate to meet new people. This wasn’t high school, which I hated. Life at a community college the preceding two academic years had been nice. But, I didn’t spend a lot of time with people. My social life during the my first two years out of high school revolved around whatever plans my parents had and my German shepherd. My dating life revolved around my hands and a bottle of baby oil. Things would be better now, I assured myself. North Texas was different. I wasn’t dealing with kids anymore. I was dealing men and women. I thought.

On a whim, I followed a guy I’d met and quickly befriended in the dorm to the AOD party, where beer flowed like the testosterone through my body. There were lots of beautiful people, and I tried making friends with every one of them. I really wanted people to like me. Being shy hurt and I had to break free of it.

In 1984, President Ronald Reagan signed a federal law requiring states to raise their minimum legal alcohol consumption age to 21; otherwise, they’d lose highway funding. The law was a response to the growing anti-drunk driving movement. Before the 1980s, drunk driving was viewed with an almost humorously dismissive attitude. Despite fatal accidents involving alcohol, intoxicated driving still wasn’t considered nearly as egregious as interracial marriage or homosexuality. That all changed after the young daughter of Candy Lightner, a California woman, was struck and killed by a habitual drunk driver. She made it a national issue. Hence, the 1984 federal law.

But, then-Texas Governor Mark White essentially told Reagan to go to hell when he mandated the legal alcohol consumption rate wouldn’t be raised to 21 in the Lone Star State until 1985. Texas had enough money to fund its own highways without some former B-movie actor telling us what to do. (That anti-Washington sentiment has always sort of been part of the Texas identity. White, I might add, was a Democrat.) It really didn’t matter to me, though. I didn’t drink that much alcohol anyway at the time.

Three years earlier, 18-year-old seniors at my high school were upset because Texas planned to raise the minimum alcohol-drinking age to 19.

“They can give you a right,” one girl told me at the start of an English class, “but they can’t take it away.”

How profound. I didn’t care. I just wanted to get the fuck out of that high school.

But, when I stepped into the back yard of the AOD house, I followed the crowd to the beer kegs and started partaking of Coors Light. Even now, the mere smell of Coors Light incurs bitter images of college boys behaving stupidly. I had one plastic cup of beer. And then, another. And then, another. And then, another. And then, another. And then, another. And then, another. And then, another.

And, that’s where it began.

I wanted so much to Belong. My lifelong shyness had stunted my personal growth. Aside from my dog, I felt no one liked me. But, in pursuing that friendship goal – paying money along the way – I became a punching bag for most of those guys. More importantly, my entire academic regimen collapsed, and the university placed me on academic probation for the spring 1985 semester. That prevented me from becoming a full, active member of AOD. I still had to pay monthly dues, of course. But, I remained in the netherworld of pledgeship. That’s something like a glorified time out. Can you feel the hopelessness?

Things got worst that year. We had to put our dog to sleep in April, and then, the university suspended me for the rest of 1985. My parents were outraged, and I became suicidal. I felt I’d lost everything. My dog was dead; I didn’t have any new friends; and my future looked bleak. And, I was still a virgin.

My life reached a new low that October when I got arrested for drunk driving. I showed up to my waiter job at a country club already intoxicated one weekday evening. Carl*, my openly-gay supervisor, wouldn’t let me work, even though the gaggle of mostly-Jewish members wouldn’t have given me a second look anyway. Instead, Carl made me sit in the back office where I ate a meal he had one of the cooks prepare for me and admitted he had the hots for me. Great, I thought. After all my efforts at chic one-liners and coy humor, the only person interested in me was a middle-aged man with a beer gut. After I sobered up a little, he told me to go home. But, I didn’t. I felt I had nothing to live for at the time. So, I got into my little Ford Escort and went bar-hopping. Coming off Dallas’ Greenville Avenue, I stumbled into a police trap and then into a police car. I had never felt as much humiliation as the moment I called my parents from Lew Sterrett Jail in downtown Dallas. They bailed me out early the next morning. Fortunately, my blood-alcohol level tested below what was then the legal limit of .10.

I returned to North Texas for the spring 1986 semester and then again for the ensuing academic year. I left for good a year later; vowing to return and complete my education. I never went back. But, I finally did earn a college degree – 20 years later.

I made only two really good friends during my tenure at North Texas. One, Dean*, I had met through AOD. He was a tall, skinny guy with tousled brown hair and a penchant for short girls. We became close – like brothers. Not frat brothers. Real brothers. As an only child, that meant everything in the world to me. He became the kind of friend I’d always wanted. He was upset that I didn’t become a full member of the frat, yet he didn’t let that bother him.

But, AOD did get in the way of our friendship. In September 1986, after I’d settled in once more at North Texas, I ran into Dean in a parking lot, while headed to class. We hadn’t seen each other in over a year. We traded phone numbers, and later, he invited me to drop by an AOD rush party. Against my better judgment, I took him up on his offer. I went with a guy named James* who’d just graduated from high school and who I’d met at my new job a few months earlier. There, I ran into many of the people I’d known before. It felt so strange – being in that house – with those familiar faces – and the smell of Coors Light. But, nothing could have prepared me for what happened next.

At some point, I got into a heated discussion with a guy named Kyle*. He’d been part of the same pledge class as me and Dean and now, two years later, was AOD’s president. Kyle was already kind of a strange character; someone who did a great Keith Richards impersonation, but was probably the same type to walk into his workplace with a shotgun. I didn’t realize he could be such an asshole, though. I don’t know what prompted the argument, but a short while later, Dean asked me to leave. Actually, he had been told to ask me to leave. He was the frat’s “Sergeant-at-Arms” – a glorified Boy Scout-type role – and apparently, since we’d been such good friends, he’d been given the task to let me know I was no longer welcome. Fine. I didn’t need them. So, I calmly departed with James in tow; acting is if nothing was wrong.

Deep down inside, however, I felt completely dejected. I had wanted so badly to be a part of that group. The next night I scampered about the campus, ripping down flyers advertising AOD. I guess I showed them! Regardless, Dean and I stayed in touch throughout the remainder of the academic year. We just didn’t talk about the frat.

The other friend, Robert*, had actually attended the same grade school as me. We knew each other only sparingly back then. But, on my first day in the dorm in August 1984, Robert stepped into my open doorway and introduced himself; he was in the room just across the hall. He startled me at first, but I was glad people were so friendly. Or, at least he was. After another moment, though, I thought I remembered him. It’s one thing to reconnect with people from high school. But, grade school?!

Ironically, he joined AOD – at my urging – and did well with it. He wasn’t there the night Dean asked me to leave. But, Robert has remained one of my best friends ever since. He’s tolerated my moodiness over the years. For example, I had an alcohol blackout one night in the early 1990s and unwittingly called him to tell him “this was it.” I was determined to kill myself. (I seriously don’t remember the incident, but I trust he’s telling me the truth.) Being the good real estate salesman he is, Robert stayed calm and managed to talk me into exhaustion.

When he revealed that to me a few years ago, I apologized to him for making such a scene and taking up so much of his time. It’s not his fault I couldn’t get my stuff together and heal myself from depression and alcoholism. Which I eventually did. Several years later.

Over the past two decades, I’ve been dumbfounded – angered, actually – to learn of incidents involving social Greek-letter outfits on college campuses. They almost always feature severe alcohol abuse, hazing and, quite often, sexual assault. How is it, I ask, that colleges allow these groups to exist? I guess the frat culture is embedded that strongly in the realm of America’s higher education. What a waste.

In the summer of 2003, my employer hired three young female temporaries to assist with an ongoing project. One had just graduated from high school and planned to attend a major Texas university that fall. Shortly before she resigned her position, I warned her to stay away from social fraternities – and sororities. “They’re just no good,” I told her.

I last saw Dean on South Padre Island during spring break 1987. I’ve retained my friendship with Robert, but I still often think of Dean. Not long after he had ordered me to leave the AOD house in 1986, Robert told me Dean had gone on a drinking binge. He felt he’d turned on a friend, Robert said, and couldn’t handle it. I never knew that. I can only hope Dean didn’t descend into a decades-long battle with alcohol like I did. I wouldn’t wish that on anybody.

It wouldn’t be fair, if I said that Dean and Robert were the only decent guys in that fraternity. In fact, most of them were great guys. It was the handful of assholes who ruined it for everybody else. Isn’t that the way it often works?

Yet, I wonder – where is Dean now? Is he okay? Did he succeed in life? I felt, if anyone deserved it, he did. I’m not so arrogant to wonder if he thinks of me, though. But, we had the kind of friendship that should have lasted a lifetime. If that damn fraternity just hadn’t thrown so much crap all over us.

*Name changed.

Image.

7 Comments

Filed under Essays

Bad Boys, Dumb Broads

Joke All You Can 051414

‘Nice guys finish last,’ goes the old maxim. Apparently, they also go home alone. At least the straight ones do. Recently, a photograph has been circulating on the Internet of a young man named Jeremy Meeks. This isn’t just a simple cell phone snapshot, or a Facebook post. It’s a mug shot. Meeks’ picture went viral in June, after his arrest on weapons charges; earning him the affectionate moniker of “handsome mug shot guy.” The 30-year-old Californian isn’t exactly husband material, though, and the dark spot on the outer edge of his left eye isn’t a birthmark. Meeks has a lengthy criminal rap sheet dating back to 2002; the tear drop mark is a gang tattoo.

That didn’t stop thousands of people from visiting his Facebook page and “liking” it, as people are wont to do in this digital age. It didn’t even prevent talent agent Gina Rodriguez from accepting Meeks as a prospective client. Rodriguez, whose gallery of talent includes such media gems as Nadya “Octomom” Suleman and Farrah “Teen Mom” Abraham, hoped to get Meeks a modeling contract as voluptuous as his lips. Knowing star potential when they see one, officials with a porn studio have also approached Meeks; offering him a $100,000 contract. Meeks has been held in the San Joaquin County Jail on a whopping $1,050,000 bail.

“Handsome mug shot guy” is married with a young son and, according to family members, has been trying to live a quiet life after spending time in prison and being involved with the “Northside Gangster Crips,” an offshoot of the Los Angeles-based “Crips,” one of the oldest and most violent street gangs in the U.S. He was arrested June 18 after driving away from a suspected drug house in Stockton, California that was due to be searched. Two others were with him in the vehicle, which also contained a loaded and unregistered semi-automatic handgun and two extended magazines in the trunk. Police also found marijuana in the car.

Insisting her son is an innocent “working man,” Meeks’ mother, Katherine Angier, seized upon his newfound celebrity to plead for help. She established a profile on the “Go Fund Me” web site to raise money for his legal defense, adding that her precious offspring has been stereotyped because of his past behavior. “He’s my son, and he is so sweet,” Angier opines.

Well, who could argue with her?! Unfortunately, more than just Meeks’ mother has come to love his face. Plenty of desperately lonely females have swooned over those cornflower blue eyes and chiseled cheek bones.

Just the right look.

Just the right look.

A second cousin of mine who’s an active-duty solider in the U.S. Army recently went on a Facebook rant about the lascivious response Meeks is getting; ending it with a deprecatingly bitter piece of advice: “Keep it classy, girls!”

I can empathize. This is the kind of crap that drives men crazy. While women often complain that men lust after the ubiquitous supermodel chicks, the reality is that most men usually don’t become infatuated with female criminals. At least most supermodels aren’t of a criminal bent – excluding Naomi Campbell. Indeed, two of America’s worst serial killers, Ted Bundy and Richard Ramirez, developed legions of female fans during their respective criminal trials. It didn’t seem to matter that these monsters deliberately sought out and slaughtered untold numbers of innocent people. Some felt there was an angel inside each man and they had the ability to bring it out.

I wish I had purple eyes and stood six feet tall. But, I don’t. I just wasn’t born with those attributes. I wish I’d joined the U.S. Navy some 30 years ago; my life might have gotten into better shape a long time ago. But, I just never did. I’m not alone. People often want who and / or what they can’t normally have. Poets and psychologists have debated this issue for millennia; knowing it’s part of the human psyche to crave the unattainable. Modern science has deduced that dopamine, a chemical precursor to adrenaline, is the primary culprit. It’s a complex substance the brain develops naturally; one that generates feelings of pleasure and desire – but, not necessarily satisfaction. It may be a key factor in substance abuse, such as alcoholism. Researchers still don’t understand why some people respond more acutely to one set of stimuli than others. The brain may be the most powerful sex organ in the human body, but it remains a mysterious one.

Females who prefer the stereotypical “bodice-ripper” (think Rhett Butler carrying a shrieking Scarlett O’Hara up the staircase in “Gone with the Wind”) might want to confer with occupants of a domestic violence shelter; women who either fell for or stayed with a man they thought could change with a good meal and the right perfume. It’s amazing how stupid some women can be in genuinely believing their feminine charms are powerful enough to alter the core personalities of the worst men; a sort of hormonal alchemy that would be the “Holy Grail” for marriage counselors, psychologists and talk show hosts. But, with a few exceptional cases, it rarely occurs.

Such blind self-adulation can be fatal. There are countless stories of women dying at the hands of men who really didn’t have a Prince Charming hidden beneath those balled up fists and bloodshot eyes. But, when I contemplate such odd pairings, I recall the tragic tale of a cousin who took her own life in January 1983. Already a somewhat fragile soul, she had married a man with a drug problem a couple of years earlier; believing she could somehow cure him of his ailment. Her mother strongly opposed the union, as did most everyone else in the family. But, no one could stop it. After all, she was an adult. And, apparently no one – not even my cousin – could stop her husband’s drug addiction. So, she left him. That would seem a happy enough ending, but her marriage’s sudden dissolution plunged my cousin into a state of extraordinary despair. I guess she blamed herself for the guy’s inability to shake free from his wicked habit; shattering her vision of a bright and loving future for the two of them. So, she sat down in a closet one night after work and stuck a pistol in her mouth. He had been a very bad boy, and she was a very good girl. Yet, she’s the one who ended up dead. He had failed miserably, but she felt like a miserable failure. Where’s the justice, I asked quietly at the funeral. Where, in a decent world, is there room for something so twisted as that?

Wearing a San Joaquin County jumpsuit – in what I called “arresting amber” – Meeks made a court appearance on July 8 and received mixed news: he’s no longer facing multiple weapons charges. But, the state turned his case over to the federal government, and now, Meeks is looking at a single federal weapons possession indictment. As a federal case, it’s obviously much more serious, and if convicted, he could face up to 10 years in prison and a $250,000 fine.

And, knowing how desperate some women are for a man, there’ll be more than a few nitwits holding vigil for his sorry ass in the comfort of their delusional minds. Meanwhile, the truly nice men will still be at home alone.

Cartoon courtesy of Joke All You Can.

1 Comment

Filed under Essays