Monumental

redskinstrademark

Tomorrow evening, October 27, the Dallas Cowboys will play the Washington Redskins at AT&T Stadium in Arlington, Texas. Currently, the Cowboys are on a winning streak, and hopes for a successful season look brighter than a cure for Ebola. But, amidst the usual revelry of a brutal contact sport, the issue of naming has arisen once again – the Redskins’ name.

Yolanda Blue Horse, a Dallas resident and member of the Lakota Nation, has scheduled a formal protest outside the stadium for 3 p.m. on Monday.

“When we all stand together as one, we also honor those before us and those to come after us,” Blue Horse declared. “The continued use of this negative word is not only derogatory, but it is offensive and we demand that the owner, Dan Snyder, stop using this racist word to promote his football organization.”

For years Native Americans have been demanding that Washington change its team name; a racial slur as bad as nigger, spick, chink, or elected official. And, for years, Washington has balked at the suggestion. But, in recent years, I’ve noticed something different: people are starting to pay more attention to the issue. Moreover, the Federal Communication Commission (FCC) has taken greater interest in the subject. For the first time in memory, they’ve actually contemplated banning radio and TV stations from using the term ‘redskin’ while broadcasting.

“We will be dealing with that issue on the merits, and we’ll be responding accordingly,” said FCC Chairman Tom Wheeler. Wheeler admits he’s a critic of Washington’s name, calling it “offensive and derogatory” in a recent interview. He refers to the club as “the Washington football team” instead.

Banning the term ‘redskin’ would effectively prevent radio or TV outlets from utilizing it while on the air. If they do, in other words, they could lose their license. That would mean any TV network broadcasting a game featuring Washington couldn’t openly refer to them as the Washington Redskins. The announcers couldn’t utter it, and the name couldn’t be displayed even in written form. Therefore, it’s possible a network wouldn’t take the chance and decided not to televise the game. That could result in millions of dollars in lost revenue for the network and its sponsors. If Washington should make it to the annual Super Bowl, that could create a financial calamity. Earlier this year the U.S. Patent and Trade Office went so far as to cancel the team’s trademark; denouncing it as disparaging to Native Americans. That’s the closest anyone has come to banning ‘redskin’ from public usage at the national level.

This past spring 50 members of the U.S. Senate sent letters to National Football Commissioner Roger Goodell prodding him and the league to endorse a name change for Washington.

“The NFL can no longer ignore this and perpetuate the use of this name as anything but what it is: a racial slur,” said one letter, signed by 49 senators. “We urge the NFL to formally support and push for a name change for the Washington football team.”

Not surprisingly, owner Dan Snyder has refused calls to change the team’s name, proclaiming it a noble moniker, not a slur. In a recent interview with ESPN, he once again insisted he won’t bow to public pressure. “It’s just historical truths,” he said, “and I’d like them to understand, as I think most do, that the name really means honor, respect.”

Snyder highlighted both William Henry “Lone Star” Dietz, Washington’s first coach and for whom the team was named to honor his “Native American heritage,” and Walter “Blackie” Wetzel, the late former president of the National Congress of American Indians and chairman of the Blackfeet Nation, who helped design and approve the team’s logo, as true-life examples of the positive history of the nickname.

I wrote up an essay on this issue a couple of years ago; wondering aloud if anyone would tolerate sports team names such as the Washington Niggers or the Houston Hebes. The word ‘redskin’ has a muddled history. Many claim it was a reference created by early European explorers and / or colonists who took note of the often-ruddy complexion some Indigenous Americans have. Others declare it was a reference to the reddish body paint some native peoples adorned themselves with, as they prepared for battle, or engaged in some kind of religious ceremonies. Whatever its origins, redskin is still a vulgar and racist term.

Quite frankly, though, some people of Indian extraction aren’t offended by it; seeing it strictly as a name only, with no racist overtones. In the ever-mutating world of American English, however, plenty of folks view attempts to ban ‘redskin’ and force Washington to change its name as another chapter in the ‘Book of Political Correctness.’

In an editorial last year, “Washington Post” columnist Charles Krauthammer lamented, “I don’t like being lectured by sportscasters about ethnic sensitivity. Or advised by the president of the United States about changing team names. Or blackmailed by tribal leaders playing the race card.”

The “Conservative Tribune,” deemed calls for Washington to change its name “absurd,” adding, “If anything, the team is showing respect to native Americans by actually naming themselves after them.” The same site also just published this brilliant photo of a “conservative’s reaction” to the ‘Hands Up, Don’t Shoot’ mantra over the Michael Brown shooting.

Erick Erickson, editor of RedState, blamed President Obama for the USPTO’s decision. “The lesson here is that guilty feeling white liberals are a threat to freedom and, in Barack Obama’s America, the key to survive is to not appear on the radar of in Washington, D.C.,” Erickson wrote. He further implicated “a bunch of overeducated white guys who cry during ‘Love Actually’” and “a class of men who pee sitting down.”

Rush Limbaugh noted the Patent and Trademark Office is part of the Obama Administration, which, in turn, is the source of all this “tyranny.”

Right-wing blogger Matt Barber sees an unsettling trend looming on horizon with the USPTO’s decision. “Whether or not you believe the Redskins should change its team name, you should be concerned by this troubling development,” he wrote. “It’s a harbinger of things to come. The American free market and private enterprise are no longer free nor private. Liberty is under threat as never before. Here’s to the good ol’ U.S.A.! We’ve officially become an Obamanation.”

Comments to a “Dallas Morning News” piece about the matter last week displayed an exorbitant amount of vitriol. One man complained that he felt like suing the Cracker Barrel restaurant chain for its obviously racist name. Yes, I’m sure millions of Caucasian-Americans get sick to their stomach when they see the Cracker Barrel sign; that’s why so many of them keep patronizing those stores!

Okay, I get it! A bunch of middle-aged White conservatives are pissed off that someone dares to challenge their view of the way American society is. It’s the same reaction many had to school desegregation and the 1964 Civil Rights Act. They’re the ones who believe the United Nations still has covert operatives hovering along the U.S.-Canadian border, just waiting for the right moment to launch an assault and force gay marriage and mandatory abortions on God-fearing Americans.

No, you idiots, this isn’t political correctness. Political correctness is saying that all Indian people are great and wonderful, even if they’re drunk-ass bastards who engage in criminal behavior. Political correctness is telling men they must always respect women, no matter what stupid or awful things she does or says to him. Political correctness is U.S. foreign policy towards Israel.

Since Snyder is Jewish, he could easily change it to Washington Kikers, but then, political correctness would really get turned upside down. But, I believe the Washington Monuments would be appropriate. Washington, D.C., is home to some of the nation’s premier monuments to its heritage. Besides, a monument – as in the Washington Monument – is a long, thick column of granite, sticking straight up to the sky. I think it’s appropriate, considering football is the last bastion of male athleticism in the U.S.; a tribute to excess testosterone and men’s aggression.

Despite the right-wing rancor, this issue isn’t going away. And it’s never been a matter of political correctness; it’s simply a matter of respect.

4 Comments

Filed under Essays

Viral Vitriol

fearappeal

By the time the President of the United States made a public statement about the epidemic, several people had died and an untold number were already infected. But, when he stepped to the podium to address the media, his words weren’t anything some in the audience had hoped he’d say. His brief speech wasn’t about funding or education directed towards stemming the scourge and ultimately finding a cure; it was about policy. A cacophony of jeers slammed into his geriatric face, and he merely lifted an eyebrow, as if saying, ‘Well, that’s all I need to say about it.’ Indeed, that’s all anyone should have expected Ronald Reagan to say about AIDS.

On June 5, 1981, the “Morbidity and Mortality Weekly Report,” a publication from the Centers for Disease Control and Prevention, presented data about the peculiar cases of 5 young men, “all active homosexuals,” who had developed Pneumocystis carinii pneumonia (PCP) at 3 different hospitals in Los Angeles. Two of them were dead by the time the report came out. PCP is a very rare form of pneumonia, occurring only in people with depressed immune systems. That seemingly healthy young men in large urban areas around the country were coming down with it seemed to contradict medical scripture about the ailment. Because the patients were all “active homosexuals,” however, the CDC labeled the new disease “Gay-Related Immune Deficiency” (GRID). Within months, however, the CDC realized that “active homosexuals” weren’t the only victims. Intravenous drug users were also coming down with the mysterious new disease; then prostitutes, but also other people who didn’t fit into any of those groups. They quickly renamed it Acquired Immune Deficiency Syndrome (AIDS). But the damage was already done by those 2 words: “active homosexuals.”

When Reagan addressed the press on September 17, 1985, he mentioned AIDS only to declare a travel ban for all HIV-positive and AIDS-afflicted people. By then, scientists had identified the AIDS virus, and the U.S. Food and Drug Administration (FDA) had approved usage of the first test to detect it, the ELISA test. Scientists had already confirmed one critical fact about the new scourge: it was a blood-borne pathogen; infectious, but not contagious. Still, panic had set into the nation. Gay men were being targeted with more violence than they ever had been in the nation’s history. Even as the gay-rights movement gained momentum in the 1970s, gay men didn’t face the sort of vitriolic backlash as they did with the rise of AIDS.

In 1983, Pat Buchanan, a former speech writer for President Richard Nixon, published a column about the AIDS epidemic, in which he claimed, “The poor homosexuals – they have declared war on nature, and now nature is exacting an awful retribution.”

In 1986, Libertarian Lyndon LaRouche proposed legal discrimination against people with HIV and AIDS as a matter of public interest. He wanted federal and state governments to protect people from AIDS in the same way it protects the citizenry against other diseases by quarantining them in concentration camp-like structures.

Reagan’s lack of concern for the burgeoning epidemic has always been a sore point for human rights activists. The former actor, however, repeatedly extolled the virtues of personal responsibility, even with health matters, and bemoaned government involvement. During his 1966 run for governor of California, Reagan denounced President Lyndon B. Johnson’s Medicare program as “socialized medicine.”

But, previously, the U.S. government did respond quickly to health scares. When several people attending the annual legionnaire’s convention in Philadelphia in September of 1976 came down with a vicious flu-like ailment, health care workers jumped into action and almost immediately identified the source: a water-borne bacteria later called Legionella.

That same year U.S. health officials warned the public about a pending influenza epidemic, swine flu, and urged people to get vaccinated as soon as possible. Panic set into the American psyche and several individuals rushed to their doctors. The resulting hysteria now stands as one of the worst debacles in U.S. healthcare history.

When 7 Chicago-area residents died from ingesting cyanide embedded in Tylenol capsules in the fall of 1982, the federal government jumped into action to help Tylenol maker Johnson & Johnson manage the crisis. The company pulled every single one of its products off store shelves, resulting in a multi-million dollar loss, and then reintroduced them with tamper-resistant packaging. It’s difficult for younger folks to imagine now, but there was a time when you could open a bottle of something and not have to peel away a layer of plastic or foil. The crime spawned only one known copycat incident – in Auburn, Washington in 1986 – but it remains unsolved.

For those of us who recall the hysteria over the AIDS epidemic in the 1980s, the current reaction to the Ebola fiasco is painfully similar. Like HIV, Ebola is a blood-borne virus; spread only by close contact with the body fluids of an infected person. They both originated in Africa. HIV has been traced to green monkeys, where it started out as simian immunodeficiency virus, or SIV. How or when it metamorphosed is still being investigated, but researchers believe it made its first appearance in humans in Kinshasa, Democratic Republic of the Congo (then known as Zaire) in the 1920s. Scientists still don’t know the host source of Ebola, but they believe it comes from fruit bats. That’s pretty much where the direct comparisons end. Ebola is far deadlier; it induces a severe hemorrhagic fever, in which the internal organs not only collapse, but literally begin to disintegrate. Once an infected patient reaches the stage where they’re bleeding incessantly, it’s too late to save them. There are now drugs that can slow the advance of HIV and even full-blown AIDS. But, there’s not even a vaccine for Ebola. Agents like ZMapp haven’t gone beyond the experimental stage yet. Now some have the audacity to wonder why there isn’t enough of it.

It’s ironic that the world learned of Ebola before it learned of HIV and AIDS; yet more people have died from the latter. That the developed world never contemplated (outside of scientific circles) that Ebola could spread beyond remote Central African villages signals a certain degree of naiveté, if not stupidity. In this increasingly interconnected global economy, there’s no reason to suspect otherwise.

But, the attitude of ‘them-vs-us’ is what allowed the AIDS epidemic to get so out of hand. The “active homosexuals” comment – something the CDC regrets to this day – burned into the minds of socially conservative activists who saw the scourge merely from the viewpoint of a moral lens. Conservatives warned Reagan not to mention AIDS or HIV during his speech at the 1984 Republican National Convention in Dallas, lest he lose the party faithful. Those in control of the U.S. blood industry, such as the Red Cross, didn’t want to believe their products and patients were at risk from HIV; literally asking some hemophiliacs and organ transplant recipients if they wanted to be placed in the same group as “them” – meaning the gay male / drug user / prostitute gallery.

If the U.S. had taken AIDS seriously from the start, we might have developed protease inhibitors by the end of the 1980s, instead of a decade later. By now, we might even have a vaccine, if not a cure. (If you read my 2012 essay, “I Almost Hope They Don’t Find a Cure for AIDS,” you might understand my sense of trepidation about this particular matter.)

The perception of ‘it’s their problem’ has impacted countless issues of various types: economic, medical, political, religious and social. Some health officials saw the need to work towards a cure, or at least a treatment for Ebola long ago. Dr. Kent Brantly, a U.S. medical missionary, contracted Ebola this past July while working with patients in Liberia. When he was brought to Atlanta’s Emory University, looking like an extreme beekeeper, he became the first person with the disease to step foot on American soil, or anywhere in the Western Hemisphere for that matter. Some people have wondered aloud why he would have spent so much of his time and energy in the first place to work with Ebola patients in Africa, when we have people dying of obesity and drug addiction here in the U.S. Those are fair questions. Yet Brantly sees his purpose in life as more than just a dispenser of medicine and sage advice. His Christian outlook on life (and I don’t want to bring religion into this debate) prompted him to be concerned about everyone around him – not just his immediate circle of family and friends. More than just a few people have used their religious ideology to narrow their view of ‘Others.’ I’ve worked with plenty of them. Just look at the AIDS epidemic. Even now, more than three decades after the epidemic was given a name, several individuals still look at AIDS from a moralistic perspective. They still don’t understand that morality really has no place in health and medicine.

Right-wing extremists have proposed simple solutions to the Ebola epidemic. Sen. Ted Cruz called for a complete ban of people traveling from Ebola-ravaged nations in West Africa. “Common sense dictates that we should impose a travel ban on commercial airline flights from nations afflicted by Ebola,” he said. “There’s no reason to allow ongoing commercial air traffic out of those countries.”

He’s just one of many who have made such idiotic proclamations. But Dr. Anthony Fauci, an early proponent of AIDS research and current head of the National Institute of Allergy and Infectious Diseases (NIAID), literally scoffed at the notion; dubbing it “counterproductive.” “[W]hen people come in from a country, it’s much easier to track them if you know where they’re coming from,” he noted. “But what you do, if you then completely ban travel, there’s the feasibility of going to other countries where we don’t have a travel ban and have people come in.”

There are no direct flights from anywhere in Africa to the U.S. Thomas Eric Duncan, the Liberian man who developed Ebola shortly after arriving in Dallas last month and who died on October 8, had initially flown from Monrovia to Brussels; then from Brussels to New York City.

Reductions in the CDC’s budget also may have played a part in the Ebola mess. As usual, conservative Republicans were quick to demand cuts in health care; rampaging through the CDC’s financial allotments like a drunk rabbi in a Catholic boys’ school. Even President Obama bought into the philosophy that this was a wise move, slashing $72 million from the CDC’s public health emergency preparedness program for fiscal year 2012. I’ve noticed social conservatives are never so eager to cut military spending or funding for more prisons.

I don’t know what’s next in the Ebola scourge. It shows no signs of abating in West Africa, and there’s a good chance more people are going to contract the virus outside of that region. I shudder at the thought of it reaching India or China. Politics and religion don’t have places in health and medical care. Whenever they’re factored into the mix, people get hurt and die. In this modern world, we can no longer afford it.

1 Comment

Filed under Essays

Happy Birthday Tom Petty!

tom_petty_large_02

“Music is probably the only real magic I have encountered in my life. There’s not some trick involved with it. It’s pure and it’s real. It moves, it heals, it communicates and does all these incredible things.”

Tom Petty

 

“Don’t Come Around Here No More”

“Free Fallin’”

“I Won’t Back Down”

“Learning to Fly”

“Stop Draggin’ My Heart Around” with Stevie Nicks

“The Waiting”

Leave a comment

Filed under Birthdays

Update: Ebola Hits Dallas

On October 5, a hazardous-materials crew cleaned outside the Dallas apartment building of a nurse who was infected with Ebola.

On October 5, a hazardous-materials crew cleaned outside the Dallas apartment building of a nurse who was infected with Ebola.

As some have feared and others predicted, the Ebola situation in Dallas has worsened. The man who became the first person diagnosed with Ebola in the United States was identified a couple of weeks ago as Thomas Eric Duncan, a 42-year-old Liberian native who arrived here on September 20. He died of the disease early on October 8. The facts surrounding Duncan’s case have changed almost as quickly as promises from the mouths of Texas politicians. But, then again, when the media hurries to publish a story, the truth almost always gets lost in the chaos.

Duncan had been accused of lying on a questionnaire he was given upon boarding a flight from Monrovia to Brussels on September 19; one that asks travelers if they’ve had recent contact with an Ebola patient or have recovered from the ailment. We learned almost as soon as news of Duncan’s dilemma became known that he had carried a pregnant 19-year-old woman to and from a taxi cab shortly before he departed Liberia. The woman died of Ebola not long after she’d been turned away from a local hospital because they were filled to capacity. Then, some of her other relatives got sick and died. By the time she passed away, Duncan was already in Dallas.

Now, news reports claim that neither Duncan nor any of the pregnant woman’s relatives were aware she had Ebola. Apparently, the latter didn’t realize it until after she died. Such is the case in Liberia and other developing nations of West Africa. The health care infrastructure is as pathetic as the road infrastructure. That’s why it doesn’t surprise me that the pregnant woman was turned away from a hospital.

“If he had known he had Ebola … he would not have put the love of his life in a situation like this,” family friend Saymendy Lloyd said of Duncan after he died.

But, officials at Texas Health Presbyterian Hospital of Dallas should have known better than anyone in a ramshackle hospital in Liberia. It’s bad enough that – when Duncan first arrived on September 25, complaining of fever and nausea – the hospital merely sent him home with a prescription for antibiotics. Now, we’ve learned he had a 103° temperature that night. I’m not a healthcare professional, but even I know someone with a 103° temperature needs to be hospitalized. Then, there’s the breakdown in information. The nurse who saw Duncan initially recorded his temperature in the hospital system. But somehow, that crucial bit of data got lost in the electronic shuffle. Hospital officials were quick to blame the software, which was designed and distributed by Epic, a Wisconsin-based firm that controls about 20% of the U.S. market in electronic hospital records. Another piece of lost information – Duncan revealed he’d recently traveled from Liberia.

Someone once told me that computers are only as smart as the people who operate them. No, I responded, they’re only as smart as the people who design them. Actually, it’s both. Presbyterian Dallas’ story keeps shifting, so hopefully they’ll settle on a final version before the book and TV-movie come out. Regardless, none of it leaves me with any sense of confidence in the U.S. health care system. The U.S. has spent more time and money building prisons and sports arenas than health care facilities. Our backward-thinking politicians have made sure oil companies got large tax breaks, while funding for education is always put up for a vote – and fails.

Yet, it’s gotten worse. A nurse who tended to Duncan while he was in isolation at Presbyterian has now tested positive for Ebola. The news just broke, so there aren’t too many details, except that she’s a 20-something native of Fort Worth. Supposedly, her apartment has already been cleaned out, although reports state her dog is still inside. Hopefully, the animal doesn’t suffer the same fate as the pet of a Spanish nurse last week. I mean, I’d rather sacrifice the entire Texas State Legislature instead.

And, the drama continues.

Leave a comment

Filed under News

Seattle Goes Native

Seal_of_Seattle

Seattle, Washington has become the latest city in the United States to rename Columbus Day “Indigenous Peoples Day.” On October 6, the Seattle City Council voted unanimously to celebrate the nation’s indigenous inhabitants instead of the Italian-born adventurer who didn’t know where he’d actually landed. Columbus Day has always been a point of contention for Native Americans. Saying that Christopher Columbus “discovered” America is akin to stating that Galileo “discovered” the moon. Many Americans of European extraction believe that Columbus technically opened the door for a new society. Most Indians feel it was the start of the world’s greatest and longest-lasting holocaust; the effects of which are still being felt today throughout the Western Hemisphere.

In 1992, celebrations for the 500th anniversary of Columbus’ voyage met with strong blowback from indigenous groups. A parade in Denver, for example, was canceled that year for fear that protests would turn violent. Some have, given the hostilities that exist; due, in no small part, to the racist ideologies of some White Americans, as well as the arrogance of some Italians. It’s odd because Columbus couldn’t get financial backing from his own people. In the 15th century, Italy was actually a collection of city-states that wouldn’t jell into a single nation until the 1860s. Even now, some people may refer to themselves as Sicilian, instead of Italian, which is like saying the sky is azure, not blue. Columbus turned to Spain and Queen Isabella I. He had wanted to find a western route to India to gain an advantage in the lucrative spice trade. It’s difficult to imagine now, but spices were as precious as gold and silver at the time.

I’ve always felt Native Americans should have their own holiday. I don’t see the point in revising Columbus Day; let the Italians have their holiday, if they want. All the renaming won’t change history. We simply can’t go back and make everything all better again. It’s happened, and we need to continue moving forward, while still acknowledging the past. We’re all part of the human race, so ethnic divisions serve no real purpose. Some day, I hope, everyone else will realize that.

3 Comments

Filed under News

Anatomically Correct and Socially Uptight

One of nine bronze sculptures by artist Jorge Marin in Houston.  Try not to look too hard.

One of nine bronze sculptures by artist Jorge Marin in Houston. Try not to look too hard.

In January of 2002, as the United States was still reeling from the calamity of the 9/11 terrorist attacks, then-U.S. Attorney General John Ashcroft became overwhelmed with a more pressing matter: two statutes of partially nude female figures in the Great Hall of the Department of Justice. Feeling undignified being photographed in front of them, he ordered one, “Spirit of Justice,” to be covered. At taxpayer expense, $8,000 worth of drapery shielded unsuspecting viewers from both of the art deco statues. These were the same statues that stood behind former U.S. Attorney General Edwin Meese in 1986, when he announced findings of a Department of Justice study on pornography.

In recent decades, social conservatives have associated nudity and human sexuality with pornography. The dysfunctional comparison has arisen again in Houston where Mexican artist Jorge Marin has – well – erected nine bronze sculptures of anatomically correct male forms in a park. Collectively entitled “Wings of the City,” the figures have taken up residence in the city’s downtown area. Houston is just the latest major metropolitan area to see Marin’s artwork; he’s exhibited his statutes over 200 times. They stood on México City’s heavily-traveled Paseo de la Reforma where millions of people viewed them.

But, to the easily-offended souls of America’s fourth largest city, the statutes don’t qualify as art; they’re pornography. Get real!

“It’s very inappropriate, seeing that they have a lot of kids here,” resident Trena Cole told the “Houston Chronicle” recently.

“I don’t know that it enhances the park,” another resident, Julie Griffis, who lives nearby, also told the Chronicle. “I don’t think it fits in with the theme.”

Other residents, such as Jim Thomas, don’t see any problem with the statues. “We see them as art,” he told the Chronicle, mentioning one of the most famous anatomically-correct nude male figures of all time: Michelangelo’s “David.”

College student Alan Lima pointed out, “It’s part of the body. What can you do? That’s the way you were born.”

Exactly! That’s how we’re born. There seems to be a growing sense of animosity towards the male physique in recent years. It’s gotten to the point where I often see young men wearing two and three shirts during winter and long pants during summer, while their overweight wives and girlfriends parade around in mini-shorts that make me want to call Green Peace about beached whales. Professional basketball players wear shorts so long and baggy they qualify as split skirts. I’ve heard stories of school boys who won’t shower in the locker rooms after physical education classes because someone might think they’re queer.

If the fools who think the statues are “pornography” could get proctologists to help find their brains, they might want to hop over to Houston’s rougher sides where people are dropping dead from drug use and gun violence. Visit a homeless shelter where children often stay and tell me again you think a nude male sculpture is “pornography.”

There’s nothing pornographic or offensive about the male body. I have plenty of pictures of my body. Videos, too! Oh, wait…that’s a different subject. Anyway, check out Marin’s work and try not to get too upset.

4 Comments

Filed under Art Working

Where Are They Now?

images

A couple of weeks ago I watched the latest documentary series by Ken Burns, “The Roosevelts: An Intimate History.” It focuses on the three most famous members of this legendary family: Theodore, Franklin and Eleanor. They are also three of the most fascinating individuals of the 20th century, and this series only solidifies, in my mind, a deep longing for similar people in public life today. The Roosevelts were much like the Kennedy family of Massachusetts. They were ambitious, assertive, intellectual and strong-willed. Their progressive ideals and bold honesty shoved the United States onto a (sometimes unwilling) forward track. Yes, they were wealthy and traveled in elitist circles. But, for the most part, they had overwhelming respect for their fellow citizens. They were committed to public service, not politics. And, as the United States stumbles from one crisis to the next in this strange, new world of the 21st century, I have to ask where are people like the Roosevelts and the Kennedys now?

The U.S. never has had a royal family. Our official founders technically escaped European feudalism because of the vice grips that small bands of inbred groups had on their ancestral homelands. But, I’d have to say the Roosevelts and the Kennedys come close to American royalty. The Roosevelts produced two extraordinary presidencies, and the Kennedys produced one; albeit a tragically short one. Yet, both families charted progressive courses for the U.S. that ultimately gave freedom to so many of their contemporaries and challenged future generations to keep America as a beacon of democracy.

I’ve always viewed Theodore Roosevelt as a personal hero. It’s odd, considering he had been a sickly child burdened with asthma. As an adult, he suffered from depression. Yet, he grabbed life by the throat and rung every ounce of energy from it. He was a ball of lightning; unafraid to take on the notorious bosses of Tammany Hall and the ruthless titans of industry. A nature lover, he established the national park system.

His fifth cousin, Franklin, and the latter’s wife, Eleanor, helped move the nation closer to racial equality than anyone had before. Franklin broke from family tradition when he accepted a post as Assistant Secretary of the Navy in 1913 in the administration of Woodrow Wilson. He ran for the vice-presidency as a Democrat in 1920. After losing that race, he returned to a simpler life, enjoying his family and earning a living as lawyer. But, in the summer of 1921, while vacationing in New Brunswick, Franklin experienced a life-altering event: he contracted polio; then called infantile paralysis, a frightening and debilitating scourge (usually afflicting children) with no cure or vaccine. Franklin never regained use of his legs and could only stand or walk with the help of someone or something. He persevered, however, and became determined to heal himself as best as possible with lengthy stays at a resort he eventually purchased in Warm Springs, Georgia. There he could languish in a pool for hours, which eased the agony of twisted muscles and constricted joints.

But, Franklin also remained committed to life as a public servant. In 1928, he ran for and won the governorship of New York state. Four years later he successfully ran for president. He ran three more times, holding the office for an unprecedented 12 years.

Like his familial predecessor, Franklin Roosevelt wasn’t afraid to make bold decisions and launch big projects. Whereas Theodore took on various industries, such as oil and timber; compelled the U.S. Congress to mandate safe working conditions; and commence the national parks system, Franklin forced the federal government to take control of the slew of banks still faltering during the Great Depression; created the Civilian Conservation Corps; and introduced Social Security. Franklin’s predecessor, Herbert Hoover, and Hoover’s Treasury secretary, Andrew Mellon, boasted typically conservative attitudes about business and the economy: government had no real role in managing corporations; if a company – or even a bank – got itself into financial trouble, it was incumbent upon that entity to get itself out of trouble. Franklin knew that was true, but he also understood the true scope of the economic calamity afflicting the nation in the early 1930s. People were losing their jobs, their money and sometimes their lives, as banks folded. The crisis was gigantic in scope, and the hands-off approach of the Hoover Administration only exacerbated matters. Roosevelt created the Federal Deposit Insurance Corporation (FDIC) during his first year in office; an entity that would safeguard consumer bank assets and – slowly – reinstill trust in the nation’s financial institutions.

After the U.S. became embroiled in World War II, Franklin’s health began to deteriorate. He hardly campaigned in 1944. But, he didn’t give up. He was determined to lead the nation out of the war. Sadly, he didn’t see the day when America’s enemies surrendered, yet he maintained a high degree of spiritual vigor. He didn’t stop until his body forced him to do so.

Eleanor Roosevelt triumphed as well on many levels, but not really until after Franklin died. Like most of her female contemporaries, Eleanor had few choices in life. She had to be someone’s wife or someone’s mother, but she could never be her own person. A niece to Theodore and a distant cousin to Franklin, she felt uncomfortable in the role of First Lady. But, once she realized how desperately poor much of the nation’s citizens were because of the Great Depression, she pushed her husband to enact the strident and controversial legislation for which he’d become famous; not being given even a smattering of credit for it, of course, until decades later. Almost accidentally, she also became a torch bearer for the burgeoning civil rights movement; knowing that all Americans – regardless of gender, race or ethnicity – deserved to be treated equally. Not long after Franklin’s death, Eleanor prodded his successor, Harry S. Truman, to proceed with establishment of the United Nations and, later, battled for the “The Universal Declaration of Human Rights.” Her tireless efforts towards gender and racial equality made her an enemy of the staid social right-wing (even to the point of receiving death threats), but they helped her carve out her own legacy in the gallery of extraordinary Americans. “No one can make you feel inferior about yourself,” she declared in her book “This Is My Story,” published in 1939.

John F. Kennedy is another personal hero of mine, but not because he was the nation’s 35th president, or an heir to a prominent and wealthy Irish Catholic family. Like his older brother, Joseph, Jr., John Kennedy joined the military during World War II. Joseph was killed in action in August of 1944, and John nearly lost his own life in the South Pacific a year earlier. John had joined the U.S. Navy shortly after graduating from Harvard University in 1940. While commanding a torpedo boat, a Japanese warship rammed the small vessel. Despite severe injuries, Kennedy led other surviving crew members to a nearby island. His back never fully healed, and he suffered with the pain for the remainder of his life.

Before his stint in the Navy, however, John Kennedy attained a modest level of intellectual notoriety. In 1939, his father was appointed U.S. Ambassador to Great Britain. During a visit to England that same year, the younger Kennedy researched why the nation was unprepared to fight Germany at the onset of WWII. It became his senior-year thesis; a detailed analysis so well-received that it was published as “Why England Slept.” Kennedy launched the space race by challenging the U.S. to “land a man on the moon” before the 1960s ended – which we did.

Other giants of the 20th century shouldn’t go unnoticed: Wilson and Truman, of course, but also Dwight D. Eisenhower and Lyndon B. Johnson. Wilson was reluctant to jump into World War I (then called “The Great War”) and envisioned the U.N., which he called “The League of Nations,” a multi-national entity that forced the United States onto the world stage. Truman integrated the U.S. armed forces. Eisenhower jumpstarted the interstate highway system. Johnson signed into law some of the most important pieces of legislation of the modern age.

They were not without their faults. Theodore Roosevelt was essentially a racist in that he believed Caucasians were biologically superior. But, one has to consider that he was a product of his time, so I think he can be forgiven for that. A lot of otherwise good people felt that way back then. Franklin Roosevelt, John F. Kennedy and Lyndon Johnson were adulterers. Johnson may have been a modernist in regards to civil rights, but he also led the U.S. into the quagmire of Vietnam.

The closest the U.S. has to a political dynasty is the Bush family, which isn’t saying much. The Bush clan has produced two of the most dismal presidencies within a quarter century. Therefore, I lament the fact I can’t point to many notable political leaders right now. I placed a great deal of faith Barack Obama, when he first ran for office. Now, I’m disappointed in him. I know it’s not completely his fault. He’s dealing with an arrogantly recalcitrant Congress; a hodgepodge of right-wing extremists who are more concerned with banning gay marriage and instituting creationism into America’s educational curriculum than more critical tasks, such as punishing those responsible for the 2008 economic collapse and rebuilding our crumbling infrastructure. I’m certainly disappointed in U.S. Attorney General Eric Holder who just announced his resignation. Our elected officials are wrapped up in petty battles with one another.

There seem to be no big dreamers anymore – and I don’t know why.

3 Comments

Filed under Essays