Review: Martha Nussbaum on Anger, Apologies, and Forgiveness

Over the years, I’ve spent a considerable amount of time discussing anger, apologies, and forgiveness with therapists and survivors of child abuse and other traumas. Survivors and therapists alike are often passionate in the their belief that forgiveness is the only way to move forward from traumatic abuse. Without forgiveness, they feel, healing is impossible.

Having a typically transactional view of forgiveness, I always held that it makes no sense to forgive when there is no acknowledgment of wrongdoing on the part of the abuser. Asking a survivor to forgive unilaterally and unconditionally is bereft of meaning at best and morally repugnant at worst. Only if the abuser were to apologize and make some effort at amends, at least, could I see then extending forgiveness to the abuser, and I would consider this a charitable act on the part of the survivor.

Others have hastened to tell me that such an exchange is not necessary. They insist that unconditional forgiveness, freely given, is more meaningful and more liberating to survivors than the transactional form of forgiveness. Besides, they say, forgiveness is cleansing and is, indeed, the only way for survivors to rid themselves of the burden of intense and destructive anger.

I have always countered that it is possible to put anger aside without offering forgiveness to someone undeserving and unrepentant. Choosing a somewhat less emotional and inflammatory example, I can point out that I once had a moderately expensive lawnmower stolen from me. It wasn’t the end of the world, but it certainly made me angry. The thief was not caught and, I assume, never suffered any pangs of guilt for the crime. Over time, I was able to get on with my life, though I still remember it 30 years later. I decided to stop dwelling on it and get over it, so I tried to stop thinking about it and focus on things that could improve my life.

My interlocutors quickly countered that losing a lawnmower is nothing like the pain of having your innocence robbed (some described it as theft of a child’s “soul”). I am quick to agree, but I see it as a difference in degree, not kind, and I still cannot see how offering forgiveness to a remorseless abuser can aid healing.

My view was bolstered by the work and words of Alice Miller, the famed psychoanalyst and child advocate who died in 2010. In her 1991 book, Breaking Down the Wall of Silence, Miller writes, “Forgiving has negative consequences, not only for the individual, but for society at large, because it means disguising destructive opinions and attitudes, and involves drawing a curtain across reality so that we cannot see what is taking place behind it.” Instead, she tells us, “Survivors of mistreatment need to discover their own truth if they are to free themselves of its consequences. The effort spent on the work of forgiveness leads them away from this truth.”

Martha Nussbaum’s new book, Anger and Forgiveness: Resentment, Generosity, Justice, offers a

Martha_Nussbaum_wikipedia_10-10
By Robin Holland – Photo file provided by Robin Holland

third way of viewing anger and forgiveness. Nussbaum agrees that therapists should not force forgiveness, but she offers a more nuanced and philosophically grounded way of viewing the work of anger and the way forward from even extreme wrongs and injustices.

While many philosophers have ignored or dismissed the moral relevance of the emotions, others such as Aristotle have noted the importance of anger to a good life. While anger is a negative emotion, it has benefits for people seeking to flourish in life. Namely, anger is said to enable us to recognize injustice when it occurs and then motivate us to action to correct the wrongs inflicted on innocent parties. For Aristotle, anger occurs when someone’s status is lowered without good cause. Indeed, an attack on one’s character or social rank is likely to provoke anger and, in many cases, a wish for revenge. Nussbaum notes that revenge has few or no practical or moral benefits. Other than a temporary sense of satisfaction, payback accomplishes nothing of importance for us.

But if payback isn’t a useful result of anger, then perhaps contrition, apology, and forgiveness are necessary components of a moral and flourishing life. Most of us have grown up in a culture that stresses the importance of apologies and forgiveness for wrongs. Nussbaum traces ancient Jewish and Christian (primarily) texts dealing with the role of forgiveness. The most familiar form is transactional—if someone reduces the status of someone else, the perpetrator shows remorse and asks forgiveness. When the wronged party bestows forgiveness, the proper ranking of the parties is restored, and justice, it seems, is served.

Of course, contrition and apologies are not always forthcoming. Sometimes the perpetrator is simply stubborn and sometimes the perpetrator is no longer alive. This is often the case for survivors of child abuse. In the absence of an apology many therapists, as noted above, advise survivors to offer unconditional forgiveness. This kind of forgiveness is said to release the victim from the shackles of anger and enable a flourishing life to happen. Of course, contrarians such as Alice Miller claim this type of forgiveness traps survivors in a life-long lie that destroys them emotionally.

Nussbaum recognizes these challenges and takes a different approach. She offers several examples of people who move forward without offering forgiveness but in a more positive way than Alice Miller would likely think possible. In the example of the Prodigal Son, the son returns to his father to be greeted with open arms. Although the son has behaved quite badly, his father thinks only of the future with his son and not the past (his other son is not quite so ready to embrace his wayward brother). It is the focus on the future that makes all the difference for Nussbaum.

In an even more painful and poignant example, she describes a father from Philip Roth’s American Pastoral, whose daughter becomes an addict and kills several people. The father finds his daughter and realizes he is helpless to change what she has done or her future prospects. He does all that he can do. He loves her and stays with her. Nussbaum says, “There is no apology, and there’s really no question of forgiveness on the agenda, whether conditional or unconditional. There’s just painful unconditional love.”

When anger is useful, Nussbaum says it is useful as a transition from a wrong to a focus on a better future. In the transition, someone would say in anger, “That’s outrageous! Something must be done to prevent this in the future!” Nussbaum applies this model in three realms: the intimate, the middle (public), and the political (social) realm. Simply because of my interest and background, I found her discussion of the intimate realm the most interesting and compelling.

In the middle, or public, realm, I think most of us realize our anger at strangers is rarely helpful. Minor wrongs (e.g., someone cutting in line at the grocery store) are best forgotten as quickly as possible. More serious wrongs are a matter for law enforcement and the court system. Being consumed with anger is only a form of self-torture.

In the political realm, though, anger is said to be a great motivator toward justice, and surely anger has propelled many social movements to success. Again, though, Nussbaum warns that it is easy to get caught up in concern for revenge or payback rather than creating a better world. After great harms, we need to focus on truth and reconciliation, not punishment. Indeed, the most successful social movements have focused on the future and not redressing wrongs.

Nussbaum sees Nelson Mandela as an exemplary role model for looking to the future rather than the past in response to injustice. She says, “Mandela frames the entire question in forward-looking pragmatic terms, as a question of getting the other party to do what you want. He then shows that this task is much more feasible if you can get the other party to work with you rather than against you. Progress is impeded by the other party’s defensiveness and self-protection.”

For years, I have had difficulty clearly delineating exactly what I found problematic with our accepted model of anger and forgiveness. Nussbaum has provided a welcome bit of clarity for a universal yet surprisingly complex human problem. Realistically, we will not be able to let go of useless anger and focus only on transitional anger, but at least we have a better target. When we do succeed it will be because we rely on another human emotion—love.

April 22, 2016 Workshop: Intersection of Ethics and Justice for Therapists

Intersection of Ethics and Justice for Therapists

CEUs: Three CEUs for LMFTs (provider 891), LCSWs (provider 6900), and LPCs (provider 2444).
Date: Friday, April 22, 2016
Time: 9 am – noon
Location: 2017 Colquitt St, Houston, TX 77098
Price: $35.00
Registration: This is a small group workshop limited to six registrants. To register, please complete and return the attached form: Workshop Registration April 22 2016

Contact: Randall@ethicsbeyondcompliance.com

Description:

Justice is often neglected in discussions of ethics as it seems to be a societal problem Justitia-2400pxrather than an individual problem, but what obligations do therapists have toward promoting justice? If we are not responsible for justice, who is? What is the importance of rights, interests, development, and capabilities? How can therapists promote justice, particularly given that they typically work with individuals rather than large populations?

Objectives:

Define libertarian theories of justice
Define utilitarian theories of justice
Define contractarian theories of justice
Define capabilities theories of justice
Distinguish ethics and justice
Analyze problems related to access
Analyze problems related to marketing
Analyze problems related to choosing clients
Analyze problems related to advocacy
Analyze problems related to activism
Analyze problems related to pricing
Evaluate theories of justice
Apply theoretical reasoning to practical

He isn’t being vulnerable, he’s crying

As a child, I grew up in a culture defined by rampant sexism, racism, and homophobia. While I now realize many of the people around me were gay, they were invisible to me at the time. At least, their sexuality was invisible to me. As a teenager, I made an intellectual decision that everyone had a right to equal dignity and expression. Living in a seemingly homogeneous society, though, I didn’t have the opportunity to experience my own implicit biases until later.

I strongly defended the rights of gay people to live, work, love, and express their lovepublicly, but my reaction to actual gay lives was untested. I was probably a bit too comfortable with myself and my choice for equality, for the first time I saw two men kissing, I was horrified to find that I looked away with feelings of discomfort and perhaps even disgust. I was then filled with shame for the latent feelings I obviously had, but I did my best to not turn away.

Over time, I was lucky enough to find many gay friends and to experience their love and affection in ways that seemed perfectly natural because they were perfectly natural. I’m sure I still have many implicit biases, and I keep trying to overcome them all, but at least now I can usually deal with people kissing with no internal conflict. (As I age, I have become painfully aware that many young people feel the same disgust when they see older people kissing.)

Unfortunately, many people react to a man crying in the same way I initially reacted to men kissing men—they turn away in discomfort or even disgust. It is widely assumed that it is men who are disgusted by other men crying (and I’m sure some are), but famed vulnerability researcher Brene Brown found that it is more often women who can’t accept men’s vulnerability. Obviously, being vulnerable means much more than just crying, but I would like to say that I think crying is really the single behavior that sets people stomachs to churning.

We find crying so shameful, in fact, that we often call it “being vulnerable” in order to avoid saying the word “crying.” I don’t mean this to be a criticism of researchers’ use of the word “vulnerability” while they discuss men’s emotional health. Rather, I mean to suggest that the rest of us have adopted the word “vulnerability” as a way of avoiding discussion of crying. Often we will only say that a man “was vulnerable,” because to say that he was “openly sobbing” would be to rob him of his dignity and bring shame to him. Paradoxically, by trying to protect him from judgment, we reinforce the judgment that all men face for being weak, sad, or emotional.

I should qualify that last statement. We don’t judge men so much for being emotional as we judge them for what particular emotions they express. Crying is acceptable for women and girls, but anger is reserved for boys and men. If a man loses his son or father, for example, he may seek revenge in various ways, and he is often honored for doing so, especially if the death was caused by malice or negligence.

Historically, revenge frequently took the form of actual violence, and vengeful violence has certainly not disappeared, but revenge can also take the form of lawsuits, public shaming campaigns, and other legal and socially acceptable forms. But the man who falls into a deep depression or cries uncontrollably for an extended period will face criticism. I once talked to a father who was told he needed to “get it together” at his own son’s funeral.

We pretend that men aren’t in touch with their feelings or that men are incapable of expressing their feelings. If these things are true, it is only because we have conditioned men to suppress their feelings through our own reactions of disgust. Boys are taught in their first months out of the womb that crying is unacceptable. The result is that men must either destroy themselves or destroy those around them in order to process their own feelings.

The price we pay is that the men we are around are emotionally drained, stressed to the breaking point, and prone to anger and destruction over empathy and connection. Of course, this is an oversimplification and is an exaggerated statement of what happens. We all know well-balanced men who are nurturing and emotionally connected. Some men are lucky that their lives have not burdened them with too much grief and sadness. Other men have, in spite of social programming, been lucky to find people who accept them and their emotions. And, finally, some men have the fortitude to find effective means of self-care.

Still, we can and should work to remove the shame and stigma from male weakness, and that begins with removing disgust from the sight of male tears. How do we do it?

  1. Don’t turn away. If a man is crying in your presence, do not avert your gaze. Continue to look at him and let him know that you are with him, free from judgment.
  2. If you are a man, openly discuss your own tears with both women and men. When we remove our own shame, the disgust of others cannot affect us.
  3. Stop saying, “boys don’t cry” to anyone, especially a child. Boys hear this almost as soon as people start talking to them. Support the full emotional range of boys.
  4. Stop mocking male tears. Some feminists seem to feel that making fun of male emotions is an acceptable response to centuries of male tyranny, but mocking male tears is a sure way to help perpetuate misogyny and the oppression of women.
  5. Create safe spaces for men. Men need opportunities to talk to other men about crying and weakness. Men need to let one another know that crying is not weakness. You can take care of your family, be a protector, or be a warrior and still take time to cry.
  6. Recognize the varied experiences of men. Adult men are often victims of childhood abuse whether it be physical, emotional, or sexual. Men are victims of domestic violence and abuse. While physical violence is a reality for many men, emotional battery is even more common. The victimization of men is not a joke, so please stop laughing at it.

Many men will reject my suggestions as being absurd and will suggest I should just “man up.” I ask those men to remember those words the next time, and it will happen, they are struggling to force back the knot forming in their throats as they build a dam against the tears threatening to break forth. Whether we choke the tears back successfully or not, the damage is done. We still feel the shame and disgust. We feel devalued and demoralized by our own natural emotions. We can be free and we can be whole. We just have to come out and be honest about what and who we are.

Business man blowing his nose

Diogenes Versus Plato: Who will set you free?

No one can question Plato’s writing and rhetorical abilities. He was a superstar of the ancient world, and the fact that his dialogs have endured for millennia attests to the fact of his beautiful writing. Of course, Bertrand Russell found it ludicrous to praise Plato’s ideas based on the quality of his writing, saying, “That Plato’s Republic should have been admired, on its political side, by decent people, is perhaps the most astonishing example of literary snobbery in all history.” Other famous thinkers of the ancient world weren’t as lucky as Plato; although their reputations survive somewhat through the words of others, we often have no copies of their original works or just a few remaining fragments. It may be that Plato was simply such a great writer that his works were preserved while the works of others were not, or perhaps other factors played a role in which works were saved and which were lost.

According to the biographer of philosophers, Diogenes Laertius, the Cynical philosopher Diogenes of Sinope (no relationship to the biographer), also wrote a number of books.* If he actually did, none survives today. The biography is here. The Cynic is infamous for masturbating in public, going naked, eating in the market, and carrying a lamp around in the middle of the day. As we don’t have the original works of Diogenes, we can’t be sure which of these stories might be true and which are apocryphal as they reflect how others saw him, not necessarily how he presented himself. The lack of surviving texts may be down to Diogenes himself, at least partly. When Hegesias asked todiogenes-800px read some of his writing, he reportedly replied, “You are a simpleton, Hegesias; you do not choose painted figs, but real ones; and yet you pass over the true training and would apply yourself to written rules.”

So, it seems that Diogenes, like Socrates before him, valued face-to-face interaction over the more passive learning that comes from reading. It is worth noting that Diogenes was a student of Antisthenes, who was in turn a student of Socrates. Although Antisthenes was reluctant to accept Diogenes as a student, Diogenes considered Antisthenes, not Plato, to be the true successor to Socrates.

According to Bertrand Russell’s History of Western Philosophy, Antisthenes enjoyed a comfortable and aristocratic life until the death of Socrates. After that, “He would have nothing but simple goodness. He associated with working men, and dressed as one of them. He took to open-air preaching, in a style that the uneducated could understand. All refined philosophy he held to be worthless; what could be known by the plain man.” Also, “There was to be no government, no private property, no marriage, no established religion.”  Diogenes, it would seem, followed the lessons of his teacher to their logical extremes, which lead Plato to describe Diogenes as “Socrates gone mad.”

When studying the history of philosophy, we generally follow the lineage from Socrates to Plato to Aristotle. We could just as easily follow it from Socrates to Antisthenes to Diogenes. With the former approach, we find justification for authoritarian rule over the ignorant unwashed masses constantly threatening the fabric of society. With the latter approach, we find a rejection not only of authority but of all the values that drive the totality of social regulation and empty social status.

It should be no surprise, then, which works were preserved. We know Socrates primarily through the works of Plato, which painted Socrates as a victim of ignorant Athenian leaders who rose to positions of power through a democratic process and not on their own merit. Threatened by the wisdom of Socrates, the thoughtless and insecure leaders sentenced Socrates to death. In response, Plato promised order could be secured under the direction of educated and dispassionate leaders who would tame the rabble, leading from their own realm outside the cave of illusion and delusion. The Cynics, on the other hand, would cause disruption, encouraging the working people to believe that they could take control over their own lives even without the aid of book learning and academic discipline. The Cynics valued reason, but not the well-healed reason of the aristocrats such as Plato and Aristotle.

Further, the Cynics encouraged citizens to question the value of everything that is supposed to motivate the working class. For Plato, workers driven by their appetitive elements would produce more goods in order to receive rewards to satisfy their hungers and desires. Diogenes rejected the value of expensive clothing, food, shelter or anything else, and often lived off what he could get through begging. Having almost no possessions and no desires for any more, how could anyone take control over him or threaten him with anything? When Perdiccas threatened Diogenes with death if he didn’t appear before him, Diogenes reportedly replied, “That is nothing strange, for a scorpion or a tarantula could do as much: you had better threaten me that, if I kept away, you should be very happy.” As Todd Snider said in his song, “Looking for a Job,” “Watch what you say to someone with nothing. It’s almost like having it all.”

Imagine if the working class (note: if you work for money, you are working class) now began to question the value of cars, wide-screen TVs, sports, clothing, and “good” neighborhoods. And if the poor of the world adopted Diogenes’s views on citizenship, who would fight our wars? Diogenes gets credit for coining the word “cosmopolitan,” which is usually taken to mean citizen of the world. People who travel the world, speak more than one language, eat varied cuisine, and are not, to put it simply, provincial, consider themselves cosmopolitan, but this is not what Diogenes meant. Diogenes considered himself a citizen of the universe with no political allegiance and without political rights. He was banished from his home for defacing currency or something, and he was what would now be described as a “man without a country.” Imagine everyone being that way (John Lennon thought it should be easy, if you try).

Examined rationally, as the Cynics would have us do, virtually nothing we hold dear has any intrinsic value. We spend our lives working for trifles while ignoring anything that make us genuinely happy. When Diogenes was told it is a bad thing to live, he said, “Not to live, but to live badly.” We can live well, but we may be thought mad.

* Diogenes Laertius says, “The following books are attributed to [Diogenes of Sinope]. The dialogues entitled the Cephalion; the Icthyas; the Jackdaw; the Leopard; the People of the Athenians; the Republic; one called Moral Art; one on Wealth; one on Love; the Theodorus; the Hypsias; the Aristarchus; one on Death; a volume of Letters; seven Tragedies, the Helen, the Thyestes, the Hercules, the Achilles, the Medea, the Chrysippus, and the Oedippus.”

Illness as Financial Ruin (US only)

Every human who has drawn a breath has faced illness, injury, and death. The universal experience of illness creates vulnerability, loss of identity, anxiety, diminished autonomy, and fear. The inescapable battle between health and illness defines human experience and shapes our personalities, our worldviews, and spiritual depth.

For most of the developed world, though, it does not mean financial ruin. In the United States, alone among developed nations, even a relatively minor injury such as broken bones or illness requiring a brief hospital stay can lead to economic disaster. As a result, when we in the US get sick, we don’t think about how we can recover, how we can endure the pain, or the spiritual significance of our pain; rather, we think of how we will pay for our bills.

poorunclesam-800pxAs we face our anxiety over possible diagnoses, we must constantly be prepared to battle with insurance companies, aggressive hospital billing agents, and doctors exhausted from dealing with insurance paperwork. Few things in life create as much anxiety as financial insecurity, and illness always brings the threat of insecurity to US residents. When people have serious accidents, they balk at calling an ambulance because they fear the bills—they worry over whether the ride will be covered and whether the ambulance will take them to a hospital that is in-network. As a result, many people suffering medical emergencies drive themselves to the hospital.

When it isn’t an emergency, Americans often forgo treatment altogether. A Gallup poll in 2014 found that one-third of Americans skip needed medical treatment because of cost concerns, even when they have insurance.  According to the report, “Some 34% of Americans with private health insurance say they’ve skipped out on care because it was too expensive, up from 25% last year. Additionally, 28% of households that earn $75,000 or more report that family members have delayed care, up from just 17% last year.” The Affordable Care Act succeeded in insuring more people, but it also created greater financial burdens for middle-income families through higher deductibles and co-pays. Many people who have been accustomed to being able to afford healthcare now find that it is out of reach.

While healthcare inflation has slowed a bit in recent years,  catastrophic medical events put the costs incurred out of the reach of most of us. The United States alone finds medical fundraisers to be normal and routine. According to an article in Journal News, the number of GoFundMe contributions for medical expenses “was up more than 293 percent in 2014, when more than 600,000 medical campaigns were launched, compared to just over 158,000 in 2013.”  Families with or without insurance cannot afford their medical bills. A serious accident or illness such as cancer creates an existential crisis while forcing people suffering from illness and their families to scramble to avoid destitution.

I don’t write this impersonally, my wife and I buy our insurance through the healthcare exchanges. We pay $682 per month ($8,184 per year) with a $4,000 deductible per person. The out-of-pocket limit on expenses is $13,700 per year. Balance-billed charges do not apply to the out-of-pocket limits, so there really is no upper limit to possible charges. Ignoring balance billing, my costs could easily exceed $20,000 per year.

I often hear the argument that universal healthcare coverage is too expensive and will require raising taxes on the middle class. As I see it, I would still benefit from a tax rise of $15,000 or even $20,000 each year. It is true that others are not in my position, but all Americans should realize they are at risk. No one stays young and healthy. Eventually, everyone will be at greater risk for catastrophic illness, but even those who are currently young and healthy can face illness and injury, though we may not like to think about it. Further, everyone’s income is subject to great variability. Those who have employer-provided health insurance may not want to pay in to a national system, but employer-provided insurance is never guaranteed. Employers may cut benefits, employees lose jobs through layoffs and termination, or illness can end employees’ ability to work.

The same is true for business owners. The tides of fortune shift. When the Affordable Care Act was passed, Mary Brown brought a lawsuit against it, saying she did not want to be compelled to purchase health insurance. Mary Brown owned an auto repair shop that went under due to the pressure of economic recession and the Gulf oil spill in 2010. Of note, her bankruptcy filing listed “among the couple’s unsecured creditors several providers of medical care – a hospital and a physician group in Florida; an anesthesiology group based in Mississippi; and an eye care center in Alabama.” https://newrepublic.com/article/98145/affordable-care-act-mandate-lawsuit-nfib-mary-brown-bankruptcy-court-standing

Like many people, when she was doing well, Mary Brown thought that guaranteed universal access to healthcare was something the government was providing to other people. It didn’t occur to her that she might ever be in a position where she could not pay for her own medical care, but that is exactly what happened. I recently had the opportunity to speak to a Swedish citizen about Sweden’s healthcare system. He was a middle-aged man who explained that healthcare was paid through higher taxes. He said he didn’t mind the taxes, though, because you never know when you will be the one needing care.

It seems many Americans are not able to make this basic calculation of risk. Most people, even those who consider themselves well off, are not immune from the financial ruin that illness and injury can bring. Once people realize their own vulnerability, they support universal coverage for healthcare. The time for a more sober and accurate assessment of risk is well past due. We must wake up to the fact that the US healthcare system is not sustainable, that it leaves us at risk of financial failure, that it makes the experience of illness exponentially more stressful, and that we can do better.

It will not be easy. The US spends far more than other developed nations on healthcare. Each excess dollar we spend is profit for an insurance company, hospital, testing facility, pharmaceutical company, biotechnology company, or other player in the healthcare industry. Many people profit from the dangerous, expensive, and inefficient system we have in the United States. Every reduction in healthcare spending will be a reduction in profit for someone, and each person (or business) facing a loss of income will argue vehemently and vociferously that such a loss of income is a horrible tragedy and an impossible feat.

We will be told that reducing healthcare spending will reduce the quality of care. We will be told it will reduce our choices and control. We will be told it is impossible. We already have little choice or control, and we already have higher mortality rates than the rest of the industrialized world, so we have nothing to lose and everything to gain. We have plenty of ideas on how to improve the system. What we lack is political will, but I think the will is growing. If we want universal coverage, we must demand it, and the time to demand it is now.

 

Stop infantilizing old people, please

As I write this, I am 55 years old. Like most people my age, I like to think I am a “young 55” or that I look good “for my age.” As I get older, I think I have become a little more patient, more accepting, less doctrinaire, and, yes, sadder and wiser. However, I have not become more adorable, precious, charming, or sweet.

Although I am not yet extremely old, I’ve already noticed that younger people I hardly know sometimes refer to me as “sweetheart” or “sweetie.” This seems to be a particular problem in healthcare settings. Some call it “elderspeak,” which is characterized by treating older people more as children than as fully functioning adults (I personally feel this demeaning language is often inappropriate for children as well, but I will take one thing at a time). For some reason, when people talk to older patients, they tend to slow their speech, raise the volume, and sing their sentences. In addition, every statement seems to become a question and second person pronouns are replaced with first-person plural pronouns ( e.g., “you” becomes “we”). You can read more about this phenomenon here.  At a time when nursing home workers are sharing explicit photos and videos of older adults on social media, complaining about “sweetheart” seems almost quaint, but both the diminutive terms and the more extreme demeaning media rob patients of their dignity and personhood.

Other people seem to think they are honoring older adults by treating them as mascots. Many videos on social media feature adults who are “adorable” or “precious” dancing, singing, or doing other activities they have no doubt done for their entire lives. The videos are presented with the exact same attitude behind videos of kittens, puppies, and babies. Samuel Johnson once said, “A woman’s preaching is like a dog’s walking on his hind legs. It is not done well; but you are surprised to find it done at all.” Videos of the elderly seem to take the same attitude: it is amazing that older people might still do the things they love. If they make the attempt to engage in the activities that make them happy, the are “so cute.”

The consequence of assuming adults become children once again in later life can have serious consequences. For instance, healthcare providers often ignore the sexual health of older patients. As this article states, “prevailing misconceptions among healthcare providers regarding a lack of sexual activity in older adults contribute to making elders an extremely vulnerable population.” The result of this ignorance, is that STD rates among the elderly are increasing at an alarming rate. Although about 80 percent of adults aged 50 to 90 years old are sexually active, they are infrequently screened for STDs.

I am more concerned, though, about the basic harm of a society that treats its elders as mascots for amusement. As we age we lose the respect of our fellow beings and we lose our status as persons. For the most part, younger people don’t mean any harm, even if they are doing harm; they are acting out of ignorance. That being the case, I am here to help. The following are things you should know about your elders:

  1. They have and talk about sex. In a movie, it is always easy to get a good laugh by having an old person, especially an old woman, make any kind of statement that indicates she knows what sex is. Apparently, many young people believe that when you hit a certain age you become an innocent and naïve virgin, completely unaware of how people reproduce.
  2. They curse. This is related to the first point, but it slightly different. If you curse now, you will probably curse in 10 or 30 years. At what point do you think it should become funny or cute? Old people have the same right to words that everyone else has. Language is a human right.
  3. They still know how to do things. It isn’t amazing that someone who has danced since he was seven still likes to cut the rug when he is 80. Our abilities may diminish over time (some do and some don’t), but we don’t suddenly forget everything we’ve learned over a lifetime.
  4. They are still rational and intelligent. I realize we all suffer some cognitive decline as we age and some are affected by diseases that accelerate or accentuate that decline, but young people also suffer brain injury, disease, and other limitations on cognitive ability. Age is not a sufficient reason to believe someone is stupid.
  5. They’ve won the battles you are fighting. Somehow, your elders have survived. If you can manage the same, you should be honored, as you should honor them now. Any old person can tell you it isn’t easy growing old. Someone who has survived had the wits and strength to overcome many adversities. They could teach you a thing or two.
  6. They are persons. Here, I am using the word “persons” in a philosophical sense of someone who bears human dignity and value. It does not diminish as you age. If anyone has value, you do.

In case you haven’t seen any of the videos I described above, here is an example:
[youtube https://www.youtube.com/watch?v=R7Br3-5L6hM]

Sparkle, Autonomy, and the Right to Die

Recently a woman in the UK known only as C won the right to effectively end her life by refusing dialysis treatment. Owen Bowcott, writing for The Guardian described it as a “highly unusual judgment,” but, in making the decision, the judge said, ““This position reflects the value that society places on personal autonomy in matters of medical treatment and the very long established right of the patient to choose to accept or refuse medical treatment from his or her doctor.”

The judge is correct; the right to refuse treatment is one of the bedrock principles of medical ethics. In most medical decisions, autonomy trumps all other considerations, including efficacy of possible treatment. In other words, you are not obligated to accept treatment simply because it will prolong your life. This is the newnhamm-MultiColored-Sparkle-fixed-2400pxway things work in the world of medicine, but there could be other approaches.

Given the facts of this case, it seems a suicidal person sort of “lucks out” when an unrelated medical issue arises. Unlike C, not everyone seeking death is able to find a legal way out. Those who are so physically incapacitated that they cannot possibly end their lives without help often find too many roadblocks to death to ever carry it out. Even when healthy people try to commit suicide, the rest of us are obligated to prevent it when possible. If we find someone who has taken a drug overdose, for example, we try to save him or her. If someone is trying to jump off a bridge, we try to prevent it. And if someone asks for drugs to commit suicide, only a few places in the world allow them to be prescribed.

It is clear that we do not always respect the autonomy of suicidal individuals. Even in the case of C, the judge said, “My decision that C has capacity to decide whether or not to accept dialysis does not, and should not prevent her treating doctors from continuing to seek to engage with C in an effort to persuade her of the benefits of receiving life-saving treatment in accordance with their duty to C as their patient.” The judge seems to feel that the doctors ought to continue trying to save C, even while recognizing that she has the right to refuse treatment.

Clearly, the law in this case is built around autonomy, but perhaps it shouldn’t be. Autonomy assumes a rational and unimpaired person making a fully informed decision. The judge notes that C is fully functional and has no cognitive impairments. At the same time, though, C is facing a diagnosis of breast cancer and a severely damaged self-image. It isn’t clear that she may not modify her view with a little time and, perhaps, psychotherapy.

If her mental health is impaired, she may not be fully autonomous in the first place. If she isn’t, then perhaps she needs care more than freedom. An Ethics of Care would possible guide us to respect her wishes as well as her needs. A little more time may be needed to assess whether her decision, which is not reversible, is truly the decision she wants to make. With a little time and support, she may come to believe that sparkle is still possible for her.

I also think a focus on capabilities might be relevant. An ethics focused on capabilities would try to enable her to have a fulfilling life by maximizing the abilities she still has. Care and capabilities both emerged as feminist approaches to ethics and justice. While on the surface, this may not seem to be a feminist issue, but the judge also said, “It is clear that during her life C has placed a significant premium on youth and beauty and on living a life that, in C’s words, ‘sparkles’.”

It is clear that C has operated under rather sexist values for most of her life. That is her choice, to be sure, but it might be possible to find new values. Many who have experienced crippling injuries have sought suicide only to later find their lives are valuable and meaningful even without the activities and relationships they once held dear.

Book Review: The Experiment Must Continue by Melissa Graboyes

We all have a complicated relationship with medical research. We know that every effective treatment or therapy that exists was once an experimental treatment or therapy. We know that some drugs have been so effective that they eradicated various diseases completely, and we also know that someone had to be the first one to try all those new drugs. On the other hand, most new drugs don’t work out. Some are simply not effective, some are effective but have serious side effects that make them all but useless, and others turn out to be deadly.

Medical research is plagued with problems related to consent, coercion, therapeutic misconception, benefit, and access. All these problems exist Medical-Research-800pxin North America and Europe with both well educated, affluent populations and with so-called “vulnerable” populations.

Informed consent is an example. Virtually everyone agrees that patients who participate in medical research should know about and agree to their own participation. Ethics committees, lawyers, and bioethicists have gone to great pains to develop procedures for proper informed consent procedures. Sadly, too many people talk to their doctors about treatment options, hear about ongoing research, and sign consent forms without actually realizing they have agreed to participate in a medical experiment. Despite the best intentions of everyone involved, patients believe they are receiving treatment that is expected to help them (therapeutic misconception).

I sometimes use the HBO film adaptation of Margaret Edson’s play, W;t, in my classes. The main character in the play agrees to experimental treatment, is informed of the side effects and goals of the research, and then goes on to suffer tremendously for her decision. When I have my students write about the movie, more than half of them still believe the doctors were trying to cure the cancer of the main character. Despite all the frank discussions of the research, they still don’t understand that the protagonist was never expected to benefit from the treatments she was receiving. Furthermore, the character never seemed to fully realize that her participation was never expected to benefit her in any way.

If these kinds of misunderstandings happen between researchers and research participants from the same culture speaking the same language, the problems are sure to be compounded by cross-cultural communication. In her book, The Experiment Must Continue: Medical Research and Ethics in East Africa 1940 – 2014,Melissa Graboyes explores ethical challenges and lapses in numerous studies conducted in East Africa. Her book is a refreshing attempt to shed “conventional wisdom” about research in Africa.

For example, I think anyone who has studied research ethics has heard that African chiefs would sometimes provide consent for all the people in a village to participate in research projects. Graboyes says she could find no evidence that anyone in any of the locations under study ever recognized the right of anyone to give collective consent for a group of people. Further, many describe African research participants as “vulnerable” populations with little to no agency. In the sense that many people lack adequate medical care, they are vulnerable, but Graboyes challenges the notion that they lack agency and gives several examples of Africans responding actively and rationally to both exploitative research and beneficial research. In short, she shows that they are actually persons with wills, minds, autonomy, and awareness.

Another common theme for those studying research ethics is the use of coercion to get people to enroll in trials. Many wring their hands worrying over whether offering payment or gifts might unduly coerce potential participants whose desperate poverty might drive them to enroll. Those who did enroll, however, were more concerned about inadequate compensation than undue coercion. Participants realized that others would benefit from research carried out on their bodies or in their homes. In exchange for participating, they felt some reasonable benefit was due, whether it be in the form of cash, medicine, or health services.

One possible benefit, of course, is access to medicines researchers commonly advertise that participants will receive a new treatment at no charge. Many African participants assumed they were trading their blood for research and in turn would receive medicines that would benefit them. In some cases, participants did receive helpful medications, but those medicines were then withheld from them at the end of research, even if it proved to be effective. Researchers say it isn’t their responsibility to provide the medications, which may or may not be expensive, but leaving people with the knowledge that an effective treatment exists without making one available seems to me to be a particularly cruel kind of harm

In the United States, people also expect access to new medications. When people find they have a terminal illness, they will often (I want to say usually) demand to receive experimental medicines. In the 1980s, AIDS activists in the US demanded that experimental treatments be distributed to HIV-positive individuals, and demands for quick approval for experimental drugs have become routine. In this sense, medical research may be a victim of its own success. Most people in either America or Africa fail to appreciate the risk they take with unproven medicines.

Although many researchers view Africa as a fertile field for research (many describe Africans and “walking pathological museums) for the abundance of diseases present and for the relative low costs involved compared to research conducted in Europe and North American. Graboyes describes both successes and failures in East Africa, but the failures can be depressing. In some cases the research never got off the ground, in some it never produces usable results, and in some it made conditions much worse.

Is it unethical to conduct research in Africa? Graboyes doesn’t think it is necessarily unethical to conduct research in East Africa, but she does feel some of the research has been unethical, some simply misguided, and some poorly designed. Many Africans do not trust researchers, which is frustrating to researchers who feel they are on a noble quest to end disease, but many of them fail to realize how many researchers have told outright and deliberate lies in East Africa. People do not forget so easily.

I don’t want to give away too many details of the book, as it can become something of a page-turner. One last thing I will mention, though, is the fact that Graboyes was aware that she was another researcher visiting East Africa asking for cooperation. Although she wasn’t taking blood, spraying insecticides, or injecting treatments, she still needed to ensure that she was proceeding ethically and had the trust of the people she was interviewing. Her efforts are admirable but remind us that any reporting of facts is a matter of interpretation and may be subject to modification.

This book is admirable and compelling, especially for those interested in the ethics of international research. In addition, her insights might help to develop better ethical practices for domestic research, as many of the issues are the same.

Reid Ewing and the Failure of Autonomy in Bioethics

Reid Ewing of Modern Family fame recently wrote publicly about his struggle with body dysmorphia in a personal essay on the Huffington Post. Ewing revealed that his dysmorphia led him to seek and receive several surgeries. He feels his surgeons should have recognized his mental illness and refused to perform surgery. He wrote, “Of the four doctors who worked on me, not one had mental health screenings in place for their patients, except for asking if I had a history of depression.”

The principle of autonomy is by far the most discussed principle of bioethics. Discussions typically focus on the rights of patients to refuse treatments, not to seek them. On either side, the issues can be thorny. If a depressed and suicidal patient refuses life-prolonging treatment, is it ethical to respect the patient’s autonomy or should mental health services be provided first? As in Ewing’s case, the ethical problem arises from the claim that the decision is driven by mental illness and not reason. If someone is mentally ill, they are not fully autonomous agents as they are not fully rational.

This is a problem with autonomy in general. Our ideas of autonomy come largely from Immanuel Kant, who claimed that all rational beings, operating under full autonomy, would choose the same universal moral laws. If someone thinks it is okay to kill or lie, the person is either not johnny-automatic-gloved-hand-with-scalpel-800pxrational or lacks a good will. How do we determine whether someone is rational? Usually, most of us assume people who agree with our decisions are rational and those who do not are not rational. If they are not rational, they are not autonomous, so it is ethical to intervene to care for and protect them.

Earlier this year, a woman named Jewel Shuping claimed a psychologist helped her blind herself. She says she has always suffered from Body Integrity Identity Disorder (although able-bodied, she identified as a person with a disability). Most doctors, understandably, refuse to help people damage their healthy bodies to become disabled, which can lead clients to desperate measures to destroy limbs or other body parts, sometimes possibly endangering others.

Jewel Shuping never named the psychologist who may have helped her, so it is impossible to check the story. It is possible to imagine, however, that some doctors would help someone with BIID in the hopes of preventing further damage to themselves or others. Shuping says she feels she should be living as a blind person, and she appreciates the help she received to become blind. In contrast, Ewing feels he should have undergone a mental health screening before he was able to obtain his surgery and that his wishes should not have been respected.

Plastic surgeons are often vilified as greedy and unscrupulous doctors who will destroy clients’ self-esteem only to profit from their self-loathing. On the other hand, these same plastic surgeons are hailed as heroes when they are able to restore beauty to someone who has been disfigured in an accident or by disease. Unfortunately, we do not have bright lines to separate needless surgery to enhance someone’s self image and restorative surgery to spare someone from a life of social isolation and shame. Some would argue the decision should not be up to the doctors in the first place but should be left in the autonomous hands of clients.

Many have similarly argued that doctors should refuse gender confirmation surgery to transgender men and women. As with BIID, many assume that transgender individuals are mentally ill and should see a mental health professional, not a surgeon. Transgender activists (and I) argue that transgender individuals need empowerment to live as the gender that best fits what they actually are. If surgery helps them along that path, they should have access.

All this leaves us with the question of when to respect autonomy and when to take the role of caregiver, which may involve a degree of paternalism (or maternalism for that matter). Is it more important for doctors who ensure the patient’s rights to seek whatever treatment they see fit, or is it more important to provide a caring and guiding hand to resolve underlying mental health issues before offering any treatment at all?

One of Ewing’s complaints is that he was offered plastic surgery on demand with no screening at all. The process for people seeking gender confirmation surgery, by contrast, is arduous. Before surgery, transgender people go through counseling and live as their true gender for an extended period of time. At the far end of the spectrum, people with BIID rarely find doctors willing to help them destroy parts of their bodies and resort to self-harm. These three cases are not the same, but make similar demands on the distinctions between respect for autonomy and a commitment to compassionate care.

It seems reasonable to accept Ewing’s claim that mental health screenings should be a part of body modification surgery, especially when someone has no obvious flaws that need to be repaired. In all these cases (dysmorphia, gender identity, and BIID), mental health support is necessary. In each case, patients describe depression, emotional turmoil, and, too often, thoughts or attempts of suicide. Mental health care does not require a violation of autonomy, but it may help a person’s autonomous decisions to form more clearly from deliberation and not desperation.