Monday, December 30, 2013

Years


I’m surrounded these days by reminders that the digits on the calendar are about to change. Lists of the best of everything in 2013 and promises of what’s ahead in 2014. Requests for year-end donations and plans for the coming year. Looking back and looking ahead. I guess the same messages have always floated around as New Year’s Day approached. But as the years roll by, they seem more compelling and more confusing. I'm find balancing on the boundary a little dizzying. 

For instance …

On solstice evening, we went to a wonderful party with a group of women friends. We burned candles to bid farewell to things we’d like to release from our lives as we finish this year and candles to welcome things we’d like to bring into our lives instead. The sun turns in its path, finishing a cycle and beginning another. Around the same time, I learned that two people in my life, two of my age peers, are very ill. One is seeing her last new year, and the other may well be. And then I heard my partner’s grandson, about to turn 12, talk about something he did “a long time ago,” even as his journey has just begun. Almost surely, this transition is experienced by my two peers as the close of a year, hopefully one laced with good memories. And it is just as likely experienced by this boy as a step forward into new adventures. Ends and beginnings. 

The yin and yang of time. The edge of the year.

So what, I ask myself, is it to me? I know the “right” answer: it is what I make of it. But my experience of this edge feels more complex than that. I understand that I’m responsible for what I create of this year, within the limits that reality imposes. And I also know that reality does impose limits. Among these is the fact that as I grow older (which, by the way, we all do), the years ahead look different—as do the years behind.

You’ve probably heard it plenty, especially from old folks: time passes faster as we age. In fact, anyone of a certain age is likely to be thinking it right about now: I can’t believe how fast this year passed! Well, it seems that there is considerable evidence that this is true—i.e., that the older we get, the faster time seems to move. That applies to individual hours, and it definitely applies to whole years. Two questions come to mind: (1) Why? and (2) So what?

As for why … some folks argue that it’s because we have fewer novel experiences, and novel experiences are intrinsically more memorable than familiar ones. The early-20th century philosopher/psychologist (who was also, by the way, a brilliant writer), said it well:

"In youth we may have an absolutely new experience, subjective or objective, every hour of the day… Each passing year converts some of this experience into automatic routine which we hardly note at all, the days and the weeks smooth themselves out in recollection to contentless units, and the years grow hollow and collapse.” 

Recent neurological research seems to bear him out–although it does so far less poetically than William James. During childhood, it seems, we devote a lot of attention, which translates to a lot of neurological effort, to understanding and mastering the simplest bits of information, the most basic skills—more by far than we now remember having invested. But by adulthood, the brain has adapted to so many sorts of input, has learned to process so many things automatically, that events do in fact flow by without our noticing them. They become James’ “contentless units,” and the years collapse into one another as we look back through the telescope of time.

Another explanation is simply that each day, week, year is a smaller part of our total life’s experience, so of course it seems shorter, if only in relation to the whole. After all, one day to an 11-year-old would be approximately 1/4,000 of her life, while one day to a 66-year-old would be approximately 1/24,000 of hers. So it makes sense that a day—or a week or a year—would seem much longer to an 11-year old than to a 66-year old. Who hasn’t heard a pre-teen say something like, “Back when I was a kid …” Point made.

Turning the telescope around, focusing forward rather than backward, other folks argue that time seems to move so fast because we have fewer and fewer years ahead. Seen in this way, the years ahead seem so very precious. Of course each one seems to disappear faster—like coins to a poor person who has few versus a rich person who can’t imagine the end of wealth.

So, in the swirl of endings and beginnings, I’ve been thinking about this. This transition, this edgeis it an ending, the close of a year … or a beginning, the opening of one? Of course it’s both. But I mean psychologically, for me, which is it? Or, perhaps more to the point, which will it be?

First, I agree wholeheartedly with James’ suggestion that having new experiences and learning new things make the years richer and give them memorable content. I’ve learned that lesson (unfortunately, over and over) in my own life. Novel experiences create memories, and memories give a year an identity of sorts: “2013: the year when I did a weeklong astrophysics course, when I climbed Storm King Mountain,” etc. And when years have an identity, they don’t “grow hollow and collapse” on one another.

But I’m not sure that this phenomenon—as noteworthy and psychologically important as it is—explains why the present moment or the years ahead seem so short. That part of time’s collapse needs, for me, an additional explanation. And here, I think, the “fewer years ahead” interpretation fits. When I look at my parents’ life spans—both of them died of so-called natural causes—and consider that mine is likely to be roughly similar (fantasies to the contrary notwithstanding), I can make a rough guess about the time I have ahead. And then, if I count backward that many years, I’m stunned by how recent it seems. That many years ago, I was doing x and y—but those things seem so recent! Is that really all the time I have left?

And here rests the challenge, the "so what?"at least for me. Because if that’s all the time I have left, I had better make it time worth living, within the limitations imposed by reality. Yet, my penchant is to coast, to slip into comfortable routines, as I did with such ease when the time ahead seemed endless. So, I ask myself, if I woke up tomorrow to the news that my time was up, would I be content with my life as I’m living it today? I’m not talking about creating a bucket list here. I’ve written before about the concerns I have about bucket lists. I’m not talking about fantasies I want to realize some day. I’m talking about reality, today. Am I spending this day in a way that would make me content if it were my last?

Let me take stock: my partner and I made plans last night to spend time with old friends from San Francisco later this week, and I’m looking forward to that. This morning, I’m wrapping up arrangements for an interview for my radio show next week, which is exciting. I took today off from my editing work, a gift to myself of a leisurely day, which gives me great (rare) pleasure. I’m writing this blog, which is always huge fun for me. I’ll run some errands. (OK, yuk. Necessary life maintenance. I can feel fine about that, if not excited.) I’ll take a walk in the beautiful Colorado sunshine. If I have time, I’ll work on another blog. This evening, I’ll join the other folks in our KGNU collective to do a show on queer events of the past year … and maybe look forward to next year a bit. (There it is again, that old year—new year thing.) OK, would that feel fine as my last day? Yes. And now, can I say that every day … OK, most days (granting reality the right to intrude)?

Because, now that I think about it, we don’t have years—old ones or new ones. We have days, minutes. The only thing that demarcates Wednesday from Tuesday will be the date, the digits on the calendar. There’s nothing magical there. It’s just a day, a date. We may invoke it as a moment for review and anticipation, but we could review and anticipate any day. And as the days grow fewer (as they do for all of us), it seems like we might want to pay attention to each one while we can.

I think I’ll go take that walk.



To comment on this post, just click on "No comments" (or "2 comments" etc.) below. Comments from "Anonymous" welcome.

Thursday, December 19, 2013

Making a friend

I’ve written here before about assorted cultural conscious raising experiences, but last weekend’s was unique—three “cultural” activities, each with a different purpose and a different tone. All three were time well spent, each in its own way, but the bigger story (for me) is the amazing bit of self-realization I encountered along the way.

The day started with a meeting of the local chapter of Old Lesbians Organizing for Change (OLOC), a national group whose local chapters vary greatly in how they live out their name, “organizing for change.” The post-potluck program (the potluck is, of course, required for all lesbian events) was a video about an old lesbian couple—a growing new genre of films, documentary and fiction. It’s an interesting marker of the progress we’ve made toward visibility and the hesitant acceptance of both LGBTQ people and LGBTQ aging. More about that another time. (Soon, since I have a radio show on the topic coming up in January.)

From OLOC, we went to Sound Circle’s solstice concert. Many of you know about Sound Circle and their marvelous music, and anyone who reads this blog knows how much I love them. I’ll have more to say about them in a minute.

And from there, we rushed off to a roller derby match. Yup, you read right: roller derby. I’d never seen a roller derby match before, never even considered it as something I particularly wanted to do. But a colleague of my partner does roller derby in her spare time, so there we were, squeezing into the crowd in a chilly warehouse. Scores of folks come to watch women in colorful (and sometimes weird) costumes swirling around the oval track, doing their best to bump and block and generally disrupt one another en route. I don’t especially need to go back, but as a cross-cultural experience, it was really interesting—and it does indeed seem to have a whole culture wrapped around it. There’s currently a picture/sign in the Walnut CafĂ© that asks, “When was the last time you did something for the first time?” Good question. This was my answer. Here’s a picture to prove I was there. I’m not in the picture, to be sure, but I did take it.





So, in the middle of that cultural sandwich was Sound Circle. Their solstice concert is always an excellent way to welcome the return of the light, and this one, with a theme sketched of sleeping and waking, dark and light, rising and falling, seemed perfect for the season. I especially loved a few songs: “Something Inside So Strong,” an anti-apartheid song, and “Woke Up This Morning (with My Mind Set on Freedom),” a song from the Civil Rights movement, reminded me of last weekend’s experiences and of OLOC’s mission, “organizing for change.” Their inclusion in this concert also seemed brilliant, a twist that translated the theme of rising and waking, shifting it from the seasons to the realm of human striving. And then there was this marvelous piece called “Snowforms” by Murray Schafer. Shafer introduced the term “soundscape” and popularized the field of “acoustic ecology,” which sees sound as part of the environment. So naturally, his music depicts the environment through sound. "Snowforms" uses Inuit terms for various kinds of snow to punctuate this wonderful drifting, flowing, sometimes crunchy musical soundscape. The music is so non-standard that the “score” doesn’t have staffs and notes. Instead, it takes the form of swooping waves, white on blue, intended to depict sounds, not neatly structured music. It takes a group like Sound Circle to pull this off, I imagine. It was delightful. And a nice nod to winter.  




As always, I loved this concert. But it was different for me from earlier ones with Sound Circle. And that’s the real point of this blog.

First, I should mention that I never used to consider myself much of a fan of choral music. I appreciate the fact that many voices can create sounds that a single voice (or a few voices) cannot. And I know, in principle, that a chorus represents something important in itself: a synergy among people that says something meaningful about human existence, speaks to our desire for community. Still, until recently, all of that was just theory to me. But over the past few years, as I’ve started hanging out around folks, my partner among them, who sing in choruses—Sound Circle and Resonance Women’s Chorus, in particular—my feeling about all this has shifted. It was gradual, I suppose. Hearing more choral music in general, hearing choral music that’s this good, hearing people who sing in (and direct) choruses talking about the experience. It all had an impact, I’m sure, although I wasn’t especially thinking about it.

Until Saturday. And then I got it. I realized that I was experiencing this concert in a whole new way, and it surprised me. I took more pleasure in noticing the different voices, whereas before, I just heard the overall sound. I found new delight in the variations in mood created by different songs—I heard it more in the music and I felt it more in the audience. I was more delighted than usual by the energetic songs, and I got more absorbed in the reflective songs than I usually have (although “Praises for the World” has always moved me to the core and remains in a class of its own). And I was more aware of the musical skill of the singers, individually and collectively. Simply said, the music touched me more. I was genuinely sorry to have it end. Despite the fact that I had a roller derby match to attend.

Now, it’s possible that I was just more “present,” more mindful, more attentive than I’ve been before. But I think it’s something more. “So what was it?” you’re probably asking. I wondered this myself, even during the concert.

Why, I asked myself, is this so much more engaging for me today? My answer: I think it’s because I’ve grown such a different relationship with music lately. I’m hanging around with music a lot these days, spending time with it, sometimes alone and sometimes in company. I’m playing with it, listening to it, watching how it relates to other people and they to it, asking it questions, wondering what it wants. We’re becoming friends. And this process of getting acquainted has changed how I understand music and, quite apparently last Saturday, how I relate to it.

I didn’t come to this new friendship easily. Never having been a singer, my relationship with music was always as an outsider, an observer, not a participant—not the best way to form a friendship, I realize. So, from this less-than-intimate perspective, I think I always thought that music was something that other people did, not me. And that people who could sing just did. They’d stand up, open their mouths, and lovely music would pour out. Well phrased, perfectly on key, precisely modulated. It’s nice, I thought to myself, but it’s no big deal. It’s just what they do, because they can. And then Sue Coffee, the director of Sound Circle and Resonance, asked whether I’d like to be involved in some way with Resonance. That led to my unexpected journey into a new friendship with music.

I’ve written here before about my recently assumed role as “Assistant Maven” for Resonance, one result of Sue’s inquiry. In this role, I get to share space with the chorus as they practice every week. I halfway expected it to be boring. But it turns out to be fascinating. It first challenged and now seems to have changed how I understand singing and choruses. Listening to these women prepare for a concert, sound by sound, line by line, song by song, I’ve rather quickly come to a whole new appreciation for how much work it takes to make music sound good. From them, I’ve learned that the synergistic power of choral music, wonderful as a whole, also reflects all the countless pieces it encompasses. Individual notes, individual voices, individual parts magically stirred together—all in the context of relationships, carefully tended.

Another part of this path has been my unexpected and tentative personal foray into singing. Never (ever!) having thought myself a singer, the invitation to become involved in Resonance made me wonder, vaguely, whether I might be able sing in the chorus. Before daring such an outrageous step, I decided to take a voice lesson or two. Now, I still don’t think of myself as a singer, except in the broadest sense as someone who sometimes sings out loud, and I'm not singing in the chorus. But I have discovered that learning to sing is actually fun. It’s made me more comfortable with my voice (“more” not equating to “very”) and more comfortable with singing out loud in a group—like, during the sing-along part of a Holly Near concert. What’s more, I actually enjoy these activities. A lot.

And taking voice lessons (those words seem so improbable to me!) has also given me the opportunity to hang out on a regular basis with someone who is a singer (in Sound Circle), as well as a musician in ways I can’t even imagine (how do you even begin to “do an arrangement” of a whole song?). One of the most important lessons for me has been her talking, casually, about her own singing. “When I’m working on a song …,” she says. And I’m thinking “You? Working on a song?” Hmmm. Maybe good singers don’t just stand up and open their mouths to let the music escape. Or she says, “When I’m performing, I have to remind myself to …” So then I wonder, “You mean to tell me that you’re actually thinking about what you’re doing? You’re working on doing it right? It doesn’t just flow from you like water from a faucet?” It almost seems like making good music is like any relationship: it takes work. Really?

This is a bonus I never expected from these activities—in fact, I never would have known I’d be interested in a “friendship with music.” But sought out or not, this combination of experiences appears to have changed music for me.


Heck, I even hear the 5 a.m. clock radio differently. Truly. 


Wednesday, December 11, 2013

Mandela, reconciliation, and reparation

Yesterday, I listened to some of the coverage of the memorial/celebration service for Nelson Mandela.* From all reports, it was a moving event, with an extraordinary outpouring of grief and gratitude for this singular man. One person described him as “our moral compass.” She was referring specifically to South Africa and the deeply disappointing political landscape there in recent years, so stark against Mandela’s clarity and compassion. But she could have been referring to the world. Her words captured what I wanted to say when I first learned of his death: South Africa and the world – we all lost our moral compass.



It was with this sense of loss at Mandela’s death that I went to three events this weekend that reminded me of how easily we do that, lose our moral way. How easily we—especially those of us who are generally comfortable in the world where we live—get caught up in our own lives, forgetting that our comfort often comes not from following our “moral compass” but from ignoring the hints that we’re off course.

The first event was on Friday night, the day after Mandela’s death. It was a workshop organized by the Boulder Meeting of Friends (Quakers), and it focused on the history of Europeans’ treatment of America’s native peoples. The workshop was perfectly crafted to merge a great deal of information with a very moving bit of participant involvement. Briefly, we were all asked to stand—about 30-35 of us—on blankets spread out on the floor, pretty much filling the room. Then narrators told the history of native peoples in America, with different voices representing Indians, Europeans, government entities, and the historian. As the story progressed, groups of participants were told that they represented Indians who had died during different historical periods—from illnesses brought by the Europeans, in massacres, walking the Trail of Tears—and those people left the blankets to sit around the room. Slowly, as our numbers dwindled, the blankets were folded inward around our feet, shrinking the “land” where we stood even as the population shrank. Finally, by the end of the exercise, only five of us were left standing, and we occupied just a small patch in the middle of the room. Like the others, I stood silent, a bit stunned by what had just happened.

The informational part of the workshop focused on two doctrines: the “Doctrine of Discovery,” an actual declaration issued and then reiterated by various European leaders, secular and religious, declaring that Europeans had the right, even the obligation, to claim all lands they visited and to enslave or eliminate all the peoples they found there. The other document was the U.N. Declaration on the Rights of Indigenous Peoples—an internationally agreed upon document that recognizes the human rights of indigenous people everywhere and directs nations to honor them. I was dismayed to learn that the U.S. has refused to sign on to this doctrine.

It’s not hard to figure out that our nation was created by the general willingness of the colonists and then the “settlers” to abide by the Doctrine of Discovery. Now, I’ll grant people of the 15th, 16th, 17th, 18th centuries … maybe even the 19th century … the historical and cultural context that would make such behavior seem self-evidently “right.” But we are now in the 21st century, and we should know better. Why, oh why, I asked myself, have we not signed the U.N. Declaration that would have to some degree made amends for that earlier, unconscionable decree? Then I read the U.N. Declaration, and I knew why. Let me quote two short sections:

Article 10. Indigenous peoples shall not be forcibly removed from their lands or territories. No relocation shall take place without the free, prior and informed consent of the indigenous peoples concerned and after agreement on just and fair compensation and, where possible, the option of return. … States shall provide redress through effective mechanisms, which may include restitution, developed in conjunction with indigenous peoples.

Article 28. Indigenous peoples have the right to redress, by means that may include restitution, or, when this is not possible, just, fair, and equitable compensation for the lands, territories, and resources which they have traditionally occupied or used and which have been confiscated, taken, occupied, used, or damaged without their free, prior and informed consent.

Of course we haven’t signed this declaration! To do so would mean we are willing to provide “redress, by means that may include restitution … or just, fair, and equitable compensation” for all the land that the Indians occupied and we took. All the blankets we stood on Friday night.

I was still thinking about this when I went on Saturday to a half-day “CU on the Weekend” class on how unconscious attitudes influence health care. More specifically, we learned about health care disparities (which, interestingly, are called “health care injustices” elsewhere in the world) and the mechanisms underlying them. Which is to say we learned why it is that people of color (poor people, queer people, old people, women … pick a marginalized group) get poorer health care and have poorer outcomes even when all the variables that should affect health care are identical.

The professor’s proposition was that this occurs because of the unconscious attitudes we all carry around with us. She was talking about implicit attitudes, which I’ve mentioned here before. Basically, even if no one intends to treat marginalized groups differently, we are all influenced, in ways we don’t even recognize, by attitudes we’ve absorbed over a lifetime and don’t even realize we have. That goes for health care professionals as well as for the rest of us. It’s also true for patients, who approach health care with their own unconscious biases. And it’s true in virtually every situation we face, to one degree or another. I won’t even try to summarize three hours in two paragraphs. So let me just set this aside for a minute and mention another event that will help bring this all together.

Saturday evening, after this health care lecture, I saw the film “12 Years a Slave.” If anyone out there hasn’t seen it, do. As you all surely know, this film is based on a true story of an African American man who had been a free man and was then kidnapped and sold into slavery. The film is gripping and disturbing, all the more so for the fact that it’s a true story. So many of the horrors of slavery that we’ve heard about are lived out in this story. Watching it, I found myself wanting it to be fiction because it’s just too awful to imagine that people were actually subjected to this sort of treatment. And with no recourse. Absolutely none.

The echoes of that treatment live on in all of us, in those implicit attitudes that we have inhaled with the racism that still floats around in our world. In the attitudes that make for health care disparities and that invite the selective forgetting that lets us think genocide and slavery are history and have no relevance to today. That allow us to refuse, as a nation, to sign the U.N. Declaration on the Rights of Indigenous Peoples.

Over the years, demands for reparations for the mistreatment of African Americans and of Indians have come and gone in this country. Reparation would be one way to fulfill the dictates of the U.N. document … but we haven’t signed it, so we’re not obliged to adhere to it. Besides, when we consider the magnitude of the moral failings to be addressed, it’s easy to see why movements for redress meet with resistance. Partly because it’s hard to imagine how we could ever compensate either group for what this nation has taken from them. And partly because, we insist, that’s history and we weren’t personally involved, so it’s not our responsibility.

And this brings me back, full circle, to Mandela, our moral compass. It will be a great tragedy if, having lost him as a living model of the power of forgiveness, we also lose him as a moral guide, our north star. When Mandela became the first president of his new nation, he could so easily have used his position to punish those who had persecuted him and his people. But he didn’t. He chose truth and reconciliation over vengeance. That choice represented true north, and he never wavered from it, although he could have, he had the power to do so. The Truth and Reconciliation Commission sought exactly what the name stated: open acknowledgement of the wrongs that had been committed and sincere efforts to forge ties between people who had been at virtual war for centuries.  

This, it seems to me, is our task vis-Ă -vis Indians, African Americans, and all the other groups that any of us continues to marginalize and disregard. We have to—individually and collectively—confront our failure to stay on course, to find a moral path. We have to do our part, individually and collectively, to find ways to acknowledge and then work to reconcile the differences that have kept us so apart. This isn’t an easy proposition, at least not for me. But when I was standing on that blanket, I knew I was being called out. And when I heard the lecture and saw the film, I knew I was in the dock again. Much as I might wish to think of myself as having mastered these issues, I know I’m not done with my work. Not by a long shot.

The morning after Mandela died, I considered posting a blog, but changed my mind. It seemed like everything had already been said. Besides, I couldn’t find words for what I was feeling. In the process of writing it, though, I was reflecting on how Mandela represented something about who we could aspire to be as human beings. I was reminded of a quotation from Abraham Lincoln’s first inaugural address. It seemed to capture who Mandela was in the world. And it also reminds me of my continuing task, highlighted by the weekend’s activities.


We are not enemies, but friends. We must not be enemies. Though passion may have strained it must not break our bonds of affection. The mystic chords of memory, stretching from every battlefield and patriot grave to every living heart and hearthstone all over this broad land, will yet swell the chorus of the Union, when again touched, as surely they will be, by the better angels of our nature. – Abraham Lincoln, 1861



_______________________
* Initially, I used the name "Madiba" interchangeably with Mandela. I like that name. It's actually Mandela's clan name, and it's used as a term of respect and affection. But after doing a bit of reading around about it, I realized that my using it would be a form of appropriation ("I like that name. I think I'll just take it for my own use"), implying a close connection with Mandela that I did not in fact have. So, I chose not to use it. In the process, I was reminded again of how easily I can assume that I have a "right" to the cultural artifacts, names among them, of other peoples. Privilege can catch us anywhere if we're not looking. 


Tuesday, December 3, 2013

Hiding Places

A conversation with friends the other day got me thinking about things we hide away. Some are treasures, things of value that we hide to protect them. And then there are things we hide away because we don’t want to think about them. Things that shatter our simple worldviews, that make us uncomfortable, that scare us.

We were talking about books and movies that offer especially thought-provoking insights into lives that are so different from our own that we get surprised by them. One woman had just seen a movie, “The Book Thief,” about a German family who hid Jews during the Holocaust. We talked about the odd realization that we never much thought about there being “good” Germans during WWII. Of course there were, we all knew, once we thought about it. But their lives are so hidden, buried under the reams and reels of stories about the “bad” ones.

That discussion brought to mind an older movie, “Sarah’s Key,” which told the story of a Jewish family in France who were rounded up by French collaborators and sent to concentration camps. The whole family, that is, except for one small boy, whom his sister, Sarah, hid in a closet, locking the door. She told him to keep silent until she came to let him out. She thought she’d be right back—the police hadn’t been taking girls and women. Instead, it was years before she returned. Her brother had followed her admonition to stay there and stay still. The French family who took over the apartment noticed the smell, but couldn’t locate it. She knew where to look, in his hiding place.

Our conversation about these two movies—one about Germans hiding Jews in the basement, the other about Jews hiding a child from his countrymen, both about lives largely hidden from our own awareness—were threaded through with this theme of hiding. It was this theme that brought to mind a pair of short books I read a few years ago that have clung to my consciousness ever since: The Buddha in the Attic and When the Emperor Was Divine, by Julie Otsaka. The first is about things left behind—like the Buddha hidden in the attic—when Japanese Americans were forcibly removed from their homes and sent off to detention camps during WWII. The second is about the places we hid those people, out of sight, and the invisible lives they led. I love this author’s writing style, which is unique and totally engrossing—you really have to read it to see what I mean. But more relevant to this discussion is her skill at laying out these devastating events for us to see without lecturing, without even commenting on the unspeakably cruel acts of our government against its own citizens. Just showing it, word by word and step by step.

Thinking about this later also reminded me of a play we saw recently, “Do You Know Who I Am?” This play was crafted from simple first-person accounts of “dreamers,” undocumented youths who are trying to make lives for themselves in this country. They’re daring to come out of hiding, to be public about their status in order to tell people—or rather, show people—who they are. There they were on the stage, proud, nervous kids (OK, maybe young adults, but to me they seem like kids) asking the audience, “Do you know who I am?” Good question.

Our discussion set me to reflecting on things that are hidden and how carefully—if unintentionally—we keep them that way. How often do we really think about Germans who risked their lives to save Jews, about French officials who collaborated with the Nazis, about the immense sorrow of Japanese Americans being marched away from their lives to remote camps, about undocumented kids who are our children’s and grandchildren’s friends, about the fact that we are walking on land stolen from the Indians? We all “know” that these stories exist, but we have the luxury of hiding them away when we want to. So the antidote is probably choosing not to hide them, at least from ourselves.

“Do You Know Who I Am?” was a production of Motus Theater, who also did a very powerful play last year called “Rocks, Karma, Arrows” (which I wrote about here). Our conversation the other day turned to that play and to the mistreatment of Indians—including massacres of whole communities by the US Army—and thence to Thanksgiving. I wrote last year about how uncomfortable I am about Thanksgiving. Basically, I find it really hard to celebrate genocide. I know that there are responses to that claim, but I just don’t find them persuasive, so I’m always a bit “off” on Thanksgiving.

The other day, as we were talking about some of these stories, my partner and I hatched a plan for next year at Thanksgiving. One of the most reprehensible massacres of Indians (if such things can be placed in a hierarchy of awfulness) occurred at Sand Creek in southern Colorado—the site is now a national historic site. It turns out that one of the Japanese internment camps, Amache, was in Colorado, very close to Sand Creek.

When we lived in New England, we used to go to the counter-Thanksgiving held by local Indian tribes at Plymouth Rock, the land of the Pilgrims' pride. That’s too far away now, so we decided to make our own counter-Thanksgiving by visiting Sand Creek and Amache, bearing witness to the stories that we don’t honor with holidays. Trying to avoid hiding them away. Again.


Monday, December 2, 2013

Natalya

There are times when someone's passing just has to be acknowledged. This was one of those times. My partner called my attention to the New York Times obituary: “Natalya Gorbanevskaya, Soviet Dissident and Poet, Dies at 77.” I had seen the tag line, but hadn’t recognized the name until she said it on the phone. The words of a Joan Baez song came immediately to mind. I’d heard her sing “Natalia” many times, and I knew it was an homage to a woman who had defied the Soviets and was summarily committed to a mental institution. (You can hear it too if you click on “Natalia.”)

And now, thanks to the Times and a little Internet sleuthing, I know much more about Natalya. Her most notorious brush with the Soviet authorities involved her 1968 protest against the USSR’s invasion of Czechoslovakia. In the words of the Times, Natalya and a group of fellow dissidents “stood on a spot reserved for executions in prerevolutionary times and held up banners with slogans like ‘shame to the invaders.’” Not finished with her outrage, she wrote about the arrest and trial of her companions in an independent newspaper she had helped found, whose aim was explicitly to stand in opposition to the USSR’s “official” newspapers.

The following year, she helped found a group to promote civil rights in the Soviet Union. Her explanation: 

“One must begin by postulating that truth is needed for its own sake and no other reason.”

The simple statement speaks so clearly to so many issues: kindness, poverty, race, education, war, healthcare, love. So dangerously clearly.

Shortly after the protests, in 1969, Natalya’s writings challenging the Soviets got her arrested. She was  diagnosed with “continuous sluggish schizophrenia” and committed to a mental institution, where she remained until 1972. Two years after the demonstration in Red Square, she published a book about the protest and the subsequent trials, which was later published in English as Red Square at Noon. I haven’t read it, but I plan to. 

Finally, in 1975, she emigrated to Paris, where French psychiatrists pronounced her mentally healthy and concluded that she had been committed for political, not medical, reasons. No fooling. It reminds me of a speech that Dr. Martin Luther King, Jr. gave to the American Psychological Association in 1967 in which he urged the creation of an “International Association for the Advancement of Creative Maladjustment,” an organization dedicated  to the practice of defying common norms in the name of justice, of honoring “maladjustment” that speaks truth to power. Seen in this light, Natlya's story is both an inspiration for us and a cautionary tale about the many ways that people can be silenced.

In her introduction to the song written in Natalya’s honor, Joan Baez said, “It is because of people like Natalya Gorbanevskaya, I am convinced, that you and I are still alive and walking around on the face of the earth.”

That’s a large debt of gratitude we owe.




Sunday, December 1, 2013

AIDS, still


December 1 is World AIDS day, and in my queer world, the date is usually marked in some way. But this year, it seems especially salient. For one thing, I’ll be hosting a radio show about HIV/AIDS on December 2 on the program I mentioned here before, “Outsources,” which airs Mondays at 6:30 on KGNU. (88.5 FM. Listen in!).  Also, we just saw a movie about the early years of the AIDS epidemic—“The Dallas Buyers’ Club,” which reminded me of a documentary on that period that we saw early this year—“How to Survive a Plague.” These things all come together in my mind today—the films, the show, the awful reality of what AIDS has been and still is, despite its general disappearance from the front pages.

I realize that some folks aren’t too aware of the trajectory of HIV/AIDS. Not everyone was old enough to follow the story back when it began in the early 1980s, and many who were may have lost track of the account. So I think I’ll start with a quick primer: call it HIV/AIDS 201 (for 101, check out the hyperlinks in the Wikipedia entry, which will take you to all sorts of information). I’m hoping this doesn’t seem like a lecture. I actually mean it to be the story of a part of our lives. It’s the story that’s running through my mind today, on World AIDS Day, 2013.

It’s hard for me to believe that the HIV/AIDS pandemic is now over 30 years old. It seems so much more recently that I first heard of this strange disease that seemed to be affecting gay men. On the coasts, mostly—New York and San Francisco. The first case in the US was identified in 1981. Initially, no one knew quite what to make of the swirl of unusual symptoms that were suddenly appearing—uncommon cancers, uncommon forms of pneumonia, heightened susceptibility to all manner of infections. Soon, it was clear that the disease, whatever it was, appeared to be most common among gay men. So it was called GRID—gay-related immune deficiency. Its association with homosexuality and later (to a lesser extent) with IV drug use set the social trajectory of the illness. The disease emerged into a culture already laced with virulent homophobia, and it quickly became fodder for that homophobia. 

The stigma associated with homosexuality was so intertwined with the disease that AIDS was virtually ignored—or actively dismissed as unimportant—by society at large, even as scores and then hundreds and finally thousands of people died. Then-President Ronald Reagan was famously silent for six years as the epidemic raged. By the time he finally mentioned AIDS in May of 1987, nearly 21,000 Americans had died from complications of AIDS. And as we watched (or not) the disease’s progression in gay and bisexual men, HIV/AIDS was also thriving in other communities—especially among IV drug users and their partners (of whatever sexual orientation and gender identity) in this country, and largely among heterosexual people elsewhere around the world. By the time Reagan broke his silence, the disease had been identified in 113 countries, with more than 50,000 cases worldwide.

Reagan’s surgeon general, Everett Koop, who would have spoken out, was silenced “because,” he said, “AIDS was understood to be primarily in the homosexual population and in those who abused intravenous drugs.” The president's advisers, Koop said, argued that “They are only getting what they justly deserve.” This refusal to care was a stark indication of the intensity of the stigma that hung off of homosexuality and IV drug use: they were so reprehensible that they warranted a death sentence. In this climate, it’s not surprising that many people weren’t out to their families. Now, countless men came out to their families by telling them they were dying of AIDS.

The silence and the stigma persisted as a generation of gay and bisexual men watched their friends and lovers die around them, as did uncounted IV drug users, people who received unsafe blood by transfusion, people who had sex with anyone who had the virus from any source, people who had sex with people who had sex with anyone who had the virus … the list was monstrous, and the potential for it to grow on and on, unimpeded, was huge. Finally, it was not our government but folks on the street who stepped up, spoke out, yelled and protested and organized. The LGBTQ community and many straight folks began to care for those who were infected. Where the government failed, home-grown clinics, personal-care and food-delivery services, buddy programs, cadres of volunteers and friends showed up for the people who were dying. The LGBTQ community had never been so united. That period—the 1980s and beyond—remains a monument to what communities can do. Among the groups that sprung up were the Gay Men’s Health Crisis, the first major grassroots AIDS-related group, and ACT UP (the Aids Coalition to Unleash Power), whose motto was “Silence = Death.” Such groups became the strong, insistent, irreverent voices for all the folks who were unheard, using grassroots organizing, learning everything they could about the HIV virus and AIDS, being rowdy and misbehaving in staid and proper circles.

Finally, others got on board. Slowly, drugs were approved—often under extreme pressure from grassroots activists—and slowly, those drugs got more effective and less toxic. What had been a virtually certain death sentence eased its grip, and people began living longer with HIV. Then, in the early 1990s, new drug “cocktails” arrived on the scene, seeming to make of HIV/AIDS a lifelong health condition instead of a short path to early death. These drugs were still too toxic for some people, and millions of people around the world who could benefit from them couldn’t afford them or couldn’t manage the grueling regime and precise schedule they required. But for many, the future changed.

It is these most awful of days, months, and years that were depicted in both the documentary (“How to Survive a Plague”) and the movie (“The Dallas Buyers' Club”) I mentioned above. The deep fear, the desperation, the sense of total abandonment, the wrenching sadness of loss after loss after loss, the coalescence of a community under siege. The documentary is about AIDS activism—especially the work of ACT UP, of sympathetic docs, of the allies who stepped up, and especially of the LGBTQ community (and its allies) that saved its members and in the process birthed itself. The other film is about the same epidemic, same period, same activism, but this time featuring an individual instead of a community, a lone heterosexual man who got AIDS, wanted drugs for himself, and began a crusade to provide them for others (and to make a lot of money in the process)and also to insist that the government stop acting as an obstruction and start caring for its citizens. Both are, from different perspectives, inspiring reminders of, to paraphrase Margaret Mead, the power of a few committed people to change the world. You’ll be glad you saw them.

By the mid-1990s a corner seemed to have been turned, and many people—including those most visible to mainstream America—went back to living. The drugs continued to get better, and official, government-sponsored systems emerged to provide medication and services to people living with HIV/AIDS. The newspapers were no longer filled with news of AIDS, and the obituary columns weren’t filled with men who died young. The funerals slowed, the persistent terrible news waned. People who had quit their jobs returned to work. Relationships seemed possible again. Life seemed possible again. On the other hand, paradoxically, this change was complicated for some. People who had spent or given away everything and cashed in life insurance policies assuming that they would die now tried to figure out how to recover financially. People who had stayed with a partner rather than abandon him (or sometimes her) as he (she) was dying had to decide anew whether to stay. Organizations that were created to provide services and political clout saw their support and their influence decline as grassroots services were replaced by “official” ones.

Along with these challenges, the stigma lived on. LGBT people, whatever their HIV status, and people with HIV/AIDS, whatever their gender identity and sexual orientation, still struggled with stigma. Still today, HIV/AIDS is seen as a sign of moral failure. More than any other disease I can think of, there is great shame attached to HIV/AIDS—as if the disease weren’t enough of a burden. Which brings us to this year, World AIDS Day, 2013.

The good news is that we have several important gains to celebrate these days:

  • New guidelines issued by the US Preventative Services Task Force encourage routine testing for all adolescents and adults 15 to 65 years of age. More testing means earlier detection (currently, about 20% of adults in the US who are infected don’t know it because they’ve never been tested), earlier life-saving treatment, and reduced transmission resulting from a lack of information. 

  • The drugs keep getting better and easier to manage, and people are living longer. 

  • With the Affordable Care Act, medical care will become easier to get and more affordable for folks in this country. 

  • Although there is no “cure” on the horizon, a so-called functional cure has been achieved in an infant. That means that the virus is barely detectable and is not replicating, despite the fact that the child isn’t receiving drugs any longer.


But in the midst of this tremendous sense of relief, this easing of tragedy and terror, we mustn’t lose sight of all that remains undone in the fight against HIV/AIDS. Here’s a short list. We’ll address some of this in Monday’s  show. 
  • Even as HIV/AIDS has become less devastating (at least for the sub-group mentioned above) the stigma associated with HIV/AIDS persists. Largely because the disease in its earliest days was so closely associated with groups that were already stigmatized—gay and bisexual men, other men who have sex with men, IV drug users—this stigma grew strong and deep from the beginning. It persists for infected people both within and outside those groups.  

  • The “cocktails” have been hugely successful in transforming HIV/AIDS from a death sentence to a manageable condition—among people who have access to the drugs and who can afford them and tolerate them. However, as the people who were most vocal have become less visible because they are in less urgent need, attention to AIDS has waned, as has funding for AIDS-related research and services.  

  • For a whole host of reasons, we are currently seeing an increase in the number of gay and bisexual men who engage in high-risk behaviors—a figure that was lowered, painfully and slowly, during the early years of the epidemic. Many factors probably contribute to this rise. The young men in these communities didn’t live through the worst of the epidemic and have no personal sense of how awful it was. Besides, they think, new drugs have made of HIV/AIDS “just a condition” that can be managed with meds. Some folks have suggested a role for internalized homophobia—the self-hatred that arises when stigma is incorporated into your own self-concept. Another factor may be that men are choosing sexual partners with the same HIV status as their own—or so they believe. Finally, some research indicates that the newest drugs also protect against transmission of the virus, so people may think they needn’t worry. None of these reasons is without its drawbacks. And despite the new drugs, unprotected sex still increases the risk of contracting HIV/AIDS, and HIV/AIDS still has massive consequences. The drugs continue to have serious side effects, and growing old with HIV/AIDS comes with serious costs. Recent research has found that people who have been on the drugs for a long time have higher incidences of cancer, heart disease, diabetes, dementia, depression, and other major health problems. They also die younger, on average, than their uninfected peers. This is not to berate people who have the disease, but to point out that the drugs have not turned living with HIV/AIDS into a non-issue, a trivial inconvenience, a consequence of such little importance that it needn’t be factored into sexual decisions. 

  • The communities in the world who are most severely affected by AIDS are not the sort of mainstream American gay and bisexual men who became the face of AIDS in the US in the 1980s and 1990s. They are IV drug users of all genders and sexual orientations, people of color (ditto), people in other nations (especially Africa), and poor people everywhere who do not have access to adequate health care. In this regard, the Affordable Care Act will help by (theoretically) providing access to adequate and affordable medical care for US citizens. That, of course, still leaves out people everywhere else in the world, as well as people in this country who don’t connect with health care providers for any reason. Among the reasons might be lack of education, poverty (you still have to connect with ACA, which is hard if you have no transportation and no internet), and that persistent, painful bugaboo: stigma. Fear of judgment. 

  • Among the groups with high rates of infection are the children of mothers with the virus. Recent research has found ways to reduce this transmission, including administering drugs to the mother during pregnancy and to the infant after birth and then foregoing breast feeding. With appropriate treatment, the risk of mother-to-child infection can be reduced to about 1%. But it’s not hard to figure out that people in some of the most heavily infected areas of the world do not have the luxury of following these procedures. AIDS education is non-existent or minimal, drugs are difficult or impossible to acquire, and breast feeding is often both culturally and logistically the only option.  

So, more than 30 years into the epidemic, it’s easy to find reason for great hope and reason for great concern. Personally, my greatest concern is the risk of apathy, indifference, complacency. In contrast to the 1980s, it’s easy to understand why 2013 looks great for many in the LGBTQ community. But we’re not done, and other groups are in far worse shape than those in the American mainstream. We just can’t just stop. 

My greatest hopes lie in three domains. The first is the tremendous progress now being made in the medical field. Given what researchers are now learning in non-AIDS-related areas (epigenetics, especially), major continuing progress in treatment seems virtually certain. I hesitate to mention words like cure and “vaccine,” but we’re talking about hope here. My second great hope lies the gradual downward trajectory of the stigma associated with HIV/AIDS—at least in some segments of the population in this country. Clearly, HIV infection still comes with a load of stigma. But as attitudes toward LGBTQ people improve, some part of that stigma may slowly wither away. My hope is that over time, just as the stigma of HIV/AIDS was grew from and then reinforced homophobia, the slow decline in (at least some forms of) homophobia may usher in less judgmental attitudes toward HIV/AIDS infection. My third great hope lies in the work of folks like Bill and Melinda Gates and U2 singer Bono, both of whom have worked long and hard (and spent a lot of their money) to bring understanding, education, and treatment to some of the poorest and most heavily infected regions of the world.

It’s common in the queer community to mark the history of our movement by reference to Stonewall, the 1969 uprising that birthed the contemporary “gay rights” movement—events are marked as “before Stonewall” and “After Stonewall.” But the gay and bisexual men I know mark their history in another way. Events are remembered and coded as “before AIDS” and “after AIDS.” 

HIV/AIDS changed the world, and it’s not over yet. I hope we don’t forget that.