Most of you are probably familiar with the ALS Ice Bucket Challenge. Celebrities have taken part, including Bill Gates, and it’s been filling social media.
But for those of you who aren’t familiar, it’s pretty simple: when challenged, you either dump ice water on your head or donate $100 or ALS research and treatment. Many donate the money regardless, but if you do dump the ice water, you can challenge three more people, giving them 24 hours to comply. In effect, it goes like this:
The goal, besides raising money, is to spread awareness. The viral quality of the campaign has proven particularly effective, raising over $100 million dollars, according to this article from Aug. 15, and bringing ALS to the forefront of the public sphere. It is a brilliant viral campaign, seeming to make a positive difference.
But for some the project feels too public, too self-broadcasting. It reeks of shallow millennial-led narcissism and low-effort activism, where over-rich Americans throw cold water on themselves, film it, send it to the world, and think it constitutes “help.”
I know I’ve been posting a lot lately about the Internet and digital literacy, but this time, it’s based off on one of the more recent Idea Channel videos:
To summarize, many online “speech communities” from specific groups and interfaces have their own linguistic patterns, expressions, and focuses. In the language of the video, they have “dialects,” just as different geographic regions have different wording, slang, and linguistic personalities.
For example, as the video shows, the /b/ forum on 4Chan feels and sounds coarse, chaotic, and (to some) unfriendly. Or Tumblr tends to use many .gifs based off of the .gif-friendly interface.
As I think of these topics, I often turn to the German philosopher Jurgen Habermas and his conception of the “public sphere.” While the details often differ depending on the theorist or the argument, the public sphere is essentially a space where people from different backgrounds can meet and discuss topics in a united context. Imagine a park, bringing together a web of people, or a coffee shop, open constantly to the public.
For Habermas, one of the key principles of the public sphere is its “universal access.” Here, many others attack him, as access to the public sphere often requires certain things, like a reliance on shared symbols and rules, a level of education, and material access. Many also critique his assertion that this public sphere must be rational, a carryover from the historic genealogy that Habermas uses. Action- and meaning-defining discourse may be happening, they arguem even if it is not “rational.”
Thus, while the Internet may seem like a “public sphere” of sorts, it clearly isn’t because it lacks this universal access. You need a connection, something many people do not have, and the Internet lacks the order and unity that a public sphere seems to imply. Its borders and spaces have no geographic limitations. Some exist beyond the realm of legislation. Professional or educational websites coexist with amateur, joking, obscene, pornographic, criminal, and chaotic spaces. Many different languages and symbols collide, and many users don’t “discourse,” but troll or produce random content, like “YouTube poop.”
My vocabulary is deliberately spacial and organic here. Like our living spaces, the Internet is a lived-in space, changed by those who live in it. Or, to go back to language, the Internet is always in a constant dialogue with itself, as the theorist Mikhail Bakhtin might have seen it. It builds meanings, connotations, and references constantly through the shared use of its symbols and spaces. Memes change. Expressions change. Words emerge, like “smol” or “lol.” The Internet and digital technology, in Bakhtin’s language, is the new novel, alive and changing.
I said to myself, “Look, I have increased in wisdom more than anyone who has ruled over Jerusalem before me; I have experienced much of wisdom and knowledge.”Then I applied myself to the understanding of wisdom, and also of madness and folly, but I learned that this, too, is a chasing after the wind. For with much wisdom comes much sorrow; the more knowledge, the more grief
I once acted in a series of one act plays, and when I wasn’t running lines or rehearsing, I watched the other shows. One particular line has stood out from the experience: “Why be better?” I almost missed it, but hearing that line over and over, I finally realized how nihilistic it was. Yet, some days, I ask myself the same thing.
For the most part, it seems to be a modern question. Ennui, hysteria, and melancholy became common, even expected, medical diagnosis for the growing middle class in the 18th and 19th centuries as prosperity and public reform democratized leisure. Prior to that, some historians argue, people didn’t have the resources for ennui.
Couple this with growing cities, rising industry, increased skepticism for religion and morality–Darwin’s work being one cause–and one can see the anxiety and hopelessness that spurs such questions, especially by the start of the 20th Century.
Monday’s Merriam-Webster word of the day was hashtag. Few other elements of social media have endured the same ire and satire. I’m sure many eyes rolled with the #ashtag selfies from this past Ash Wednesday. And Jimmy Fallon has poked fun at them with major celebrities. In both instances, I found myself laughing, but I didn’t know why.
Indeed, the octothorpe, relabeled and retrofitted for new media, has broken beyond the realm of the phone. In its new place, it has had some helpful uses. The hastag organizes the flood of rapid-fire information on Twitter. Revolutionaries and activists in the Arab Spring used it, and for journalists, it lets their observations climb above the noise and sail alongside other “trending” news and topics. And, as with any creative use of language, a well-used hashtag can trigger a laugh or a smile.
So why the scorn and parody? To me, I think it’s the growth pangs from a new mode of speaking entering our lexicon. We’re still learning how to use the hashtag, and as with any piece of literacy, open use creates some strange, comical combinations and incurs the skepticism of tradition.
I found this link today to some inspiring words from Bill Watterson, the reclusive creator of the beloved Calvin and Hobbes series. The words critique the high-climbing, fast-paced American view of success and happiness: work hard, keep climbing, and one day you’ll be happy, or at the very least you’ll have fame, success, and a lot of money. Pointing out the statistics and the logical fallacies to view entails is not new. Neither is Watterson’s encouragement to break away from social pressure and follow personal passions, ignoring the flak and shame that comes from following “the road less traveled.”
Some people may think such encouragement is trite or naive. It’s the sort of drivel that idealistic college kids tell themselves when struggling in classes and accruing debt or peppy elementary teachers post on walls, but ultimately, it’s a lie,as pervasive and false as the American dream. But when one considers the way Watterson lived out his own advice, the words gain a new depth. He did resist corporate pressure and created one of the most beloved, evocative comic strips around. Not everyone would want to fallow his path, and many may think his reclusive life unstable and unhealthy.
But still, hearing such words in such a monoculture of competition and corporate ambition is refreshing. Hearing such words from Watterson, transformed into a homage by cartoonist Gavin Aung Than–that is truly moving:
Hey all, I have a longer post I wrote today, but I want to edit it and post it tomorrow or later in the week. It’s about the relevance of philosophy, so I figured that this engaging video would be a nice primer.
The video is a roundtable discussion and lecture about the relevance of philosophy. It takes place at The New School in New York, with some leading thinkers in the field of philosophy and otherwise. Some of the conversation is quite interesting and well-worth the watch if you, too, wonder what the point of philosophy is.
I moved back into school today to start the next semester. A new semester has always had more of a “New Year” feeling than actual New Years, since school provides a ready-made change of scenery and lifestyle.
That said, I try not to treat “resolutions” like “revolutions.” Often, this time of year–especially the first week or two at the start–gets annoying. Everyone has a hundred hopes, impossible plans, and vague outlines, all aimed at turning them into a new person. I respect the hope and spirit that goes into this, but as with many things, the hope outshoots the reality. Would-be gym-goers, dieters, meditators, and volunteers slump back into their old habits, like a well-worn couch, and lose momentum until “next year.”
Other research has different numbers, but the conclusion seems pretty clear: resolutions don’t come easily.
One thing that may hamper our ability to reach our goals is an inherent limitation to self control. Recent research seems to indicate that we can only use so much self control before we succumb to temptation. Or, at the very least, we get more likely to succumb. That pizza, ice cream, and beer hits us much harder after a long day at work.
Sometimes we even rationalize it, saying “Well, I worked hard today and kept up my diet, so I deserve a little something.” The psychologist Kelly McGonigal, who studies willpower, critiques this particular tactic that she calls “moral licensing” in an interesting video.
Moreover, moral licensing and limited self control aren’t the only things that impede resolutions. The stubborn resilience of bad habits, our inability to visualize future selves, competing priorities, guilt-saturated procrastination, and more set strong roadblocks between us and progress.
I found this recording of the famous post-modern novelist, cultural critic, essayist, and educator David Foster Wallace delivering a commencement speech at Kenyon College. The words are all the more haunting knowing that Wallace hanged himself Sept. 12, 2008 after a lifelong struggle with depression. The main focus of the speech is the “human value” of a liberal arts education. For Wallace, an ideal education provides “awareness” of our world and our way of processing the world.
With this in mind, two passages in particular struck me. The first deals with the potential dangers of the mind. As Wallace says:
Twenty years after my own graduation, I have come gradually to understand that the liberal arts cliché about teaching you how to think is actually shorthand for a much deeper, more serious idea: learning how to think really means learning how to exercise some control over how and what you think. It means being conscious and aware enough to choose what you pay attention to and to choose how you construct meaning from experience. Because if you cannot exercise this kind of choice in adult life, you will be totally hosed. Think of the old cliché about ‘the mind being an excellent servant but a terrible master.’
As someone who suffered from depression, Wallace clearly understood the tyranny of a mind mastering reality, the way it warps and weaves impressions into a gloomy, self-destructive haze, leaving one alone in a world of friends.
But equally destructive is the closed-minded comfort that creates destructive prejudices or what Wallace calls our “default setting”: the self-focused way we narrate, judge, and arrange our life. In itself, this is innocuous, but when we start to think our reality is the norm or the “right” way of doing things, a process called “normative hubris,” we can become destructive.
As the blog and book You Are Not So Smart argues, our “rational” or “informed” opinions are often biased rationalizations. Some of these biases may be cultural or biological, but many are self-created, or at the very least, they can be self-controlled.
This, argues Wallace, is the goal of the liberal arts education: the ability to recognize this hubris and ignorance and do our best, if possible, to keep it in check. It grants us the ability to recognize the most basic thing, the way we explain reality.
Wallace is not the only person to say this. It rings with the self-conscious ignorance of Socrates and echoes Albert Camus’ dictum from his notebooks: “An intellectual is someone whose mind watches itself.” Its view of education also mirrors what astronomer Carl Sagan said in his final interview about science: “Science is more than a body of knowledge: It’s a way of thinking.”
Wallace’s unique addition is the painful awareness he has over his own limitations and the poignant, almost Zen-like awareness that the simplest, most pervasive things are the most hidden. Wallace opens the speech with a didactic story about two young fish swimming. Coming the opposite direction, an older fish swims by them saying, “Morning boys, how’s the water?” The two younger fish keep going, and eventually one of the fish turns to the other and asks, “What the hell is water?” The very fabric of their existence is far from obvious.
This parable returns toward the speech’s conclusion in a pointed restatement of the theme:
[T]he real value of a real education [has] almost nothing to do with knowledge, and everything to do with simple awareness; awareness of what is so real and essential, so hidden in plain sight all around us, all the time, that we have to keep reminding ourselves over and over:
‘This is water.’
‘This is water.’
It is unimaginably hard to do this, to stay conscious and alive in the adult world day in and day out. Which means yet another grand cliché turns out to be true: your education really IS the job of a lifetime.
As someone who works as a T.A. for a freshman composition class and in a writing center that aids students with the composition process, I’ve come to reach a similar point of view–I hesitate to call it a conclusion. Now on the other side of the desk, where I’m supposed to provide “knowledge” or “guidance” to new students, I painfully recognize the subjectivity of it all, the hubris of trying to “teach” someone how I see the world.
Instead, I just want to make them aware–aware of the world around them, with its conversations and conventions, and how they fit into it. What their own voice has to say. Or what their own voice has misidentified, misunderstood, or overgeneralized. But I often feel torn between the immediate goals of polishing up their arguments, correcting their grammar, or getting them a good grade and this much more idealistic, long-term longing. Moreover, I often struggle with normative hubris or unaware auto pilot in myself.
Most of the time, I’m not the older fish who sees the water. Most of the time, I’m simply the younger one, asking, “What the hell?”
For a while now I’ve been wanting to write an essay about end tables, coat pockets, bag bottoms, and storage cabinets. We often forget these clutter-gathering crevices of our individual lives, until we fish through an old coat and pull out receipts, candy bar wrappers, and a dollar or two tangled with some coins. These seemingly random articles, disused and long-forgotten, once played a role. We bought something, earning that receipt. We ate that candy bar and couldn’t find a trash can. We pocketed that loose change.
Such odds and ends reveal our former lives, providing a time stamp for our days and habits, whether they are the books and jewelry on our end tables, the unused casserole dishes in our cabinets, or the grit and at the bottom of our bags. Our past selves leave traces. And just as archaeologists dig through the rubbish of past cities, we can dig through our own lives.
But unlike archaeologists, we don’t normally care about these random bits of rubbish. We crave the big picture–the narrative that collects the pieces, not the pieces themselves. Perhaps a few things transcend this bias, like a stone from our childhood house, a ticket stub from a memorable movie, or that framed first dollar a business might display. We infuse these random pieces of existence with meaning and display them, like a museum of our lives.
But in themselves, they are mere physical objects. That dollar passed trough hundreds of indifferent hands before it fell behind that frame. Its “it-narrative” probably included stints buried in coat pockets or lost in the wrappers and rubbish on the bottom of a bag. Maybe it fell behind a bed. Maybe it went from a lemonade stand to a store clerk to a strip club. That dollar connects us to hundreds of other lives–including our past selves–but its average everydayness camouflages it.
Sorry for the absence, it’s been the final weeks here at school, so I have been grading, tutoring, and working on final papers like crazy. Expect a post this Sunday, but in the meantime here is a link to the first part of a documentary about a poet I wrote on this semester named Charles Olson. The rest of the documentary is online as well.
Olson, considered the foundational figure for the “projective verse” movement and a key figure for New American Poetry, was a well-read and fascinating character. Born Dec. 27, 1910 in Worcester, MA, to a postman, Olson spent most of his life in the small fishing town of Gloucester, MA, where he wrote his most famous work, The Maximus Poems.
He read voraciously, and through his own work as a postman in and around Gloucester, he developed an intimate eye for detail. This latent curiosity and a love of history spurred his studies at Wesleyan and Harvard, where he became a critical expert on Herman Melville, prompting his 1947 book Call me Ishmael.
Besides his poetry and 1950 critical essay “Projective Verse” Olson’s most well-known accomplishment was his time teaching at and directing Black Mountain College, a small liberal arts school near Asheville, NC, that acted as a gathering point of avant-garde teachers and students from its founding in 1933 until it closed in ’57. Some of its faculty and students included Robert Creeley, Ed Dorn, John Cage, Josef Albers, Robert Duncan, Allen Ginsberg, William de Kooning, and more.
One of his most original idea is the notion of “polis.” Drawn from the Greek word for city state, “polis” for Olson constituted the ability of a certain local area to connect to and mirror the world. Olson, a historian and observer by trade, studied the records, geography, and people of his local Gloucester, and by doing so, he laced his personal memories and existence into the geography and history. Synthesizing the personal connection and history, he was able to create an overlap, where the personal bled into the historical and geographical. This was polis: seeing the “totality of the system” by “inverting” it, the macrocosm through the microcosm.
Olson, however, was a controversial figure. He was opposed to the capitalism that now directs our everyday way of life, seeing it as a “mu-sick” that flooded out and leveled down polis. And his larger-than-life personality, at 6-foot-seven, was as well known as his womanizing and dismissive attitude toward most women poets. Some also think his writing and presence at Black Mountain and elsewhere assumed the role of a high prophet or Zen master, didactic and needlessly cryptic.
While some of these criticisms may be more accurate than others, one has a hard time doubting Olson’s influence or intelligence. And taking a leaf from his own book, I encourage anyone interested in him to do their own research, this documentary providing an engaging start. Enjoy.