Self-care and Students

This week, Jay Dolmage–a prominent disability and disability rhetoric scholar–has been to Syracuse for our department’s Spring Conference, giving a talk and leading a workshop (both were wonderful, and here are the materials and more on accessibility and disability studies). It, along with some other things, have made me think about self care and students.

I’ve always found that writing instructors have a unique connection with students, compared with other disciplines. We often have smaller classes, we tend to get a huge portion of the university, and we get a lot of frosh students. In addition, writing tends to involve many more skills than “grammar”: critical thinking, reading strategies, synthesizing ideas, formulating arguments, researching topics, analyzing primary and secondary sources, evaluating sources, cultivating and managing productive work and workflow strategies, etc.

And, perhaps in a more Romantic sense, writing, even academic writing, is a personal task. Though the image of the lone writer in some castellated tower is not accurate, writing and authorship–crafting a cohesive document that carries our mark and oftentimes our name–is something powerful, even in our information-saturated age. Issues of voice and privilege play a role, along with identity. And, one of my favorite adages from the field remains: “Writing is thinking.” Often, as we think through an issue by writing, we learn something new.

I am not saying that this is true for all writers nor with all writing, but it happens. And we often give a space for students to reflect on issues like identity and experience because literacies, in all their forms, are fundamental to how we exist in and experience the world. As the humanities get stripped and softened in many universities, “writing” provides a space to reflect on fundamental questions and experiences–should students and administrators allow it.

All this is to say that increasingly, though we already have too much to teach in a semester, I’ve been trying to address, or think about addressing, issues that are not immediately tied to writing. And this post, I want to stress self-care (or self care, with no hyphen?). Inspired by my colleague Allison Hitt, and others, I’ve increasingly made some space to address self-care with students, particularly strategies and experiences.

This deserves its own post or article, and the input of others with more experience, but for now, I often start with this article on perfectionism and procrastination. It disarms the usual narratives we and our students tell ourselves–and are told–about productivity and laziness.

But in a more general sense, I think “addressing self care” involves getting into the embodied, day-to-day experiences of being students and writers, of being friends and partners, of being sons and daughters (or something else, articulated or not)–in short, we get at being human.

And, it does not always work. And it can been exploitative and risky for you and students, with plenty of pitfalls–and is impossible with increasingly destructive teaching loads for adjuncts and others–but I feel like it is important to consider and strive for when possible, not only for our unique connection with students in higher ed, but also for the fact that questions of productivity, writing, and development involve care.

You cannot succeed, or even survive, if you’re always treading water.

And, as part of my own focus on technology, I also increasingly think it’s important to talk about privacy, cyber security, and technology habits as part of our profession. Media literacy, password security, the mental effects of social media, screen time, etc., are both questions of literacy and questions of self-care. As our digital lives and “flesh” lives infuse–as well as the literacies and skills we rely on to negotiate these lives–the importance of these topics increase.

I know I am not new at this, from Stuart Selber to Estee Beck–and Selfe and Hawisher and a bunch of other brilliant young and cornerstone scholars–writing instructors have long recognized the role of technology in composing. But, I think we also recognize the intersection of self, technology, and literacy in ways that are profound and unique. And increasingly important.

Watson Talk – Ownership and Online Composition

“There is a pleasure in the pathless woods,
There is a rapture on the lonely shore,
There is society, where none intrudes,
By the deep sea, and music in its roar:
I love not man the less, but Nature more,
From these our interviews, in which I steal
From all I may be, or have been before,
To mingle with the Universe, and feel
What I can ne’er express, yet cannot all conceal.”

-Lord Byron

Watson Talk Slides

Starting off a reflection about social media with a quote from Byron about the solitude of nature seems counter intuitive. A “society, where none intrudes” clashes with the usual rhetoric surrounding the networked culture of digital spaces, and the “lonely shore” and “pathless woods” probably lacks WiFi–or broadband.

But bringing in Byron highlights the paradox of place that the Internet and digital technology brings. We are networked selves, accessing the Internet in multiple ways from multiple places or portals, as our physical self continues to take up space and air “irl.” And much like the narrative locales of Romantic poetry, many digital spaces are constructed and emergent.

Byron’s saga traces the physical geography of Southern Europe, but Byron’s textual place–his “pathless woods” and roaring sea–arrive at us in ephemeral language through his poetry. They are authored locales. Phrased another way, one can visit the spaces where he allegedly traveled while writing Childe Harolde Pilgrimage, but those irl locations—the rocks, the rivers, the trees, the moss-laced logs—all of these differ from the locations that we envision when reading or hearing his poetry—nor are they constant over time, like the printed word. Language both signifies and creates locales.

Similarly, I think that the quality of born-digital space forces us to look at space as an ephemeral, emergent gathering. Websites may have a url pinning them down and servers in world sucking up power and taking up space, but we largely experience them more subjectively. In his later work, Martin Heidegger discusses the notions of “location” (or “locale”) and “space.” As he writes in “Building, Dwelling, Thinking”:

“The location is not already there before the bridge is. Before the bridge stands, there are of course many spots along the stream that can be occupied by something. One of them proves to be a location, and does so because of the bridge. Thus the bridge does not first come to a location to stand in it; rather, a location comes into existence only by virtue of the bridge.”

The bridge in this example, by being constructed, is opening a “location,” a significant site where different elements can gather and be. One can look at the bridge as a concrete space of possibility, a site that can direct meaning at some level in ways that an unmarked, undeveloped area cannot. Before the bridge exists, the area is just a “spot.” Things are happening in it, but nothing is built there. And with no building–or inscribed significance, like a park or childhood memory–the place feels anonymous.

On the one hand, this is obvious, and Heidegger’s obscure thinking may over-complicate the matter. But I think it gets at something important: how construction creates a fundamentally new reality at a site. Before the bridge, the space was simply “nature” or a river bend. Now, the bridge may have a name. It serves a human purpose for commerce. Lovers add locks to it. It may be in a film. It may represent a certain style or culture. It interacts with the nonhuman environment, deflecting rain and providing shelter for animals.

In Heidegger’s thought, a “thing,” like a bridge, is not an inert site of stone and steel. Drawing from the older use of thing in Icelandic and Germanic language, “Ting” and “Ding” respectively, thing is a site for an assembly, a gathering of people to reach decisions. With thinkers like Bruno Latour and Thomas Rickert picking up on this use more recently, I think we can look at Internet architecture with a similar dynamism.

A site is often even more of a “thing,” in this sense, than Heidegger’s bridge. It is a place for gathering. And in that gathering, a fundamentally location-attuned way of being arises through the interplay of different forces. As Nancy Baym argues in “The Emergence of On-Line Community,” online communities are emergent rather than dictated. As she writes, “Social organization emerges in a dynamic process of appropriation in which participants invoke structures to create meanings in ways that researchers or system engineers may not foresee.” Participants inherent certain structures or systems, Baym points out, and users dwell in and add to these initial elements to construct social practices and communal spaces. Location emerges. The community of individual authors writes and is written by the location.

But I want to turn, particularly, to authorship.

As Jessica Reyman argues in “Authorship and Ownership,” such spaces are often “co-authored” by algorithms and multiple people. By drawing from user data—as they point, click, and brows the digital spaces—algorithms tailor adds, curate feeds, and allegedly cocoon users in “filter bubbles” of easy-to-consume content, all the while drawing meta data for marketing and research. Today, this data mining and site curation is commonplace, and though scandals brought by Cambridge Analytica and others have brought renewed scrutiny, Reyman offers an important perspective. She argues that users have a right to this data: they are the ones creating it, while corporations profit off it. This sort of free labor, sometimes fit under the term “playbor” abounds in the Internet. As Andrew Ross argues, “The social platforms, web crawlers, personalized algorithms, and other data mining techniques of recent years are engineered to suck valuable, or monetizable information out of almost every one of our online activities” (15).

The relationship between authorship and labor has had a pronounced history leading back to the Statute of Anne in 1710 and the tensions of “intellectual property.” The image of the gentlemanly author plucking inspiration from muses and native genius to create new ideas, taken down in print, remains a sticky one. Today, if one follows Reyman’s argument, we are all authors at some level, as our being-in-the-(digital)-world adds to that world, co-authoring these spaces through our content creation and meta-data. Considerable playbor takes place in the form of Instagram posts, linking to articles, fanfiction, videogame modding, and more. Indeed, part of the reason that videogame companies endure the cottage industry of streamers and walkthroughs is for the free publicity it provides, and it has been common place since the 90s to collect and re-release content created by fans for company profit. Turn-it-In also owns student work, creating a financial empire from the labor of student writers.

In the more material sense, in terms of dollars and cents, this is a problem, but I want to take it to a somewhat deeper level–first addressing the authoring on the other side.

As philosopher Daniel Estrada wrote in a Medium article on filter bubbles, “in a very deep sense, you are your bubble. The process of constructing a social identity is identical to the process of deciding how to act, which is identical again to the process of filtering and interpreting your world.” While I would argue that identity is more than “the process of deciding how to act,” a point that I reckon Estrada would likely recognize, I think it definitely plays a central role. Sartre put it best: “We are our choices.” Our choices have echoes, and sometimes those echoes etch our being–or how others view our being.

But Estrada goes on: “any constraints imposed on your filter are also constraints on your possibilities for action, constraints on the freedom of your decisions and the construction of your world. If you are your bubble, then any attempt to control or manipulate your bubble is likewise an attempt to control you.” As technology ethicist Tristan Harris puts it, you may get to decide what you eat in these platforms, but they provide the menu.

Again, this has implications as we consider our selfhood or identities. While for Kant, the self is largely insular, cognitive, sensory, and self-contained, thinkers continue to argue, from a Buddhist metaphysics of emptiness to Diane Davis in Inessential Solidarity and Thomas Rickert in Ambient Rhetoric that the self is more osmotic or relational. It is permeable and messy, bundled and blurry, oozy and diffuse, yet localized by language and materiality. As Rickert puts it, we don’t just live in a world, we are enworlded.

And here come the algorithms. These too, if you want to go this way, are part of us, and so is the digital pathways they “co-author” from our metadata. To use Kant’s term, this digital world informs–or possibly is–our phenomenological experience and the self that this experience informs. In many cases our digital selves are ourselves—networked and saturated by technology and the nameless bots and programs in the background. And as both Reyman and Estrada point to, we don’t really own, or fully understand, these algorithms. Eusong Kim has argued about trending, for example: “We don’t know why something trends. The algorithm is a locked secret, a “black box” (to the point where MIT professors have built algorithms attempting to predict trending tags). The Fineprint: Trending is visibility granted by a closed, private corporation and their proprietary algorithms.”

This leads me back to Reyman’s view on data and our ownership of it. As we live in a more English model of copyright, economics and law tend to steer the conversation. But as this digital composing infuses our lives, both the deliberate messages we send out and the co-authoring of our data, issues of ownership, autonomy, and originality come to the forefront—especially that of ownership. Who owns our data is not just an issue of privacy, but it is an existential one. As our being-in-the-world co-authors and becomes entangled with our personas and places online, so do our selves. Just as England wrestled with the intellectual labor and textual ownership of traditional authors, we face a world in which our own ideas and our own digital being has become monetized and divested from our hands. Despite efforts by Facebook and others to allow us to see our data or have more input on our privacy and feed, a fundamental structure of black-boxing already exists, persistent through law and custom, to own and profit from our online meanders and statuses—and filer our own experience and online localities.

As we make paths in this pathless wood, Facebook profits and shapes the woods around us.

Fake News, Affect, and Media Literacy (C&W 2018)

Here is my introduction as part of a round table at the 2018 Computers and Writing Conference at George Mason:

As Bruce McComiskey describes in his recent Post-Truth Rhetoric and Composition, “fake news” has become another means to validate and circulate falsehoods, facilitated by social media and an audience’s desire to share and support this erroneous news. But it goes beyond this. As Collin Brooke argues in “How Trump Broke/red the Internet,” many people critiquing articles share them, causing it to trend, and beyond human agents, bots share and comment. “The Spread of True and False News Online” by Soroush Vosoughi, Deb Roy, Sinan Aral finds that fake news tends to spread faster than truthful sources on Twitter.

As an example, fake news offers a sticky paradox: opponents of “post-truth” are often hampered in their fight by broader histories of habit (especially in the media), infrastructure, and economic goals and models. While this brief introduction does not have the space to detail this, I want to describe what I mean, why it’s significant, and two approaches.

A Backdrop: Media and Post-Truth Rhetoric

In terms of these histories of habit, Michael X. Delli Carpini argues in “Alternative facts,” “Rather than an exception, ‘Trumpism’ is a culmination of trends that has been occurring for several decades” (18). The blur between news and entertainment, the weakening of traditional gatekeepers, and the growth of what Carpini calls a “multiaxial” and “hyperreal” media landscape, where contradictory news co-exists and information often replaces the underlying material reality it represents—all of these represent long-standing trends contributing to Trump and post-truth rhetoric.

Mainstreaming fringe discourse also contributes. As Waisborg et al argue in “Trump and the Great Disruption in Public Communication,” mainstream news offered platforms for fact-free, intolerant discourse from formerly fringe groups, and as Zeynep Tufekci argued in a recent New York Times op-ed, algorithms on sites like YouTube often draw viewers to more extreme content. Angela Nagel, in Kill all Normies, and a recent report from Whitney Phillips in Data and Society also point out this mainstreaming, highlighting the role of trolls. Furthermore, as Noble’s Algorithms of Oppression highlights: the digital infrastructure often enforces hegemony and racism.

As rhetoric has long been central to public deliberation, we need to teach what has become of this deliberation. While political enmity, fractured discourse, and fake news are not new—from Ancient Athens killing Socrates to the strife of Reconstruction—our media landscape is. And I think two points bare deeper scrutiny.

Possible Responses

First, as Zizi Papacharissi argues in Affective Publics, we often underestimate the role affect in public debate. This is especially true today, as her work with social media shows. Many of these point-and-click economies rely on affect, often stoking social change—or the means for it—through revenue models, forming “affective publics” as networks organize online and offline. Many legacy media outlets also rely on affect to draw and maintain viewers, informing coverage. While we, as a field, may often prioritize logos and ethos in writing, we need to recognize affect and its ability to circumvent other appeals—through humans and interfaces.

Second, much as the digital humanities has advocated working with computer science departments while developing computer literacies of our own, I think we need to connect with media and journalism. As public rhetoric often takes place through news—fake or otherwise, on television or through Facebook—we need to connect with those who do this work, how it is done, its history, and how it circulates. In other words, we need to interrogate the whole structure, not just consumer media habits and literacies.

Patricia Roberts-Miller argues in Demagoguery and Democracy that demagoguery comes from an underlying culture. Even as we fight the daily battles of post-truth rhetoric, we must also—per our energy’s allowance—combat the underlying war, as it pervades our media, politics, and daily lives.

 

Works Cited:

Bockowski, Pablo J. and Zizi Papacharissi, eds. Trump and the Media. Cambridge, MA: The MIT Press, 2018.

Brooke, Collin Gifford. “How #Trump Broke/red the Internet” Skinnell 122-141.

Carpini, Michael X. Delli. “Alternative Facts : Donald Trump and the Emergence of a New U.S. Media Regime.” Bockowski and Papacharissi 17-24.

McComiskey, Bruce. Post-Truth Rhetoric and Composition. Logan, UT: Utah State University Press, 2017.

Nagle, Angela. Kill All Normies: Online Culture Wars From 4Chan and Tumblr to Trump and the Alt-Right. Winchester, UK: Zero Books, 2017.

Papacharissi, Zizi. Affective Publics : Sentiment, Technology, and Politics. Oxford, UK: Oxford University Press, 2015.

Phillips, Whitney. “The Oxygen of Amplification.” Data and Society. 22 May 2018. Web.

Roberts-Miller, Patricia. Demagoguery and Democracy. New York, NY: The Experiment, 2017.

Skinnell, Ryan, ed. Faking the News: What Rhetoric Can Teach Us About Donald J. Trump. Exeter, UK: Imprint, 2018.

Tufekci, Zeynep. “YouTube, The Great Radicalizer.” The New York Times. 10 March 2018. Web.

Vosoughi, Soroush, Deb Roy, Sinan Aral. “The Spread of True and False News Online.” Science 359.6380 (2018): 1146-1151.

Waisbord, Silvio, Tina Tucker, and Zoey Lichtenheld. “Trump and the Great Disruption in Public Communication.” Bockowski and Papacharissi 25-32.

Image Credits:

Featured: Lorie Shaull, “Lightning strikes Trump bus…fake news?” (via CC)

 

Stardew Valley, Sorge, and Martin Heidegger

I’ve been playing a lot of Stardew Valley lately. The pixel-graphics farm RPG has enjoyed a  one-year anniversary this past Feb. 26, but mostly I’ve found the game to be a bit of an escape as Syracuse’s nickel grey March and school’s looming deadlines deepen a seasonal depression.

For those of you who have not played Stardew Valley, the plot is simple. Inheriting your grandfather’s rustic farm in the bucolic Stardew Valley, you start with some lose coins and tools and gradually nurture the farm back to health, interacting with the community and the surrounding countryside–from mysterious woods, to mines, to the ocean–as you plant and harvest seeds, forage, mine, and care for animals. Like any RPG, you level up your skills, from crafting and combat, and build relationships with NPCs by giving gifts and completing small quests. The player can eventually get married and raise a family.

The game has some overlap with the Harvest Moon and Animal Crossing series, placing the player as a caretaker enmeshed in a community. The simple music, pixel graphics, and winsome, quirky cut-scenes have their charm, and while the mechanics can get a bit grind-inducing (depending on one’s style and goals), the rhythm of rising, getting set for the day, working, and heading to sleep is a calming metronome that structures your daily actions, whether attending a community celebration, fighting “Slimes” in the mine, or simply fishing away a few hours.

More deeply, though, I kept coming back to what Stardew Valley teaches about Martin Heidegger (1889-1976), especially his notion of sorge, or “caring,” as it’s often translated.

Continue reading “Stardew Valley, Sorge, and Martin Heidegger”

Tech’s Silicon Tower

I was just reading Cathy O’Neil’s (@mathbabedotorg) New York Times piece on the tech industry and academia, which argues how academics have not done enough to study issues caused by recent technology, including filter bubbles and big data. Others have already critiqued some of the tone and oversights of the piece, with varying degrees of sass, but I want to look at it as a rallying cry. While I think the piece could give more credit to current researchers, it recognizes a dangerous gap between this research and the tech industry.

A few of O’Neil’s points are especially key. For one, she notes how big data is often cloistered in companies, reducing access to academics. She also notes how private companies hire academics, and she describes how funding that drives engineering and computer science programs may not include more humanities-tinged concerns for the ethical, social dimensions of technology.

More contentiously, O’Neil also says, “There is essentially no distinct field of academic study that takes seriously the responsibility of understanding and critiquing the role of technology — and specifically, the algorithms that are responsible for so many decisions — in our lives.” While a distinct field of study may be harder to name and locate, plenty of sub-fields and inter-disciplinary work hits at this exact issue. For example, in rhet-comp, Kevin Brock and Dawn Shepherd discuss algorithms and their persuasive power and Jessica Reyman has analyzed issues of authorship and copyright with big data. Beyond rhet-comp, danah boyd continues to write on these issues, along with work from the University of Washington.

But a gap remains to some extent, despite this research.

Personally, I see two potential reasons: hubris and tech’s failure to consider social media more critically. Regarding hubris, George Packer’s “Change the World” (2013) explores Silicon Valley’s optimism and their skepticism of Washington. After describing how few start-ups invest in charity, for instance, Packer writes:

At places like Facebook, it was felt that making the world a more open and connected place could do far more good than working on any charitable cause. Two of the key words in industry jargon are “impactful” and “scalable”—rapid growth and human progress are seen as virtually indistinguishable. One of the mottoes posted on the walls at Facebook is “Move fast and break things.” Government is considered slow, staffed by mediocrities, ridden with obsolete rules and inefficiencies.

After Russia’s propaganda push and amid ongoing issues, like Facebook’s role in genocide, this optimism seems naive and dangerous. Zuckerberg’s trip to the Midwest , hiring more fact checkers, and increasing  government scrutiny seem to point to a change. But I’m not sure how much is actually changing in tech–or larger structures like education and law.

This leads me to my second thought. In Being and Time, Martin Heidegger distinguishes between the ready-at-hand and the present-at-hand. The former refers to how we normally go through life, interacting with objects without much reflective thought, while the later refers to the way a scientist or philosopher may look at stuff. In his hammer example, Heidegger says that we normally use a hammer without much second thought, but once the hammer breaks, we reflect on what it is or does.

Similarly, with the ugly realities of social media surfacing more, we are more apt to examine and reflect. Before it “broke,” we used it as a neutral tool to communicate and pontificate digitally. As long as we continue to see social media as a neutral tool, or a tool just needing tweaks or fixes, we miss considering what social media is within a broader context of culture, economics, and society. We may be waking up to these deeper questions now, but we can’t fall back on present-for-hand approaches to use and design.

As Lori Emerson (2014) argues, companies rush to intuitive designs and ubiquitous computing, but we must consider how these trends blackbox the values and potentials of our tools. As Emerson and others argue, we can challenge these trends with firmer technological understanding, more democratized development, and the resistance of hackers and activists.

But with tech having so much power, I am not optimistic for change without a broader attitudinal shift in tech and elsewhere. I only see incremental changes coming, like increased fact checkers and algorithmic tweaks. These are good and may lead to significant change in time, but fundamental outlooks in tech–what philosophers may call instrumental rationality–will likely stay the same. Many critique the Ivory Tower for its obsession with present-at-hand abstraction, but the Silicon Tower seems just as dangerous with its present-for-hand reduction.

[Image: “Hacker” by the Preiser Project, via Creative Commons]

 

Play(dough)

[Image from Learning4kids.net]
[Image from Learning4kids.net]

Playdough. Tiny hands tweak, pinch, stretch the dough into tinsels, meaty threads, snakes curling into snail shells–suddenly smashed flat, “like pancakes,” and rolled smooth in young palms into spheres. Perhaps, with a few gentle, well-placed tugs, the children teas out arms and legs, or a simple face, then the fingers close, vise-like, dough peeking slightly from the spaces between, molding shaping it into a small brain, nooked and crannied, and grained with palm lines.

Then, at the end of the day, it all goes back in the plastic can, smashed, once more, into an uneven cylinder. “Don’t forget,” say the teachers, “or else you won’t be able to play with it anymore.” Sealed behind primary-colored lids and walls, the malleable plaything remains withdrawn and dormant, waiting.

II.

“Writing is revision.” A teacher I once shadowed said this a few times. So did I to own students, thinking it a properly provocative, axiomatic phrase. Something White Lotus from Kung Fu might say if he taught first year composition.

But, to be honest, I don’t really know what it means. Is it a reference to something like Linda Flowers and John R. Hayes and their “cognitivist approach to writing,” in which revision and pre-writing are part of the “writing process”? Or perhaps it’s a more political adage, on the “revision” of ideological entrenchments and social structures. Writing allows one to “revise” the state of things, both inside our heads and outside, in the world.

Or it may stretch the never-ending inventive tweaking that revision entails over the whole of writing. In other words, writing is a constant “revision” of sorts, a constant trying to get words out as best as we can. We are never done. The moment we pick up our pens, we are already revising. The moment we “finish,” we are still revising.

III.

If one mentions (or Googles) Albert Camus, the word “absurdity” is not far behind–neither is “existentialist,” which is a whole other issue. But, as with most cases of historical association, things are more complicated.

The “absurd” is the first of three philosophical progressions for Camus. During WWII, Camus wrote the trilogy of the absurd: the play Caligula (1944), the novel The Stranger (1942), and the essay The Myth of Sisyphus (1942). This, he said, would be his guiding process, tackling his ideas with a play, a novel, and an extended philosophical essay.

His second trilogy centered on revolt, inserting human values in the face of nihilism. Writing the book-length essay The Rebel (L’Homme révolté, 1951), Camus received a wave of criticism. For one, he attacked the French left, which included his friends Sartre and Beauvoir, because they knew about the atrocities of the GULAG and still supported Stalin.

But more pointedly, Camus also changed his thoughts. He no longer was the “prophet of the absurd,” but the spokesman of revolt. While some argue this shift was a complete rejection and others say it forms a “continuum” with absurdity, both represent a shift.

As Camus writes in his essay “Enigma,” “Everyone wants the man who is still searching to have already reached his conclusion. A thousand voices are already telling him what he has found, and yet he knows that he hasn’t found anything.”

Camus was still searching, still stumbling and exploring his ideas, flashlight in hand, but his public name was already solidified–and, in many ways, remains so.

IV.

Playdough is revision. You’re never done tweaking or sculpting it. As it’s name suggests, playdough is always “play,” never product. Pure process, pure doing, all about feeling the grainy pliant substance stick and fold with your fingertips. And each time, it goes back in the container, like an artist who scrubs away his canvas just to start again.

It’s not “art for art’s sake,” but creative construction and exploration without a clear endpoint. Like a sand box or a “sand box” game, playdough provides a space to explore the space. That’s its end and means.

In a sense, it even differs from a “game,” our usual sites of play, as playdough has no constraints. No “rules” that structure the game. For example, in soccer (i.e. football), because you can’t use hands and arms, the “game” is to use one’s other body parts to head, dribble, kick, cross, and score.

Playdough has no “rules,” except, perhaps, a parent saying you can’t stick it on the rug.

V.

Camus also wrote that writing is a “daily fidelity,” a daily act of holding onto and working one’s ideas and images into something that may take years. For some reason, Camus often latches on to five years, saying that one must have an idea five years before one starts writing about it.

Camus’ often forgotten first novel A Happy Death is a steppingstone to The Stranger. The character is a cold, detached Algerian named Mersault (a one letter difference from The Stanger‘s Meursault.). It evokes similar images, similar echoes and feelings, though the novels differ profoundly.

Scrapped and unpublished in his lifetime, A Happy Death may be a failure in some ways. Or a mere writing exercise, a book-length warm up for a new writer. But still, the question remains, how much does it stand on it’s own? How much is it part of The Stranger? And how much does the distinction matter?

VI.

I always remember that our English “essay” comes from “essai,” the French word for “trial” or “to test the quality of” (like metal in a furnace), echoing Michel de Montaigne’s Essais, which he viewed in a similar light. They were not meant to be polished, finished pieces, but “trials” and “attempts,” sketches or studies in a sense that tested his ideas.

Like Camus’ daily fidelity and playdough’s unfinished pliancy, Montaigne’s Essais were searching, roving, and unfinished–despite receiving countless edits, read throughs and revisiting. And like Camus writing, the Essais offer profound political and philosophical insights. Here, writing is revision, and revision is powerful.

Encountering most essays, however, we often see them as static and finished. We also see them as discrete and separate–or when not separate, as “derivative” or “remixed.” But technology provides a possible return to Montaigne’s Essais or a possible shift into the realm of playdough, of productive play, as our “interfaces” are often not static. Here, writing is much like revision.

Only I shudder to use the word “productive,” because it has become an instrumentally focused word, layered with nasty, anxiety-inducing overtones that make me wonder if I’m “doing enough,” and “keeping up,” and not “wasting time.”

So, in a sense, technology allows us to have interfaces of co-authorship, interaction, constant change, new mechanics of invention, etc., but we also need a culture that can explore this. We may have playdough interfaces, but we need a playdough culture, a culture that isn’t telling us what we have “found,” to paraphrase Camus, but relishes the play of the finding. Doing so, we may further liberate our technology and creativity to innovate and express. But most of all, it may bring more freedom and joy back into the creative process.

As I said above, it’s not art for art’s sake, but doing for doing’s sake. It’s about turning revision into invention and vise versa. It’s about taking our tacky, doughy language and playing with it, seeing what comes out as we stretch and flatten it into compositions.

Identifying the Alien in our Humanity

Look around you. At any given moment, “beings” encircle us from all sides. I’m using a computer on a table, while sitting on chair. Nearby, some window blinds murmur a restless patter and s kettle hisses and whines. Outside, the stirring, purring, scratching, sniffing scuttle of nature persists indefinitely. Indeed, we are not alone.

On the one hand, this is pretty obvious. Humans have always had “tools” or “technology,” and we’ve always been in the environment. But at a deeper level, this intimacy with other beings implies a kinship. Particularly in contemporary culture, people constantly interact with and through technology, like cell phones, buses, radios, computers, or televisions. Doing so, we express our humanity in and through technology, and this technology has an important role in how that occurs.

In other words, humans do not express what we often call “humanity” in a vacuum. To compose the great texts of history, Shakespeare, Beethoven, Sappho, and Sun Tzu needed technology. They needed ink or stylus, paper or tablet. And these texts always grew out of a place. The tablets of Mesopotamia needed the clay of the Fertile Crescent. The cave sketches of Lascaux needed the water and pigment–along with the cave wall.

This is what the scholar Thomas Rickert is getting at, to some extent, with the notion of “ambiance”: we grow out of stuff, express with stuff, “are” through stuff and space. As Carl Sagan said, “We’re made of star stuff. We are a way for the cosmos to know itself.” Humans may like to center the world around our own being, but we are intimately part of the nonhuman, “spoken” in a sense by our environment and the objects and nonhuman beings that compose it.

Continue reading “Identifying the Alien in our Humanity”