BiText: The Early Days of Texting

Sharmaine E.DS. Browne | 30 October 2019

“Well you need to fall into line because that’s how it’s done now.” I need to fall int-?  I was stunned.

Van, a fairly new acquaintance, was talking about tech, specifically, texting, more specifically, my failure to respond to each and every text he sent, which was occasionally every fifteen minutes, for an hour at a time, and nearly every day for stretches at a time. Sometimes, when I failed to respond because I was out or at work or on the subway, a dialogue would develop without me; he would write, wait for a little while, answer his own inquiry, then text again, and so on, a script.

 

It wasn’t that I didn’t respond at all, just that I failed to respond to every text he sent. I always responded about plans we had or whatever personal crises might be brewing, though it seemed to me, if a longer communication was in order, then so was a meeting or a phone call, but he didn’t roll that way. And if I didn’t respond to the little electronic tugs at my shirtsleeve all day, as I tried to go about my day, buying fruit at the little grocery, sipping a coffee in my favorite café, checking out a book at the library, browsing for books at the local bookstore, chatting with my neighbor on the stoop, or having a drink with friends at our local watering hole, it was because I preferred living in my days – not alongside them in some digital multiverse.

The incessant rings and buzzes disturbed my peace of mind, frazzled my nerves after a while, and interrupted whatever flow I managed to catch through the days. Since everyone had begun cultivating these invisible, electric attachments, I tried to be careful to keep my irritation to myself. I was sympathetic. I’d had developed my own digital attachments via email a few years prior, and by telephone some years before that, and the addictive qualities had been powerful.

There is something special and sometimes secretive about a correspondence. It’s a space in which you not only communicate but also cultivate a unique and creative manner of textual bounce with someone else. The problem was, I find such technological bonds pulled me out of the world I was in and into one of my shared mind with another. I worked hard to rid myself of the habit. Once I did, like a former smoker, I simply did not want to be pulled back in. I felt I had a right to my time and my days and the rhythms of life I had worked so hard to cultivate, so I tried to explain when chastised. I also tried to head off future conflicts by announcing that I unplugged when I traveled and certainly over summers, but that excuse held some people off for only so long.

One winter, it reached a breaking point with one friend. The weather was bitter cold though not yet snowing, and I was leaving for Spain, anxious to escape the doldrums of my job, happy to put as much distance between myself and it as possible. I was moving slowly. The doctors had cut into me, through skin and muscle, and had made a mistake. The new pain was intense. Even sitting up was an effort. Red called and offered to help me pack. She and three other friends came over and put a month’s supply of life in a suitcase and we laughed, though it hurt, and had a good time overall. We made blue cheese sandwiches with pears, listened to the Boss in honor of the East Coast, took silly pictures, and sat around watching the cats. Another friend took me to the airport where a wheelchair was waiting, which I stubbornly declined out of some sense of displaced pride. So I made my way slowly and somewhat painfully instead, my head heavy with thoughts and wondering how I got there, in life, weighed down by the events of a decade.

It was one of those omen-filled journeys, there and back, a time for healing, but also a time of enormous transition which I tried to deal with as gracefully as possible, but probably failed miserably, though that’s a different story. In any case, I had my thoughts full, dreaming, discarding, cleaning house. Familiar with my need for space, most of my friends let me be, knowing they would see me a month or two out, and might call me on the phone to chat, but ceased texting out of respect for time and money. Not Van.

Within days, he began texting, dramatic texts like ominous tweets, “it’s gone,” or “it’s over,” or “alone now,” without context or conversation, without identification or explanation, little hooks meant to draw me and inquire or attend. I called him the first night I received a little script of bits, left a message, but he didn’t call back, just persisted in the hooks the next day, to which I in fact couldn’t respond while in a meeting. I called him back again. He didn’t answer. For my own reasons, I didn’t want to lock into the little screen, so I didn’t text. Late that night, I received several more dramatic notes without context, and finally called a mutual friend to find out what was going on. He filled me in. There had been some drama. So I called one last time, out of a sense of responsibility and empathy, left a message, heard nothing, tried calling one more time, heard nothing, and then went on with my month. As far as I know, he didn’t text again, or if he did they’d been lost, and I had far too much going on in my life, and others around me had far too much going on with theirs, to pursue a silence punctuated by tiny scripts from a close but definite acquaintance from thousands of miles away.

We had never really gotten to know each other very well, but his texts had become a daily event for over a year by then. What I found most astonishing about them was their lack of psychic regard for another’s day or preoccupations. It was as though he couldn’t imagine another life or mind outside his own, that we were all magically there and available for his time, and specifically, his space. If we weren’t in his space, and he rarely left his basement apartment or his computer, then we weren’t doing anything at all; our being was non-being in his imagination’s cyberspace.

I recently heard that there is something about that in Deleuze — that there are those who cannot imagine the unfolding of life of another not themselves, that it doesn’t exist for them. It’s a challenge for most of us anyway, to really center ourselves in another’s perspective, and fix it, if just for a moment, but for some it may be impossible entirely, or so it seems. It’s some combination of empathy and imagination and a kind of intellectual selflessness that’s needed. Without that chemical, emotional, and imaginative combination, these folks imagine you are there just for them, somehow, and nobody else exists, not one’s friends or family or lovers, let alone one’s self, life and feeling. The illusion of the Top Dog.

In Van’s case, his inability to imagine another’s life could well have been a fracture in his perception. Could it be that the technology provides a medium to watch our pathologies play out? Could the technology simply reflect the ups and the dips, the swings in mood without the filter of another’s face offering a reflection of the behaviors? I’m empathetic. I’m familiar with feelings of isolation and confusion, and we all know the times when a kind of desperation for understanding takes hold.

But, what interests me, what frightens me, is that I’ve watched the technology turn into a leash for some and in both directions. We appear to be most vulnerable when life isn’t going well.  Anyone who is feeling slightly off-kilter, a little sad, somewhat isolated, or just incapacitated by circumstances or illness seems vulnerable. It has me thinking about the larger implications of the texting technologies in the world at large. For every impulse, the technology is there to satisfy it—if we feel sad, we text; if we feel frustrated, we text; if we feel confused, we text, and texts want an immediate response. This doesn’t leave a lot of time for rumination, contemplation or self-examination. Moreover, due to the distance, geographical and emotional, the absence of another tangible voice or face, our ability to ignore the effect of our words on our audience, the technology seems to amplify the feelings of the sender while cutting off the feelings of the receiver; dialogues become monologues. The empathetic line, the reciprocity, is lost.

As we strive to be open, we become insensitive ourselves—because when we don’t have to see the effects of our words on someone else, we very well may formulate communications less thoughtfully or considerately. Instead of measuring our words, we blurt them. Instead of considering the listener, we hold them hostage to our needs. Being on the other end of an emotive communication which fails to consider our being-in-the world has become increasingly difficult. Texting separates us. It also controls us.

To me, texts can begin to feel like small tugs, demanding time, demanding attention, and demanding response. And when I haven’t responded, sometimes, the texts intensify, becoming stronger and stronger tugs, until they’ve grabbed my shirt, and twisted it behind me, again and again, tighter and tighter, until what I have left behind is choking me.

Also, texting leaves us with very little privacy. People know how long we might take to consider a meeting, or they know whether or not we are busy or free, presumably anyway, and we no longer have the freedom of our own space and time to think – to ruminate – to just get away from the world for a while without it feeling like a deliberate decision. Each and every text asks us to answer it or not. Texts don’t let us be.

And what’s sad, really sad, is that what it means to be a good friend to someone is losing its resonance amidst the cacophony of ringtones and message alerts. When I returned to New York City, freeze and storms, the sidewalks were icy and the wind was biting cold. I emerged from the bookstore, it was late, and as I looked across the square to the street where we often met for a beer just two doors from his building, I called and left a message; I was in the neighborhood and back from Spain if he wanted to hang out.

I began walking downtown and had walked two and half long, wintry blocks when the text came in. “im sorry i dont think i can c u again.” I stared at it for a moment, cold fingers beginning to clutch at my guts in either anger or hurt, unclear to me. I texted back, “huh?” “ive texted u and u havent responded. u obviously dont want to be friends u r not a good friend.” “huh? I just called you. I just got back.” He began to text long messages, bleeping on and on, I stood there trapped at the top of the subway stairs in the cold to avoid losing the signal, thinking to myself and unsure what to do as I heard my phone buzz with the alerts. I didn’t go down into the tunnel but went to wait for the bus instead—another aspect of these kinds of digital communications which few people take into account, the ways they disrupt the most basic aspects of our lives.

When I looked down again, a few minutes later, I responded to a text I saw there, confused, bummed, and somewhat irritated. I texted that if he needed to talk he needed to call me. I called, but he didn’t answer. I texted that I didn’t want to have a serious conversation over text screens. I asked him to call. I heard nothing. I rode the bus home.

After exiting the bus, I walked through the cold to my door feeling sad and a little ill. I was thinking about the ill feeling in my guts, my guilt, my friend’s accusations as I was taking off my boots, the snow melting onto the wood, while I was pulling off my coat, shivering, walking over to turn up the thermostat, as I was making cocoa. There was no feeling of warmth or comfort. I didn’t get to enjoy the heat or my cocoa. There was no welcome upon returning home—only accusations. I remembered the days before digital—when friends were excited when I returned from a trip, and when I was excited to see them when they came home from a trip. I began singing softly, “Those days are gone my friend, we’d thought they’d never end, we’d sing and dance, forever and a day ….”

The screen alert and ringtone were silent until two o’clock in the morning when scrolls of texts started pouring in again. Exasperated, and getting angry, I ignored them and called. This time he answered. I told him that serious conversations should not be had via text, and he flung angry accusations at me, hard and fast, all of which were specious and somewhat invented. It was drama I didn’t need, nor want. He read from the standard script – I wasn’t a good friend because I hadn’t stayed in touch during my vacation. I told him that I felt suffocated because it seemed he wanted me on call for him, that my time was in fact his time. There was no room in his rule book for anyone to fall or wander out of range.

I was tired of feeling that this most normal possibility in years past, close enough in time to easily remember, just falling off the map for a little while, was considered a violation of friendship, never mind one’s needs. The conversation went poorly for about fifteen minutes—and then he berated me for neglecting my best friend. My best friend? It was out before I could stop it. “I’ve barely known you a year?!” I said. But it was then that I realized that he had no conception at all of my life outside his tiny and narrow experience of it. My world only existed within the limits of what he could imagine, a tiny slice of his life was the entirety of mine—to him. My presence in that slice was my life. His texts were everything. I lived inside of them, being a good friend by responding, or being a bad one by not responding.

Later that year, I would meet with an old boyfriend from college while in California with whom I spent many nights up the coast or afternoons wandering the beach, phoneless, back then, and free in so many more ways than now. We were walking his dog through Santa Cruz discussing my interest in communication technology, and it was he who pointed out we were able to disappear back then and come home to roommates who had already started the bbq, unruffled that we couldn’t be reached. “Where were you guys?” “At the beach.” The answer sufficed. No more. My nostalgia runs deep.

A year later, I had another collision with a man who was fully plugged in every way, who could not tolerate being alone for even an hour without logging onto social media or texting someone. I hadn’t seen him in many years, and whatever correspondence I did have with him usually consisted of a brief request for some kind of advice—and was always accompanied by some snarky remark about how difficult I was to reach. In fact, it was not difficult to reach me at all by the old standard. Were someone to call me one morning and leave me a message, they could easily expect and receive a callback that night, or at the latest, over the next day or two if I was really busy. But that old standard no longer applies. The pace and peace of communication is no longer slow or measured, no longer glassy and smooth. It’s fixed by a staccato beat and rapid fire responses, brief, unpunctuated and immediate. One no longer really has time to think, or even time for one’s self.

I had lived on the other coast for years and had just returned the week before. For the week and a half since my return, each of my days had been peppered with little tweets of texts from him. At first, I chalked the abundance of texts to excitement that I was in town. We met for a drink, and in person, I had a lovely time. The next morning, he texted, “i miss u mama!” Feeling stupid, I wondered how he could miss me? We just saw each other. “It was good to see you too,” I responded. But when the “cant wait 2 c u”s and the “miss u mama”s and the “i luv u”s began rolling through my days, I began to feel overwhelmed. I watched the texts roll in over the course of days, curious, perplexed, and flummoxed. I wondered if he ever wanted time to himself.

He invited me to come over to his house for dinner and to see his new house, a meeting that was overdue, and I agreed, but he wanted me to meet him at work first. He shared these aspects of his life, rolled it out for me, and that was fine. He had a good job, a professional position that adhered to a traditional nine to five schedule, unlike mine as a teacher, or my colleagues—artists and intellectuals who work, if not outside the grid, then aside its timetables. But he didn’t ask about my job, in fact, had never asked what I do at all, and had no sense of it.

Instead, he wanted me to embrace a lifestyle I had worked hard to avoid, routine and safely contained, suburban and local. I could feel the insistence, the psychic weight pushing at me to like his condo in a gated community, to be impressed and to want to be involved. He showed so little interest in who I was while insisting I celebrate his choices. I have since wondered if, to some extent, I was simply offended by his lack of attentiveness or inquiries. I began to realize he hadn’t the faintest ideas about my beliefs or interests, my passions or my tendencies, never mind my work. At some point I realized we barely knew each other. The tech had kept us divided as strangers.

He texted me several times during the day on a Sunday, and I replied once saying that I had work to do, that … I’d let him know if I had time in the evening. He texted again late in the afternoon, and I told him I didn’t think I’d be able to meet. My work was unfinished, and I had a deadline.

I thought to myself at the time, I realize, now, though, he has little idea of what I do, and hasn’t the faintest notion about what it entails. In his world of getting up and out the door to his official corporate job, writing, research, and even teaching isn’t work at all. My time is free time in his mind. He followed up with three more texts of increasing intensity and vulgarity, a little less than ten minutes apart, all imploring me to come to his house, and I finally texted him that if I were up for going out, I had coffee or pool in mind, not going over to his house an hour away. But in any case, I didn’t want to go out at all. I wanted to finish my work.

After a few minutes, a new slew of text messages started flowing in, a rapid succession of cajoling, teasing, irritating, and offending messages, and with every alert, I found myself becoming increasingly tense as I tried to concentrate. After half a dozen more texts, all of which I honestly didn’t see at first because there were so many, I responded, simply, “nope.” A couple more texts of disappointed sentiments and “awwww” followed, then followed by a fifteen minute pause.

Then, another half dozen or more cajoling texts followed a while later, and ten minutes after my final, “no,” I got one more punctuated message, “so then when will i c u?” to which I, admittedly, became just totally exasperated. I took a breath, remembered compassionate communication, and asked myself, “what does he need?” I texted back, “what is going on with you? Are you okay?” There was no response, so I texted a “?” to which I received no response again, and so I called, and I asked him again. He was unresponsive, clearly disappointed in my response, and admittedly, amidst the awkward pause and silence, I felt bad, realizing I had put him on the spot. He declined to have a conversation, assured me that everything was fine, and I went back to work.

The next morning I received another text; he had sent me a letter on FB. Facebook? Seriously?!?!? Now, granted, I have a peculiar aversion to writing serious letters on FB as well as text screens, but I swallowed my aversion, and calmly read it, and thought a lot about what he had to say in it all day. I really, really believe that letters of a certain gravitas should be read carefully and that a response should take time. As I was mulling it over, and falling asleep that night, another text came in that he hoped I had read his letter. I could have responded, and maybe I should have, but I was tired and thought that a morning response would be okay, and reasonable. When I woke up late, however, I had ten more texts sent around six o’clock in the morning.

It would be hard to accurately describe my reaction to those break-of-dawn texts. They hurt in a number of ways, leaving many colors of bruises on my feelings, on my sense of self, on my ego—but especially on my good intentions, the patience and effort I had exerted the past two weeks to tolerate what I felt was extreme, unreasonable, and immature behavior, — to look past my own very palpable judgments, and to assess my own reactions as somewhat shallow and perhaps premature. Looking at it, of course it was my fault to a large extent; I should not have exerted such effort, and I should have made my feelings even more clear, but truly, I was trying to be polite, and trying to get past my own sense of irritation and at times dismay over these conflicting communications. And I really had been open to spending more time with him, but I needed some time, and I needed to take some time. I had been lonely myself, and I welcomed the companionship, but I was ill-prepared for what seemed like a demand for a complete collapse of boundaries.

Suffice to say, things didn’t end well, but looking back at it, it was the texting, the availability of the medium, and the expectations harnessed to that tech, immediacy of response, accessibility, perpetual perceptual mis-readings, which cost us whatever friendship we might have maintained. An aggressive person in life, persistent and dogged in his desires and demands, the technology answered to his every impulse, and his own frustration built then blew when the human pulse on the other side of that digital leash wasn’t there and available on his schedule, compliant and responsive On Demand; my number was a premium channel with its own life and mind.

I’m not arguing that technology is bad, but I am noticing that technology provides us with an instantaneous outlet for every neurosis, every mental tick, every emotional need on call, and this is a problem. It amplifies needs, pathologies, and feelings. It can blind us, or, it can create an echo chamber in which every response we think we hear is in fact one of our own making.

Whether or not you believe in ruminating or taking your time with a thought or a feeling, the pause that the nature of time is always generous enough to offer often opens up into reflection. Anger cools, forgiveness blossoms, desire evens out, excitement finds a rhythm. But too, the quiet silence, the time of reflection, allows another to take in what we want or need, to slosh it around, and figure out how to respond, how we want to respond.

Like it or not, technology affects human interaction and not always for the better, maybe often for the worse. The way we use it is customized to our own particular needs, peculiarities, issues, personalities, and yes, even mental or emotional disturbances. As a leash, it yanks others down into whatever hole we’re falling into with us, and that simply, is just, no good.

 

The Quests and Questions that Stories Inspire: A Case for the Humanities in the Time of Technology

Sharmaine E.D.S. Browne | 25 June 2019

After a particularly difficult week, a colleague asked how my class went that afternoon, and I lit up. “It was amazing!” And generally, that’s how it goes. I work hard to create lively, vibrant classroom environs to cultivate intellectual and creative risk-taking. The students, in turn, motivate me to work harder, achieve more. What first struck me about my job years ago is that the heaviness of life falls off when I walk into a classroom. I feel the responsibility I have to my students deeply, and their own efforts inspire me and give me hope for the future of our planet.

I love my work, and I appreciate all my colleagues, many of whom are, really, the students. Every day presents possibilities for transformation, for learning, and most of all, for discovery and a different process of making connections than the digital connectivity to which we have been culturally trained for the past decade. It’s difficult to describe the sensation of witnessing a class of students burst into conversation, each one eager to contribute, the excitement of critical thinking at work. But what I find most meaningful about teaching in the humanities isn’t just what we teach, but where what we teach takes us.

So, when I read article after article bemoaning the increasing irrelevance of the humanities, I realize that too many of us who study and work in the humanities and the arts too often fail to assert ourselves and the worth of the work. In fact, lately, it seems like too many of us in the humanities are apologizing or retreating, saddened but resigned to irrelevance. The death toll has been sounding for several years now from various corners of this digital world.

I’ve seen STEM friends gleefully giggle on social media whenever an article comes out about the low pay and lack of suitable working conditions for humanities majors. They openly chastise those who chose to work in the humanities whenever an article rolls through discussing the lack of basic benefits for those who are unable to find full-time work and must work as adjuncts in an ever-shrinking field which daily discards full-time jobs in favor of temporary or part-time positions. Scrolling through the smirks gets old. But I also read experts weighing in on the death of the humanities, their supposed diminishing purpose, and their inevitable end in a world overlaid with ever-multiplying digital platforms. And of course, The Chronicle of Higher Education seems to publish a new article on the demise of the humanities every time I look. It is certainly true that universities no longer value the humanities and arts as much as they should, but this has been the case for decades, through some perspectives, centuries, but our culture’s failure to value what is important doesn’t make it unimportant. Just as morality and the law don’t always align, what has value and what is valued are not always in sync either.

Let me be clear: when I talk about the importance of the humanities in education, I am not referring to the hyperbolic clichés of out-of-touch scholars laboring for years on miniscule topics of irrelevance in the rarified air of an imaginary ivory tower—not to say there isn’t an important role for intensive work on minute, obscure topics in all research and scholarship. There is, in fact, a crucial role for such intellectually and labor-intensive work in all fields. And nobody snubs scientists and programmers for most often being engaged in small, tedious studies which would be of little interest to the general public; they are not chastised for their research. In all scholarship, there is a cascading effect to research which may take place beyond the public eye which may have significant cultural effects sooner or, often, years later. Such work is necessary and should not need to be defended.

The humanities, though, also occupy a more expansive space in the cultural imagination because they are intrinsic to the broader business of humanity—critical thinking, empathetic understanding, remembering history, foreseeing the future, protecting individual liberties, creating imaginative possibilities. But that expansiveness means exposure, and under the weight of an ideological gravity which insists on a certain analytical (and profitable) direction, gross generalizations keep arising from the misguided prejudices of an ideology which values STEM over the humanities.

1

As is the habit of many of us in the humanities, the week has left me pondering the many ways corporations slip habits into our lives which seep into the minutiae of our days, transforming them, reified, invisible yet integral to our daily existence. I find myself asking what the social (read: “digital world”) world and the natural world now have in common. In both cases, the worlds in which we move through our lives changed, and we with them, and we really didn’t have a choice in the matter. We didn’t even know to make a choice or that there was a choice to be made. As now-digital natives living in a changing climate, I keep trying to look clearly at what carried us here, to this time, under these conditions, in the places we live, and the answer looms large, technology, or rather, the more recent later twentieth and early twenty-first century corporate harnessing of our technologies—technologies which now feel so natural that we no longer notice their reified presences pressed into every aspect of our experiences and our days.

I call the where of how we live now the ‘digitalverse.’ We move through and interact with a fully integrated series of overlapping, intersecting networks now—while awake and asleep. As organisms, we have been fully assimilated into this digital, global grid, throbbing with imperceptible, seemingly endless electrical impulses. However, for millions of years, our ancestors were fully integrated into an entirely different series of networks—those of nature. It would be our particular species, Homo sapiens, who would deeply affect and change our world beginning with agriculture and the control of plants and animals.

Two-hundred thousand years ago, modern humans evolved in Africa. Homo sapiens were one of four species of humans as far as we know. We are the sole survivors. The other three became extinct, Homo erectus 70,000 years ago, Homo neanderthaleasis or Neanderthals 28,000 years ago, and Homo floresiensis 17,000 years ago. Then, 12,000 years ago, civilization as we know it began when we stopped living as nomadically and began living in villages, permanent settlements which emerged from the agriculture which would eventually give rise to cities. It would take another 11,800 years before we would reach the technological savvy of the Industrial Revolution and become bold enough to make a play for world domination—over Nature.

Our technological prowess has always marked our cleverness as a species, and it has been largely through our technologies that we have able to survive or thrive over millennia. Over thousands of years, we became experts at our various technologies, our technê, which in the Greek meant “craft” or “art.” Most basically, as a species, we use our technê or technology to interact with our environs—often to tame and control it, but always to manage it—or rather, ourselves within or in relation to it. First, fire enabled us to cook our food so that we would evolve to think, warm ourselves to thrive, transform elements to progress and control our environs, and fight off enemies to survive. While there is evidence that hominins were able to control fire 1 million years, ago, evidence of hearths is only about 400,000 years old. The wheel wouldn’t be invented until 3,500 B.C. A rudimentary steam turbine was invented during the first century A.D., but the first steam machine and steam engine, invented in Spain and England respectively, would not be invented until the 16th century. These inventions would eventually allow for the collapse of time through space by train and transport. The printing press would be invented in the 15th century. Electricity, first harnessed by Benjamin Franklin in the 18th century and then applied to power in the 19th century, would further collapse the distances of space and the lengths of time through communication devices—the telegraph, telephone, cell phone, flip phone, and eventually the smart phone.

Our ability to transform and manipulate raw materials, sand into glass into windows, metals and stone into bridges and structures, and coal, oil, and gas into fuel (the list is long), seemingly enabled us to ‘overcome’ nature. However, climate change would bely the tragedy that our way of doing business in modernity has turned our strength into our weakness. Has our cleverness via the harnessing of technology by corporations turned out to be our species’ tragic flaw, our hamartia? This is the kind of question which emerges from fields in the humanities.

I think back over the digital years—to the first time a colleague complained with tears of frustration that he couldn’t get his students to put down their phones and focus. That was nine years ago, 2010, but it feels like at least 20 years ago. It was 19 years ago but seemingly longer when Professor Ellen Willis, a feminist voice and unapologetic critic and journalist of might, complained to us during our cultural journalism class that when you saw someone talking to themselves in the streets it once meant that they were, as she put it, “crazy.” “Now,” she said, “it means they are talking on a phone.” Cell phones were still flip phones then, not yet ubiquitous, certainly not addictive.

Today, smart phones are everywhere, and we are now wired into them because they are wired intrinsically into our days—our schedules, daily habits, weekly rituals, and our minds and bodies. To accommodate this intimacy with this new digital presence in our lives, we had to change how we sleep, wake up, meet up, study, move, friend, date, teach, work, relation, do business, and bank. Even the way we are entertained has changed. Now, movies are likely to have little dialogue—as the drama largely takes place on screens. Plots which are supposed to take place now must be wrapped around the widespread digital habits and infrastructures of our days—or the plots have to take place before the 90s. It is astonishing how completely we have changed every last corner, nook, and cranny of our lives. We didn’t miss a beat, a moment, or a spot on earth. All together now, planet earth, as far as its human inhabitants go, just changed – en masse – and fast. The change was so fast, generations rippled apart ….

We no longer share a knowledge of base lines—what it means—or once meant—to live. We have lost each other in this mighty shift. But somehow, we forget who (or what) drove the changes we now take for granted as our new normal—a handful of corporations moving in concert to accomplish, without resistance, a complete takeover of the globe—Google, Facebook, and Amazon. In the 1990s, such a sci-fi scenario would have been taken for fantasy. Yet here we are. It isn’t the tech that astonishes me as much as the unbridled, unchecked, unhinged power of today’s tech’s corporate masters over our lives in relation to our complete complacency about the corporate good.

When referring to “tech,” it is easy to list technologies, but what technology actually “is” at any given time isn’t so simple to define. Our relationships to our technologies are even less simple than that because our understanding of what technology actually is changes from generation to generation, and that understanding can somewhat still be calibrated to lifetimes. At a given time, and in a given place, a given group of people can generally agree about what they would designate “technology.” Today, for instance, most of us would first describe smart phones, tablets, and laptops as current technology. From there, we might move outward from ourselves, describing various technologies with which we are less intrinsically connected. However, the farther away we get from the invention of a technology, the less technological an innovation feels. An Uber driver recently said to me that he thought technology was changing people, but on second thought, he said, people used to say the same thing about television—as though t.v. were not a technology. This is a common way of thinking.

In our minds, technology means “new,” therefore, recent, at least at first blush. Few people would name “trains” as a technology in 2020, but in 1820, trains would have been one of the first technologies to come to mind, as a “shock to the status quo” as Marc Greuther describes it in his review of Michael Freeman’s Railways and the Victorian Imagination (1999), technology which first dazzled then transformed the nineteenth-century mind. And while generations used to be lifetimes, as life spans have grown longer, generations have been pared down to a handful of years because we measure generations, now, by current-time technology savvy, not milestones or lifetimes. We laugh about it, teasing each other based on levels of expertise and ways of doing life. And really, when one stops to think about it, it’s kind of absurd. When our tech savvy becomes the gold standard for respect, we have a problem. Frankly, STEM fields have a vested interest in ignoring the problem.

Children are nimble. They are quick to assimilate any technologies that adults make available to them, and our technologies are the invisible scaffolding of our lives. Of course they become tech savvy college students and adults. However, everyone, Baby Boomers, Gen X, Millennials, and Gen Z alike have all cultivated a facility with technology so that swiping, scrolling, and clicking has become natural enough to be mundane. But as with language, the younger you are when you are immersed in it, the easier it comes to you. Our facility with tech can be compared to accents in language, only with tech, what marks one’s fluency is speed and assimilation, the integration of the technologies into the fluid motions of our bodies and our lives. The youngest among us are perhaps most fluent with at least the surface aspects of tech—they can “speak” it easily. My parents are always surprised by my speed with various devices; if they saw my students in action, I imagine they would be floored. We measure our technologies by our gains which we, unfortunately now gage by efficiency, speed and productivity, but we rarely stop to consider our losses amidst technological progress, seduced by convenience and our own pleasure centers in the brain. Convenience, progress, and efficiency carry costs, often overlooked in culture but as often explored in literature.

In addition to technology, another uniquely human endeavor is storytelling, and this is ancient. Stories began orally and pictorially, and they too have changed in sync with emerging technologies. What is quite special about stories, however, is that stories remember. They resist acceleration, demanding our attention, slowing our pace, inviting thought. We have told stories for as long as our collective memory reaches back. I imagine storytelling sliding back through generations, around fires, at bedsides, over tables, and along travels. Our play, poems, epics, short stories, and novels have ridden the currents of culture over centuries. And I wonder, might storytelling preserve the foresight we lost foresight for fire?

In fiction, we read stories such as E.M. Forster’s “The Machine Stops” (1909), Ray Bradbury’s “The Veldt” (1950), and Ursula Le Guin’s “The Ones Who Walk Away from Omelas” (1973), or Octavia Butler’s “Speech Sounds” (1983). Forster manages to create a world in which every surviving individual is willingly solitary, only communicating with others, even family, through versions of Skype or other screen technology. When a son wants to meet with a mother in person, she is shocked. Bradbury creates a world in which a smart house becomes the primary care giver, and the children, Peter and Wendy, no longer have any need for parents who would turn “off” the house and take away the children’s virtual playroom. Le Guin paints a dreadful picture of a “perfect” world in which the suffering of a child is necessitated and justified by the relative success of the many. And Butler depicts a post-apocalyptic world in which the loss of communication parallels a loss of intellect and increase in violence. Each one of these writers creates a dystopian future of their own creative musings, and each one, eerily, extracts vulnerabilities and tendencies in human nature to weave threads of stories into parables which capture so many of the nuances of our contemporary world. How can we allow the relevance and resonance of our stories, our imaginative play, to become irrelevant to education?

In some cases of literature, as in Forster’s story written 110 years ago, the author’s foresight is downright bewildering. It isn’t just that he foresaw the dominance of machine technology, but that he was able to guess its general progression in relation to human behavior, psychology, and ideology. His guesses weren’t precise, but they were prescient. Forster imagined a world overlaid by a technological infrastructure through which human beings lived their lives in isolation, nonetheless, fully integrated into the gridwork of a world by interfacing with others via screens, even teaching online. The adults tremble at the idea of going outside, and they dissuade their children from venturing into world they believe remains terrifying and toxic. In many ways, there is here. In his wider work, Bradbury imagined tiny seashells for ears (ear buds), an increasingly accelerating world in which nobody walked, only sat in front of screens while attention spans grew shorter and shorter, books shrank into clips (before being burned as illegal), and relationships diminished into talking walls (screens). The value of this work isn’t just entertainment, it lends perspective and it does so at a distance, illuminating without the heat. If a lightbulb is 5% light and 95% heat, literature is 95% light and 5% heat.

Literature stirs us to action and rumination because it emerges out of the world without being accusatory. It combines the fields of history, philosophy, drama, and anthropology. It teaches us about the past, yes, but more importantly, it teaches us about the present’s direction into the future. And most often, literature is able to articulate the feelings of individuals who are searching to express their own experiences. In essay after essay, and class discussion after class discussion, reading literature motivates students to share their thoughts on technologies, social media, isolation, loneliness, and an increasing inability to connect in a world of increasing digital connections. They are quick to discern the pitfalls alongside the boons, the dangers amidst the progress because they themselves are experiencing them. Contrary to what some of my older friends and relations expect to hear, younger generations are increasingly aware of diminishing gains—and individual losses sustained. Literature pries open the imaginative spaces that the world closed for them; it elicits their silent questions and provides them with the spaciousness to search for answers.

In a recent class in which many of the students were film majors, future game designers, and aspiring writers, we were studying the foundations of stories, and we were reading the thoughts and theories of various critics who have weighed in over the centuries, including Northrup Frye (1912-1991) and his ideas on archetypal narratives. In this case, I drew a circular diagram on the board depicting these narratives in four categories, Romances (usually involving quests; science-fiction falls under this category), Comedies (usually involving obstacles to be overcome, resulting in “inclusion”), Tragedies (of course ending in disaster), and Satires (ironic, sardonic, and often gritty). What becomes abundantly clear while reading Frye is how we keep telling so many of the same stories in different variations for thousands of years. By studying such work, we become aware of our own contexts and histories. Shared worlds of past and present materialize in our imaginations. Understanding where we come from often helps us decide where to go. Our stories emerge from our lives, and our lives make up our histories. Telling a good story can change an individual’s world. Telling a good story well can leave a lasting impression over generations.

One student observed, “so Eternal Sunshine of the Spotless Mind really brings all of these genres together.” Yes! A friend once said that this movie, in particular, made magnificent use of its medium rather than simply borrowing the literary medium (and badly) as many movies do. Whether we are aware of it or not, we still largely adhere to Aristotle’s standards in Poetics that plot is primary; however, Aristotle was referring to plays, and stories were generally dramatic or oral. We did not yet have the technology for film or the novel then, and each genre had its own unique possibilities as media emerged. A great story does not simply rely on plot. A great story draws upon the numerous story elements available to the particular medium which carries that story to an audience – whether it’s a stage, a screen, or a book. Eternal Sunshine of the Spotless Mind plays with its possibilities and makes uses of its form in the delivery of its content. But I hadn’t considered the wide array of archetypal narratives it drew upon, and this student’s observation delighted me. More importantly, it inspired other students by bringing history into the present. Studying the history of philosophy, art, literature, music, and culture has a direct bearing on what we can learn and what we can create today. But this demands time, spaciousness, and thought. In this instance, the example is light-hearted, fun, and quirky, but imagine when it applies to graver topics.

In an era of so many digital natives, we complain a hell of a lot about diminishing attention spans, increasingly unfocused students, and multiplying addictions to social media platforms, gaming, and what-should-have-been-otherwise-mundane-methods-of-communication, emailing and texting. We often (falsely) believe that while the youngest among us are prone to the addictive tendencies that media giants like Google and Facebook prey upon, older generations are more immune; in fact, we are all vulnerable to the predatory algorithms, and like most addicts who are protective of their substances, the vast majority of us are protective of our content fixes. Our egos fiercely protect our basil ganglia’s circuits of triggers, needs, and fixes. We operate in an odd space of denial (not unlike decades of climate change denial), somewhat acknowledging that we can’t do without our devices while stubbornly or secretly looking forward to our next fix. This leaves us vulnerable in hard-to-discern ways.

I first assigned Nicolas Carr’s “Is Google Making Us Stupid?,” (2008) soon after it was published in the Atlantic. I read in Carr’s article the truth of my own experience: my mind was changing. States Carr,

Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going—so far as I can tell—but it’s changing. I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.

I brought the topic to my students. Only a few initially dismissed the possibility that Google was anything but good and benevolent, but by the end of the semester, we had all agreed, yes, our reliance and in some cases dependence on the internet was changing our minds. Since then, I have not even had to pitch my case. We all find the interest with its search engines and media platforms useful and indispensable. But students are becoming increasingly savvy and self-aware about the nefarious effects of our online addictions and attachments—able to weigh the good and the bad without going to extremes.

Today, my students are downright sophisticated in their self-reflection about the damaging fallout of our social media use. The effects are of very different natures. Our reliance on Google signals a loss of serendipity, individuality, and the kind of deep knowledge accessed only by deep drilling – into databases, books, and archives, and even daydreams—the stuff that doing nothing is made of. In addition, we are actually losing the ability to drill down into ideas because we are losing the practices of research, browsing, questioning, and exploring the vast resources available to us over centuries. Google is a godsend in many ways, especially for students and writers, but it is only a tool of convenience while it is substituted for a well and wealth of knowledge—which is isn’t. I’ve always been skeptical when we defend our internet searches with the argument that we have the world at our fingertips—as though the wealth of knowledge available is somehow just right there and doesn’t demand work and investments of time and mental energy. Also, when we think to ourselves, “I can look up anything I want!” The question is, do we? I’m not arguing against the internet. Rather, I am skeptical of it becoming the first and last word, the answer to it all. Any scholar knows, such a belief is rife with problems. As a friend once said, we are becoming a world of fingertips—a quip which carries multiple meanings. We are enclosing ourselves within ourselves, we are curating images of ourselves which play out in the world to be seen and read by others enclosed in themselves who reply to our curated selves with their fingertips. As we venture out of our homes less frequently, we venture out of our own opinions and perspectives less frequently still. New algorithms facilitate this—delivering the search results we prefer to see, based on our search history, and online behaviors which are being recorded, stored, and disseminated back to us in the form of ads and information that corporations want us to see or opinions various interest groups want us to believe.

The effects are all around us now, and one would need to be wrapped into a cocoon of greed, opportunism, or denial to insist that we do not have a social crisis on our hands – from the disintegration of relationships and social skills, the rise in mental illness and loneliness, and the decline in empathy, to diminishing attention spans, the erosion of our privacy, the death of small businesses and with them whole neighborhoods, and the elimination of lives alive with serendipity. As another friend once lamented, who wakes up on a Saturday morning inspired by memory anymore, a memory that would motivate us to get out of pajamas and take a walk, stroll through downtown, browse through the stacks of an independent bookstore which has stood at a corner for 75 years, or thumbs through vinyl albums at the same record shop that everyone in the neighborhood had been frequenting for decades? How many small hellos have we given away on any given morning? How many conversations haven’t we had? And all without being seen, recorded, or tracked by cameras, Alexa, and Google apps on our phones? Why would we give all that up? For convenience and efficiency, essentially, bargains and ease.

The digitalverse focuses our attentions narrowly, whether it’s to purchase, vote, or curate images of ourselves and lives, and there are benefits of course, but the conversations that emerge from “humanities” classroom force us to engage with the world as it is. The tangential and inspired sparks which come from debate and discussion help us hopscotch to unexpected realizations so that we are better able to make unpredictable connections which are relatable to the world. During a lesson plan on creating atmosphere in stories, in addition to analyzing literary language, we also looked at cinematographic techniques. Horror movies are a particularly effective genre for this, and John Carpenter a particularly effective director. While watching a clip from a 1981 movie, The Fog, I was startled by a particular scene at a gas station. The aesthetics of the set were beautiful, and they made me nostalgic for our mechanical devices the slower pace of life before digital. The lighting, the atmosphere, the colors were all gorgeously orchestrated. But what surprised me was the abundance of glass bottles and containers in the refrigerated displays of the gas station’s convenience store.

There was just a lot less plastic in circulation 1981. I hadn’t really thought about it until noticing this detail in a movie during a lesson, the shift to plastic was as recent as the past 30 years, not, as I had assumed, farther back during 1950s when the suburbs were being built and convenience was being sold at a furious pace via the appliances being advertised to populate the new suburban households. No, it was during the 1980s and the 1990s—when our awareness of the coming global crisis emerging from climate change was well underway. This was not, of course, in the lesson plan design, but materialized naturally—leading to further discussion about the deliberate choices that were made for profit at the expense of polluting the planet.

We chose single-use plastic over recyclable glass because plastic was easier and cheaper. For the consumer, it was simply easier to load up shopping bags (then, paper, then plastic too) with plastic bottles and jars instead of glass ones. Plastic was lighter, it didn’t shatter, and it was (in the shortsighted view) easier to dispose of. For the corporations? Plastic was cheaper and far more profitable than glass. Plastic is now poisoning us and our oceans, all the inhabitants of the planet, and of course, the planet itself. Plastic is a problem. I don’t know many who would disagree. I bring it up as an example of shortsightedness, as an example of an inability to foresee its devastating consequences. Unfortunately, the same can be said of many of our inventions of efficiency, progress, and profit. Making the connections, or “critical thinking,” is central to the humanities at any given moment, unpredictably, unexpectedly.

2

In a recent WSJ Saturday Essay, “Stop Worrying About the ‘Death’ of the Humanities” (2019), Adam Kirsch rightly argues that “[p]ursuits such as literature, art, and philosophy are fundamental expressions of human nature. While they have taken very different forms in different times and places, no civilization has been without them, and there is no reason to think that ours will be the first.” With respect to recent trends such as the soaring popularity of STEM majors and the drain on humanities majors, Kirsch notes that “[t]hese trends started to spark alarm around five years ago, when observers began to talk about the “death,” “decline” or “crisis” of the humanities. Since then, alarm has turned into something more like panic.Who is going to save the humanities?” wondered Michael Massing in the New York Review of Books earlier this month, echoing last summer’s headline in the Chronicle of Higher Education: “The Humanities as We Know Them Are Doomed. Now What?””

Massing attributes the huge numbers of students flocking to STEM majors to the 2008 financial crisis, looming student debt, and the disparity between starting salaries for STEM and humanities graduates. As he notes, “the median annual earnings for engineering grads is $82,000, compared to $52,000 for humanities grads.” Eric Hayot, in “The Humanities as We Know Them Are Doomed. Now What?,” notes with alarm that “the decline in the humanities majors since 2010 is over 50 percent.” And rightly, Hayot advises us to reconsider the ethics and wisdom of continuing to accept doctoral candidates into programs when we know that there won’t be any jobs waiting for them on the other side of harrowing years of study, research, debt, and effort. As he asks, “[w]hen an English department goes from 414 majors in 2005 to 155 in 2015 (as did the University of Illinois at Urbana-­Champaign’s), or from 126 to 55 (as did Harvard’s), what department head can justify increasing the size of the tenure-line faculty? On what grounds can we even argue for hiring replacements for our retiring colleagues?”

The mindset which has drained the humanities of its perceived value in the larger culture, in part, comes from misunderstanding its role in education. Kirsch demonstrates this tendency when he argues that “[u]niversities are not responsible for, or capable of, creating a living humanistic culture” because while “scholarship is an important part of that culture” it is “not its engine.” While scholarship in the humanities, indeed, may not be the engine per se of the arts and humanities, the business of universities is not only to engage in scholarship but also, and quite obviously, education. Education may lay the foundation for the creation of the arts and a humanistic culture, and education may most certainly provide the atmosphere needed to facilitate and produce a humanistic culture write large. In the university, there is room for both, tucked into the corners of databases, libraries, and labs, research and scholarship may blend with what happens in the classroom, teaching the foundational studies and creating the environs for creative invention, depth of understanding, and the kinds of critical skills which may facilitate a better, kinder, more vibrant world. Some institutions of higher education emphasize research; others emphasize teaching. This is common knowledge. In either case, there is room for both the study and creation of the critical and the creative—in the humanities as well as the sciences, technological, engineering, and mathematical fields as well.

A point most often overlooked is this: in financially elite institutions, even drained of historically stable financial backing, the humanities will continue in some form. The danger is that the humanities may become less accessible to all but the most privileged. This is a problem because the education offered by the universities is for everybody, not only a financially elite few. With that education comes exposure because an education in the humanities is also about access—a point too often overlooked (which I suspect is a problem of the kind of ignorance born of privilege, a Marie Antoinette absence of knowledge). So while we may not have cause to worry about the death of the humanities by way of creating or accessing the arts and philosophies in the more privileged strata of American society, we certainly need to worry about access to literature, art, history, philosophy, and music at a time when culture is so one-sidedly valuing STEM and its potential profits for those who require more grist for their multi-billion dollar mills.

Yet, it is true, as Kirsch claims, that interest in the humanities does not seem to be waning in the American public. People are still reading, art exhibits and theatrical productions still have audiences, and movies continue to play a central role in American culture, perhaps more so than ever before. Contrary to Kirsch’s claim that “[m]uch of what students learn may not be directly applicable to their lives and careers” citing “need[ing] to know how to identify a school of painting or interpret a poem” as terribly narrow examples of the skill sets that the humanities provide, the humanities directly apply to students lives in ways that are transformative, empowering, and irreplaceable: 1) We encourage students to take risks and enter into conversations while confidently demonstrating communication skills ourselves, kindly but strongly. A young woman, nearing graduation, approached me at the end of a class to tell me that she feels empowered to go into a male-dominated field confidently after taking my class. She is armed with tools for success. 2) We teach literature which, among other things, cultivates empathy. A young man told me that they all learned how to be more empathetic toward people. He will be well-equipped to succeed in team efforts and gage the consequences of actions and policies on others. 3) We encourage them to advocate for themselves and to identity the connections between the work we do in class and their future fields of study. Another student wrote that they learned how to be adults. They are better prepared to take responsibility for their work and their goals. 4) We teach research and reasoned argument in every class. Another student told me that she learned how to converse about controversial topics without getting angry for the first time, armed with new skills sets, language, and historical precedents. The ability to keep a level head during conversation or conflict will serve her well professionally. Other comments, feedback, and evaluations attest to learning how to write clearly, persuade effectively, read deeply, organize efficiently, research thoroughly, let alone thinking critically and historically while analyzing thoroughly and fairly. I could cite, literally, hundreds of these student comments. What so many discussions about the humanities lack is a knowledge of the actual experience of actual students for whom education is most meaningful. And yes, this does and will affect American culture and even democracy for generations to come.

The humanities do teach us how to think critically which in turn may teach us how to feel empathetically. The humanities provide us with skill sets that include reading critically, writing articulately, analyzing astutely, and yes, creating inventively. Perhaps most importantly, the humanities empower us and enable us to climb out from the inner coils of the ideologies which bind us too tightly, to see the bigger picture, to recognize our own individual worth and value the worth of others, and to learn how to navigate and in some cases transform the current, cultural currents. This is indisputable although too easily overlooked because such effects are so diverse, distant, and difficult to measure.

Kirsch quotes Matthew Arnold, a nineteenth-century critic, as having “defined culture as ‘the best which has been thought and said in the world.’” Taken out of context, as it is, true, “few humanities professors would now want to claim the authority to say what is best, or even agree that there is such a thing as ‘the best’ art or thought.” Arnold’s two finer points, however, in “The Function of Criticism at the Present Time,” had to do with the role of criticism in culture, what we would now think of as scholarship in the humanities. First, Arnold juxtaposes the critical with the creative, judging the critical to be of a “lower rank” than the creative, but he goes on to say that creativity is not limited to “producing great works of literature or art.” Critical endeavors may be creative in their own right. Second, he draws our attention to the role of the critical in relation to the creative—which is that the critical work creates the atmosphere in which the creative work may thrive:

for creative literary genius does not principally show itself in discovering new ideas; that is rather the business of the philosopher; the grand work of literary genius is a work of synthesis and exposition, not of analysis and discovery; its gift lies in the faculty of being happily inspired by a certain intellectual and spiritual atmosphere, by a certain order of ideas, when it finds itself in them; of dealing divinely with these ideas, presenting them in the most effective and attractive combinations, making beautiful works with them, in short.

In a myriad of ways, the cultural critic has the capacity to create the atmosphere in which the arts and humanities may flourish. Critics and scholars alike have played pivotal roles in bringing the public’s attention to one work of art or another, bringing works of art to the forefront or out of obscurity, and motivating individuals to creativity. To claim otherwise is stubborn, but tragically, an offshoot of a way of thinking in the current climate. Shakespeare, for instance was retrieved from falling into disfavor by a scholar, Samuel Johnson.

As for the role of the humanities in culture, Kirsch echoes a claim made by George Steiner among others that the humanities do not humanize, saying that the idea that the “humanities humanize” is a “difficult argument to make,” given examples to the contrary, pointing to two of the most horrifying, depraved, inhumane instances of hatred, the genocides of Hitler and Stalin. Kirsch observes that “highly educated people … were, for instance, devoted Nazis or Stalinists and … used their learning to buttress their defense of inhumanity.” I am reminded of post hoc ergo propter hoc, or false cause. Although these tragedies marked a failure of the hope of progress for humanity, the great collapse of Enlightenment hopes for us as a species, these crimes against humanity were not perpetrated because of the humanities. The horrors and atrocities of human behavior of the twentieth century were a failure of humanity, not the humanities. Moreover, the humanities remember.

The humanities may not be able to save us from our destructive and sometimes evil tendencies as a species; however, in an albeit faith-based essay, Mark Watney makes a compelling observation, that “despite the humanities’ inability to “humanize” us  in the Enlightenment sense of the word , they have left us a remarkable record of our deep-rooted and mysterious sense of right and wrong.” He suggests that “the true ‘humanizing’ power of the humanities [is] not in protecting us from depravity, but in exposing its terrifying reality, and recording our despair and yearning for purity.” In another fascinating Atlantic article celebrating the life of a former professor in memoriam, “Humanizing the Humanities: John Rassias and the Art of Teaching,” Lara N. Dotson-Renta describes the situation this way:

The humanities and the “soft” skills these disciplines foster are pitted against the sciences when in fact they are a part of an ecosystem of knowledge, a balance in ways of thinking and seeing. There is value in debating the ethics of King Creon’s refusal to allow Antigone to bury her brother in Antigone, just as there is worth in delving into the mysteries of atoms. While society at large requires more “education,” to advance, it has failed to see that it comes in many forms, arguably narrowing its meaning and application.

Dotson-Renta is right when she observes that “[a]s a culture, the country has come to place decreasing value on thoughtfulness, abstraction, and nuanced critical thinking that poses big (uncomfortable) questions rather than presuming answers.” I often ask my students to “follow the question” in search of answers rather than following the answers they hope to find to reinforce their existing opinions through research. Being able to rest in uncertainties, seek out uncomfortable truths, and shake ourselves out of our own ideological malaise to extricate ourselves from blind adherence to what we already believe is one of the true powers of a humanities education.

3

Anecdotally, I first realized how endemic undervaluing the humanities was to academia during my time at NYU twenty years ago. The disparity between the living accommodations offered to humanities verses STEM students would have been laughable were they not so deleterious. As an example, humanities graduate students were crammed as pairs into studio apartments built for one person while neuroscience graduate students enjoyed enormous apartments all to themselves. We paid $2,000/month for the pleasure of living in a small, dark studio tucked into the one of the over-large, super-block Washington Square Village buildings built in 1959, though granted, they were located well in the Village, as we pathetically tried to create some semblance of privacy with folding screens we purchased at the family-owned, corner hardware store (gone by now I’m sure). By contrast, my neuroscientist friend paid $400/month for a single—a hip, inspiring, three-times-as-large apartment in a small, red-brick nineteenth century building with character, an open floor design, and a sweet fire escape. He had pillars. We had cockroaches. He had large windows and hardwood floors. We had aluminum-framed windows (only one opened) and linoleum. As a student of cultural journalism and literature at the time, or in my roommate’s case, music therapy, the message was clear: theorizing a tinier part of a teeny tiny part (as my neuroscience friend described it) of a tiny specialized field of science which might one day be useful for a population, or not, was more valuable than an informed population or healing an abused child. The humanities address individuals and communities—made up of individuals—while STEM majors address populations, theories, and the possibilities of progress—irrespective of individuals. Where value is places is reflected in the treatment of fields’ students and scholars. A professor of mine in graduate school, Ira Shor, once told us to look around. The rooms we inhabit, as tenants, workers, or students tell us a lot about how we are valued, or not.

The maltreatment trickled down into a trailing condescension that infected even our social lives, once slyly swirling around the cocktail table one night at a favorite nightclub, The Fez in NoHo (gone since 2005)). Caught in the middle of a debate which I look back on now as toxic masculinity in action, a young woman sitting between two neuroscientists, a would-be knight in shining armor and an aggressor, I sat there uncomfortably as they debated the merits of a graduate degree in the humanities. My friend was arguing a case for the humanities having investigated what we humanities-types do with our critical thinking and scholarship. Literary criticism had passed the test, and he had become convinced that our intellectual pursuits were as intellectually rigorous as theirs. My friend’s colleague, on the hand, snorted and disagreed, looking simultaneously amused and uncomprehending. He made it clear he thought the very idea ridiculous. It was only later I would realize just how insulting that conversation had been on a personal level. But the attitude was endemic. The universities have been telling us in the humanities that we aren’t worth as much as our STEM counterparts for some time. Why?

Follow the money. It isn’t because the humanities aren’t worth as much culturally, artistically, individually, or intellectually; it’s because STEM is more profitable and can be harnessed for maximum power by elites and organizations whereas the business of the humanities is actually, in part, to stem that flow of unchecked power and money through many means: cultivating critical thinking skills, questioning dominant authorities, challenging unquestioned ideologies, studying history, reading literature, fostering the kinds of perspectives through which, gasp, empathy emerges. As a result, in a market-based economy which has transformed into a full-blown market-driven culture, the humanities have been cobbled, trimmed, cornered, and harnessed—because the humanities have the potential to get people to think.

In 2014, seemingly while sipping an old fashioned, Benjamin Winterhalter wrote a wonderful article for The Atlantic on “The Morbid Fascination With the Death of the Humanities” in which he observes that “the humanities crisis is largely a positive feedback loop created by stressing out over economic outcomes.” After meeting and talking with Matt Langione, who was studying literature at Berkeley at the time, and whose interests were in Modernist literature by way of studying neuroscience, Winterhalter captures the irony: “The novelty of Matt’s studies, it seemed to me, encapsulated the craziest thing of all about the whole ‘crisis of the humanities.’ The conversation about funding for the humanities somehow manages to proceed in complete isolation from the actual practices of today’s humanistic scholars.” The fact is, the humanities are not irrelevant. They have never been irrelevant. They are, nonetheless, a problem for corporate America because the humanities are (or should be) engaged in the business of thinking. We may not be able to save the humanities in education, and we will see the destructive effects in culture, perhaps not with a bang, but a whimper.

 

Works Cited

Baker, Kevin. “The Death of a Once Great City.” Harpers. July 2018. https://harpers.org/archive/2018/07/the-death-of-new-york-city-gentrification/. Accessed 20 June 2019.

Bosker, Bianca. “The Binge Breaker.” Atlantic. Nov. 2016, https://www.theatlantic.com/magazine/archive/2016/11/the-binge-breaker/501122/. Accessed 20 June 2019.

Bradbury, Ray. “The Veldt.” The Illustrated Man, 1951, pp. 7-19, http://teachers.wrdsb.ca/jackmaw/files/2012/09/The-Veldt-Ray-Bradbury-pdf.pdf. Accessed 19 May 2019.

Butler, Octavia. “Speech Sounds.” Isaac Asimov’s Science Fiction Magazine, Dec. 1983, https://www.unl.edu/english/docs/englishweek17/engl200-speechsounds.pdf. Accessed 19 May 2019.

Carr, Nicholas. “Is Google Making Us Stupid?” July/Aug. 2008, The Atlantic, Atlantic Media Company, June 2018, www.theatlantic.com/magazine/archive/2008/07/is-google-making-us-stupid/306868/. Accessed 19 May 2019.

Dotson-Renta, Lara N. “Humanizing the Humanities.” The Atlantic, 17 Jan. 2016, https://www.theatlantic.com/education/archive/2016/01/humanizing-the-humanities/424470/. Accessed 25 June 2019.

Freeman, Michael. Railways and the Victorian Imagination. Yale UP, 1999.

Forster, E.M. “The Machine Stops.” Oxford and Cambridge Review, Nov. 1909, The University of Rhode Island, https://www.ele.uri.edu/faculty/vetter/Other-stuff/The-Machine-Stops.pdf. Accessed 19 May 2019.

Hayot, Eric. “The Humanities as We Know Them are Doomed. Now What?” The Chronicle of Higher Education. 9 July 2018. https://www.ewa.org/latest-news/humanities-we-know-them-are-doomed-now-what. Accessed 25 June 2019.

“Humans Change the World.” The Smithsonian Institution’s Human Origins Program, Smithsonian National Museum of History, 14 Sept. 2018, humanorigins.si.edu/human-characteristics/humans-change-world. Accessed 19 May 2019.

Kerry, Cameron F. “Why Protecting Privacy is a Losing Game Today—and How to Change the Game.” Brookings. 12 July 2018, https://www.brookings.edu/research/why-protecting-privacy-is-a-losing-game-today-and-how-to-change-the-game/. Accessed 20 June 2019.

Kirsch, Adam. “Stop Worrying About the ‘Death’ of the Humanities.” The Wall Street Journal, 26 April 2019, https://www.wsj.com/articles/stop-worrying-about-the-death-of-the-humanities-11556290279. Accessed 25 June 2019.

Lawton, Graham. “Every Human Culture Includes Cooking – This is How it Began.” New Scientist, 2 Nov. 2016, https://www.newscientist.com/article/mg23230980-600-what-was-the-first-cooked-meal/. Accessed 25 June 2019.

Le Guin, Ursula. “The Ones Who Walk Away from Omelas.” New Dimensions 3, edited by Robert Silverberg, Nelson Doubleday, 1973, https://www.utilitarianism.com/nu/omelas.pdf. Accessed 19 May 2019.

Manney, PJ. “Empathy in the Time of Technology: How Storytelling is the Key to Empathy.” Journal of Evolution and Technology, Sept. 2008. https://jetpress.org/v19/manney.htm. Accessed 25 June 2019.

Massing, Michael. “Are the Humanities History?” The New York Review of Books, 2 April 2019, https://www.nybooks.com/daily/2019/04/02/are-the-humanities-history/. Accessed 25 June 2019.

Palermo, Elizabeth. “Who Invented the Steam Engine?” Live Science, 19 March 2014, https://www.livescience.com/44186-who-invented-the-steam-engine.html. Accessed 25 June 2019.

Rosin, Hanna. “The End of Empathy.” Civility Wars, NPR, 15 April 2019, https://www.npr.org/2019/04/15/712249664/the-end-of-empathy. Accessed 20 June 2019.

“The History of Electricity—A Timeline.” The Historical Archive, 13 Feb. 2007, http://www.thehistoricalarchive.com/happenings/57/the-history-of-electricity-a-timeline/. Accessed 25 June 2019.

“The Invention and History of the Printing Press.” PS Print, https://www.psprint.com/resources/printing-press/. Accessed 25 June 2019.

Twenge, Jean M. “Have Smartphones Destroyed a Generation?” Atlantic, Sept. 2017, https://www.theatlantic.com/magazine/archive/2017/09/has-the-smartphone-destroyed-a-generation/534198/. Accessed 25 June 2019.

Watney, Mark. “Torturing Jews and Weeping Over Schubert: Have the Humanities Failed to Humanize Us?” Dappled Things, n.d., https://dappledthings.org/12338/torturing-jews-and-weeping-over-schubert-have-the-humanities-failed-to-humanize-us/. Accessed 25 June 2019.

Winterhalter, Benjamin. “The Morbid Fascination with the Death of the Humanities.” The Atlantic, 6 June 2014, https://www.theatlantic.com/education/archive/2014/06/the-morbid-fascination-with-the-death-of-the-humanities/372216/. Accessed 25 June 2019.

Wolchover, Nathalie. “Why It Took So Long to Invent the Wheel.” Live Science, 2 March 2012, https://www.livescience.com/18808-invention-wheel.html. Accessed 25 June 2019.

 

The Lost Quiet

When was the last time you walked into the world without filters? When was the last time you gazed out onto a beautiful landscape without thinking of taking a picture, or deciding not to take a picture? When was the last time a picture never entered your mind? When was the last time you worked in silence or sat inside a quiet house? I miss the quiet noises with the buzzing, blinging, and swooshing, and scrolling.

Radio Days

Lately I’ve been thinking a lot about how to cultivate a radio life again. I never had to cultivate a “radio life” back in the days I now reminisce about.  Those days just happened.  My days were my own back then — before 2010.  I don’t even need to go back to the 90s to recall times when I woke to look out my window instead of down at a screen, when my thoughts, plans, and daydreams wandered according to a pace and direction my own mind and life set for myself before me instead of the random pings and alerts set by my social media feeds.  Back in the day, the radio days, I had more spaciousness to ruminate, to take walks, to listen to the radio.  I miss the work of the mixed tape and the living of the mixed up life.  “Say So Long to Serendipity” wrote John Balzar in a 2002 article.  I still have the now-old newspaper clipping taped to the wall over my desk back at home.  The iPhone would yet exist for seven years, Facebook for a couple more years, never mind Instagram, Twitter, and Tumblr, six, eight, and seven years respectively.  Was the world far quieter back then?  Not really, but my days were my own. Over the next few months, I’ll be working out how to cultivate the old radio days again.

Following the question …

It’s all too easy to argue what we know, or what we think we know, what’s familiar, or what we are sure we believe.  It’s much harder to let go of our attachments to those beliefs and to follow our questions into new ways of thinking, being, and feeling.  Here we invite inquiry, connection, and curiosity with the goal of cracking open new views to ways of seeing which cultivate empathy.  By leading with questions instead of answers, we may find ourselves more able to think critically, analytically, and creatively.  Maybe we can then write ourselves into newer, more vibrant and compassionate worlds by following the question.

?