THAT VS. WHO: The Politics of Linguistics
I Object! Or is that Subject? This was my first reaction to the common linguistic objectification of sentient beings which has become pervasive in America—main case in point: The use of the word that in the place of the word who. Common linguistic misuse often does not register as such, eventually becoming acceptable, despite written grammatical standards chronicled in dictionaries and stylebooks. Who? That? Whatever . . . . No! These are simple yet powerful words, the usage of which potentially can have serious ramifications—root and core words, therefore of primal importance.
The improbable foreigner that remembered his dream of strangers realized he was a tourist in the eyes of the person that spoke in logical conditionals that was probably a judge or a journalist writing about my friend that was renowned for raising the eyebrows of his pet rabbits with his nomadic life.
A parallel may be drawn between the case of that vs. who and concepts of objectivity vs. subjectivity, rationality vs. irrationality, and the apparent schizophrenic dichotomy of the left and right hemisphere functions of the human brain—how they can operate in harmony, at odds, or with one side dominating due to biological causes, (or more likely), less exertion of the other side.
The usage of that instead who has become prevalent; but I ask you to consciously recall the origins of how it has permeated your use of it (if it has indeed)—just how did it you absorb it? This is opposed to the sort of trendy, infectious waves of linguistic rashes (of short duration), or diseases (lingering) of certain words, for example “actually” and “basically,” which people used compulsively (nervous ticking), as anchors to somehow reassure themselves of their reality; here in New York City, among journalists, a swift, ubiquitous rash spread of the word of “ubiquitous,” as if the unintentional punning was somehow too much to remain contained; the obscure, unearthed word “schadenfreude,” which, once committed to memory, took hold with a (short spate of) vengeance; the virulent “like” has been a veritable insidious plague; but, most alarming, has been the fact that many people are referring to one another as that’s instead of who‘s (and thankfully not it’s, yet, because that would be just too blatantly inappropriate), perpetuating, subconsciously, the dehumanizing of people as quantity or objects—this recent mutation in the usage of the word that, oddly enough, has coincided with the growth of globalization.
Globalization and its media coverage was projected to aid in the “humanizing” of peoples in other countries, for non-travelers and those with no direct familial ties abroad; believe it or not, there are adults who have never lived anywhere but in their own hometowns, in other countries too!; and people dying or being killed at any given moment, in other nations too, like in our own!—to prove to us that our country should not be as egocentrically nationalistic as (gasp) other countries are! What country in the world projects the notion: “Yeah, I guess we are an OK people, but there are far better countries, so we might as well just go crawl under a rock for their edification. We give up, just invade us now, and get it over with already.”?—if you hear of one, let me know. New world journalism was supposed to bring us all closer (making the world “smaller.” What? Why not bigger?), calling us to empathize with this person’s hurting or that person’s pure joy; but it has, just as likely, had the opposite effect of objectifying people’s suffering or diminishing their exultations, depending on who is watching from what perspective or state of mind—which has led to juxtapositional schizophrenia, induced by data and sensory overload.
Globalization, a flashback to the roots (pre-omni tête-à-tête): Most world news coverage in the grand United States used to be limited to relatively small segments on the evening news, a couple of weekly news “magazines” on television, and a few informative newspapers and print magazines; flashing forward again, we see that, if people are interested, they can know about the political situation of almost any country on any day of the year, even obtaining up-to-the-minute reporting if there is a particularly interesting breaking news story—but not as readily in circumstances of an extended and ongoing one, such as collective starvation, genocide or epidemics such as AIDS (those can be researched extensively on the Internet); since the average resident of the USA possesses the attention span of a two-year-old caught in between the throws of yet another temper tantrum; news stories (if they do not consist of pure sensationalism) warrant one solid week of media concentration, two tops, if they are lucky, before interest wanes drastically—people are out there working, shopping, and online right now, ignoring much more than that, just one or two clicks away from news of any current travesty they wish to avoid; so, the world goes on as usual, just as it did during this or that war occurring at any period in human history, but now many of us can know so much more, forcing us to be more responsible and culpable for the willful obliviousness which we display day to day.
Globalization, exponentialized by the awesome Technological Revolution, promised to allow us more leisure time, because the populace assumption (fed by the media) was that the artificial intelligence of machines would make lives easier; but in reality, it has just accelerated activities phenomenally, generating endless work and stress, especially for people in metropolitan areas, some of whom are becoming juggling and overwhelmed scatterbrains, producing inevitable errors. As much as I extol the concept of the evolution of our species by pushing intelligence further and further out, I predict that more people are going to break, despite their drugs, exercise, religion, therapy, and multifarious escapes (some of which are supported by open minded corporate employers to ostensibly reduce stress levels, but strategically, to prepare the employee to endure even more work).
Is this really going to have to get brutally Darwinian?
I can imagine—and use the word quite carefully in this context—America moving on a slow course in the direction of Nazi Germany in its nationalistic mentality—I know that envisioning this might seem to some a harsh criticism, perhaps unwarranted or hyperbolic, yet, seriously, consider this: Who ever believes the atrocities of a country until they become concrete and verifiable? Then the ensuing, retroactive debate—hindsight. Recent politics have given birth to unsettling political Family Dynasties, such as those of The Bushes and the Bin Ladens, associated through The Carlyle Group business of President George W. Bush Sr. (a former Head of the CIA), directly, and President George W. Bush Jr. (a former CEO), indirectly, one company down—until The Fall out of the World Trade Center; this seems even less surprising considering that the preceding Prescott Bush (a US Senator) found himself transacting indirect business through a few companies which financed the German government during the Third Reich, until the U.S. government’s Trading with the Enemy Act put a stop to it. The U.S. government aided Saddam Hussein’s authoritarian reign, during the Iraq/Iran war. These are just a few examples in U.S. history of advantageous business involvement with smaller countries (exhibiting no sense of ethics) against others (for relatively temporary gains), followed by distancing, dissociation and erasure through propaganda when a tie is no longer advantageous; then ultimate betrayal—a recognizable cycle. To drill down a bit more: In the environs, be they literal or political, people do not heed the warning signs of toxic situations until the damages become manifestly undeniable. Journalists’ and politicians’ collective jaws drop this week or that, as they ask “How did this happen?” and proclaim “We must do something about it. It should never happen again.”—with regularity, these serial emergencies are treated with the same rhetoric, despite the inherent differences of the actual events.
The only Revolution that I can get in front of is that of The Mind; Freedom ultimately is in one’s Head—that of Absolute Sovereignty.
Which brings me back to who vs. that.
It is still grammatically proper to use the word who when referencing a person by his or her classification, title, or entity, as defined by the object relations of societal function (such as a person’s job title—lawyer, cashier, waitress, etcetera): “The judge who . . .” “The journalist who . . .”—no matter how objective these specific groups of people strive to be, and regardless of their attempts to flatten the field with their egalitarianism, grammatically, they are still who’s, not that’s.
Lumping people into easily assigned groups, sussing and dismissing, without attempting to understand them individually, is, by the very action, generalizing; yes, it is much easier to consider people as components of groups, such as Capitalists, Socialists, Communists, Radicals, Republicans, Democrats and so on, rather than perceiving them as individuals encountered on a one-to-one basis (group vs. person) as, say, a coworker, with whom I converse daily, because that might compromise one’s ideas about being for or against a specific group or movement. The ability to see subjectively and objectively, at will, can be of extreme importance in how one navigates one’s life.
One could posit that all sentient beings, human beings and other animals, could be considered who’s, with consciousness—one could even broaden this to vegetation, to the extent that this ideal might create vegetarians out of people who wish to counteract the narrowness of their natural, speciescentric inclinations; so, even broccoli could be considered a who, as that’s disintegrate into the oblivion of who suchness: when every particle exists with its own level of consciousness, without which it could not exist. Descartes was wrong in his assessment that animals do not have souls because they do not “think.” Everything in existence must “think,” therefore it All Exists—there is no space that is not full of Something; even the seemingly Empty is Full; and everything is in flux, as Heraclitus intuited so simply; this has been proven by the sciences of biology, chemistry, and physics, among others. Then again, applying the same logic, everything just mentioned could all be thought of as thatness, instead—yet whoness has been a speciescentric construct, but thatness, too impersonal.
One can apply some imagination to spot the obvious anthropomorphizing of objects all around us: Humans have created abundant objects in the encompassing architecture—many of these objects directly reflect their creators: The phallic, the round, the facial—structures which echo parts of the human body and psychology, as well as mimic natural patterns and phenomena. For centuries, some languages have featured the genderizing of practically every object—all that linguistic effort ensconced in the poetic and/or the sexist. So humans, theoretically, could anthropomorphize all existing things, transforming everything into a who—this is where some have created a God in their own image who necessarily must think like a Human—a very LARGE Human; other Gods created by Humans are allowed more freedom to be less known.
So is everything, including human beings, destined to be all that, all who, or both, in some fluctuating proportion, until we eventually become extinct or leave the planet? Should we even try to retain these linguistic distinctions between the human who, and all other matter that?
The Turk who murdered the Armenian that; The Nazi who murdered the Jewish that; The Jew who murdered the Muslim that; The Arabs who murdered The American World Trade Center that; The Americans who murdered the _______ that . . . .
Who always wants to win.
The _______ who will murder the _______ that.
The _______ that will murder the _______ that?
Wars kill linguistic distinctions.
Copyright Carol Maric 2007
All Rights Reserved
3 Comments:
It's nice to see that someone else is also bothered by the "that v. who" isssue. When I was in college, one of my professors gave a long lecture (dissertation, almost) about this. Thanks for the post!
Remember, all of the rules of grammar in the english language were simply made up when we decided to write a dictionary who knows how many years ago. Even spelling was simply made up. I suggest not trying to reason with it because there is no reason behind it, I say whichever usage is popular in society at the time is the correct usage. If who is popularly interchangable with that, then its correct, if its the other way around, then its fine. Forget rules. It's not concidered proper to use "they" when refering to one person, but I do it anyway because I don't want to say "he," I find it sexist.
"The usage of that instead who has become prevalent; but I ask you to consciously recall the origins of how it has permeated your use of it (if it has indeed)—just how did you absorb it?"
Microsoft Word, grammar check, seems to favor "that."
Post a Comment
<< Home