Scenario 1: You’re in traffic; someone drives recklessly and almost causes you to wreck; you get angry and exhibit “road rage.” You say naughty words.
Scenario 2: You’re in a theatre; a character in a film dies; you get sad and try to hide your tears. But the sobbing is irrepressible. Scenario 3: You’re reading T. S. Eliot’s “The Metaphysical Poets,” and come across the phrase “dissociation of sensibility.” You look up, turn a thought, but other than that, nothing happens. Scenarios 1 and 2 are examples of personalizing. Scenario 3 is an example of depersonalizing. We spend our days personalizing and depersonalizing. If we saw a near miss on the highway by two drivers and a subsequent road rage response, we would probably depersonalize. If we were forced to see a film on a subject for which we had no affinity, we would probably exhibit our own dissociation of sensibility. Questions about personalizing and depersonalizing: Which of the following YouTube videos would you personalize and which would you depersonalize? A video:
Which of the following would elicit personalizing or depersonalizing from you:
Well, you get the point, don’t you? We spend much of our days personalizing or depersonalizing our experiences. In instances involving us directly, personalizing is often, but not invariably, automatic. Depersonalizing results from distance, cognitive training, and sometimes from a neurological malady inhibits empathy. I suppose one might argue that depersonalizing is synonymous with distancing. As you observe yourself and others over the next day, ask whether you are witnessing personalizing or depersonalizing. That which is personal is meaningful.
Huh? Mayor Eric Adams of NYC and Mayor Muriel Bowser of Washington, D. C. held press conferences to express their anger that the Texas governor bussed illegal border crossers to their cities. At a distance, the resident border citizens’ problems with illegal aliens, the unseen deaths of tens of thousands by fentanyl, and the crimes committed by illegal aliens seemed to be meaningless until the problem became a personal one. Shipped by plane by the federal government at night to New York and now in the day by bus by the government of Texas, the illegal migrants are suddenly a personal problem for the people in “sanctuary cities,” and the border crisis has become meaningful. What happens elsewhere is one thing. What happens in my place is another, eh? That which is personal is meaningful. Futuristic design, graceful arches, the new Sixth Street Bridge closed shortly after it opened. No, it doesn’t have a structural problem. It has a human problem: Drag racing, donut hole maneuvers, and people climbing the wide arches for Selfies. Police have issued numerous citations, and now the authorities are devising preventative measures to keep people safe; the first measure is temporarily closing the multimillion-dollar bridge. *
The best laid plans, eh? That our human club membership includes both criminal and foolish people shouldn’t surprise anyone, but those who designed and funded the bridge might have considered that some will discover new uses for it and that they will do so at their own and others’ jeopardy. There isn’t much anyone can do to ensure a bridge will be used as it was intended, including any social or psychological bridge. Take the current perception of an angry America split along political, economic, and racial lines as an example of an unbridged gap. Can we build a bridge that connects one side to the other without its being misused? Note what happens during televised debates between pundits. In very few incidents does either side cross the bridge, and in some instances one or both sides destroys it. The next time you see a debate, picture a bridge shiny and new and built with the purpose to connect. And then watch the drag racing, donut hole maneuvers, and people climbing the supports to show off for onlookers. Watch, also, the frustrated moderator try to facilitate the free movement across the bridge only to yield to the necessity of shutting down the bridge by interrupting the debate with a commercial, a cutaway, and another set of guests. In some instances those on either side of the gap simply walk away from the bridge, unwilling even to meet at the halfway point or acknowledge its purpose to connect one side to the other. The primary purpose of a bridge is to connect places and people. They facilitate the interactions of society. That all bridges eventually need some repairs is inevitable, but what we see in social and psychological bridges isn’t a longterm wearing down. Almost as soon as such bridges span gaps, they need repairs. *https://www.nbclosangeles.com/news/local/sixth-street-bridge-arches-boyle-heights-downtown-la/2944212/ https://www.latimes.com/california/story/2022-07-20/the-sixth-street-viaducts-baptism-by-la https://abc7.com/sixth-street-bridge-takeovers-friday-night-closure/12069497/ Although I have not read every word in the Bible, I have read in my opinion enough to have an opinion. And no, I’m neither a theologian nor a literalist. Just a guy who picked up the Bible and read a bit, more, I confess than I read in the Koran, the Book of Mormon, and the Upanishads, all of which I have also picked up to read at times. In all those books and in the passed on wisdom of the Buddha, I sense an underlying, if not explicit, assumption that pride is on average the biggest moral NONO. It’s pride, for example, that enables one to lord over another, to rob another, and to injure or kill another. Sure, there are other motives involved; greed, we could argue, underlies slavery, lust underlies sex slavery, and the other Seven Deadlies underlie behaviors that we might associate with pride, including its excessive form: vanity. If I were so bold as to rewrite the Ten Commandments, I might simplify all of them with a single dictum: “Be not proud” or its synonymous “Be humble.” (Or, for devotees of the King James Version: “Be ye not proud.”)
I suppose that many people believe that the first sin had something to do with sex—what with all that nakedness in Paradise and nobody to watch what was going on. But the two creation stories in Genesis—yes, there are two—provide a context for ascribing the first sin to pride. * In the first creation story (chapter 1) God tells Adam that he is forbidden to eat from the Tree of Knowledge of Good and Evil. In the second story, after her creation, Eve encounters the serpent who asks her why she can’t eat from that one tree. The serpent then tells her that upon eating the fruit, she will find her “eyes opened” and that with Adam, she will be like a god. Tell me that isn’t the groundwork for an act of pride. Nothing says “You can’t tell me what I can or cannot do” more than eating from the Tree of the Knowledge of Good and Evil. Pride is always part of or on the edge of human actions involving authority and individualism, as every rebellious generation runs counter to the previous generation, swinging the social, philosophical, and psychological pendulum. It is the issue of Big Government bureaucrats’ intrusion into the lives of individuals; it underlies the quandary of choosing between any governing person or body and personal freedom. It lies at the heart of collectivism and individualism. Conformity in the extreme is acquiescence; rebellion is self reliance. Both sides of the issue are manifestations of pride, the collectivist deigning that all must follow whatever he writes as law, and the individualist rejecting dicta and accepting the role as a god on Earth, fully aware of the knowledge of good and evil. Pride feeds the root of rebellion, and it seems to be destined to reveal itself either subtly or overtly. If in this year you want to see it subtle variations, watch on-the-street interviews of Russians who oppose the war in Ukraine. In fear that opposition to the official position means imprisonment, those who wish to rebel say, “I cannot comment.” So, a couple of 2022 stories from America illustrate the quandary over whether or not to east from the Tree of Knowledge of Good and Evil to become “like gods,” as the serpent says. First, there’s a story of a ten-year-old girl in Hawaii who, having been bullied by another girl or girls, drew some picture that some of her classmates reported to the bully’s mother. That mother called the school to tell them to call the police. The police showed up at the school, and they carted the ten-year-old artist off in handcuffs. ** How does this story illustrate the quandary? The “offended” mother decided that she could determine what was Good and Evil, and she decided that her Good outweighed the Evil of the ten-year-old. And the police who responded, acknowledging the folly of the complaint still acted to arrest the girl for drawing the picture and supposedly putting on it the word kill plus curse words. They acquiesced, gave up their commonsense, and conformed regardless of their own sense of what was good and what was evil, the latter being the arrest of a hurt child whose only response was to draw a picture. * Big Government (school authorities and police) can control individuals because it makes no exceptions to its rules unless the government officials du jour decide pridefully that their personal agendas are more important than both law and commonsense. In the instance of the little girl, the police obeyed the commandment against violence drawn on a sheet of paper, ignoring, by the way, Big Government’s own rules about freedom of speech. Second, there’s the story of the bishop in Peoria, Illinois, who has forbidden some Catholics to pray the Rosary in the cathedral during their “March for Catholics” at the end of September. The Catholics are upset by certain actions the clergy—including the Pope—has taken in the past few years, such as restricting the Latin Mass and including rites and religious instructions from other religions. The disgruntled Catholics want to march to the cathedral and pray for the Church’s return to the traditions they know. Now, one might argue that the word of the diocese’s bishop should be taken as law, but again, two sides represent two prideful motivations. On the side of the bishop pride manifests itself in his refusing to accept a questioning flock or even talking to them; on the side of the protesting Catholics pride manifests itself in their Adam-and-Eve like disobedience and defiance of his rule. In past centuries, the conflict between the two would have resulted in actions like torture and condemnation by the Inquisition, and by wars like those of the Reformation. Pride’s at play here, mark my words. Not that I have opened your eyes to what you don’t know, but I pride myself here in showing you the root problem in most, if not all, human interactions. That I have also railed against socialism in a number of these postings is also an indication of my own pride and rebelliousness. Sure, I like order and realize that without some ordering system, humans will act in their own interests often to the detriment of others. But I also realize that complete conformity denies me my sense of individualism. And it’s not that I am unaware that in becoming rebellious to some extent isn’t in itself a form of conformism. Take rebellious youth who adopt a fashion and subculture, such as Goth. Their rebellion against the fashion of the times simply manifests itself in the fashion of a group. So, each rebellious person exhibits both pride and acquiescence. Adam, as you recall, followed Eve’s lead. * The two different creation stories are probably the work of different authors whose works were then collated by a priestly class. The scholarly work on this was done by E. A. Speiser for his translation of Genesis for the Anchor Bible. By the way, he also points out the problem of the first word of Genesis, one implying an introductory adverbial clause and the other implying a prepositional phrase: “When God set about to create…” as opposed to “In the beginning.” Both beginnings present us with a problem. The adverbial clause implies Time before Time; the preposition phrase implies a God who made some mistakes that had to be corrected over the course of six days (“In the beginning God created Heaven and Earth and the Earth was void and without form…” like the adverbial clause presents us with a deity bound by time as we are, acting as we do sequentially) **https://www.hawaiinewsnow.com/2021/11/10/response-10-year-olds-arrest-hpd-says-offensive-drawing-was-credible-threat/ When General Edward Braddock died, he reportedly said to George Washington either or both “We shall know better another time” and “Who would have thought?”
That latter expression might make sense for all of us as we approach our own demise though we might add “at this place and at this time.” For Braddock, death came by war wound in a skirmish during the French and Indian War along what is now called the National Road, or Route 40 between Chestnut Ridge and Laurel Ridge east of Uniontown in southwestern Pennsylvania. I suppose every soldier knows that death is possible while believing it is not probable. The other expression ascribed to Braddock, “Who would have thought?” is not unexpected. Braddock was a Londoner who had survived a dual and other military engagements; dying in the woods far from home was most likely not his plan for the day. The former quotation, “We shall know better another time” appears to be the lot of all of us. Our limitations, like those of a British major-general who failed to anticipate the nature of his enemy, include our running out of time to use the lessons that we learn, especially any “last lessons.” Thus, Braddock’s last words might be repeated by many: The Selfie-taker who stands too close to a precipice and falls to her death, the saber-rattling leader of a belligerent country like North Korea, the hegemonic leader who decides to invade a neighboring country, or the rash home-invader who meets death at the hands of an armed homeowner. Knowing better another time isn’t an option because this is not our practice life. Each of us will have a Braddock moment when we learn a last lesson that we can never apply at “another time.” Ah! The Grand Canyon. The majestic White Mountains of New Hampshire. The West Coast’s sea stacks and marine terraces. When I see those features, I know truth lies in the seeing. Forget all the Matrix stuff. Those places are real to the feel, and they stimulate wonder. And then there are the majestic scenes of Avatar. Wondrous indeed! But wholly made up. Fake.
As an “older” American, I can go back to a time before we had a television, to a time when I traveled via black-and-white movies with scratchy sound into sub-Saharan Africa, to the pyramids, and to the “Wild West.” Black-and-white films, mind you, not “Technicolor” with Dolby Sound. And I traveled to space on a sparking noisy spaceship with Buck Rogers. I’m sure that the actors had colorful clothing. I assume that for the galas in movies with haute couture, also. But I had to imagine the green of the selva, the tan of Giza, the green of saguaro, and those multicolored outfits of actresses walking with poise down grand staircases. All that I had to imagine. All that was a vision in the brain. The eyes? Well, the eyes were as they are now, just the receivers, not the interpreters, and with their blind spots, they even leave a tiny part of the scene blank. Not to worry, however. The brain paints in the scene, the brain is the “seer.” The brain enabled those who watched film prior to The Wizard of Oz to envision roads, yellow or otherwise. So, I guess I can say that “way back then” I was used to “fake” imagery, to an Ansel Adams’ black-and-white majesty because I could fill in the missing color from experience with a world of color. Had I grown up in the Arctic, however, I might not have had the breadth of color in experience from which to superimpose a complete picture over the partial one. Too much monotone in regions of sea and ice. Today, I find myself sitting in front of a computer with HD capability, and I see videos labeled with numbers like “3” or “4” for the level of detail. And yet, my eyes—that is, my brain—is no better off than it was when I was child because I realize that I might be subject to some artistic reinterpretation or computer enhancement. I noticed the click bait YouTube videos purportedly showing strikes by drones on military vehicles in Ukraine. Some of those videos are no more than game videos serving as “reality.” Fortunately, with a long history of interpreting that goes back to those black-and-white days, I can more times than not know which videos are “real” and which are “fake.” Not, I will remind you, because of what my eyes can do, but rather because of what my brain can do. But in a world in which our ability to discern fake from real is constantly tested, I find myself questioning anything that comes into my field of view, and my brain sometimes applies the virtual realities for the “real.” And I know exactly when I first experienced this phenomenon of perception that is unique to our times. I saw a car accident one day shortly after the first season when “instant replay” was used during televised football games. Immediately after the accident—within seconds, I’d estimate—I turned my head to “re-see” it, as though I might have been watching a replay of a touchdown. The reaction was automatic. I was conditioned to see what I had not seen if, for example, I had left the TV room for a snack and had to go back for that replay. I had, as all of us have since that time, entered a period when very few “blind spots” exist. And where they do leave part of a scene blank because we were not paying attention or absent during an event, we rely on re-seeing—sometimes repeatedly and often from many different angles. I’m almost a god of seeing, and you are, too. I can go back in time to re-see, and I can adopt multiple angles of seeing. I’m used to not missing much as you are. And in all this “seeing” both you and I have learned more or less to interpret and infer. But we are not gods, and we are not infallible, so the eyes can—sorry, the brain can—misinterpret, especially when there is an element of technical enhancement. Take the images given to us by sophisticated, but artificial “eyes” like the James Webb Space Telescope. Recently released images reveal colorful celestial objects. But those images are computer enhancements that bring together images in different wavelengths. The color is a composite. Each part of the spectrum detected by the JWST and other telescopes merges to make the “reality” we see. Any one of the images prior to merging would seem dull by comparison to the final product. But my brain accepts the collated images as “real.” We live in a time when we cannot trust much of what we see unless we are standing at the edge of the Grand Canyon, the base of Mt. Washington, along the Big Sur, or in any of those places I knew only in black-and-white during my childhood. We are so intertwined with artificial imagery that we now have to question much of what we see. Were our ancestors plagued with this phenomenon? Maybe a little. After all, they had to interpret the world of colorless night and colorful daytime, of flickering firelight and foggy morning distortions. But our reliance on “virtual” imagery has exacerbated our confusion about what we think we see. If we throw into that confusion the purposefully misleading work of others, we have a level of problematic interpretation that generations a century and longer ago never encountered. Do my eyes deceive me or does my brain? Does this question shade all that I experience? Does doubt in the validity of perception precipitate a skepticism that our ancestors did not have prior to motion pictures, TV, replays, and computer enhanced imagery? If so, then what stone can I kick as Dr. Samuel Johnson kicked one to determine what’s real and what’s a product of the imagination à la Berkeley? |
|