Wednesday, April 19, 2017

Riding a Big Ball of Wibbly Wobbly*

In his book Time Travel, author James Gleick provides, according to the jacket blurb, "a mind-bending exploration of time travel: its subversive origins, its evolution in literature and science, and its influence on our understanding of time itself." I'm only into the first chapter so far, but Gleick seems to be living up to the blurbomise [book jacket promise of stellar content]. It all began, in literature at least, with H.G. Wells' The Time Machine, as Gleick describes:
Our hero fiddles with some screws [on the time machine], adds a drop of oil, and plants himself on the saddle. He grasps a lever with both hands. He is going on a journey. And by the way so are we. When he throws that lever, time breaks loose from its moorings.

What happens when cover artists smoke too much crack

Wells, according to Gleick, "invented a new mode of thought" when he dreamed up the time machine. Then, in a later chapter, Gleick asks, "Can you, citizen of the twenty-first century, recall when you first heard of time travel?" He doubts it, given the massive infusion of chronotraipsing into our imaginative world since The Time Machine. Felix the Cat traveled in time. So did Elmer Fudd and, of course, Mr. Peabody and Sherman. The number of time travelers [chrononauts?] have been countless and will only grow, I'm sure. I read TTM who knows how long ago (I would need to chronotraipse myself to find out) but I couldn't swear to it being my first exposure to time excursions.

All of the chrononautic adventurers have, well, adventures, scary and exciting for them and sometimes also for the readers. Si Morley, the main character in Jack Finney's classic 1970 novel Time and Again, is one of my favorites. (Two others are Clare and Henry from The Time Traveler's Wife.) In T&A (an unfortunate acronym) Morley is recruited into a secret Army project to save the world (naturally). He learns how to time travel using self-hypnosis. that is, with the help of movie sets and actors he is able to convince himself that he is in 1882 instead of 1970 and, voila, there he is! In the end [SPOILER ALERT!], he decides the time project is ill-conceived, perhaps even evil, and returns to 1882. There he stops the parents of the project head from meeting and, presto chango, no project. He stays in the past with the woman he met there and fell in love with and, presumably, they lived happily ever after. He also conveniently avoids witnessing or taking responsibility for any "collateral damage" in the future that may have come from his actions.

Anyway, we all experience time and the desire, occasional for some and obsessive for others, to slow it down or speed it up, travel at will to the past and/or future, or even merely wind our life clocks backward to answer all those pesky "what ifs." We also have invented myriad clever ways to measure it and track it and remind us of how much of it we...okay me/I...waste, in this case while trying to figure it out if I can self-hypnotize and zip ahead three years or so. The fear in that, of course, is that, if I manage it, I will discover that Idiocracy has come to pass. If you haven't seen the movie, it's where we get dumber rather than smarter as time marches on and the United States becomes a place where "the English language had deteriorated into a hybrid of hillbilly, valley girl, inner-city slang and various grunts." In the film, this degradation takes five hundred years. In the real world...well, I'm afraid to even think about it.

* PS: The title comes from someone who truly understands, or who truly understands that no one truly understands, what we call time: Doctor Who. As he puts it, "People assume that time is a strict progression of cause to effect, but actually from a nonlinear, nonsubjective point of view it's more like a big ball of wibbly wobbly...timey wimey...stuff." Brilliant!

Monday, March 13, 2017

Time to Be Anti-Social?

"The vaunted human capacity for reason may have more to do with winning arguments than with thinking straight." So goes the picture caption to a New Yorker article by Elizabeth Kolbert titled "Why Facts Don't Change Our Minds." She begins her piece by describing a 1970s Stanford University research project that asked some undergraduates to distinguish between real and fake suicide notes. Some were told they did really well and some told they did really poorly. In reality, they all did about the same. The hidden purpose of the study (all psychology experiments thrive on lacksparency [the intentional lack of transparency]) was to see how the subjects responded after the researchers revealed their true purpose. Even knowing they did average, the subjects who were told they excelled said they said they thought they had done really well and vice versa. "Once formed," the researchers noted, "impressions are remarkably perseverant." Scientists doing another study like this one found similar results. Despite seeing that the evidence "for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs."

No, you cannot come to the orgy. No blue sashes in the red sash group, get it?*

Kolbert poses the question, "How did we come to be this way?" She then describes the answer that two Harvard researchers posit in their book The Enigma of Reason. They say, "Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve problems posed by living in collaborative groups." The biggest advantage humans have over other species, these two assert, is our ability to cooperate, something that might seem extremely doubtful these days.

The same two researchers also note that a large obstacle to our ability to discern truth (this is an all-inclusive "our") is confirmation bias, our tendency to embrace information that supports our beliefs and reject information that contradicts them. In addition, they point out that while we're good at spotting weaknesses in other people's arguments, "almost invariably the positions we're blind about are our own." The researchers argue that if this trait were negative and harmful, genetically speaking, it should have been "selected against." Since it wasn't, "it must have some adaptive function," a function they say relates to our "hypersociability."

This sociability, the tendency to group together, did us good back in the days of wooly mammoths and saber-toothed tigers. Now, not so much. One thing we suffer from in our sociability is the "illusion of explanatory depth," that is, our thinking we know way more than we actually know. In some places, this doesn't hurt us much. For example, we can toggle the handle on a toilet without knowing how it works and the toilet will flush (usually). In other areas, again, not so much. For instance, in another study, the less able people were able to locate Ukraine correctly in the world, the more they favored US military intervention there.

In similar fashion, the less we know about President Trump's "immigration ban," the more likely we are to strongly favor or oppose it. The more others in our circles, whatever they may be, concur with and confirm our opinions, the more we resist or flat-out disbelieve and dismiss anything that counters those opinions. "This is how," two other scientists quoted by Kolbert observe, "a community of knowledge becomes dangerous." She then concludes, perhaps oversimplifying, "If we all now dismiss as unconvincing any information that contradicts our opinion, you get, well, the Trump Administration." Whatever side of that perspective you come down on, what seems incontrovertible is that our reasoning abilities and methods haven't evolved past the Pleistocene. Somehow, I don't find that the least bit surprising.

*Time Saving Truth from Falsehood and Envy, François Lemoyne, 1737. Public Domain.

Thursday, March 09, 2017

Buffaloed Beyond Belief

In a recent blog post, Grammar Girl Mignon Fogarty asked me (well, everyone) a question I never expected to be asked: "Have you ever struggled to figure out the 'buffalo buffalo buffalo' sentence and given up?" This is one query I can answer unequivocally: no. In fact, I was not aware that a "buffalo buffalo buffalo" dilemma existed.

American bison k5680-1.jpg
That's American Bison to you, if you please.*

The sentence GG references is (not simply) eight "buffalos" in a row: "Buffalo buffalo Buffalo buffalo buffalo buffalo Buffalo buffalo." Yes, this is a real sentence that is grammatically correct and makes sense. The capitalization is part of what makes this happen. I will pass on Mignon's explanation, which includes a sentence diagram I won't repeat here, in summary form because she does it so much better than I ever could.

First, she notes that three different meanings for "buffalo" are involved:
  1. the city in New York
  2. the plural form of the noun describing the animal
  3. the verb that means "bamboozle" or "fool" or "trick"
The sentence comprises subject-verb-object plus modifiers and a restrictive clause. Translated, it reads "New York bison [that are] tricked [by other] New York bison [also themselves] trick [other] New York bison." Put into the original sentence, it comes out this way: "Buffalo buffalo (bison from New York) Buffalo buffalo buffalo (that New York bison trick) buffalo Buffalo buffalo (trick New York bison)."

If you would like a more prosaic translation, there's this one from Mental Floss: "Bison from Buffalo, New York who are intimidated by other bison in their community also happen to intimidate other bison in their community."

There are two lessons here. One comes from the Mental Floss author Chris Higgins, who warns "Beware of Buffalo buffalo, buffalo, for they may buffalo you." The other is, should you ever travel to Buffalo, be sure to buffalo Buffalo buffalo before Buffalo buffalo buffalo you. Or you could just avoid Buffalo altogether and not have to worry about getting caught in the middle of buffaloed Buffalo buffalo buffaloing buffaloing Buffalo buffalo. From my perspective, that seems the saner course.

* Bison image was released by the Agricultural Research Service of the United States Department of Agriculture. Public Domain.

Monday, March 06, 2017


I've broken a solemn vow. When Survivor debuted about a hundred years ago or so, I said I would never watch a "reality" show. I managed to keep my self-promise until now. What happened? Hekla, Katla, Vigdis, Bubbi, and Por -- that's what happened. You might think these are Norse gods of some ilk or perhaps the participants in a Norwegian version of Big Brother, which would be Storebror if you want to get picky. However, these five individuals are neither superhuman nor subhuman. They are kittens and the current stars of Keeping Up with the Kattarshians, which Time Magazine describes as "the reality show you never knew you needed."

Humph! Felinality TV? How undignified!

The current stars of the show, which is sponsored in part by the Icelandic Cat Protection Society, are the kittens of a pregnant cat found in Reykjavik and brought to a shelter. The kids were born in a foster home and then, when they weaned, moved on to stardom.

Not much happening with the Kattarshians at the moment. The kittens are snuggled together in the top bunk in their "bedroom." If you stare at them for a while, you can see an ear twitch now and then. But that's the beauty of this particular reality show. If you ever need what Trevor Noah calls "your moment of Zen," go there, watch cats sleep, breathe deeply, and feel the stress evaporate. Or watch them when they're playing, have a few good chuckles and grins, and feel the stress disappear. It's been proven that owning a cat reduces your risk of having a heart attack or stroke. I'm sure watching them has the same effect, perhaps even more so since there are no litter box duties involved.

In 2014, according to National Geographic, the US had eighty million cats residing with people and the world had three cats to every dog (ha! take that dog people!). Theory has it that humans and cats came together five or six thousand years ago (some say nine thousand) when wild cats started to hang around human granaries and eat the mice that were eating the grain. The humans noticed this and made an offer to the cats: you keep the mice from eating our food and we'll give you a warm, dry place to sleep when you like. The cats thought about it for about two seconds and then said, "Deal." The rest is history.

Research has shown that as cats spent more and more time with us, their genetic makeup changed (ours, too, most likely). Specifically, the genes linked to fear and motivation shifted so that they became less fearful of new situations and more driven by rewards, as in when they jump on the bed at night and stare at you until you give in and break out the treats. Our relationships with these animals are either mutualistic (when both benefit from it) or commensal (when one benefits and the other just hands out the Friskies).

But I'm spending way too much time on this and I'm missing out on my cat watching. Three kittens just got up. The red tabby Por is playing with some fuzzy toy in the foreground, while some of the others spectate. The tortoiseshell Vigdis is still curled up in the top bunk. That looks very appealing at the moment as opposed to opening the newspaper or turning on the radio or TV. Oh, look. Either Hekla or Katla (they look alike) has gotten back in the bunk with Vigdis. This looks like a trend. I feel like I should join in. If I book a ticket to Iceland now, I can get there in a few hours. Then I just have to get them to skooch over a bit to make room for me. I'm sure they would.

Wednesday, March 01, 2017

Deepity Doodoo

The day after President Trump's first state of the union address that for some reason they don't call the SUA for a first timer, it seems appropriate to bring up the concept of deepity (thanks, Randy!). Deepity is a word not coined but adopted by philosopher Daniel Dennett. According to Wikipedia,
Dennett used "deepity" for a statement that is apparently profound, but is actually trivial on one level and meaningless on another. Generally, a deepity has two (or more) meanings: one that is true but trivial, and another that sounds profound and would be important if true, but is actually false or meaningless. Examples are "Que sera sera!" and "Beauty is only skin deep!"

The Man Who Knew Too Much (1956 film).jpg
If only it were true.*

Now that you know that, here, for your consideration, are a few phrases from President Trump's address to Congress:
  • A new surge of optimism is placing impossible dreams firmly within our grasp.
  • We will provide massive tax relief for the middle class.
  • I am going to bring back millions of jobs.
  • The time has come for a new program of national rebuilding.
  • The way to make health insurance available to everyone is to lower the cost of health insurance, and that is what we will do.
  • Everything that is broken in our country can be fixed.
One author who picked up on Bennett's deepity idea is Stephen Law, who coined a term of his own: "pseudo-profundity." He defines it this way: "Pseudo-profundity is the art of sounding profound while talking tosh. Unlike the art of actually being profound, the art of sounding profound is not particularly difficult to master." One of the numerous ways to achieve pseudo-profundity, Law points out, is to spout deepities.

I'm guessing that many people, I among them, would be pleased as punch should the president's promises to make our lives better, safer, and more affordable become reality. If that happens, I will be first in line at the crow deli for a big sandwich. If not, I see two choices. The first is to buy a gazillion copies of Law's book Believing Bullshit: How Not to Get Sucked into an Intellectual Black Hole and start passing them out to everyone. The second is to order several pairs of hip waders to prepare for what's coming. The first choice would take buckets of cash and a long-term commitment. The second would arrive in two days with free shipping from Amazon. Seems an easy choice, don't you think?

* Hitchcock movie that introduced "Que Sera Sera" sung by Doris Day. Copyrighted by Paramount Pictures, Inc.. Artists(s) not known. -, Public Domain,

Wednesday, February 22, 2017

An Exercise in Portendity

On April 20, 2016, Merriam-Webster posted this article in its A Thing about Words blog: "2,000 New Words and Senses Added to Merriam-Webster Unabridged." Thankfully, it didn't list all 2,000 words and senses, just a few to give you a taste of what's new and now "official" in our language. These are words like "waggle dance" (what bees do), "dipsogenic" (producing thirst), "ICYMI" (in case you missed this it stands for "in case you missed it"),"hella" (extremely or much of), and "dox" (to publicly identify or publish information about a person with negative intent; "dox" is a respelling of "docs").

One million? Who'll go one million? Anyone? Anybody?

So, after adding 2,000 more, how many words are there in the English language anyway? No one really knows. One could, if one were insane, sit done with an unabridged dictionary and start counting. Or one could turn to the Global Language Monitor (GLM), which thinks we have exactly 988,968 words in English. You might be interested to know (or not) that besides counting words with their algorithm, the GLM people have taken on another intriguing task: "Top Words for the First 15 Years of the 21st Century & the Trends They Portend." The list contains 40 terms. Here are a few of note (to me anyway):
  • Word: selfie. Comment: Evidently an ego-manical [sic] madness gripped the world in 2013-14. 21st Century Trend: The more people populate the planet, the greater the focus on the individual.
  • Word: the above-mentioned hella. Comment: An intensive in Youthspeak, generally substituting for the word "very" as in "hella expensive." Trend: The world is being subdivided into the various tribes of youth. [Oh, good. Animal Farm is coming.]
  • Word: singularity. Comment: Singularity was originally the name for the Cosmic Genesis Event  (the Big Bang). Trend: Spoiler Alert: Now used to describe when computer intelligence surpasses that of humans (SKYNET!!--possibly before mid-century).
  • Word: God Particle. Comment: The Large Hadron Collider (LHC) continues its quest for the Higgs boson, popularly known as the God Particle [now found by the way]. Trend: Scientists have calculated a one in fifty million chance that the LHC will generate a small black hole that could devour the Earth. [ICYMI, it hasn't done this yet.]
To end on a positive note (of sorts), I was inspired to create my own trendy word by one item in the GLM list. I think these two go hand in hand. Here are both for your perusal:
  • GLM Word: truthiness. Comment: Steven Colbert’s addition to the language appears to be a keeper. While something may not meet the standard of truth, it certainly appears to be true. Trend: Truthiness seems to set the new standard, unfortunately.
  • My Word: trumprication. Comment: A bastardization of "fabrication" that characterizes pretty much any statement uttered by our new president. Trend: I see a whole line of words spawned in the future by this one, maybe enough to get English over the one million mark, words like "trumpricalicious" for particularly juicy pronouncements, "trumpricatory" for actions shamelessly emulating the president's "trumpricacious" behavior, and "trumpricaphobia" and the attendant "trumpricaphile" and "trumpricaphobe" that will surely appear in the sixth edition of the Diagnostic and Statistical Manual of Mental Disorders being, I'm certain, rushed to press right now.
Well, that's enough multisyllabic words for one day. I wouldn't want to start an epidemic of
hippopotomonstrosesquipedaliophobia (fear, if you didn't guess, of words longer than "SAD!" or "DUMB!" or "LOSER!").

*By Larousse - Nouveau Dictionnaire larousse 1899, Public Domain,

Wednesday, February 15, 2017

All Out of Whack

So, balance. It's a complicated thing. Our bodies achieve it (or try to) every day. mostly unbeknownst to us. They keep (or try to keep) our core body temperature, our blood glucose, our plasma ionized calcium, our blood O2/CO2 pressure, our blood oxygen, our arterial blood pressure, our extracellular sodium and potassium content, our water volume, and our extracellular fluid pH, among many other things, on an even keel. I have no idea what most of these are but thank you, body, for your pursuit of homeostasis. Without it, we'd

The real question, of course, is whether the bottle is half-full or half-empty.*

Another kind of physical balance is biomechanical. In short, it's our ability to "maintain the line of gravity within the base of support with minimal postural sway." In English that means we (most of us) can stand, walk, sit, and do other things without falling flat on our proverbial gobs or tushies. To maintain our balance (and dignity), we need constant input from our vestibular system (the inner ear), our somatosensory system (sensors in our skin, tissues, muscles, bones and joints, internal organs, and the cardiovascular system), and our visual system (eyes and everything between them and the brain). Somehow, our brain makes sense of all this input and makes adjustments as needed to keep us upright. I'm very glad that it all happens autonomously. If I had to do it consciously, every moment would be hello-face-plant experience. While this might improve my looks, it would wreak havoc on activities of daily living.

Of course, we do fall down on occasion. This usually involves, at least I surmise it does, a tipping point of some kind. There are numerous versions of TPs. The physical one is "the point at which an object is displaced from a state of stable equilibrium into a new equilibrium state that is qualitatively dissimilar from the first." Face plant, in other words. In climatology, it's similar but scarier, i.e., when the climate is tipped from one stable state into a different stable state with no return to the original state possible (spilled milk on a global scale). In sociology, the TP is the point where a group of people radically change their behavior. The latter occurrence is also referred to as a "social epidemic."

Social tipping points were made famous (kindasorta) by Malcolm Gladwell in his book The Tipping Point. The subtitle is "How Little Things Can Make a Big Difference." Gladwell tells us that three factors are key to tipping over into epidemic behavior, or social contagion if you will: 1) the law of the few, 2) the stickiness factor, and 3) the power of context. The first refers to someone with a particular and rare set of social gifts. The second refers to that certain something about the "message" that makes it memorable. The third is the influence of environment, or, more specifically the influence of the conditions and circumstances of the times and the places in which they occur. Without a doubt we have seen these three at work recently in America.

Now the social/political pendulum seems poised at another tipping point. Having swung as far right as possible, it appears ready to head in the other direction. It's the "restoring force" of gravity that brings a pendulum (eventually) back to its resting position. Let's hope that works here, too. Let's hope there is a restoring force. Let's hope gravity wins over reckless disturbing pushes, whatever their origin, and that the wild swings calm and then stop and we find equilibrium or, as those of us who flunked seventh-grade physics call it, and you knew this was coming, balance.

*Woman balancing on champagne bottle. 1903. Public Domain.