Thursday, May 18, 2017

The Faster I Go, the Slower I Get

Few things in life are constant. One of them, or so I thought, is the speed of light (186,000 miles per second in a vacuum). The uncertainty about light speed gets raised in "Understanding the Speed of Light, How Much Do We Know?" by Ryan Young. Young notes that the speed of light might be affected by something called "quantum vacuum fluctuations," which is not, as I initially thought, what happens to my Hoover whenever there's a power surge.

The distance from the Sun to the Earth is shown as 150 million kilometers, an approximate average. Sizes to scale.
Light travel time: 8 minutes, 17 seconds on average or instantaneous?

No, QVFs, in theory, occur because outer space is not empty but packed with particles that constantly alternate between existing and not existing--now you see me, etc. If so, if these particles happen to be in their existing state when light passes through, they could slow it down. This might seem weird or even impossible, but so was the idea that light is simultaneously or alternatingly, I forget which, a wave and a beam of particles.

Young emphasizes that QVFs are just an idea that sprang from some physicist's fevered brain, which is somewhat reassuring for those of us who crave stability in whatever form it may come. He also mentions Einstein's special theory of relativity, which tells us that "time gets slower the closer you get to the speed of light, stops when you reach the speed of light, and would go backward if you were to exceed the speed of light."

Along with his text, Young includes a helpful video titled "Why Is the Speed of Light the Speed of Light?" from a source called Answers from Joe (not the plumber I assume). Joe informs us that the speed of light is actually a snail's pace from the universe's perspective. It takes, after all, 45 minutes for it to get from the sun to Jupiter. "Can't it go any faster?" his audience asks. "No," Joe says. "Why not?" "Because Einstein says so." In fact, anything with mass, i.e., us, gets heavier and heavier as it moves faster and faster and so impedes itself from reaching the speed of light or going beyond it. Light has no mass and, ipso facto, can travel at, well, its speed. Since light travels at the speed of light, time stops for it. So the journey to Jupiter that supposedly takes 45 minutes at the speed of light takes no time at all from the light's point of view. Whoa.

There are lessons to be learned from all of this. First, if you want to get somewhere or do something faster, reduce your speed because, to steal from Lewis Carroll, "the sloweder you go, the aheader you get." Second, going slower makes you weigh less. Now that seems a win-win of colossal proportions. The nagging questions re the latter effect, however, are if you don't move at all, would you weigh nothing, and if you weigh nothing, would you cease to exist? Intrepid adventurer that I am, I will refrain from all activity and see what happens. If you never hear from me again (no cheers, please), consider the "cease to exist" question answered.

Wednesday, May 03, 2017

Now? There Is No Now. Only Then.

Thought to ponder, from John Banville's novel ancient light:
Even here, at this table, the light that is the image of my eyes takes time, a tiny time, infinitesimal, yet time, to reach your eyes, and so it is that everywhere we look, everywhere, we are looking into the past.
If you're thinking "he [me] must be reading that Time Travel book again," you'd be right. If you are exceedingly discerning and/or prescient, you might even be thinking "now he's going to pretend he understands Einstein's special theory of relativity." Once again, correctomundo. Well, sort of.

Incomprehensible scientific graphic included because, well,
I needed a visual. Hey, it does mention space and time at least.*

The STR, put extremely briefly, defines the relationship between space and time. Rather than try and fail to explain this myself (as if I could), I bow to the Dummies website. The STR "created a fundamental link between space and time.... If you move fast enough through space, the observations that you make about space and time differ somewhat from the observations of other people, who are moving at different speeds." This effect is known as time dilation, "when the time moving very quickly appears to pass slower than on earth" (see Interstellar). Put even more simply by the HowStuffWorks guy, "Time is not a physical thing. Time is an experience, and my time is not your time."

It wasn't long after Einstein put out the STR that "space" and "time" became "space-time" or, to use one of Doctor Who's favorite phrases, the "space-time continuum." As the song goes, you can't have one without the other. Our perception of reality, whatever that is, depends on this.

So where am I? Back to square one I guess. Even though our world can only function now, thanks initially to railroads and the telegraph, due to time being standardized, universalized, synchronized, or whatever, we all experience time differently. We all travel through space at different speeds. Our senses and cognition are perpetually catching up to what has already happened.

That last statement prompts a revelation of sorts. Everyone on earth is speaking or writing incorrectly when they use the present tense because, well, there is no present that we can perceive. To be accurate, we need to eliminate the now in our interactions. This will take some adaptation, changing, for example, greetings like "how are you doing?" to "how were you doing?" when you meet, sorry, met someone you know, er, knew on the street. Sounds doable, don't you think? Wait, what am/was I saying? It's already done.



* Illustration of a light cone [whatever the heck that is]. Public Domain.


Monday, May 01, 2017

What Lies Ahead

At the end of the 19th century, French commercial artist Jean-Marc Cote created a series of illustrated cards envisioning what the world would be like in the year 2000 (see examples here). Besides the classroom shown below, where the teacher grinds up books and feeds them electronically into students' heads, Cote foresaw things like flying firemen, whale-powered submarine commuting, an electric floor scrubber, and a mobile home. These cards were first put in cigar and cigarette boxes and then printed as postcards. Science fiction writer Isaac Asimov famously stumbled on and bought a complete set and then reproduced them in his nonfiction book Futuredays (described by Kirkus Reviews as a "featherweight non-book" -- but hey, we need those sometimes, don't we?).

Finally, a plausible way to finish the Great Books series.*

I stumbled myself (something I do often) on the image above in Time Travel, the James Gleick book I've been reading. Gleick describes these cards as "prescient images" He notes that, while the practice of prophecy (as in fortune telling) is ancient, previous to this time "no one asked the oracle to forecast the character of daily life in years to come." With H.G. Wells' The Time Machine, the ideas of "futurity" and "futurism" came into vogue. These refer to, in Asimov's view, what Gleick describes as "a sense of the future as a notional place, different, and perhaps profoundly different, from what has come before." He then quotes Isaac directly: "It may seem to us that the potential existence of such a future is self-evident, but that was most definitely not so until comparatively recent times."

When I saw the school image above, I remembered that an outtake from it appeared on the t-shirts and other paraphernalia sold when the 2012 Key West Literary Seminar, titled "Yet Another World: Literature of the Future," focused on science fiction. The seminar was cool, the authors interesting panelists and speakers, and Kalo and I got to sit at the same table with William Gibson at the closing night dinner. Gibson, author of Neuromancer and other well-known works, is described as a writer of speculative fiction. Science fiction, I guess, is now de rigueur, which makes sense because not all visions of the future involve science.

So have we come a long way with regard to how we view the future? Perhaps the pre-Time Machine world, being "nowists" instead of "futurists," had the better idea, that is, why spend time imagining something you can imagine but can never imagine correctly? Gibson captures this thought much better than I can in his novel Pattern Recognition [long quote warning--sorry]:
We have no idea, now, of who or what the inhabitants of our future might be. In that sense, we have no future. Not in the sense that our grandparents had a future, or thought they did. Fully imagined cultural futures were the luxury of another day, one in which "now" was of some greater duration. For us, of course, things can change so abruptly, so violently, so profoundly, that futures like our grandparents' have insufficient "now" to stand on. We have no future because our present is too volatile.... We have only risk management. 
The idea of our future being nonexistent because the present is too volatile is undeniably relevant and frightening these days, and the safeguard of risk management, if it is a safeguard, is getting shakier every minute. Our future, in Gibson's words, "is there...looking back at us" from the news media screens, assuring us that our fate is safe in his not-small-but-large hands. In this case, "he knows not what he says" seems a huge understatement. While a benevolent god might forgive him this, will we? Should we? Can we? Who knows? What I do know is that old saw about living in the now, even though we can never really do this (stay tuned--to be explained), seems the best advice at this moment and, for me, that means lunch, which is something, unlike most other things, that one can, and does, imagicipate [imagine with anticipation] with pleasure. And you thought, I'll bet, that this would not end on a positive note.



* France in 2000 year (XXI century). At school. France, paper card. Public domain.

Tuesday, April 25, 2017

Every Breath You Take

My wife Kalo wrote a beautiful poem about searching for inspiration in various places (a donut shop, CVS, the public library) and not finding it and, in not finding it, finding it, despite her closing stanza lines "At my computer reading Bukowski: writing about writer’s block is better than not writing at all. Do I believe him? Do you believe him? If only."

So it's not all in the mind after all.

So what is inspiration? Fortunately, you don't have to depend on me to define the undefinable because David Brooks, NY Times columnist and PBS NewsHour commentator, has already asked and answered. Or has he?
Inspiration is a much-used, domesticated, amorphous and secular word for what is actually a revolutionary, countercultural and spiritual phenomenon....
Inspiration is always more active than mere appreciation. There’s a thrilling feeling of elevation, a burst of energy, an awareness of enlarged possibilities. The person in the grip of inspiration has received, as if by magic, some new perception, some holistic understanding, along with the feeling that she is capable of more than she thought.
So the answer, apparently, is that inspiration, that EUREKA! moment, is magic. Indeed, magic seems an appropriate descriptor for "an unconscious burst of creativity in a literary, musical, or other artistic endeavor" or any other kind of invention for that matter. (In fact, "invention" seems a sister term to "inspiration," or maybe offspring is a better characterization.) The ancient Greeks viewed inspiration as a kind of madness in which the poet would go into a furor poeticus (an ecstasy), be transported out of his or her own mind, and be given access to hear, understand, and then embody the thoughts of the gods.

Merriam-Webster has another, more down-to-earth definition of the term being discussed here: "the act of drawing in, specifically the drawing of air into the lungs." This means, of course, that every breath I take, you take, we take is, well, an inspiration. So there's no need to look for it. It's right here inside us hiding in plain sight. I can't wait to tell Kalo. With any luck, this will put an end to her many bouts of inspiromnia [inability to sleep due to waiting for something unexpected and amazing to move your intellect or emotions]. On the other hand, she may be a little dismayed to know that every time she writes a poem, she goes a little mad, although, when I think about, I suspect this will come as no surprise to her.

Wednesday, April 19, 2017

Riding a Big Ball of Wibbly Wobbly*

In his book Time Travel, author James Gleick provides, according to the jacket blurb, "a mind-bending exploration of time travel: its subversive origins, its evolution in literature and science, and its influence on our understanding of time itself." I'm only into the first chapter so far, but Gleick seems to be living up to the blurbomise [book jacket promise of stellar content]. It all began, in literature at least, with H.G. Wells' The Time Machine, as Gleick describes:
Our hero fiddles with some screws [on the time machine], adds a drop of oil, and plants himself on the saddle. He grasps a lever with both hands. He is going on a journey. And by the way so are we. When he throws that lever, time breaks loose from its moorings.

What happens when cover artists smoke too much crack

Wells, according to Gleick, "invented a new mode of thought" when he dreamed up the time machine. Then, in a later chapter, Gleick asks, "Can you, citizen of the twenty-first century, recall when you first heard of time travel?" He doubts it, given the massive infusion of chronotraipsing into our imaginative world since The Time Machine. Felix the Cat traveled in time. So did Elmer Fudd and, of course, Mr. Peabody and Sherman. The number of time travelers [chrononauts?] have been countless and will only grow, I'm sure. I read TTM who knows how long ago (I would need to chronotraipse myself to find out) but I couldn't swear to it being my first exposure to time excursions.

All of the chrononautic adventurers have, well, adventures, scary and exciting for them and sometimes also for the readers. Si Morley, the main character in Jack Finney's classic 1970 novel Time and Again, is one of my favorites. (Two others are Clare and Henry from The Time Traveler's Wife.) In T&A (an unfortunate acronym) Morley is recruited into a secret Army project to save the world (naturally). He learns how to time travel using self-hypnosis. that is, with the help of movie sets and actors he is able to convince himself that he is in 1882 instead of 1970 and, voila, there he is! In the end [SPOILER ALERT!], he decides the time project is ill-conceived, perhaps even evil, and returns to 1882. There he stops the parents of the project head from meeting and, presto chango, no project. He stays in the past with the woman he met there and fell in love with and, presumably, they lived happily ever after. He also conveniently avoids witnessing or taking responsibility for any "collateral damage" in the future that may have come from his actions.

Anyway, we all experience time and the desire, occasional for some and obsessive for others, to slow it down or speed it up, travel at will to the past and/or future, or even merely wind our life clocks backward to answer all those pesky "what ifs." We also have invented myriad clever ways to measure it and track it and remind us of how much of it we...okay me/I...waste, in this case while trying to figure it out if I can self-hypnotize and zip ahead three years or so. The fear in that, of course, is that, if I manage it, I will discover that Idiocracy has come to pass. If you haven't seen the movie, it's where we get dumber rather than smarter as time marches on and the United States becomes a place where "the English language had deteriorated into a hybrid of hillbilly, valley girl, inner-city slang and various grunts." In the film, this degradation takes five hundred years. In the real world...well, I'm afraid to even think about it.



* PS: The title comes from someone who truly understands, or who truly understands that no one truly understands, what we call time: Doctor Who. As he puts it, "People assume that time is a strict progression of cause to effect, but actually from a nonlinear, nonsubjective point of view it's more like a big ball of wibbly wobbly...timey wimey...stuff." Brilliant!

Monday, March 13, 2017

Time to Be Anti-Social?

"The vaunted human capacity for reason may have more to do with winning arguments than with thinking straight." So goes the picture caption to a New Yorker article by Elizabeth Kolbert titled "Why Facts Don't Change Our Minds." She begins her piece by describing a 1970s Stanford University research project that asked some undergraduates to distinguish between real and fake suicide notes. Some were told they did really well and some told they did really poorly. In reality, they all did about the same. The hidden purpose of the study (all psychology experiments thrive on lacksparency [the intentional lack of transparency]) was to see how the subjects responded after the researchers revealed their true purpose. Even knowing they did average, the subjects who were told they excelled said they said they thought they had done really well and vice versa. "Once formed," the researchers noted, "impressions are remarkably perseverant." Scientists doing another study like this one found similar results. Despite seeing that the evidence "for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs."

No, you cannot come to the orgy. No blue sashes in the red sash group, get it?*

Kolbert poses the question, "How did we come to be this way?" She then describes the answer that two Harvard researchers posit in their book The Enigma of Reason. They say, "Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve problems posed by living in collaborative groups." The biggest advantage humans have over other species, these two assert, is our ability to cooperate, something that might seem extremely doubtful these days.

The same two researchers also note that a large obstacle to our ability to discern truth (this is an all-inclusive "our") is confirmation bias, our tendency to embrace information that supports our beliefs and reject information that contradicts them. In addition, they point out that while we're good at spotting weaknesses in other people's arguments, "almost invariably the positions we're blind about are our own." The researchers argue that if this trait were negative and harmful, genetically speaking, it should have been "selected against." Since it wasn't, "it must have some adaptive function," a function they say relates to our "hypersociability."

This sociability, the tendency to group together, did us good back in the days of wooly mammoths and saber-toothed tigers. Now, not so much. One thing we suffer from in our sociability is the "illusion of explanatory depth," that is, our thinking we know way more than we actually know. In some places, this doesn't hurt us much. For example, we can toggle the handle on a toilet without knowing how it works and the toilet will flush (usually). In other areas, again, not so much. For instance, in another study, the less able people were able to locate Ukraine correctly in the world, the more they favored US military intervention there.

In similar fashion, the less we know about President Trump's "immigration ban," the more likely we are to strongly favor or oppose it. The more others in our circles, whatever they may be, concur with and confirm our opinions, the more we resist or flat-out disbelieve and dismiss anything that counters those opinions. "This is how," two other scientists quoted by Kolbert observe, "a community of knowledge becomes dangerous." She then concludes, perhaps oversimplifying, "If we all now dismiss as unconvincing any information that contradicts our opinion, you get, well, the Trump Administration." Whatever side of that perspective you come down on, what seems incontrovertible is that our reasoning abilities and methods haven't evolved past the Pleistocene. Somehow, I don't find that the least bit surprising.


*Time Saving Truth from Falsehood and Envy, François Lemoyne, 1737. Public Domain.

Thursday, March 09, 2017

Buffaloed Beyond Belief

In a recent blog post, Grammar Girl Mignon Fogarty asked me (well, everyone) a question I never expected to be asked: "Have you ever struggled to figure out the 'buffalo buffalo buffalo' sentence and given up?" This is one query I can answer unequivocally: no. In fact, I was not aware that a "buffalo buffalo buffalo" dilemma existed.

American bison k5680-1.jpg
That's American Bison to you, if you please.*

The sentence GG references is (not simply) eight "buffalos" in a row: "Buffalo buffalo Buffalo buffalo buffalo buffalo Buffalo buffalo." Yes, this is a real sentence that is grammatically correct and makes sense. The capitalization is part of what makes this happen. I will pass on Mignon's explanation, which includes a sentence diagram I won't repeat here, in summary form because she does it so much better than I ever could.

First, she notes that three different meanings for "buffalo" are involved:
  1. the city in New York
  2. the plural form of the noun describing the animal
  3. the verb that means "bamboozle" or "fool" or "trick"
The sentence comprises subject-verb-object plus modifiers and a restrictive clause. Translated, it reads "New York bison [that are] tricked [by other] New York bison [also themselves] trick [other] New York bison." Put into the original sentence, it comes out this way: "Buffalo buffalo (bison from New York) Buffalo buffalo buffalo (that New York bison trick) buffalo Buffalo buffalo (trick New York bison)."

If you would like a more prosaic translation, there's this one from Mental Floss: "Bison from Buffalo, New York who are intimidated by other bison in their community also happen to intimidate other bison in their community."

There are two lessons here. One comes from the Mental Floss author Chris Higgins, who warns "Beware of Buffalo buffalo, buffalo, for they may buffalo you." The other is, should you ever travel to Buffalo, be sure to buffalo Buffalo buffalo before Buffalo buffalo buffalo you. Or you could just avoid Buffalo altogether and not have to worry about getting caught in the middle of buffaloed Buffalo buffalo buffaloing buffaloing Buffalo buffalo. From my perspective, that seems the saner course.


* Bison image was released by the Agricultural Research Service of the United States Department of Agriculture. Public Domain.