Sunday, February 28, 2016

Predicting the Future

I have given some thought to the question of why it is difficult to predict the future. That it is difficult is obvious. Unless the extrasensory power of precognition is possible, no one can know what will happen in the next minute or tomorrow let alone next year, next decade or next century. However, I can be fairly safe in predicting what will happen in the next minute or tomorrow in certain respects. Despite the possibility that many things can and sometimes do change in a moment, it is very likely that my life will basically be the same a minute or a day from now and will follow the same routines that it has in the recent days. Fate, or accident, can always intervene, but there are degrees of predictability for different factors that impinge on future outcomes.


The laws of physics, and to only a slightly lesser extent, biology, will presumably never change, but our knowledge of them undoubtedly will if our species survives long enough. We can stick to what we know about science to make some predictions about the future. But by sticking to the safety of what we know, we cannot go very far in predicting the future. Safe predictions can be far too tame; there is always room for the unpredictable in reality. Also, the cutting edges of scientific theory and discovery give us some clues as to what might be possible in the future, leading us quickly into the realm of pure speculation.

Eventually, the sun, the star around which the earth travels every year, will burn out, and it is possible to make guesses about when that will happen; though, barring some catastrophe, this will not happen for billions of years. On a grander scale, we do not really know the nature of the universe. Are we in a universe of limitless order or is only in the corner of a universe where we happen to be where order prevails, while the rest of the universe is a realm of utter chaos? Is it possible to travel through space faster than the speed of light or is there some other way to quickly get from here to there in our vast universe? Is time travel possible? That is, are leaps in time possible? (After all, we are always travelling into the future at what we perceive to be a steady pace.)

Despite the popularity of creationism in some quarters, the theory of evolution proposed in the mid-nineteenth century appears to be on the right track, but there are clearly many aspects of this process that we do not understand, and new discoveries about the process will undoubtedly change our understanding of what is possible. For example, five years ago, a biochemist suggested that evidence from certain microbial organisms might tell us that life began independently more than once on earth and that this could prove that all life on earth is not related to one instance of biopoiesis (creation of life from non-life). This also would tell us whether life on other planets is likely: If it happened twice on earth, then it probably happened on other planets, too. While an intriguing idea and one that remains possible, the evidence for this particular scientist’s claim, has generally been rejected. However, one of the natural reasons why evidence of independent biopoiesis could be possible but undetectable is that once life on earth got started, any newly generated organisms might be eaten up by the already existing ones. Another problem is that the conditions for the generation of new life might dictate that all newly generated one-cell organisms are genetically similar or identical to preexisting ones and therefore indistinguishable from them.*


The application of science to practical problems often results in technological innovation. There is a certain logic to technology, based on scientific principles that can help us predict future developments in the technical aspects of our future, but the specific directions technology will take are subject to unpredictable factors. How will science itself limit technological innovation? How will cultural and political factors quicken or slow the pace of technological change?

One of the most famous shortcomings of science fiction is that virtually no science fiction writer foresaw the decentralizing effect of computer technology. Almost to a man and woman, sci-fi writers of the 1950s through 1970s thought that computer technology would lead to and enable the centralization of information and social control. The opposite became possible and was realized to everyone’s amazement.

The development of artificial intelligence is a favorite science fiction device, but the invention of a sentient robot like that of Isaac Asimov’s novel, “I, Robot,” is a complex problem. Computer technology and mechanical robotics need to be fully integrated if we are to create sentient robots. Asimov’s genius in exploring the ethics of artificially intelligent robots was that he tried to answer questions that did not seem to need to be asked yet, but we now realize that we need to ask them before A.I. robots become a reality. We now have robots that can do repetitive tasks and move around. We are even seeing robots that can learn to change their behavior based on circumstances encountered in the real world. The pace at which we are solving the problems of artificial intelligence and robotics reminds us that putting together something as seemingly unitary as a sentient robot depends on separate technologies and technological problems that are of different degrees of difficulty and will not all be solved at the same time. The biggest how-to problem may be mimicking human thought, language and communication. Anyone who has encountered machine translation knows that computer programs that translate between two or more human languages have gotten better but are still unreliable—sometimes wildly and hilariously off-base. Until this and other language problems are solved, it will not be possible to teach a computer to think and speak like a human being.


One of the myths often put forth about the Founders of the United States and Framers of the U.S. Constitution is that they had no idea what the future and its needs might be. Actually, while it is true that they did not know exactly what the future would bring—and arguably made the Constitution short and simple as well as amendable precisely because they understood the need for a lawful and regular way to change it—they were capable of realizing that the future would be different from their present. For example, gun-control advocates have argued that the Framers of the Second Amendment to the Constitution did not know that future firearms would be different from the single-shot, muzzle-loading muskets in use in the late eighteenth century. Actually, they did have inklings of technological change in this as well as other areas. Remember that several of the Founders were inventors and tinkerers—notably Benjamin Franklin and Thomas Jefferson—who not only contributed to technological innovation themselves but were deeply appreciative of the innovations of others. In the area of firearms innovation, prototypical breach-loaders and multiple-shot guns were invented in the late eighteenth century and the implication that future firearms might be capable of firing several times without reloading was already dawning on astute thinkers at the time that the Second Amendment was framed. This fact did not change the rationale for the right of access to the technology of self-defense that was recognized in that amendment, and the explicit recognition of such innovations would not likely have given the Framers any pause.

Political policies can profoundly affect technological innovation. Laws passed by governments can encourage or hinder new technologies in every field. Laws and regulations in such areas as pharmaceuticals can help to make new drugs safer before they reach the market but can also prevent useful drugs from reaching patients in a timely fashion. Predicting technological innovation with any degree of accuracy must take into account politics among other factors. Political developments are unpredictable because they are malleable and can even be whimsical, dependent as they are on such factors as popular satisfaction and dissatisfaction with the status quo.


It is difficult to predict fads, although betting on a certain degree of conservatism in the preferences of human beings will often reward prognosticators. For example, one of the wildly erroneous predictions of popular visions of the future in the twentieth century was the notion that people in the future would drive wildly absurd-looking automobiles—including flying ones. (It is interesting that the actual analogue of the manned flying car has turned out to be the unmanned drone, which few if any foresaw. On the other hand, some sci-fi writers predicted that automobiles might one day be driven by robots, which is now possible and about to happen.)

Automobiles are a good example of an area where fad and technology intersect, and politics, too. Designs have to be functional, but they also lead to consumer expectations that may not be easy to change. After all, an automobile is an update of—and retains some features of—earlier technologies such as horse-drawn carriages. On the other hand, changes in design can be exciting to consumers depending on what influences are in the Zeitgeist. As mid-twentieth century popular culture became fascinated by rocketry, faux rocket motifs became more acceptable in automobile design. On the other hand, the political mandate for fuel efficiency stemming from the 1970s oil crisis led to design changes that would have been difficult to predict earlier. Not only was the mandate itself not easy to foresee (see “Politics” above), but the idea that cars would have to be made with lighter materials to achieve maximum fuel efficiency could only be foreseen when the mandate was foreseen (although, given the politics of an oil shortage, then technology’s own logic would, indeed, predict lighter vehicles). As well, the results of the computer models used to achieve more aerodynamic designs would have been difficult to predict without having those computer models.

Another erroneous prediction from twentieth century popular visions of the future was that, surely by 2016, people would be going about their daily lives wearing silvery overalls. (These would be skin-tight, of course, when worn by buxom women, according some of the cheesier sci-fi movies and pulp magazine illustrations.) In the actual 2016, we can see that this has yet to happen, and a good guess is that it won’t happen. Why? Most human beings prefer the feel of natural fabrics over synthetics. Cotton, silk and even wool feel more comfortable against the skin than substitutes, which are more acceptable if they can match the feel of natural fabrics. It is true that synthetic clothing has been developed, but usually it has been made to look if not feel like traditional fabrics. And there has been considerable resistance to synthetic fabrics that are uncomfortable or lack aesthetic appeal. One of the most radical innovations in clothing on the technological horizon is the embedding of electronic technology into fabrics, but this technology has not been embraced by most people. The best guess is that the more unobtrusively it can be woven in, and the more useful it becomes, the more it will catch on.**

Predicting cuts of clothing has been one of the most fickle areas of forecasting. The problem is that such guesses are based on the latest trends that have not necessarily been popularly embraced and which might fizzle and disappear, never to be embraced at all. An instructive example is the costume design of the movie “2001: A Space Odyssey” (1968) in which the designers, who included Rudi Gernreich, assumed that then-current clothing trends would either continue or develop toward some teleological endpoint. The result was the projection of the then relatively new miniskirt (although it had had earlier incarnations, for example, in the 1920s) into the next century. Although the miniskirt went out of style in the mid-to-late 1970s, it did, indeed, reappear by the 1980s and seems to have become almost perennial in Western society up to and including the present. Likewise, the movie’s guess that the lapels on men’s suit coats might grow narrower was a safe guess since lapels get wider and narrower in cycles, and the only other possibility for them is that they might disappear altogether. On the other hand, “2001” predicted that men’s neckties might disappear altogether by 2001. As things turned out, trust in the conservative tendency of attitudes toward clothing would have been the winning strategy in predicting the future of the tie. It is true that men in the twenty-first century sometimes go without neckties, but they still wear collared shirts that beg for a tie. The fringy fads made fashionable in the 1980s and especially popularized by the TV series “Miami Vice,” did briefly promote an avant-garde look of wearing an un-collared shirt and no tie with a suit, but the problem with this series was that its styles always made one look like a Miami drug-dealer, and that look was never going to move permanently to the center of acceptable fashion.

As with clothing, there is no accounting for whimsical fads in hair that start out on the edges of social acceptability and sometimes take over the center, as happened with many of the fashion trends of the 1960s and 1970s. The musical group, the Beatles, promoted radical fashions in hair and clothing that no one saw coming nor could they have. In the area of hairstyles, “2001” did not promote the then relatively new fashions in longer hair on men (production of the film began in 1965), and by not doing so the film seems to have correctly predicted that shorter men’s hair styles would become acceptable again. In the 1980s, men’s hair styles shortened after lengthening in the previous two decades, but “retro” longish hair remained a possibility and did not go away. The same applied to beards, although experimentation with nineteenth-century-looking handlebar mustaches and muttonchops in the ’60s and early ’70s eventually yielded by more natural full beards, although fanciful designs in facial hair as well as atop the skull, often reminiscent of topiary gardens, continue to be popular among the avant-garde. Meanwhile—and again promoted by “Miami Vice”—men began to sport stubble as a fashion statement, a fad that persists on the trendy fringe to this day.***

Women’s hair also underwent even more frequent and radical changes than men’s during the same period. The popularity of natural and Rapunzel-esque long hair on women in the 1960s yielded, even during the ’60s, to the imperatives of ease of care, prompting shorter page boy and pixie styles. There were also pressures for the more curvaceous tresses only achievable in hair salons or with the home use of rollers. In the eighties, “big hair” bouffants were popular and were only later supplanted by more natural-looking styles in the 1990s and beyond.

Hair can also be a political statement. In the 1960s and early ’70s, one could often tell someone’s political agenda and subcultural identity just by looking at their hair and clothes. To wear longish hair told people that you probably smoked pot and were opposed to the Vietnam War. As such styles became mainstream, there was actually a phenomenon whereby sideburns crept downward and thickened as one became more politically or culturally radical or willing to advertise it. But eventually all of this became mainstream fashion so that it no longer necessarily carried a message. At the same time, African-American culture made assertive use of hair and clothing to make a political statement. The Afro hairstyle practically spoke a sentence if not a long paragraph about one’s politics: “I reject conformity to the dominant white culture and will not try to look white in any respect. My hair is naturally kinky and I will not use chemicals to straighten it.” In conformity to the trend toward longer hair, however, some of the Afro styles were very long, especially among women, although not limited to them. But this too could be part of the statement: “Nor will I hide my hair by shaving or shortening it; rather I will wear it so it can be seen in all its glory.”

My point is that all of these fads would have been very difficult to have predicted without also predicting the political changes that made them imperative or the technological changes that made them possible. The intersection of science, technology, politics and fads combines more and less manageable influences on society, making it impossible to predict the future with accuracy. At best, an understanding of these factors will be helpful to the prognosticator, but mainly in teaching him that there are limits to the reliability of his prediction.
*The issue here is called “spontaneous generation,” an officially discredited notion in science, despite the fact that proving that it cannot and has never happened means attempting to prove a negative. Evidently, something like it must have happened at least once, or how else are we here?

**The joy of taking an idea to the limit is illustrated by Neal Stephenson’s novel, “The Diamond Age,” in which nanotechnology is embedded in everything from clothing to buildings and from books to chopsticks.

***A fascinating book is “Fashions in Hair,” by Richard Corson, which illustrates changes in hairstyles from ancient to modern times. Even in Ancient Greece and Rome there were cycles in what was popular, although, while trends seem to turn over more rapidly in modern times, cycles in the ancient world tended to be longer term. For example, periods during which men either wore beards or went clean-shaven lasted a century or two in ancient Greece and Rome. A young Roman senator was once expelled from the Senate for wearing a goatee during a clean-shaven cycle. Also, the Romans seem always to have opposed wearing a mustache without a beard because it reminded them too much of the Germanic tribesmen they often warred against. While it was often difficult to tell why beards came and went in ancient fashion, in one case it seems clear that an emperor ended a period of fashionable clean-shaven-ness when he grew a beard to cover facial scars. One could only predict a fashion-shift like that by anticipating the ascendance of a trend-setter in politics or society. To know that such things happen allows us to guess, but knowing when this will happen and what form it might take is unpredictable.

Tuesday, February 9, 2016

Abraham Levitsky, Ph.D., psychotherapist and hypnotist, 1922-2012

Back in the 1970s and 1980s I wanted to become a psychologist or psychotherapist. I finally decided not to become one and am glad I did not for reasons I won’t go into, but one of the most remarkable people I met in the course of this process was Dr. Abe Levitsky. He worked out of a little house on Milvia St. in Berkeley, California. He tried to hypnotize me once. I did not think he had succeeded at the time, but then I had to wonder, since I fully cooperated with the whole process, how do I know he didn’t hypnotize me?

 Published in San Francisco Chronicle on July 8, 2012 - See more at:

Abraham Levitsky April 1, 1922-July 2, 2012 Bay area psychotherapist and longtime Berkeley resident Abraham Levitsky died peacefully of natural causes at the age of 90 on Monday July 2nd . Abe was born on April 1st in 1922 in Montreal, Quebec, the only member of his immediate family to be born in North America. His father, mother and three older siblings were Russian Jewish immigrants. When Abe was three years old the family moved to Brooklyn, New York. He received his B.A. from Brooklyn College in 1942. After college Abe served in the Army in various Mid-Western locations as a translator for Italian prisoners of war. Although not speaking Italian prior to this assignment, his gift for languages allowed him to finesse many potentially embarrassing situations. During a brief stint as an aircraft navigation specialist he fared less well. After completing his Army tour he enrolled at the University of Michigan, completing his PhD in Clinical Psychology in 1955. Shortly thereafter he moved to St. Louis and started an active private practice. During the 15 years he lived there he developed his signature flair for integrating diverse perspectives into his clinical work: hypnosis, Gestalt, and later, mysticism. He trained with Milton Erickson, became an associate editor for The American Journal of Clinical Hypnosis, and subsequently a Vice President of the American Hypnosis Society. In 1967 he moved to California to study at Esalen with Fritz Perls, one of the formative figures in Abe's life. During the 1970's Abe was deeply involved in the Gestalt Institute of San Francisco, teaching and training the principles of awareness, authenticity, and spontaneous creativity that were at the heart of his approach to psychotherapy. Abe had many passions: Advaita mysticism, Schubert leider, Rhodesian Ridgebacks and Portuguese Water Dogs, tennis, and scrambled eggs. He is remembered by all for his singular sense of humor and a talent for packaging great wisdom into memorable one-liners. Abe is survived by his wife Ellen (Nina) Ham of Kensington, his stepson Todd Porter, his step-daughters Jessica Mason and Jillian Jolie; his grandchildren, Amiya Mason, Kenzi Jolie, and Chad, Lorna and Kathryn Porter; his nephew Larry Levitsky, and his nieces Marianne Levitsky, Anne Rothman, and Barbara Bergeron. Donations in memory of Abe may be made to the Bay Area Gestalt Institute. Contact Lu Grey at for further information.