Tuesday, December 23, 2014


Charlemagne, called Carolus by contemporaries, was born in the 740s (historians are uncertain about the year), crowned King of the Franks in 768, additionally made King of (northern) Italy in 774, and Emperor of (western) Europe on Christmas in 800. He spent much of his life fighting to expand his empire. Pope Leo III was a close friend and ally, sanctioning his imperial aspirations.


Although illiterate, Charlemagne spoke at least two languages fluently, Frankish and Latin, and encouraged the preservation of learning as well as the thorough education of his children. He was a Christian and demanded that his subjects accept Christianity. When he died in 814, he had expanded his kingdom in all directions but especially toward the south and east. He controlled most of what is today Franceand Belgium and also ruled much of what is now western Germanyand northern Italy. (His capital, Aachen, is now part of Germany.) Within his borders was some territory now belonging to Spain, and there were other lands where he made his influence felt although the were never fully under his control.


His son Louis the Pious became emperor after him, but over time the Caroligian Empire, as it was called, split into various kingdoms, only to unify as several separate countries in more recent centuries.

Friday, November 28, 2014

Television: A Personal History

(Note: When I began the following lengthy essay, I had some consistent policy in mind for when to use "TV" and when to use "television," but this seems to have broken down so that there is no rhyme or reason for my usages.)

I have had a very long history with television. I am 63 years old. My first memory of television must date earlier than 1955. I was of the first generation for whom the one-eyed box served as baby-sitter. My mother would plunk me in front of the tube and do her household chores: bed-making, vacuuming, laundry, cooking, etc. Staring at the television screen is among my earliest memories, though not my earliest. I had yet to learn to control the knobs and dials (there were not yet commercially available remote controls), and so I had to ask my mother to change the station.

One of the things that stands out is that I watched a lot of old movies and cartoons. There were only a few new and recent programs such as "Romper Room," a show aimed at kids my age; "Louise Morgan," a Neolithic talk show; and the local news and weather, of course, which my mother would watch with me. By and large, there was too little television programming to fill even a short broadcast day, so I watched silent cartoons (e.g., "Felix the Cat") from the 1920s and sound cartoons from the 1930s and early 1940s. The motion picture industry was afraid of television in those days and had banned the sale of post-World War II films of any kind to television. Consequently, though born in 1951, at the beginning of the second half of the twentieth century, my early experience of television was of looking through a window onto the first half of the century. Today, television has built up more than half a century of its own programming; now they don't need the motion picture industry in order to broadcast 24/7. (Yet there has been a truce between movies and television since the 1960s and now a marriage that seems uncharacteristically solid by Hollywood standards.)

Most television stations went down between midnight and six a.m. I remember my brother and I getting up early on Saturday mornings to watch Laurel and Hardy and/or the Three Stooges on TV; only, if we got up too early, there was nothing but a test pattern on the screen until the station signed on. Speaking of the Stooges, et al., these were favorites of my brother and I, but my very favorite was the Marx Brothers whose movies my mother let me watch, mainly during the weekdays, while she did her chores. I remember her suggesting that if I liked the Marx Brothers, I would surely like the Ritz Brothers. I didn't. They made several films between 1936 and 1942, but the few I saw were boring and decidedly unfunny. Another favorite program was the first filmed, as opposed to live broadcast TV series, which also showed the first re-runs in the 1950s: "I Love Lucy." I used to watch episodes of that show that I suppose were only a year or so old when I watched them. This was the beginning of the trend of television feeding off of its own inventory instead of having to use early-twentieth century cartoons and movies. Those movies were often made and took place either pre-World War II or during the war. I might well have thought that the war was on-going, at first, until my Mom assured me that it wasn't. My fears could not be so easily assuaged when it came to TV promos for I-do-not-know-what service or agency, but they showed Nikita Khrushchev banging on the table at the U.N., and shouting in Russian, which the narrator translated as "We will bury you!" Why this angry old man wanted to bury me, I could not fathom, but he scared the stuff out of me. Other ads were less threatening. "A tab in your washer is all that you need,/ Starts whitening action with light-n-ing speed." That early television jingle for bleach tabs still pops into my head sometimes.

Television, as almost nobody knows, was invented in 1927 by Philo Farnsworth. I asked my parents who invented television once, and they declared that it was invented by many people and not by any one person. They didn't know. There had been experiments in hybrid mechanico-electrical television before Farnsworth, but his was the first all-electronic television system. His invention was stolen from him by RCA, and although the U.S. Patent Office declared in Farnsworth's favor in the mid-1930s (1936, I believe), he never made the fortune he deserved because his patents expired in 1947 just as the popularity of television in the United States took off. His was the same system--with some technical improvements added by RCA--that lasted from the 1930s to the 2000s when high definition television was widely introduced and eventually supplanted the old standard.

Another little-known aspect of television history is that Great Britain had a more sophisticated television system in operation than the United States did from 1936 to 1939, even though the technology had been invented in the United States. (A Scotsman named John Logie Baird had invented what was probably the best of the mechanico-electrical television systems, but when he first saw Farnsworth's system he realized that his own system was doomed.) The reasons why the British system developed more rapidly than American systems were several: 1) The United States is vast and requires many television stations to reach every part of it, while Great Britain is small enough that a large part of the country could be covered by even a single station. 2) While the television manufacturing business was in private hands in Britain, the main signal broadcaster was the BBC, which, again, only had to reach a small amount of territory (and densely populated) to make television broadcasting worthwhile. 3) The BBC also set up a regular programming schedule of six hours a day, six days a week; contrast this with the United States where, during the same period, a typical station might run three or four hours of programming, three or four days a week. This made the expense of a television set more attractive to British customers than their American counterparts because at least there was "something to watch" on British television. There was a joke in the fledgling television industry in 1939: "There are only 100 TV sets in New York City, and 99 of them belong to RCA executives."

The German government (i.e., the Nazis) never allowed private industry to produce sets for the average consumer, maintaining state control over viewing as well as broadcasting. (The 1936 Berlin Olympics were seen on television but only in a few public viewing rooms in a few German cities.) Consequently, Germans saw even less proliferation of television than Americans did. (BTW, when Farnsworth went to Germany to collect royalties for television from the government, they chased him out of the country, practically at gunpoint.) Television broadcasting was curtailed almost everywhere--with the possible exception of some U.S. markets--for the duration of World War II, but came back up in Britain in 1946. Even after World War II, the price of a television set in Britain remained slightly lower than in the United States, because the volume of sales allowed the price to stay low, but by the 1950s, U.S. TV sets were becoming cheaper as Americans bought sets by the millions.

(Note: How television helped win the war even though it was largely suspended for the duration: The factories that had made TV sets before World War II were converted to make radar screens; the development of radar technology, more advanced in Britain and the United States than in Germany and Japan, had a profound impact on the balance in air power between the combatants in World War II.)

From the 1950s, we had a TV set in our living room, which was a good-sized room, longer east to west than north to south, and the TV occupied the western end of the room. (In December, our Christmas tree occupied the eastern end.) The television was faced by a couch with an easy chair on one side and a rocker on the other. This was pre-transistors, and TV sets, like radios, had tubes inside them. The screen just went black when enough of these tubes burned out, and the tubes would need to be replaced. When the TV went on the fritz, we either took it to the little shop, jam-packed with TV sets and parts, where we had bought it, or else the shop owner or one of his sons made a house call.  I cannot remember the name of the man who fixed our TV. He was Armenian, though, as so many citizens of Worcester, Massachusetts, are. He would open the back of the TV, bend over and take various instruments from his worn leather tool belt. (Mercifully, I never saw the crack in the back the way I did when the plumber came by.)

Every few years, we got a new TV set for the living room. Always black and white.  Maybe we went through three or four over two decades(?). In addition, in the early 1960s, we got a portable TV, also black and white, which my parents put in their bedroom. Later this TV moved down to the back porch where I would watch "Honey West" (1965), while my Dad watched "Gomer Pyle, USMC" (1964-1969) in the living room, only a few feet away. What can I say? Adolescent boys are driven by forces more powerful than the funny-bone.

That portable TV has many associations. While it was in my parent's bedroom, I saw parts of shows I was never allowed to watch otherwise, including "The Dick Van Dyke Show," "That Was the Week That Was," and others. In the summers we took that set on vacation and I remember seeing some shows that I didn't see the rest of the year. There was "G.E. Theater," for example, an anthology series with a memorable episode about Christopher Clayton Hutton, a British intelligence officer who invented the "cloth escape map," an ingenious device that allowed prisoners of war to secret a highly durable map of the territory in which they were captured and use it to find their way home once they escaped. (In 1942, American intelligence studied the British cloth maps and began issuing their own version of them to American servicemen, especially pilots. I am sad to say that although I had a silk map of the Philippines when I was a child, it went missing.)

The most memorable thing I saw on the portable set, was the two-hour-old video of the murder of Lee Harvey Oswald by Jack Ruby in Dallas, Texas, two days after the assassination of President John F. Kennedy. An irony: The night of the assassination of JFK, one of my favorite anthology series, "The Great Adventure," had been scheduled to air an episode about Wild Bill Hickock, starring Lloyd Bridges. I had been looking forward to it, but, naturally, the program was preempted by live coverage of Lyndon Johnson and Mrs. Kennedy arriving in Washington, D.C. on Air Force One. When that episode of "The Great Adventure" finally aired, many months later, it ended with a re-enactment of the assassination of Hickock in 1876.

I remember traveling to Washington, DC, in 1962--so, before the assassination--to take my maternal grandmother to visit my grandfather's grave in Arlington. He had been a World War I veteran. My Dad and I stayed in a motel and we watched "Have Gun Will Travel" and "The Jack Paar Show," and my father advised me not to let Mom know that he let me stay up so late. I never did.

My parents never accepted those newfangled color TVs. My uncle, that is, my father's older brother, had a brand new house and a brand new color TV set, while my father had a fifty-year-old house and the latest (more or less) black-and-white. My parents' chief rationale for not going color was that most programs were in black-and-white, anyway. This argument fell apart in January 1966 when the television industry launched a new policy: All new television programs would be in color. This meant that, although any program that was already black-and-white stayed that way, for the time being, all new shows would be in color from then on. And most shows that were renewed for the following fall would be "color-cast," too. My favorite show--or my hormones' favorite show--at the time, was "Honey West," about the eponymous, gadget-enhanced super-detective. The late Anne Francis, the star of the show, said that she was told that if the show got renewed in the fall of 1966, it would be in color, but it was cancelled. A show that did premier in color on the 12th of January 1966, was "Batman." I was never a big fan, but I watched this goofy show often enough because everybody else was watching it. Despite the growing proportion of TV shows in color, my parents never did buy a color set.

We got the best picture on our TV set when we tuned in to the Boston stations. There was channel four, which was an NBC/Westinghouse station (many if not most of Westinghouse's stations were NBC affiliates; most of their programming came from NBC, but Westinghouse produced a few of its own programs); there was a CBS channel on five, and seven was ABC. Before stations began playing musical chairs in the 1970s and '80s (in Boston, channel five became an ABC station and seven became CBS), and especially before Fox took over a number of former-CBS stations in the early '90s, which network was on which channel had been almost the same everywhere in the United States, especially in the big cities. When I moved from Boston to San Francisco in 1980, I was surprised to find that the same channels were paired with the same networks as they had once been in Boston. Eventually, I found out that this had been deliberate and no coincidence. Four was always NBC, Five always CBS, and seven always ABC because the big three networks had grabbed low station numbers in the 1940s, and each of them took the same channel number in each market. There was an early fourth network called DuMont, which existed only from 1946 to 1956, but I do not remember watching it, which could mean that it was not available in my area; my partner who is older than I am and lived in New York in the '50s, does remember it.

Smaller cities often did not have their own television stations in the early days of television, unless there was a history of experimentation with television in that particular town; for example, Schenectady, New York, and Lincoln, Massachusetts, had experimental Television stations back in the 1930s. Worcester, Massachusetts, where I grew up, did not have a station until about 1968 when the State Mutual Life Insurance Company, with their headquarter only about a mile from where I grew up, built UHF channel 28. This brings to me to the difference between VHF and UHF. UHF required a separate tuner from the standard VHF tuner. VHF required a standard antenna, either set-top "rabbit ears" or a roof top antenna. The VHF channels would go from 2 to 12, and the UHF channels were all double digits, often 28, 29, 38, 44, etcetera. The DuMont and early Fox networks were generally relegated to UHF, which made their viewership limited to people who had sets with UHF tuners. DuMont never overcame this problem and went out of business, whereas Fox did overcome it when it acquired all those former-CBS stations in the '90s. Channels 38 and 44, were strong, popular channels in both the Boston and San Francisco areas, probably for same reason. They all showed lots of old movies. Of course, when broadcast channels were included on cable line-ups, VHF and UHF channels were on equal footing for the first time; you didn't need to adjust your VHF rabbit ears (or rooftop antenna) or your UHF hoop on the back of your set.

When I moved to central Virginia a little over a decade ago, I saw another odd result of the big network claims on certain channel numbers: The Shenandoah Valley never had good broadcast television reception, so in the 1970s a cable system was established. This system included the NBC station from Washington, DC, which, of course, was on broadcast channel four, and wound up on channel four of the new cable system. Years later, Charlottesville, Virginia, acquired its own NBC broadcast affiliate, channel 29. When it replaced the Washington NBC affiliate on the cable system, 29 took over cable channel four. Most of the cable customers in this area never wonder why they are watching channel 29 on channel four.

For a long time, channel six was unavailable in most (though, somehow, not in all ) markets, because the FCC decided to put the FM radio spectrum where television channel six is located. I remember being puzzled in the early '70s when I was tuning up and down the FM range and stumbled on the dialog from an NBC-TV soap opera at one end of the dial.

I left home in 1970 to go to college. The TV sets in the dorms were color, but most of them were monopolized by the other students. I did find a TV in the basement that hardly anybody watched, and I watched "The Dick Cavett Show," a great talk show, and other more or less memorable programs that nobody else wanted to watch. Later, during the '70s and '80s, I went through periods when I did not own a TV set at all. I nevertheless saw the advances in television and its accessories during this time. I remember seeing a store that sold video recorder-players in the early '70s. The only movie available on video at the time was "The Godfather," and the cost of these devices was prohibitive. Once, when I did own a TV set in the '70s, I saw an episode of the now-classic detective series, "Columbo," in which William Shatner played a man who used a newfangled video recorder-player to create a time-shifted alibi for a murder. Columbo figured out how the video recorder-player works and solved the case. How dated that episode would seem now.

The 1970s saw the beginning of cable TV as we now know it. Cable had been regarded as a necessary evil for people who lived in valleys where they could not get broadcast television. But the idea of charging people to watch cable TV in big cities was in its infancy in the '60s and '70s. "TV Guide," magazine--which used to be a publication aimed at thinking adults instead of being aimed, as it is now, at people who cannot tell the difference between reality and fiction--had articles on the coming cable revolution by the late '60s. They called it "pay TV" at that time, because the fact that people would have to pay to watch television was the novelty that jumped out at everyone. I remember trying to explain it to a classmate in college in 1970, and he couldn't comprehend why someone who was from, say, New York, New York, as he was, would pay to watch cable when New York is bombarded by many broadcast stations. It was neither the first nor the last time that I ever "lost" an argument to someone who turned out to be entirely wrong.

During the '70s, HBO, TBS and TNT came on-line. I remember my first exposure to a cable system that allowed someone in Ohio to watch TV stations from Chicago and Atlanta. It seemed a new world. (Of course, valley-bound cable systems had been doing something like that for years, pulling in channels from the nearest big city, though it might not be at the distance of Chicago and Atlanta.) I remember being slightly less impressed when I saw someone else's cable system in the '70s, because they got HBO, and that seemed to be about it. By 1980, however, cable was big business and everybody wanted it. CSPAN and other services had been added, and the picture on every channel was perfect, no more snowy channels like the ones we got when I was a kid, getting snowy pictures from Manchester, New Hampshire, and Providence, Rhode Island, in contrast to the Boston stations, which were crisp and clear. (Although I am currently staying in a motel where the CBS channel on the cable is snowy for no good reason.) I think I first saw closed captioning on a big screen TV in a restaurant during the 1980s. The '80s were also the era of video rental stores, and the dawn of music videos and MTV. (Yes, kids, the "M" in MTV once stood for "music," and that cable channel once actually showed music videos!)

As cable spread, I became aware of a growing political problem. Cable television was--and still is--treated as a monopoly utility so that local governments license a single cable company to operate a monopoly in that locality. This is a system that goes from flawed to worse very quickly. Like all monopolies, the cable company has its customers over a barrel, and if, in theory, residents have some recourse by complaining to the government body licensing the cable company, in actuality, the cable companies just contribute to the politicians' campaigns and your service complaint goes nowhere. The city of Allentown, Pennsylvania, once had a suitable solution. (I wonder whether they still have it.) They licensed two cable companies and set them in competition with each other. Customers whose complaints were not satisfied simply switched to the opposing cable company at no extra charge. Prices went down and service quality went up.

Another result of the politicization of cable television is not just systemic but actively corrupt. In sprawling Los Angeles, California, each section of the city and county got its own cable system. When it came to wiring South Central L.A., the poorest section of the city, a system of corruption kept the right to wire the neighborhood in play for many years as the holders of the right--themselves often ex-politicians and bureaucrats who left government to go into the cable business (in one case, after having written the ordinances governing cable systems!). During this time, not a single foot of cable was actually laid. The pols kept saying that they did not want to sell the rights to a big cable company like Warner, but, in the end, that is exactly who they sold them to! During all of this, a pair of private businessmen who actually had the means and experience to wire the neighborhood, were trying to get the rights, but the ex-pols played keep-away, dragging the whole thing out as long as they could, so that they could get rich selling and reselling the rights. The two businessmen asked what they could do to get the rights, and they were told that they should donate to the mayor's campaign fund. They didn't do that, and therefore never got the contract.  Ultimately, the two businessmen sued the city in federal court and a judge ruled that the city had violated the civil rights of these businessmen (who were black, by the way). No ruling on whether the irony that black and Hispanic politicians and bureaucrats of Los Angeles had violated the civil rights of the black and Hispanic residents of Los Angeles by keeping so many of them from having the latest communications technology in their homes for so many years.

About four years ago, my own access to television technology underwent a radical change. I had had cable for many years, both in California and when I moved to Virginia. I had also discovered the power of Internet video-viewing on newer computers. When we moved into our current apartment, we had to set up our TV set near the wall where the cable connection was available. We were dissatisfied with this location and wanted to move our set to an area of our apartment away from the cable outlet, but this would have meant the cable provider drilling a new hole from the outside--verboten to our landlord--or else running cable across the room and underfoot. What to do? Finally, I was watching a video of Glenn Beck on the Internet, and he announced that his new Internet television network was going to become available on Roku. I looked into Roku and found out that, through the Internet networks Hulu, Amazon Prime, and Netflix, a Roku viewer could connect his TV set to the Internet and watch many, though not all, of the same programs that a regular television viewer could watch, plus many on-demand offerings that a cable viewer could not see. Our television viewing now consists of a combination of about ninety percent viewing of older movies and series on the above-mentioned three Internet networks, and about ten percent viewing on our DVD player. We still have basic cable, which allows us to record programs from television and then watch them later. We especially use it to watch CBS programs because, while they are available on Amazon Prime, there is at an additional charge per episode. Meanwhile, NBC, ABC, and Fox let Hulu stream their programs about 24 hours after original broadcast. Also, through Netflix, we can still rent DVDs of movies and TV series that are not available online. Another feature of Roku is availability of the Internet channel Pandora, which offers a stream of music that can be tailored to the listeners preferences.

Telecommunications is a brave new world, indeed. It seems obvious that the future holds some sort of overall integration of media. People are already able to access their interactive computer functions through the same devices that allow them to watch Hulu or Netflix or download a book from Amazon's Kindle store. "Star Trek: The Next Generation" used to show characters going into their quarters and asking one interactive computer to play a certain piece of music, change the temperature of the room, or access information. Any of that that isn't already possible soon will be. The main change will be the diffusion of that sort of technology across our entire society.

A final caveat. In my reading about the history of television, I was struck by the irony that David Sarnoff, William Paley and other pioneers in the television industry saw television as an instrument for global communication that could do good by getting people in touch with each other and allowing them to see and understand each other. The irony lies in the underhanded way that Sarnoff went about controlling communications through power grabs, back-biting, intimidation, and in some cases even theft. How did these visionaries expect to achieve global harmony and good will when their methods of achieving it exemplified the exact opposite of harmony and good will?

Monday, October 6, 2014

Will the Real James Bond Please Stand Up

It is night. A secret agent stealthily emerges from the sea. He peels off his rubber suit to reveal a dinner jacket. He enters the casino and mingles with the guests. No one suspects that he’s a spy.

His name is James Bond, right? And, of course, it’s the swinging 1960s?

Actually, no. You’re thinking of fiction. I’m talking about something that happened in real life. In the 1940s. During World War II. His name wasn’t James Bond. It was Peter Tazelaar. Though a native of Holland, he was working for the British. He had just infiltrated the Netherlands by way of a gambling resort at Scheveningen. The casino was crawling with German counter-intelligence agents, but they never suspected the man in eveningwear who even smelled as if he had been drinking brandy all evening. (A nice touch, eh?)

(The Koerhuis Hotel in Scheveningen near where the spy Peter Tazelaar emerged from the sea. All pictures courtesy of the wikimedia picture gallery.) 

Someone who was well aware of this operation was a young naval intelligence officer named Ian Fleming (Who do you think provided Tazelaar with his transportation to the beach?) who later incorporated the incident into his novel “Goldfinger.” Tazelaar, by the way, was one of only four agents who survived missions into the Netherlandsat that time. The rest were all compromised. Like Bond, Tazelaar was a man who could handle danger. He served all over Europe and the rest of the world during the war* and afterward worked for the Dutch government and the CIA.
It has often been said that Ian Fleming based his famous fictional spy on this or that real life spy, but the truth is that James Bond was a composite of several spies that Fleming was aware of and admired.

Aside from Tazelaar, let’s look at some other candidates:
Capt. Sidney Reilly, M.C., also known as Agent ST1.

Reilly's dates say a lot about his mystique: 1874? to 1925? No one knows where or when he was born or where or when he died. Much of his "official" biography was probably false. (The best guess is that he was from Eastern Europe, possibly the Ukraine.) He was certainly not a native of Britain, though he worked for the British Secret Service and spoke English with an English accent. But, then, he spoke German, French, Italian and Russian like a native, too. He had the Bond charm, all right. A connoisseur, a ladies' man, a master of disguise, and an expert marksman. One of his legendary exploits occurred during World War I. Supposedly, Reilly killed a German officer and put on his uniform, entered a German Army staff meeting, listened to all of their battle plans, then disguised himself as a German civilian and slipped back to the British lines. After World War I, Reilly became embroiled in Russian politics, seeking to overthrow the Bolsheviks, sometimes on behalf of the British, but apparently also on his own. Had one of his schemes worked, Reilly himself might have become a top official of a new government. Reilly is presumed to have been secretly executed by the Cheka, the forerunner of the KGB.

Dushan "Dushko" Popov, OBE, also known by two code names: Tricycle (British) and Ivan (Germans).

A double agent of Serbian background, he worked for British Intelligence during World War II feeding disinformation to the Germans who thought Popov worked for them. The story goes that Fleming, who knew that he worked for the British, saw Popov place a daring bet at the baccarat table just to rattle a rival. The scene went into Fleming's first Bond novel, "Casino Royale." Popov lived in London, Lisbon, and New York. He warned the FBI that the Germans wanted information about U.S. defenses at Pearl Harbor, Hawaii, but his warning went unheeded. He was, like Reilly and the fictional Bond, a linguist, ladies' man, and connoisseur of the finer things in life. Like Bond, he pretended to be a representative of an export-import company. He lived from 1912 to 1981, but may have claimed, presumably out of vanity, to have been born in 1919.

Peter Fleming, OBE

During World War II, Ian Fleming's older brother fought behind German lines with resistance fighters in occupied Norway and Greece. He also ran deception (psychological warfare and disinformation) operations in Southeast Asia.

William Stephenson, code name: Intrepid; nickname: Little Bill.

Stephenson was a Canadian who became a spy for Britain during World War I and later ran a counter-intelligence network for MI6 during World War II. He was based in New York but his network covered the entire western hemisphere.

Cmdr. Lionel Crabb, Royal Navy

(Coote, R G G (Lt), Royal Navy official photographer [Public domain], via Wikimedia Commons)

Crabb was not as sophisticated as Bond, but, like both Bond and his friend Fleming, he was a naval commander and a chain-smoker. In his prime Crabb was the best frogman in the navy, engaged in covert ops against the Germans and Italians during World War II. Cobb mysteriously disappeared during an op against the Soviet navy in the mid-1950s. Years later, a Soviet frogman claimed that he had killed Crabb in an underwater knife fight. Though few knew the details of Crabb's final mission at the time, Fleming knew enough to use it as the basis for his novel "Thunderball."

Cmdr. Ian Fleming, Royal Navy

As an intelligence officer himself, one of Fleming's jobs was thinking up wild operations, some of which were deemed too farfetched by his superiors. Historian Vejas Liulevicius notes that, as a novelist, Fleming was much better paid for much the same ideas. He also got in touch with his inner Walter Mitty when he created the fictional Bond: In Lisbon, Portugal, during World War II, Fleming played high-stakes card games with enemy agents, but, unlike his fictional hero, he lost. Bond smoked the same cigarettes, drank the same liquor and enjoyed the same sports as Fleming. Fleming's masochistic fascination with Bond's endurance of torture might trace back to Fleming's education at Durnford School, a private British elementary school where some allege that extreme child abuse was the norm.**

This topic is further covered by a Wikipedia article entitled "Inspirations for James Bond" and in Ben Macintyre, "For your Eyes Only: Ian Fleming and James Bond," London: Bloomsbury Publishing, 2008.

*Born in the Dutch East Indies, Tazelaar helped rescue his own mother from a Japanese POW camp.
** Ben Macintyre, "A Spy Among Friends." New York: Crown Publishers, 2014, p. 5.

Saturday, July 12, 2014

Hand in Glove

Trivia: Latex surgical gloves are generally used today to protect the patient from germs on the hands of the surgeon and his team. Surprisingly, this was not why surgical gloves were first used, however. Dr. William Stewart Halsted of Johns Hopkinsintroduced surgical gloves for the initially exclusive use by his favorite surgical nurse, Caroline Hampton (who eventually became Mrs. Halsted). In the late nineteenth century, the idea of keeping the field around the patient sterile was relatively new, and doing so involved spraying an antiseptic mist containing mercury. Unfortunately, Caroline’s skin had an allergic reaction to the mercury; so, rather than do without his favorite nurse, Halsted gave her rubber gloves to wear during procedures. Nobody else wore gloves, just her. Eventually, of course, it was realized that latex gloves on the hands of the entire team might actually benefit the patient, too, and the rest is history.

Not Too Big to Fail?

It is an article of faith among those on the left that if one person, company or country is successful while another fails, then it must be because the one stole the success of the other. This is rarely the way the world works. Take for example, the fact that in 1914, London, the capital of Great Britainand its far flung British Empire, was the financial capital of the world. If you were in the business of making and controlling large amounts of money in stocks and other investments, London was the place. By 1918, however, New York City in the United Stateswas the financial capital of the world. Did New yorkachieve this by taking something away from London? Not quite. What happened between those two years was World War I, which financially ruined Germany and to a degree France as well. To pay for its own share of the war, Britain spent and borrowed money and brought itself near bankruptcy, too. Londonhad to borrow from someone, so they borrowed from New York. It wasn’t so much that London had something taken from it as that it exhausted itself. New Yorkstepped in to take up the slack, not only loaning to Londonbut to anyone else who wanted to borrow money and could no longer borrow it from London.


Failure is never so much what others take from us as it is the result of our failing ourselves. Recently, the AMC cable channel started airing a series called “Halt and Catch Fire” about the heady days of PC clone computers in the 1980s. This is a remarkable and instructive (and therefore forgotten) story about how the action of the free market confounds the common (leftist) wisdom about how economics works.

Shortly after IBM announced its pending introduction of the OS2 operating system for its PC computer in the mid-1980s, I met a man who worked for “Big Blue” and he told me that IBM was going to continue to dominate the PC market because it is big (a kind of precursor of “too big to fail”) and people would buy OS2 just because it was IBM that produced it.

To grasp the fallacy in his thinking, you need to understand the history of IBM’s PC project. Of course, IBM is one of the oldest computer companies in the world, and made its reputation building enormous computers. In the 1960s, the company introduced “mini-computers,” which were so-called because they were smaller than the house-sized behemoths IBM had been making, but these minis were still more than twice the size of the refrigerator in your mom’s kitchen.

Well, after Apple introduced desk top computers with user-friendly software, Big Blue assigned some executives to a team project intended to develop IBM’s own desktop to compete with Apple. The trouble was that IBM’s people did not know what they were doing and did not really know that they did not know, and if they had known, they would not have believed that it made any difference. They represented IBM and they were they five-hundred-pound gorilla in the computing business.

So the first thing IBM did was create it own original microprocessor. They got it patented. The next thing they did was to build a desktop computer around this processor, using off-the-shelf parts—parts they did not patent. Anyone else could buy those parts and build a computer with them, but IBM didn’t think they would, because they would be competing with Big Blue, and Big Blue was the five-hundred-pound gorilla, right?

Next the IBM boys went to Redmond, Washington, and met with Bill Gates, then-president of a little software company called Microsoft. Gates (now the semi-retired chairman of Microsoft) and his business associate, Steve Balmer (who recently stepped down as president of Microsoft), quickly realized that the IBM guys did not have a clue. They thought that Gates could magically give them CPM, the best operating system at the time, but not one that Microsoft owned. Gates sent IBM to the owner of CPM, but when IBM called at his home, he wasn’t there, and his wife, who was vice-president of his little company, was so freaked out by the IBM men in their crisp blue suits and waiving sheaves of non-disclosure agreements in her face, that she sent them away. Not knowing what to do next, the IBM men went back to Gates. This time, Gates decided to stop being so ethical and take some money from these suits. He bought QDOS—the Quick and Dirty Operating System—from a long-haired hacker and re-sold it to IBM. Now, Gates knew that QDOS was basically a rip-off of CPM, but any lawyer could have told him that the law in the early 1980s was very unclear as to whether or not someone could patent a software program. Dubbed a “virtual machine,” a program fell into a gray area because it was not an actual machine. (Do not try this today; the law has since decided that computer programs are patentable.) This legal situation, which helped Gates sell a computer operating system to IBM, would later come back to haunt Big Blue.

IBM took the Microsoft DOS (MS-DOS after Gates replaced the Q with an MS) and called it OS. It operated the first IBM PC. It was an adequate little desktop computer for its day. For one thing, it was cheaper than the Apple desktop, and you could buy one anywhere and even get it repaired through your dealer.

(A side note: Michael Dell was a college dropout who sold bootleg IBM PCs out of the trunk of his car. The legality of this was interesting because the dealers who sold Dell those PCs for him to resell were breaking a legally-binding contract they had signed with IBM; but Dell was not technically breaking the law because he had signed no such agreement. Michael Dell eventually went into the business of selling his own PC clones, but that gets us ahead of our story.)

Gates again remembered his ethics and—realizing that someone could use Microsoft DOS to operate a PC clone—offered to sell the exclusive rights for the operating system to IBM. But IBM did not see why they needed to do that. The software only worked in a PC, and IBM made the only PCs. They were Big Blue after all. They were the five-hundred-pound gorilla, right?

It might have worked out differently. When companies like the fictional one in the TV series “Halt and Catch Fire” began the expensive and elaborate process of reverse-engineering the microprocessor at the heart of the PC, it was only a legal theory that judges would declare that this was legal. What the would-be clone maker had to do was get someone who had never worked for IBM and had never seen the blueprints for its microprocessor to read a report carefully extracted from a report on how the IBM PC microprocessor works. This extracted report would not tell how to make the microprocessor; it only told what it does and what a clone would need to do. The reverse-engineer had to invent something that would mimic everything the IBM microprocessor was capable of doing. Almost from scratch. It would cost the company that wanted to do it millions of dollars just for the reverse engineering project, but the prospect was not as daunting as it might have been. If you reverse-engineered the microprocessor, you did not have any other patents to worry about. IBM did not own any other components of the hardware, and, since IBM had turned down Gates offer to buy the exclusive rights to the operating system, the clone manufacturer could just pay Microsoft for the right to use MS-DOS.

Was all of this legal, though? IBM would have to sue somebody to find out. The problem, of course, was that candidates to sue were everywhere. There were both foreign and domestic clone makers within a short time. And when IBM finally sued some of them, the courts made the landmark decision that reverse-engineering was legal because the person who actually built the new microprocessor did not directly copy the IBM processor but merely made a device that could do the same thing.

At that point, the argument that people would buy IBM’s new proprietary OS2, just because IBM made it, fell flat. IBM failed to realize that they were closing the barn door after the cows had already escaped. Microsoft DOS was a good enough operating system as far as consumers were concerned, and the price of a PC clone was right. (It was cheap.) The desktop computer became ubiquitous, and the revolution was underway.

The head of the PC project at IBM tendered his resignation, by the way, but his boss refused to accept it. The lesson taken at Big Blue seemed to be that this was a learning experience for everyone. Quite a lesson: Size only matters until it doesn’t.

It was not that the PC clone makers took anything away from IBM so much as that IBM shot itself in the foot by not taking proprietary control over more than just one component of its invention. IBM failed to gain a monopoly over the PC because of its own weaknesses, not because of the strengths of its competitors who resembled a swarm of mosquitoes more than an opponent anywhere near IBM’s own size. And the assumption that IBM would beat Apple because Apple was a smaller company just showed a failure on IBM’s part to see the bigger, more complex picture. It was also an error on the part of both IBM and Apple to think that the real battle was over the hardware. In the war between IBM and Apple—which my IBM acquaintance assumed that IBM would naturally win because of the five-hundred-pound gorilla thing—it turned out that the winner was Microsoft, not either of the hardware champions. Licensing MS-DOS and subsequent software products to PC clones is what made Gates the billionaire that he is today.

Gay Fascism?

There are more and more cases popping up all over the country of gay people suing business owners for not photographing or catering their weddings or parties. In at least one case a t-shirt customizer was required to make t-shirts for a gay pride parade or else get out of the business.

Past millennia of gay people would have given their right arms just to be tolerated. They wanted to live without other people’s values imposed on them. To now impose the obligation to participate in gay celebrations on non-gays is reverse fascism, forcing people, who would rather not, to participate in same sex celebrations. It is one thing to expect a baker to sell you a cake regardless of your sexual orientation, but it is quite another for you to force him to cater your gay wedding or make signs for your gay pride parade. If you call the police on him, he isn’t guilty of harassing you; you are guilty of harassing him.

Thursday, June 5, 2014

Jews and Christians

The elephant in the living room for most Christians is the fact that Jesus was Jewish. Joseph and Mary were Jewish, his disciples were Jewish, most—if not all—of his friends were Jewish.  Yet the only people called Jews in the gospels are people who are not currently following Jesus. Take his friend Lazarus in the Gospel According to John. Now Lazarus is Greek for Eleazar. That’s really a Jewish name. Eleazar was Jewish and so were his sisters, Martha and Miriam (called Mary in the Gospel, naturally).


So in the eleventh chapter of the Gospel of John, Lazarus’s sisters send word to Jesus that Lazarus—that is, Eleazar—is dying. And Jesus takes his time getting there so that by the time he arrives, his friend Eleazar is dead and buried. And John says there are Jews who have come to sit shiva with the grieving family as if they’re Jewish and Eleazar’s family isn’t, although I don’t know why, in that case, the family would be sitting shiva and why “the Jews” would sit with them.


Anyway, Jesus comes, and you really know Martha is Jewish because she actually believes that Jesus is the messiah and this means that the messiah of all people apparently has come to sit shiva with her family, and yet the first thing she says to him is, basically, “You couldn’t have come sooner?”


Now there has recently been some controversy about verses being left out of the gospels, and in this case it happens to have come to light that a line was left out of this story about Eleazar being resurrected by Jesus. Jesus brings him back to life and presents him to his sisters, and what’s been left out is Martha’s line: “But he had a hat.”


I think I was doing well in the above commentary until I brought out the old hat joke, but you get the idea.


It has been said that Jews talk about being Jewish in a way that Christians don’t talk about being Christian. This is true in the sense that being Jewish is an identity that goes deeper than one’s religious beliefs. Being Jewish depends on history, ethnicity, family, and the experience of being an often oppressed minority. It is precisely that experience of oppression—many Jews themselves believe—that has given many Jewish people an ironic, sarcastic, skeptical sense of humor. One Jewish man went so far as to describe his parents as “very satisfied middle-class people and I love them dearly but they have never been funny. My grandparents, on the other hand, escaped pogroms in Europe and spent the first years of their lives in America in poverty, and they are hysterically funny.”


The same man observed that Christians, across the board, don’t have the kind of identity issues that would lead them to sit around talking about being Christians per se. Although you might want to watch the Coptic Christians escaping Egypt today and settling in places like New Jersey. In Egypt they have been killed by mobs, their daughters have been raped, their homes, businesses and churches burned—there might be a good chance that their sons and daughters, if they survive to reach America, might be the next wave of young standup comics.

While Christians don’t get together to discuss being Christian in as deep a sense as some Jews might, we should qualify that by saying that Christians have identity issues, usually surrounding their belief systems and how that has affected their personal lives, families and even their community. Obviously, people who have deep Christian beliefs do relate to each other in a special way. A Christian woman I know told me that she went into an office and the secretary, a woman of a different race, asked how she was. My acquaintance said, “I am blessed.” “Well, so am I,” said the secretary, and the two strangers ended up hugging each other.


There is also a kind of high sign between apostate Christians. You know what I mean if you have ever seen two ex-Catholics or two ex-Baptists meet. There is a sense that they each know what the other has been through: parents, friends, community, clergy, all frowning on their current life choices. This has to do with belief—or in this case, the absence of belief—but it affects relations with parents, siblings and community members who either take a long time to come to terms or never do. My best friend no longer practices Catholicism but she still says, “I am not an ex-Catholic; it has marked me forever.” I asked her today whether she feels that she has a “Catholic soul” in the sense that some Jews have said that they feel they have a Jewish soul. No, she says. A soul is more likely to come from ethnic identification and usually when they have a history of having been denigrated. So Irish people might feel they have an Irish soul, or African-Americans might feel they have a unique soul. (Indeed, African-Americans have virtually owned the concept of "soul" for some time.)


A New Age guru ran a commune where he invited speakers of different faiths to come and present their spiritual gifts. After a charismatic rabbi spoke to the commune, the guru stood up and said that before the rabbi came, the commune was about six percent Jewish; now that the rabbi had come, they were still six percent Jewish, but only because each member of the community was now six percent Jewish.

Four possible reactions from  four (or fewer) Jews:

Reaction number one: Wow, what tolerance! What acceptance!


Reaction two: This kind of logic is the best you can expect from the goyim?


Reaction three: Only SIX percent?


Reaction four: How come the six Jews in this cockamamie commune lost ninety-four percent of their Jewishness?

Tuesday, March 25, 2014

Number Magic

There are many tricks involving numbers that demonstrate their surprising relations to each other. What does any of this have to do with history? Well, number magic is as old as man's discovery of numbers.

An old chestnut is the way that the results in the multiplication table for the number nine run consecutively with each operation, but forwards in the tens column and backwards in the ones.

 1 x 9 =  9

 2 x 9 = 18

 3 x 9 = 27

 4 x 9 = 36

 5 x 9 = 45

 6 x 9 = 54

 7 x 9 = 63

 8 x 9 = 72

 9 x 9 = 81

10 x 9 = 90

This pattern actually continues after both columns start over at "9."

11x 9 = 99

Then the first two digits can be read together as consecutive numbers, counting up, while the last digit, in the ones column, counts down.

12 x 9 = 108

13 x 9 = 117

14 x 9 = 126

You can also see the results over one hundred as a "subtract one formula." For example, 4 x 9 = 36, so 14 x 9 can be solved by taking 36 and subtracting 1 from the 3, which equals 2, but keep the 1 before the 2, which gives you 1, 2, 6 and you have 126.

Also notice that if you consider the digits as separate numbers and add them up, you always get nine.

For example, 1 + 2 + 6 = 9, or take the sequential results 18, 27, 36 and 45: 1 + 8 = 9; 2 + 7 = 9; 3 + 6 = 9; 4 + 5 = 9. Also, consider 12 x 9 = 108. If you add 1 + 0 + 8, you get 9. Same with 13 x 9 = 117: 1 + 1 + 7.

Of course, one of the most obvious reasons why the first relationship works the way it does is that in a system of counting that is based on the number ten, we can easily see that adding ten to any number will change the number in the tens column one up, but the number in the ones column is the same; so 16 + 10 = 26, 17 + 10 = 27, and so on. Nine, however, is one less than ten; so when you add nine to any number, your ones column will always be one less than the number in the ones column of the number you were adding to. So, 9 + 10 = 19, but 9 + 9 = 18, or one less than 19. Each subsequent time that you add nine, the next result in the sequence is going to be one less in the ones column than the previous result.


Here is another trick: The addition of any sequence of numbers where the last number is divisible by three, will always yield a sum that can be reduced to the number 6 if we add up its digits.


Let's take 1, 2, 3.

1 + 2 + 3 = 6.

No need to do anything more in this case because the last number in the sequence is 3, which is obviously divisible by three (3 ÷ 3 = 1), and since six is the number we are looking for, we need look no further than the sum.

The next sequence (4, 5, 6), however, will give us a two digit sum, so we will have to add the component digits of the sum:

4 + 5 + 6 = 15.

And when we add the digits of this sum (1 + 5), we get 6.

This trick seems to go on working forever.

7 + 8 + 9 = 24; 2 + 4 = 6 ...

16 + 17 + 18 = 51; 5 + 1 = 6 ...

202 + 203 + 204 = 609; 6 + 0 + 9 = 15; 1 + 5 = 6,

and so on.

Tuesday, January 7, 2014

The Impossibility of Saying What We Mean

I found the quotation, below, scribbled on a piece of paper. Does it make sense to you? It makes perfect sense to me, but then I have spent years of my life contemplating how easy it is to reach colossal misunderstandings with others. I suspect that I copied this from something by Vardis Fisher, the little known but worthwhile American novelist. Probably from his book "God or Ceasar."

"In no language can we say what we mean. We conform what we mean to the limitations of the language(s) we happen to speak."

If Fisher didn't say this, it is very like something I know he said in "God or Ceasar"about English not being a precise language. If you don't qualify what you mean very carefully, you will regularly be misunderstood. Fisher advised the writer to keep this in mind. It is one of the reasons for re-reading your own writing before sending it out into the world. On second reading—or third, or fourth—you will probably catch an ambiguity that you didn't think about when you were writing your letter, essay or story. You didn't see it when you were writing it because you knew what you meant, of course. There was no ambiguity in your mind, just in your careless choice of words, and that could only be recognized when you stood back and read your own words with more remove, if not quite the eyes of a stranger.