Friday, December 30, 2011

Debunking the Debunkers

Snopes.com has made its name debunking (and presumably occasionally bunking) urban myths and Internet circulars, those emails your friends (real and faux) send to you while you are trying to work.

One Snopes post from six years ago debunks a piece that has been in circulation for much longer. The Snopes bit www.snopes.com/politics/religion/monument.asp includes a more or less full copy of the circular email upon which it comments.

The circular tends to interpret the appearance of religious symbols surrounding the Washington Monument in the nation’s capital as showing the religious nature of our country and of the Founding Fathers, George Washington in particular. The Snopes commentary tends to undermine the circulator’s religious interpretation by showing, for example, that while there is a Bible in the cornerstone of the monument, there is also a long list of other books and objects, most of which have no religious significance. Or that one of the monument’s many engraved stepping stones, which the circulator claims was put there by Chinese Christians was actually put there by Chinese citizens who did not leave any kind of Christian message at all. Or that the circulator’s claim that the placement of the monument in combination with the Capitol and other monuments forms a cross which was part of “the original plan of the designer, Pierre Charles L’Enfant” is belied by the fact that the Lincoln and Jefferson memorials were built long after L’Enfant, were built in their present locations only after debate as well as adjustment to material conditions (the soil of the original location chosen for the Washington Monument was unsuitable for building), and only “impose a cross upon the landscape” by accident. (Ah, but the National Park Service admits that the prominent placement of the monuments and the Capitol do form a “great cross” along north-south and east-west axes originally found in the plans of L’Enfant; if the religious want to believe that this was “ultimately” no accident, there is no way to prove them wrong.)

Another quibble with Snopes: They argue that an alleged prayer by Washington was not actually a prayer but part of a letter to all of the governors of the “thirteen states,” that it was not written by Washington and that it has been altered to seem more like a prayer. (Snopes notes that the quotation was turned into a prayer not by the email writer but by someone who made a bronze tablet for St. Paul’s Church in New York where Washington was a parishioner during his presidency.) On each of these points we might quibble. Certainly the email circulator did not realize that the tablet was an alteration of the original text, but even granting that objection it is not entirely accurate to say that the quotation is not in any sense a prayer. The original paragraph begins, “Now I make it my earnest prayer….” Thus while the author of the letter is not at the moment praying, he does report his prayer.

As to whether General Washington wrote the letter himself, or it was rather the creation of his aide, Col. David Cobb, I can add something further to consider. In the 1990s, I saw a televised conference of Washington scholars meeting at Mount Vernon. One of them presented his discovery that President Washington characteristically wrote all correspondence himself and then had a secretary rewrite each letter so that it appeared in another’s hand; Washington would then sign the letter and have it mailed, thereby deliberately creating the impression that he was not micromanaging everything that went on in his administration when in fact he was doing just that. It is hardly improbable that Washington established this practice by the end of the Revolution when the letter in question was written. BTW, I put “thirteen states” in quotation marks because there were fourteen states by the end of the Revolution, Vermont having been carved out of New York State by then. (Of course, perhaps Washington snubbed Vermont for some unknown reason.)

The email circular ends with a declaration that “Washington’s America” was “established under the guidance, direction and banner of Almighty God, to whom was given all praise, honor and worship by the great men who formed and fashioned her pivotal foundations.” Snopes slaps this down by pointing out that the Washington Monument was a nineteenth-century project and that its religious messages (whose significance Snopes could undermine but not deny altogether) reflected the later century rather than the sentiments of the eighteenth.

Not so fast. The circulated email might be inaccurate on many points, but this one is debatable. The Founders, by and large, were, indeed, religious men. Snopes reports that it is difficult to find references to Jesus, by name, in Washington’s writings, but references to “Providence”—by which all of the Founders who used that word meant “Almighty God”—are plentiful in Washington’s pronouncements. Many of the Founders—including Washington—clearly saw the creation and development of the United States as a sacred venture.

Does Snopes.com betray a liberal bias as some complain? At least it would seem that Snopes follows the liberal party line on the issues brought up in this case. Liberals generally have an animus against religious belief, especially a traditional, personal, passionate one that is based on traditional values. Liberals butress their bias by maintaining that history shows the Founders to have been deists whose faith was cool and intellectual and did not play up traditional Christian faith in Jesus Christ. This view is not entirely justified by historical fact. Some of the Founders had their doubts about questions of traditional religious faith, and they were often liberal in their ecumenical acceptance of other sects of Christianity and even of Judaism, but most of them were church-goers and possessed a palpable degree of passion in their trust of what they so often called Providence. This was no small or unimportant thing to them; many of them often expressed the feeling that they were embarked on a dangerous adventure that could only be successfully navigated under divine protection and guidance. It was just this kind of concern that led Benjamin Franklin—not a man whose religious views were entirely conventional—to complain about the absence of any reference to God in the proposed Constitution of 1787.

I am reminded of Bernard Goldberg’s observation in his book, “Bias,” that liberals do not think they are being liberal, just reasonable. The “knowledge” one has received at the temples of American liberalism—originally the universities but now almost every public school—and kept up with by reading the daily press, seems reasonable. But too often this “knowledge” is fully wrapped in a coating of shiny ideological bias, so familiar that its consumers cannot recognize it for what it is.

Friday, December 16, 2011

"All Men are Born Free and Equal"

Some politically-minded thinkers have suggested that the founding generation of the United States did not mean to include African Americans when they declared, in Thomas Jefferson’s immortal phrase, that “all men are created equal.” That they saw no contradiction between uttering these words and owning slaves.

It is difficult to believe that any thoughtful person can think this, especially if they know the actual history. How do they explain the fact that within five years of Jefferson’s declaration, two states ended slavery and other states soon followed until the northern states had either abolished it or set slavery on the road to abolition? (Even South Carolina’s legislature was forced to vote on a bill that would have abolished slavery, but, of course, it did not pass.) Jefferson himself had mixed feelings about the institution, saying, “I tremble for my country when I reflect that God is just, that his justice cannot sleep forever. Commerce between master and slave is despotism. Nothing is more certainly written in the book of fate than these people are to be free.” Because of Jefferson’s qualms about slavery, Alexander Stephens, the vice-president of the Confederates States of America, later declared that no Southern gentleman should read Jefferson.

Benjamin Franklin was remarkable for being one of the most forward-thinking Founders despite being one of the oldest. At age 40, he owned a couple of slaves; at age 80, he not only no longer owned slaves but had become president of the Pennsylvania Society for Promoting the Abolition of Slavery. On behalf of this society, Franklin submitted a petition to the First Congress of the United States asking that the government come up with a way of legally ridding the nation of slavery, “removing the Inconsistency from the Character of the American People.”

Perhaps nothing better illustrates the burgeoning awareness of an “Inconsistency” between liberty and slave ownership than the case of Massachusetts where a new state constitution in 1780 was broadcast and made citizens of the commonwealth fully aware that “All men are born free and equal, and have certain natural, essential, and unalienable rights; among which may be reckoned the right of enjoying and defending their lives and liberties.”

The conventional wisdom in some quarters is that the thought occurred to no one at the time that these high-sounding words had anything to do with the rights of those held in slavery in the newly-minted states. But the thought did occur to a slave woman known as Bett who was held by John and Hannah Ashley of Sheffield, Mass., when she heard the new constitution read in public. Further, she shared this thought with an attorney named Theodore Sedgwick*, who was convinced that Bett was onto something. He argued the case for Bett and another slave named Brom in county court in August 1781, and a jury declared that the Ashleys had no right to hold Brom and Bett in bondage under Massachusetts’s new constitution.

After a subsequent anti-slavery case, Walker vs. Jennison, was decided by the Massachusetts State Supreme Court in the slave’s favor (citing the Brom and Bett case as a precedent), slave owners in Massachusetts were on notice that their so-called property rights over their fellow human beings would no longer be upheld by the courts. They were forced to prepare to free their slaves or, at least, upgrade their status to that of indentured servant, which, while not an immediately improved condition, offered the promise that their status could not be enforced perpetually or be forced upon an indentured servant’s children. By 1790, there were virtually no slaves left in Massachusetts. Notably, the commonwealth never officially abolished slavery, but the constitutional argument was used to undermine the institution and force its rapid demise.

All of the most northern states took various steps to eliminate slavery by 1800. This did not mean that the end of slavery in all of these states actually coincided with the end of the eighteenth century. Some states dragged their feet by prohibiting or making difficult the acquisition of new slaves. Thus, on the eve of the Civil War, there were still a couple of elderly slaves living in New Jersey (according to the 1860 census). Delaware and Maryland, not being culturally really northern states, continued to hold slaves until the Thirteenth Amendment in 1865, after the Civil War had ended. (There are records of Union officers from states such as Delaware marching off to battle with their slaves in tow.)

* * *

Bett, upon gaining her freedom, changed her name to Elizabeth Freeman and became a wage-earning servant for the family of her former attorney. She is buried beside her close friend, Judge Sedgwick’s daughter, Catharine, and is, I believe, the only African American buried in the Sedgwick family plot in Stockbridge, Mass.

Catharine Sedgwick recorded Elizabeth Freeman’s memoirs. (Freeman herself was illiterate.) Occasionally, Sedgwick appears to have tried capturing the flavor of Freeman’s speech:

“Any time, any time while I was a slave, if one minute's freedom had been offered to me, and I had been told I must die at the end of that minute, I would have taken it—just to stand one minute on God's airth a free woman— I would.”

As an amateur linguist, I am struck by what I presume is Sedgwick’s transcription of “earth” as “airth.” It could very well say as much about the speech of Sedgwick as it does about that of Freeman. It is my understanding that “airth” is considered to be an ancient, possible alternative spelling of the more standard spelling of “earth” but that no one is certain of this. By 1780, the modern spelling of the word was standardized, and as a highly educated woman from a highly educated family, Sedgwick would have known this. Apparently, she spelled the word “airth” in order to suggest her friend’s pronunciation. If this is so, then it implies that Freeman’s pronunciation was different from Sedgwick’s. Perhaps Sedgwick’s pronunciation of the word was very close to our modern pronunciation.

In Old and Middle English—the languages of “Beowulf” and “The Canterbury Tales,” respectively— “airth” would probably be, to our modern eyes and ears, a close approximation of how “earth” was pronounced. While the pronunciation of courtiers and other educated classes gradually grew to resemble the pronunciation of more modern, educated speakers, regional dialects tended to hold on to such pronunciations as “airth” for “earth.” If I am correct, then, Catharine Sedgwick is unwittingly telling us that she herself said “erth” while Elizabeth Freeman said “airth.” The stratification of American language along class lines—but often based on region of origin in the Old World—would continue throughout the history of the United States. Of course, while some African American’s speech probably bore some African influences, especially if they had actually been born in Africa, the kind of English that African Americans learned from lower-class Englishmen often constituted an even heavier influence. That appears to be what happened in the formation of Elizabeth Freeman’s dialect.

* Trivia: Theodore Sedgwick is the great-great-great-great grandfather of actress Kyra Sedgwick, star of one of my favorite TV series, “The Closer.”


Dec. 19 - Actress Meryl Streep told the story of Elizabeth Freeman last night on CBS's "60 Minutes." She thought, as many do, that Freeman was defending her sister when she was struck with a hot metal object and suffered serious burns at the hands of her mistress, Hannah Ashley. Apparently there is anecdotal evidence that Freeman was protecting her sister, but more solid evidence suggests that it was her daughter she was defending. Further, there is no evidence outside the anecdote there Freeman even had a sister.

Streep wants to help found a national women's museum, and it would undoubtedly include an exhibit on Freeman.

Monday, December 5, 2011

Words Spelled the Same but Pronounced Differently; Other Notes on English

Read (to read; pronounced like "reed") – A present tense verb involving the recognition of the meaning of written words or the act of translating written words into spoken ones (reading out loud). Sometimes used as a noun, as in, “The mystery novel was a good read.”

Read – The past tense of “to read.” To add to the confusion, it is pronounced identically to the color “red.”

Wind (the vowel is pronounced similarly to that in the word "win") – Noun meaning the movement of air, especially meteorologically, but its meaning also extends to the breath or even flatus.

Wind (to wind; the vowel is pronounced similarly to that in the word "wine")  – Present tense verb meaning to twist or turn something, as a watch stem or string or thread around a spool. By extension, something can be said to turn or twist over the terrain, such as “The Long and Winding Road,” the title of one of Paul McCartney’s songs.

Wound (pronounced similarly to the interjection "wow") – The past tense of the above verb “to wind.” Verbs that form other tenses by going through changes in their vowels are technically called “strong verbs” because they are ancient and resist the more modern simplification of using the same form as the present tense but adding “-ed” to the end to make the past tense.

Wound (pronounced like the old word for courtship "woo") – An injury such as a cut, puncture or other damage to the flesh. It can be extended to include psychological hurt.

Close (pronounced with a real "s" sound) – An adjective that describes one object that is near another.

Close (the "s" is pronounced more like a "z") – The present tense of a verb meaning to shut, stop or suspend something. “Please close the book.” “Close the swimming pool for the rest of the year.” It can also be a noun: the end of something such as an event. “At the close of the festive evening, everyone went home.”

Present (pronounced PREzent) – Adjective describing the condition of being here, now. Noun: The time experienced right now. The same pronunciation is used for the noun describing an object given to someone. “Thank you for the Christmas present.”

Present (preZent) – Verb: to show or introduce something or someone to another person. “Your Highness, may I present the Count of Monte Cristo.” To introduce or make something known. “He presented his symphony to the public for the first time on May 7, 1824.”

Record (REcord) – A written document or vinyl disk for playing music. Adjective describing a noteworthy event. “The temperature will reach a record low.”

Record (reCORD) – The act of making a written document or registering sound or events by a device such as a tape recorder or seismograph. Notice that in the cases of “present” and “record” we mark each as a noun or adjective on the one hand and as a verb on the other by emphasizing the first syllable for noun/adjective and the second syllable for the verb. So “reCORD” is what we do in order to make a “REcord.”

These words are pronounced differently for different reasons although it mostly boils down to sound change over the history of the language. In Old English, the accent was always put on the root syllable. French does not follow such a rule, so the introduction of French words into English changed that expectation. Indeed, “present” and “record” were derived from French. So, although we spell many words the same way, we now change the accent to change their meaning thanks to the influence on English of continental languages such as French and Latin.

Have you ever noticed the words in English that change their sounds when the word changes its function (technically called “inflection”). For example, take the verb “to go.” Today I go. Yesterday I went. “Went”? Its not the same word; “went” doesn’t even have a single letter or sound in common with “go.” This is an example of a strong verb. But wait, what happens when I have gone? “Gone”? Why is it pronounced “gawn”? After all, the way we spell, it should rhyme with “bone,” but it doesn’t. How come?

Partly it has to do with the influence on the vowel of the sound of the “n” after it. The “n” sound keeps the “o” from being a long sound. Originally, the “o” in “go” was pronounced the same as the higher and shorter “o” in “gone,” but as the sound of “o” changed in most of the words in English (part of what is called the Great Vowel Shift), it changed more in “go” than in “gone.”

The Great Vowel Shift occurred throughout the period of Middle English (spoken roughly from the eleventh or twelfth century until the late fifteenth). Middle English had five dialects, each of which pronounced words quite differently (or even possessed completely different words for the same concept). While in part of southern Middle England the word “go” was pronounced more or less the way “gone” is now, the northern Middle English speakers still pronounced it the way they had in Old English, which was more like “gah” than “gaw.” They also pronounced “bone” as if it were “bahn.” (Like someone from Boston pronouncing “barn.”)

These regionalisms – both of words and their pronunciation – led to confusion as England became united as a single country and more or less unified as a culture. Either one dialect had to be raised as the standard or features of several dialects needed to be blended together. English spelling and vocabulary might have been more regular if the first course had been followed, but it was pursuit of the second one that gave us the hodgepodge that is English today. For example, in the southwestern dialect of Middle English, people said “vox” instead of “fox,” and their word for a female fox also began with a “v” sound: “vixen.” Standard English now has “fox” but we also use “vixen.”

Many common English words, such as “their” and “horse,” come from the northern Middle English dialect. The thing that really becomes confusing is that each dialect was written so that it had its own spelling system. When the dialects were blended to create one English language, more or less near the end of the fifteenth century, we kept the various dialectal spellings of the words without striving for uniformity. In some cases, words had to be repurposed or a new distinction had to be made. For example, “shirt” is southern English and “skirt” is northern. Originally, they meant the same thing, but since both came into the standardized English language together, each was given a different specific meaning, even though they both still mean an article of clothing.

Wednesday, October 19, 2011

230th Anniversary of the Siege of Yorktown

The Siege of Yorktown, the final major battle of the American War of Independence, came to an end 230 years ago today with the Articles of Capitulation being signed on October 19, 1781. In a formal ceremony, British General Charles Cornwallis surrendered his sword to American General George Washington.

I do not know what Cornwallis said to Washington on that occasion, but he might have pointed out that the British won most of the battles of the war; to which Washington might have replied that they had lost the last one.

Washington himself had been technically defeated in battle again and again. In my view, his true talent as a field commander was his ability to recognize it when he had blundered into a dangerous situation in battle and then to admit his mistake to the extent that he always organized an orderly retreat so as to save his army for another day. At least twice, junior British officers tried to prod their generals into pursuing the rebel army and destroying it, but the generals always seemed to notice that it was tea time and so they broke off the pursuit until the next day, by which time Washington was always long gone.

The British war effort during the War of Independence was rather dreadful. The high command in London consisted of Lord George Germain. If you look in a dictionary for the term “upper class twit” there should be a picture of Germain. If he could not develop an effective strategy for the war, he was at least supposed to manage communication between the generals in the field, but he failed even at that, never telling one general that another general needed his help, so that the war was completely uncoordinated. After this sort of incompetence led to the American victory at Saratoga, New York in 1777, the French decided to send actual military aid in the form not only of money and weapons but troops and ships as well.

So it was that in the fall of 1781 there were fifty-one French ships, including thirty-three war ships, off the coast of the American colonies and converging on the Chesapeake Bay. Meanwhile, the British, having just sent a fleet home to England from the Caribbean, had only nineteen war ships in the vicinity. On September 5, the French won a sea battle off the coast of Virginia. It became clear to Cornwallis that the British were not going to relieve his army at Yorktown, leading to his surrender the following month.

Wednesday, August 10, 2011

An Inapt Analogy

I used to hear people say, “If they can put men on the Moon, why can’t they feed the world?” Perhaps the reason that you don’t hear that anymore has to do with the way that a more precise rendition of the question contains its own answer: “If the government can put men on the Moon for two weeks (total time logged on the Moon during six missions by American astronauts over 41 months from 1969-1972), then why can’t the government feed the world?”

Doesn’t the answer become clear? Human beings cannot eat six times over a 40-month period and then go for more than forty years without eating at all.

Monday, July 18, 2011

PREVENT THE REVOLUTION

Today I saw that bumper sticker that says,

"Stop Bitching And Start The Revolution"

To which I reply:

99 Percent Of Revolutions Lead To Dictatorship.
99 Percent Of Revolutions Yield Poverty And Famine.
99 Percent Of Revolutions End In Mass Murder And Genocide.
Don't Like Those Odds?
KEEP ON BITCHING,
AND PREVENT THE REVOLUTION.

Saturday, June 18, 2011

Ignorance About the Declaration and Constitution

We are apt to assume that the American people not knowing the difference between the United States Constitution and the Declaration of Independence is strictly a contemporary problem, but it has existed at least for a few decades, and I am afraid that it is a much older problem than we think. Still, I believe that, say, fifty years ago, most high school graduates and certainly most college graduates knew the difference, while, according to a recent survey, only twelve percent of high school seniors are able to identify what those documents are. I doubt that today's college graduates could do much better.

Partly this might be explained by the fact that at one time fewer people graduated from high school let alone college; however, I do not think that can explain the difference in knowledge because I also believe that the rate of high school graduation has increased in some quarters from the 1930s to the present. Also, there was a push in the 1950s to teach American history with an emphasis on the founding. That push came to an end, however, by the 1970s.

American history is no longer being taught with the emphasis on things like the difference between the Constitution and the Declaration. I know that the documents are being taught in some school systems even at the elementary level, but I do not know for how long this has been so at those schools and whether this reflects a wider trend. Naturally, for any real understanding to develop, a subject must be taught at higher levels than the elementary. Second and third graders are not prepared to grasp very much about the meaning and context of these two documents. If they never hear about them again, they can be forgiven for not being able to tell the difference.

Those who cannot be forgiven for the ignorancve of our children and the adult citizens they become are the educators and, especially, those who decide what goes into the curriculum and text books and what is left out of them.

Friday, June 17, 2011

Do We Dream to Forget?

The late Francis Crick’s thesis was that we dream in order to forget. That is, we are not supposed to remember our dreams but, rather, we are supposed to forget them. Was he right or at least onto something?

No, simply because we do NOT forget our dreams. They do not go away, as he suggested, even if we do not consciously recall them.

Recurring dreams, for example, recur precisely because we remember them (unconsciously).

The fact that people who pay attention to their dreams come to remember them demonstrates that all of our dreams are stored in our brains just as all of our waking memories are stored in our brains.

The interesting question is whether dreams and waking memories are stored in different places or in the same place. Is there something different about the storage of waking memories and dreams that explains the difference in their recovery? Perhaps the difference has less to do with storage than it has to do with access (how we remember dreams versus how we remember waking memories), but I am not sure what that means.

Tuesday, June 14, 2011

Who Is John Hospers?

John Hospers died at age 93 in Los Angeles, Calif., on June 12. He was the first Libertarian party candidate for president of the United States and got one electoral vote in 1972 when a renegade elector from Virginia, who had been committed to Richard Nixon, instead voted for Hospers and his running mate, vicepresidential candidate Theodora "Toni" Nathan, the first woman to receive an electoral vote.

Hospers was at one time head of the philosophy department at the University of Southern California and the author of the textbooks "Meaning and Truth in the Arts," "Introduction to Philosophical Analysis," and "Human Conduct." He had befriended Ayn Rand, author of "Atlas Shrugged," in 1960, but the two eventually had a falling out.

My favorite story from his presidential campaign (he was on the ballot in only two or three states) is about his reply to a reporter who asked, "If elected, what will you do for me?"

Never one to lose sight of his libertarian message, Hospers replied, "I'll leave you alone."

Monday, June 6, 2011

Never Said It and Doesn't Exist

I Never Said That!

“I invented the Internet.”

--Attributed to Albert Gore. He never said it, although he took credit for helping to pass legislation that kept the Internet from being overregulated back in the 1990s.

“Mission accomplished.”

--Attributed to George W. Bush, but he never said it. He made a speech that was televised from the deck of a ship that was being brought back from the Middle East because the SHIP’s mission had been accomplished.

“I can see Russia from my backyard.”

--Attributed to Sarah Palin, but she never said it. Professional Palin impersonator Tina Fey said it.

Do Not Exist--or Do They?

Ask the relevent governments and they'll tell you that the following do not exist:

Seal Team 6, a.k.a., DEVGRU or U.S. Naval Special Warfare Development Group (NSWDG). This unit of frogmen goes by all those names and yet doesn't exist. But everyone knows they killed Usama bin Laden, and there are movies about them.

Area 51. U.S. secret testing area near Groom Lake, Nevada, where stealth technology and so much else was invented. There's also an Area 52, but it doesn't exist, either. Yet there is a new book out: "Area 51: An Uncensored History of America’s Top Secret Military Base," by Annie Jacobsen.

MI5 and MI6. United Kingdom intelligence services. The Crown officially used to deny their existence. Although now they have their own websites: https://www.mi5.gov.uk (MI5) and https://www.sis.gov.uk (MI6).

Sunday, May 15, 2011

Rasha Limbo and Going the Way of France and Italy

Rasha Limbo

I was listening to the Teaching Company’s lecture series “Understanding Linguistics: The Science of Language,” and in Lecture 22: “Languages Sharing the World--Bilingualism,” Professor John McWhorter cites a study in which Russian-Americans were asked to tell how the same sentence might be said by different individuals. The topic of Rush Limbaugh came up during interviews.

The interviewer found that older immigrants who had come to America as adults rendered the sentence “I know another person who listens to Rush Limbaugh” something like this (I am spelling this phonetically with English pronunciation in mind, and I am certainly screwing up some of the words):

Ya zna-you eshch-yo odnovo chyel-o-vyeka kotoree slusha-yet Rasha Limbo.
(Lit.: I know one other fellow who listens to Rush Limbo.)

Second generation Russian-Americans, who were born in the United States would be more apt to say:

Ya zna-you dru-goi chyel-o-vyek ee on slusha-yet TO RUSH LIMBAUGH.
(Lit.: I know another fellow and he listens "to Rush Limbaugh.")

Aside from different word choices, the younger Russian-American also switches to English when he says “to Rush Limbaugh.” The older people said “Rasha” not because they can’t pronounce “Rush” but because Russian nouns are declined like Latin ones and “Rush” in the genitive (possessive standing in for accusative or direct object) or, perhaps, the dative* (indirect object) case becomes “Rash-a.” The younger people don’t want to—or don’t know how to—change the name to another case, so they just switch to English and say “Rush Limbaugh” with the preposition “to” in front of it. (There is no preposition in the pure Russian version because changing the noun to another case implies the meaning “to.”)

It is interesting that Rush Limbaugh is a topic of conversation in the Russian immigrant community, although it is nothing new to listeners of his radio program who might remember that Russian immigrants have called into his program now and then over the twenty-three years that it has been on the airwaves.

Going the Way of France and Italy

When Americans make fun of the French and Italian military it makes me queasy, not because I am particularly attached to or sympathetic to the French and Italians, but because I know enough about history to know how the two nationalities reached their low military reputation and how far they fell, and I know therefore that history—or more properly I should say fate—is fickle enough that the same decline that happened to them could happen to the United States.

In the mid-nineteenth century, France and Italy were respectable military powers—or, rather, parts of Italy were, since Italy was not unified into one nation until the 1860s. Indeed, during the first century of America’s premier military academy, West Point, cadets were thoroughly drilled in French because they were expected to study the strategy of Napoleon Bonaparte in the original language. When France and Prussia went to war in 1870, the world assumed that either the French would win or it would be a stalemate. That the French were massacred is no surprise to us today, but it was for contemporaries –other than some of the Prussians who, of course, anticipated a win. The French have simply never recovered their reputation as a great military power. It has been forgotten and replaced with ignominy.

Similarly, Italy’s reputation as a military power was drained away, never to be regained. Perhaps the key moment came during World War I. The Italians’ commander-in-chief, Luigi Cardona, was a martinet who was good at giving orders but no good at strategy and tactics. He sent armies of brave Italians to their deaths and lost battle after battle. Eventually, the Italian soldiers figured out the obvious: their commander did not know what he was doing and, if he ordered them to march forward, the wisest course was retreat. The shame ought to be on this individual commander, but instead, the lasting shame has been unfairly visited on the whole Italian military and, indeed, the Italian nation.

The lackluster performance of the Italian military under Mussolini during World War II served to worsen an already bad reputation. (This time the excuse was that no one in Italy, arguably including Il Duce himself, was that enthusiastic about entering World War II to begin with, and the conflict turned into a civil war in which Italians may have killed Italians more often than they killed Yanks, Brits or Nazis.)

I can only conclude that we Americans dasn’t be so smug as to assume that the same ignominy could not befall us. It is not as much of an exaggeration as one might think to say that all it might take is another few years of President Barack Obama or his ilk to make the United States another laughingstock in the world ranking of military prestige.

* Profesor McWhorter says it is genitive standing in for accusative, and I should bend to his expertise; it is just that I know Russian is not necessarily one of his languages, and he might be wrong; in which case, my guess that it is dative might be right after all.

Monday, May 2, 2011

Now, Back to Libya, Which is Already in Progress

I have been in drone mode, I guess, lulled into assuming that official pronouncements about what is happening and what is supposed to happen in Libya are substantive and not examples par excellence of what George Orwell said all political speech is: intended to give the illusion of solidity to pure wind.

We are told that nobody is going to put European or American “boots on the ground,” but it is obvious now and should have been obvious from the start that there are only two alternatives:

1. That American troops will go to Libya and take out Muammar Gaddafi’s military and then him, or else,

2. Gaddafi will continue to rule Libya until he dies of natural or self-inflicted causes, and that he will exact his vengeance on Europe and America.

There are no other alternatives. Troops will either be used against Gaddafi or we will have to live with his return to Libya’s state-sponsored terrorism of years past—the terrorism that brought you the Lockerbie bombing of twenty-two years ago.

This should have been obvious because it became clear during World War II that air power alone, using conventional bombs, can never bring a country to submission—unless, of course, you use nuclear bombs. At some point you have to send in the troops or else resistance will continue. (In 1928, an Italian military theorist wrote a book entitled “Air Power,” in which he argued that you CAN bring a country to its knees simply by dropping conventional bombs on it. The German Luftwaffe was very impressed by this theory and tried it out on Britain. It didn’t work and has been thoroughly discredited.)

It might seem surprising, given what I have said above, but I am not advocating that we send troops into Libya. I am saying that this is exactly the moment when we might still have a chance to step back from the brink and say, leave Gaddafi alone for the moment. We arguably already missed our moment when we could have eliminated him with surgical precision; every option from now on will just get messier and messier. One benefit to leaving him alone—though not a clean solution since Gaddafi himself will remain a problem—is that the Muslim Brotherhood would not gain power in Libya as they have in Egypt and look to be gaining elsewhere in the Middle East.

You Say Usama and I say Osama

Some might have noticed that Fox News last night called the Dead Head in Islamabad “Usama” instead of “Osama.” Lest anyone read anything into this, “Usama” is just the proper classical Arabic pronunciation and transliteration of bin Laden’s given name. There was no equivalent of “O” in classical Arabic. In Saudi Arabia, where bin Laden grew up, and where the modern dialect of Arabic does have the “O” sound, people probably did call him “Osama.” But the Quran is written in classical Arabic, and that book arguably exercises even more control over the conception of what is “correct” in the minds of Arabic speakers than the works of Shakespeare and the King James Bible do over the minds of English speakers. So calling him “Usama” instead of “Osama” is oddly respectful and proper. Of course, if you pronounce it “Usama” instead of “Osama,” it helps you to keep from saying “Obama” when you mean “bin Laden,” which everybody should start doing—saying “bin Laden” instead of “ ’sama” so you don’t say the wrong name.

Thought of the Day on Afghanistan

"The only thing worse than staying in Afghanistan, would be what will happen when we leave."

--former Secretary of State Lawrence Eagleburger

Is that a valid conundrum or an excuse for continuing to do something stupid, like sticking your hand in a bear trap so that the only way to get it out is to risk sticking your other hand in?

Proposed Bumper Sticker

You rid us of Osama bin Laden,
But now, Obama, ban Biden.

Sunday, May 1, 2011

Scientific and Technological Advancement Not Promoted by War

According to Wikipedia, the first kidney dialysis machine, or dialyzer, was invented by a Dutch physician during World War II. Since his country was occupied by the Nazis at the time, the good doctor was unable to make a good dialyzer from the materials available to him. Most of his dialysis patients died until 1945, when a woman in a coma caused by kidney failure awoke and was able to live for another seven years.

In 1946, American comic actor John “Rags” Ragland died from kidney failure, three days shy of his forty-first birthday. Whether dialysis was available to him at that time or not, I do not know. His condition was evidently advanced. But I wonder whether or not the new technology might have saved him if it had been developed during peacetime and, so, had been both more available and more advanced. Instead, advancement had not occurred because war trumped the development of this healing invention.

This is simply another example that puts the lie to the common myth that wars cause advances in technology. At the least, such a notion has to be taken with a large grain of salt. Obviously, kidney dialysis was held back by the war, although it is conceivable that if an American doctor instead of a Netherlander had invented it, there might have been more rapid development since American medical researchers might have been able to overcome wartime shortages by arguing that dialysis was important enough to justify additional resource allocation. Those are a lot of “mights,” however. And certain kinds of medicine are more interesting to wartime planners than to the peacetime marketplace. They might not have given the green light to funding non-war-related medicine. War always involves the reallocation of resources by the government, and that is what usually retards advances in technology for peaceful purposes. (It is significant in light of that that early advocates of social engineering through government intervention spoke of their wish to create a peacetime cause as potent as war; people could be made to follow government edicts during war far more easily than during peace.)

Television is a classic example of a technology that did not develop as it might have due to war. Invented by an American in 1927, fully electronic analog television languished in the United States largely for economic and geographical reasons, though for legal reasons as well, but it got off to a start in Britain where conditions were more favorable. By 1939, there were more television sets per capita in England than anywhere else in the world. The BBC unceremoniously ceased television broadcasting, however, when Britain declared war on Germany. In the same year, the Farnsworth Corporation and Radio Corporation of America (RCA) finally settled their patent issues, but the continuing Great Depression and the subsequent entrance of the United States into World War II held back the proliferation of television stations as well as further marketing of television sets until after the war.

One of the main wartime uses of already developed television technology was the manufacture of radar screens, which are essentially like television screens. At the outset of World War II, the British had already built the most advanced radar defense system in the world. The Germans evidently never had a clue as to how significant this was. The British “Chain Home” radar network was largely responsible for defending Britain against the Luftwaffe during the Battle of Britain.

During the years preceding World war II, British scientists had been studying radar more intensely than most of the world’s radio technology researchers. They even developed a device called the “cavity magnetron.” This was the world’s first microwave device. Key to my point here, the cavity magnetron, like the television screen that became the wartime radar screen, was invented before the war, not during it. It is always the trend during wartime that technology invented before the war is repurposed for war while all peaceful uses are put aside until after the war. So it was with the cavity magnetron. Afraid that Germany would invade Britain and that this microwave device could fall into the wrong hands, Britain gave its two prototype magnetrons to the United States. It is an interesting side note that the conveyance of the device to the Massachusetts Institute of Technology was done so secretively that the scientists at MIT who received the device were given very little explanation of what it was and how it worked. One of them finally came up with a clever explanation by analogy. The cavity magnetron is like a whistle: a whistle works by confining air within a space so that it is forced to change frequency; in the same way, the magnetron changes the frequency of radio waves. Out of the work at MIT, a light radar system was developed. Up until this point in time, radar depended on large towers or shore-based installations used by the navy. The Magnetron made possible smaller radar systems that could be placed on board aircraft for the first time. The British and Americans now were able to bomb Germany with radar-equipped planes that did not need to see their targets. They could fly at higher altitudes at night and still destroy strategic enemy sites. It was not until the Germans shot down some of the Allied bombers and reverse-engineered the microwave radar that they were able to use this technology themselves. By then, it was too late for Germany, however.

The technology developed during World War II—from television to radar to rockets to nuclear fission—had been invented before the war. Governments commandeered it and focused on its war applications. It was not until after the war that peaceful uses of these technologies could be realized. It is likely that the marketplace, in the absence of war, would have brought about the peacetime world of advanced technology a decade earlier. For example, the widespread use of television for communication and entertainment that we knew after 1947 would undoubtedly have arrived well before 1945. (It already had in Britain where, by 1939, the BBC was broadcasting news and entertainment on television six hours a day, six hours a week; at the same time, the few American television stations were broadcasting about three days a week and fewer hours a day.) As it was, it is noteworthy that the post-war trend was for the price of a television set to gradually go down in the United States while it gradually went up in Britain. This is because the post-war economy of the United States boomed while the British economy declined. To a large extent, this was a symptom of the fact that the British government, after the war, maintained wartime rationing and prevented a booming recovery like the one that occurred in America. From this we can see that war is not the only thing that retards technological development, but, rather, anything that retards economic growth, as well as anything like war that siphons off resources, will tend to retard technological advancement.

Saturday, April 23, 2011

Banning Easter Unfair to Christians and Pagans Alike

The celebration of Easter is as verboten in some places as is Christmas, not only in the public square but in public schools as well. Recently, a school told children they could not have Easter eggs, but they could have "Spring Spheres." Even though eggs are not technically spherical. I guess that means the Easter Bunny is persona non grata and we better not see any Easter lillies on school grounds.

It is all part of the progressive policy of knocking Christian festivities down a peg in order to achieve fairness after so many years of Christian hegemony in our society. Now it is ok to proclaim Islam, paganism, wicca--and perhaps even wiki--from the roof tops, but not Christianity. (There is even some difference of opinion in progressive circles about whether and to what extent Judaism should remain safe from the politically correct.)

No one seems to be mindful of how the suppression of Easter is oppressive to pagans. After all, Easter is only belatedly a Chrsitian festival. It was originally a pagan one. I mean, read your New Testament: there is no mention of the Easter Bunny or Easter eggs there. These are pagan customs. Even the word "Easter" is a pagan name. It was a spring festival associated with a female deity who ushered in the growing season.

I am tempted to become a druid just so that I can protest society's oppression of pagan's as well as Christians in this culture war against Easter. Christians and pagans should seek common cause over our rights to our separate versions of Easter.

Thursday, April 14, 2011

Are people essentially good?

One of the most cliché philosophical questions is whether or not people are essentially good or, perhaps, essentially evil. (One of my favorite sayings goes “That there is a devil, there isn’t any doubt,/ But is he trying to get in us, or trying to get out?”) I have always leaned toward the view that people are essentially good. The key word in that declaration, however, is “essentially.”

People are also complex biologically and psychologically. (Indeed, this complexity is related to the unity of body and mind; the two are not merely intertwined and interdependent but functionally identical.) Any complex system is more vulnerable to disorder precisely because of its complexity. Disorders of many if not every kind are possible because of our complexity. Thus, human beings are easily corrupted. This means that we cannot rule out the existence of essential goodness. Goodness is fragile, especially when it is based in innocence and its attendant ignorance. Experience will tend to corrupt.

It is one thing for a young child to be basically good, having had no experience of the world beyond what psychiatrists Fairbairn and Winnicott called “a good enough mother”—that is, a nurturer who makes all reasonable efforts to provide adequate care. It is quite another thing to grow up and make one’s way in the world beset by people who have not had good enough nurturing by parents and others, and who have actually had inadequate nurturing. So many complex beings, corrupted by the weak and craven impulses of those around them, cannot do anything but challenge even the most well-adjusted people, tempting them to indulge even the smallest tendencies toward weakness and cravenness in themselves.

We humans might well be essentially good, but it takes more than essential goodness to prevent us from succumbing to evil. Goodness can only prevail if it is fostered by the intellect. It comes to be, as we grow up and grow old, that we can continue to be good only if we understand why it is better to be good and why it is wrong to succumb to evil.

Monday, April 4, 2011

Being From New England, One Expects This Sort of Thing

Sunday morning I took a book off of my shelf and perused. "Cloak and Gown" is about the Yale graduates who, disproportionately, populated the CIA and its World War II predecessor, the OSS (Office of Strategic Services). The book consists largely of a series biographies of Yalies.

My eye fell on one about Norman H. Pearson. I was quickly more interested by the account of Pearson's early years. He was born in Gardner, Massachusetts, in 1909. This is not far from Worcester where I grew up. My father often drove around Worcester County as part of his work, and I remember that he often went to Gardner. Then I noticed that Pearson's mother's maiden name was Fanny Kittredge. I thought, wait, I am descended from Kittredges if you go back far enough (five generations, it turns out); so I spent the rest of Sunday--until I had to go to work--trying to work out whether Pearson and I are related.

We are indeed related. Seventh cousins, twice removed. That means that if you go back, my eighth great-grandparents are the same as his sixth great-grandparents. It also happens to mean that my grandmother (born 1903) was in the same generation as Pearson; so they were seventh cousins, period, not removed. BTW, Norman Pearson's uncle on his mother Fanny's side was Alfred B. Kittredge who, though born in the same New Hampshire town as Fanny, became a U.S. Senator from South Dakota. (And I thought I could write a post without reference to politics.)

The migration from England to the Massachusetts Bay Colony that occurred during 1630-1640 was only large in comparison to the smaller but more famous arrival of the Pilgrims in 1620. Over the next several generations, this relatively small population burgeoned into millions largely through intermarriage among the members of the first families. The result is that if you are related to one of these families, you are related to several of them. If you are willing to go out to seventh, eighth, ninth or tenth cousins, everybody whose ancestors arrived in New England before 1700 is related to everybody else.

Thursday, March 3, 2011

The Inevitability of Comparisons to Hitler and Nazism

Placards carried by union demonstrators in Madison, Wisconsin, showed pictures of Gov. Scott Walker with a toothbrush mustache and a Hakenkreutz or swastika added to the photograph. The obvious suggestion is that Governor Walker is somehow like the infamous German dictator Adolph Hitler. One of the protesters explicitly compared Walker’s proposal to reduce state workers’ bargaining power to Hitler’s complete outlawing of the German labor movement. Not long before the demonstrations in Wisconsin, Egyptian strongman Hosni Mubarak was portrayed as Hitler by sign carrying protesters in Cairo. None of this is new. During the run-up to passage of the health care bill in Congress, TEA Party protesters carried signs that compared President Barack Obama to Hitler, and at the height of fighting in Iraq a few years ago, anti-war protesters carried pictures depicting President George W. Bush as Hitler.

Whenever opponents of any government want to reach for an easy and powerful symbol of tyranny, the comparison to Hitler and his National Socialist (Nazi) Party is trotted to the fore. Modern political rhetoric would seem to be at sea without the option of comparing an offending office holder to the most heinous political leader of the twentieth century. (Although the Soviet Union’s Joseph Stalin, China’s Mao Tsetung, and Cambodia’s Pol Pot rival Hitler as callous and brutal mass murderers.) Only someone who has lived under the proverbial rock does not know that Hitler and the Nazi Party are inextricably associated with evil and tyranny in the popular imagination; so invoking them has become shorthand for the worst that can be said about a political opponent. If in the mind of one’s audience, the comparison to Hitler is accepted, then the comparison is an effective bit of propaganda. On the other hand, if it strikes the listener as too much of an exaggeration and deserving of condemnation, it will hurt the case of the person making the comparison.

The comparison is deemed worth risking because it is so powerful and, once you overlook the risk of backlash, it is convenient. Almost any position that any political official might take could be compared to one in the Nazi program. This actually becomes inevitable for a couple of reasons. Every government is at least superficially alike in some respects regardless of its constitution or lack thereof. All governments tend to preserve themselves and their power, all seek to limit the participation in the political process of those who would reduce government’s size and power or transfer it to a new political cast of characters. The measures that governments will take, the extremes to which they will go, vary from case to case, and that is what separates the Nazis from most governments. I am reminded of biographer and historian Edward Crankshaw’s evaluation of late nineteenth century German Chancellor Otto von Bismarck of whom he said that while other nineteenth century political leaders did highhanded and unethical things, one gets the impression that there were things that they would not do, extremes to which they would not go to get their way. In contrast, Bismarck seems to have had no such scruples. Whatever he wanted, he was willing to do anything without compunction to get it. His only calculus had to do with whether he thought he could get away with it.

Yet even Crankshaw recoils from comparing Bismarck to Hitler. Bismarck was not as bad as Hitler. He didn’t lead Germany to destruction, and one of the reasons for this is that he had more patience than Hitler so that although Bismarck gambled as did Hitler, he did not over-reach. (Arguably, however, he made possible Kaiser Wilhelm II and Hitler—the leaders who led Germany into World Wars I and II respectively—by setting the example of the strongman leader who does whatever he thinks is best without consulting the people through their representatives.)

The comparison of any opponent’s policy to any Nazi policy is convenient in the most cynical sense because the Nazis had policies on every conceivable issue and often were on both sides. Consider abortion, for example. The Nazis divided the world into Aryans, or Germans and some other northern Europeans like themselves on the one hand, and non-Aryans, or people of other ethnic groups on the other; they did not allow abortion for Aryans but they championed abortion for non-Aryans. Consequently, those who oppose abortion have compared their opponents to Nazis because Nazis favored abortion for some, while proponents of abortion have compared their opponents to Nazis because Nazis opposed abortions for some. Obviously both sides are guilty of disingenuousness.

Whatever your opponents do, something in their platform is bound to be reminiscent or able to be made reminiscent of some position in the Nazi program, however tenuous the similarity might be. Whereas the Nazis did everything ruthlessly and to the extreme, the rhetorical game is find and if need be force a match between their extreme policy and an opponents policy and draw the inevitable conclusion even if the similarity is weak or not of the same degree.

When the comparison does fit, the effect can be cheapened by habitual overuse, and the comparison can also be beside the point: a comparison of Stalin to Hitler might as well be made the other way around because both strongmen were equally autocratic and brutal. Yet the comparison has a legitimacy when it serves to warn people that a government is moving in the same direction of increased arbitrary power that the Nazis did, even if the government in question has not gone as far as the Nazis did. When governments move in the direction of tyranny quickly, it is necessary to quickly inform people that this is happening and capture the public's attention because it is something that people ought to be alarmed about. It is up to the audience of any message to be educated enough to recognize the relative merits of the comparison and judge its value. A carelessly used comparison will bring disrepute on the user while an apt comparison can be made effective.