About Jamie Rawson

I make my living teaching a variety of high-tech subjects, but my undergraduate degree is in history, and history remains an avocation. I have diverse and widely varied interests and opinions, but if there is any theme which ties all of this together, it is perhaps Professor William Slottman's view that we study history to learn compassion.


This is a distinctly “political” post. But herein I make no citations of contemporary politics; I write about the politics of two millennia past. The subject is a favorite theme of mine, because I find it illuminating: The Roman Republic. This is far too brief to be comprehensive, but I have aimed to keep it short enough to be readable.

* * *

The Roman Republic may well be the most important, influential, and enduring republic that human history has ever produced. Its span lasted from the overthrow of the Roman Kings in about 509 BC until the rise of Augustus Caesar as the first Emperor of the Roman Empire in 27 BC, in those five centuries, Rome grew from a small, agrarian city in a backwater of the Mediterranean world into an empire which subsequently became unrivaled in human history. (Some empires governed greater land areas, and some lasted longer by some measures, but for size and duration, no other empire truly comes close.) And during those five centuries, Rome excited the interest and envy of the classical world from Spain to Egypt. The Greek political historian Polybius, writing the the mid-second-century BC, attributed Rome’s unparalleled success to its ingenious blending of Democracy, (Rome’s Popular Assemblies) Aristocracy, (Rome’s Senate) and Monarchy, (Rome’s binary executive, its two Consuls.)

Rome’s masterful blending of these varied elements of ancient forms of government became known as “The Concerns Of The People,” in Latin, “RES PVBLICA,” which evolved into our modern term “republic.” Rome was never a pure democracy (there have been precious few of these in the world, because pure democracy does not scale well to larger polities) but there was the democratic element of the Popular Assemblies, where registered citizens cast their votes to elect governmental officials. These assemblies also originated a variety of domestic legislation including bills of finance. Rome’s abiding dread of Kingship led to the development of a *dual* executive consisting of two consuls selected from the top two vote-getters in an annual election. Each Consul could override the acts of the other, which made for a cumbersome, yet typically a surprisingly effective executive. But the third “branch” of Rome’s creative government was the most powerful: the Senate.

The Senate was not directly elected. Those who has served in the senior elected positions served in the Senate, and to a very real degree, membership in the Senate was an inherited position, because a small number of influential families maintained a presence in the Senate for centuries. The Julian clan, from whom Julius Caesar sprang, for example, was represented in this august body for the entire span of the Republic and thereafter. The Senate controlled foreign policy, had the power to declare and wage war, and controlled a variety of crucial state revenues and resources. Yet the Senate was to a degree beholden to the popular assemblies. Notably, for much of the life of the Republic, the popularly elected Tribunes had the power to veto acts of the Senate (“VETO” means “I forbid!” And that is what the Tribunes did.)

For centuries, the Roman Republic successfully balanced the needs of the people with the wants of the aristocracy. The resulted in the pragmatic formation of an effective government which was willingly borne by the commoners and nobles alike. It is worth remembering that Rome’s very self-identity is expressed by the acronymic “SPQR,” “SENATVS*POPVLVSQVE*ROMANVS,” “the Senate and the Roman People.” This formula succinctly expressed Rome’s concept of its government.

As Rome grew more and more powerful, the impact and effect of an expanding empire induced strains upon this successful civil government. By the late First Century BC, several factions formed within the Senate, ultimately becoming two major rivals: The “Optimates,” or “the Best People,” and the “Populares,” or “the Popular Party.” The Optimates clearly favored the interests of many aristocrats in the Senate. (“Aristocrat,” bear in mind, comes from the Greek for “government of the best people.”) The Populares were associated with many programs that assisted the common citizen, such as agrarian land reform and public assistance for citizens.

Note that I will refer to these two competing groups as “factions” and not parties. They never represented anything like what we have termed political “parties” for the past 250+ years. In modern times, some interpreters assert an analogy between the Optimates Faction and the United States’ own Republican Party; likewise a parallel is drawn between the Populares and today’s Democratic Party. The comparison is temptingly easy, and satisfyingly simple. Yet it is also unhelpful and ultimately false. The two factions in Rome’s Senate did each have certain “popular” or “optimate” “planks,” but fundamentally, these two factions did not represent any identifiable principles at all.

Marxian historian Michael Parenti, in his 2003 work, “The Assassination of Julius Caesar,” claimed to find in Julius Caesar a champion of the common man against the arictocrats. Yet this seems far too much of a reach. Julius Caesar did build a powerful base of support from the commoners, but he was an aristocrat first and foremost. The real basis of the conflict and contest between Rome’s two major factions was simply rivalry for power. Ideology, such as it existed, was purely a means to the end of gaining power. The Populares built their base upon the leverage of the Popular Assemblies and the like, whereas the Optimates more plainly served the interested of the power elite. But neither “party” really gave a tinker’s dam about the people. The superordinate goal was simply power for each faction’s adherents and cronies.

By the late period of Rome’s remarkable Republic, the only consistent characteristic of either of its leading political factions was a drive for power for its own supporters and destruction of its rivals. So consumed by this petty infighting did the Republic become that its last century was basically ten decades of upheaval, chaos, and civil disorder. So fixated were Rome’s Senatorial factions that they neglected or abandoned the care of the state and the needs of the people; misery and catastrophe dominated. The end of this confusion came about with the demise of the Republic. First, Julius Caesar seized control as “Dictator for Life,” and after his assassination, his adopted son, nephew Octavian arose to become the first Emperor of Rome.

Stability finally returned to Rome, but the Republic survived only in ceremonial references. The people of Rome never regained their ancient Rights. For the final one thousand years that the last vestiges of Rome’s Empire survived, only autocrats and aristocrats had political sway. There were no parties, only factions; no principles, only the drive for power.


Jamie Rawson
Flower Mound, Texas

I doubt that I can even begin to I compose a proper bibliography. But some titles I am informed by include:

Michael Parenti, “The Assassination Of Julius Caesar,” ISBN-10: 1565847970

H. H. Scullard, “From The Gracchi To Nero,” ISBN-10: 0415584884

Erich Gruen, “The Last Generation Of The Roman Republic,” ISBN-10: 0520201531

A Tiresome Bit Of Obsessivity Again Raises Its Head

The question is again pressing upon us:


This question comes up repeatedly. It makes readily appreciable sense to account a new decade as starting with the rollover of the new digits — we easily know that 2020 is a clear division in our counting scheme, so it is sensible that it should serve as the boundary for a new decade. Yet the very most insistent amongst us insist that, No! A decade starts on the xx01 year following xx00. Of course, since there was no Year Zero, it is true, from a purely mathematical perspective, that the first “decade” would have been from Year One to Year Ten, inclusive, because a decade is, strictly speaking, ten years.

But a “decade” is also a generalized term of calendration, much as a month, a year, or a century is. Yet we have absolutely no problem with irregular months, or variable years, or inconsistent centuries, so it seems rather strange to insist upon rigid consistency with decades.

Since long before Julius Caesar declared, upon extensive consultation and coordinations with his Greco-Egyptian astronomer Sosigenes, that 45 BC (“Annus Confusionis“) should last a rather astonishing 445 days, irregular calendration had been pragmatically used from time to time to resolve issues with the calendar. Add to this fact that Caesar promulgated the insertion of a quadrennial “Leap Day” to correct the calendar so that every fourth February is an extra day longer. But, because Sosigenes’ every-four-year correction proved a bit too aggressive, Pope Gregory XIII acceded to the omission of nine days to reset the calendar in 1582 to align with celestial mechanics (and when England finally got on board in 1752, 11 days were excised.)

Along with the skipping of days, the “Gregorian” calendar changed such that only centuries divisible by 400 would be leap years thereafter. (In 1600 and 2000 February had 29 days, but in 1700, 1800, and 1900, February had its usual 28 days.) These extra-calendrical solutions are purely pragmatic rather than algorithmic. They definitely mar the “purity” of the calendar, but they work! This sort of practical fudging is still done periodically today: “leap seconds” are from time to time inserted into the scheme of things these days. So there is no reason to insist that decades logically have to start on years one. Simply no reason. One cannot cite calendrical consistency, nor mathematical purity, to back up such an insistance. It is literally without precedence.

The reality is that decades pragmatically run from 0 – 9 because that is the way people treat them. It is plainly practical. A calendrical genius noted back in the late 1990s that the matter is actually readily resolved with a simple bit of pragmatic reasoning, one which noted essayist Stephen Jay Gould concurred with: The first “decade” of the current era simply had nine years; the rest are correctly aligned 🙂 This simple, pragmatic resolution makes all of the debate moot.

This next New Year, 2020, is clearly the start of the newest decade. Anything else is an excess of pedantry.



Fifty Years Ago,

Some of us are going to have an extremely difficult time believing it, but it was FIFTY years ago today, on 20 July 1969, that a human being first walked on the moon!

“That’s one small step for [a] man; one giant leap for mankind,” said Neil Armstrong as he touched toe upon the lunar landscape. Watching this on television at our neighbors’ house, despite the noisy static — which makes the quote possibly a slight slip-up: if the “a” is not spoken, it’s a bit off — it was clear that the quote was intended to be a famous and inspiring legacy. The conquest of space was not simply an act American bravado, but a development that all human beings could share in.

At the time, the broadcast of the event gained the largest world-wide audience ever recorded. For a short while, hundreds of millions of people around the globe were united in watching and celebrating an achievement of the most dramatic sort: new heights had been reached, and in a peaceful enterprise (albeit one that certainly had potential military implications.) If you were old enough to have watched this momentous event, do you recall where you were at that time?

My sisters were at a camp in the Colorado Rockies, the chance to watch television was an unheard-of and special exception to business-as-usual. A friend was in Vermont, staying at the Middlebury Inn, and watched in the inn’s common room. And older work colleague of mine recalled having listened to the broadcast on an ancient, unreliable tube radio in an otherwise unmemorable bar in Talkeetna, Alaska. No matter where or how, the event remains vivid for most everyone who watched. For a brief point in time, the medium of television perhaps truly reached its great potential.

I recall watching the event on a large Television at the house of our next-door neighbor. They had a pool in their back yard and decided to make the occasion a festive and holiday sort of event rather like Independence Day: a large grill was used to cook large quantity of hamburgers and albacore-patty “burgers” and a multitude of neighborhood kids splashed and played in the pool. After the both Neil Armstrong and Buzz Aldrin had ventured onto the surface of the moon, the television coverage became a bit less exciting for us kids, so back to the pool we went. But I do remember, once full night had settled, looking up at the moon in the night sky and asking my Mom if she though we would one day plant colonies in outer space. “Yes,” she said, “but not in my lifetime.”

I cannot help but reflect, that five hundred years ago Spain landed in Mexico in 1519, and within thirty-two years had founded a university in the thriving metropolis of Ciudad Mexico; likewise Spain landed in Peru in 1531 and within twenty years had established a university in the thriving metropolis of Lima. Now, to be fair, Spain – for all that the age was technologically primitive by today’s standards – did not have to deal with the challenges of blasting away from earth’s atmosphere and existing on an atmosphere-free rock in space. But, even so, the speed with which the New World was integrated into the Old was astonishing. It is also true that the integration was neither mutually beneficial nor benign. The peoples present at the arrival of Spain suffered mightily from the contact and conquest. Nevertheless, questing humans had reached a new and utterly foreign world and made that world integral to their own within a generation. Yet fifty years after “One small step for a man …” Space remains almost wholly the stuff of science fiction.

Certainly the race for the moon was an outgrowth of the Soviet/American Cold War, and the technological achievement served to impress the Soviets with how much a committed United States could achieve in an amazingly short time, but it was and is more than that. Mankind is a questing species. New horizons have always beckoned. To paraphrase a quote from a television show that predates the moon landing, “Space is the final frontier.” It remains to be seen if we ever return.

Jamie Rawson
Flower Mound, Texas

It is not because things are difficult that we do not dare; it is because we do not dare that things are difficult.

— Seneca

When What Is Legal Is Not What Is Right

On 20 March 1852, Harriet Beecher Stowe’s landmark novel Uncle Tom’s Cabin; or, Life Among The Lowly was first published. The book unquestionably ranks as one of the most important novels in world history: it was the best seller of the entire Nineteenth Century throughout the world, second only to the perennial leader, The Bible, and it was significantly responsible for creating a profound change in attitudes toward slavery throughout the United States of America, but mainly, of course, in the Northeast and Midwest. Pro-slavery and Abolitionist sentiments had been at odds from the very earliest days of the founding of our republic, but in the wake of the publication of Uncle Tom’s Cabin, abolitionist sentiment grew immensely. Tolerance for the manifest hypocrisy of the evil of slavery flourishing in “The Land of the Free” grew less and less. As absolutists on each side clashed, war seemed inevitable. Upon being introduced to Mrs. Stowe at the Whitehouse, Abraham Lincoln is reported to have said, “So this is the little lady who made this great big war!”

The book sold out its first run almost immediately. Before the end of 1852, more than 300,000 copies had been printed in the United States, an unprecendented success at that time. Another 200,000 copies were printed elsewhere in the English-speaking world, and translations began to appear in other languages before the year was out. These figures do not include unauthorized and “bootleg” copies that flooded the market as well, nor the many unauthorized abridgements and digests of the book. Additionally dramatic interpretations ranging from public readings to many hundreds of theatrical versions brought the tale to millions of people. It was a phenomenon unlike any which the world had seen. And, as noted above, its impact was immense and immediate.

For today’s tastes, Uncle Tom’s Cabin is overwrought, melodramatic, and verbose in the extreme. Stowe’s prose styling makes one think of Charles Dickens as being spare and concise by comparison. The novel’s major appeal today lies in its historical importance. It is today much more often read about than read; indeed, some recommend against actually reading the unabridged original novel because it contains characterizations of its enslaved protagonists that are distinctly stereotypic and often unflattering (though no one decries the portrayal of the wicked and greedy Simon LeGree, whose name is still an epitome for nastiness and evil.) This seems to me to be a perfectly fine state of affairs, as the book really is a labor to slog through; it is enough to know what it was about and to know its impact.

But there is an important question that one must ask: since anti-slavery movements and the cause of abolition had been present in American society since the 18th Century, what was it that gave this book such an unusual impact? Why was Stowe’s depiction of slavery suddenly more powerful and more effective at raising anti-slavery sentiment than the many efforts that had come before? Why did Uncle Tom’s Cabin ignite a fire that had merely smoldered previously?

The reasons are no doubt many, but foremost among these is that a major theme of the novel is the forced separation of families. The brutality of this forced tearing apart had been raised repeatedly by abolitionists, but their tracts and treatises and speeches often found limited audiences, audiences of already like-minded people. Stowe’s novel was a gripping and emotionally affecting epic that reached many millions who perhaps had never really given serious thought to the issue of slavery or its abolition. Though anti-slavery activists made much emphasis on the beatings and lashings and physical cruelty of the institution of slavery, such sensationalism apparently failed to foster a lasting concern among most Americans. The United States of America was mainly rural in the 1850s, and it was perhaps easy for those who heard a speaker describe the torments inflicted upon slaves to accept these as being aberrations or outlying incidents; every farmer knew that no wise farmer would regularly abuse his valuable resources in such a way, for such behavior would be self-destructive. Thus, even when confronted with first-person accounts and eye-witness testimonials, most Americans seemed quite able to dismiss these cases as being sufficiently rare as to be unworthy of sustained outrage, and unnecessary to work against.

Perhaps the abolitionists’ putting so much emphasis on the physical horrors of slavery had in reality caused far too many Americans to assume they were exaggerating. Perhaps the nation’s citizens simply could not admit that their country countenanced such evil. Perhaps they were simply in denial.

But Uncle Tom’s Cabin changed all of that at one fell swoop! Almost literally overnight, the truth of the pernicious cruelty of slavery became palpably real through the medium of an utterly fictitious novel. While physical torments and tortures might be impossible for readers to effectively conceive, the unending trauma brought about by forcibly ripping families apart was far more “real,” far more familiar and readily conceivable. Stowe’s melodramatic account of “poor Eliza” facing the prospect of being rent forever from her son touched millions in the population of the readers of the book, the listeners to public readings, and the playgoers. While few could honestly relate to the horror of a fierce whipping, virtually everyone had first-hand experience with separation and loss. The majority of mothers and fathers of that age knew first-hand the pain of the loss of a child. Widows and widowers and orphans were commonplace. In a day of limited communication, most Americans knew the ache of separation when friends and family moved on to the West, or embarked upon a sea voyage, or simply rode to a neighboring town. Whether Harriet Beecher Stowe expected the potent effect of making forced separation a main theme of the novel, Stowe had touched upon a truly effective way to reach people and to permit them to truly understand and feel the horrors of a system that was legal in large areas of “The Land Of The Free.”

Suddenly, the wickedness of the institution of slavery became undeniable: through the portrayal of the forcible splitting of families, an enormous wrong that was such a plainly evil feature of slavery could no longer be ignored. In the book, Stowe has one character observe: “The most dreadful part of slavery, to my mind, is its outrages of feelings and affections – the separating of families, for example.” (Italics mine.) People knew then what we still know now: forcibly separating families is torture, and it is torture of the most callous and barbarous and unbounded sort, for it tortures entire families, young and old, and the torment goes on and on and on. And millions of Americans became aware that they could no longer be proud of a nation that permitted and protected such an intolerable system. Slavery was not just antithetical to the ideals and principles of a Free Country, it was intolerable evil, pure and simple. Mrs. Stowe’s decision to show the cruel injustice of slavery by first focussing on the forcible separation of families did more to capture people’s hearts and minds than decades of graphic emphasis on the physical cruelty. Stowe hit a resonant note, and the supporters of slavery were rightly terrified.

I make note of this today, especially, because the past still has so immensely much to teach us today. Systems and regimes that rely on such inhumane and uncivilized practices as the forced isolation of children from parents have always been on the wrong side of fundamental human morality, and they are ultimately judged to be on the wrong side of history as well. No matter the time, no matter the place, and no matter the justifications proffered, forcible separation of families is torture, and it is torture of the most callous and barbarous and unbounded sort. This cruel practice tortures entire families, young and old, and the torment goes on and on and on. Americans woke up to this stark and undeniable reality 164 years ago; we must wake up to it again. It is true that there are those who earnestly approve of such barbarity today, just as there were in the past. But these are bad people today just as they were in the past. Our great and prosperous nation must not once more join the ranks of those wicked and ultimately failed governments that were led by vicious oppressors or murderous tyrants. Those unspeakably benighted leaders justified their brutality as being “the law.” How very hollow and cowardly it sounds to hear our own Executive Branch claiming that “the law” demands such vile cruelty. Forcibly separating families is despicable. Always. Any time. Anywhere. Today’s victims of such brutality are non-citizens; this does not mean that they are non-people. These are human beings, no matter how poor they may be, no matter how little they bring, no matter what their ethnic heritage or social background may be. These are people. Our own self-respect as human beings, and our nation’s very character, demand that treat these people with compassion. Those who attempt to enter this country illegally must be dealt with, but this country has both the resources and the character to do so without descending to barbarism.

The past is not dead; it is not even past: it is with us in the present. But knowing the past and frankly facing it with all its glories and all of its shames, is necessary to living today and building a better future for all of us.

Jamie Rawson
Flower Mound, Texas

Fable is more historical than fact, because fact tells us
about one man and fable tells us about a million men.

— G. K. Chesterson


Bury The Chains: Prophets and Rebels in the Fight to Free an Empire’s Slaves, Adam Hochschild; Houghton Mifflin, 2005: ISBN: 0618104690

A professor of journalism at the University of California, Berkeley, Hochschild has nevertheless built an impressive reputation as a historian, though perhaps “popular historian” should be used (historians can be a snarkey bunch, and there always seems to be a certain disdain for those who write things that many people actually want to read!) In Bury The Chains, Hochschild recounts the struggle of the British anti-slavery movement. He notes that this cause was the first modern popular cause, employing mass media – newspapers and broadsheet posters – and organizing economic action against slavery in the form of sugar boycotts. He says, “It was the first time a large number of people became outraged, and stayed outraged for years, over someone else’s rights.” This book too has its villains such as Banastre Tarleton, (the evil English dragoon colonel featured in Mel Gibson’s The Patriot; he was pro-slavery in Parliament) and its heroes such as John Newton who wrote Amazing Grace.

Uncle Tom’s Cabin online

The Great Fire Of London

With this post, I depart from my usual practice of noting an anniversary or similar connection to an event. This topic is not inspired by a long past event on this date nor an especial anniversary. The Great Fire of London happened 351 years ago last month. But given current events, it is perhaps pertinent and fitting to speak about this historical event as fires yet rage in California.

Between Sunday, 2 September 1666 and early Wednesday 5 September 1666, the Old City of London was almost completely destroyed by a devastating conflagration known ever after as The Great Fire of London. The fire was unprecedented both in scope and scale – more than 15,000 structures were consumed over an area of more than 700 acres within the ancient city which was at that time still bounded by its Roman walls built almost 1,500 years before.

The fire was fiercely fueled by the fact that the vast majority of London’s buildings were made of wood. Today, we know London as a city of incredibly creative and beautiful brickwork and stone masonry, but in the late 17th Century, cheap and easy wood construction dominated. The presence of so much ready fuel – the structures themselves – plus the additional effect of stores of good such as turpentine, pitch, hemp, and timber – all so necessary for a maritime economy – as well as enormous amounts of gunpowder which was inevitable in a great military capital, meant that the fire was able to burn with an unusual intensity: ingots of steel and bronze liquified in the heat and poured like water into gutters running into the Thames.

The Lord Mayor of London was unwilling to take action against the fire, and though it was contrary to tradition and law, ultimately King Charles II commanded the creation of fire breaks which required that rows of houses and shops and warehouses be pulled down or, ultimately, blown up with gunpowder. This meant that property was destroyed before the fire reached it, but it also meant that the fire was, at last, stopped. This tactic, plus the cessation of powerful, dry winds finally brought an end to the devastation.

But London was, effectively, no more. More than 80 churches, in those days the center of neighborhood life, and more than 13,000 homes were gone. The Old City was a black and reeking ground.

Some visionaries such as the remarkable architect Sir Christopher Wren proposed a complete redesign of London and a rebuilding that would utterly modernize the ancient city. But King Charles II knew that England could not long stand without London in full working order – England was even then still engaged in a costly and dangerous war with The Netherlands. So the time needed to effect a complete redesign of the City was out of the question. The King convened special courts to settle matters of land ownership and legal responsibilities. These special courts facilitated rapid reconstruction, though they left many unsatisfied.

NOTABLY, in the aftermath of the fire, London’s building codes – already strict “on paper” were strengthened and, more importantly, firmly enforced. No longer would wooden buildings be permitted; no longer would thatched rooves be allowed. Any new buildings in London were required to be constructed of stone or brick, with rooves of tile or slate. And until 1997, when the recreated Globe Theater was constructed, these restrictions were rigidly enforced.

London has suffered fires since 1666, of course. There was no way to escape vast fire damage during The Blitz when German bombers carpeted whole neighborhoods of the great city with incendiary explosives. But the wisdom of building with fire-resistant materials was shown quite wise indeed. Though London went up in flames several times as The Blitz was waged, the flames were quickly and efficiently contained.

The Great Fire of London happened more than 350 years ago. The immediate and forceful response of London’s government in the wake of the fire has helped protect London ever since.

I have always wondered why it is that in our nation, particularly in areas where fire is a frequent and devastating impact, we still gladly and blithely accept the practice of building our homes and schools and churches and places of business out of easily flammable and readily combustible material? The technology to build fire-resistant structures is more than 5,000 years old, and in the past few millennia it has been improved. Why do we still build with so much fire fuel?

Jamie Rawson
San Antonio, Texas

Your country is desolate, your cities are burned with fire.

— Isaiah, 1:7

Sharp Medicine

It was on this day forty years ago, 10 September 1977, that the last legal beheading in the western world took place as France executed Hamida Djandoubi by guillotine. Djandoubi had been convicted earlier that year on charges of kidnapping and torture-murder of his former girlfriend. Because the crime was especially brutal, and because Djandoubi had attempted another kidnapping shortly after the murder, neither the courts nor the executive would extend clemency and commute his sentence. His execution, therefore marks the final operation of La Guillotine, which had been France’s only legal method of execution since the French Revolution.

It seems strange, perhaps, but the guillotine was chosen as a method that was both extremely humane and intensely egalitarian. Before the revolution, only nobility were beheaded; commoners could suffer a variety of methods of execution depending upon the crimes in question, with hanging being most common, but burning and breaking on the wheel were still legal under France’s monarchy. The guillotine was fast and effective. It remains the only form of Capital Punishment which is uniformly fatal, and within a predictable time range. Hanging, even if a body’s neck is broken, can still cause a soul to linger for 8 to 12 minutes. Gassing and other poisons are far too variable in the ranges of reactions and responses as any observer can note. The electric chair is not even to be considered. (We may as well revive burning at the stake.) Firing squads are iffy at best, even when staffed with excellent marksmen. But examine, if you will, the virtues of the guillotine: it is fast. Almost all of the blood drains from the severed head in less than a minute, guaranteeing the cessation of all brain activity within three minutes. There is no missed shot, no unexpected reaction, and no second attempt.

During France’s dire Reign of Terror, the intended “deterrent effect” was also in full force: severed heads were snatched up by the hair and shaken before the crowds to reinforce the severity of the penalty. In more modern times, all executions were carried out in rather private circumstances. And despite teh vaunted “benefits” of this method of execution, there remains something profoundly disturbing about it.

I still find it rather odd to realize that this final victim of La Guillotine was executed just one day before I left home to begin my time at college.

Jamie Rawson
Flower Mound, Texas

This is a sharp Medicine, but it is a Physician for all diseases and miseries.

— Sir Walter Raleigh, contemplating the headsman’s axe

Remember The Past; Do Not Honor Its Errors

The current impassioned debate about the need to remove monuments to the Confederacy and its leadership has rekindled my long personal struggle with the subject. When I was young, we lived in Montgomery, Alabama, which in the early 1960s was at the very forefront of an outdated and oppressive culture of racism. We subsequently moved to Northern Virginia, rather more up-to-date than Alabama, perhaps, yet fiercely proud of its Confederate past. And I have lived in or near Dallas, Texas for nearly twenty-seven years. In my early years, therefore, I regularly passed by Confederate memorials such as the Robert E. Lee house in Alexandria, and a statue honoring Confederate soldiers nearby. Even today, I often attend events and gatherings at Dallas’ Lee Park, honoring General Robert E. Lee. I have had friends who taught at Dallas’ Jefferson Davis Elementary School. Such familiarity has fostered in me an acceptance of such things, despite knowing that it must be difficult for a black student to attend a school honoring the president of the Confederacy. I felt that undoing such legacies from the past might hide an important part of our common history. But I have been reconsidering.

I do honestly think these monuments and memorials helped me develop an appreciation for the nearness of the past, and instilled in me a sense of history. And for quite a while, the idea of taking them down was unappealing to me. So I have wrestled with this question mightily. One the one hand, I do think that the very existence of Confederate monuments represents something utterly unique in our history – how many losing rebellions throughout history were ever permitted to honor their failed “heroes?” The fact that these monuments were permitted possibly speaks volumes about American tolerance.

On the other hand, the Confederacy was ultimately about preserving and promoting a system of exploitation and human misery that ensured that an entire race was oppressed, disenfranchised, and denied the right to be full Americans. Slavery – The United States of America’s “original sin” – bore a legacy that still tears at this nation today. Honoring that legacy is incomprehensible.

Too, it is important to note that many of these Confederate monuments did not arise in the near aftermath of the failed rebellion; they typically date from the early 20th Century, long after the Civil War. Dallas’ Oak Lawn Park was rechristened Lee Park upon the dedication of a statue in 1935, 70 years after the Civil War, at a time when Americans were embracing a romantic, idealized, and fictitious image of The Old South. They were inspired by a misguided nostalgia for the “noble” South that never really was. And, quite importantly, as these monuments usually date from the time when the Ku Klux Klan was at its apogee, it is probable – almost inevitable – that they genuinely do represent a racist reminder that despite the defeat of the Confederacy, its heirs intended to maintain the status quo of oppression and exclusion of Blacks from mainstream America.

Battlefield parks, historic forts, preserved sites of significance (such as plantations that include slaves’ quarters and aspects of their lives) are appropriate ways to keep history present and tangible. Memorials to the fallen of both armies at battlefield sites such as Gettysburg remain appropriate and fitting, much as if in a cemetery. But civic statues and monuments are neither necessary nor educational even when they claim to represent “history.” For one, Confederate soldiers were in rebellion to the lawful government of the United States of America. For all that ‘rebellions are lawful when they succeed,’ this one failed, and honoring its proponents is simply improper. For another point, remember that the majority of these memorials date to a full two generations after the Civil War, and do indeed reflect a reprehensible reminder that, despite the Confederacy’s defeat, its aim of racial oppression was in full force.

It is time to move beyond they early 20th Century mythos of The South and embrace an America that truly is more free and equal, but which has much work remaining toward genuinely achieving those goals. Undoing reminders and symbols of an oppressive past is not erasing history; it will be embracing our 21st Century America.

Jamie Rawson
Flower Mound, Texas

The Greatest Shipwreck That Few Recall: SS Eastland

In the second largest city in America, a passenger steamship, tied to the dock, loaded with 2,500 working people dressed in their picnic clothes, topples slowly and sinks to the river bottom like a dead jungle monster shot through the heart. Over 1,000 men, women and children, trapped like rats in a cellar, are drowned. — Carl Sandburg, 1915

One of the worst maritime disasters in U.S. history happened on 24 July 1915, causing the loss of more than eight hundred lives. Yet this tragedy is little known today.

Between Monday 15 April 1912 and Friday 7 May 1915, three enormous maritime disasters rocked and stunned the world. The two of these bracketing the trio are quite well known, being the subjects of many works of history and fiction in books, plays, and movies: RMS Titanic and RMS Lusitania. The middle one of the trio, RMS Empress of Ireland which sank after being rammed on 29 May 1914, is nowhere near as famous these days as its bookends, but it was every bit as shocking and stunning a disaster. Each of these catastrophes caused the loss of more than a thousand lives; each was probably avoidable. In the aftermath of the wreck of Titanic, new regulations were quickly put into place to ensure adequate life boats, but the speed with which Empress and Lusitania sank precluded any effective deployment of life saving equipment.

It is profoundly and tragically ironic that the loss of S.S. Eastland in July of 1915, less than two months after the sinking of Lusitania, may have been precipitated by the vessel’s compliance with the requirement for more lifeboats. Eastland was a Great Lakes steamer as large as an ocean liner. Launched in 1903, Eastland early on revealed a tendency to list excessively because of a somewhat top-heavy design. To help avoid this characteristic, the ship adopted rules regarding how many passengers she could board, and how many could be on the upper decks at any given time. These measures permitted the ship to ply the lakes for a dozen years without major incident. However, after the new regulations of lifeboats following the loss of Titanic, Eastland’s tendency to list returned quite noticeably. The extra life-saving equipment made the upper decks even more top-heavy.

On 24 July, Eastland had been chartered by Western Electric Company to ferry employees to a company picnic in Indiana. The ship was docked on the south bank of the Chicago River between LaSalle and Clark Streets. The employees began boarding the vessel about 6:30 AM; within forty minutes the ship was at full capacity of about 2,500 passengers. To take fullest advantage of the “pleasure cruise” to the picnic, many passengers congregated on the upper decks. Several witnesses who observed the vessel from shore recalled that the ship seemed to have developed a distinct tilt toward its port, riverward side. Something seems to have attracted the attention of a large number of passengers, for at about 7:30 AM, hundreds of passengers began crowding to the port side.

There was no crash, no explosion, no fire, indeed, nothing especially dramatic at all; Eastland simply slowly and quietly capsized into the shallow water of the river.

Another ship nearby, Kenosha, immediately maneuvered to start rescuing passengers, but many were trapped below decks. Though the emergency response was immediate, some 844 passengers and crew perished. The city and the nation reacted with shock and horror. The local papers immediately passed judgment as to responsibility for the disaster. Western Electric committed $100,000.00 to the efforts to care for survivors and to recover victims’ bodies. Naturally a spate of legal actions were undertaken. Famous lawyer Clarence Darrow represented victims in one action in Federal court. Ultimately, the courts found neither negligence nor malfeasance in the disaster.

Chicago journalist Carl Sandburg – later renowned as one of America’s great poets – wrote a scathing commentary on the disaster for the 15 September 1915 edition of the International Socialist Review. He later penned a poem to commemorate it as well. The poem’s incendiary tone ensured that it would not be published until 1993.

Jamie Rawson
Flower Mound, Texas

It was a hell of a job, of course
To dump 2,500 people in their clean picnic clothes
All ready for a whole lot of real fun
Down into the dirty Chicago river without any warning

— Carl Sandburg, The Eastland

XLast week I was in Chicago, and I walked right above the site of this disaster, and motored past on a river tour boat.

For reference:

Sandburg in the ISR

Sandburg’s poem

Texian Victory At San Jacinto

It was on this day in 1836 that the news of the outcome of the Battle of San Jacinto reached New Orleans, Louisiana from which it was quickly communicated to the rest of the United States. Though Texas was decidedly not a part of the United States, and the Texas Revolution was not a matter of U.S. national involvement, nevertheless, the country was greatly interested in the outcome. News of the Texian victory made Sam Houston a national hero in the U.S. and it was widely assumed that the soon-to-be-independent Texas would become a state in the Union.

On April 21, 1836, Texian (as they then styled themselves) forces under the leadership of General Sam Houston defeated Mexican General Antonio Lopez de Santa Anna’s army at the battle of San Jacinto. This victory won for Texas its independence from Mexico, and paved the way for the creation of The Republic of Texas, an independent nation for a decade.

Mexico’s loss of Texas really comes as no surprise under the circumstances: the population of Texas was overwhelmingly composed of Yanqui settlers who had been invited into Texas. Spain had claimed the land that would become Texas as early as 1536, but for the next 290 years made no effort to settle the vast and often inhospitable land. After the United States acquired the Louisiana Territory by purchase in 1803, Spain realized that the Americans would soon be looking westward at the open lands of Texas. In nearly three centuries, Spain had established just three settlements in Texas: San Antonio, Nacogdoches, and Goliad. By 1820, the population of Texas that was reckoned as Spanish subjects amounted to about 4,000 (no one bothered to count the native tribes, naturally!)

In order to strengthen their claim on the territory, Spain invited Americans to settle in Texas, so long as they swore allegiance to Spain and nominally adopted the Roman Catholic faith. Moses Austin and his son Stephen Fuller Austin organized American settlements in Texas, drawing heavily from Tennessee, Alabama, and Mississippi. With Mexico’s successful overthrow of Spanish rule in 1821, these newly imported Yanquis were theoretically citizens of Mexico. Mexico decided to maintain Spain’s policy of inviting more Yanqui settlers into Texas. So successful was this program, that by 1835 the population of Texas had grown to about 45,000!

Despite oaths of allegiance, most of these Yanqui settlers considered themselves Americans, and they enjoyed the relatively free hand they were given to manage their affairs within the Mexican federal system. As Mexico became more concerned about the loyalty of these settlers, it began to impose restrictions on immigration and also put limits on local autonomy. These impositions were greatly resented by the Yanqui Texians, (“Texian” is the demonym used to denote inhabitants of Texas prior to statehood) and several rebellious demonstrations took place.

In 1834, General Santa Anna, who had been democratically elected Presidente the year before, assumed dictatorial powers, and suspended the Mexican Constitution of 1824, which had granted significant local power to Texas. The Texians, both Yanqui settlers and native Tejanos, resented this restriction – and notably, they resented Santa Anna’s intention to prohibit slavery in Texas. Santa Anna swore to force the Texian back into submission. The Texians pledged to defend themselves against the oppressor. As armed conflict grew inevitable, popular sentiment moved toward complete separation from Mexico, though not as an independent nation: from the start, Texians felt that Texas should join The United States of America.

When General Santa Anna marched north into Texas, he headed a small but well-trained Army. Determined to capture San Antonio, then the most important city in Texas, Santa Anna learned that a small garrison was holding the city’s mission church known as The Alamo, as a fortress. Santa Anna decided that he would have to conclude this situation before he could make further advance.

The two weeks that Santa Anna devoted to taking the Alamo gave the Texian army time to regroup and reinforce. The brutality with which Santa Anna treated the Alamo’s defenders ignited a great fury and an even greater resolve among the Texians, who made “Remember the Alamo” a cry that echoes unto our own day. As is the case with most symbolically important events, the details of the Alamo are often obscured and exaggerated, but it is true that the time gained by the defenders of the Alamo was crucial to the ultimate success of the Texian rebellion.

On 26 March, Santa Anna’s troops captured the Texian defenders of the town of Goliad. Despite their formal surrender in accordance with military protocol, Santa Anna ordered all the captured Texian soldiers to be shot, beaten to death, or trampled by cavalry. This gross breach of the Law of Arms not only enraged the Texians and ultimately much of the outside world, it greatly demoralized Santa Anna’s own troops: not only had their general flouted the rules of warfare, his mercilessness would surely be returned by the Texians!

Six weeks after the fall of the Alamo, Santa Anna’s 2,400 man force was encamped on low ground near the present city of Houston at San Jacinto. In the late afternoon of 21 April, the Texian forces caught the Mexican Army literally as it slept: some 900 or so Texians announced their presence with a shout – it is claimed they shouted “Remember the Alamo! Remember Goliad!”

What ensued was a slaughter, pure and simple. The unwary Mexican troops were unable to organize a coordinated response, and despite their formal military expertise, they were trained for set engagements on the battlefield, not hand to hand fighting in swampy mire. In about 18 minutes it was all over: more than 600 Mexican troops had been killed, nearly 800 captured. The Texian losses amounted to nine killed and thirty wounded. Though Santa Anna initially escaped, he was captured and forced to grant Texas’ independence.

General Sam Houston, wounded during the combat, survived and was later elected the first president of The Republic of Texas. He would later serve as Senator and Governor of the state after it was admitted into the Union. Texas remained an independent nation for almost ten years, and was widely recognized in Europe, but not by most Latin American nations. The four-vessel Texas Navy repeatedly attacked Mexican shipping during the years of independence, helping to deter Mexico from a new invasion. Finally, in 1845, the United States admitted Texas as the 28th state.

But for Santa Anna’s overwhelming and unjustified confidence, and his Texas-sized ego, Texas might have remained under Mexican rule. But for the Battle of San Jacinto, Texas would never have become independent. There is a massive monument at the battle site – taller than the Washington Monument, of course! – and Texas still take pride in the stories and myths that make up the history and traditions of this great state!

Jamie Rawson
Flower Mound, Texas

If I owned Hell and Texas, I’d live in Hell and rent-out Texas!

— General W.T. Sherman

And here is a poem that my Texan Grandfather used to recite, and which
my Mom taught me before I was in school; it has always colored
my impressions of Texas:

Hell in Texas

Oh, the Devil in hell they say he was chained,
And there for a thousand years he remained;
He neither complained nor did he groan,
But decided he’d start up a hell of his own,
Where he could torment the souls of men
Without being shut in a prison pen;
So he asked the Lord if He had any sand
Left over from making this great land.

The Lord He said, “Yes, I have plenty on hand,
But it’s away down south on the Rio Grande,
And, to tell you the truth, the stuff is so poor
I doubt if ’twill do for hell any more.”
The Devil went down and looked over the truck,
And he said if it came as a gift he was stuck,
For when he’d examined it carefully and well
He decided the place was too dry for a hell.

But the Lord just to get the stuff off His hands
He promised the Devil He’d water the land,
For he had some old water that was of no use,
A regular bog hole that stunk like the deuce.
So the grant it was made and the deed it was given;
The Lord He returned to His place up in heaven.
The Devil soon saw he had everything needed
To make up a hell and so he proceeded.

He scattered tarantulas over the roads,
Put thorns on the cactus and horns on the toads,
He sprinkled the sands with millions of ants
So the man that sits down must wear soles on his pants.
He lengthened the horns of the Texas steer,
And added an inch to the jack rabbit’s ear;
He put water puppies in all of the lakes,
And under the rocks he put rattlesnakes.

He hung thorns and brambles on all of the trees,
He mixed up the dust with jiggers and fleas;
The rattlesnake bites you, the scorpion stings,
The mosquito delights you by buzzing his wings.
The heat in the summer’s a hundred and ten,
Too hot for the Devil and too hot for men;
And all who remained in that climate soon bore
Cuts, bites, stings, and scratches, and blisters galore.

He quickened the buck of the bronco steed,
And poisoned the feet of the centipede;
The wild boar roams in the black chaparral
It’s a hell of a place that we’ve got for a hell.
He planted red pepper beside of the brooks;
The Mexicans use them in all that they cook.
Just dine with a Mexican and then you will shout,
“I’ve hell on the inside as well as the out! “

from American Ballads and Folk Songs, Lomax

275 Years Ago: Handel’s “Messiah” Premiers

It was on this day, 13 April, that George Frideric Handel’s magnificent oratorio Messiah was premiered in the Great Music Hall on Dublin’s Fisamble Street. Though in this day and age Messiah is most often performed for Christmas, it was written for specifically for Easter. The thirteenth of April in 1742 was Tuesday of Holy Week according to the older Julian calendar which Great Britain and Ireland used at the time of the premier. Messiah has remained one of the most frequently performed of all choral works. Because the full oratorio can take more than two and a half hours to perform, Messiah is today most often performed as selected excerpts.

The first performance on 13 April 1742 was realized by a rather modest orchestra and chorus. As Messiah gained in popularity, it became common to include bigger and bigger choirs with larger numbers of instrumentalists and more complex instrumentation. No less a musical luminary than Mozart felt moved to orchestrate a more elaborate, grander orchestral setting of Handel’s “greatest oratorio.” The trend toward ever more immense productions perhaps reached a zenith with an 1857 rendition in London’s Crystal Palace which included an orchestra of 500 and a chorus of 2,000 singers! The end of the Victorian Era saw a decline in the great numbers of choral societies that had characterized the 19th Century musical landscape. Accordingly, in the 20th Century, a revival of more “authentic,” smaller scale performances gained adherents. These performances returned to the surviving 18th Century manuscripts for musical details and aimed to more nearly match the original scope and scale of the productions of Handel’s day.

The enduring popularity of Messiah and the nearly perfect musical expression of deep religious sentiment which appropriately pervades Messiah have fostered various tales and legends about the creation of the masterpiece. Some of these remarkable stories are quite true: Handel did, in fact, compose the entire oratorio in a mere 24 days between 22 August and 14 September 1741. The surviving autographic score does contain certain errors, but surprisingly few for a piece of such length. Handel himself has often been quoted as asserting that while composing Hallelujah, “I did think I did see all heaven before me and the Great God Himself!” Though the quotation does not appear in sources contemporary to Handel, (it first appears in print in Horatio Townshend’s 1852 Handel’s Visit To Dublin) it has become an inextricable part of the tales surrounding Messiah.

Certainly, Handel did feel moved by religious sentiment in creating this great work; he concluded the manuscript with the abbreviation SDG, (Soli Deo Gloria; “Only to God the Glory”) and he later noted that he was pleased that so many of the performances in his lifetime, including the premiere, were charitable benefits.

Of the many beautiful and noteworthy choruses throughout Messiah, surely none is more famous or more familiar than he glorious Hallelujah which concludes Part II of the oratorio. By tradition, the entire audience rises at the start of the chorus and remains standing until its conclusion. King George II attended the first London performance of Messiah in 1743. Moved by the opening of Hallelujah, the King rose to his feet. Etiquette required that his subjects do the same. Because this tribute seemed an especially apt recognition of the inspired music, the practice of standing for the Hallelujah chorus has been maintained to this day. The great classical composer Franz Josef Haydn, upon first hearing the chorus in London’s Westminster Abbey, stood with the audience, and wept from emotion. At the conclusion, he proclaimed, “He is the master of us all.”

This chorus has been both praised frequently and criticized for the frequency with which it is heard in innumerable versions and parodies. However, there can be no more sincere tribute to this iconic masterpiece than that from Ludwig van Beethoven: “Go and learn from [Handel] how to achieve great effects with simple means.”

Jamie Rawson
Flower Mound, Texas

I should be sorry if I only entertained them, I wish to make them better. – George Frideric Handel, upon being informed that the audience for the premiere had found Messiah entertaining.


The Oxford Companion To Music, Tenth Edition, edited by John Owen Ward; Oxford University Press, 1995

Handel: Messiah, edited by Watkins Shaw; Novello Handel Edition, 1992

Accompanying booklet from Messiah, recorded by The Smithsonian Chamber Players, the American Boy Choir, with tenors and basses of the Norman Scribner Chorus, conducted by James Weaver, 1981.

The Life of George Frederick Handel, William Smyth Rockstro; MacMillan and Company, London, 1883

Handel’s Visit To Dublin, Horatio Townsend, Esq; William S. Orr and Company, and J. A. Novello, London, MDCCCLII