SHOES: A Modest Proposal To Overturn An Unconstitutional Curtailment Of Our God-Given Freedom

Let us today address the “elephant” that is in everyone’s room: shoes.

For years, people have been told that they “must” wear shoes. From early childhood, we are inculcated with the belief that shoes help promote good health and hygiene, and that it is our social responsibility to wear shoes. People unquestionably accept the utter tyranny of stores and eating establishments and other public places posting placards proclaiming: “No Shoes, No Service!” and similarly oppressive policies. Shoes are even used as a proxy for decent, civilized behavior itself. Just try attending the symphony or dancing at a fancy gala affair without shoes. The oppression is real! We cannot escape the dictatorial forces which force shoes upon us.

Even as infants, we are commanded to wear shoes by well-meaning parents and care-givers. Yet observe the natural behavior of many infants who scream upon being forcibly shod, or the many toddler sho shed their shoes at every opportunity. In a state of nature, even the smallest child instinctively prefers the natural and inherent freedom of shoelessness. This freedom is a Constitutional right!!!

Our shoe-enforcing overlords also assure us that shoes provide for better hygiene and sanitation, and are a part of overall healthful living. But what are the risks really? No one gets hookworm anymore. And even those who do contract a hookworm helminthiasis are very unlikely to spread it to other unshod compatriots since the worms spread through fecal contact. And though some might argue that the obnoxious infection of athlete’s foot may be readily spread among barefoot folks, there are no studies that really confirm this. And even if such spread is occasionally confirmed, we know that athlete’s foot is typically the result of bad personal hygiene and is therefore the fault of those who catch it, as with most diseases. And why should I have to wear shoes because others are unclean?

“But,” we are assured, “shoes protect your feet!” And, “Shoes help to improve your posture!” Though those who impose shoes upon us may claim these things, where are the scientific studies to back up these wild and exaggerated claims? Sidewalks are paved and homes and public places are completely free of hazards to the unshod foot. Shoes are utterly unnecessary. Sure, if you fear for your feet, go ahead and wear shoes in public, but be aware: tens of thousands of studies confirm that shoes are completely ineffective in protecting your feet from harm.

In my own personal experience, I once suffered two broken and badly mashed toes when an 80lb stove dolly fell from a pickup truck tailgate onto my foot. Though I was at that time wearing steel toed boots, the injury happened nevertheless. Shoes provided me no protection from harm! Another time, in the desert south of Tucson, Arizona, I stepped directly on a three-inch spine from a desert ebony tree which pierced the sole of my boot and punctured the skin of the sole of my foot. The shoe failed to protect me! So even without relying on the hundreds of thousands of studies proving the uselessness of shoes, I know because my own experience perfectly confirms the fact that shoes cannot protect my feet.

There is simply no reason to wear shoes! They are uncomfortable, expensive, and awkward. Wear your shoes if you are too timid to stride forth unshod. Patronize those tyrannical establishments that compel shoe-wearing. As for me, I shall bold step forth, fearlessly free of footwear! Just say, “No!” to shoes!!!

Dammitol! I seem to have stepped on a tack!!!

IGNORANCE

In a Facebook group in which I participate, a question was recently posted to elicit thoughtful comment: “What’re you most afraid of?”

This is a good question, and one that is most pertinent to ask in today’s crisis-saturated world. What do we fear?

I’d love to be able to align myself with the thinking of Franklin Roosevelt who once assured this nation that, “the only thing we have to fear is fear itself — nameless, unreasoning, unjustified terror which paralyzes needed efforts to convert retreat into advance.” But at present it is not “fear itself” that I fear. What I fear even more than fear itself is ignorance — wanton, unreflective, unjustifiably confident ignorance which makes fun of sensible precautions and facilitates the spread of infections.

Today I read a posting asserting, among other points, that the simple act of wearing a mask “reduces oxygen up to 60%,” and “increases the risk of CO2  poisoning.” (I think they intended to terrify us with a warning about CO poisoning.) The picture also included some reasonably accurate concerns, such as masks can promote increased face-touching, but the premise appeared to be that no one would think to practice basic hygiene.

But this sort of hairy-scary hype against the wearing of masks is just ignorance on parade. For instance, masks are not, of course, impermeable to gasses. They are intended to permit nearly normal breathing. While it is possible that some reduction of O2 could happen from the effect of the mask, it certainly could not be 60%. A person who experiences a drop in O2 saturation of 20% — that is O2 saturation of 80% — is dangerously near death. If a person had an oxygen reduction of 60% — that is O2 saturation of 40% — they would be dead. No one has yet been recorded to have died simply from donning a mask.



Too, CO2 poisoning, while not an impossible condition, is astonishingly hard to achieve in any space that is even poorly ventilated, and requires a concentrated source of CO2, such as blocks of dry ice. Therefore I infer the reference was actually intended to have been to CO — carbon monoxide — which is indeed deadly in relatively small concentration, (> 35ppm) and which is a byproduct of our very own metabolism. However, the concentration of CO present in our exhalation is sufficiently negligible as to be safely ignored; even a gas-impermeable mask would not cause a person to die from CO intoxication.

By making frightening claims, and by asserting neatly precise numbers, the author of this deceit aims to have the appearance of “Science.” Of course, they are clearly ignoring medical, scientifically informed practice that has been in place for more than fourteen decades. But ignorance delights in attacking generally accepted practice as some sort of conspiracy against the ignorant. While there is every good reason to question common thinking, and to examine conventional wisdom, it is neither scientifically valid nor logically sound to immediately declare a premise false simply because it is widespread.

Despite the fact that we live in an age of ready access to vast volumes of reasonably reliable information on every subject from Science to History and vastly beyond, there seems to be a concerted effort to convince people to retreat into the “certainty” of their own fears, doubts, and personal inclinations, and not to disturb the lovely comfort of certainty with intrusive facts.

Writing in Newsweek, 21 January 1980, biochemist and writer Isaac Asimov observed, “There is a cult of ignorance in the United States, and there always has been. The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that ‘my ignorance is just as good as your knowledge.’” Wishing a thing to be so does not, of course, make it so. Ignorance is not neutral. Ignorance is not benign. And when ignorance becomes aggressive, people die.

What am I most afraid of? Humankind’s deadliest affliction: Ignorance.

— Jamie Rawson
12 May 2020

This Is Wartime

My parents lived through World War II. It was a time of extraordinary trial for the United States of America and for the entire world. And the extraordinary challenges demanded extraordinary action. Citizens of the U.S. found themselves subject to extraordinary measures and restrictions that were needed to address the extraordinary crisis.

I remember my Mother recalling that there were indeed some people who mewled and moaned about fuel rationing and other impositions. But she also noted that the average American bore these extraordinary impositions with the understanding that it was wartime, and that it was certainly not “business as usual.” Those who whined and complained about the necessary extraordinary measures were considered to be ignorant or selfish. People knew that the extraordinary circumstances of wartime demanded extraordinary changes in behavior.

Today we face a different sort of challenge with COVID-19. This is an extraordinary challenge that demands extraordinary reaction. We must not panic, but we must respond. Taking wise precautions is responsible and sensible. The disease is here now. Limiting opportunities for easy and large-scale transmission is a rational and reasonable response. Denying reality is not. There is no reason to panic, but there is every reason to treat this as urgently, immensely serious. Italy’s healthcare resources are already being over-extended due to the explosion of cases there. We need to work vigorously to avoid that situation here. Limiting contact is NOT a panic response. Not at all.



Large gatherings should be suspended, postponed, or cancelled. (Penalties, breech-of-contract, cancellation fees, and the like that are typically associated with such actions should not apply.) These are extraordinary times. Ordinary operations are inapplicable and insufficient. 

These are not ordinary times; this is wartime. Things must change. We can make the needed changes. It will not last forever.

Jamie Rawson

Flower Mound, Texas



Not “History,” But Observation

I’ve no wish to be a doom-sayer. And I am neither easily frightened nor am I paranoid. Yet I am deeply, profoundly troubled by trump’s latest move against the press.



With so much attention being focussed on an ailing stock market and an increasing concern about the Corona Virus, it is easy to lose sight of the utterly chilling fact that president trump has opened a lawsuit against an organ of the press; in filing suit against the New York Times, we see the unprecedented spectacle of a sitting president actively attempting to silence critics in the media through intimidation via the courts.

Despite years of blustering threats of suits, trump rarely makes any follow-through on suing the media, and with good reason: long before he assumed the presidency, trump was a sufficiently public figure that the bar for such suits would be extremely high. Heretofore, it would have been a hurdle unthinkable for a sitting president to even attempt. And, in fact, this absurd suit is going to go nowhere. Our courts are not yet completely corrupted; they will uphold the Constitution.

So why has trump pursued this particular suit, and why now?

One might think it is just an attempt to garner free publicity. The media that favor trump will surely find this inane suit a bold and courageous stand against dissenting opinions. Chilling in and of itself, but not unexpected. So publicity could be a motive.

But I fear the motive is more sinister in its end goal. The very act of a sitting president suing a newspaper for libel is so unnatural and so utterly un-American that it has never happened before. Not even Richard Nixon, who had such a contentious relationship with the press throughout his career, contemplated such an unorthodox move toward chilling Freedom of the Press. Yet by taking this step toward the trappings of a tyrant, trump is simply laying groundwork.

Even a year ago, a president who took such a step would have had critics and opposition even from among supporters and Congress members of his own party (back in the days when adherence to an oath to support and defend the Constitution mattered more generally; once upon a time, such oaths were taken so seriously that the wording was specifically crafted to ensure that loyal ex-Confederates could not serve.) But today, trump is testing the waters on all fronts to determine just how imperial he can be before someone, somewhere within the power structure tells him, “Enough!”

It starts as a mere fripperous lawsuit, but it normalizes a president punishing a dissenting press. Soon that “dissenting” press becomes that “dissident” press. The unthinkable of the historic past becomes the plausible for today; the inconceivable of last year becomes the inevitable for 2020. It is not so very great a leap from filing dogging lawsuits intended to intimidate newspaper after newspaper, or chill broadcaster after broadcaster, or silence journalist after journalist, to the next stage: actually criminalizing dissent.

 “But that cannot ever happen here!” people may confidently cry, “This is America!”

But sadly it could. Even now we have seen a vast array of actions that are so utterly unpresidential evolve into the new normal. We have seen sweeping abuses of power that would have greatly constrained or unseated previous administrations pass as if unremarkable. Freedom of the Press, alas, despite being a “God-given” right, is nevertheless fully subject to being revoked by human agency. It starts with previously unimagined suits, but where it ends could very well be a far further descent into the unimaginable.

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
— Jamie Rawson
Flower Mound, Texas

The people never give up their liberties but under some delusion.
— Edmund Burke

Though A Good Law, A Deep Shame, Too

I guess that Congress would like me to feel proud that we at last have a Federal law criminalizing the vicious terrorism, and utterly lawless crime of lynching.

I suppose I should be relieved that it has finally been formally criminalized.

But I am not proud. I am ashamed. Deeply, achingly ashamed.

Congress has failed to enact this plainly obvious legislation almost two-hundred times since 1900. TWO-HUNDRED TIMES. In the 120 years between 1900 and 2020, Congress has never heretofore decided that this legislation was worthy. Not during the heyday of “Jim Crow.” Not during the resurgence of the Ku Klux Klan in the 1920s. Not during the racial tensions preceding the Civil Rights Era, and not during it or thereafter.

Not in a dozen decades has Congress had the courage or principle to do the right thing before today. Today. Two decades into the 21st Century.

If there is any piece of evidence that shows how slowly we have progressed, surely this is it. How could it have ever been a question to make lynching a crime? And by the time of Emmet Till’s murder, sixty-five years ago, it was already long overdue to treat lynchings as especially heinous. But to have allowed twelve decades to have elapsed before today’s legislation is a disgrace that fully covers this nation with a heavy blanket of grievous shame! Yes, it is right to finally accomplish this. But the circumstances are still a deep and pervasive shame.

I am glad this is finally going to be sent to the President for his signature, which surely will happen, no matter his personal preferences, but I am ashamed that it has taken twice my entire life-span to come to pass.

S P Q R

This is a distinctly “political” post. But herein I make no citations of contemporary politics; I write about the politics of two millennia past. The subject is a favorite theme of mine, because I find it illuminating: The Roman Republic. This is far too brief to be comprehensive, but I have aimed to keep it short enough to be readable.

* * *

The Roman Republic may well be the most important, influential, and enduring republic that human history has ever produced. Its span lasted from the overthrow of the Roman Kings in about 509 BC until the rise of Augustus Caesar as the first Emperor of the Roman Empire in 27 BC, in those five centuries, Rome grew from a small, agrarian city in a backwater of the Mediterranean world into an empire which subsequently became unrivaled in human history. (Some empires governed greater land areas, and some lasted longer by some measures, but for size and duration, no other empire truly comes close.) And during those five centuries, Rome excited the interest and envy of the classical world from Spain to Egypt. The Greek political historian Polybius, writing the the mid-second-century BC, attributed Rome’s unparalleled success to its ingenious blending of Democracy, (Rome’s Popular Assemblies) Aristocracy, (Rome’s Senate) and Monarchy, (Rome’s binary executive, its two Consuls.)

Rome’s masterful blending of these varied elements of ancient forms of government became known as “The Concerns Of The People,” in Latin, “RES PVBLICA,” which evolved into our modern term “republic.” Rome was never a pure democracy (there have been precious few of these in the world, because pure democracy does not scale well to larger polities) but there was the democratic element of the Popular Assemblies, where registered citizens cast their votes to elect governmental officials. These assemblies also originated a variety of domestic legislation including bills of finance. Rome’s abiding dread of Kingship led to the development of a *dual* executive consisting of two consuls selected from the top two vote-getters in an annual election. Each Consul could override the acts of the other, which made for a cumbersome, yet typically a surprisingly effective executive. But the third “branch” of Rome’s creative government was the most powerful: the Senate.

The Senate was not directly elected. Those who has served in the senior elected positions served in the Senate, and to a very real degree, membership in the Senate was an inherited position, because a small number of influential families maintained a presence in the Senate for centuries. The Julian clan, from whom Julius Caesar sprang, for example, was represented in this august body for the entire span of the Republic and thereafter. The Senate controlled foreign policy, had the power to declare and wage war, and controlled a variety of crucial state revenues and resources. Yet the Senate was to a degree beholden to the popular assemblies. Notably, for much of the life of the Republic, the popularly elected Tribunes had the power to veto acts of the Senate (“VETO” means “I forbid!” And that is what the Tribunes did.)

For centuries, the Roman Republic successfully balanced the needs of the people with the wants of the aristocracy. The resulted in the pragmatic formation of an effective government which was willingly borne by the commoners and nobles alike. It is worth remembering that Rome’s very self-identity is expressed by the acronymic “SPQR,” “SENATVS*POPVLVSQVE*ROMANVS,” “the Senate and the Roman People.” This formula succinctly expressed Rome’s concept of its government.

As Rome grew more and more powerful, the impact and effect of an expanding empire induced strains upon this successful civil government. By the late First Century BC, several factions formed within the Senate, ultimately becoming two major rivals: The “Optimates,” or “the Best People,” and the “Populares,” or “the Popular Party.” The Optimates clearly favored the interests of many aristocrats in the Senate. (“Aristocrat,” bear in mind, comes from the Greek for “government of the best people.”) The Populares were associated with many programs that assisted the common citizen, such as agrarian land reform and public assistance for citizens.

Note that I will refer to these two competing groups as “factions” and not parties. They never represented anything like what we have termed political “parties” for the past 250+ years. In modern times, some interpreters assert an analogy between the Optimates Faction and the United States’ own Republican Party; likewise a parallel is drawn between the Populares and today’s Democratic Party. The comparison is temptingly easy, and satisfyingly simple. Yet it is also unhelpful and ultimately false. The two factions in Rome’s Senate did each have certain “popular” or “optimate” “planks,” but fundamentally, these two factions did not represent any identifiable principles at all.

Marxian historian Michael Parenti, in his 2003 work, “The Assassination of Julius Caesar,” claimed to find in Julius Caesar a champion of the common man against the arictocrats. Yet this seems far too much of a reach. Julius Caesar did build a powerful base of support from the commoners, but he was an aristocrat first and foremost. The real basis of the conflict and contest between Rome’s two major factions was simply rivalry for power. Ideology, such as it existed, was purely a means to the end of gaining power. The Populares built their base upon the leverage of the Popular Assemblies and the like, whereas the Optimates more plainly served the interested of the power elite. But neither “party” really gave a tinker’s dam about the people. The superordinate goal was simply power for each faction’s adherents and cronies.

By the late period of Rome’s remarkable Republic, the only consistent characteristic of either of its leading political factions was a drive for power for its own supporters and destruction of its rivals. So consumed by this petty infighting did the Republic become that its last century was basically ten decades of upheaval, chaos, and civil disorder. So fixated were Rome’s Senatorial factions that they neglected or abandoned the care of the state and the needs of the people; misery and catastrophe dominated. The end of this confusion came about with the demise of the Republic. First, Julius Caesar seized control as “Dictator for Life,” and after his assassination, his adopted son, nephew Octavian arose to become the first Emperor of Rome.

Stability finally returned to Rome, but the Republic survived only in ceremonial references. The people of Rome never regained their ancient Rights. For the final one thousand years that the last vestiges of Rome’s Empire survived, only autocrats and aristocrats had political sway. There were no parties, only factions; no principles, only the drive for power.

—————————————

Jamie Rawson
Flower Mound, Texas

I doubt that I can even begin to I compose a proper bibliography. But some titles I am informed by include:

Michael Parenti, “The Assassination Of Julius Caesar,” ISBN-10: 1565847970

H. H. Scullard, “From The Gracchi To Nero,” ISBN-10: 0415584884

Erich Gruen, “The Last Generation Of The Roman Republic,” ISBN-10: 0520201531

A Tiresome Bit Of Obsessivity Again Raises Its Head

The question is again pressing upon us:

IS 2020 THE START OF A NEW DECADE?

This question comes up repeatedly. It makes readily appreciable sense to account a new decade as starting with the rollover of the new digits — we easily know that 2020 is a clear division in our counting scheme, so it is sensible that it should serve as the boundary for a new decade. Yet the very most insistent amongst us insist that, No! A decade starts on the xx01 year following xx00. Of course, since there was no Year Zero, it is true, from a purely mathematical perspective, that the first “decade” would have been from Year One to Year Ten, inclusive, because a decade is, strictly speaking, ten years.

But a “decade” is also a generalized term of calendration, much as a month, a year, or a century is. Yet we have absolutely no problem with irregular months, or variable years, or inconsistent centuries, so it seems rather strange to insist upon rigid consistency with decades.

Since long before Julius Caesar declared, upon extensive consultation and coordinations with his Greco-Egyptian astronomer Sosigenes, that 45 BC (“Annus Confusionis“) should last a rather astonishing 445 days, irregular calendration had been pragmatically used from time to time to resolve issues with the calendar. Add to this fact that Caesar promulgated the insertion of a quadrennial “Leap Day” to correct the calendar so that every fourth February is an extra day longer. But, because Sosigenes’ every-four-year correction proved a bit too aggressive, Pope Gregory XIII acceded to the omission of nine days to reset the calendar in 1582 to align with celestial mechanics (and when England finally got on board in 1752, 11 days were excised.)

Along with the skipping of days, the “Gregorian” calendar changed such that only centuries divisible by 400 would be leap years thereafter. (In 1600 and 2000 February had 29 days, but in 1700, 1800, and 1900, February had its usual 28 days.) These extra-calendrical solutions are purely pragmatic rather than algorithmic. They definitely mar the “purity” of the calendar, but they work! This sort of practical fudging is still done periodically today: “leap seconds” are from time to time inserted into the scheme of things these days. So there is no reason to insist that decades logically have to start on years one. Simply no reason. One cannot cite calendrical consistency, nor mathematical purity, to back up such an insistance. It is literally without precedence.

The reality is that decades pragmatically run from 0 – 9 because that is the way people treat them. It is plainly practical. A calendrical genius noted back in the late 1990s that the matter is actually readily resolved with a simple bit of pragmatic reasoning, one which noted essayist Stephen Jay Gould concurred with: The first “decade” of the current era simply had nine years; the rest are correctly aligned 🙂 This simple, pragmatic resolution makes all of the debate moot.

This next New Year, 2020, is clearly the start of the newest decade. Anything else is an excess of pedantry.

THE CALENDAR IS HERE TO SERVE HUMANKIND,
NOT HUMANKIND TO SERVE THE CALENDAR
.

 

Fifty Years Ago,

Some of us are going to have an extremely difficult time believing it, but it was FIFTY years ago today, on 20 July 1969, that a human being first walked on the moon!

“That’s one small step for [a] man; one giant leap for mankind,” said Neil Armstrong as he touched toe upon the lunar landscape. Watching this on television at our neighbors’ house, despite the noisy static — which makes the quote possibly a slight slip-up: if the “a” is not spoken, it’s a bit off — it was clear that the quote was intended to be a famous and inspiring legacy. The conquest of space was not simply an act American bravado, but a development that all human beings could share in.

At the time, the broadcast of the event gained the largest world-wide audience ever recorded. For a short while, hundreds of millions of people around the globe were united in watching and celebrating an achievement of the most dramatic sort: new heights had been reached, and in a peaceful enterprise (albeit one that certainly had potential military implications.) If you were old enough to have watched this momentous event, do you recall where you were at that time?

My sisters were at a camp in the Colorado Rockies, the chance to watch television was an unheard-of and special exception to business-as-usual. A friend was in Vermont, staying at the Middlebury Inn, and watched in the inn’s common room. And older work colleague of mine recalled having listened to the broadcast on an ancient, unreliable tube radio in an otherwise unmemorable bar in Talkeetna, Alaska. No matter where or how, the event remains vivid for most everyone who watched. For a brief point in time, the medium of television perhaps truly reached its great potential.

I recall watching the event on a large Television at the house of our next-door neighbor. They had a pool in their back yard and decided to make the occasion a festive and holiday sort of event rather like Independence Day: a large grill was used to cook large quantity of hamburgers and albacore-patty “burgers” and a multitude of neighborhood kids splashed and played in the pool. After the both Neil Armstrong and Buzz Aldrin had ventured onto the surface of the moon, the television coverage became a bit less exciting for us kids, so back to the pool we went. But I do remember, once full night had settled, looking up at the moon in the night sky and asking my Mom if she though we would one day plant colonies in outer space. “Yes,” she said, “but not in my lifetime.”

I cannot help but reflect, that five hundred years ago Spain landed in Mexico in 1519, and within thirty-two years had founded a university in the thriving metropolis of Ciudad Mexico; likewise Spain landed in Peru in 1531 and within twenty years had established a university in the thriving metropolis of Lima. Now, to be fair, Spain – for all that the age was technologically primitive by today’s standards – did not have to deal with the challenges of blasting away from earth’s atmosphere and existing on an atmosphere-free rock in space. But, even so, the speed with which the New World was integrated into the Old was astonishing. It is also true that the integration was neither mutually beneficial nor benign. The peoples present at the arrival of Spain suffered mightily from the contact and conquest. Nevertheless, questing humans had reached a new and utterly foreign world and made that world integral to their own within a generation. Yet fifty years after “One small step for a man …” Space remains almost wholly the stuff of science fiction.

Certainly the race for the moon was an outgrowth of the Soviet/American Cold War, and the technological achievement served to impress the Soviets with how much a committed United States could achieve in an amazingly short time, but it was and is more than that. Mankind is a questing species. New horizons have always beckoned. To paraphrase a quote from a television show that predates the moon landing, “Space is the final frontier.” It remains to be seen if we ever return.

Jamie Rawson
Flower Mound, Texas

It is not because things are difficult that we do not dare; it is because we do not dare that things are difficult.

— Seneca

When What Is Legal Is Not What Is Right

On 20 March 1852, Harriet Beecher Stowe’s landmark novel Uncle Tom’s Cabin; or, Life Among The Lowly was first published. The book unquestionably ranks as one of the most important novels in world history: it was the best seller of the entire Nineteenth Century throughout the world, second only to the perennial leader, The Bible, and it was significantly responsible for creating a profound change in attitudes toward slavery throughout the United States of America, but mainly, of course, in the Northeast and Midwest. Pro-slavery and Abolitionist sentiments had been at odds from the very earliest days of the founding of our republic, but in the wake of the publication of Uncle Tom’s Cabin, abolitionist sentiment grew immensely. Tolerance for the manifest hypocrisy of the evil of slavery flourishing in “The Land of the Free” grew less and less. As absolutists on each side clashed, war seemed inevitable. Upon being introduced to Mrs. Stowe at the Whitehouse, Abraham Lincoln is reported to have said, “So this is the little lady who made this great big war!”

The book sold out its first run almost immediately. Before the end of 1852, more than 300,000 copies had been printed in the United States, an unprecendented success at that time. Another 200,000 copies were printed elsewhere in the English-speaking world, and translations began to appear in other languages before the year was out. These figures do not include unauthorized and “bootleg” copies that flooded the market as well, nor the many unauthorized abridgements and digests of the book. Additionally dramatic interpretations ranging from public readings to many hundreds of theatrical versions brought the tale to millions of people. It was a phenomenon unlike any which the world had seen. And, as noted above, its impact was immense and immediate.

For today’s tastes, Uncle Tom’s Cabin is overwrought, melodramatic, and verbose in the extreme. Stowe’s prose styling makes one think of Charles Dickens as being spare and concise by comparison. The novel’s major appeal today lies in its historical importance. It is today much more often read about than read; indeed, some recommend against actually reading the unabridged original novel because it contains characterizations of its enslaved protagonists that are distinctly stereotypic and often unflattering (though no one decries the portrayal of the wicked and greedy Simon LeGree, whose name is still an epitome for nastiness and evil.) This seems to me to be a perfectly fine state of affairs, as the book really is a labor to slog through; it is enough to know what it was about and to know its impact.

But there is an important question that one must ask: since anti-slavery movements and the cause of abolition had been present in American society since the 18th Century, what was it that gave this book such an unusual impact? Why was Stowe’s depiction of slavery suddenly more powerful and more effective at raising anti-slavery sentiment than the many efforts that had come before? Why did Uncle Tom’s Cabin ignite a fire that had merely smoldered previously?

The reasons are no doubt many, but foremost among these is that a major theme of the novel is the forced separation of families. The brutality of this forced tearing apart had been raised repeatedly by abolitionists, but their tracts and treatises and speeches often found limited audiences, audiences of already like-minded people. Stowe’s novel was a gripping and emotionally affecting epic that reached many millions who perhaps had never really given serious thought to the issue of slavery or its abolition. Though anti-slavery activists made much emphasis on the beatings and lashings and physical cruelty of the institution of slavery, such sensationalism apparently failed to foster a lasting concern among most Americans. The United States of America was mainly rural in the 1850s, and it was perhaps easy for those who heard a speaker describe the torments inflicted upon slaves to accept these as being aberrations or outlying incidents; every farmer knew that no wise farmer would regularly abuse his valuable resources in such a way, for such behavior would be self-destructive. Thus, even when confronted with first-person accounts and eye-witness testimonials, most Americans seemed quite able to dismiss these cases as being sufficiently rare as to be unworthy of sustained outrage, and unnecessary to work against.

Perhaps the abolitionists’ putting so much emphasis on the physical horrors of slavery had in reality caused far too many Americans to assume they were exaggerating. Perhaps the nation’s citizens simply could not admit that their country countenanced such evil. Perhaps they were simply in denial.

But Uncle Tom’s Cabin changed all of that at one fell swoop! Almost literally overnight, the truth of the pernicious cruelty of slavery became palpably real through the medium of an utterly fictitious novel. While physical torments and tortures might be impossible for readers to effectively conceive, the unending trauma brought about by forcibly ripping families apart was far more “real,” far more familiar and readily conceivable. Stowe’s melodramatic account of “poor Eliza” facing the prospect of being rent forever from her son touched millions in the population of the readers of the book, the listeners to public readings, and the playgoers. While few could honestly relate to the horror of a fierce whipping, virtually everyone had first-hand experience with separation and loss. The majority of mothers and fathers of that age knew first-hand the pain of the loss of a child. Widows and widowers and orphans were commonplace. In a day of limited communication, most Americans knew the ache of separation when friends and family moved on to the West, or embarked upon a sea voyage, or simply rode to a neighboring town. Whether Harriet Beecher Stowe expected the potent effect of making forced separation a main theme of the novel, Stowe had touched upon a truly effective way to reach people and to permit them to truly understand and feel the horrors of a system that was legal in large areas of “The Land Of The Free.”

Suddenly, the wickedness of the institution of slavery became undeniable: through the portrayal of the forcible splitting of families, an enormous wrong that was such a plainly evil feature of slavery could no longer be ignored. In the book, Stowe has one character observe: “The most dreadful part of slavery, to my mind, is its outrages of feelings and affections – the separating of families, for example.” (Italics mine.) People knew then what we still know now: forcibly separating families is torture, and it is torture of the most callous and barbarous and unbounded sort, for it tortures entire families, young and old, and the torment goes on and on and on. And millions of Americans became aware that they could no longer be proud of a nation that permitted and protected such an intolerable system. Slavery was not just antithetical to the ideals and principles of a Free Country, it was intolerable evil, pure and simple. Mrs. Stowe’s decision to show the cruel injustice of slavery by first focussing on the forcible separation of families did more to capture people’s hearts and minds than decades of graphic emphasis on the physical cruelty. Stowe hit a resonant note, and the supporters of slavery were rightly terrified.

I make note of this today, especially, because the past still has so immensely much to teach us today. Systems and regimes that rely on such inhumane and uncivilized practices as the forced isolation of children from parents have always been on the wrong side of fundamental human morality, and they are ultimately judged to be on the wrong side of history as well. No matter the time, no matter the place, and no matter the justifications proffered, forcible separation of families is torture, and it is torture of the most callous and barbarous and unbounded sort. This cruel practice tortures entire families, young and old, and the torment goes on and on and on. Americans woke up to this stark and undeniable reality 164 years ago; we must wake up to it again. It is true that there are those who earnestly approve of such barbarity today, just as there were in the past. But these are bad people today just as they were in the past. Our great and prosperous nation must not once more join the ranks of those wicked and ultimately failed governments that were led by vicious oppressors or murderous tyrants. Those unspeakably benighted leaders justified their brutality as being “the law.” How very hollow and cowardly it sounds to hear our own Executive Branch claiming that “the law” demands such vile cruelty. Forcibly separating families is despicable. Always. Any time. Anywhere. Today’s victims of such brutality are non-citizens; this does not mean that they are non-people. These are human beings, no matter how poor they may be, no matter how little they bring, no matter what their ethnic heritage or social background may be. These are people. Our own self-respect as human beings, and our nation’s very character, demand that treat these people with compassion. Those who attempt to enter this country illegally must be dealt with, but this country has both the resources and the character to do so without descending to barbarism.

The past is not dead; it is not even past: it is with us in the present. But knowing the past and frankly facing it with all its glories and all of its shames, is necessary to living today and building a better future for all of us.

-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-
Jamie Rawson
Flower Mound, Texas

Fable is more historical than fact, because fact tells us
about one man and fable tells us about a million men.

— G. K. Chesterson

FURTHER READING

Bury The Chains: Prophets and Rebels in the Fight to Free an Empire’s Slaves, Adam Hochschild; Houghton Mifflin, 2005: ISBN: 0618104690

A professor of journalism at the University of California, Berkeley, Hochschild has nevertheless built an impressive reputation as a historian, though perhaps “popular historian” should be used (historians can be a snarkey bunch, and there always seems to be a certain disdain for those who write things that many people actually want to read!) In Bury The Chains, Hochschild recounts the struggle of the British anti-slavery movement. He notes that this cause was the first modern popular cause, employing mass media – newspapers and broadsheet posters – and organizing economic action against slavery in the form of sugar boycotts. He says, “It was the first time a large number of people became outraged, and stayed outraged for years, over someone else’s rights.” This book too has its villains such as Banastre Tarleton, (the evil English dragoon colonel featured in Mel Gibson’s The Patriot; he was pro-slavery in Parliament) and its heroes such as John Newton who wrote Amazing Grace.

Uncle Tom’s Cabin online

The Great Fire Of London

With this post, I depart from my usual practice of noting an anniversary or similar connection to an event. This topic is not inspired by a long past event on this date nor an especial anniversary. The Great Fire of London happened 351 years ago last month. But given current events, it is perhaps pertinent and fitting to speak about this historical event as fires yet rage in California.

Between Sunday, 2 September 1666 and early Wednesday 5 September 1666, the Old City of London was almost completely destroyed by a devastating conflagration known ever after as The Great Fire of London. The fire was unprecedented both in scope and scale – more than 15,000 structures were consumed over an area of more than 700 acres within the ancient city which was at that time still bounded by its Roman walls built almost 1,500 years before.

The fire was fiercely fueled by the fact that the vast majority of London’s buildings were made of wood. Today, we know London as a city of incredibly creative and beautiful brickwork and stone masonry, but in the late 17th Century, cheap and easy wood construction dominated. The presence of so much ready fuel – the structures themselves – plus the additional effect of stores of good such as turpentine, pitch, hemp, and timber – all so necessary for a maritime economy – as well as enormous amounts of gunpowder which was inevitable in a great military capital, meant that the fire was able to burn with an unusual intensity: ingots of steel and bronze liquified in the heat and poured like water into gutters running into the Thames.

The Lord Mayor of London was unwilling to take action against the fire, and though it was contrary to tradition and law, ultimately King Charles II commanded the creation of fire breaks which required that rows of houses and shops and warehouses be pulled down or, ultimately, blown up with gunpowder. This meant that property was destroyed before the fire reached it, but it also meant that the fire was, at last, stopped. This tactic, plus the cessation of powerful, dry winds finally brought an end to the devastation.

But London was, effectively, no more. More than 80 churches, in those days the center of neighborhood life, and more than 13,000 homes were gone. The Old City was a black and reeking ground.

Some visionaries such as the remarkable architect Sir Christopher Wren proposed a complete redesign of London and a rebuilding that would utterly modernize the ancient city. But King Charles II knew that England could not long stand without London in full working order – England was even then still engaged in a costly and dangerous war with The Netherlands. So the time needed to effect a complete redesign of the City was out of the question. The King convened special courts to settle matters of land ownership and legal responsibilities. These special courts facilitated rapid reconstruction, though they left many unsatisfied.

NOTABLY, in the aftermath of the fire, London’s building codes – already strict “on paper” were strengthened and, more importantly, firmly enforced. No longer would wooden buildings be permitted; no longer would thatched rooves be allowed. Any new buildings in London were required to be constructed of stone or brick, with rooves of tile or slate. And until 1997, when the recreated Globe Theater was constructed, these restrictions were rigidly enforced.

London has suffered fires since 1666, of course. There was no way to escape vast fire damage during The Blitz when German bombers carpeted whole neighborhoods of the great city with incendiary explosives. But the wisdom of building with fire-resistant materials was shown quite wise indeed. Though London went up in flames several times as The Blitz was waged, the flames were quickly and efficiently contained.

The Great Fire of London happened more than 350 years ago. The immediate and forceful response of London’s government in the wake of the fire has helped protect London ever since.

I have always wondered why it is that in our nation, particularly in areas where fire is a frequent and devastating impact, we still gladly and blithely accept the practice of building our homes and schools and churches and places of business out of easily flammable and readily combustible material? The technology to build fire-resistant structures is more than 5,000 years old, and in the past few millennia it has been improved. Why do we still build with so much fire fuel?

-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
Jamie Rawson
San Antonio, Texas

Your country is desolate, your cities are burned with fire.

— Isaiah, 1:7