About Jamie Rawson

I make my living teaching a variety of high-tech subjects, but my undergraduate degree is in history, and history remains an avocation. I have diverse and widely varied interests and opinions, but if there is any theme which ties all of this together, it is perhaps Professor William Slottman's view that we study history to learn compassion.

Though A Good Law, A Deep Shame, Too

I guess that Congress would like me to feel proud that we at last have a Federal law criminalizing the vicious terrorism, and utterly lawless crime of lynching.

I suppose I should be relieved that it has finally been formally criminalized.

But I am not proud. I am ashamed. Deeply, achingly ashamed.

Congress has failed to enact this plainly obvious legislation almost two-hundred times since 1900. TWO-HUNDRED TIMES. In the 120 years between 1900 and 2020, Congress has never heretofore decided that this legislation was worthy. Not during the heyday of “Jim Crow.” Not during the resurgence of the Ku Klux Klan in the 1920s. Not during the racial tensions preceding the Civil Rights Era, and not during it or thereafter.

Not in a dozen decades has Congress had the courage or principle to do the right thing before today. Today. Two decades into the 21st Century.

If there is any piece of evidence that shows how slowly we have progressed, surely this is it. How could it have ever been a question to make lynching a crime? And by the time of Emmet Till’s murder, sixty-five years ago, it was already long overdue to treat lynchings as especially heinous. But to have allowed twelve decades to have elapsed before today’s legislation is a disgrace that fully covers this nation with a heavy blanket of grievous shame! Yes, it is right to finally accomplish this. But the circumstances are still a deep and pervasive shame.

I am glad this is finally going to be sent to the President for his signature, which surely will happen, no matter his personal preferences, but I am ashamed that it has taken twice my entire life-span to come to pass.

S P Q R

This is a distinctly “political” post. But herein I make no citations of contemporary politics; I write about the politics of two millennia past. The subject is a favorite theme of mine, because I find it illuminating: The Roman Republic. This is far too brief to be comprehensive, but I have aimed to keep it short enough to be readable.

* * *

The Roman Republic may well be the most important, influential, and enduring republic that human history has ever produced. Its span lasted from the overthrow of the Roman Kings in about 509 BC until the rise of Augustus Caesar as the first Emperor of the Roman Empire in 27 BC, in those five centuries, Rome grew from a small, agrarian city in a backwater of the Mediterranean world into an empire which subsequently became unrivaled in human history. (Some empires governed greater land areas, and some lasted longer by some measures, but for size and duration, no other empire truly comes close.) And during those five centuries, Rome excited the interest and envy of the classical world from Spain to Egypt. The Greek political historian Polybius, writing the the mid-second-century BC, attributed Rome’s unparalleled success to its ingenious blending of Democracy, (Rome’s Popular Assemblies) Aristocracy, (Rome’s Senate) and Monarchy, (Rome’s binary executive, its two Consuls.)

Rome’s masterful blending of these varied elements of ancient forms of government became known as “The Concerns Of The People,” in Latin, “RES PVBLICA,” which evolved into our modern term “republic.” Rome was never a pure democracy (there have been precious few of these in the world, because pure democracy does not scale well to larger polities) but there was the democratic element of the Popular Assemblies, where registered citizens cast their votes to elect governmental officials. These assemblies also originated a variety of domestic legislation including bills of finance. Rome’s abiding dread of Kingship led to the development of a *dual* executive consisting of two consuls selected from the top two vote-getters in an annual election. Each Consul could override the acts of the other, which made for a cumbersome, yet typically a surprisingly effective executive. But the third “branch” of Rome’s creative government was the most powerful: the Senate.

The Senate was not directly elected. Those who has served in the senior elected positions served in the Senate, and to a very real degree, membership in the Senate was an inherited position, because a small number of influential families maintained a presence in the Senate for centuries. The Julian clan, from whom Julius Caesar sprang, for example, was represented in this august body for the entire span of the Republic and thereafter. The Senate controlled foreign policy, had the power to declare and wage war, and controlled a variety of crucial state revenues and resources. Yet the Senate was to a degree beholden to the popular assemblies. Notably, for much of the life of the Republic, the popularly elected Tribunes had the power to veto acts of the Senate (“VETO” means “I forbid!” And that is what the Tribunes did.)

For centuries, the Roman Republic successfully balanced the needs of the people with the wants of the aristocracy. The resulted in the pragmatic formation of an effective government which was willingly borne by the commoners and nobles alike. It is worth remembering that Rome’s very self-identity is expressed by the acronymic “SPQR,” “SENATVS*POPVLVSQVE*ROMANVS,” “the Senate and the Roman People.” This formula succinctly expressed Rome’s concept of its government.

As Rome grew more and more powerful, the impact and effect of an expanding empire induced strains upon this successful civil government. By the late First Century BC, several factions formed within the Senate, ultimately becoming two major rivals: The “Optimates,” or “the Best People,” and the “Populares,” or “the Popular Party.” The Optimates clearly favored the interests of many aristocrats in the Senate. (“Aristocrat,” bear in mind, comes from the Greek for “government of the best people.”) The Populares were associated with many programs that assisted the common citizen, such as agrarian land reform and public assistance for citizens.

Note that I will refer to these two competing groups as “factions” and not parties. They never represented anything like what we have termed political “parties” for the past 250+ years. In modern times, some interpreters assert an analogy between the Optimates Faction and the United States’ own Republican Party; likewise a parallel is drawn between the Populares and today’s Democratic Party. The comparison is temptingly easy, and satisfyingly simple. Yet it is also unhelpful and ultimately false. The two factions in Rome’s Senate did each have certain “popular” or “optimate” “planks,” but fundamentally, these two factions did not represent any identifiable principles at all.

Marxian historian Michael Parenti, in his 2003 work, “The Assassination of Julius Caesar,” claimed to find in Julius Caesar a champion of the common man against the arictocrats. Yet this seems far too much of a reach. Julius Caesar did build a powerful base of support from the commoners, but he was an aristocrat first and foremost. The real basis of the conflict and contest between Rome’s two major factions was simply rivalry for power. Ideology, such as it existed, was purely a means to the end of gaining power. The Populares built their base upon the leverage of the Popular Assemblies and the like, whereas the Optimates more plainly served the interested of the power elite. But neither “party” really gave a tinker’s dam about the people. The superordinate goal was simply power for each faction’s adherents and cronies.

By the late period of Rome’s remarkable Republic, the only consistent characteristic of either of its leading political factions was a drive for power for its own supporters and destruction of its rivals. So consumed by this petty infighting did the Republic become that its last century was basically ten decades of upheaval, chaos, and civil disorder. So fixated were Rome’s Senatorial factions that they neglected or abandoned the care of the state and the needs of the people; misery and catastrophe dominated. The end of this confusion came about with the demise of the Republic. First, Julius Caesar seized control as “Dictator for Life,” and after his assassination, his adopted son, nephew Octavian arose to become the first Emperor of Rome.

Stability finally returned to Rome, but the Republic survived only in ceremonial references. The people of Rome never regained their ancient Rights. For the final one thousand years that the last vestiges of Rome’s Empire survived, only autocrats and aristocrats had political sway. There were no parties, only factions; no principles, only the drive for power.

—————————————

Jamie Rawson
Flower Mound, Texas

I doubt that I can even begin to I compose a proper bibliography. But some titles I am informed by include:

Michael Parenti, “The Assassination Of Julius Caesar,” ISBN-10: 1565847970

H. H. Scullard, “From The Gracchi To Nero,” ISBN-10: 0415584884

Erich Gruen, “The Last Generation Of The Roman Republic,” ISBN-10: 0520201531

A Tiresome Bit Of Obsessivity Again Raises Its Head

The question is again pressing upon us:

IS 2020 THE START OF A NEW DECADE?

This question comes up repeatedly. It makes readily appreciable sense to account a new decade as starting with the rollover of the new digits — we easily know that 2020 is a clear division in our counting scheme, so it is sensible that it should serve as the boundary for a new decade. Yet the very most insistent amongst us insist that, No! A decade starts on the xx01 year following xx00. Of course, since there was no Year Zero, it is true, from a purely mathematical perspective, that the first “decade” would have been from Year One to Year Ten, inclusive, because a decade is, strictly speaking, ten years.

But a “decade” is also a generalized term of calendration, much as a month, a year, or a century is. Yet we have absolutely no problem with irregular months, or variable years, or inconsistent centuries, so it seems rather strange to insist upon rigid consistency with decades.

Since long before Julius Caesar declared, upon extensive consultation and coordinations with his Greco-Egyptian astronomer Sosigenes, that 45 BC (“Annus Confusionis“) should last a rather astonishing 445 days, irregular calendration had been pragmatically used from time to time to resolve issues with the calendar. Add to this fact that Caesar promulgated the insertion of a quadrennial “Leap Day” to correct the calendar so that every fourth February is an extra day longer. But, because Sosigenes’ every-four-year correction proved a bit too aggressive, Pope Gregory XIII acceded to the omission of nine days to reset the calendar in 1582 to align with celestial mechanics (and when England finally got on board in 1752, 11 days were excised.)

Along with the skipping of days, the “Gregorian” calendar changed such that only centuries divisible by 400 would be leap years thereafter. (In 1600 and 2000 February had 29 days, but in 1700, 1800, and 1900, February had its usual 28 days.) These extra-calendrical solutions are purely pragmatic rather than algorithmic. They definitely mar the “purity” of the calendar, but they work! This sort of practical fudging is still done periodically today: “leap seconds” are from time to time inserted into the scheme of things these days. So there is no reason to insist that decades logically have to start on years one. Simply no reason. One cannot cite calendrical consistency, nor mathematical purity, to back up such an insistance. It is literally without precedence.

The reality is that decades pragmatically run from 0 – 9 because that is the way people treat them. It is plainly practical. A calendrical genius noted back in the late 1990s that the matter is actually readily resolved with a simple bit of pragmatic reasoning, one which noted essayist Stephen Jay Gould concurred with: The first “decade” of the current era simply had nine years; the rest are correctly aligned 🙂 This simple, pragmatic resolution makes all of the debate moot.

This next New Year, 2020, is clearly the start of the newest decade. Anything else is an excess of pedantry.

THE CALENDAR IS HERE TO SERVE HUMANKIND,
NOT HUMANKIND TO SERVE THE CALENDAR
.

 

Fifty Years Ago,

Some of us are going to have an extremely difficult time believing it, but it was FIFTY years ago today, on 20 July 1969, that a human being first walked on the moon!

“That’s one small step for [a] man; one giant leap for mankind,” said Neil Armstrong as he touched toe upon the lunar landscape. Watching this on television at our neighbors’ house, despite the noisy static — which makes the quote possibly a slight slip-up: if the “a” is not spoken, it’s a bit off — it was clear that the quote was intended to be a famous and inspiring legacy. The conquest of space was not simply an act American bravado, but a development that all human beings could share in.

At the time, the broadcast of the event gained the largest world-wide audience ever recorded. For a short while, hundreds of millions of people around the globe were united in watching and celebrating an achievement of the most dramatic sort: new heights had been reached, and in a peaceful enterprise (albeit one that certainly had potential military implications.) If you were old enough to have watched this momentous event, do you recall where you were at that time?

My sisters were at a camp in the Colorado Rockies, the chance to watch television was an unheard-of and special exception to business-as-usual. A friend was in Vermont, staying at the Middlebury Inn, and watched in the inn’s common room. And older work colleague of mine recalled having listened to the broadcast on an ancient, unreliable tube radio in an otherwise unmemorable bar in Talkeetna, Alaska. No matter where or how, the event remains vivid for most everyone who watched. For a brief point in time, the medium of television perhaps truly reached its great potential.

I recall watching the event on a large Television at the house of our next-door neighbor. They had a pool in their back yard and decided to make the occasion a festive and holiday sort of event rather like Independence Day: a large grill was used to cook large quantity of hamburgers and albacore-patty “burgers” and a multitude of neighborhood kids splashed and played in the pool. After the both Neil Armstrong and Buzz Aldrin had ventured onto the surface of the moon, the television coverage became a bit less exciting for us kids, so back to the pool we went. But I do remember, once full night had settled, looking up at the moon in the night sky and asking my Mom if she though we would one day plant colonies in outer space. “Yes,” she said, “but not in my lifetime.”

I cannot help but reflect, that five hundred years ago Spain landed in Mexico in 1519, and within thirty-two years had founded a university in the thriving metropolis of Ciudad Mexico; likewise Spain landed in Peru in 1531 and within twenty years had established a university in the thriving metropolis of Lima. Now, to be fair, Spain – for all that the age was technologically primitive by today’s standards – did not have to deal with the challenges of blasting away from earth’s atmosphere and existing on an atmosphere-free rock in space. But, even so, the speed with which the New World was integrated into the Old was astonishing. It is also true that the integration was neither mutually beneficial nor benign. The peoples present at the arrival of Spain suffered mightily from the contact and conquest. Nevertheless, questing humans had reached a new and utterly foreign world and made that world integral to their own within a generation. Yet fifty years after “One small step for a man …” Space remains almost wholly the stuff of science fiction.

Certainly the race for the moon was an outgrowth of the Soviet/American Cold War, and the technological achievement served to impress the Soviets with how much a committed United States could achieve in an amazingly short time, but it was and is more than that. Mankind is a questing species. New horizons have always beckoned. To paraphrase a quote from a television show that predates the moon landing, “Space is the final frontier.” It remains to be seen if we ever return.

Jamie Rawson
Flower Mound, Texas

It is not because things are difficult that we do not dare; it is because we do not dare that things are difficult.

— Seneca

When What Is Legal Is Not What Is Right

On 20 March 1852, Harriet Beecher Stowe’s landmark novel Uncle Tom’s Cabin; or, Life Among The Lowly was first published. The book unquestionably ranks as one of the most important novels in world history: it was the best seller of the entire Nineteenth Century throughout the world, second only to the perennial leader, The Bible, and it was significantly responsible for creating a profound change in attitudes toward slavery throughout the United States of America, but mainly, of course, in the Northeast and Midwest. Pro-slavery and Abolitionist sentiments had been at odds from the very earliest days of the founding of our republic, but in the wake of the publication of Uncle Tom’s Cabin, abolitionist sentiment grew immensely. Tolerance for the manifest hypocrisy of the evil of slavery flourishing in “The Land of the Free” grew less and less. As absolutists on each side clashed, war seemed inevitable. Upon being introduced to Mrs. Stowe at the Whitehouse, Abraham Lincoln is reported to have said, “So this is the little lady who made this great big war!”

The book sold out its first run almost immediately. Before the end of 1852, more than 300,000 copies had been printed in the United States, an unprecendented success at that time. Another 200,000 copies were printed elsewhere in the English-speaking world, and translations began to appear in other languages before the year was out. These figures do not include unauthorized and “bootleg” copies that flooded the market as well, nor the many unauthorized abridgements and digests of the book. Additionally dramatic interpretations ranging from public readings to many hundreds of theatrical versions brought the tale to millions of people. It was a phenomenon unlike any which the world had seen. And, as noted above, its impact was immense and immediate.

For today’s tastes, Uncle Tom’s Cabin is overwrought, melodramatic, and verbose in the extreme. Stowe’s prose styling makes one think of Charles Dickens as being spare and concise by comparison. The novel’s major appeal today lies in its historical importance. It is today much more often read about than read; indeed, some recommend against actually reading the unabridged original novel because it contains characterizations of its enslaved protagonists that are distinctly stereotypic and often unflattering (though no one decries the portrayal of the wicked and greedy Simon LeGree, whose name is still an epitome for nastiness and evil.) This seems to me to be a perfectly fine state of affairs, as the book really is a labor to slog through; it is enough to know what it was about and to know its impact.

But there is an important question that one must ask: since anti-slavery movements and the cause of abolition had been present in American society since the 18th Century, what was it that gave this book such an unusual impact? Why was Stowe’s depiction of slavery suddenly more powerful and more effective at raising anti-slavery sentiment than the many efforts that had come before? Why did Uncle Tom’s Cabin ignite a fire that had merely smoldered previously?

The reasons are no doubt many, but foremost among these is that a major theme of the novel is the forced separation of families. The brutality of this forced tearing apart had been raised repeatedly by abolitionists, but their tracts and treatises and speeches often found limited audiences, audiences of already like-minded people. Stowe’s novel was a gripping and emotionally affecting epic that reached many millions who perhaps had never really given serious thought to the issue of slavery or its abolition. Though anti-slavery activists made much emphasis on the beatings and lashings and physical cruelty of the institution of slavery, such sensationalism apparently failed to foster a lasting concern among most Americans. The United States of America was mainly rural in the 1850s, and it was perhaps easy for those who heard a speaker describe the torments inflicted upon slaves to accept these as being aberrations or outlying incidents; every farmer knew that no wise farmer would regularly abuse his valuable resources in such a way, for such behavior would be self-destructive. Thus, even when confronted with first-person accounts and eye-witness testimonials, most Americans seemed quite able to dismiss these cases as being sufficiently rare as to be unworthy of sustained outrage, and unnecessary to work against.

Perhaps the abolitionists’ putting so much emphasis on the physical horrors of slavery had in reality caused far too many Americans to assume they were exaggerating. Perhaps the nation’s citizens simply could not admit that their country countenanced such evil. Perhaps they were simply in denial.

But Uncle Tom’s Cabin changed all of that at one fell swoop! Almost literally overnight, the truth of the pernicious cruelty of slavery became palpably real through the medium of an utterly fictitious novel. While physical torments and tortures might be impossible for readers to effectively conceive, the unending trauma brought about by forcibly ripping families apart was far more “real,” far more familiar and readily conceivable. Stowe’s melodramatic account of “poor Eliza” facing the prospect of being rent forever from her son touched millions in the population of the readers of the book, the listeners to public readings, and the playgoers. While few could honestly relate to the horror of a fierce whipping, virtually everyone had first-hand experience with separation and loss. The majority of mothers and fathers of that age knew first-hand the pain of the loss of a child. Widows and widowers and orphans were commonplace. In a day of limited communication, most Americans knew the ache of separation when friends and family moved on to the West, or embarked upon a sea voyage, or simply rode to a neighboring town. Whether Harriet Beecher Stowe expected the potent effect of making forced separation a main theme of the novel, Stowe had touched upon a truly effective way to reach people and to permit them to truly understand and feel the horrors of a system that was legal in large areas of “The Land Of The Free.”

Suddenly, the wickedness of the institution of slavery became undeniable: through the portrayal of the forcible splitting of families, an enormous wrong that was such a plainly evil feature of slavery could no longer be ignored. In the book, Stowe has one character observe: “The most dreadful part of slavery, to my mind, is its outrages of feelings and affections – the separating of families, for example.” (Italics mine.) People knew then what we still know now: forcibly separating families is torture, and it is torture of the most callous and barbarous and unbounded sort, for it tortures entire families, young and old, and the torment goes on and on and on. And millions of Americans became aware that they could no longer be proud of a nation that permitted and protected such an intolerable system. Slavery was not just antithetical to the ideals and principles of a Free Country, it was intolerable evil, pure and simple. Mrs. Stowe’s decision to show the cruel injustice of slavery by first focussing on the forcible separation of families did more to capture people’s hearts and minds than decades of graphic emphasis on the physical cruelty. Stowe hit a resonant note, and the supporters of slavery were rightly terrified.

I make note of this today, especially, because the past still has so immensely much to teach us today. Systems and regimes that rely on such inhumane and uncivilized practices as the forced isolation of children from parents have always been on the wrong side of fundamental human morality, and they are ultimately judged to be on the wrong side of history as well. No matter the time, no matter the place, and no matter the justifications proffered, forcible separation of families is torture, and it is torture of the most callous and barbarous and unbounded sort. This cruel practice tortures entire families, young and old, and the torment goes on and on and on. Americans woke up to this stark and undeniable reality 164 years ago; we must wake up to it again. It is true that there are those who earnestly approve of such barbarity today, just as there were in the past. But these are bad people today just as they were in the past. Our great and prosperous nation must not once more join the ranks of those wicked and ultimately failed governments that were led by vicious oppressors or murderous tyrants. Those unspeakably benighted leaders justified their brutality as being “the law.” How very hollow and cowardly it sounds to hear our own Executive Branch claiming that “the law” demands such vile cruelty. Forcibly separating families is despicable. Always. Any time. Anywhere. Today’s victims of such brutality are non-citizens; this does not mean that they are non-people. These are human beings, no matter how poor they may be, no matter how little they bring, no matter what their ethnic heritage or social background may be. These are people. Our own self-respect as human beings, and our nation’s very character, demand that treat these people with compassion. Those who attempt to enter this country illegally must be dealt with, but this country has both the resources and the character to do so without descending to barbarism.

The past is not dead; it is not even past: it is with us in the present. But knowing the past and frankly facing it with all its glories and all of its shames, is necessary to living today and building a better future for all of us.

-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-
Jamie Rawson
Flower Mound, Texas

Fable is more historical than fact, because fact tells us
about one man and fable tells us about a million men.

— G. K. Chesterson

FURTHER READING

Bury The Chains: Prophets and Rebels in the Fight to Free an Empire’s Slaves, Adam Hochschild; Houghton Mifflin, 2005: ISBN: 0618104690

A professor of journalism at the University of California, Berkeley, Hochschild has nevertheless built an impressive reputation as a historian, though perhaps “popular historian” should be used (historians can be a snarkey bunch, and there always seems to be a certain disdain for those who write things that many people actually want to read!) In Bury The Chains, Hochschild recounts the struggle of the British anti-slavery movement. He notes that this cause was the first modern popular cause, employing mass media – newspapers and broadsheet posters – and organizing economic action against slavery in the form of sugar boycotts. He says, “It was the first time a large number of people became outraged, and stayed outraged for years, over someone else’s rights.” This book too has its villains such as Banastre Tarleton, (the evil English dragoon colonel featured in Mel Gibson’s The Patriot; he was pro-slavery in Parliament) and its heroes such as John Newton who wrote Amazing Grace.

Uncle Tom’s Cabin online

Het Rampjaar – The Disaster Year

In Dutch history, the year 1672 has from that time to the present been known as “Het Rampjaar,” “The Disaster Year.”  So many catastrophes and calamities befell the Dutch Republic in that year, that to the Netherlanders of that day and later, the whole year merited the description “Disaster.”  The Dutch today describe their forebears of 1672 as “Het volk redeloos, de regering radeloos en het land reddeloos;”  “The people, irrational; the government, irascible; the country, irredeemable.”  

Prior to 1672, the tiny nation had seemed to be enjoying a charmed and fortunate existence: After decades of struggle, the United Provinces of The Netherlands had finally achieved independence from Spanish rule and in 1648 had secured international recognition as a nation.  Unusually, in an age of emperors and kings and princes, the United Provinces formed a republic to govern themselves; this proved a brilliant boon to the nation.

The governments of monarchs were notoriously bad risks for money lenders, for if a king died in debt, his debts passed with him.  But a republic, well that was another thing all together.  A republic, like a corporation, is intended to be immortal.  This stability also means that a republic cannot simply shed its debts by the death of a leader.  So the Dutch Republic quickly became known as a uniquely good risk for money-lending.  Basically, the Dutch Republic had a really high credit score, and the result was that it could borrow money far more cheaply than the great kingdoms around it.  Wars cost lots and lots of money.

Thus it was that the tiny Dutch Republic could rival England and France and the sundry German potentates on fields of battle, and utterly outdo these rivals in the commercial arena.  Small though she be, the Dutch Republic in the 17th Century sailed the largest merchant fleet in the world, and a navy that was numerically on par with France and England, and which had by far the fastest, most maneuverable ships.  This is why the reach of The Netherlands extended from the West Indies, where today’s Kingdom of The Netherlands still holds sovereign possessions, to the East Indies and from New York (“Nieuw Amsterdam,” originally) to Cape Town.  Such a trading empire also afforded the ability of the urban middle class of the Dutch Republic to become the wealthiest in the world, allowing them to build comfortable urban residences and to become patrons of a luminous constellation of portraitists and landscape painters of unsurpassed ability so that their compact and efficient homes could be adorned with artworks.

But “Het Rampjaar” changed everything.  In the wake of the Disaster Year, Prince William of Orange and his supporters gained control of the government allowing William to be granted the title of Stadtholder, “keeper of the state,” and act as a de facto monarch.  Though the Dutch Republic was not destroyed — it would officially persist until the time of Napoleon some 130 years later — it became less and less meaningful and finally yielded to a royal kingdom in the early 19th Century, a status it retains today.

So what happened during Het Rampjaar?  In the main, too many wars on too many fronts happened simultaneously.  England, France, and a coalition of German Princes and Electors all attacked The Netherlands, and the invading armies conquered much of the nation’s territory.  Cities were pillaged, immense stores of goods were looted or burned in merchants’ warehouses, people were forced to refugee to safety, and civil disorder completely disrupted normal trade and commerce.  Of particular note is that tensions which had strained the Republic’s politics for generations, the conflicting desires of some to retain their republic, and the aims of others to install a proper royal monarch, exploded.  Johan DeWitt, the “Raadpensionaris” of the Republic (“Prime Minister,” effectively) and his brother were attacked at the instigation of Admiral Cornelius Tromp, by a mob of Orangists — supporters of Prince William of Orange — who tore the men from limb to limb and are said to have roasted and eaten their flesh!  After two generations of unparalleled prosperity and success, the Dutch were unprepared for defeat and temperamentally unsuited to cope with it, and the nation was riven at precisely the time when it could least afford any disunity.

Of course, the Dutch Republic was, in the end, utterly lost, but The Netherlands remains a vital and vibrant nation to this day.  Despite the upheavals of the Disaster Year, the country did not vanish; the people did not fade away.

So, why am I writing about Het Rampjaar today?  The events contained within that “year” are truly a period from February 1672, until about March of 1673.  Historic forces rarely turn neatly upon the hinge of the calendar.  And I observe that in the later annals of our own age, we might very well find 2017 becoming known as our own “Rampjaar,” running from 8 November 2016 through November 2017.  While we have nots been beset by multiple invaders, thanks be, we have seen an almost unrelieved stream of incomprehensible acts and utterances from our leadership. The observation, “The people, irrational; the government, irascible; the country, irredeemable,” seems frightfully fitting.

-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
Jamie Rawson
Flower Mound, Texas

The Great Fire Of London

With this post, I depart from my usual practice of noting an anniversary or similar connection to an event. This topic is not inspired by a long past event on this date nor an especial anniversary. The Great Fire of London happened 351 years ago last month. But given current events, it is perhaps pertinent and fitting to speak about this historical event as fires yet rage in California.

Between Sunday, 2 September 1666 and early Wednesday 5 September 1666, the Old City of London was almost completely destroyed by a devastating conflagration known ever after as The Great Fire of London. The fire was unprecedented both in scope and scale – more than 15,000 structures were consumed over an area of more than 700 acres within the ancient city which was at that time still bounded by its Roman walls built almost 1,500 years before.

The fire was fiercely fueled by the fact that the vast majority of London’s buildings were made of wood. Today, we know London as a city of incredibly creative and beautiful brickwork and stone masonry, but in the late 17th Century, cheap and easy wood construction dominated. The presence of so much ready fuel – the structures themselves – plus the additional effect of stores of good such as turpentine, pitch, hemp, and timber – all so necessary for a maritime economy – as well as enormous amounts of gunpowder which was inevitable in a great military capital, meant that the fire was able to burn with an unusual intensity: ingots of steel and bronze liquified in the heat and poured like water into gutters running into the Thames.

The Lord Mayor of London was unwilling to take action against the fire, and though it was contrary to tradition and law, ultimately King Charles II commanded the creation of fire breaks which required that rows of houses and shops and warehouses be pulled down or, ultimately, blown up with gunpowder. This meant that property was destroyed before the fire reached it, but it also meant that the fire was, at last, stopped. This tactic, plus the cessation of powerful, dry winds finally brought an end to the devastation.

But London was, effectively, no more. More than 80 churches, in those days the center of neighborhood life, and more than 13,000 homes were gone. The Old City was a black and reeking ground.

Some visionaries such as the remarkable architect Sir Christopher Wren proposed a complete redesign of London and a rebuilding that would utterly modernize the ancient city. But King Charles II knew that England could not long stand without London in full working order – England was even then still engaged in a costly and dangerous war with The Netherlands. So the time needed to effect a complete redesign of the City was out of the question. The King convened special courts to settle matters of land ownership and legal responsibilities. These special courts facilitated rapid reconstruction, though they left many unsatisfied.

NOTABLY, in the aftermath of the fire, London’s building codes – already strict “on paper” were strengthened and, more importantly, firmly enforced. No longer would wooden buildings be permitted; no longer would thatched rooves be allowed. Any new buildings in London were required to be constructed of stone or brick, with rooves of tile or slate. And until 1997, when the recreated Globe Theater was constructed, these restrictions were rigidly enforced.

London has suffered fires since 1666, of course. There was no way to escape vast fire damage during The Blitz when German bombers carpeted whole neighborhoods of the great city with incendiary explosives. But the wisdom of building with fire-resistant materials was shown quite wise indeed. Though London went up in flames several times as The Blitz was waged, the flames were quickly and efficiently contained.

The Great Fire of London happened more than 350 years ago. The immediate and forceful response of London’s government in the wake of the fire has helped protect London ever since.

I have always wondered why it is that in our nation, particularly in areas where fire is a frequent and devastating impact, we still gladly and blithely accept the practice of building our homes and schools and churches and places of business out of easily flammable and readily combustible material? The technology to build fire-resistant structures is more than 5,000 years old, and in the past few millennia it has been improved. Why do we still build with so much fire fuel?

-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
Jamie Rawson
San Antonio, Texas

Your country is desolate, your cities are burned with fire.

— Isaiah, 1:7