The “Radium Girls”, Deodands, and the Rise of Worker’s Compensation in the U.S.

Photo of Lower Jaw of a Deceased Radium Girl

Last week I had the pleasure of being invited to participate in a ‘talk back’ session following a performance of Radium Girls with cast members and several trial lawyers who handle worker’s compensation cases. Put on at the Regent Theatre in Arlington by the Burlington Players, and sponsored by the Massachusetts Academy of Trial Attorneys, Radium Girls is a powerful play that traces the events of the mid-1920s as a group of 5 women who were employed by the U.S. Radium Corporation try to recover damages in court as they succumb to horrible ailments caused by radium poisoning. Radium, absorbed into the body in a way not dissimilar to calcium, wrecks havoc with bone tissue, often concentrating in the mouth and jaw. Causing dehabilitating, disfiguring and ultimately lethal injuries to many exposed to this susbtance–not unlike ‘phossy jaw‘ caused by exposure to phosphorus in the manufacture of matches–the legal system generally did not provide redress to workers for these types of injuries.

By way of background, the legal system in the early 19th century was ill-equipped to deal with the rapid technological, economic, social and other advances wrought by industrialization. Beginning in the 1820s in the U.K., and slightly later in the U.S, the advent of textile mills and other forms of industry created a sea-change in the way manufacturing was conducted. The legal system had centuries to adapt rules regarding traditional employee-employer relationships, most often known as master-servant law (you can access my two articles on this subject here, if desired), and these cases contemplated cottage industries where there were close personal working relationships between an employer and perhaps (at most) a handful of servants. There was something in the nature of reciprocal responsibilities between the parties– while tilted towards the employers, servants did have legal recourse for non-payment of wages, breach of the terms of contracts or indentures, mistreatment, and the like. With the advent of industrialization, suddenly there were factories employing hundreds and sometimes thousands of workers, and issues that seldom had to be dealt with before were becoming commonplace. For example, dams burst and flooded fields; steam boilers exploded and scalded or killed workers and steam boat passengers; employees had limbs horribly mangled in the cogs of industry; factories belched pollutants into the air and spewed effluvia into rivers; and locomotives–perhaps the most evident sign of progress–killed cattle, struck pedestrians crossing railway lines, and set fire to fields by spewing sparks into the air. Industrialization was seen as a significant, perhaps an overridding, social good– and Anglo-American legal regimes generally reflected that. So powerful were the forces of progress that they swept away entrenched, centuries-old legal principles that conflicted with them: my favorite example being the now-largely-forgotten law of deodand. Deodand was the ancient legal principle that if an animal or inanimate object occasioned the death of one of the King’s subjects, the item was forfeit to the Crown; over time this rule changed to commonly encompass ascribing a monetary figure to the object or animal instead, with that amount being transferred to the Crown or to the family of the deceased (a marvelous article on this topic, “The Deodand and Responsibility for Death”, may be found here). Somewhat predictably, it was that driver of industrialization known as the ‘railroad’ which was to prove that deodands had outlived their usefulness, as illustrated by the Sonning Cutting accident of 1841 in which 9 people were killed. The law of deodand was abolished formally by Parliament in 1846. It has been said to live on in the U.S. as the basis for the somewhat-related and contentious “civil forfeiture” or “asset forfeiture” principle.

Employees who sustained injuries on the job were generally barred from recovering for their injuries by what has been referred to as the ‘unholy trinity of defenses to compensation’. The first doctrine was assumption of risk— simply put, a worker was assumed to know the risks of employment, and to accept them, by virtue of accepting a wage. In theory, this rule posited, wages were adjusted to compensate for the level or risk and workers were always free to work elsewhere. Employers, for their part, were only required to provide the level of safety measures common to the industry as a whole– a ‘leveling to the bottom’ scenario that meant few safety precautions related to worker safety were taken. Workers were also frequently required to sign employment contracts in which they abdicated their right to sue, known not-so-affectionately as “right to die” clauses.  The rule of contributory negligence held that if the worker was in any way responsible for his injuries, than the employer could not be held liable; while the fellow servant rule held that employers were not responsible for the actions of another employee– an injured employee had to seek compensation from the fellow employee directly. Predictably, the effect of these principles was to essentially preclude employees from gaining compensation. A poem that wonderfully captures the injustice of these rules is Edgar Lee Masters’s (1868-1950) poem “Butch Weldy”, found in Poetry of the Law edited by David Kader and Michael Stanford (University of Iowa Press, 2010) at  78:

After I got religion and steadied down
They gave me a job in the canning works,
And every morning I had to fill
The tank in the yard with gasoline,
That fed the blow-fires in the sheds
To heat the soldering irons.
And I mounted a rickety ladder to do it,
Carrying buckets full of the stuff.
One morning, as I stood there pouring,
The air grew still and seemed to heave,
And I shot up as the tank exploded,
And down I came with both legs broken,
And my eyes burned crisp as a couple of eggs
For someone left a blow-fire going,
And something sucked the flame in the tank.
The Circuit Judge said whoever did it
Was a fellow-servant of mine, and so
Old Rhodes’ son didn’t have to pay me.
And I sat on the witness stand as blind
as Jack the Fiddler, saying over and over,
“I didn’t know him at all.”

By the turn of the century, progress in the U.S. was evident. The earliest departures from these rules applied to railroads–which to this day have different statutory schemes governing worker’s compensation—with Congress in 1906 and 1908 passing legislation to soften the contributory negligence rule. Most work remained state-by-state, with the first comprehensive worker’s compensation scheme being enacted in Wisconsin in 1911 and the last in Mississippi in 1948. Meanwhile, injuries continued to mount; it was estimated that in 1900 there were 35,000 work-related deaths per year in the U.S. and some 2 million injuries. A gradual chipping-away at the law by jury awards, some legislative movement and a growing sense of the unfairness of many worker’s compensation regimes–not to mention the rise of the contingency fee structure that made legal services much more accessible to the working class–meant that over time these obstacles to worker’s compensation eroded.

These laws and cases, however, dealt with discrete, tangible, traumatic work injuries– they did not encompass, nor could they predict, the damaging effects of latent workplace injuries as exemplified by the experience of the Radium Girls or those exposed to phosphorus who contracted “phossy jaw” as mentioned earlier. In the later years of World War I and thereafter, companies such as U.S. Radium Corporation produced luminous watch dials and other items, using radium salt mixed with zinc sulfide to form a paint known as “Undark”. Young women moistened their paintbrushes in their mouths to keep a fine point as they painted watch faces, working day after day in poorly-ventilated factories where everything was coated with radioactive dust. To amuse their boyfriends, they painted their teeth and fingernails to produce an enticing glow-in-the-dark effect. And glow in the dark they did!  At no time were they told that radium was dangerous, even while technicians and others protected themselves from radium’s effects. Predictably, many of these women succumbed to horrific ailments, including necrosis of the jaw (known as “radium jaw“). Five such women fought a lengthy and high-profile legal battle against the U.S. Radium Corporation in the 1920s, culminating in a settlement in 1928– all along the way U.S. Radium denied liability and even smeared the women’s reputation by publicly claiming they were infected with sylliphus, while also buying off dentists and doctors, using executives to pose as medical specialists, and using delaying tactics in court, among other unsavory practices. None of the five women lived more than a few years after the settlement, but the saga helped shape public and political opinion. In 1949 Congress expanded legislative protections for workers harmed by occupational diseases, and industrial safety standards were ratcheted up in the years following the Radium Girl’s struggle. While radium-based paint was used extensively in the World War II period and as late as the 1960s, further cases of radium jaw were avoided through the use of safety procedures and training– procedures and training that were far from onerous and indicate how easily these tragedies could have been avoided.

And as Eleanor Swanson writes about them in her poem “Radium Girls”:  “Now, even our crumbling bones/will glow forever in the black earth”….

   Late 19th C. Drawing of Phossy Jaw

NY Man Sues Ex-Fiancée After She Breaks Off Engagement

Some of my readers will remember that some time ago I wrote about the law and custom regarding broken engagements, including blog entries on the issue of the return of engagement rings, and ‘breach of promise to marry’  lawsuits. In an interesting twist, a NY man, Steven Silverstein, is suing her ex-financée Kendra Platt-Lee for more than $61,000 in expenses related to their planned wedding, what he claims is her share of rent he paid for their joint apartment during the time they were living together,and for funds –they were engaged to each other once before this latest chapter– $2,975 wedding deposits, $13,756.69 for damages to wedding vendors, $19,269.90 in funds she withdrew , $25,668.75 for what he alleges is her half of the rent he paid while they lived together. She returned a $32,000 engagement ring he had given her; while he concedes she returned it, he is alleging that her return of the ring in question indicates her awareness of the “conditioned nature of all gifts” given to her by Silverstein “in contemplation of marriage”.

As they apparently had no written agreeement, I find it unlikely he would prevail for the full amount he is seeking–but of course I do not know what types of records he has kept related to these matters, what other evidence the parties are able to provide, and ultimately what a court would rule. Does considerations of equity mean that she owes him at least some of the amount he is seeking? If so, how much? Is this a breach of contract? Or should Silverstein just be out the money as the price of failed love? Let me know what you think!

 

 

 

 

A Long Time Coming…But Back in Print!

Hello, gentle readers!  My next blog post will be up in a week or two, but I wanted to mention my latest work. It is a bit surreal to me to mention it, really, since I completed this as a part of my doctoral thesis at McGill in 2003. Several years later I took it up again, began tweaking it, and it went through a lengthy external review and editing process, and following acceptance was in the publication queue for another two years. All this to say that it is a great joy to see it in print and in such good company, no less! I must also acknowledge that the kind folks at the Law and History Review were a joy with which to work. Regretfully, I had taken a few years off from publishing but this begins the process anew, with a book chapter due out next year and a few other projects in the pipeline. The subject, while not great fodder for cocktail conversation, deals with an aspect of 19th century criminal justice in Montreal, namely the legal response to infanticide. Combing through the judicial archives and period newspapers uncovered a great deal of information on these otherwise unknown cases, and I hope I had some interesting things to say about this heavily-gendered area of the law which also reflected one of my favorite themes, namely the intersection between law and custom. Montreal is also, I think, a particularly interesting jurisdiction to study, straddling as it does linguistic, juridical, ethnic and other divides. If you’re so inclined you can peruse the article here, and of course comments are always welcome!  https://iancpilarczyk.com/wp-content/uploads/2012/04/So-Foul-A-Deed.pdf

‘So Foul A Deed’: Infanticide in Montreal, 1825-1850

From the introduction to the issue: “Our final article, by Ian Pilarczyk, examines the phenomemon of infanticide and the legal responses to [it] in Montreal from 1825 to 1850, a period marked by significant economic, social, political, and legal flux. Working with thirty-one unpublished case files of infanticide, he illustrates that the legal and social ramifications of this heavily gendered crime were chracterized by complexity, compromise, and conflict. He finds that the Canadian response largely mirrored that of other nineteenth-century Western jurisdictions. This finding suggests that local context matters, but should also remind scholars to consider the significance of transactional patterns in policing.”

My intro, in part: “This article argues that infanticide, and the legal and social responses thereto, exhibited a compromise between conflicting sentiments, realities, and paradigms. As a result, the actions of defendants, prosecutors, judges and jurors, and the public at large were characterized by competing motives and countervailing sympathies. The infant victims were nominally the focus of the law, but in reality these acts were viewed as crimes against social conventions. The issue of infanticide during this period therefore presents a fascinating study in this heavily gendered area of nineteenth-century criminal law, reflecting stark differences between law and custom. This article will provide a brief discussion of the historiography and underlying methodology, followed by the political and historical context for the Montreal experience, before moving on to the issue of infant abandonment, coroner’s inquests, and the legal mechanics of infanticide prosecutions.”

‘So Foul A Deed’: Infanticide in Montreal, 1825-1850, 30 Law & History Review 575-634 (May 2012)

‘Ripped from Today’s Headlines’: The Alford Plea–Pleading Guilty But Protesting Innocence

Last week in class, I wandered off on a slight tangent related to the Alford plea. Coincidentally, the very next day this fascinating bit of contemporary legal history and criminal procedure made its way into the news, prompted by a fairly unlikely source: a troubled ex-point guard from the University of Washington, Venoy Overton. The many varied ways in which legal history can surface–as well as coincidence–never ceases to fascinate me, and this example is no exception. What makes the Overton story interesting to a legal historian is, in a word, his plea. The Alford plea, Alford guilty plea, or as it is sometimes more colloquially known, the “I’m-guilty-but-I-Didn’t-Do-It” plea, allows a defendant to benefit from a guilty plea while maintaining innocence. But why would anyone do that? How did this plea come to pass? And what prompted Overton to do so?

The Alford plea is a form of “alternative plea”, meaning that it does fit the traditional pleas of either guilty or not guilty. As my students know, the English common law required one to plead, at the risk of suffering a very unpleasant procedure known as the peine forte et dure, or death by pressing. The rule has changed over time that failure to plea is entered as a not guilty plea by the court, but the principle of entering some form of plea is well-entrenched.  One such form of alternative plea is the no contest plea, or ‘nolo contendere‘, in which a defendant in a criminal action neither disputes nor admits the charge(s)–particualrly attractive to a defendant at risk for a subsequent civil case for damages, as a no contest plea contains no allocution or admission of guilt that could be used as evidence of wrongdoing. The Alford plea is another variant, premised on the defendant’s acceptance of a plea bargain agreement. In the plea bargain, s/he enters a plea of guilty while continuing to assert innocence. Typically, this involves a defendant’s acknowledgement that evidence of sufficient weight exists to result in a probable guilty verdict. The origins of this eclectic plea are fairly recent, dating to the 1970 Supreme Court case of North Carolina v. Alford. Henry Alford was charged with first-degree murder seven year earlier, and faced an automatic death sentence should two prerequisites be met: first, that the defendant pled not guilty; and secondly, that upon conviction the jury did not recommend a life sentence be imposed instead. Alford felt that, under the circumstances, he was facing a double-bind: only a guilty plea would guarantee he would not face the death penalty, yet he wished to profess his innocence. Alford therefore pled guilty to the non-capital charge of second degree murder, but felt that he was essentally doing so under duress. Alford appealed to the Supreme Court of North Carolina, District Court and the U.S. Court of Appeals; of these, only the Court of Appeals ruled that this plea was not voluntary. The Supreme Court took up the case, with a majority opinion written by Justice Byron White. The Court set out the standard that was to become the basis for the Alford plea, namely that a defendant “concludes his interests require a guilty plea and the record strongly indicates guilt”, having received benefit of advice from a competent attorney.

The record showed that Alford’s explanation of his plea was this: “I pleaded guilty on second degree murder because they said there is too much evidence, but I ain’t shot no man, but I take the fault for the other man….I just pleaded guilty because they said if I didn’t, they would gas me for it, and that is all.” For his part, Overton was facing jail time for the crime of promoting prostitution. In his explanation of his plea, he wrote: “While I believe that I am innocent, I believe that the evidence in this case is such that a jury would likely find me guilty of the crime charged….I am entering into this plea agreement to take advantage of the states (sic)…recommendation.” Based on articles that describe Overton’s past conduct as well as the evidence against him in this case, it is likely (in my opinion) that he holds an overly-charitable view of the degree of his personal culpability in this case…but I digress.

The Alford plea is not without controversy, certainly– some defend it on the grounds that anything that promotes plea bargains is beneficial for a system that relies heavily on their use. Others view it as intellectually (and perhaps morally)  bankrupt, arguing that it undermines respect for the criminal justice system. If you believe that acknowledgement of one’s culpability is an important and necessary element in our system of justice, than an Alford plea does seem inimical to that goal. Alternatively, critics also argue that Alford pleas simply don’t make sense–innocent people should have their day in court, while guilty people should allocute to their crimes, while this plea facilitates neither of these things. This latter interpretation is, I think, true, but only if one does not include the category of people who one would most want to benefit from the Alford plea– namely, those who are legitimately not guilty of an offense but face a likely conviction. There will always be people who are in the ‘wrong place at the wrong time’ or against whom a strong (albeit misleading) circumstantial case stands.  The strongest criticism of Alford pleas is that they can become a vehicle for corruption,where defendants are ‘railroaded’ into taking pleas they do not fully understand. Conversely, it is true that some defendants taking advantage of Alford pleas are simply in denial about their guilt or have strategic reasons to enter such a plea in court. The West Memphis 3 were controversally allowed to enter this plea in 2011, with the result that their previous murder convictions were vacated, they pled guilty to lesser crimes while maintaining their innocence, and were sentenced to time served plus a suspended sentence of 10 years.

In reality, the Alford plea makes up a small percentage of all plea bargains in U.S. courts, estimated at somewhere in the 5% range for all federal pleas, and 17% of all state pleas, according to the U.S. Department of Justice–although these numbers are misleading as they also include please of no contest as well. They are accepted in virtually all state jurisdictions (Indiana, Michigan and New Jersey remain hold-outs); a full list of its acceptance by states may be accessed here.  While the civilian federal courts recognize Alford pleas, U.S. military courts do not.

The Alford plea is morally complex, it is true; it is also difficult to categorize and frankly even pardoxical. One day historians may look back at it as a fascinating historical relic, an esoteric piece of legal anachronism that seems as out of place as trial by battle–or it may become a bedrock aspect of American criminal procedure.

 

‘Twelve Angry Men’, or The Origins of the Jury System

An earlier blog post talked about ‘straw men’ and compurgation; and to continue in that vein I wanted to say a few words about the origins of the modern jury. The timing for me is quite fortuitous, as I just hosted a small group of law students from Korea. Korea is in the early years of experimenting with the introduction of a jury system for a small number of criminal offenses, and while taking them on a tour of the U.S. district court (where they had occasion to observe a pre-trial hearing for a civil case, as well as testimony in a criminal case) I had ample opportunity to reflect on our use of juries.

Many people cringe at the sight of the ‘jury summons’ they receive in the mail every few years. I have to admit that I have no direct experience with serving as a juror– I’ve been bounced every time, which frequently happens to those with legal training–but nonetheless find the institution fascinating and I hope to have first-hand experience with this someday. The closest I’ve gotten so far is seeing “Twelve Angry Men” several times.

We say that the two most important duties of a citizen are voting and jury duty; yet too many Americans don’t bother with the former and bend over backwards to avoid the latter. I thought it might be instructive to write a few words about medieval juries so as to put the role of juries in historical context. For its origins, we need to go back at least as far as the Norman conquest of 1066. As originally conceived, a jury was a body of men sworn to give a true answer to a question– they essentially provided information of interest to the Crown related to property and questions of law, the best example being the “Domesday” survey of Great Britain completed in the 1080s. It not infrequently, if incorrectly, is referred to as the “Doomsday Book”–the etymology probably is from the old English word “dom” meaning “judgment”–and judgment it was, as the tax and property judgments made in it were unalterable and not subject to any appeal!

Juries were therefore tasked with the function of providing the Crown’s representatives with information on a wide variety of matters, such as land ownership, agriculture, the number of sheep or pigs in a county, and to identify suspected criminals for trial by ordeal. With the decline of ordeals after 1215 (you can read more about ordeals in my article here), juries took on an increasingly formalized and important role. Jurors were to be independent neighbors, culled from the area but not having a direct interest in the questions before the jury. Like witnesses and compurgators, jurors were originally expected to know something of the truth of the matter before them– hence the reason for requiring them to come from the same area as the parties. The questions juries were asked could be a question of fact or of law, or of mixed fact and law. They could be asked, for example, to render a verdict under oath as to the names of all landowners in the district and how much land each of them owned; or the names of people suspected of murder or other crimes. This was one of the largest differences between contemporary and medieval juries: namely, that juries long ago were expected to know in advance the circumstances of the particular case in front of them. Other contemporary elements were the same: juries generally consisted of 12 people (although the introduction of women to juries is of recent vintage), were sworn under oath,  and were expected to render unanimous verdicts.

By the 14th century it was generally accepted that juries were to work together as one body, with the aim of not only answering questions but in hearing sworn evidence and determining the truth or falsity thereof. Medieval juries typically had a wider range of decision-making then they do today, but the right of juries to vote their consciences, rather than delivering the verdict that was expected, took some centuries to become the norm. For those of you who blanch at the sight of  a jury summons, you should know that as courts became increasingly concerned with parties exerting outside pressures and influence on jurors a process of strict sequestration became common. In the modern era, we equate sequestration with sensational trials (like the O.J. Simpson trial), where the media blitz is so extreme that it is considered imperative to keep jurors isolated from it, although sequestration may also be used for other reasons such as to ensure juror safety. During the middle ages, however, jurors were essentially kept as prisoners by the court. In order to ensure they took their role seriously, and to expedite a timely and unanimous verdict, they were typically kept under lock and key during their deliberations and, worse of all, given no “meat, drink, fire or candle”–meaning they were kept in the cold and dark, unfed, until they reached a verdict! Should the jury not be unanimous, one old practice was to place them together in a cart and ride them through town until such a time as they could all reach a consensus.  Juries that issued verdict that did not comport with the court’s interpretation of the facts, or raised the royal ire, could face significant consequences: fines and imprisonment were not uncommon, nor was the early practice of razing the houses of jurors who delivered a ‘wrong’ verdict.

So, next time you receive a jury summons, remember: it’s not that bad!

Incidentally, the Commonwealth (in conjunction with Suffolk University) recently created an 18 minute long Jury Duty Orientation video; my friend Kathleen P. appears as one of the jurors in the front row. You may wish to check it out here: Massachusetts Jury Duty Orientation video.

And Happy New Year!

Of Christmas Caroling, Extortion, and Mistletoe

What, might you ask, do caroling and extortion have in common? Unless you’re very cynical, the answer probably should be “nothing.” Personally, I love the holidays and believe caroling is a lovely tradition. I still remember the last time I answered the front door, to be greeted by a spirited group of carolers. It was a lovely and festive act, much appreciated at the time, and we invited them in for eggnog and cookies. But as we approach the holidays, consider for a moment some of the lyrics of the popular 19th century Christmas carol, “Here We Come A-Wassailing,” otherwise known as “Here We Come A-Caroling” or “The Wassail Song”:

Here we come a-wassailing
Among the leaves so green;
Here we come a-wand’ring
So fair to be seen.

REFRAIN:
Love and joy come to you,
And to you your wassail too;
And God bless you and send you a Happy New Year
And God send you a Happy New Year.

Our wassail cup is made
Of the rosemary tree,
And so is your beer
Of the best barley.

REFRAIN

We are not daily beggars
That beg from door to door;
But we are neighbours’ children,
Whom you have seen before.

REFRAIN

Call up the butler of this house,
Put on his golden ring.
Let him bring us up a glass of beer,
And better we shall sing.

REFRAIN

We have got a little purse
Of stretching leather skin;
We want a little of your money
To line it well within.

REFRAIN

Bring us out a table
And spread it with a cloth;
Bring us out a mouldy cheese,
And some of your Christmas loaf.

REFRAIN

God bless the master of this house
Likewise the mistress too,
And all the little children
That round the table go.

REFRAIN

Good master and good mistress,
While you’re sitting by the fire,
Pray think of us poor children
Who are wandering in the mire.

REFRAIN (for alternate lyrics, see Here We Come A-wassailing)

Well, when you comtemplate the lyrics they seem a bit odd and full of curious juxtapositions, beginning with the nicest sentiments but quickly devolving into demands for beer, Christmas loaf, mouldy cheese (apparently a desirable thing, mind you), and even money to line one’s purse, all mixed in with a little bit of pathos and manipulation (“pray think of us poor children/who are wandering in the mire.”) Meanwhile, they emphasize that despite these demands the carolers are no mere “daily beggars” but your neighbors.

No less strident, but much more straight-forward, are the entreaties conveyed in one of my perennial favorites, “We Wish You a Merry Christmas,” which hearkens back to the 16th century or so:

We wish you a Merry Christmas;
We wish you a Merry Christmas;
We wish you a Merry Christmas and a Happy New Year.
Good tidings we bring to you and your kin;
Good tidings for Christmas and a Happy New Year.

Oh, bring us a figgy pudding;
Oh, bring us a figgy pudding;
Oh, bring us a figgy pudding and a cup of good cheer
We won’t go until we get some;
We won’t go until we get some;
We won’t go until we get some, so bring some out here.

Besides the fact that most of us tend to forget the second, “figgy pudding” stanza, and perhaps are somewhat unsure what a figgy pudding is–it’s an ancient type of Christmas pudding, by the way–the lyrics can be quite alarming. Am I to understand that you are wishing me a Merry Christmas and a Happy New Year (thank you, that’s very nice), but that you are also requesting–nay, demanding–a figgy pudding and a “cup of good cheer”? And you won’t leave unless you get it?  Fiddlesticks and bah humbug–that’s extortion!

So, where does this tradition come from? It has been said to date back to Anglo-Saxon pagan traditions originally, and subsequently incorporated into Norman-era Christmas customs. It shares common elements with two medieval traditions: the one, the charitable exchange between feudal lords and their serfs on Twelfth Night (the serfs providing song and blessings on the house in exchange for food and drink); the other, the ancient practice of feudal service, where a lord was owed goods or services (e.g., a specified number of knights or men-at-arms, crops, or nominal items such as a ‘rose at midsummer’). The acts depicted in “Here We Come A-Wassailing” are the benign form of this exchange between a lord and his serfs; they grant the lord and his family their collective blessing in exchange for a spot by a warm fire, the wassail beverage, and perhaps other food and gifts.  The custom expressed in “We Wish You a Merry Christmas” is more analogous to that of trick-or-treating: give us what we want, or we’ll make mischief and/or not leave until you do. To my mind, that shares certain similarities with some forms of the chiravari or shivaree (see my entry for the Role of Informal Law ), in which rambunctious groups loudly serenaded couples on their wedding day, banging pots and drums under their windows and blowing trumpets until bribed to depart.

So, should carolers come to you door, don’t forget to invite them in for the wassail beverage (or equivalent) –and at all times mind the contractual obligation created by standing beneath the mistletoe! To all of you I wish “good tidings for Christmas and a Happy New Year”. Now off I go a-wassailing.

I am endebted to my dear friend and law school classmate, Robert P. McHale, of R | McHale Law,  for suggesting wassailing as a potential blog topic. A figgy pudding to you, my good sir.

What Is A ‘Straw Man’, and What Is the Connection to the Rise of the Jury Trial?

A colleague, while discussing corporate takeovers, recently asked me about the origins of the term straw man— hence this week’s blog entry. I’m always happy to make the connection between the contemporary and the historical! A straw man, as the term is commonly used, typically has two meanings: the first of these is commonly used in political or other debates, where a straw man or straw man argument is essentially a misrepresentation of an opponent’s argument by comparing a statement one’s opponent has made with a false (yet superfically plausible-sounding) equivalence and then refuting it. This gives the appearance of having refuted the opponent while in reality one has merely refuted a distorted version of what he or she had originally said. This can be a highly effective debating tool–effective but intellectually bankrupt. Common means of creating a ‘straw man’ argument involve exaggerating, over-simplyfing, or decontextualizing an opponent’s position, or inventing a character that ostensibly reflects the views with which one is in disagreement, or “quoting” some unnamed person that is said to be representative of the opponent’s arguments. The etymology of straw man in this context probably refers to the straw-stuffed dummies historically used for bayonnet drills and the like by the army, or by boxers of old, as they provide no resistance and make success assured. What does this have to do with the rise of the jury trial? As far as I know, nothing!

The other meaning of straw man, however, may very well have a great deal to do with jury trials. In this usage of the term it refers to someone who is a figurehead or a stand-in for someone else (either a real or corporate person). In some instances, a straw man is used in order to meet the letter of the law in such contexts as property conveyances: the common law has traditionally held that one cannot convey property to oneself; if one holds property and wishes to convert it to a joint tenancy (owned equally with another person), then the one can “convey” the property to a straw man (such as one’s lawyer) who is nominally then the owner but only in name; the straw man then conveys the deed to the original owner and their new joint tenant. In this context a legal fiction is used in order to achieve a desired, and lawful, purpose. Straw men can also be used for illegal purposes or as a means of skirting the law; for example, for money laundering purposes a straw man may hold ownership of a business while the real owners (e.g., an organized crime syndicate) stay in the shadows. A straw man may be used to shield someone from liability; shell corporations (corporations that exist in name only but have no real assets) can be used for this purpose to essentially make owners judgment-proof. The etymology of this type of straw man seems to have its origins in an ancient medieval legal practice known as compurgation.  Compurgation was the swearing of an oath, usually a very specific, ritualized oath, in order to settle  a legal dispute. One party would “purge” himself of the charges or legal claim by undertaking the oath without error or hesitation before witnesses, and was used in the U.K. until the early 17th century in debt cases. The party would bring witnesses (usually 11) to attest to the truthfulness of their claim; this made sense when communities were small and tight-knit and if one was attesting to a claim that had actually occurred before numerous witnesses and where was one able to get 11 people to testify to the party’s credibility. The shortcomings of such a system are quite obvious, which eventually led to the rise of a class of professional ‘oath helpers’ who congregated around court houses and offered to swear to the veracity of a party’s claim for a fee. While perhaps apocryphal, it is said that they would advertise by having tufts of straw poking out of the tops of their boots, thereby letting the initiated know their availability as a professional oath helper– and hence the origins of the term straw man. Indeed, as justice was centralized in royal courts it became completely impractical to bring in 11 people from afar to testify in routine cases. By the end of the 16th century part of the official duties of court porters was to find these professional ‘oath helpers’ to assist in the ritual of legal process. This process begain to die out by 1600 and was largely forgotten by the end of the 17th century at the same time that the jury trial became more and more entrenched as a means of settling criminal, and later civil, cases. The jury, or jurata, were required to swear an oath to deliver a true answer (or verdict); its members were therefore known as jurors (or juratores) referring to the fact that they were people who had been sworn. While they were originally supposed to know something of the truth of the matters that came before them– hence the reason they were culled from the environs near where the cases occurred–juries were to morph into groups of 12 people who were to be disinterested in the outcome of the trial but tasked to ascertain what the true facts were.

And if you think jury duty is an onerous burden now, wait until you hear what being a juror used to be like–which gives me an idea for my next blog entry!

Is Trial By Combat Still a Possible Form of Legal Action?

Imagine you’ve just gotten a ticket for a motor vehicle violation. You have the right to defend yourself against it, but do you have the right to take up arms to do so? In other words, can you demand your right to trial by combat? This question may seem non-sensical. After all, we have an adversarial system, but it’s not that adversarial, right?

Before delving into why I’m posing that question, a bit of legal historical context: trial by combat, also known variously as ‘trial by battle’, ‘wager of battle’, or ‘judicial duel’, was a medieval form of criminal procedure in which the disputants in a legal suit fought each other, with the winner also deemed to have won his or her case. Alongside ordeals (you can read my article on that topic, if you are so moved, at Between a Rock and a Hot Place) , trial by combat was a common form of adjudication of disputes. Originally Germanic in origin, the Normans brought it with them to England following the Conquest. In the U.K. it’s high point of use was between the 11th and 15th centuries. Trial by combat was not available in certain cases, such as if there was very strong exculpatory evidence against the defendant, if he or she was captured ‘red-handed’, or if he or she had attempted escape following capture. A party lost by dying, being rendered unable to fight any further, or by crying “craven” (a lovely old word meaning “cowardly”, but originally being old French for ‘broken’, meaning in this context “I am vanquished”). A defendant who was killed lost the case (not that it probably mattered much at that point), but if defeated and alive, could be executed or declared “infamous” (meaing he lost all legal protections, priviliges and status). A defendant who defeated the plaintiff, or was able to defend himself successfully from sunrise to sunset, was deemed exonerated. The stakes were high for the plaintiff, as well– if killed, he lost the suit (again, probably not very important at that point); if he survived and lost, he was likewise rendered “infamous”.  Interestingly, before being allowed into the ring to begin their trial by battle, the combatants often had to swear an oath that they had not resorted to sorcery; one such surviving oath read as follows: “Hear this, ye justices, that I have this day neither eat, drank, nor have upon me, neither bone, stone, nor grass; nor any enchantment, sorcery, or witchcraft, whereby the law of God may be abased, or the law of the Devil exalted. So help me God and his saints.”

So, why am I asking whether trial by combat is still a possible form of legal action? Just that question came up a few years ago in the U.K.  In December 2002 a 60 year-old unemployed mechanic from the town of Bury St Edmunds incurred a £25 fine for a minor traffic infraction resulting from his failure to notify the Driver and Vehicle Licensing Agency (DVLA) that his motorcycle was no longer operational. Leon Humphreys shocked the court by maintaining that he still had the ancient right to fight a champion nominated by the DVLA. Following his hearing, he was quoted as saying: “I am willing to fight a champion put up by the DVLA if they want to accept my challenge. The victor speaks in the name of God and justice so it is a reasonable enough way of sorting the matter out. I know I am in the right so I do not have anything to worry about. I am reasonably fit for my age and I am not afraid of taking anyone on if they want to fight.” The magistrates did not quite know what to make of this– the question of whether this barbaric form of medieval process was still extant hadn’t come up before, to their knowledge. [This leads me to mention that there’s never a legal historian around when you need one, but I digress]

They eventually decided (quite rightly) that it was not. While they may have been unsure of the reason, the reality is that trial by combat fell into disuse in the late medieval period and was forgotten– or, at least, forgotten until 1818 when a defendant in a murder appeal demanded it to a shocked court (Ashford v. Thornton). It was formally repealed by Parliament the following year, which also abolished ‘appeals of murder’ (the ability of a third party to prosecute a defendant after he or she had been acquitted of murder charges). But could it still be an option in the U.S.?

English common was received into the U.S. before the American Revolution. Following American independence, the common law remained entrenched here, albeit supplanted, modified, and expanded upon at both the federal and state levels over the intervening two centuries. Clearly, U.K. decisions regarding English common law are not binding on U.S. courts; and moreover trial by combat survived in the U.K. until formally abolished by Parliament in 1819, well after American independence. Since we inherited the common law, and since subsequent repeal by Parliament has no legal weight here– and as it appears no court in the U.S. has ever grappled with the issue– that leaves open the question of whether theoretically trial by combat may have survived as a legacy of our common law system.

This would, of course, bring a whole new meaning to “fighting it out in court”.

 

…a few more esoteric Constitutional provisions…

My last entry had to do with whether President Obama could unilaterally use the 14th Amendment to raise the debt ceiling. This prompted me to think about the handful of esoteric constitutional provisions that have faded into desuetude, been repealed by subsequent amendments or events, or otherwise been largely forgotten.  The Constitution has two such examples that came to my mind: the first of these, found in Article I section 9, read that “The Migration or Importation of such Persons as any of the States now existing shall think proper to admit, shall not be prohibited by the Congress prior to the Year one thousand eight hundred and eight, but a Tax or duty may be imposed on such Importation, not exceeding ten dollars for each Person.”   This provision reflected Congress’ reluctance to attempt to restrict slavery at the time of the Constitution’s ratification; in fact, this clearly was designed to ensure that Congress made no restrictions of the slave trade for at least 10 years after ratification. Another example, found in Article IV section 2, likewise had to do with slavery as well. It stated that “No Person held to Service or Labour in one State, under the Laws thereof, escaping into another, shall, in Consequence of any Law or Regulation therein, be discharged from such Service or Labour, but shall be delivered up on Claim of the Party to whom such Service or Labour may be due.” This unhappy provision also protected the slave trade, by precluding sanctary in free states for escaped slaves, but was superseded by the 13th Amendment as part of the Reconstruction Amendments passed in the aftermath of the Civil War.

The Bill of Rights (the first 10 Amendments) also contain an interesting relic. I mentioned in my previous entry the 3rd Amendment which stated that “No Soldier shall, in time of peace be quartered in any house, without the consent of the Owner, nor in time of war, but in a manner to be prescribed by law.” While clearly a response to the British policy of forceably garrisoning troops in civilian houses during the occupation of Boston and other cities, this amendment has thankfully slumbered quietly since its birth in 1791. While not terribly esoteric (nor terribly controversial, either), the most interesting amendment from a historical perspective is the most “recent”, the 27th, ratified in 1992. Why do I say the most “recent”  in quotation marks? Because it was one of the proposed amendments to the Bill of Rights in 1789, but was not ratified until 203 years later. It states that “No law varying the compensation for the services of the Senators and Representatives shall take effect, until an election of Representatives shall have intervened.”  This restriction on Congress’ power to set its own salary languished for two centuries–and was able to do so as it set no deadline for ratification–until the 27th Amendment was certified following its ratification by Michigan on May 7, 1992 which met the 38 state (or 3/4 majority) requirement. Interestingly, it later came out that the historical record had forgotten that Kentucky had ratified the amendment in June 1792, meaning that it was actually Missouri’s ratification two days earlier that had made the amendment official– nevertheless, Michigan still gets official credit for being the 38th state.

Massachusetts, by the way, still hasn’t ratified the 27th Amendment, making it one of only 5 such states.