As is quickly becoming an annual tradition, media outlets are reporting widespread violence and general chaos in retail outlets across the nation as Americans pack into stores early in the morning in an effort to capitalize on the so-called “Black Friday” sales that begin the holiday shopping season. In Los Angeles, a woman injured twenty people when she used pepper spray in an effort to deter other shoppers and “gain preferred access to a variety of locations in the store,” in what police referred to as an act of “competitive shopping.”

Meanwhile, shootings were reported in Northern California, North Carolina, South Carolina, and Iowa; brawls broke out in New York, Arizona, and Ohio among others; a young girl was trampled in Michigan; and chaotic, frenzied scenes were widespread across the nation. How did it get to this point? How did the day after Thanksgiving morph into the unhinged quasi-holiday known as “Black Friday”?
Despite the valiant efforts of multiple generations of historians, most Americans still speciously assume Thanksgiving to be a largely unbroken American tradition that spans four centuries from the Pilgrim’s bountiful feast with the Wampanoag Indians to our contemporary football-infused turkey holocaust. To be sure, days of thanksgiving were observed frequently in many American colonies and during the early republic, but they were less bountiful celebrations of abundance than they were ascetic days of fasting and prayer. Most of the “traditional” Thanksgiving we celebrate today, from the food to its mythical origin story, was in fact the creation of mid-nineteenth-century moral reformer Sarah Josepha Hale. Within her widely read magazine Godey’s Lady Book, Hale promoted the potential holiday as a means through which the nation could gather, celebrate, and model loving ties and moral life for the nation. After decades of unsuccessful lobbying, and with the help of Abraham Lincoln, Hale succeeded in creating a national holiday as a mechanism to help reunite the nation after the tumult of the Civil War. The idea spread quickly and by the beginning of the twentieth century, Thanksgiving was widely celebrated throughout the North (Southerners initially distrusted it as a “Yankee holiday”), second only to Christmas.

Thanksgiving developed during a period in which, as historian Leigh Eric Schmidt has shown, most American holidays became thoroughly consumer-oriented, which was particularly true of Christmas. To a certain extent, of course, this applied to Thanksgiving, not only in its focus on certain foods, but also in its strong tie to American football. Indeed, already by 1893, one columnist could grump that “Thanksgiving Day is no longer a solemn festival to God for mercies given… It is a holiday granted by the State and Nation to see a game of football.” Yet, suggests Matthew Dennis, Thanksgiving was simultaneously (though however imperfectly) an expression of Americans’ growing ambivalence or even “aversion to commercialization and capitalist excess,” particularly in its de-emphasis on gift-giving and the simple preindustrial vision of America it celebrates. As a result, even in the nineteenth-century, it was considered in poor taste for retailers to begin the holiday shopping season until after Thanksgiving. In 1924, Macy’s created its annual Thanksgiving Day Parade to mark the transition into the Christmas shopping season. So the day after Thanksgiving has served as the transition into the Christmas shopping season for nearly as long as there have been Thanksgivings.

In fact, by 1939, this tradition had become so firmly entrenched that it precipitated one of the great and largely-forgotten political scandals of the twentieth-century: Franksgiving. Locked in the midst of the Great Depression, and with Thanksgiving falling on November 30th and therefore allowing a particularly short holiday season, Roosevelt acted on the advice of national retailers and—in an effort to boost the national economy—moved Thanksgiving ahead to provide an extra week of shopping days. The decision was widely panned, with over sixty percent of Americans disapproving (in large part because it screwed up football games scheduled for Thanksgiving). An Indiana shopkeeper protested with a sign in his window reading, “Do your shopping now. Who knows, tomorrow may be Christmas,” while the Oklahoma attorney general wrote:

 “Thirty days hath September,
April June and November
All the rest have thirty-one.
Until we hear from Washington”

The unpopular decision resulted in a joint Congressional resolution designating the fourth Thursday of November as Thanksgiving, which Roosevelt signed into law in 1941.

If the day after Thanksgiving has marked the rather abrupt transition into the Christmas shopping season, and been one of the busiest retail days of the year, since the nineteenth-century, it only became associated with the name “Black Friday” during the 1960s, when Philadelphia police officers and bus drivers used it to refer to the traffic jams and crowded sidewalks resulting when mobs of shoppers descended into downtown stores. The term spread slowly across the United States until it began capturing some national exposure and a double-meaning, reflected in a 1982 ABC News World News Tonight Broadcast: “Some merchants label the day after Thanksgiving ‘Black Friday,’ because business today can mean the difference between red ink and black on the ledgers.”

Yet, despite the often frenzied atmosphere and outside a few cultish die-hards who camped outside stores in advance of mostly normal openings, Black Friday remained a largely normal shopping day into the mid-1990s. By 1995 and 1996, however, the term “Black Friday” appears to have become a massive marketing campaign itself, used extensively in national advertisements and covered with increasing vigor by the news media, thus precipitating the ongoing race to the bottom we see today. In an effort simply to draw consumers into stores, corporations began advertising lower and lower sale prices each year of limited-supply items in an effort to encourage more and more people to camp outside earlier and earlier, actively cultivating an atmosphere of sport and competition.

By 2005, this manufactured frenzy began turning particularly violent—beyond the already widespread pushing and shoving—as companies began opening their doors even earlier while offering even lower prices and more short-supplied items like Wal-Mart’s $300 computer. In 2006, widespread violence was reported as consumers waited for the PlayStation 3, of which there was a worldwide shortage. As the New York Times noted at the time, “Many merchants angered shoppers by trumpeting huge discounts—like $70 portable DVD players and $600 flat-screen televisions—only to announce they were sold out moments after they opened.”

Then, in 2007, Jdimytai Damour was killed in a stampede of 2,000 customers who managed to break down the doors of a New York Wal-Mart at 4:55 in the morning, trapping him in a vestibule where he died of asphyxiation. A temporary employee working at a company infamous for low wages and poor benefits, Damour spent the evening with his family before rushing off to guard a “blitz” line for a “door-busting” sale, only to be trampled by a frantic mob of consumers, many of whom reacted in anger when informed that the store would be closing and continued to shop. Although many Americans expressed horror and anger and Damour’s senseless death, it precipitated few cultural changes or even much sympathy for the underpaid workers forced to endure such madness.

This year, when many of the major American retailers opened at midnight, it drew significant pushback from their disempowered employees. Although one petition to stop the decision received 190,000 signatures, most Americans appeared to adopt a stance similar to the Minneapolis Star Tribune’s: be grateful you have a job—you’re free to quit if you don’t like it. To suggest that workers with little economic security, and who are faced with daunting unemployment rates, are “free” to quit is, of course, a bizarre and elaborate fiction. To suggest that corporations should have the right to make any unreasonable demand they please from workers living in a state of such unfreedom is a disturbing reminder of how far Americans have strayed from the principles of charity and compassion the season is alleged to represent.  It is only very recently, then, that Black Friday has taken on its third meaning as the darkest day in the American calendar.
Oh, and for anyone who thinks they're part of the solution by shopping online.

I was forwarded this story in an email:

"In the line at the store, the check-out girl told an older man that he should bring his own grocery bags because plastic bags weren't good for the environment. The man apologized to her and explained, "We didn't have the green thing back in my day." The girl responded, "That's our problem today. Your generation did not care enough to save our environment."
She was right -- our generation didn't have the green thing in its day.Back then, we returned milk bottles, soft-drink bottles and beer bottles to the store. The store sent them back to the factory to be washed and sterilised and refilled, so it could use the same bottles over and over. So they really were recycled. But we didn't have the green thing back in our day. We walked up stairs, because we didn't have an escalator in every store and office building. We walked to the grocery store and didn't climb into a 300-horsepower motor vehicle every time we had to go two blocks. But she was right. We didn't have the green thing in our day.

Back then, we washed the baby's diapers because we didn't have the throw-away kind. We dried clothes on a line, not in an energy gobbling machine burning up 240 volts -- wind and solar power really did dry the clothes. Kids got hand-me-down clothes from their brothers or sisters, not always brand-new clothing. But that young lady was right; we didn't have the green thing back in our day. Back then, we had one TV, or radio, in the house -- not a TV in every room. And the TV had a small screen the size of a handkerchief (remember them?), not a screen the size of the state of Alaska! In the kitchen, we blended and stirred by hand because we didn't have electric machines to do everything for us. When we packaged a fragile item to send in the mail, we used a wadded up old newspaper to cushion it, not Styrofoam or plastic bubble wrap. Back then, we didn't fire up an engine and burn gasoline just to cut the lawn. We used a push mower that ran on human power. We exercised by working so we didn't need to go to a health club to run on treadmills that operate on electricity. But she's right; we didn't have the green thing back then.

We drank from a bubbler fountain when we were thirsty instead of using a cup or a plastic bottle every time we had a drink of water. We refilled writing pens with ink instead of buying a new pen, and we replaced the razor blades in a razor instead of throwing away the whole razor just because the blade got dull. But we didn't have the green thing back then. Back then, people took the Metro or a bus and kids rode their bikes to school or walked instead of turning their moms into a 24-hour taxi service. We had one outlet in a room, not an entire bank of sockets to power a dozen appliances. And we didn't need a computerised gadget to receive a signal beamed from satellites 2,000 miles out in space in order to find the nearest pizza joint.

But isn't it sad the current generation laments how wasteful we old folks were just because we didn't have the green thing back then?  Please forward this on to another selfish old person who needs a lesson in conservation from a “Know-it-all” young person.
Remember: Don't make old people mad. They don't like being old in the first place, so it doesn't take much to tick them off."

Dear Old People,

“Back in my day, our families bathed in the same water one-by-one and then brushed our teeth with it. Then the water was used in the preparation of the next day's soup!"

Sorry to interrupt your masturbatory nostalgia, old people, but who do you think invented all the wasteful conveniences you’re railing against here? Was it the Kaiser? Did that rotten scoundrel trick you into driving 5-mile-per-gallon automobiles? Did Stalin force you to buy disposable diapers, to drink bottled water, or to “escalate”? Because I was under the apparently mistaken impression that this was all your bullshit.

You came out of the Depression and jumped face-first into a wasteful, parasitic consumer culture like a fat man and a bowl of pudding, and you never thought twice about it. You sprawled the landscape with wasteful suburban homes—while making sure to keep “those negroes” out, of course—so you could safely surround yourself with the latest in plastic and electronic garbage in a vain attempt to forget the meaninglessness of your shallow, forsaken lives. You enslaved brown people to make it possible and you went to war to protect it. And then you taught your children that this was good, that it was freedom—the gasoline would flow like a river forever,  greed is good, developing nations are America's garbage heap, and your ability to consume is more important than the lives of billions. And if anyone questioned you, if anyone proposed a more just, equitable, or meaningful vision of the future, you called them a “commie."

I mean, for fuck sake, you all elected George W. Bush--twice.

And now that it’s time to pay the piper, now that the unsustainability of the world you created is clear and the rest of us will suffer for your sins (right before you finally do us all a favor and check out), you want to talk about how hard it was for you to wash diapers?
You don’t like “green” bags? You don’t like being old? Still mad they canceled Matlock? Tough shit. 

You’re lucky we don’t line you all up against the wall.

Young People

Collective bargaining, what’s that?? Before last week, the only kinds of unions that showed up in the news were civil unions. Now, with the governor of Wisconsin calling for the end of collective bargaining for public employees of the state, signaling a move on the right towards targeting labor rights in general, unions are big news. While Democrat state senators have fled the state to avoid the quorum that Governor Scott Walker needs to pass his anti-union bill, President Obama has spoken out against the bill. Meanwhile, the Tea Party has arrived to counter the protest at Madison’s state capital building, and to up the crazy quotient in Wisconsin in general, which is usually pretty high anyways. I mean, these are people that walk around wearing cheese on their heads at all times, apparently.

We at Bareknuckle Vengeance feel the need to throw in our collective voice to this terribly complex situation. Here’s what we have to say:

1. The right to bargain collectively is a pretty good one.

2. We don’t know Walker personally, but we think he seems like kind of a dick.

Now that that’s clear, I suggest that we all watch some good pro-union films. Here are three of my favorites. Let me know of others that I missed.

Harlan County, USA
This film from the early 1970s won the Academy Award for best documentary, and its easy to see why; the filmmaker captures the entire year long strike at a Kentucky coal mine, chronicling the nasty tactics of the mining corporation and its local allies and the tremendous efforts of the miners. My favorite scene is one which a miner travels to New York City and ends up talking to a policeman on the street. The miner is shocked when the policeman tells him how great his pay and benefits are. (In Wisconsin of course, the policemen's union is safe from Walker’s plan since it endorsed him the last election. Methinks that we're not ready for democracy yet.)

Cradle Will Rock
Tim Robbin’s masterfully written, star studded recreation of the eponymous Great Depression era agitprop play. Marc Blitzstein’s 1937 socialist musical, the first of its kind, tells the story of steel workers in Steeltown, USA who organize after World War I. Robbins weaves Blitzstein’s story in with those of Diego Rivera, Nelson Rockefeller, and Federal Theater Project director Hallie Flanagan, whose autobiography and HUAC transcripts provided much of the film’s dialog. And did I mention it has Bill Murray, Hank Azaria, Jack Black, John Cusack, Joan Cusack, Vanessa Redgrave, Cary Elwes, and John Turturro?

One of John Sayles’ best films, and that’s saying quite a bit. Matewan depicts the “coal wars” of the 1920s in West Virginia, and he does so with a keen eye towards not only the murderous tactics of the coal company, but also by shedding light on the ways in which matters of race intersected the battle between capital and labor. The company brings in black and Italian workers via railroad, hoping to appeal to the xenophobia and bigotry of the local miners. I guess its unlikely that we will see the kind of violence depicted in Matewan show up in Madison, and thats definitely a good thing. But understanding the country's past labor battles, even if we do so through good film, might help put the right's new attacks on unions in some sort of perspective.

Finally, whatever you do, don’t see Pajama Game, the 1957 Doris Day musical romantic comedy about women who strike at a pajama factory. It really sucks.

Spring’s first thaw warms the dead of winter with sour smells. Footsteps through slop, slabs of ice sliding off roofs, explosions in knee-deep muck, tufts of fur worn in the ground. When Molly Henry surveys the spring-time wretchedness of her Saskatchewan ranch in Wolf Willow, Wallace Stegner captures it well: “Matted, filthy, lifeless, littered, the place of her winter imprisonment was exposed, ugly enough to put gooseflesh up her backbone, and with a carrion smell over all of it.”

Here in town it’s not as bad, since you don’t have to breathe in the stench of rotting cattle carcasses. When it’s warm and windy and the sun’s shining, it’s great to get out for a walk. But as each layer of snow melts, you’ll find fresh depositions of neighborhood garbage—dog shit, frozen squirrels, uncollected ARC bags. Across the street, there is a pile of socks and briefs emerging from the snow bank, a frozen slab of undergarments still stiff enough to break a side mirror.

Sometimes I’ll find a frozen songbird thawing in the sun, but usually it’s squirrels. I’ve thought about eating them before, but I’m not sure they’re safe. A few days will go by and nothing else touches them, not even the local hawks, raccoons, or cats. I bag them up, reverently as possible (any creature that fails to survive a Minnesota winter deserves some respect), and throw them in the trash.

I’ve never actually eaten squirrel before, but my grandparents did during the Depression. Whenever I ask about it they change the subject. Since my own backyard produces a plethora of squirrels, it seems like a good opportunity for “urban foraging.” From April to October, the squirrels fatten on whatever seasonal crops are growing in the garden. Sometimes they find their way into the garden's chicken-wire enclosure, and then can’t find their way out. They’re like bank robbers trapped in a vault.

When my dad was a kid, he worked for the maintenance division at an Oscar Mayer slaughterhouse. As the team’s youngest member, he was assigned all the gross jobs like cleaning out blood gutters on the kill floor. That was back when employers offered their workers benefits. Each year, Oscar Mayer provided a four-year scholarship to whichever employee scored highest on the SAT. Oscar Mayer paid his way through college.

At Iowa State, where he studied, they had a conveyor belt in the dining hall to take dirty dishes back to the kitchen. One time, him and his friends got a pig head from the plant and placed it on a greasy plate as it clattered into the wash room. A few seconds later, screams and the crash of broken glass erupted from the other side of the wall.

These days, pig heads, offal, feet, and faces are fashionable frontiers of the food world. In the last decade, Anthony Bourdain, Fergus Henderson, Michael Ruhlman, and other celebrity chefs and food-writers have popularized the lost arts of charcuterie and “nose-to-tail” cookery. Tonight, thousands of hopeful lovers will take a risk and order head-cheese on their Valentine’s Day dinner dates.

This renaissance, however, is taking some criticism. In particular, B.R. Myers’ “moral crusade” last week against foodie-ism in the Atlantic has raised a stir. Over the course of a long column, Myers seems to indict just about everyone who enjoys eating as a glutton, especially when it comes to meat, an ultimate source of “caloric wastefulness and environmental damage.” Clearly, “Myers is sitting very high on his horse,” as Robert Sietsema points out in a hilarious rebuttal: “Well, he can’t actually sit on the horse because that would be cruel.” It almost goes without saying that Myers is one of those vegans whose righteousness seems to impair his broader ethical senses.

But I do agree with Myers that the eating public, “foodies” included, should spend more time understanding the environmental and ethical consequences of their food habits. Despite his use of cherry-picked quotations, Myers does a fair job demonstrating the ridiculous explanations that certain food writers have provided to justify their voracious appetites for foods ranging from goat testicles to endangered species. But other than the typical vegan responses, I wonder what Myers would say about the damage or wastefulness of eating a ubiquitous backyard squirrel?

I've thought about it, but I probably won't do it, even though they eat all the strawberries. I don't have the heart to kill one with a garden spade, and I'd be too embarrassed if a .22 ricochet broke my neighbor's window. These are tough, well-fed squirrels anyway, and I don't know if a pea-shooter is up to the task. I've seen them get run over by trucks and shake it off. Sometimes they even survive the winter.

Nope, this is not an argument for the arming of Tucson citizens with Batarangs.

There was a minor eruption in the blogospere recently over a decision by DC Comics to create a spin-off of Batman, to market in France, which would feature a “Muslim Batman.”  The storyline goes something like this: Bruce Wayne has decided to franchise his brand around the world, leading to the creation of
Batman, Inc.  While in France, he finds Bilal Asselah, who is arrested after being caught up in the 2005 immigrant riots while dressed up in spandex and running up walls, and Wayne picks Asselah to be Paris’s coolest new superhero, Nightrunner.
People were upset.  Al Arabiyah reports that the outrage within France has been minimal, but blogs around the world are condemning the decision to make France’s Batman a Muslim.  Immigrants, you see, “have been rampaging across the country for several years now,” according to one blogger.  The “Angy White Dude Blog” was confounded by the idea that a Batman character would ascribe to the “religion of murder.”  But the argument that appears to have resonated the most is simply that France should have a French Batman.  And apparently, Bilal Asselah isn’t French.  Except he was born there.
Apparently Batman couldn't find any actual Frenchman [sic] to be the ‘French savior,’” complained one blogger.  The issue is not simply one of religion, although certainly attitudes towards Islam play a major factor in the blacklash against a Muslim Batman.  More acutely, French people (and apparently white people everywhere), don’t want France to be understood as a Maghrebi or non-white nation.  There is a fear that the nation that created its national identity, according to Jules Michelet, around the superhero Joan of Arc is at risk of losing that identity to a more popular superhero.  Everyone loves Batman, so with the creation of a “French” version, the stakes of identity are high.
Whose Paris?
So why are rabid right-wing Americans so concerned with Frenchness?  One hears echoes of the birther movement -- even though Barack Obama was born in the United States, his most extremist critics push his perceived foreigness.  Maybe they’re concerned about their own othered region to the South -- Mexico.  The day of the shooting of Gabrielle Giffords, the New York Times published a story about the abolition of ethnic studies programs in Arizona public schools.  (It didn’t get much attention due to the events in Tucson that day.)  Because Tucson teacher Curtis Acosta’s class taught students to understood the history of United States imperialism and to explore the contingencies of state borders and national identities, it was understood to be sowing the seeds of division and hatred.  Arizona likes its history like the French right likes its Batman: nativistic.
The opposition to the Muslim Batman probably won’t amount to much, but it did lead the creator of Nightrunner, David Hine, to respond to his critics.  When asked what he thought about the charge that Bilal is not a Frenchman, he responded: “I’d like to see [the bloggers] in a room full of French Algerians making that point.”  But when pressed, he stepped back from the controversy, insisting that he wasn’t trying to make a political statement, only trying to write good comics.
Comics, of course, have always been political, ever since DC published the first Superman story in which the hero battled evil bankers.  The early Superman comics were heavy with New Deal politics.  Deeper readings would reveal that Superman himself was an immigrant, an intent often ascribed to his creators’ foreignness.  Jerry Siegel and Joe Shuster were Jewish immigrants at the head of a long line of Jewish immigrant and first generation American comic book writers, including Will Eisner, Harvey Kurtzman, Jack Kirby and Stan Lee.  Their creations, like X-Men, Mad Magazine, and even Captain America, have often questioned or satirized nationalistic or bigoted discourses.  But explicit critiques of normative Americanness were off-limits, the writers’ own subjectivities bound by a heavily assimilationist culture.
Captain America took on these Tea-Partiers last year.
So, in imposing strict anti-immigration policies, abolishing ethnic studies, arming white citizens, and deporting anyone who looks funny, Arizona is doing something far worse than creating an atmosphere of fear and hatred, and threatening thousands of people’s material well beings.  The state is quite possibly preventing the appearance of the next Harvey Pekar (who passed away recently).  And that is unacceptable.

Let’s hope that a Nightrunner movie gets made someday, to show popular audiences the vision of demotic cosmopolitanism that the Jewish-American comic writers fell short of.  If all goes well, M. Night Shyamalan will direct, and will cast Jim Carrey in the role of Bilal.
You can find more Nightrunner artwork here:

The Baseball Hall of Fame remains perhaps the most venerated sports institution in the United States, filled not only with the greatest players in the sport’s long history, but also with virulent racists, noted cheaters, drug users, and your everyday run-of-the-mill sociopaths. The process and guidelines for selecting which former players deserve induction into the Hall of Fame are notoriously ambiguous and contentious, leading each year to hilariously exhaustive exegeses or truly bizarre ruminations on the meaning of the word “fame” in “Hall of Fame.” For some writers, though, the real problem with the induction process has less to do with the vague guidelines than with the fact that each player remains eligible for fifteen years; for such writers, it should be immediately obvious whether someone is a Hall of Famer or not.

While having to debate the merits of Jim Morris or Tim Raines (no and yes, by the way) every year is no doubt frustrating, there’s a certain historiographical wisdom to this system that is too often overlooked as voters must constantly review and rethink baseball history in light of new knowledge. This was perhaps best demonstrated earlier this month with the selection of Bert Blyleven, a former pitcher whose reputation has grown substantially over the course of the past ten years as advanced statistical analyses (known as “Sabermetrics”) have become increasingly accepted within the notoriously stodgy baseball community.

Bert Blyleven: Hey, "taste" isn't a criterion.

In his second year of eligibility, Blyleven captured just 14-percent of the vote (75% is necessary for induction) and was widely understood as a solid pitcher with a long career that was nonetheless distinctly unworthy of the Hall of Fame. Sabermetrics have, however, over the course of the past decade, revealed Blyleven to have been one of the greatest pitchers of his generation, chronically underrated, both while he was playing and in the present, because he was consistently saddled with terrible teams that masked his dominance. As a result, beneath the dry numbers in his voting record lies a fascinating shift in the way baseball history is understood and interpreted:

1998: 17.5 %               2002: 26.3%                2006: 53.3%                2010: 79.7%
1999: 14.1%                2003: 29.2%                2007: 47.7%                           
2000: 17.4%                2004: 35.4%                2008: 61.9%               
2001: 23.5%                2005: 40.9%                2009: 62.7 %               

At the conclusion of his career, most would have understood Blyleven to have been obviously unworthy of the Hall of Fame. In the intervening period, neither the facts about his career nor his statistical accomplishments changed. Yet the way we understand and periodize baseball has shifted so dramatically that he now seems an obvious Hall of Famer.

The wisdom of this system, I suspect, will be demonstrated even more fully in the next fifteen years as more and more players tainted with proven or unproven associations to steroids become eligible for induction. This year, for instance, at least four candidates’ cases likely suffered because of their alleged ties to steroids:

                                               Jeff Bagwell                41.7%
                                               Larry Walker              20.3%
                                               Mark McGwire            19.8%
                                               Rafael Palmeiro           11%
Had the steroid scandals of the previous decade never occurred, each of these players likely would eventually enter the Hall of Fame, but now their cases seem in increasing doubt. Bagwell, in particular, probably would have gained entry this year as one of the greatest first basemen in history. But even though there is no concrete evidence linking him to steroids, Bagwell’s skill-set as a home run hitter, his hulking body, and the time period within which he played was enough for a substantial number of writers—posing as moral guardians of a sacred institution—to conclude that he might have used steroids and could therefore tarnish the Hall of Fame. Rob Neyer points out that, by 2015, there could be as many as 22 entirely deserving candidates who played during the “steroid era” up for election and therefore could be considered suspect. 

At some point, I think, we will really have to question: what was the “steroid era” anyway, how does it differ from what preceded it, and how does it fit into the broader history of baseball?

The most common interpretation is helpfully illustrated here, which understands steroids to have been introduced into baseball in 1988 by outfielder Jose Canseco, spread like wildfire while home runs flew out of the park at a record pace, and finally exploded into public consciousness with the BALCO case, becoming the single greatest embarrassment in the sport’s history since the Black Sox Scandal (and that whole segregation thing) . 

Canseco is a reviled, bumbling, and bankrupt former star who was recently beaten into a whimpering fetal position in a schadenfreude-laden boxing match and, as such, makes for a convenient scapegoat. But, I think, there is good reason to be skeptical of this interpretation. Baseball, particularly if Ken Burns is to be believed, has always been approached as a representation of American identity and, as a result, the narrative of a "Steroid Era" draws upon--and feeds into--a popular narrative of national purity and innocence corrupted by modern excess--where have you gone Joe DiMaggio? But, of course, neither baseball nor America was ever as pure and simple as people like to believe.

The first known use of performance-enhancing drugs in baseball was not by Canseco, but instead by Hall of Fame pitcher “Pud” Galvin who used the “elixir of Brown-Sequard"--made by draining a monkey’s balls--to improve his performance, a decision which yielded accolades from the Washington Post: “If there still be doubting Thomases who concede no virtue of the elixir, they are respectfully referred to Galvin’s record… It is the best proof yet furnished of the value of the discovery.”

More seriously, though, performance-enhancing drugs were widespread in baseball by at least the 1950s, particularly in the form of amphetamines. Pitcher Jim Brosnan first revealed the widespread use of amphetamines by baseball players in his 1959 memoir, recounting how players believed it gave them a physical and emotional boost during the grueling baseball season. And boy howdy were amphetamines popular, used by everyone from Hank Aaron to Willie Mays (who was reputed to have kept a bottle of “red juice”—a mix of speed and fruit juice—in his locker).

But it wasn’t just amphetamines—players were dabbling with nearly any drug or cocktail they believed could give them a leg up on the competition or quell their physical pain. Hall of Fame pitcher Whitey Ford, for instance, was alleged by one teammate to use dimethylsulfoxide to fight is aches and pains: “you rub it on with a plastic glove and as soon as it gets on your arm you can taste it in your mouth. It’s not available anymore though. Word is it can blind you.”  And by 1970s, of course, everyone was on cocaine and Dock Ellis was (hilariously) throwing no-hitters while tripping balls on acid:

If amphetamines were widely believed (even if such a belief was false) to improve player performance in terms of reaction and hand-eye coordination, it’s worth asking what precisely the ethical distinction between Barry Bonds using anabolic steroids and Willie Mays using Hi-C from hell is.

This was my performance-enhancer back in my T-ball days. Laced with speed, of course.
But it would be reasonable to believe that steroids were also widespread in baseball well before 1988. Former pitcher Tom House has not only admitted to using steroids himself during the 1960s, but suggested it was a widespread practice throughout the major leagues and that “six or seven pitchers per team” were taking steroids or HGH: “We were doing steroids they wouldn’t give to horses. That was the 60s, when nobody knew.” Zev Chafets has suggested that, during the famous 1961 home run chase between Mickey Mantle (easily considered one of the ten greatest players of all-time) and Roger Maris, Mantle faded because of an abscess that developed on his hip after receiving a “vitamin injection” from a quack doctor who mixed steroids, amphetamines, multivitamins, enzymes, solubilized placenta, bone marrow, and animal organ cells.

Steroids were also known to be widely used in nearly every other sport throughout the 1960s. In 1969, a lawsuit by a former defensive lineman revealed that the San Diego Chargers gave their players—and sometimes forced their players to take—anabolic steroids and amphetamines in an attempt to enhance their performance. Indeed, steroids were casually placed on the training table along with amphetamines, pain killers, and sleeping pills. The use of steroids was so prevalent that, by the summer of 1973, both the Subcommittee on Investigations and the Subcommittee to Investigate Juvenile Delinquency held hearings to determine the extent of drug use within American athletics.

Actually not about this year's Vikings team.
As Craig Calcaterra suggests, though, the legislators’ efforts were particularly ineffective with baseball, which they problematically wanted to hold up as a model to other sporting organizations, perhaps in part because of its association with ideals of American purity. In response to the initial investigations, baseball commissioner Bowie Kuhn created an educational pamphlet and issued a statement that announced the sport had “no significant problem” with drug use, which was enough to gain a commendation from the Subcommittee Chairman that “baseball’s drug program [is] the best and most effective of its kind in sports.”

But not everyone was satisfied with this assessment.When Senator Birch Bayh tried to get Jack Scott, the founder of the Institute for the Study of Sports in Society, to praise baseball's anti-drug initiative,Scott adamantly refused. Instead, he suggested such anti-drug programs were merely a coverup that fixated on drugs like heroin and LSD instead of those that were actually in widespread  use by athletes:  “Here, for example, is a book that is put out by the office of the Commissioner of Baseball. The “Baseball vs. Drugs” and this is an education prevention program. There is not one mention of anabolic steroids in the entire booklet—one of the chief drugs athletes are using to help their performance."

So by the 1950s and 1960s, American athletics were saturated with a culture of drugs, which were widely available; admitted users have suggested they were in widespread use; baseball players were widely accepted to be using amphetamines and other newfangled elixirs to gain an edge. Is it really a stretch to believe some of the greatest players of the 1960s and 1970s may have been using anabolic steroids in addition to amphetamines? Would it even be possible that steroid use has declined overall since the 1960s, but merely become more effective? Lastly, what precisely is the ethical distinction between using drugs one thinks will yield performance-enhancement and using drugs that actually yield such enhancement?

I suspect such questions will be asked in upcoming years, as more and more people delve begin to challenge the romantic narratives of baseball's simple past. How will it change the way we think about baseball history over the past 20 years? It will be interesting to see if, and how, our analysis of the last thirty years begins to change.