From telltale dried dark smears at crime sites in legal procedurals to bright crimson gouts flying from wounds in fantasy epics to literal deluges of deep red liquid flowing across gory horror sets, blood suffuses modern film and television. Several genres are actually defined by the free flow of fake blood. (Splatter filmmakers often sell movies by touting their use of thousands of gallons in single scenes, and their fans catalogue content by volumetric carnage.) Blood is so commonplace that its absence in a show or movie can be more striking than its presence. “Even a casual viewer has plenty of opportunities to see blood on screen,” stresses Gregor Knape, a makeup artist who designs special effects like fake blood and wounds for Kryolan, a major film cosmetics firm.
Onscreen blood is so common, and often presented with so little fanfare, that it’s become the visual background noise of the modern mediascape—rarely noticed and almost never questioned because of its ubiquity. Even when people do register cinematic gore, few ever interrogate its presentation, thanks to wider taboos against showing or discussing blood outside of the realms of medicine and media that leave most folks without many reference points for comparison and scrutiny. Steffen Hantke, an expert on modern horror films, said that he hadn’t even realized, until The Blood Project reached out to him recently, “how little I have thought about [the way blood is presented onscreen], even as I’ve spent considerable time with films in which considerable amounts of blood are spilled.”
But if you make the effort to examine blood in entertainment, you’ll likely be surprised by how different it looks from show to show or film to film. (It can be anything from a thick and vibrant pink-red to a watery and dull rust brown.) Even individual filmmakers, like the prolific bloody horror director Dario Argento, seem to take wildly divergent approaches to designing and shooting blood at different points in their careers. Yet it’s hard to find reliable and cohesive explanations for the apparently ever-changing nature of onscreen blood across projects and through the generations.
This year, in honor of Spooky Season, The Blood Project decided to solve this mystery—by breaking down the evolution of cinematic blood from the pre-screen age to the modern era.
Before The Silver Screen
Folks often assume that, in ye amorphous olde days, people simply used real blood to depict gore in plays. Until the modern era, animal butchery was a more common aspect of daily life, and people had greater access to its bloody byproducts, after all. One persistent myth even holds that the ancient Romans executed condemned criminals onstage for maximal dramatic effect.
But real blood simply isn’t practical for use in any sort of production. Unless properly handled and preserved, it coagulates quickly upon exposure to air, which would have made it difficult to store and deploy as needed. Even before people understood germ theory, they also recognized that blood generally goes off quickly when left out in the open. Blood is a laundry nightmare to this day as well, so costume costs would have negated any potential authenticity or convenience.
Instead, most entertainment historians believe actors usually used a concealed red handkerchief, or simple artists’ paints, as unrealistic but cheap and accessible stand-ins. Crucially, these forms of fake blood often favored vibrant colors over accurate depictions so that people in the back of a large theater would still be able to see violence down on the stage. As always, theater required the suspension of a degree of disbelief—which audiences gave them for their violent delights.
The earliest recorded bespoke fake blood recipe comes from the Grand Guignol,1 an ironically small Parisian theater that opened in 1897 and earned a reputation for its baroque and (at the time) transgressive horror productions, which often featured borderline gratuitous violence. Experts disagree on the exact recipe, but at its core it was likely a mixture of lush red carmine dye and glycerol, expressly crafted for visibility and vivid impact rather than for authenticity.2
Although early horror films consciously borrowed beats from Grand Guignol productions in their efforts to build a visual language of shock and suspense, they conspicuously veered away from the theater’s love of blood, to cater to the sensitivities and taboos of wider audiences. Even the first vampire films (the oldest of which, Nosferatu, turned 100 this year) showed at most tiny dark smudges in quick shots, rather than the arterial spurts that pervade the sub-genre today.
Early American filmmakers especially faced sporadic censorship (sanctioned by court rulings that free speech protections didn’t apply to film), which encouraged extreme conservative depictions of violence to avoid legal and political blowback. In 1934, these piecemeal-to-chaotic prohibitions coalesced into the centralized Hays Code, which pushed the last vestiges of sex and violence, as well as swearing and moral ambiguity, out of the cinematic mainstream and onto its far fringes.
When filmmakers did portray blood—usually in films made outside of America or in fleeting glimpses stateside—their black-and-white medium meant that they didn’t have to consider its exact hue. Instead, they prioritized materials with dark colors and high viscosity, which would cut stark contrasts against light surfaces. Initially, that meant using oil-based paints. But after companies like Bosco and Hershey developed cheap, pre-mixed chocolate syrups for mass retail in the mid-1920s, directors quickly realized they offered visually compelling textures and switched en masse. Even the iconic blood in 1960’s black-and-white horror masterpiece Psycho is actually just a glop of chocolate syrup trickling from a bottle down towards a swirling drain.
Blood in Technicolor
Technicolor, the first prominent technique for color filmmaking, started out as an expensive and complex process when big studios popularized it in the 1930s. So small shops that went in for controversial content, like blood and gore, didn’t gain reliable access to color until the 1950s. But once they did, this technological shift forced them to reevaluate their usage of dark brown paint and syrup. Some reportedly attempted to recreate the Guignol’s recipe. But as Knape points out, the peculiarities of film lighting and processing imposed unique constraints on filmmakers.
“The incandescent spotlights they used have a very warm, almost amber hue to them. Everything needed to be shifted towards blue-violet to be processed correctly,” he explains. “Technicolor and related film techniques also had a very low color sensitivity. They were basically unable to record anything transparent.” So, if filmmakers wanted to show red blood, they’d need to make very opaque and very saturated reds to read on film. (Knape also suspects that crews’ eagerness to dive into vibrant colors led them to “overdo it a little bit—at least to our modern tastes.”)
For lack of experience and set formulas, filmmakers engaged in trial-and-error experimentation to develop Technicolor-ready blood. But by the 1960s a former pharmacist working for Hammer, a British studio that’d been making gothic horror flicks since 1934, developed a blend of golden syrup, water, food dyes, corn flour, and flavoring as needed for blood-in-mouth scenes, dubbed Kensington Gore (for a street in London.) It was cheap, easy to modify in hue and consistency according to lighting and other technical needs, and easy to wash out of clothes, so it became the standard for fake blood. (Even Stanley Kubrick used it for The Shining’s elevator scene.) But, in line with the brightness and limitations of Technicolor, it was also nail-polish-glossy and thick.
“This fake blood was used more as a symbol,” Knape points out. “It was illustrative, and fit general color palette aesthetics.” It was also, he adds, just amusing at times, even if a little hokey.
Still, not everyone wanted the high melodrama of hokey-garish blood in their horror. Nor could shoestring productions afford full color. So well into the 1960s, films like Night of the Living Dead and Psycho opted to stick with black-and-white film and chocolate syrup blood to achieve their desired impact.3 Granted, they did innovate within their chocolate confines. Notably, Alfred Hitchcock made use of newly-developed plastic squeeze bottles for more realistic splatters and trickles. Meanwhile, over in Japan, Akira Kurosawa experimented with pressurized carbonation to get dynamic spurts of syrup from sword wounds in his early, blood-soaked samurai movies.
The social consequences of widespread experience with industrial warfare through the mid-20th century, decay of censorship regimes in the 1950s, and democratization of a wider and higher-fidelity range of color filmmaking techniques in the 1960s collectively opened the floodgates for fake blood experimentation, with an eye towards achieving greater realism. Notably, Herschell Gordon Lewis, the forefather of modern splatter cinema, struggled with the hyperreal color of Kensington Gore while working on 1963’s Blood Feast, his first major foray into horror. He hired a lab to make an alternative (a mix of red food dye and a common antacid) that he poured over raw offal from a butcher’s shop. But the end result was still simply off.
Around the same time, Dick Smith, a special effects technician whose mentors drilled the value of realism into him, tried his hand at developing a realistic alternative as well. He came up with a mix of clear Karo corn syrup, food coloring, and a few other additives, depending on whether or not actors would need to get any “blood” in their mouths or eyes for a film. The color came out close to that of blood flowing from a minor wound—deep red with a hint of brown—and it flowed with the right consistency. Film legends like Francis Ford Coppola and Martin Scorsese turned to Smith’s blood in their quest for high-impact gore.4 Soon enough, (slight variations on) his recipe became Hollywood’s standard.
Smith also participated in efforts through the 1960s to develop squibs, blood-filled packs that would detonate on command, which allowed actors to respond to real force, and a sudden gush of blood to pour out of one or both sides of a wound. (Specifically, Smith pioneered techniques for embedding squibs under layers of fake skin on an actor’s body.) These techniques, explains Blair Davis a film studies professor at DePaul University, removed the layer of artifice inherent in, for example, an actor clasping their chest at the sound of a shot and slowly grimacing their way to the ground while, after a quick scene cut, a line of blood trickles from under their fingers.
A few special effects shops have attempted to develop their own realistic recipes in the decades since Smith got into the game. But most practical effects experts have focused on adjusting his formula to fit their practical needs. Some developed additives to make his blood bead up like the real thing. Others toyed around with the exact mix of food colorings in order to fit their lighting and cameras, or to depict the difference between lighter arterial versus darker venous blood as needed. (“When on set, we must be ready to make final tweaks all the time,” Knape says, “because in the end it’s all about the lighting.”) The team behind 1981’s The Evil Dead notoriously mixed in non-dairy creamer to make the many mouthfuls of blood actors had to hold in more palatable.
Granted, there have been a few notable advances beyond Smith’s revolution, Knape notes, like the development of “scab effects, to show blood on older lesions.” He adds that “recently, Kryolan developed a gel that would build up in semi-spheres, to depict a wound in zero gravity.”
Although realism became de rigueur for filmic gore following the 1960s to the point that several companies gradually discontinued their own once-popular fake blood offerings, Knape points out that a few filmmakers still opted for unreal alternatives. Some simply put a practical premium on their superior washability, he explains. Some needed a truly vibrant red to stand out against dark sets. Some wanted the symbolic aesthetics, or the patent absurdity, of hyperreal blood. Notably, when someone asked famed auteur Jean-Luc Godard about the ample blood in his bright and heavily stylized Pierrot le Fou, he reportedly quipped back that it was “not blood.” It was “red.”
“When blood effects are obviously fake, I tend to giggle,” Knape adds. “Maybe that is not a bad thing. It might even be intended. The lawnmower scene in Peter Jackson’s Braindead comes to mind.” (Before he caught the public eye, Jackson cut his teeth on a series of tongue-in-cheek absurd splatter horror films, featuring over-the-top violence.) “It’s so bad that it’s good.”
Just as Technicolor necessitated changes in blood formulae, so have more recent innovations, like the advent of home video systems. As Davis points out, these tools allowed fans “to study how gory special effects are executed on screen. The ability to pause, rewind, and go frame-by-frame through moments of bloodshed designed to be fleeting experiences for theatrical film viewers has allowed fans to pick apart how filmmakers designed their bloodiest cinematic moments.” This, in turn, changed filmmakers’ calculus for how to design blood effects, not just to play to in-the-moment emotional reactions but to careful scrutiny. Often through the pursuit of ever higher degrees of realism, building on Smith’s work and observations of their own blood.5
The switch from film to digital cameras in the final years of the 20th century also allowed for a more direct translation between how a set appeared in real life and how a scene would appear on a viewer’s screen. This likewise forced filmmakers to reconsider their formulae. But it’s also made it much easier to understand, in the moment, how blood on an actor will appear in a final product—a clarity that removes barriers between stylistic or realistic intents and end effects.
Do Computers Bleed Electric Blood?
In the 21st century, rather than rely solely on actual buckets of fake blood, a growing number of filmmakers are gradually turning towards computer-generated imagery (CGI) for some or all of their cinematic gore. Unfortunately, Davis explains, clumsy CGI blood effects appear “as unreal as if they had been added to the frame through hand-drawn animation.” And most of the early attempts at digitally animating blood into a scene, thanks to the nascent nature of the tech itself and a lack of trial-and-error practice, were as clumsy as early color blood formula experiments. This awkwardness, divorced from any stylistic goal, earned CGI blood a bad rap among film fans, who to this day urge directors to forego it in favor of the reliable impact of practical effects.
Yet rather than relent, filmmakers have leaned further into CGI. In 2007, the ever-competent and well-regarded director David Fincher used selective CGI to touch up blood in his serial killer thriller Zodiac. In 2012, The Expendables 2 made headlines for solely using CGI blood effects.
Some of this switchover is purely pragmatic. CGI allows filmmakers to forego the complex, and often costly, considerations that come with practical blood effects—like cleaning up gore after every take for a scene. But some of it is artistically aspirational. As Knape points out, CGI offers filmmakers freedom from the constraints of physics, on-set lighting and formula calculus, and practical concessions to cleanup burdens. And through that freedom, they can potentially control exactly where and how blood, of an exact consistency and hue, flows in a final product.
As this technology improves and grows cheaper—as access to and competency with CGI blood democratizes—the uncanny valley that separates all-digital blood from realistic gore effects is rapidly closing. The promise of the future of CGI blood is such that, as Knape puts it, “only purist [filmmakers] demand blood effects that are all physically present on a set” today.
The Future of Fake Blood
“It’s only taken a couple of years for CGI blood effects to change viewers’ expectations,” Knape adds. “Both visionaries and audiences have developed expectations in the visuals of bloodshed that are physically impossible, or at least extremely complicated, to realize without digital enhancement.” That is to say, they’ve circled around from expecting realism to craving the utter spectacle—the raw emotional effects—that can flow through competently crafted hyperreal blood.
No one’s sure where this digital frontier, and the changing attitudes it’s ushered in, will take us. But Knape expects that we’ll see a mixture of the more fantastical, physics-defying carnage that it offers, alongside an ironically careful attention to the complex details of blood realism.
“Aspects like color fidelity and control over blood’s reflectivity and opacity will become larger issues for digital blood in the future,” he argues. “At least in high-budget productions.”
But shoestring budgets and effects purism mean there’ll likely be a place for Guignol-style buckets of garish gore, dumped onto sets and over our screens, for years to come yet.
About the Author
Mark Hay is a Brooklyn-based freelance writer. His work has appeared in Atlas Obscura, Forbes.com, VICE, and dozens of other outlets. You can find more of his articles on blood, and many other topics, here.