An invading force, 10,000 strong, marches through a storm toward a fortress built into the side of a mountain.
From a distance, the combatants look like ants -- menacing and alarmingly well organized. They rattle their spears and snarl through teeth that have never known modern dentistry, and when lightning strikes, it reveals their sheer numbers. Volleys of arrows fly, swords find their way to the weak spots around breast plates. Bodies on both sides hit the ground.
This bloody affair is the Battle of Helm's Deep, from The Lord of the Rings: The Two Towers. It was one of the first cinematic battles powered by Massive, a piece of software designed specifically for Lord of the Rings to create computer-generated armies using artificial intelligence to simulate realistic battles on a tremendous scale.
As movies like Star Wars have revolutionized special effects and studios like Pixar shaped the use of computers in cinema, Massive, first built by Stephen Regelous for the world-renowned, award-winning visual effect studio Weta Digital, helped elevate what audiences now expect from onscreen battles.
Cutting-edge effects don't always hold up, but 20 years later, the Uruk-hai orcs from Helm's Deep look every bit as real and threatening as they did when they first charged the mountain stronghold. Massive has helped fuel some of the most iconic battle scenes in just the last few years, including HBO's final season of Game of Thrones and Marvel's Avengers: Endgame.
"[You] can see the impact by simply looking at the films that the software has been used on since Lord of the Rings over the past 20 years. It's a pretty substantial list of films and TV," says Bob Thompson, founding director of the Bleier Center for Television and Popular Culture at Syracuse University.
Now Massive is competing in a more technologically advanced visual effects world, against other programs capable of crowd simulation and rising expectations for special effects.
From the Dothraki to the Wakandans -- when the fate of the world is on the line, Massive is still marching into battle.
Well before orcs and elves could start hacking at each other in the name of Middle Earth, Massive was literally a dream.
In 1993, Regelous, the creator of Massive, dreamed he'd walked into his office and found a group of people watching a computer simulation of a forest. Inside this computer forest world were trees, animals and weather -- all coexisting as they do in real life, running in real time.
The people, who turned out to be aliens (naturally), told Regelous how it worked: This universe was a system of nodes connected together to create the behaviors behind all the critters on screen.
The dream stuck with him. So when director Peter Jackson asked Regelous to code crowd simulation software at the wrap party for Jackson's horror comedy flick The Frighteners, Regelous had an inkling of how it could work. Though he initially turned Jackson down, he eventually holed up in his apartment, even coding himself a stopwatch to track that he was actually working for eight hours a day on the project.
Two years later, Massive's first battle wasn't nearly as detailed or realistic as Helm's Deep. Regelous went to Weta Digital, situated in a small house in Wellington, New Zealand, and showed off a clash between 1,000 silver soldiers and 1,000 gold soldiers. Broadly, each character within Massive is called an agent.
At one point, it looked as though some of the soldiers were running away from battle, and the initial assumption in the room was that maybe the agents were smart enough not to want to get involved in a deadly conflict. In another instance, someone else on the Weta team pointed at a couple soldiers fighting and thought he saw one soldier try to avenge another who had just been killed.
"Actually that's not what [was] going on," Regelous tells me, "but people make it real by seeing into the simulation things that aren't really there."
The screeching overhead is loud. Soldiers turn to the sky in terror to see winged monsters called fell beasts swoop toward them. The beasts pick up a handful of men like M&Ms in a bowl, fly into the sky and then drop them to their deaths. Bodies go limp as they hit the roofs down below, in the city of Minas Tirith.
If anyone thought the Battle of Helm's Deep was intense, the Battle of Pelennor Fields in Lord of the Rings: Return of the King was an even worse day in Middle Earth.
In the two years spent coding the program, Regelous made some crucial decisions about how to create the software that made a scene like this, and all that followed, possible.
Massive uses something researchers call "fuzzy logic." If traditional logic states that something is either true or false, fuzzy logic allows for the possibilities in between. In Massive's case, this means that if an orc walks up to an elf, there's a wide variety of options for how those two agents fight each other, based on different logic rules written to guide the interactions of agents. Multiply that by thousands, and no two interactions are copied.
"Human eyes are very good at picking out duplicates," says Martin Hill, Weta Digital visual effects supervisor.
Using fuzzy logic not only provides for unique interactions, but it's more flexible to use than a neural network, Regelous tells me over Zoom. Often, when you hear about artificial intelligence, you hear about neural networks. They're a means of deep learning that was first proposed in 1944. A basic example might be an object recognition system that would need to be shown thousands of pictures of items in order to learn what they are. If Jackson had decided, for example, that a group of orcs needed to be more aggressive in a scene, it could take neural networks months to train the AI to be more aggressive. Massive would allow for those changes to be made on the fly.
In this case, fuzzy logic only applies to an agent that's alive. Once that agent dies, another important part of Massive is activated: rigid body dynamics. It was the one bit of code Regelous couldn't write himself.
The idea is this: An orc gets hit with an arrow, dies and falls off a cliff, or a soldier from Minas Tirith smashes into a roof. Rigid body dynamics would account for the physics of that body going limp and falling now that the agent isn't acting on its fuzzy logic rules anymore. Because it's dead.
Regelous found a university student who'd written a rigid body dynamics engine and got non-exclusive rights to use it.
"That enabled us to have … all these beautiful physical interactions that would have been impractical [to animate] when you consider you've got thousands of these guys in the shot," Regelous says.
And remember those screeches from the fell beasts as they swooped menacingly overhead? Massive agents are designed to react to that too. They don't just react to battle scenarios, but the sounds occurring within virtual earshot. Weta Digital found a way to integrate sound into Massive. The team could represent that sound with a cue in the software that would register with any agents fighting within virtual earshot. In other words, if they heard something, they could look up -- perhaps just in time to meet their end.
For the climactic Battle of Winterfell, in the final season of Game of Thrones, expectations could hardly have been higher. Fans look back with disappointment on the season, but the battle itself is one of the most iconic in pop culture history.
In this instance Massive had to animate what viewers couldn't see.
At the outset of the battle, the Dothraki army rides out, blades on fire, into the inky horizon where the White Walkers and their horde of the dead await. But then, one by one, the fires blot into darkness. The Dothraki, once a fearless, indestructible band of fearless warriors, are dead, their fires extinguished.
Onscreen it seems subtle, small even. But the question of how to realistically show the fires going out was a challenge for the artists at Weta Digital.
Hand-animating the action would have been difficult, time-consuming and not necessarily convincing in terms of the quality of the movement. Instead, Weta Digital used Massive.
"There's a whole battle going on back there in Massive," Hill tells me. Dothraki clash with the wights, they fall off their horses, drop their blades -- all the carnage you might expect. The difference: Every single element of the battle, besides the flames, was rendered black. Audiences couldn't see a single thing going on in the scene. All that was visible was the blazing swords, their lights being quickly stamped out as they died at the hands of the White Walkers in the darkness.
Rigid body dynamics came into play in the episode as well, in another iconic moment in the battle. After Arya Stark kills the Night King, his armies shatter to pieces. Weta used a digital cue in the software to release the armies from their fuzzy logic rules and literally let them fall apart.
In planning the Battle of Winterfell, director Miguel Sapochnik wanted to draw inspiration from Helm's Deep. Luckily for him, the same artists who worked on the battle back then were still at Weta Digital.
"The DNA of [Helm's Deep] is passed through to the Battle of Winterfell," Hill says.
If the Battle of Winterfell represented the fate of Westeros, in pop culture there's another battle with even higher stakes: the final battle in Avengers: Endgame. On the line? Half the population of the entire universe.
Right around the point where hope seems lost, and even Cap looks done, portals open up and the reinforcements pour through. Phalanxes of good guys: The Wakandans, the Asgardians, the Ravagers and more show up for a final throwdown with Thanos and his infestation of henchmen. They charge each other. In the background of various shots, you can see the combatants running across mounds of debris, bodies hurtling through the air.
Though audience members might not realize, they're watching AI agents, with different fighting styles and weapons going up against each other -- some with swords, some with shields and spears, some airborne with sparks flying from their hands.
To create distinctive battle patterns for the different groups, Weta relied on motion capture, allowing Massive to give each agent an arsenal of moves to choose from.
Endgame was one of the biggest projects the company had tackled in a while, Weta Digital visual effect supervisor Matt Aitken says. Recording the specific fighting styles was a three-day process, preceded by research from previous movies to see moves that had already been established. The motion capture artists made a variety of fighting vignettes the agents could draw from.
Not only are all the fighting styles different, the agents are different. Back in the Lord of the Rings days, Jon Allitt, head of the crowds department at Weta Digital, created a tool in Massive called Orc Builder, that could randomly generate different variations of orcs based on characteristics like height and limb length. Orc Builder is now called Body Page, and it worked the same way in Endgame.
"We don't want two guys to ever look the same," Allitt said at the time during a DVD featurette on the visual effects behind Lord of the Rings: The Two Towers.
Massive's not the only way to draw a crowd. Since Regelous, who retained the rights to his creation, turned Massive into its own company in 2003, there have been other programs that simulate crowds. And filmmakers have a number of tactics to fill arenas, stadiums and the like.
"Visual effects, at its very core, are not about being a perfect simulation of reality. It's about making you believe it," says Gray Marshall, industry veteran and chair of the department of visual effects at the Savannah College of Art and Design.
Visual effects artists on a film might decide to make a composite. They might shoot a group of 20 to 30 people, have them change costumes, move them around to another spot, shoot them again, rinse and repeat -- then put all of it together to fill a stadium, the steps of a building or whatever the case.
Another method involves filmmakers essentially placing digital green screen cards in the seats of a stadium for example, and projecting people onto those cards.
Once, Marshall had to fill 90,000-ish seats in Wembley Stadium, and did it with flesh-colored digital grass blowing in the wind because so little sharp detail was needed.
"A human very quickly just becomes this fleshy face color, a band of hair color, a band of color for the shirt, and a band of color for the lower 60% of pants," he says.
Why anyone decides on one technique over another depends on the needs of the film, the complexity of the crowd, the budget and the preferences of the visual effects artists.
Allitt says that Weta Digital doesn't always use Massive for crowd simulations.
There are also other special effects programs like 3D animation software Houdini, which does a lot more than crowd simulation, like compositing, modeling and lighting, or Miarmy and Golaem Crowd, which are both plugins for Maya, a 3D computer graphics program from Autodesk.
From Marshall's perspective, though there's clearly overlap, they're all somewhat different, with different uses in different situations: "It's like comparing Ferrari to Toyota."
For all the conflict Massive generates for movies and television, its own story is relatively free of drama. When I talk to Regelous and various folks from Weta Digital, their highlights and pivotal moments often include instances like when everyone got 64GB workstations, or when they designed a new file format that would be less of a strain on the system.
Regelous remembers building a fabric simulator one Easter, finishing right before dinner. Aitken remembers having to run simulations overnight for the sake of not slamming the network, and coming in the next morning to "see what bloodbath had ensued."
Today at Weta Digital, the crowds team consists of just five people.
"I don't know if they quite realize what visual power they're wielding," Aitken says.
Regelous, still helming the company, says he's always trying to figure out how to keep moving forward. That might mean giving filmmakers the ability to see special effects almost immediately during production. The Mandalorian, for example, made headlines when it premiered using Epic's Unreal Engine's real-time rendering to create immersive, computer-generated sets. So instead of inserting CG environments into green-screen footage after the fact, it happens during filming.
A new version of Massive -- Massive 9.0 -- is slated for release, boasting new capabilities like making it more compatible with other software programs, such as Autodesk's Maya.
Hill thinks it's an anomaly that a piece of software has carried through for so long.
"It evolved and got better," Hill says, "but the core elements are still what they were 20 years ago." After hundreds of movies and TV shows -- too many for Regelous to have kept track of -- and a few Emmys and Oscars, Regelous is still focused on pushing Massive ahead. Granted, it's got quite a head start.
"It's still current," Allitt says "[Massive] was 20 years ahead of its time 20 years ago."