When the original "RoboCop" premiered in 1987, the idea of a resurrected man in a machine body battling autonomous killer droids was only slightly less outlandish than the replicants of "Blade Runner" or a sociopathic artificial intelligence in "2001: A Space Odyssey."
Flash forward nearly three decades to the "RoboCop" remake. Our protagonist, Alex Murphy, instead of being brought back from the dead, is only critically injured when robbed of his agency, which leads to this popcorn action-flick tackling a different breed of philosophical issues than its predecessor.
While the 1987 version spoke to the blurring line between man and machine and the moral responsibilities of science, the 2014 "RoboCop" more readily addresses those meaty cyberpunk themes, thanks in part to the science and technology being conducted in today's research labs at a pace that is scarily sailing toward science fiction.
"We wanted to take the elements -- electronics, robotics -- that we had learned through our process of research into the developments of the technologies happening right now. They may not be market-worthy, but the spark of these ideas is there," said "RoboCop" Production Designer Martin Whist. "We took the spark and embellished it to say that in the future, these things that are only sparks will be real."
Some of those elements will be real, and some are real right now.
"At the time of the original 'RoboCop' movie, this was all just fantasy," said Charles Higgins, an associate professor of neuroscience and electrical engineering at the University of Arizona. "Nowadays it's quite a lot more realistic. It's really not so much that the movie is inspired by real prosthetics as that it's become less fantastic."
RoboCop -- in essence a mixing of man and machine, brain and computer -- was originally born from the depths of science fiction. But now, Higgins points out, it is safer to say it's a far-off, but not wholly unrealistic ideal in neuromorphic engineering, an interdisciplinary field comprised of neuroscience and prosthetics, electrical engineering, and computer science that aims to understand how the architecture of biological nervous systems can be artificially replicated for both humans and robots.
Alongside the progress being made in brain-to-computer interfaces to augment and assist how our brain interacts with machines and information, these advances hold the key to whole new realms of possibility. We may one day be able to restore accurate limb and sensory motor function to injured or paralyzed individuals, as well as open up avenues to truly uncharted territory, like augmented cognitive function, direct visual overlays, and other forms of cybernetics.
Combine all that with the jaw-dropping advancements in nanotechnology -- namely the applications of carbon nanomaterial graphene for crafting super strong, conductive material -- and we have a present scientific landscape not too distant from the one featured in 2028 Detroit.
The future of prosthetics and brain interfacing
Creating crime-fighting cyborgs and replacing our healthy body parts with brain-controlled bionic alternatives sounds awesome, but it's completely impractical. Right now, the advancements in neuroscience and prosthetic development lie solely in the medical spectrum.
"We're looking at giving people confined to a wheelchair the ability to walk around. They're not going to be jumping over walls, flipping, or doing hand-to-hand combat," Higgins said. "That is a much higher level of interfacing. That's much farther in the future."
But thanks to the US military's aggressive funding initiatives aimed at improving the lives of veteran amputees, prosthetics are advancing at an astounding pace. For instance, just this past fall a team headed up by Levi Hargrove at the Rehabilitation Institute of Chicago reported the. The device relies on rerouting severed nerves to healthy ones, allowing amputee Zac Vawter to control his ankle by contracting his hamstring. The process is automatic; Vawter only needs to think about moving his foot and the corresponding movements are translated by the hamstring muscles using electrodes and algorithms.
And it turns out that in the case of RoboCop, nerve rewiring and the delicate operation of brain interfacing play a large part in the updated condition of protagonist Murphy.
"In the movie, Murphy is severely injured. They dispose of most of his body except for an arm, head, lungs, and spinal cord. That's quite smart," Higgins explained. "It turns out the central brain, which we think of as the feed of our intelligence, is where our consciousness resides and most of what we think of as memory, while the spinal cord actually contains the fine muscle movements. The spinal cord is the only place that is stored. So if you severed that and kept the brain, you'd lose all that."
In the original film, RoboCop isn't as much an augmented human as he is a robot with the fluidity and agile precision of a well-trained athlete. The reboot -- which is headed up by Brazilian director José Padilha and stars Joel Kinnaman in the titular role -- presents a protagonist who is able to regain the movements he once knew (and then some) through an augmenting body.
"We'd want to preserve not just the central brain, but the spinal cord all the way down. That has a lot of info the person will have learned across their life to control their body," Higgins explained. "You'd want to tie that into all the motors and synthetic muscles. I suspect it was done for visual effect, but as a scientist, that was the right thing to do."
Another aspect of the film, pulled from the original and updated to match the science of today, is the idea of Murphy's brain being susceptible to overriding programs, essentially turning his thoughts into an artificial intelligence program that can be tricked into thinking it's exercising free will.
"They make an attempt in the movie to stimulate the brain of Murphy to make him do what they want him to do. They stimulate the brain in order to control him, but he's not supposed to know he's being controlled," Higgins said.
Subtle mind control of someone with a cybernetic brain sounds fitting for fiction, but is, in fact, grounded in modern experimentation.
Researchers at Harvard Medical School this past summer were able to control the mind of a rat and move its tail using a noninvasive method: a human wearing a brain-to-computer interface sends electrical brain activity picked up using electroencephalography (EEG) to a corresponding computer-to-brain interface connected to the rat. The activity, picked up by the interfaces at a certain matching frequency, is then translated into a low-intensity, focused ultrasound aimed at the specific part of the rat's brain that controls tail movement.
Later that same month, University of Washington scientists Rajesh Rao and Andrea Stocco completed the, wherein Rao sent a signal using brain activity collected using EEG over the Internet to Stocco, who received the hand motor command straight to his brain through transcranial magnetic stimulation and tapped a key on his keyboard.
Despite these advances, neuroscience is still a long way away from pioneering interface methods to control more powerful prosthetics without constant and risky surgery, let alone something that could be wired into a RoboCop-like nanosuit for augmented athleticism.
"EEG can't get enough info to control a prosthetic. What you can get through EEG is relatively slow; it's about a second and not enough information to control, say, an arm," Higgins said, mentioning that movement akin to a cyborg like RoboCop would require zero latency and a far more powerful communication method. That's where electrocorticography, or ECoG, comes in as a viable alternative.
"If you're willing to be somewhat invasive, something that would be very successful is ECoG. You open up the skull, and instead of penetrating the brain, you lay an electrode array on the brain, like a 2D sheet. No damage done," Higgins said. "You can then close up the brain. It takes a while to heal, but once it's closed, you can power it from a battery without cutting anyone open."
Higgins sees ECoG as playing an integral role in the future of interfaces. "You get a heck of a lot more than you do from an EEG. In the next 15 to 20 years, you'll see prosthetics coming from that. Those are getting implanted in hospitals right now for things like epilepsy," he said.
Graphene nanosuits straight from our video game fantasies
The US military and affiliated organizations, like DARPA and Raytheon, have put considerable effort into developing powered exoskeletons to enhance soldier performance, but human augmentation may have an equally bright future thanks to nanotechnology. Such advancements are featured prominently in video game conceptions of advanced tech like the Crysis series' nanoweave suit and the Halo series' Mjolnir battle armor.
The development of nanomaterials is something Whist and the film's design team used as a main inspiration when imagining an updated RoboCop. The new character's armor would also incorporate core carbon nanotube fiber woven into a full-fledged suit that sits between the outer armor and Murphy's cybernetic components.
"The main sort of revelation I had in terms of material with this is graphene. We saw how it was being used in testing and in the labs and it opens up so many doors for lightweight, barely visible, completely powerful material," said Whist.
If you've never heard of graphene, then imagine a light, strong, and incredibly conductive material in a one-atom thick layer. Stemming from the carbon-based mineral we use to make pencil lead, scientists have been well aware of graphene and its properties for decades, but had been unable to isolate it for production until two Russian researchers at the University of Manchester in 2004 were playing around with some Scotch tape. They discovered they could remove layers of flakes of graphite with it, earning them a Nobel Prize in physics.
Since then, graphene research has exploded, with potential applications in everything from touch screen interfaces and silicon chip replacements to nuclear cleanup and desalinating seawater.
One of the most promising functions of graphene, from a military and law enforcement standpoint, is the ability to quickly produce individualized protective gear like vests, one step toward the possible development of the nanosuits dreamed up by science fiction.
"If you're talking about 2028 ... there's a high chance that graphene will be used in many applications," said Elena Polyakova, cofounder and CEO of the New York-based nanomaterials company Graphene Labs.
"You want to have a suit that is light, strong, and bulletproof and at the same time you want to embed electronics and sensors and so on. It's not just a strong suit, but it's a smart suit," Polyakova said of graphene's capabilities. Polyakova notes that 3D printing advancements are allowing the production of bullet-protection vests using individualized 3D models of the human body. One day it could be mixed with graphene composite materials to increase its strength and unlock the ability to embed electronics.
As for the development of a suit that could amplify someone's movement, Polyakova is not as optimistic.
"My guess would be no. Not in 15 years," she said. "If we're talking about high performance electronics, no. If we're talking about sensors and printable electronics, then most likely yes, it can be done."
In November, Graphene Labs launched a new company, Graphene 3D Labs, as a spinoff from the company's R&D efforts to "allow products with different components, such as printed electronic circuits, sensors, or batteries, to be manufactured." Polyakova expects graphene to revolutionize 3D printing and the road to 2028 to be paved with the accelerated development of graphene-based materials, which could be applied to car manufacturing, aerospace, computing, and more.
Killer robots still a long ways off
One element of the original "RoboCop" that has persisted in the remake is the Enforcement Droid Series 209, or ED-209 for short. The autonomous, emotion-free killer robot acts as the antithesis to the still-somewhat-human cyborg.
The new film also adds an autonomous humanoid droid -- ED-208 -- into the landscape, a prescient idea given the current political landscape surrounding the use of drones domestically and abroad. That's not to mention what will inevitably be a long series of philosophical and existential discussions about the roles of robots in daily life as humanoid bots -- for instance, fully autonomous robots replacing guards, police officers, and warfare infantrymen -- become a viable alternative to manning streets and security posts.
Integral to the remake's plot is a piece of legislation known as the Dreyfuss Act, a publicly supported initiative to disallow the combination of droids and firearms on US soil. The law is the reason why the callous executives of OmniCorp put their faith in RoboCop to humanize robotic law enforcement -- which is ultimately an attempt to repeal the act and sell more of its autonomous products in the process. Higgins noted that, realistically speaking, OmniCorp would have already designed cyborg policemen like RoboCop before being able to pioneer fully autonomous drones. That RoboCop is new, and ED-209 existing technology, creates one of the film's more unbelievable elements.
"The way it proceeds in both movies, the ED-209 actually precedes RoboCop. In the new movie, OmniCorp is interested in selling its ED-209 robots, which are successful but the military is reluctant to use them," Higgins said.
Current humanoid robots, like Boston Dynamics' ATLAS, are being modified by robotics teams to perform tasks with the ultimate goal of defusing bombs and providing disaster relief and surveillance. And some critics are openly questioning initiatives like DARPA's Robotics Challenge Trials as a discrete way to fund the development of robots as weapons. However, none contain artificial intelligence systems anywhere near the level depicted in any of Hollywood's robot tales -- and won't for some time.
"The truth in the real world is that we're much closer to making RoboCop than ED-209. The Robocop tech is an extension of existing prosthetics. The ED-209 is a true autonomous robot," explained Higgins. "It has cognition at the level of a cat or even higher. We don't know how to do that."
While it's humorous to imagine a killer drone thinking like a feline, simulating natural cognition at even its most elementary level is an astounding feat.
"A fully autonomous robot that is as intelligent as a cat," Higgins added, "is much farther in the future."