Microsoft is offering a quick quantum computing primer that makes the esoteric subject almost understandable.
Quantum computing promises to solve unimaginably large and complex computer problems that classical computing could never crack. That is, if it can ever be made to work in anything like a practical manner. Achievements to date have been.
Classical computing machines -- everything from your laptop and smartphone all the way to today's most powerful supercomputers -- "are basically bells and whistles on top of the original Turing machine," said Michael Freedman, head of Station Q, the Santa Clara, Calif., facility where Microsoft does quantum computing research, in a longer treatise intended to describe the topic in a down-to-earth fashion and to introduce you to the people involved in the pursuit.
The model of computing described by the legendary Alan Turing essentially describes the way contemporary CPUs and algorithms work together.
In today's machines, data is represented by either a 1 or a 0 -- bits that are either on or off. Put enough of those ones and zeros together in a certain combination, and you have yourself an Excel spreadsheet, a weather-forecasting system, or a game like Candy Crush.
By contrast, in quantum computing, data is theoretically handled by qubits, existing as both 1 and 0 at the same time. And that would open up vast new realms of scientific accomplishment. Other technology heavyweights, like Google, are also.
"There are some problems so difficult, so incredibly vast, that even if all the computers in the world worked on the problem in tandem they would be sporting that little 'I'm thinking hard' hourglass for a long time," the Microsoft article says.
Quantum computing could possibly solve these kinds of problems in days if not hours.
Computer scientists find the prospect as daunting as it is tantalizing. Making the jump from classical computing to quantum computing, Freedman said, would be "kind of like getting a peek at the inner workings of the universe."
What's the connection with quantum physics?
Microsoft's video (above) tries to spell that out. It begins by explaining how particles behave predictably on a large scale but on a nanoscale it's "particles gone wild."
"If quantum mechanics hasn't profoundly shocked you, you haven't understood it yet," the article quotes Niels Bohr as saying. In 1922, the physicist won the Nobel Prize for his work in quantum theory.
How shocking? At the molecular level "particles in a quantum state can teleport information from one place to another. Particles can also experience 'entanglement,' remaining eerily connected no matter how far apart they become...like separated identical twins...where one bumps her head in Paris and the other, in Los Angeles...starts to rub her sore noggin."
And it gets weirder. Particles can also achieve "superposition," where they exist in multiple states simultaneously.
So how does a quantum computer work?
"Classical computers attack problems like you would navigate a corn maze, those farm-size labyrinths popular in rural areas at harvest time. It proceeds down each long, stalk-lined corridor and at each fork, it picks one direction. If it reaches a dead end, it turns around, finds its way back, and tries another route until eventually it solves the maze," the article explains.
Quantum computing would blow past that limited way of thinking by way of its multitalented qubits: "Because of the bizarre properties of a quantum state, like superposition, a qubit can be a 1 or a 0 -- or it can operate as both a 1 and a 0 at the same time. If one qubit, as both a 1 and a 0, can do two calculations at once, then two qubits can do four, and things get exponential pretty quickly."