Modern computers, like the one you're probably reading this on, are capable of performing the kind of calculations that, in the early days of computing, it would have taken a room-filling apparatus to do. Supercomputers, the modern incarnation of those room-fillers, can execute something close to a thousand trillion operations per second, crunching through incomprehensibly large amounts of information with relative ease. Unfortunately, even the mightiest of these modern analytical engines are cowed by the prospect of tackling the latest player to emerge onto the scene: big data.
Big data is the somewhat unimaginative term used to describe “an accumulation of data that is too large and complex for processing by traditional database management tools”. The rate at which data is gathered now is unprecedented, but the level of complexity of the available data is also unprecedented. And though technology has made strides in deciphering huge amounts of data, our ability to collect data has exceeded the ability to analyze it. That data is no good though if no one understands the story it’s telling, but a new way has been developed to manage these now available enormous data sets.
|This graphic view of editing activity by a single robot is just one way to visualize big data.|
Image Credit: Wiki user Fernanda B. Viégas; CC BY 2.0
Topology is a type of algebraic geometry that has become an increasingly popular way to study large data sets. It not only allows the analysis of a structure's physical parts but it also allows data sets to be viewed as parts of an interconnected structure.
Despite its flexibility, topological data analysis is often too data-rich to be practical in solving real-world problems even for the most advanced supercomputers. Enter quantum computers. While conventional computers process data as finite bits—either 0 or 1—quantum computers would (they're still mostly theoretical) process data as indefinite quantum bits or qubits. Qubits include 0 and 1, but also anything in between. This provides quantum computers with exponentially more processing power.
To use topological data analysis on a data set with 300 points, a conventional computer would require 2300 processing units—about the number of all the particles in the universe—MIT professor and study author Stephen Lloyd explained in a Jan. 25 MIT press release. But a quantum computer would only require 3002 processing units, or 300 qubits. “The algorithms provide an exponential speed-up over the best currently known classical algorithms for topological data analysis,” the Nature Communications paper says.
Qubits tap into the weirdness of a quantum property called superposition. In superposition, subatomic particles are thought of as existing at all possible states at once, like overlapping waves. Those waves can sometimes be added to make bigger waves, subtracted to cancel the other out, or states in between. Superposition allows quantum computers to perform several tasks at once, whereas a conventional computer processor can only perform one task, albeit quickly, at a time. This is what University of Oxford physicist David Deutsch terms quantum parallelism.
|A look at the wavelike properties of superposition—a flat line can correspond to any number of equal-but-opposite wave amplitudes in superposition.|
Image Credit: Institute of Physics