A 100 years ago Quantum Theory was born, first reported in the journal Nature (1) at the time and following a speech made by Max Planck, detailing work that later won him the Nobel Prize.
Since that time quantum theory has revolutionised science and the way we live, and continues to do so to this day. In last week’s post I talked about a few notable ways in which quantum theory is impacting our everyday lives. In today’s post we’re going to discuss yet another aspect of quantum theory; one that may alter our lives irrevocably in the years to come.
Another small part of Planck’s quantum legacy then is the exciting and relatively new field of Quantum Computing where once again ‘Nature,’ in their 150th year, was at the forefront. In the October 24th 2019 issue it was announced that the AI team at Google had achieved Quantum Supremacy (2) for the first time. Its’ ‘Sycamore’ quantum machine had performed a complex calculation in a little under 4 minutes which, they estimate, it would have taken the world’s most advanced ‘classical’ or traditional computer 10,000 years to complete!
Then just weeks later IBM announced that their Quantum machine, Raleigh, had achieved a quantum volume of 32. And then, barely 6 months later we had the next great leap forward! Honeywell announced that its’ new Quantum Computer had doubled the capacity of IBM’s and achieved a quantum volume of 64, leaving Honeywell’s President of Quantum Solutions, Tony Uttley, excitedly effusing that (Honeywell) were ‘closer to industries leveraging our solutions to computational problems that are impractical to solve with traditional computers’, effectively putting Honeywell ahead of the competition. He went on, saying that “what makes our quantum computer so powerful is having the highest quality qubits with the lowest possible error rates!”
And having quality qubits seems to be paying off handsomely for Honeywell. Their next gen quantum computer, the System Model H1 — built with 10 superconducting qubits — has now achieved a phenomenal quantum volume of 512. That’s 8 times as much in less than a year! A truly astounding achievement!
But what does all of that mean? What is quantum volume and why would this put Honeywell at the head of the queue?
Quantum volume is a metric created by IBM that measures quantum computers’ capabilities and error rates. IBM itself announced that its quantum computer achieved a score of 32 in January 2020, an achievement now dwarfed by that of Honeywell, who made their breakthrough by virtue of the quality of their superconducting qubits.
As a field of research though, quantum physics, despite all that is known, is still (most likely) in its infancy, but no doubt holds in its infinitesimally small arms a whole universe of amazing wonders that may have the potential to change not only how we see the world around us, but also how we interact with it. And the rise of Quantum computers is no exception.
But what exactly is a Quantum computer and how does it do its stuff?
However, in order to understand something about Quantum Computers it is first necessary to know a little about the workings of a traditional computer as well as a smattering about Quantum Theory itself.
Traditional computers complete the desired task (a calculation of some sort, for example) by converting input (or data) into a string of 0’s and 1’s where each 0 or 1 is a ‘bit’ of information. This binary input then produces the output (another string of 0’s and 1’s) after translation by an algorithm which encodes the result we see. But even the most clever algorithm can only ever manipulate the strings of bits, which are either a ‘0’ or a ‘1’, and at a machine level this either/or dichotomy is represented in electrical circuits which are either closed or open.
Quantum computers, however, work in the microscopic world where this dichotomy doesn’t occur. Quantum (or sub-atomic) particles, such as photons or electrons, can exist in both states at once; in other words, the particles are not mutually exclusive and may be either a 0 or a 1 or perhaps even both at the same time. Photons, for example, may exhibit two types of polarisation simultaneously and so may appear to be in two or more places at once. This is called Superposition.
But herein lies one of the mysteries of quantum physics; this superposition can never be observed because somehow the moment the phenomenon is detected, all but one of the possibilities disappear!
And the reason why this is so remains a mystery, but can nevertheless, be observed using Schrodinger’s Equation.
And it is the superposition (of the particles) that frees a quantum computer from the dichotomous constraints that restrict traditional computers. A particle in superposition is known as a ‘qubit.’
But within a system of one or more qubits, each qubit is unlikely to remain ‘independent’ and may become ‘entangled’ with any number of others, so that when a qubit is measured as a ‘0’, for example, it immediately tells you what you will see in the other (entangled) qubits. What’s more, the qubits can be separate and yet still become entangled!
This uniquely flexible characterisation of entangled particles is what Einstein famously called ‘spooky action at a distance.’
Now, for a computer trying to encode entanglement in a system of several qubits, it isn’t just a matter of laying out a string of 0’s and 1’s. Instead, all of the correlations between entangled qubits need to be described as well, and what’s more, the number of descriptions grows exponentially with the number of qubits in the system, such that in a system of n qubits there are 2n correlations. Therefore in a system of just 300 qubits there are already more descriptions than there are atoms in the universe!!!
Now that is a big WOW!
So, because the numbers are so big a traditional computer cannot cope, but scientists hope that a workable quantum computer could, and this is why they hold such immense possibilities and potential.
Quantum computers take entangled qubits (as input), encode the input using a quantum algorithm, and then produce the output, which is also in qubits.
And this is the stumbling block at the moment in the development of a working quantum computer; that is because the output is in qubits, the results disappear as soon as they are observed, so the quantum computer has to glean as much information as is immediately possible from what is, in effect, unobservable.
And unlocking that massive potential could have global ramifications on many fronts. For example, once a workable quantum computer is built it could (for example) decrypt much of today’s encrypted data in a miniscule fraction of the time it currently takes the most advanced supercomputers on the planet.
So, this aspect alone holds huge (negative?) potential with regards to both one’s personal security and also, on a wider scale, one’s national security. So many organisations are already upgrading their encryptions in anticipation of this coming eventuality by doubling or even quadrupling their decryption key sizes (from say 64bit to 128 or even 256bits and above).
Within a computation, however, as the qubits interact with one another you get an ‘exponential expansion in the number of values that can be considered at the same time,’ making certain types of calculations infinitely more do-able. It is this amazing capability that Honeywell hope to ‘leverage’ to industry for rates c.$10,000 per hour!
That should pay the electricity bill one would have thought!
But it doesn’t end there! Uttley contends that Honeywell ‘aims to increase the quantum volume of its machine by a factor of 10 every year, reaching a score of 640,000 by 2025’, making the once impossible now seem eminently possible, and within a matter of a few years. And judging by the progress made so far, one would assume this to be eminently do-able!
If achieved, such an advance could, in theory, threaten the 256bit encryption currently used by Bitcoin, making it possible for the ‘owner’ of a suitable quantum computer to effectively take control of crypto-currency, which would inevitably lead to the value plummeting and the ensuing chaos in the financial markets that would, doubtless, follow on.
One of the upshots of this evolutionary jump forward in quantum computing capability is that it has started an ‘arms race’ between those trying to protect our data and those developing the technologies that could potentially break through the sophisticated security encryptions now used so widely in industry. The cyber security industry is fighting back working on ways to prevent, or at least, stall the seemingly unstoppable advance of quantum technology.
What about legislation?
So the responsible governments of the world and their institutions need to get a legal handle on both sides of this developing drama before the technology falls into the wrong hands. The governments of the world have, thus far, been relatively slow in getting to grips with recent technological and digital advances and have seen fit for the industries concerned to largely self-regulate.
But they cannot allow the same to happen with quantum technology. The pace of the advances made, and their potential for both good and bad, mean that legislators everywhere need to be ahead of the curve on this one, because the consequences of not being may well be catastrophic.
And surely the only way to go with this is to have a global agenda for this, set up by an internationally recognised body of experts who can set the gridlines that must be followed to limit the potential for catastrophe on both a personal and a global scale. The possibilities for good here are immense, and it is vital that these possibilities be contained within a rules based arena for the good of all, and that they are not exploited in any way by potential bad actors.
So, where now?
It wasn’t so long ago that people in the know were talking in terms of decades, possibly even longer, for anything approaching a workable quantum computer would become viable. But things are moving ahead with such pace right now that the suggested time-frame of decades is shrinking rapidly.
At the backend of 2020 Chinese researchers boldly claimed to have wildly outclassed other quantum computers by using a light-generated or photonic quantum computer, named Juizhang, that markedly reduced errors and performed at rates that far outstrip other quantum computers.
Whilst in Switzerland, at the Paul Scherrer Institute, yet another approach was being adopted using qubits whose central elements are magnetic atoms from so-called rare earth metals. This method builds on the logic gates of traditional computers by producing ‘controlled NOT gates’ (or CNOT gates) in addition to the normal qubit gates to dramatically reduce errors and improve computational speeds once again.
And then in October 2020 Peter Chapman, President of IonQ made this announcement:
“I am incredibly excited to unveil IonQ’s new quantum computer, the most powerful on the market. The system smashes all previous records with 32 perfect qubits with gate errors low enough to feature a quantum volume of at least 4,000,000.”
What? A quantum volume of 4 million!! That’s outrageous!
Chapman rightly says this puts his team at the forefront of the industry as we move towards a quantum revolution and the accelerated development that may follow.
In a recent podcast (4) Chapman and his co-founder Chris Monroe spoke about how they are working towards a commercially viable quantum computer, one that could potentially be ready for sale and use within the next 3 to 4 years, which is truly amazing.
It was only very recently researchers were talking in terms of decades for this possibility to become a reality, and yet here it is!
Though they admit that the possible applications of such a computer may at first be limited, they envisage that any limitations will be more a consequence of the lack of the appropriate quantum software than a deficiency in the hardware itself per se.
So what is it that the guys at IonQ are doing differently from everyone else that puts them potentially so far ahead of the game?
Chapman and Monroe suggest that they are approaching the project as an engineering problem, not a physics problem. As they say, they are not trying to make the ions (or atoms - they don’t differentiate between the two) that form the qubits; the ions are ‘everywhere’ and always have been. All they need to do is to find a workable solution to observe the data that a qubit can store in such a way that the error rate can be managed and maintained at low enough frequencies to make a workable computer a viable proposition.
They achieve this using a trapped ion approach, as opposed to the quantum super-conducting method utilised by Google and others. There is an error rate for all qubits regardless of how they store data, they explained. Using their approach they are able to reduce the error correction needed (during the actual computation) to a fraction of that achieved in superconducting computers.
They explained that there are two types of qubits; physical and logical, and it is the relationship (or ratio) between the two that determines the computing performance of the machine. It is the logical qubits that store the observable data. And by trapping the charged particles individually using laser guided ion gates they are able to reduce the ratio between the two down to about 13:1 physical to logical qubits, and along with it the error rate. This ratio is by magnitudes, far more favourable than that exhibited by superconducting machines which can have ratios that exceed several 1000 to one.
And because there are no superconductors present they don’t have the perennial and costly problem of cooling. This makes the IonQ trapped ion approach potentially far more scalable and cost effective. By reducing the distance between the laser source and the gates further, Chapman and Monroe anticipate that the error rate can not only be improved upon, but that it would have the commercially desirable side-effect of reducing the overall size of the computer in the process.
However, as touched upon above, the limitations of any machine may be more on the software side than the hardware side. Chapman explained that there are not many people out there currently writing quantum software. The languages are still being invented and the lack of a ‘basic’ language, to use PC speak, is likely to hold things back initially.
This burgeoning field is however now attracting investors from many different industrial sectors (such as Daimler and Goldmann Sachs) who are eager to explore the possibilities this exciting area of research is opening up. Along with the financial investment we may start to see the development of both hardware and software speeding up pretty soon.
So we have perhaps the exciting prospect of a commercial quantum computer in the next few years, which is not something that I had ever thought I’d be able to write!
Thus, the field of quantum computing is generating many innovative and fantastical approaches, many of which, it seems, may offer ways to construct and develop the field in ways that, as yet, may not be readily apparent. That said, quantum advances in medical technology, material science, drug manufacture, communications, supply chains and logistics are just a few of the fields that may be revolutionised in the near future through the application of quantum technology and quantum computing in particular.
A workable quantum computer may still be a few years away, but forward thinking governments and institutions should be readying themselves, preparing the way and undertaking legislative planning for the quantum revolution that is surely just round the corner.
But what do I know? So please keep watching this space and I’ll endeavour to keep you informed.
Thanks for reading. Back again soon.
References not sourced during the post.
- Nature Podcast, December 27th 2019.
- Nature, 24th October 2019. Editorials: Precarious Supremacy.
- Implications of quantum computing for encryption policy. Carnegie Encryption Working Group 2019
- Exponential View podcast with Azeem Azhar. Episode published on 7th April 2021; Making Quantum Computers a Commercial Reality.