《量子计算机的可靠性》

-回复 -浏览
楼主 2019-01-15 14:29:06
举报 只看此人 收藏本贴 楼主

——————

本文选自《Economist2018.2.24Science and technology

感兴趣的可以关注、分享下 

——————

量子计算机的可靠性

——量子计算机在得到应用前,要达到一定的可靠性

量子计算机的计算是基于量子比特(qubit)的,从某种程度上讲,这就是计算机的未来。量子计算机在解决一些特定的数学问题时有着很大的理论优势,例如大数的因式分解,这在传统计算机上是很难甚至不可能实现的。这要归功于量子比特不同于传统计算机上的比特位只有01两种逻辑状态,量子比特有许多种状态【译注:可见参考链接】。量子计算机的量子比特越多,就能解决越复杂的问题。金融、医药、化学和人工智能这些领域都被看好用量子计算机来处理问题。

哪里有未来,哪里就一定有谷歌。谷歌在他们的量子计算机项目Bristlecone上已积累了很多关于量子计算机的技术。他们打算研发一台“量子至上”的设备,在解决特定数学问题上,会比同等体积的传统计算机快得多得多。为此,谷歌目前正着手准备一台49量子比特的设备,有传言说今年年底前能够面世。虽然这些传言没有得到证实,但谷歌量子计算机项目的负责人John MartinisAAAS(美国科技促进协会)会议上谈到了关于这个项目的一些问题

作为量子计算机研究的领军者,Martinis博士表示,通常人们都会根据一台量子计算机所能处理的量子比特位数来评价它,很少有人关心它们的出错率。因为设计这些实验性机器的人是物理学家而不是工程师,他们通常会使用相对不错的测试数据来反映量子比特的出错率,以此来展示机器的性能。然而这种数据丝毫不能引起Martinis博士和他的团队的兴趣。他们像工程师那样思考,想造出一台可靠性高、能稳定工作的设备。因此,Martinis认为现在人们的普遍做法是非常不可取的,对量子计算机来说,可靠性很重要。

Bristlecone项目主要关心量子比特的质量,也就是可靠性。如果作为进行计算的载体——硬件不可靠,那么纵使量子计算机的理论性能比传统计算机的好得多,这也是没有意义的。

现在就会时而遇到这种问题。研发和维护量子比特是十分费时费力的过程,有些类似于硅时代来临前的古典计算机。微小的外界干扰也有可能影响到量子比特、使计算出错。谷歌的机器的每一个量子比特都在一块有120根引脚的芯片上,每一根引脚都可能引入令人烦恼的噪声。量子计算机没有使用传统计算机所使用的纠错机制,而是采用了这样的方法:将数据的输出复制到多条引脚上,并相互校验。将量子比特的输出复制成多路就意味着必须提前检测它们,而这将改变量子比特的状态,破坏计算。【译注:以上两句应该跟量子力学有关,表示没读懂】因此,一切环节都必须尽善尽美才行。

传统计算机的可靠性是很高的,硬件出问题的概率相对较小(软件问题是另一方面的问题),这依赖于60年来半导体工业的不断进步。现在,量子计算机所处的阶段相当于传统计算机使用真空管进行计算、体积跟屋子差不多大的阶段。那时的计算机,真空管经常会出故障,导致短路。这样的庞然大物最终能发展成我们现在所用的计算机,是因为在其发展的每个阶段都能被人所使用、产生价值。相似的,量子计算机最终也会经历这些。

 

————参考————

1.网易公开课:http://open.163.com/movie/2016/3/I/3/MBHE0RSAC_MBHEIA4I3.html

————以下为原文————

Quantum computingQuality over quantity

——If quantum computers are to be useful, they must first be reliable

Calculating machines that run on quantum bits (known as qubits, for short) are, by some accounts, the future of computation. Quantum computers have the theoretical advantage that they can solve with ease certain mathematical problems, such as the factorisation of large numbers, which are hard or impossible for classical machines. This is possible thanks to a qubits ability to remain, through the peculiarities of quantum mechanics, in many quantum states simultaneously. The more qubits a computer has, the more mind-bogglingly gigantic are the calculations it can handle. Finance, medicine,chemistry and artificial intelligence are thus all expected to be transformed by quantum computing.

And where the future is, there surely will Google be also. The firm sets great store by its quantum-computing project, which it calls Project Bristlecone. This is intended to develop a quantum-supremacy device, ie, one that is palpably and provably faster than a traditional computer of equivalent size at solving particular mathematical problems. Google is currently

preparing such a device, which will have 49 qubits. It is rumoured that this will be ready for demonstration before the year is out. Though not confirming such rumours,John Martinis, Googles quantum-computing boss, told the AAAS meeting about some of the problems involved in doing so.

Leadership in the quantum-computing race, Dr Martinis said, is typically measured in terms of the number of qubits that a machine can handle. Less attention is paid to those machines error rates. Since the people building such experimental machines are usually physicists, rather

than engineers, they typically cite their best measurements when reporting qubit error rates, in order to show the machines capability. That number is of little interest to Dr Martinis and his team, though. They are thinking like engineers, attempting to build a robust, working device. In this case,Dr Martinis says, it is the worst error, not the slightest, that is important.

Project Bristlecones main concern is the quality of the qubits. The theoretical ability to beat a classical computer is of little use if the hardware that serves as the physical representations of those calculations is misfiring or dodgy.

At the moment, that is sometimes the case. Preparing and maintaining qubits is a delicate and fiddly process, akin to classical computing in the days before silicon.The slightest puff of interference from the outside world risks disrupting a qubit and scuppering a calculation. Every one of Googles qubits is held in a chip that has 120 wires coming out of it, each of which is

capable of introducing troublesome noise.Nor can quantum computers rely on the error-correction techniques that classical computers use. Those duplicate data outputs and check them against each other.Duplicating the output of qubits would mean having to measure them prematurely. That would change the qubits quantum states, wrecking the calculation. Instead,

Everything must be and remain perfect.

Such reliability has been mostly achieved in classical computing. Hardware problems are not unheard of, but they are relatively rare. (The software is another matter.) But that dependability is the result of60 years of continuous improvement of solid-state silicon transistors. Quantum

computing is now in the equivalent of the days of vacuum tubes running calculations in room-sized computers. And that was a world in which the tubes of ten blew,and bugs in the system were literal ones,namely insects that caused short circuits.Such behemoths were able to turn into todays sleek machines because, at every stage of the journey, they were useful. And that, ultimately, is the standard quantum computers will have to match.


我要推荐
转发到