Surface code error correction on a defective lattice

被引:24
作者
Nagayama, Shota [1 ]
Fowler, Austin G. [2 ]
Horsman, Dominic [3 ]
Devitt, Simon J. [4 ]
Van Meter, Rodney [1 ]
机构
[1] Keio Univ, 5322 Endo, Fujisawa, Kanagawa 2520882, Japan
[2] Google Inc, Santa Barbara, CA 93117 USA
[3] Univ Durham, Dept Phys, South Rd, Durham DH1 3LE, England
[4] RIKEN, Ctr Emergent Matter Sci, Wako, Saitama 3150198, Japan
来源
NEW JOURNAL OF PHYSICS | 2017年 / 19卷
关键词
quantum error correction; surface code; topological quantum error correction; qubit loss; fault tolerant quantum computation; QUANTUM COMPUTATION; IMPLEMENTATION;
D O I
10.1088/1367-2630/aa5918
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
The yield of physical qubits fabricated in the laboratory is much lower than that of classical transistors in production semiconductor fabrication. Actual implementations of quantum computers will be susceptible to loss in the form of physically faulty qubits. Though these physical faults must negatively affect the computation, we can deal with them by adapting error-correction schemes. In this paper we have simulated statically placed single-fault lattices and lattices with randomly placed faults at functional qubit yields of 80%, 90%, and 95%, showing practical performance of a defective surface code by employing actual circuit constructions and realistic errors on every gate, including identity gates. We extend Stace et al's superplaquettes solution against dynamic losses for the surface code to handle static losses such as physically faulty qubits [1]. The single-fault analysis shows that a static loss at the periphery of the lattice has less negative effect than a static loss at the center. The randomly faulty analysis shows that 95% yield is good enough to build a large-scale quantum computer. The local gate error rate threshold is similar to 0.3%, and a code distance of seven suppresses the residual error rate below the original error rate at p = 0.1%. 90% yield is also good enough when we discard badly fabricated quantum computation chips, while 80% yield does not show enough error suppression even when discarding 90% of the chips. We evaluated several metrics for predicting chip performance, and found that the average of the product of the number of data qubits and the cycle time of a stabilizer measurement of stabilizers gave the strongest correlation with logical error rates. Our analysis will help with selecting usable quantum computation chips from among the pool of all fabricated chips.
引用
收藏
页数:28
相关论文
共 42 条
  • [1] Fault-tolerant computing with biased-noise superconducting qubits: a case study
    Aliferis, P.
    Brito, F.
    DiVincenzo, D. P.
    Preskill, J.
    Steffen, M.
    Terhal, B. M.
    [J]. NEW JOURNAL OF PHYSICS, 2009, 11
  • [2] [Anonymous], PHYS SCR
  • [3] [Anonymous], ARXIV11085738
  • [4] [Anonymous], ARXIV9811052
  • [5] [Anonymous], SIGARCH COMPUT ARCHI
  • [6] [Anonymous], 2006, P 1 INT C PERF EV ME
  • [7] [Anonymous], NAT COMMUN
  • [8] [Anonymous], J EMERG TECHNOL COMP
  • [9] Operator quantum error-correcting subsystems for self-correcting quantum memories
    Bacon, D
    [J]. PHYSICAL REVIEW A, 2006, 73 (01):
  • [10] Fault Tolerant Quantum Computation with Very High Threshold for Loss Errors
    Barrett, Sean D.
    Stace, Thomas M.
    [J]. PHYSICAL REVIEW LETTERS, 2010, 105 (20)