Deep Transfer Learning-Based Detection for Flash Memory Channels

被引:4
作者
Mei, Zhen [1 ,2 ]
Cai, Kui [3 ]
Shi, Long [4 ]
Li, Jun [4 ]
Chen, Li [5 ]
Immink, Kees A. Schouhamer [6 ]
机构
[1] Nanjing Univ Sci & Technol, Sch Elect & Opt Engn, Nanjing 210094, Peoples R China
[2] Southeast Univ, Natl Mobile Commun Res Lab, Nanjing 210096, Peoples R China
[3] Singapore Univ Technol & Design, Sci Math & Technol Cluster, Singapore 487372, Singapore
[4] Nanjing Univ Sci & Technol, Sch Elect & Opt Engn, Nanjing 210094, Peoples R China
[5] Sun Yat Sen Univ, Sch Elect & Informat Technol, Guangzhou 510006, Peoples R China
[6] Turing Machines Inc, NL-3016 DK Rotterdam, Netherlands
基金
中国国家自然科学基金;
关键词
Flash memories; Training; Threshold voltage; Channel estimation; Error correction codes; Decoding; Transfer learning; Data detection; error correction code; flash memory; neural network; transfer learning; READ; INTERFERENCE; OPTIMIZATION; RETENTION; RECOVERY; NETWORK; DESIGN;
D O I
10.1109/TCOMM.2024.3357616
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The NAND flash memory channel is corrupted by different types of noises, such as the data retention noise and the wear-out noise, which lead to unknown channel offset and make the flash memory channel non-stationary. In the literature, machine learning-based methods have been proposed for data detection for flash memory channels. However, these methods require a large number of training samples and labels to achieve a satisfactory performance, which is costly. Furthermore, with a large unknown channel offset, it may be impossible to obtain enough correct labels. In this paper, we reformulate the data detection for the flash memory channel as a transfer learning (TL) problem. We then propose a model-based deep TL (DTL) algorithm for flash memory channel detection. It can effectively reduce the training data size from 106 samples to less than 104 samples. Moreover, we propose an unsupervised domain adaptation (UDA)-based DTL algorithm using moment alignment, which can detect data without any labels. Hence, it is suitable for scenarios where the decoding of error-correcting code fails and no labels can be obtained. Finally, a UDA-based threshold detector is proposed to eliminate the need for a neural network. Both the channel raw error rate analysis and simulation results demonstrate that the proposed DTL-based detection schemes can achieve near-optimal bit error rate (BER) performance with much less training data and/or without using any labels.
引用
收藏
页码:3425 / 3438
页数:14
相关论文
共 42 条
[1]   Decision-Directed Retention-Failure Recovery With Channel Update for MLC NAND Flash Memory [J].
Aslam, Chaudhry Adnan ;
Guan, Yong Liang ;
Cai, Kui .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS, 2018, 65 (01) :353-365
[2]   Read and Write Voltage Signal Optimization for Multi-Level-Cell (MLC) NAND Flash Memory [J].
Aslam, Chaudhry Adnan ;
Guan, Yong Liang ;
Cai, Kui .
IEEE TRANSACTIONS ON COMMUNICATIONS, 2016, 64 (04) :1613-1623
[3]   Integrating structured biological data by Kernel Maximum Mean Discrepancy [J].
Borgwardt, Karsten M. ;
Gretton, Arthur ;
Rasch, Malte J. ;
Kriegel, Hans-Peter ;
Schoelkopf, Bernhard ;
Smola, Alex J. .
BIOINFORMATICS, 2006, 22 (14) :E49-E57
[4]   Error Characterization, Mitigation, and Recovery in Flash-Memory-Based Solid-State Drives [J].
Cai, Yu ;
Ghose, Saugata ;
Haratsch, Erich F. ;
Luo, Yixin ;
Mutlu, Onur .
PROCEEDINGS OF THE IEEE, 2017, 105 (09) :1666-1704
[5]  
Cai Y, 2015, INT S HIGH PERF COMP, P551, DOI 10.1109/HPCA.2015.7056062
[6]  
Cai Y, 2013, DES AUT TEST EUROPE, P1285
[7]  
Cai Y, 2012, DES AUT TEST EUROPE, P521
[8]   Reduced-complexity decoding of LDPC codes [J].
Chen, JH ;
Dholakia, A ;
Eleftheriou, E ;
Fossorier, MRC ;
Hu, XY .
IEEE TRANSACTIONS ON COMMUNICATIONS, 2005, 53 (08) :1288-1299
[9]  
Chen T.-Y., 2014, P INF THEOR APPL WOR, P1
[10]   VLSI Implementation of BCH Error Correction for Multilevel Cell NAND Flash Memory [J].
Choi, Hyojin ;
Liu, Wei ;
Sung, Wonyong .
IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS, 2010, 18 (05) :843-847