Trades-off between lithography line edge roughness and error-correcting codes requirements for NAND Flash memories

The only way to keep pace with Moore’s Law is to use probabilistic computing for memory design. Probabilistic computing is ‘unavoidable’, especially when scaled memory dimensions go down to the levels where variability takes over. In order to print features below 20nm, novel lithographies such as Ex...

Full description

Saved in:
Bibliographic Details
Published inMicroelectronics and reliability Vol. 52; no. 3; pp. 525 - 529
Main Authors Poliakov, Pavel, Blomme, Pieter, Pret, Alessandro Vaglio, Corbalan, Miguel Miranda, Gronheid, Roel, Verkest, Diederik, Houdt, Jan Van, Dehaene, Wim
Format Journal Article
LanguageEnglish
Published Kidlington Elsevier Ltd 01.03.2012
Elsevier
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The only way to keep pace with Moore’s Law is to use probabilistic computing for memory design. Probabilistic computing is ‘unavoidable’, especially when scaled memory dimensions go down to the levels where variability takes over. In order to print features below 20nm, novel lithographies such as Extreme Ultra Violet (EUV) are required. However, transistor structures and memory arrays are strongly affected by pattern roughness caused by the randomness of such lithography, leading to variability induced data errors in the memory read-out. This paper demonstrates a probabilistic–holistic look at how to handle bit errors of NAND Flash memory and trades-off between lithography processes and error-correcting codes to ensure the data integrity.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ISSN:0026-2714
1872-941X
DOI:10.1016/j.microrel.2011.09.037