Digital Storage And Memory Projections For 2023, Part 2

This is the second of a set of three blogs on digital storage and memory predictions for the coming year that we’ve been doing for some time. The first blog focused on the latest developments and predictions for magnetic recording (HDDs and magnetic tapes). This blog focuses on the different types of solid state memory and storage and DNA for storage. We will talk about the latest developments in computing storage, flash memory including NVMe-oF and CXL, DRAM, NVMe and how they will change the way we do computing. We will also talk about life after Optane and how it will impact the growth and evolution of non-volatile memory technology.

By the end of 2022, demand for all storage and memory technologies for consumer, client and server applications will decline. Additionally, new DRAM and NAND capacity has been launched, leading to additional product availability (excluding pandemic-related supply chain issues). This has lowered the cost of NAND flash and DRAM. At the end of November, TrendForce reported that NAND flash revenue fell 24% quarter over quarter in the third quarter of 2022.

However, after 2023, there are many drivers for higher storage demand, which will drive the demand for storage devices. Indeed, in mid-December 2022, SEMI reported that the global chip industry is expected to invest more than $500 billion in new factories by 2024 (new semiconductor facility starts are expected to decline by about 15% in 2023 compared to 2022). Although expected). of this investment going into a memory chip factory.

NAND flash is the dominant primary storage (storage for data being actively processed) in data centers and enterprise applications, and is often the only storage used in most personal computers and consumer devices such as smartphones. In large facilities, active data resides on SSDs, while cold data is kept on hard disk drives (HDDs) and magnetic tape.

NAND flash is now available in up to 232 layers from Micron (for consumer SSDs), and SK Hynix’s Solidigm (formerly Intel’s NAND flash division) has announced that it is making the first 238-layer 512Gb TLC NAND flash die slated for volume production. . In 2023, we expect NAND flash with 200+ layers to gain market share and perhaps the first NAND approaching 300 layers to be announced.

However, layer scaling isn’t the only way to increase memory density in NAND flash. NAND scaling was discussed at FMS Kioxia and partner WDC in 2022. Kioxia’s keynote image shows that lateral scaling (the size of cells and their spacing from each other) is another important characteristic.

In addition to lateral and vertical scaling, there is also architectural scaling that places different types of semiconductor devices on top of each other to save space, including bonding different NAND cell die-bonds (as advocated by China’s YMTC).

Logical scaling refers to how many bits are stored per cell, with TLC (3 bits per cell) and QLC (4 bits per cell) available in many applications today, and PLCs (5 bits per cell) possible in the future. This logical scaling trades off density for cell retention time and attrition. WDC said it expects more than 500 NAND flash layers by 2032, total NAND production capacity in 2021 is 765 exabytes (EB), and more than 2ZB (zettabytes) are expected in 2025.

Samsung is also exploring ways to stack NAND dies to create denser storage devices. The chart below shows a 32 die stack projection for a 1PB device over 10 years (2032). NVIDIA is interested in using PB-class NAND devices with their GPUs.

DRAM scaling is also continuing. Samsung is the world’s largest DRAM manufacturer and at its 2022 Tech Day, it shared its DRAM roadmap below.

Samsung’s upcoming DRAM solutions include 32Gb DDR5 DRAM, 8.5Gbps LPDDR5X DRAM, and 36Gbps GDDR7 DRAM. Samsung also talked about custom DRAM solutions such as high bandwidth memory-process in memory (HBM-PIM), acceleration DIMM (AXDIMM) and CXL.

There are many types of compute storage devices and architectures available. This includes DPU-based network computing storage, such as those available from NVIDIA’s Mellanox, and SSDs with built-in compute capabilities from major SSD companies. Moving computing closer to or on storage devices reduces data movement (thus reducing system power requirements and latency) and also reduces CPU load on some computational tasks. We anticipate that in 2023, various computational storage devices will become more common for a variety of computational tasks.

NVMe is now the dominant flash memory interface, and NVMe-oF (over fabric), where the fabric is often Ethernet, is becoming commonplace in data centers. NVMe-oF is being used to create a pool of solid-state storage that can be shared between CPUs and servers. This pooling and sharing of storage is called decomposition (decomposing the various parts of a server into shared pools), and software can use it to create a composable infrastructure that supports virtual devices or containers that can be created or removed as needed. This kind of pooling and composability is being extended to memory with the Compute Express Link (CXL) interconnect.

CXL provides a switched network for different types of memory, and in the last few years there has been a push to use Optane memory with DRAM in a shared memory environment supporting different types of memory with different cost and performance. The release of the CXL 3.0 specification, which will allow you to create pools of memory that can be shared between CPUs, is out in 2022.

Intel introduced 3D XPoint technology with then-partner Micron in 2015, began shipping NVMe Optane products (trade name for 3D XPoint) in 2017, and DDR products in 2018. Intel has announced that it will phase out its Optane products in July 2022. Current generation Optane products are still available from Intel, but there is no Optane CXL product. Instead, SSD makers are seeking ways to release CXL-based products using DRAM and NAND flash.

Some of the major NAND flash companies have introduced NAND-based CXL devices in 2022. Samsung has launched memory semantic CXL SSDs for AI/ML applications. The device contains an internal DRAM cache with a larger amount of NAND flash memory. Small IO is done with DRAM and regular IO is done using NAND flash. Samsung claims a 20x improvement in random read performance compared to regular PCIe 4.0 SSDs. SK Hynix showed off its CXL memory expander and elastic CXL FPGA prototype at FMS in 2022 (like other companies). They also said that DDR5-based CXL samples are available.

Marvell and other controller companies are supporting CXL (and NVMe) in their controllers as a means to achieve full data center composability that includes memory pooling and storage pooling. The image below shows Marvell’s vision of how CXL can lead the development of top of rack (TOR) switches that support CXL and completely separate compute, memory and storage.

The first systems using CXL for memory expansion of existing CPUs will be available starting in 2023, and memory pooling systems supporting CXL version 3.0 are expected to be available sometime in 2024.

While Optane memory is reaching its end of life, a variety of other non-volatile memory technologies are growing in embedded applications, initially replacing NOR flash and some SRAM. These include magnetic random access memory (MRAM) and various resistive RAM (RRAM) technologies. TSMC, Samsung and other foundries have produced a variety of embedded devices for wearables and automotive applications. In addition, the growing popularity of chiplet technology and the new Universal Chiplet Interconnect Express (UCIe) interface could fuel demand for discrete memory chiplets for both DRAM and DRAM. recalled memories.

As the chart below from the Coughlin Associates and Objective Analysis Emerging Memories Enter the Next Phase report shows, growth in embedded and discrete non-volatile memory technology (expressed as MRAM) could drive capacity shipment growth and $44 billion in revenue by 2032. there is.

Finally, let’s take a quick look at the future of DNA-based storage. To some extent, this has to do with solid-state memory, as several synthetic DNA storage startups are looking to use silicon-based devices as an important element of their approach to storage. The image below, published by Karin Strauss of Microsoft Research, shows the basic steps of synthetic DNA used for storage.

DNA storage is still in the lab and early prototype storage system stages, but rumor has it that there will be demonstrations of DNA storage in 2023.

2022 ended with a decline in demand for all types of storage and memory technologies, but demand will rebound in 2023 to meet growing storage demand and gain efficiencies from newer technologies, including NAND, DRAM, CXL, and advances in emerging memory. is expected. We also expect some significant advances and demonstrations of DNA storage in 2023.

Source

Also Read :  Samsung SOC research laboratory ‘splits’ into AI/Computing and Communication Chip

Leave a Reply

Your email address will not be published.