- Samsung has announced its new SSD storage that aims to deliver additional performance to the table while under AI and ML workloads.
- The new technology of the storage is called “memory-semantic SSD” and Samsung claims they perform 20x better for AI and ML-related workloads.
- The full technical specifications of memory-semantic SSDs are not announced yet, but they utilize Compute Express Link interconnect and DRAM cache as well.
As one of the biggest memory manufacturers, Samsung, has revealed its new product at Flash Memory Summit 2022 in California. The new product is developed and optimized for artificial intelligence and machine learning technologies; the company claims that it can deliver up to 20x better performance than a traditional SSD.
Memory-semantic SSD
The new product is categorized as a “memory-semantic SSD” which includes a DRAM cache and utilizes Compute Express Link technology. The new technology works best with reading and writing small-sized data chunks. The data required and used by AI and ML can be huge but the data is processed in small chunks, which makes Samsung’s memory-semantic SSD suitable for it.
The technical specifications of this new SSD are not announced yet but Samsung’s claim is quite impressive. The company’s “20x improvement” claim includes both random read speeds and latency. The current generation of traditional SSDs can go up to 15 GB/s reading speeds. However, those metrics are only reachable when every condition is in the “best” state, including the sizes of the data written or read. And those conditions mostly include “bigger-sized data” and the performance drops with the smaller pieces of files.

Additionally, Samsung has announced that its first PCIe 5.0 SSD PM1743 and first 24G SAS SSD PM1653 are now in mass production. The company is also about to begin the mass production of UFS 4.0 mobile storage chips this month. Jin-Hyeok Choi, executive vice president of Memory Solution and Product Development at Samsung said;
« The IT industry is facing a new set of challenges brought on by the explosive growth in big data, and this underscores the importance of a robust, cross-industry ecosystem. We are committed to developing transformative memory technologies that can bring far-reaching changes in how we move, store, process and manage data for future AI, ML and HPC applications, as we navigate these challenging tides together with industry partners. »