Wednesday, March 28, 2018
One bet that's seemingly working well for chip giant Intel (NASDAQ: INTC) is its heavy investment in non-volatile memory technology such as NAND flash and 3D XPoint. Back in 2015, the company announced that it planned to convert an old logic factory -- Fab. 68 -- into a factory that builds both NAND flash memory as well as a new type of non-volatile memory known as 3D XPoint.
Intel is already the leading vendor of processors and other related components into data center and personal computer applications, so it's only natural that the company would try to push into non-volatile memory products for storage and -- in the case of 3D XPoint -- a replacement for DRAM in certain data center workloads.
If Intel's execution remains solid, then it seems likely that memory products will represent a significant source of revenue and profit for the company in the years ahead.
In this column, I'd like to go over the next logical step for Intel's memory ambitions.
Building DRAM
Today, Intel is not in the business of building DRAM -- this is a market dominated by Samsung (NASDAQOTH: SSNLF), Micron (NASDAQ: MU), and SK Hynix. DRAM is a fundamental type of computer memory that's required by just about every non-trivial computing device in existence, ranging from an entry-level smartphone to the highest-performance supercomputers.
Today, much of the DRAM that goes into personal computers, servers, and even mobile devices isn't integrated onto the same package as the main computing elements. In fact, virtually all of personal computer and server DRAM is added to the system by way of stand-alone memory sticks that plug into standard slots on the motherboard.
Going forward, though, I believe that memory and processing will become increasingly integrated. For example, Intel recently released a processor -- known as Kaby Lake-G -- that includes a high-performance CPU, graphics processor, and a type of memory known as HBM2 integrated in a single package.
Moreover, there are credible rumors that Intel is planning to introduce variants of its upcoming Ice Lake Server processors with substantial amounts of on-package HBM2 memory.
On-package memory can be made faster and more power-efficient than memory that's located in discrete modules a significant distance away from the main processor. Speed and power efficiency are both of critical importance in virtually all computing segments.
Intel can, of course, source HBM2 memory from third parties, but this would mean lower profit margins (since Intel has to pay a third-party manufacturer's margins) for Intel and an uncomfortable dependence on the ebbs and flows of the DRAM market. If the DRAM market booms, then Intel's costs go up, but if the DRAM market slumps, Intel can acquire DRAM on the cheap.
It makes sense, then, for Intel to -- in anticipation of the pending marriage of processors and memory chips -- start building its own DRAM in-house. Moreover, not only would Intel benefit from being able to more easily and cost-effectively offer integrated products, but it could leverage its dominant position in the PC and data center markets to bundle commodity DRAM modules with its processors, solid-state drives, Ethernet chips, and so on.
In short, if Intel can control DRAM, it can offer potential customers just about every critical component for their platforms, making it a one-stop shop. The profit potential for Intel here is quite large, though obviously it would require significant investment either in the form of organic DRAM technology development or through the acquisition of an existing memory maker (either Micron or SK Hynix would be suitable acquisition targets).
By: DocMemory Copyright © 2023 CST, Inc. All Rights Reserved
|