Wednesday, June 9, 2004
Tools and methodologies need to be "fundamentally reconsidered" as the electronics world enters an era of unprecedented complexity, the chief technologist of Intel Corp. said Tuesday (June 8).
Pat Gelsinger, delivering the keynote at the 41st Design Automation Conference here, outlined the many challenges facing designers in the coming years, from gate and source-drain leakage problems to funky and vexing variability issues which cannot be managed by contemporary tools.
"We believe our design methodologies and our design tools need to be fundamentally reconsidered," Gelsinger said.
Gelsinger struck notes of hopefulness amid the description of major potholes in the design landscape, saying one should "fundamentally believe in (Moore's) the law." He described the Semiconductor Industry Association's road map as "promising."
"We believe this doubling remains possible and we're confident it's something we can deliver against," he said. The 486 microprocessor, which Gelsinger oversaw, was 1.12 million transistors and the next Itanium will be north of 1 billion, he noted.
However, the issues of power dissipation and process and on-chip variations are thorny ones that require new ways of thinking, he said. While gate oxide thinness should be solved with high-k dielectrics, the problems of source-drain leakage are increasing exponentially, Gelsinger said. To deal with it, Intel is looking at tri-gate structures to mitigate leakage.
Variation, however, represents "a new class of challenges on horizon that changes everything about our industry," he noted.
With static variation, for instance, "I have a distribution of devices, some leaky, some less, some faster than others," Gelsinger said. With dynamic variations, designers run into local hot spots and variations chip. As process rules shrink, fewer atoms end up on the channel and often designers get a non-uniform distribution of those atoms, another variation issue.
"Everything we do in our designs will have a probabilistic element to it," he said.
Using the Pentium 4 Northwood processor as an example, Gelsinger said Intel has good control on frequency variation, around 30 percent at 130nm design rules. However, leakage power ranges from five to 10 times.
"Our tools don't need to optimize for speed or logic functionality. They need to consider yield and bit splits, parameters and variations, leakage power, the distribution of that across the die, "Gelsinger said. "We need design tools and environments that integrate all these variables together."
By: DocMemory Copyright © 2023 CST, Inc. All Rights Reserved
|