Computerworld

DRAM's inventor, 76, still going strong at IBM

51-year Big Blue veteran Robert Dennard to receive IEEE medal next week
  • Eric Lai (Computerworld)
  • 19 June, 2009 10:36

Dennard's Law? It doesn't have quite the ring of Moore's Law, mostly because IBM researcher Robert H. Dennard remains unknown to the general public.

The research community, however, knows all about two significant contributions made by the 76-year-old scientist.

In the late 1960s, Dennard invented Dynamic Random Access Memory, or DRAM, the memory used in virtually all computers today.

Dennard followed in the mid-1970s with a groundbreaking paper describing how to keep shrinking transistors to build smaller, faster and less expensive chips.

Dennard's "scaling theory" ( PDF document) is often ascribed to Moore's Law, when, as Dennard modestly puts it, "scaling and Moore's Law go very well together."

For those achievements, Dennard, who celebrated his 51st year as an IBM employee this week, will receive a Medal of Honor from the Institute of Electrical Engineers next Thursday.

Fittingly, Dennard will get his Medal from IEEE one year after Intel's Gordon Moore did.

Without the invention of DRAM, computer memory might be the technology laggard that hard-disk drives and laptop batteries remain today.

As Dennard recalls it, the dominant memory used in IBM's mainframe computers of the late 1960s was magnetic core memory. Co-invented by the An Wang (later co-founder of workstation pioneer Wang Laboratories), magnetic core memory used small loops of wire to store bits of data.

Magnetic-core memory "was delicate like jewelry," Dennard said. "They were these teeny little things, almost like cheerios, but made out of ferrite [iron-based] material."

Not only was magnetic-core memory fragile, but it was expensive and slow. But it had one great advantage: It was non-volatile, meaning that you didn't need to send electric current to maintain the data.

DRAM, by contrast, seemed tricky and complicated. All of the prototypes other researchers had built up to that time were memory chips that involved multiple transistors, which made designs more complicated and expensive.

To solve this issue, some researchers were testing the use of bi-polar junction transistors. But Dennard preferred metal-oxide-semiconductor field-effect transistors, or MOSFETs, even though, he admits "MOS was definitely less advanced and more problematical. There were some basic problems to be solved to make it manufacture-able. But I still considered it more promising."

Dennard was eventually able to create a memory cell that was able to store a charge (representing a bit of data) and keep it continually refreshed, all in a simple single-transistor package. Dennard patented his invention in 1968. But multi-transistor DRAM continued to reign, both at IBM, then a major memory maker, and others.

It wasn't until the mid-1970s that the first single-transistor DRAM appeared. And the market never looked back.

Today, gamers and PC performance addicts can buy gigabytes of DDR3-1600 DRAM with a peak transfer speed of 12800 Megabytes/second (12.8 Gigabytes/second) for less than $100.

At 76, Dennard doesn't play many shoot-em-up videogames, or overclock many PCs. So he doesn't fully reap the fruits of his innovations.

"I have a seven-year-old PC in my office. Truthfully, I'm not a big computer user," he said, adding that if IBM still issued patent notebooks to its researchers for recording their ideas, he would.

"This was what I was taught at Carnegie Mellon. It was a very efficient way to work," he said.

Interest in DRAM has cooled, giving way to alternatives such as flash memory, used in solid-state disks (SSDs) and touted as an ultra-fast-albeit-still-thornyreplacement for hard-disk drives.

"Flash? Well, I've got a lot of it in my digital camera," Dennard joked. More seriously, Dennard concedes that "a lot of people are hopeful that flash memory can play more of a role in basic computing as well."

What about other non-silicon forms of memory, such as holographic storage?

"Optical computing has been a Holy Grail for a long time, but it's never broken through," he said. "I'm not sure what people are most hopeful about today."

His prediction: "Miniaturized CMOS technology will keep reaching a high level of performance," Dennard said. "People are still working on improving it. It's what's being manufactured today, so it will be very hard to replace."