Micron challenges conventional computer architecture with new chip

Micron's Automata processor uses modified memory cells that can be reprogrammed to solve specific problems

Micron's Automata processor

Micron's Automata processor

Micron is challenging conventional computer architectures conceived decades ago with Automata, a highly parallel processor that can change its behavior to process the task at hand.

The Automata processor, which was announced this week, has thousands of modified memory cells that can be turned into processing units, said Paul Dlugosch, director of Micron's Automata processor technology development. The memory cells are nonvolatile, and can be erased and reprogrammed to solve a certain problem, Dlugosch said.

"This is indeed a new architecture, it's based on memory," Dlugosch said, adding that the processor has been under development for seven years.

The customized column of memory in Automata can gang up to process tasks quicker than on conventional computers, Dlugosch said. There are no fixed data sizes, and with a compiler, instructions can be created on the fly targeted at solving specific problems. Data is spread across memory units in parallel for processing, and unlike conventional computers, there is no need to wait for data to be shifted out of memory.

Dlugosch said Automata challenges conventional computer architectures at work since the 1940s in which data is pushed into a processor, calculated and pushed back in the memory with the help of instructions and logic units. One such computer architecture was derived in the 1940s by mathematician John von Neumann. But chip-level limitations and programming languages hamper the ability of current CPUs and GPUs to parallelize tasks, Dlugosch said.

Automata combines logic and DDR memory interfaces, but won't replace conventional CPUs, Dlugosch said. Automata needs a CPU, FPGA (field-programmable gate array), network processor or other host computing units to feed high-level instructions.

"We make no claims that the Automata process will run on its own," Dlugosch said. "The Automata processor must be programmed."

For now, Automata can be used as a coprocessor for applications in areas such as bioinformatics, security and video processing.

"We'll see the Automata processor grow in popularity and grow as the dominant analysis engine for unstructured data," Dlugosch said.

The Automata DRAM DIMM must be thought of as a black box, said Jim Handy, analyst at Objective Analysis. A host processor loads data from another memory, hard drive, or some other source, and then writes code into another part of that DIMM's DRAM, then tells Automata to get to work.

"The host then goes off and does something else until the Automata signals completion, whereupon the host reads the results," Handy said.

Automata could be an attempt to get the memory bus out of the way and put the processor in the same package of memory cells, said Nathan Brookwood, analyst at Insight 64.

"They're basically arguing that in order to get better performance, you have to put processing close to memory," Brookwood said.

The concept of Automata has been around for decades and a handful of startups have pursued tight integration of memory and processing elements, analysts said. Earlier constraints revolved around programming models or memory implementation.

"The basic notion has been around for decades, but the DRAM companies have always seen themselves in a silo that doesn't include processors and the processor guys have always looked at DRAM as a nasty business, so neither has ever tried to invade the other's turf," Handy said.

But with recent technology advances, Micron has a chance to succeed with Automata, though it could be years until tangible results surface, analysts said.

"Maybe this time something will actually work," Brookwood said.

A lot of Automata's effectiveness lies in the compiler provided with Micron's software development kit. The compiler purposes Automata's architecture and memory units, after which the raw data is streamed through the processors. Once data is loaded, Automata identifies data patterns, and defines behavior of the processing units to crunch the data.

Dlugosch called Automata a "zero instruction set" processor with the ability to create its own instructions to focus on the targeted problem. The chip has interface logic to buffer input streams and uses high-level instructions from a host processor to control the device at a system level. Automata doesn't receive instructions that represent a program or algorithm.

Automata may be years away from practical use, but Dlugosch said that Automata won't replace DRAM. Outside of DDR memory, Handy said that Automata processor architecture has the flexibility to be built around SRAM, flash or other emerging types of memory like MRAM (magnetoresistive RAM), PCM (phase-change memory) or RRAM (resistive RAM).

Automata won't replace FPGAs either, Dlugosch said. FPGAs, which are widely used for hardware prototyping, are functionally similar to Automata with fast throughput and the ability to be reprogrammed. By comparison, Automata is based on memory architecture and applies more to data analytics than hardware or code testing, Dlugosch said.

Micron has partnered with the University of Missouri and University of Virginia to research, test and write applications for the Automata processor. The company did not say when the chip would officially ship, but the Automata software development kit will be available next year.

Agam Shah covers PCs, tablets, servers, chips and semiconductors for IDG News Service. Follow Agam on Twitter at @agamsh. Agam's e-mail address is agam_shah@idg.com

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags processorshardware systemsmicron technologyComponents

More about IDGindeedInsightInsight 64MicronNewsTwitter

Show Comments
[]