From the labs: IT's future today

The future of IT is taking shape in the form of next-generation computing technologies under development in today's R&D labs

For all its promise of revolution, the computing industry often lags behind expectations. After all, your netbook is really just a laptop, only smaller and cheaper. The chip that powers your PC today has a direct lineage back to the Pentiums of yesterday. Your latest hard drive might hold 2TB, but it's still just a hard drive. Where's the real innovation?

In the labs, of course.

Researchers in labs at major IT vendors and universities continue to point the way forward. Products and ideas dreamed up in these labs have the potential to shake up the IT industry. From the network to storage systems to the securing of sensitive data to the way in which end-users will one day interact with computing interfaces -- every facet of the enterprise is being pushed in revolutionary directions.

Here are just a few of the ideas evolving in today's labs -- future technologies that could be arriving sooner than you think.

Processors: Breaking Moore's Law

The story of computing since the mid-20th century has largely been the race against Moore's Law. Named for Intel co-founder Gordon E. Moore, the conjecture posits that integrated circuits will double in performance every 18 months. In practice, Moore's prediction has held mostly true. Even as chipmakers have drawn close to the practical limits of modern processor design, the advent of multicore CPUs has allowed them to cram ever more power into increasingly compact chip packages.

But some scientists see Moore's Law nearing the end of its useful life, and not because processors will stop getting faster. Rather, they believe computing will soon undergo a quantum leap. Quantum computers, devices that derive their processing power from the curious properties of subatomic particles, are widely seen as the next major evolution in computing. But while scientists struggle to build practical quantum computers -- even the most successful experiments to date have solved only simple calculations -- a number of other processor technologies in the works could prove equally revolutionary.

At the University of Illinois, Professor Rakesh Kumar has proposed a novel way to improve the speed of current CPUs: Cut the brakes. Today's chips, he says, spend too much time trying to get every calculation exactly right, while real-life integrated circuits make mistakes all the time. Ensuring flawless performance forces chips to consume more power than necessary, making it difficult to shrink the size of the components on the chip. Not every crash is a catastrophe, says Kumar, whose team is working on fault-tolerant rather than error-free processor designs. Combined with software that can cope with CPU errors when they occur, these so-called stochastic chips could run faster at lower power levels without worrying about flying out of control.

Meanwhile, teams from Michigan Technological University and Japan's National Institute for Materials Science are trying a far more radical approach. They've forgone silicon altogether, instead using DDQ -- a compound composed of carbon, nitrogen, oxygen, and chlorine -- to build a molecular computer that they claim already mimics the parallel processing structure of the human brain. A truly brainlike computer would outperform current CPUs by orders of magnitude. And because their design uses organic molecules, the researchers believe such computers could be grown from algae, rather than having to be constructed using industrial chemical processes.

Networking: The end of traffic jams

The phrase "the network is the computer," first coined by Sun Microsystems researcher John Gage in 1984, has never been more apt than it is today. From the most powerful server to the smallest mobile device, the one requirement for a modern computer system is that it provides instant, fast, reliable network access. But as demand for rich multimedia content increases, meeting that requirement remains a challenge. Fortunately, new networking technologies are emerging to lend fresh meaning to the phrase "high-speed broadband" -- and they could arrive sooner than you expect.

The 802.11n standard, which offers wireless networking at speeds up to 600Mbps, has been a long time in the making. Customers are only now beginning to upgrade from the slower 802.11b and 802.11g standards. But that doesn't mean work on Wi-Fi has stood still. On the contrary -- the Wi-Fi Alliance and the Wireless Gigabit Alliance have joined forces to develop Wi-Fi's next generation, which promises to be just as much a leap forward as 802.11n. The new standard will broadcast on the 60GHz radio spectrum and will be able to achieve data transfer rates up to 7Gbps -- fast enough to stream Blu-ray-quality video.

Wi-Fi speeds affect only the local area, however. The speed at which you can access the Web still depends on your connection to your ISP. So far, the fastest connections have been available only to those customers who have access to direct fiber-optic links. But ISPs should soon be able to provide low-cost access at near fiber-optic speeds to a much broader audience, thanks to technology under development at Alcatel-Lucent. Using a combination of signal-processing tricks, the technology promises speeds up to 300Mbps over ordinary copper wire, at a distance of up to 400 meters from a communications hub.

Storage: More, more, more

Today's data centers are like baby birds: always hungry. More storage capacity, greater density, lower power consumption, and faster access times -- the demands are relentless. Fortunately, storage has been one of the fastest-moving areas of computing technology in recent years, and that trend isn't slowing. Hard drive manufacturers are increasing capacity at an alarming rate, even as chipmakers are blazing new trails with high-speed solid-state devices. But the most exciting new data storage technologies are yet to come, and they will be entirely original.

At IBM's Almaden Research Center, scientists are working on a new form of solid-state storage called "racetrack memory." Using nanoscale wires to store information based on the direction of spin of individual electrons, racetrack memory stores data at greater density than traditional flash and provides access to that data at speeds comparable to those of traditional RAM. As with other solid-state storage media, data is retained even when the power is off. Unlike today's flash-based storage, however, there is no performance penalty to write to a racetrack memory device, and the memory never wears out.

Meanwhile, HP engineers are hoping to mine new value out of an old idea. The concept of a "memristor" -- the name is a portmanteau of "memory" and "transistor" -- has been around since 1971, when it was described in a paper by UC Berkeley professor Leon Chua. But it wasn't until 2008 that HP announced it had successfully produced a working memristor; now HP claims the technology has far more potential than Chua conceived. Because memristors have some properties of conventional transistors, they open the door to storage that can perform its own calculations, in addition to retaining data. What's more, memristors offer roughly double the storage density of flash devices and are much more resistant to radiation. HP hopes to commercialize the technology within the next few years.

Security: Designing the unpickable lock

As businesses and consumers accumulate ever more digital data, protecting that data has become a paramount concern. Cryptography remains one of the key tools for securing data. But as computer processing power continues to increase, the arms race between encryption methods and the tools used to break them is heating up. That's why mathematicians, engineers, and computer scientists are hard at work on new methods of data encryption that lack the old vulnerabilities.

The most discussed of these is quantum cryptography. Seen as the first "killer app" for quantum computing, quantum cryptography takes advantage of the fact that merely observing a system of quantum particles disturbs the system. Anyone who observes a message encrypted using quantum cryptography -- by reading the message -- leaves an irrevocable fingerprint on the message itself. Thus, it is theoretically impossible to eavesdrop on a channel that is secured via quantum cryptography without alerting the participants to your presence. As mentioned earlier, however, quantum computing remains largely experimental, and commercial applications of quantum cryptography are a long way off.

A recent discovery by IBM researcher Craig Gentry could be the next best thing -- and it will work with today's computer systems. Gentry has discovered an algorithm that performs a feat long thought impossible by top cryptographers: It allows a computer system to perform operations on encrypted data without decrypting it first. The data arrives encrypted and the result of the calculations goes out encrypted the same way; there's no intermediate step where the data is exposed to prying eyes. Gentry's discovery could have significant implications for a wide variety of digital security systems, from online banking to digital media delivery, even if it does take several years to put into practical applications.

Displays: New ways of seeing

When IBM first popularized the PC, green-screen monitors were the norm. Later, as graphics cards improved, those monochrome screens gave way to color. Today, bulky color CRT monitors are things of the past, replaced by thin, power-efficient LCDs. Further advances, such as OLED backlighting, are making modern flat panels brighter and crisper than ever. But this is hardly the end of the road for display technology.

For one thing, current-generation LCD panels tend to be fragile, as anyone who has dropped or sat on a mobile phone can tell you. To that end, Sony is developing flexible LCD technology that it hopes will yield not only more durable devices, but ones that are less costly and cleaner to manufacture. Current prototypes, while low-resolution, are so flexible that they can be rolled around a cylinder 4mm thick.

Another idea is to get rid of the monitor altogether. LCD projectors are commonplace enough, but they tend to be bulky and require expensive bulbs to operate. Also, their image quality varies greatly depending on ambient lighting conditions. That could change, however, with the introduction of digital projectors based on laser technology. Traditional monitors produce color using a combination of red, green, and blue light. Because reliable green laser light has proved difficult to produce, manufacturers have thus far been unable to use lasers in projection devices. Corning claims to have solved that problem, meaning it may soon be possible to project brilliant imagery from a device as small as a mobile phone.

Displays that perform equally well in daylight remain a challenge, but given the growing popularity of e-readers, it's one that scientists and engineers are eager to solve. A promising technology from Qualcomm called mirasol produces vibrant color imagery by reflecting ambient light from layers of tiny electromechanical mirrors; it should be available in consumer products in the coming year. Meanwhile, a Philips spin-off called Liquavista is a little further off, but it works either with reflected ambient light or a self-contained backlight.

User interface: Beyond the mouse

The ways in which we use computers have changed radically since the dawn of the PC era, but the ways in which we interact with computers are essentially the same as when Apple introduced the first Macintosh in 1984. The desktop metaphor, with its cursors, pointing devices, files, folders, windows, scrollbars, and other control widgets still dominates virtually every graphical OS platform.

At least that's true in real life. Look to the fanciful worlds of Hollywood movies, however, and you'll see a dazzling array of radical new UI concepts, from the hovering screens of "Minority Report" to Iron Man's in-helmet displays. Can any of these ideas be made practical for commercial use?

Microsoft thinks so. Its Surface UI platform offers new ways of transforming familiar objects -- such as tables -- into collaborative computing spaces, complete with novel input devices that move beyond the limitations of the common mouse. But Microsoft's most anticipated input innovation is Kinect for Xbox 360, formerly known as Project Natal. Aimed at helping Microsoft's Xbox compete with the Nintendo Wii console, Kinect for Xbox 360 is a gaming interface with no controllers. Instead, players manipulate in-game objects simply by moving and gesturing in midair. If the concept catches on with consumers, it could have potential applications for kiosks and touchscreen devices, as well.

Perhaps the best example of Hollywood UI technology made real, however, is the g-speak spatial operating environment from Oblong Industries. With roots at MIT's Media Laboratory, g-speak enables spatial and gestural control of graphical objects for a wide range of potential applications. With 3-D televisions and monitor screens expected to go mainstream in the near future, don't be surprised if g-speak or something like it is coming soon to a PC near you.

Neil McAllister is a freelance writer based in San Francisco. He also writes InfoWorld's Fatal Exception blog.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags NetworkingprocessorsstorageMoore's Lawmonitorsresearch and development

More about Alcatel-LucentAppleCirqueCorningetworkHewlett-Packard AustraliaHPIBM AustraliaIBM AustraliaIntelLucentMicrosoftMITMitsubishi AustraliaNECNintendo AustraliaPhilipsQualcommQuantumSonySun MicrosystemsXbox

Show Comments
[]