Stories by Gary H. Anthes

The economics of innovation

"The single biggest mistake IT managers make is listening to their customers," says Michael Schrage, a research associate at the MIT Media Lab.

Future watch: Computer to user: You sort it out

Researchers in the U.S. and the U.K. are developing computer systems that make deliberately ambiguous interpretations of human environments. What's more, the systems are often flat-out wrong. But the developers are delighted with their progress so far, saying that with computers, sometimes less is more.

Supply chain blind spots

Companies are increasingly extending their operations overseas, looking for new markets, lower labor costs and better access to raw materials. Such expansion can bring advantages, but it can also introduce critical blind spots into supply chains as business and IT managers try to monitor activities thousands of miles away.

The future of e-mail

Your company scans incoming e-mail for viruses and outgoing messages for confidential information. Your spam filter snags most of the garbage, and it gets better as it learns the latest spamming and phishing spoofs. You're encrypting sensitive e-mail now, and you recently completed a project that keeps your messages safely archived in case federal regulators come knocking.

Subatomic IT

The work of Jim Allen, a physicist at the University of California, Santa Barbara, is so far removed from everyday experience that he has to explain it by analogy: a tabletop covered with refrigerator magnets. "They all interact with each other and do funny dances," he says.

IBM's Business Insights Workbench smarter search

Eight years ago, there were plenty of tools to search and analyze structured data, and even a few to go after unstructured information such as free-form text. But the two kinds of tools were not integrated, according to Jeffrey Kreulen, senior manager of service-oriented technologies at IBM's Almaden Research Center in San Jose. And the most sophisticated analytic tools used esoteric mathematical techniques that pretty much kept them out of the hands of nontechnical users.

Software: where digital and physical worlds merge

Ike Nassi is something of a renaissance man in IT, having held senior technical positions at some of the biggest companies in the industry and also in academia. He's now the senior vice president for research at SAP Labs in Palo Alto, California. Companies in his CV include SAP, Cisco, Apple, Digital Equipment Corp and several others. Nassi founded Firetide, co-founded Encore Computer, and helped start the Computer History Museum in California.
He has held positions at Stanford University, MIT, Boston University and the University of California, Berkeley and played key roles in the design of the Ada programming language and the Mach operating system. Here he tells Gary Anthes what's driving change in the software world

The price point

Not long ago, a major U.S. company embarked on an ambitious Six Sigma program to improve efficiency. The program was a big success, saving the company US$250 million.

Hard cores

Re-engineering programs to work on multicore chips is already difficult but will get even harder as the number of processors continues to multiply.

Premier 100: Veteran CIO talks about agile enterprise

CIO at large Michael H. Hugos Monday admitted to IT execs at the Computerworld Premier 100 IT Leaders conference that the term "agile enterprise" has a certain faddish buzzword quality. Then he went on to explain why corporations and their IT departments ignore its concepts at their peril.

High-speed databases rev corporate apps

Relational database management systems have become all but ubiquitous in enterprise computing since 1970, when they were first devised by E.F. Codd. But as powerful and flexible as those databases are, they've proved inadequate for a handful of ultrademanding applications that have to process hundreds or thousands of transactions per second and never go down. Now, the very-high-performance database technologies that sprang up to serve these niche markets, such as options trading and telephone call processing, are poised to move into mainstream computing.

Supercomputer on a chip

Computer scientists at the University of Texas at Austin are inventing a radical microprocessor architecture, one that aims to solve some of the most vexing problems facing chip designers today. If successful, the Defense Department-funded effort could lead to processors of unprecedented performance and flexibility.

Love that 'legacy'

Like it or not, old code is still around, and it needs special care. By Gary Anthes

Internet pioneer looks ahead

Leonard Kleinrock is emeritus professor of computer science at the University of California, Los Angeles. He created the basic principles of packet switching, the foundation of the Internet, while a graduate student at MIT, where he earned a Ph.D. in 1963. The Los Angeles Times in 1999 called him one of the "50 people who most influenced business this century."

[]