How to tell if AI or machine learning is real

False and misleading claims abound that applications and cloud services are now smart. Here’s how to identify true artificial intelligence and machine learning

Suddenly, it seems, every application and cloud service has been fortified with machine learning or artificial intelligence. Presto! They now can do magic.

Much of the marketing around machine learning and AI is misleading, making promises that aren’t realistic—and often using the terms when they don’t apply. In other words, there’s a lot of BS being peddled. Don’t fall for those snow jobs.

Before I explain how can you tell if the software or service really uses machine learning or AI, let me define what those terms really mean:

Artificial intelligence is a wide range of cognitive technologies to enable ad hoc or situational reasoning, planning, learning, communication, perception, and the ability manipulate objects to an intended purpose. These technologies in various combinations promise to create machines or software entities that have—or at least act as if they have—the natural intelligence that humans and other animal species possess. Just as natural life’s intelligence varies dramatically across and within species, so too could the intelligence of AIs.

AI has been a popular motif in science fiction for more than a century, and it’s a particularly strong notion among techies. IBM, MIT, the U.S. Defense Department, and Carnegie-Mellon University, for example, have been doing AI work for decades, showcasing the same kinds of examples over and over again for just as long. The promises today are very much like the promise I saw at these institutions in the 1980s, but of course there’s been a lot of incremental improvement that has brought us a little closer to making the promises a reality. But we’re nowhere the scenarios of sci-fi.

Machine learning is a subset of AI. It refers specifically to software designed to detect patterns and observe outcomes, then use that analysis to adjust its own behavior or guide people to better results. Machine learning doesn’t require the kind of perception and cognition that we associate with intelligence; it simply requires really good, really fast pattern matching and the ability to apply those patterns to its behavior and recommendations. Humans and other animals learn the same way: You see what works and do that more often, while avoiding what you observe doesn’t work so well. A machine, by contrast, does only what it is told or programmed to do.

Snow job 1: Confusing logic with learning

There’ve been a lot of advances in machine learning in recent years, so not all machine learning claims are snow jobs. The quick way to tell is to ask the vendor what the software or robot can learn and adjust on its own, without a software update. Plus, ask how you train it; training is how you help it learn your environment and desired outcomes.

But most of what marketers call machine learning is simply logic. Programmers have been using logic in software since Day 1 to tell programs and robots what to do. Sophisticated logic can provide multiple paths for the software or robot to take, based on parameters the logic is designed to process.

Today’s hardware can run very sophisticated logic, so applications and devices can appear to be intelligent and able to adjust on their own. But most don’t actually learn—if their developer didn’t anticipate a situation, they can’t adjust on their own to handle it through pattern-analysis-based trial and error as a true machine learning system can.

Even if true machine learning is in place, a machine learning system is bound by whatever parameters its logic has set it to “know”—unlike a true AI, it can’t discover new facts outside its programmed world, only learn to understand and interact with the programmed world on its own.

Snow job 2: The use of IoT or cloud technology makes it smart

Marketers like to take hot technology terms and sprinkle them on whatever they already have. Many don’t really understand what the terms mean, or they don’t care. They only want your attention. You can identify a snow job quickly by looking at the buzzword-to-detail ratio: If all you see are buzzwords and the technology “how” details are lacking, you know it’s the same old technology with new marketing applied.

Today, the internet of things and cloud computing are hot, so they’re often at the heart of that new marketing. Still, both can play a role in machine learning or AI systems (really, AI precursor systems), so it’s not the use of the terms that’s a red flag, but their flippant use.

IoT relies on both local and networked sensors and on a combination of local and server (cloud) logic—both analytics and actuators to do something from the analysis. Together, these allow devices to seem smart because they’re programmed to adjust automatically to various events they sense. For machine learning, they are great inputs for the learning part, as well as great outputs for the adjusted actions.

Cloud computing opens up processing and data storage capabilities undreamt of in the past. Devices don’t have to carry all that overhead with them; instead, they can offload to the cloud all that work—and the hardware to support it. This is how Apple’s Siri, Microsoft’s Cortana, and Google Now work: They send your speech to the cloud, which translates it and figures out a response, then sends it back to your phone. That way, you don’t have to carry a mainframe or datacenter on your pocket or keep it on your desk.

Of course, you could do that before the cloud via client/server computing, but the cloud provides at least an order of magnitude more capability than your typical datacenter, so now you can do processing and storage at the scale that whole populations can take advantage of.

Snow job 3: Machine learning means it’s smart

It’s truly impressive what a service like Siri, Cortana, or Google Now can do. And what developers can do using tools like Microsoft’s Bot Framework with Cortana. But we all quickly see how they fall apart in areas outside their programming, resorting to a simple web search for what they weren’t programmed to learn. No doubt Apple, Microsoft, and Google are using machine learning on the back end to make them appear smarter.

If someone claims an application, a service, or a machine is smart, you’re almost certainly getting snowed. Of course, people will use the word “smart” as a shortcut to mean “more capable logic,” a phrase that won’t sell anything. But if they don’t explain what “smart” means specific to their offering, you know they think you’re dumb.

The fact is that most technologies labeled “smart” are not smart, merely savvy. The difference is that smart requires intelligence and cognition, whereas savvy requires only information and the ability to take advantage of it (it’s no accident that “savvy” come from the French word for “to know”). A savvy app or robot is a good thing, but it’s still not smart. We’re simply not there yet.

Even IBM’s vaunted Watson is not smart. It is savvy, it is very fast, and it can learn. But it’s been around in various forms at IBM since the 1980s, so if Watson were truly that smart, IBM would be ruling the business world by now. Watson won’t cure diseases, make peace in the Mideast, create new tax breaks, or solve world hunger. But it can help people better handle all sorts of actions, if the price is right.

If you keep that goal in mind and are truly getting machine learning and AI precursors in your business, you’ll be satisfied. But don’t expect a sci-fi fantasy version like Data from Star Trek, HAL from 2001: A Space Odyssey (inspired by IBM’s 1960s AI research!), or Philip K. Dick’s androids in Do Androids Dream of Electric Sheep? And don’t trust vendors that sell their technology under such guises.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

More about AppleGoogleIBMMellonMicrosoftMITTrek

Show Comments
[]