Connect with us

Hi, what are you looking for?

Tech & Science

Op-Ed: IBM Cognitive Cooking — Watson turns chef?

IBM just proudly tweeted their new cookbook with an annoying video, which is about a million times the speed it needs to be to get drooling. That said, cookery is an art and a half, and for a computer program to learn it is no trivial aperitif.
The IBM Cognitive Cookbook is beautifully presented — too much so. A good meal is delivered by the camera with all the panache of a publisher whose Mylanta supplies are running low. The recipes, however, definitely don’t come from the trash bin or OCD video production values. This is good stuff, and it’s not something even a competent cook would sneer at.
In the Under the Hood tech section, IBM’s copy people have been having fun with a few puns which may one day be legalized. A Recipe for Innovation is almost forgivable, but hey, copy people gotta breathe, too.
In fairness, explaining cognitive computing in a single web page isn’t all that easy. I’ve done a few articles on the subject, and it’s hard to do it justice in terms of its potential. Cognitive computing is the new step in coding. It’s the answer to code writers going nuts on a regular basis. It’s a learning tool, a research tool, and a lot more as well.
Cognitive computing means that a computer learns, and manages code to do more with what it’s learned. It’s called “machine learning,” and it’s the first real version of the self-managing computers of science fiction. This is Gigantic Science, and there aren’t even many parameters for what it can do.
Computers and food
The thing I notice about the recipe book is that the computer has selected real food, not the trivial, disgusting packaged slop usually sold to the public. I wonder what Watson would do with anything on the lists of ingredients, if programmed to recognize them. My guess would be that it would compare with its existing recipes, and use the real food in preference to the less efficient additives.
Undersold? Yes.
Cognitive computing will ultimately be the package you use to invent the replacement for the wheel. The under-sell is a result of being a bit too folksy, in my opinion. The people who need cognitive computing need to play with it, figure out what it can do, and above all, apply it to their work. I can appreciate the need to make it comprehensible to the wider market, sure, but the guys who need it want to see it in action.
For example — the gaming industry needs cognitive computing like fish need water. Code is a big issue, and code efficiency is a bigger issue. This is costing millions per hour, and a better way of handling this ever-growing code is also going to be better business for everyone. It’ll also be extremely useful for code writers who need to test, find bugs and find workarounds and fixes. There’s a few lazy billion waiting around to be picked up, and nobody will mind a bit.
In the current forte zone, Big Data, cognitive computing is likely to be the eventual reason anything works at all. The truth is Big Data is a whale being steered by minnows. Market analysts and their clients aren’t exactly over the moon with the software, which tries hard, but at this stage has to basically find its own way around, while the Big Data gets bigger and more complex on a daily basis.
Cognitive computing is the way to deal with the complexities, learning new methods and most importantly learning how to manage this global monster. Having Big Data is one thing; accessing it is another, and turning it in to useable information is another. The more complex it gets, the more important it is to create new methods to manage it.
Creating efficiencies
The major area of undersell, however, is that the movement of data is getting more ponderous. In the early days of the net, things really were fast. Now, everything seems to lumber along. It was much simpler software, but “simpler” also means “more efficient.” That’s another area where cognitive computing can perform invaluable services.
Imagine de-cluttering the net with better code management. Better streaming with more efficient videos. Less garbage, more substance. More space online created by far more efficient intrinsic codes, micro codes and data management. When the Cloud really gets going, the uber-data-monsters will be on the rampage.
In theory, human beings can analyze, find and fix code issues and manage data just as well as cognitive computing. In practice, the various data supernovae which can be caused by any event online aren’t really all that susceptible to quick handling, certainly not in real time. That’s just the net, with things going boom, never mind the endless expansion of business, scientific, and other data management usage.
If you’re talking about responsive fixes, cognitive computing is going to cook up more than a few recipes in the coming centuries. Cookery is a good example of taking a basic coding principle and turning it in to a practical working thing. Cookery is difficult. It’s demanding, and it involves understanding the basics very well. If it can do that and do it well, cognitive computing has earned its credentials for the very hard work to come.
Cognitive computing – The risky side
There is a problem, however — you realize this whole thing is eventually going to dovetail with I Can Has Cheeseburger? What if the cats figure out that Watson can produce all the treats they like, too? I Can Has Cognitive Cheeseburger? They’ve been hanging around so many computers on YouTube, the writing is clearly on the wall.
It’s easy — all you need to do is look at your human, look at the computer, purr “Watson” loud enough, and be adorable. Yet again, the world is working for the cats.

Avatar photo
Written By

Editor-at-Large based in Sydney, Australia.

You may also like:

Life

Their stories are divided into before and after.

Social Media

Wanna buy some ignorance? You’re in luck.

Tech & Science

Under new legislation that passed the House of Representatives last week, TikTok could be banned in the United States.

Life

Platforms like Instagram and Pinterest often suggest travel destinations based on your likes and viewing habits.