Email
Password
Remember meForgot password?
    Log in with Twitter

article imageOp-Ed: One big step for AI — Intel’s new self learning chip Loihi

By Paul Wallis     Oct 1, 2017 in Technology
Sydney - If there's one thing at which all sciences are truly lousy at, it's explaining themselves. Intel's new “self learning” chip called Loihi IS a major, very necessary, step forward. Try and find out why, from the information available.
The new chips act as both memory and process, much like a human neuron. Better still, Loihi uses very low power to operate, suggesting much higher levels of efficiency in both memory and processing. The self learning chips are also capable of autonomous operation, like in robots, etc.
The chips respond to stimuli in the form of data and data sets, and learn in real time directly from data. This makes them very adaptable, learning directly from environmental feedback to gather data. The chips get more efficient, “smarter” as they learn.
…. It’s at the applications level where explaining the practical values seems to be coming unstuck. Actually, visualization is probably better than descriptions for explaining how self learning works and why it’s so very important, particularly for doing all the most basic real time work. Loihi is a crucial step forward at an absolutely fundamental, absolutely necessary, level for high efficiency real time operations.
The ability to process in real time is critical to many system operations. Fast responses to operational needs and responses to environments are exactly what the human brain is supposed to be doing all the time. (If yours isn’t doing that, check your warranty.)
You can see why a robot, for example, would have to react to ongoing multiple operations and environmental factors. Loihi is in effect the platform for doing just that.
Picture this - Your mind can take in millions of bits of data, form them in to a data set in the form of a picture of the environment, and applicable circumstances, and order a ham sandwich and a cup of coffee. The entire process is almost instant, using literally microscopic amounts of energy.
This is a neurological process, in which your memories and knowledge are resolved in to neural system assessment of data, weighing options and ordering your food as a working process. That’s exactly what the self learning chips and “neural networks” are trying to emulate.
Professionals will have noted how hard I’m trying to stay away from the massive cluster of terminological distractions in most discussions of machine learning. Understanding the process simply and clearly is more likely to defuse the Artificial Intelligence Phobia in media than anything else. I would also point out that selling every mention of machine learning like it’s either a travel brochure or a physics lesson doesn’t exactly help understanding.
Humanity vs A.I. and autonomous machines? Maybe not.
With all due respect to Hawking and others, (and plenty is due), artificial intelligence is really a new tool. It’s not a very advanced tool, yet. HOW AI learns is the defining issue. Every expert in machine learning is predicting massive advances, soon, but it’s the Stone Age baby version at the moment. This is the invention of fire in the next wave of computing and it needs to be seen for what it is.
Turning it in to a perceived monster, totally incomprehensible and threatening, isn’t helping much. There is no way on God’s green golf course that real, fully functional, artificial intelligence can be avoided. It’s necessary, to start with, just to handle the tidal ebb and flow of data.
Come to think of it, inventing a new tool which can talk back to the tool users could be a very good idea. What if a machine calls BS on something? What if all this inscrutable BS in global financial markets, for example, could be accurately analysed, instantly? What if economic and business models could be correctly monitored in real time? Sure it’s big data, but it’s not inaccessible data, when you use AI. Therefore, AI will have to be used.
Totally different ball games? Damn straight. Self learning could deliver very useful insights and valuable information assets, backed up by observation and learned knowledge. On the purely commercial level alone, a machine that works like Loihi does make sense.
The huge cultural irony here is that human beings are more likely to trust information from a machine than another human being. Everybody also likes to back themselves up with proven data. Imagine going to a meeting not armed to the teeth with every bit of current supporting data you can find. It’s unthinkable.
Now - Why, exactly, would anyone deprive themselves of a do-everything, trustworthy source of data and real time information? AI, self-learning, deep learning, and real time cognitive operations just happen to be exactly what everyone wants, and will ultimately need.
I’d like to suggest a workout for Loihi in context with the real major deal in its functions – Gaming:
1. All games must be learned, often in considerable depth.
2. Degree of learning, insight and situational awareness in real time are the critical dynamics in game play.
3. Proficiency of play is the really representative and evolving metric.
4. If it can do gaming, Loihi could deliver real fascination to the X trillion dollar gaming sector, funding further R&D.
5. You couldn’t possibly get better metrics than millions of gamers, determined to beat the super chip at its own game. That’d be a REAL test, the human chips vs real humans.
6. You could also game Loihi against chips of its own kind and others in the new class of chips. Whole lotta learning going on? Yep.
Intel have made a brave, if obviously very highly structured and relentlessly verified, step in to an important field of computing. Even if the next logical step is the world’s greatest gaming chip until a new one comes out, nobody will mind. At least they’ll understand it, on a very practical level.
This opinion article was written by an independent writer. The opinions and views expressed herein are those of the author and are not necessarily intended to reflect those of DigitalJournal.com
More about Intel, Loihi, self learning chips, deep learning, neural networks