Email
Password
Remember meForgot password?
    Log in with Twitter

article imageWhy neuromorphic technology is the key to future AI

By Tim Sandle     Jul 18, 2017 in Technology
The future of artificial intelligence is seen by many researchers as "neuromorphic". Neuromorphic chips are being designed to mimic the human brain and they may, soon, replace CPUs.
The idea is to develop microprocessors configured more like human brains than traditional silicon chips with the aim of making computers more astute about the environment; this is seen as step-forwards with artificial intelligence. Neuroinformatics refers to the creation of neuromorphic chips that can replicate the brain’s information processing capabilities in real-time.
Key players in the development of neuromoprhic computing are Qualcomm, IBM, HRL Laboratories and the Human Brain Project. The Human Brain Project is a 10-year project seeking to simulate a complete human brain in a supercomputer using biological data. With the commercial developments, a neuromorphic chip made by IBM contains five times as many transistors as a standard Intel processor, Wired reports, yet it consumes only 70 milliwatts of power. These TrueNorth chips have a million computer ‘neurons’ that work in parallel across 256 million inter-neuron connections (‘synapses’). Perhaps the fastest development, however, has come from Qualcomm.
Talking with Technology Review, Qualcomm’s chief technology officer, Matthew Grob has revealed how neuromoprhic technology can be embedded into the silicon chips that power every manner of electronic device. In a demonstration robotic action figures have been tested; as the figures, which included a Captain America and a Spider-man, moved around they demonstrate awareness of their surroundings.
The figures demonstrated “neuromorphic” responses by processing sensory data, including images of objects and sounds and they were able to react to changes. What is remarkable is that these changes were not pre-programmed; instead the computers in the toys collected data, processed it and then made a decision. This is the starting point for machines that will be able to understand and interact with the world in humanlike ways.
Neuromorphic engineering requires developers to understand how the morphology of individual neurons, circuits, applications affects how information is represented. This influences learning and development with computerized systems. The technology requires input from biology, physics, mathematics, computer science, and electronic engineering disciplines.
Beyond toys, there are a number of applications that can assist companies with digital transformation. For example, medical sensors and devices to track individuals’ vital signs and response to treatments over time. Based on key signals, the machines learn to adjust dosages or notify clinicians of problems.
Taking a second example, from the business realm, researchers could develop software for smartphones that can learn to anticipate what the user wants next, such as information about the next client in the appointments calendar. To add to this, neuromoprhic computing will assist in Google’s self-driving car developments.
Based on these developments, Grob said: “We’re blurring the boundary between silicon and biological systems.” In terms of timescales, Qualcomm’s chips are expected to be available for commercial use in 2018. Qualcomm is a U.S. multinational semiconductor and telecommunications equipment company that designs and markets wireless telecommunications products and services.
Interested in reading more about artificial intelligence? An interesting debate in medical science is whether artificial intelligence can improve on human doctors, a topic which Digital Journal has recently covered.
More about Neuromorphic, artifical in, Smartphones, silicon chip
More news from
Latest News
Top News