{"id":5781,"date":"2020-06-26T09:05:26","date_gmt":"2020-06-26T08:05:26","guid":{"rendered":"https:\/\/www.innovationnewsnetwork.com\/?p=5781"},"modified":"2020-09-21T09:37:18","modified_gmt":"2020-09-21T08:37:18","slug":"neuromorphic-technology-accelerating-the-ai-revolution-in-europe","status":"publish","type":"post","link":"https:\/\/www.innovationnewsnetwork.com\/neuromorphic-technology-accelerating-the-ai-revolution-in-europe\/5781\/","title":{"rendered":"Neuromorphic technology \u2013 accelerating the AI revolution in Europe"},"content":{"rendered":"

Neuromorphic technology are rapidly advancing across Europe and are poised to help accelerate the AI revolution.<\/h2>\n

By taking loose inspiration from the brain, artificial neural network algorithms have made tremendous progress in Artificial Intelligence. However, to unlock significant gains in terms of novel real-world capabilities, performance and efficiency, a more ambitious step needs to be taken: to develop a new technology that emulates neural computation directly at the hardware level. The NEUROTECH network<\/a> presents its vision of these \u2018neuromorphic\u2019 technologies and their innovative potential in Europe.<\/p>\n

Efficient vs. power-hungry<\/h3>\n

Training artificial neural networks to learn to perform pattern recognition tasks on Graphical Processing Units typically requires hundreds of Watts. Simulating even very small parts of animal brains on supercomputers requires tens of Mega Watts<\/a>. In comparison, the human brain consumes only 20 Watts to carry out sophisticated perceptual and cognitive tasks. Neuromorphic technologies aspire to emulate neural processing circuits bridging this large energy efficiency gap.<\/p>\n

Parallel vs. sequential<\/h3>\n

Although each neuron typically spikes a few times per second in biological neural processing systems, the massive parallelism of their many neurons and synapses allow them to perform many orders of magnitude more operations per second than those of artificial neural networks simulated on conventional computers. Approaching high levels of parallelism (of the order of thousands and above) in compact and power efficient hardware platforms will require drastic changes in computer architectures and electronic devices.<\/p>\n

In-memory computing vs. von Neumann architecture<\/h3>\n

In conventional computer architectures, a large part of the energy consumption and delays are due to the transfer of information between the physically separated memory and computing parts. In neural network algorithms, this issue (\u2018von Neumann bottleneck\u2019) is critical because huge numbers of parameters need to be stored and frequently addressed. Neuromorphic technologies aim at bringing memory and computing together, like in the brain where computing (neurons) and memory (synapses and topology of the network) are completely intertwined.<\/p>\n

Plastic vs. rigid<\/h3>\n

Learning, both in the brain and in neural networks algorithms, corresponds to repetitive modification of the synapses until reaching a set of connections enabling the network to perform tasks accurately. In conventional computers, this is done by explicit modification of the memory banks storing the weights. Neuromorphic technologies aim at building systems where weights are self-modified through local rules and plastic synaptic devices, as it is done in the brain.<\/p>\n

Analogue vs. digital<\/h3>\n

Conventional computers rely on digital encoding (zero and one). In the brain, the electrical potential at the membranes of neurons can take continuous values, and so can the synaptic weights. Reproducing such behaviour with digital encoding takes large circuits. Replacing them by analogue components \u2013 either CMOS transistors or emerging nanodevices<\/a> \u2013 that directly emulate neural behaviour could improve efficiency. However, large scale realisations have yet to be demonstrated.<\/p>\n

Dynamic vs. static<\/h3>\n

Conventional computers use the steady-state of their circuits to encode information. On the contrary, neurons are non-linear oscillators that emit spikes of voltage. They are coupled to each other and capable of collective behaviour such as synchronisation, transient dynamics and edge of chaos. Neuromorphic technologies aim at emulating such a complex dynamical system in order to go beyond the possibilities of static neural networks, in particular regarding learning.<\/p>\n

Spiking vs. clocked<\/h3>\n

Conventional computers are run by a clock which sets the pace of all circuits. There is no such clock in the brain. In sensory computing, for example, the brain achieves a large part of its efficiency by operating in an event-based manner, where signals are only sampled and transmitted when new information either arrives or is computed. Neuromorphic computing aims at designing spiking architectures natively supporting this scheme.<\/p>\n

Stochastic vs. exact<\/h3>\n

Conventional computers aim at very high precision, contrary to the brain, which neurons and synapses exhibit variability and stochasticity. Resilience to such imprecision seems to be a key asset of neural networks. Relaxing the constraints on the exactitude of components and computing steps in order to decrease energy consumption while maintaining accurate results is a goal of neuromorphic technologies.<\/p>\n

Each of these directions represents a breakthrough from the current computing paradigm. As such, neuromorphic computing represents an extremely ambitious multi-disciplinary effort. Each direction will require significant advances in computing theory, architecture, and device physics.<\/p>\n

\"\"<\/a><\/p>\n

Applications<\/h3>\n

Neuromorphic computing has the potential to bring huge progress in a wide range of applications. Here we outline some expected important advances.<\/p>\n

Smart Agents on the edge<\/h4>\n

Neuromorphic technologies will provide systems capable of running state-of-the-art artificial intelligence while consuming little power and energy, enabling embedded and always-on processing. This opens the way to the deployment of artificial intelligence on the edge, where consumption and size are critical. Neuromorphic computing enables continuous learning and adaptation to different environments and users. It shows unprecedented capabilities to encode sensory information efficiently and can lead to smart distributed local processing providing faster responses (to trigger further actions for example) as well as better security and privacy.<\/p>\n

Service to people<\/h4>\n

Extremely low-power always-on detection systems for voice, speech, and keyword detection can enable further speech processing, towards natural language processing, which in turn will lead to more efficient personal assistants. Always-on systems for fall detection and biomedical signals monitoring people in households will further enhance the capabilities of personal assistants. Robotic assistants will physically interact with the environment and humans and use the adaptability offered by neuromorphic perception and computation to adapt their behaviour. In health, the importance of data privacy is huge, making on-site processing of information even more critical.<\/p>\n

Industrial<\/h4>\n

Anomaly detection, in its wide sense, is extremely useful in manufacturing plants, where monitoring of workers can improve safety. Neuromorphic technology can also provide solutions for anomaly detection in time series, automatisation of controls and tests, design for manufacturing, defect detection and forecast, predictive maintenance of machines, etc. A similar approach can be deployed in the design of safer cars, both for monitoring the car status and for advanced driver assistance systems, with a limited power budget.<\/p>\n

Research<\/h4>\n

Besides the impact on the application side, neuromorphic chips will support the research domain, as they are ideal systems to simulate biological neural networks, contributing to understanding the brain and the mechanisms of intelligent behaviour. This would bring massive novel knowledge for new treatments for neurological diseases.<\/p>\n

\"\"<\/a><\/p>\n

Technological enablers<\/h3>\n

Neuromorphic computing requires a departure from the traditional computing paradigm. This implies to use conventional computing substrates in a novel way, or to develop novel substrates. Here we outline key technological enablers for neuromorphic computing, that show promising results.<\/p>\n

Digital CMOS technology<\/h4>\n

The mainstay of the semiconductor manufacturing industry, digital CMOS is well understood and delivers very consistent performance in volume manufacture. It can access the most advanced semiconductor technologies, which helps offset its intrinsic energy-efficiency disadvantages compared with analogue circuits. When applied to neuromorphic architectures, asynchronous, clocked and hybrid approaches to circuit timing can be used, and algorithms can be mapped into fixed (albeit highly parameterised and configurable) circuits for efficiency or into software for flexibility. Examples of the former include the IBM TrueNorth and Intel Loihi, and of the latter include NEUROTECH partner the University of Manchester\u2019s SpiNNaker many-core neuromorphic computing platform.1<\/sup><\/p>\n

Analogue\/mixed-signal technology<\/h4>\n

Event-based analogue mixed-signal neuromorphic technology combines the compact and low power features of analogue circuits with the robustness and low-latency ones of digital event-based asynchronous ones. The key feature of the mixed-signal design approach, compared to pure digital ones, is the ability to build systems able to carry out processing with stringent resources in terms of power and memory by:<\/p>\n