Saturday, November 23, 2024
HomeHeadlinesIntel Unveils Next Generation Neuromorphic Computing Chip

Intel Unveils Next Generation Neuromorphic Computing Chip

- Advertisement -

Intel introduced Loihi 2, its second-generation neuromorphic research chip, and Lava, an open-source software framework for developing neuro-inspired applications. This introduction is said to signal Intel’s ongoing progress in advancing neuromorphic technology.

Neuromorphic computing, which draws insights from neuroscience to create chips that function more like the biological brain, aspires to deliver orders of magnitude improvements in energy efficiency, speed of computation and efficiency of learning across a range of edge applications: from vision, voice and gesture recognition to search retrieval, robotics, and constrained optimisation problems.

Applications Intel and its partners have demonstrated to date include robotic arms, neuromorphic skins and olfactory sensing.

- Advertisement -

According to Intel, the advances in Loihi 2 allow the architecture to support new classes of neuro-inspired algorithms and applications, while providing up to 10 times faster processing1, up to 15 times greater resource density2 with up to 1 million neurons per chip, and improved energy efficiency.

Benefitting from a close collaboration with Intel’s Technology Development Group, Loihi 2 has been fabricated with a pre-production version of the Intel 4 process, which underscores the health and progress of Intel 4. The use of extreme ultraviolet (EUV) lithography in Intel 4 has simplified the layout design rules compared to past process technologies. “This has made it possible to rapidly develop Loihi 2.”

Dr Gerd J Kunde, staff scientist, Los Alamos National Laboratory says, “This research has shown some exciting equivalences between spiking neural networks and quantum annealing approaches for solving hard optimisation problems. We have also demonstrated that the backpropagation algorithm, a foundational building block for training neural networks and previously believed not to be implementable on neuromorphic architectures.”

 

- Advertisement -
Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here
Captcha verification failed!
CAPTCHA user score failed. Please contact us!

News

Solutions

Most Popular

×