Disclaimer: The views expressed in this article are those of the author and do not necessarily reflect the views of the Economic Times – ET Edge Insights, its management, or its members

Artificial,Intelligence,Electronic,Circuit.,Microchip,With,Glowing,Brain.,3d,Rendered

The various advancements in computing prowess have take the world by storm in form of various technological innovations. Today, everything is connected and is at the heart of life, industry, and commerce: Everything from wearables, smartphones, automobiles, and IoT have led to a seamless connected reality. AI and analytics has become extremely important in our connected lives. However, the true potential of AI based advancements is likely to be unlocked with the advent of Neuromorphic chips.  Today, neuromorphic chips are very much a reality.

Neuromorphic chips are still in an early stage today and their purpose is to mimic the functioning of a human brain, its ability to learn and push more lower power chips with a great degree of efficiency.  Existing technologies like FPGA and ASIC apart from the CPU and GPU shall be complimented with neuromorphic computing. Today, we already find many mainstream chips exhibit certain neuromorphic capabilities.

Based on insights, let’s take a closer look at how neuromorphic chips are being used to solve some real world problems.

Neuromorphic chips to the rescue

In July 2020, the third annual International Conference on Neuromorphic chips was held. It was a three day event that had sessions from researchers around the world. The final paper was intriguing as it depicted how infection rates in vulnerable populations can be slowed down with the aid of neuromorphic computing.  The paper was titled, ‘Modelling Epidemic Spread with Spike-based Models’. Such models can play an instrumental role in forming pandemic responses, government policies, and healthcare policies to save lives. As many countries find themselves in the grip of a second pandemic, the need for such insights is more relevant than ever before.

Today, the most impactful advances in neuromorphic chips have been in R&D laboratories, government and academic fronts.  In 2020, the neuromorphic chips global market was estimated to be US $ 2.3 billion, as per business wire.  The market is set to reach $10.4 billion over a 8 year forecast period while growing at a CAGR of 24.2 %.

AI at the hardware level

With the conventional approach, manufacturers pack in smaller transistors in a tiny die space, coupled with caches, memory, and data pathways. Which is why, computing prowess in gadgets and mobile devices is limited due to the frugal power supply. For decades, chip manufacturers have been shrinking transistor sizes, we are 5 nm now and even 4 nm in some of the most advanced commercial nodes, to pack in more power. However, there are physics related limitations to this approach and you can only shrink transistors so much until you hit a wall. 3D chip stacking and EUV are some techniques used on the most advanced nodes.

Neuromorphic chips have an engineering approach where computing resources are optimized better. It is very different from brute force approach of conventional chips.  The design approach of neuromorphic chips is built from ground up with a focus on AI and hardware-based machine learning. Neuromorphic chips can meet the superfast throughput inside chips required by AI applications. Much like the human mind, they can mimic how the brain processes information effectively from various auditory and visual stimuli. Like the human brain, they can adapt and learn based on changing images and sounds.  The human brain is known for its prowess in determining patters from noise and learning and neuromorphic chips excel in processing discrete, clear data.  In time, a neuromorphic computing approach shall be able to resolve problems that conventional computing cannot or it can solve specific problems with fewer resources.

The road ahead

Today, CPUs, GPUs, and FPGAs largely carry out neuromorphic computing work via deep learning systems.  However, such an approach isn’t optimal for neuromorphic computing.  Chips like Intel’s Loihi are ground-breaking as they have been built from ground-up for neuromorphic tasks and are significantly more efficient at these tasks than other approaches. The efficiency of neuromorphic chips will drive the edge computing initiatives of the future and impel a new generation of miniature devices that exhibit AI capabilities

– Lionel Alva

Disclaimer: The views expressed in this article are those of the author and do not necessarily reflect the views of the Economic Times – ET Edge Insights, its management, or its members

Leave a Comment

Your email address will not be published.