Researchers at MIT have developed a new hardware chip that can make neural networks in smart devices more power efficient. Neural networks at the network processors that deal with AI and features such as speech or face recognition.
Usually, neural networks require much energy as they are computing a large of data at once, so while there are neural features on devices such as smartphones, they are integrated on a smaller scale. The data is usually is sent to the web instead to process all the information and sent back to the device.
However, these chips compute information three to seven times faster than currently dedicated chips. Not only that, they reportedly consume 94 to 95 percent less power, making them an idea inclusion on smaller handheld devices in the future.
Since these machine-learning algorithms need so many computations, this transferring back and forth of data is dominant portion of energy consumption. But the computation these algorithms do can be simplified to one specific operation, called the dot product. Our approach was, can we implement this dot-product functionality inside the memory so that you don’t need to transfer this data back and forth? – Avishek Biswas, MIT graduate student who led the new chip’s development.
Theoretically, these chips could also make way for your smartphones. Newer smart devices work on the neural network too to some degree. It used for speech recognition, photo manipulation, etc. having these new chips on board can reduce the battery consumption from our devices.
This is opening more possibilities of developing powerful chips that require less power for all devices in the future.