MIT Builds Computer Chip to Power Mobile AI Better than Qualcomm's
Qualcomm computer chips are great, which is probably why half the gadget makers in the world prefer their tech over any other. However, while Qualcomm has obviously good tech, researchers at the Massachusetts Institute of Technology are giving them a run for their money with a new chip that is not only energy efficient but is also able to cater to artificial intelligence.
The chip, according to Tech Times, is expected to enable "neural networks" -- which is similar to the way a human brain works --- implanted into mobile devices. These chips, according to researchers, will be implanted into a mobile's processing unit. Because of its mobility, it will have as many as 200 processing units or cores at a time, making it an especially good option when it comes to network creation.
Boasting of speeds ten times faster and more efficient than other mobile GPUs, the new chip can allow mobile devices to run on artificial intelligence algorithms without the need for upload and download data from the internet.
Android Headlines reported that the tech, which is called Eyeriss, was unveiled at the International Solid-State Circuits Conference in San Francisco, California. The tech, originally composed of 168 cores, will have their own memory, which means that it can work without needing to communicate with the central memory bank, which significantly saves on time and energy. It also means that the data can be shared locally instead.
The outlet also pointed out that all data to be processed is said to have to be compressed as the Eyeriss's special circuit ensures that the cores are given the maximum amount of work that they can handle.
Funded by DARPA, which is the DOD agency responsible for emerging military tech, the Eyeriss is a comeback from research on neural networks conducted back in the 70s.
While used for artificial intelligence, there is also the possibility that Eyeriss can be used on autonomous cars, robots, drones, and other applications that require computer learning and processing.
Vivenne Sze, a professor at the MIT Department of Electrical Engineering and Computer Science, said, "Right now, the networks are pretty complex and are mostly run on high-power GPUs. You can imagine that if you can bring that functionality to your cell phone or embedded devices, you could still operate even if you don't have a Wi-Fi connection."
She went on, "You might also want to process locally for privacy reasons. Processing it on your phone also avoids any transmission latency, so that you can react much faster for certain applications."