Apple is working on a dedicated AI chipset which will help with all the AI related works on the iPhones. The chip is called Apple Neural Engine. The chip will work on the facial recognition technology and speech recognition. Siri already offered Apple a competitive edge earlier because of the voice recognition.
The new AI chipset will help the company to implement enhanced capabilities especially for those self-driving cars and gadgets running on AR. The technology will superimpose graphics and other information into the real-time world of people. Apple is presently more focused on the AI. The reason is, AR and self-driving cars are the future, and the core feature of these two things is AI.
Performance improvements in the Apple Neural Engine chipset
The Apple Neural Engine AI Chip will also improve the battery life of the iDevices. It will improve the overall performance of the Apple devices too. Right now it is not clear whether Apple will implement the chip this year only. Reports suggest Apple is already testing the future iPhone prototypes in the chip.
Apple devices are handling artificial intelligence processes with two different chipsets. One is the primary processor, and the other is the graphics card. With the presence of this new chip, Apple will be able to release those tasks in a dedicated module, which is designed for the AI processing. This is how they improved the battery and overall performance.
The other companies involved in advanced AI Chipset
Qualcomm’s latest Snapdragon chip for the smartphones comes with a module to deal with the artificial intelligence tasks. Other companies like Google are also not lagging behind too far. Google introduced the Tensor Processing Unit in 2016. The chip was used for better search results and also for image recognition. This year in the I/O Conference, Google announced a new version of the TPU chipset which will be available for cloud business. Nvidia Corp. is also selling similar chipsets for its customers.
The operating system and the software features will integrate with the devices which come with the Apple Neural Engine chipset. For example, Apple is already looking to release the facial recognition within the photos application, part of speech recognition and also the predictive keyboard in the chipset. Apple might also offer developer access to the chipset. It will help the third party apps release the AI tasks.
Apple might discuss some latest advancement on its AI at the annual developer’s conference in June. It also plans to release the latest iOS 11 in the conference. The company also said it would discuss the latest laptops with faster chipsets from Intel. The goal of this Apple Neural Engine is to separate the heavy AI tasks from the phone’s processor and graphics chip. Just like Apple uses separate chips for powering motion sensing in its different devices.