Apple Unveils ‘Neural Engines’ at iPhone 15 Event
The new S9 SIP chip will process data on device instead of via the cloud, making it faster.
At a Glance
- Apple plays down AI at iPhone 15 reveal, but showcases improved neural engines and machine learning chips.
This week, Apple showcased the iPhone 15, Apple Watch Series 9 and new AirPods Pro at its annual consumer devices event.
AI was not mentioned prominently, unlike at its tech rivals’ conferences. Among the AI-related news included updates to some of its chips.
The new Series 9 Watch got a brand new chip – the S9 SiP, which contains a four-core ‘Neural Engine’ for processing machine learning twice as fast as the original Apple Watch Ultra.
The new S9 SiP chip also boasts on-device processing for Siri, Apple’s voice assistant. Previously, the smartwatch required Siri processing to be completed via the cloud – with the shift to on-device meaning faster speeds when using the voice tool.
The new machine learning chip on the Watch Series 9 also enables users to perform a finger-pinching gesture – what Apple calls the 'double tap' − to pause music, end a call or launch an app.
The new chip enables the Watch to detect subtle movements and changes in blood flow when the double tap action is performed.
The new 'double tap' feature enables users to perform common actions on the Apple Watch Ultra 2. Credit: Apple
iPhone AI updates
The new iPhone 15 was showcased with Apple’s A16 Bionic chip – first used in the prior iPhone 14 line.
It contains two high-performance cores that use less power but can handle intensive workloads like streaming videos and playing games.
The A16 Bionic did get a boost – Apple added a new 16-core Neural Engine to power nearly 17 trillion operations per second to greatly boost machine learning computations.
The boosted A16 Bionic provides greater power for the iPhone’s camera. The chip allows the iPhone 15 camera to automatically capture depth information to take sharper images without having to manually turn on Portrait Mode, the iPhone's feature that enables a depth-of-field effect when taking pictures.
It also enables new features like Live Voicemail transcriptions in the new iOS 17. Voicemails are now transcribed in real-time, with users able to pick up the call while the caller is leaving their message.
Earlier this week, Qualcomm announced that it will be supplying Apple with its Snapdragon 5G Modem-RF Systems chips for the 2024, 2025 and 2026 iPhones.
At the Apple event, not much was mentioned about its new Vision Pro headset, unveiled at its Developer Conference in June, except that it would ship in early 2024.
The tech giant also heralded its shift to creating what it claims are carbon-neutral products. It pledged that all of its products will be carbon-neutral by 2030.
Stay updated. Subscribe to the AI Business newsletter.
About the Author
You May Also Like