Apple Unveils ‘Neural Engines’ at iPhone 15 Event

The new S9 SIP chip will process data on device instead of via the cloud, making it faster.

Ben Wodecki, Jr. Editor

September 13, 2023

2 Min Read
IPhone 5 images
The iPhone 15 boasts a powerful 48MP main camera and improved safety features.Apple

At a Glance

  • Apple plays down AI at iPhone 15 reveal, but showcases improved neural engines and machine learning chips.

This week, Apple showcased the iPhone 15, Apple Watch Series 9 and new AirPods Pro at its annual consumer devices event.

AI was not mentioned prominently, unlike at its tech rivals’ conferences. Among the AI-related news included updates to some of its chips.

The new Series 9 Watch got a brand new chip – the S9 SiP, which contains a four-core ‘Neural Engine’ for processing machine learning twice as fast as the original Apple Watch Ultra.

The new S9 SiP chip also boasts on-device processing for Siri, Apple’s voice assistant. Previously, the smartwatch required Siri processing to be completed via the cloud – with the shift to on-device meaning faster speeds when using the voice tool.

The new machine learning chip on the Watch Series 9 also enables users to perform a finger-pinching gesture – what Apple calls the 'double tap' − to pause music, end a call or launch an app.

The new chip enables the Watch to detect subtle movements and changes in blood flow when the double tap action is performed.

iPhone AI updates

The new iPhone 15 was showcased with Apple’s A16 Bionic chip – first used in the prior iPhone 14 line.

It contains two high-performance cores that use less power but can handle intensive workloads like streaming videos and playing games.

Related:Apple WWDC 2023: AR Headset, New Chips, AI Autocorrect

The A16 Bionic did get a boost – Apple added a new 16-core Neural Engine to power nearly 17 trillion operations per second to greatly boost machine learning computations.

The boosted A16 Bionic provides greater power for the iPhone’s camera. The chip allows the iPhone 15 camera to automatically capture depth information to take sharper images without having to manually turn on Portrait Mode, the iPhone's feature that enables a depth-of-field effect when taking pictures.

It also enables new features like Live Voicemail transcriptions in the new iOS 17. Voicemails are now transcribed in real-time, with users able to pick up the call while the caller is leaving their message.

Earlier this week, Qualcomm announced that it will be supplying Apple with its Snapdragon 5G Modem-RF Systems chips for the 2024, 2025 and 2026 iPhones.

At the Apple event, not much was mentioned about its new Vision Pro headset, unveiled at its Developer Conference in June, except that it would ship in early 2024.

The tech giant also heralded its shift to creating what it claims are carbon-neutral products. It pledged that all of its products will be carbon-neutral by 2030.

Stay updated. Subscribe to the AI Business newsletter.

About the Author(s)

Ben Wodecki

Jr. Editor

Ben Wodecki is the Jr. Editor of AI Business, covering a wide range of AI content. Ben joined the team in March 2021 as assistant editor and was promoted to Jr. Editor. He has written for The New Statesman, Intellectual Property Magazine, and The Telegraph India, among others. He holds an MSc in Digital Journalism from Middlesex University.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like