Skip to content

Beyond Apple Intelligence: exploring the iPhone’s ‘hidden’ AI

Apple Intelligence arrived in late 2024 with a splash – and then, arguably, a crash. The initial tools aren’t all they’re cracked up to be. Notification summaries in particular are problematic, with even the BBC slamming Apple for their inaccuracies.

This might make you concerned about Apple Intelligence – and Apple suddenly jumping into the AI game. Yet AI and machine learning (ML) have for years quietly underpinned many iPhone experiences. That perhaps at least bodes well for the future of Apple Intelligence. And if you’re unconvinced, let’s explore ways in which AI and ML have already made a positive difference to the iPhone.

Improving the camera

The neural engine inside your iPhone accelerates AI and ML tasks. It’s utilized by an image processing feature called Deep Fusion every time you take a photograph. What you might not realize is that when you press the shutter, your iPhone isn’t capturing one photo – it takes several. Apple’s AI-powered technology figures out the best parts of each, and then provides you with a seamless composite.

Identifying people

The Face ID system works so efficiently that it’s easy to forget everything it does. During each unlock, your phone projects thousands of invisible dots to capture a depth map of your face. Your iPhone’s neural engine is able to transform that into a form that can be compared with the face you logged on your device – and, importantly, can adapt to changes in your appearance.

Identifying everything else

AI has a role to play in apps too, such as Photos. The app attempts to group like objects, and allows you to ‘train’ Apple’s systems to better recognize people and pets. You have long also been able to extract subjects and text from images. In iOS 18, Photos is quickly expanding rapid text-based searches to provide for all kinds of objects.

Intelligent suggestions

It’s AI and ML that help your iPhone learn and attempt to make smart suggestions that work across applications. You’ll see this when Calendar offers to pull an event from Mail, when Journal brings in data from elsewhere that might be useful for today’s entry, and in the likes of Siri and keyboard suggestions that, respectively, attempt to predict the next app or word you’d like, all based on constantly evolving contexts.

Boosting accessibility

In being able to draw on AI and ML to learn more about personal contexts and the world around someone, an iPhone can help make life a little easier for those with additional needs. Object and sound recognition provides assistance for those with vision and hearing loss. And personal voice uses AI to generate a virtual voice for people losing their own.

This list is far from exhaustive. There are other iPhone features that make great use of AI and ML. So when Apple Intelligence provides a gibberish notification or fails to represent your prompt in Image Playground, it’s worth being patient. AI is already doing great things in millions of iPhones, and there’s huge potential for a raft of new features to transform people’s lives in yet more meaningful ways.