Apple Quietly Integrates AI for Enhanced Basic Functions in New Gadgets



Key Takeaways

  • Apple has introduced AI-powered enhancements to its latest iPhones and Watches, focusing on fundamental functions like call management and image capture.
  • The company, in contrast to competitors like Microsoft and Google, is not explicitly marketing AI but has been subtly incorporating it into core software products and hardware to enhance user experiences.

Apple’s Subtle Integration of AI in Core Functions

In a departure from many tech giants that tout grand transformations through artificial intelligence, Apple is quietly leveraging AI to elevate the fundamental functions of its new gadgets. Without explicitly using the term “artificial intelligence,” Apple has introduced a fresh lineup of iPhones and Watches, featuring improved semiconductor designs that power AI-driven enhancements. These AI features primarily focus on refining core functions such as handling phone calls and capturing superior images.


While Apple has refrained from highlighting artificial intelligence during its product launches and developer conferences, the company has been steadily integrating AI into its core software products over recent months. This understated approach contrasts with the bold AI-driven initiatives of industry counterparts like Microsoft and Google, who have set ambitious transformational goals. Concerns about the unchecked development of AI tools, such as generative AI, have led to warnings from industry leaders.

Apple’s Series 9 Watch with Enhanced AI Capabilities

Apple’s latest Series 9 Watch features a new chip with improved data processing capabilities, prominently featuring a four-core “Neural Engine” that can execute machine learning tasks up to twice as fast. This Neural Engine serves as the foundation for Apple’s chips, specifically designed to accelerate AI functions. The AI components integrated into the watch’s chip have resulted in Siri, Apple’s voice assistant, becoming 25% more accurate.

Revolutionizing User Interaction with AI

Incorporating machine learning chip components has enabled Apple to introduce novel ways of interacting with the Series 9 Watch. Users can now perform actions like answering or ending phone calls, pausing music, or accessing information like weather forecasts by simply “double-tapping” with their watch hand. This innovation aims to provide users with a means of controlling their Apple Watch when their non-watch hand is occupied, such as holding a cup of coffee or walking a dog. The feature relies on the new chip and machine learning algorithms to detect subtle finger movements and changes in blood flow when users tap their fingers together.


Apple has also unveiled AI-powered improvements in image capture for its iPhone lineup. While the company has long offered a “portrait mode” that can blur backgrounds using computational photography techniques, users previously had to manually activate this feature. With the latest iPhones, the camera now automatically recognizes when a person is in the frame, gathering the necessary data to achieve background blurring later. This seamless integration of AI enhances the overall photography experience for iPhone users.


Apple’s incorporation of AI into its hardware reflects a growing trend in the tech industry, where smartphones and other devices are increasingly leveraging artificial intelligence to enhance user experiences. Google’s Pixel phones, for instance, offer AI-driven features like object removal from images, illustrating the expanding role of AI in modern technology.