AI-powered features of Apple iPhone 15 range, Watch Series 9
Cupertino tech giant Apple unveiled a new line-up of iPhone 15 models and Watch Series 9, at its Wonderlust event yesterday. The new products feature improved semiconductor designs that enable advanced artificial intelligence (AI) features. These enhancements focus on improving basic functions such as taking calls and capturing better images. So, how do Apple's latest AI upgrades work? Let us have a look.
Watch Series 9's neural engine boosts Siri
Watch Series 9 features a new chip with improved data crunching capabilities, including a four-core "Neural Engine" that processes machine learning tasks up to two times as quickly. This Neural Engine accelerates AI functions, making Siri, Apple's voice assistant, 25% more accurate. The machine learning chip parts also enable a new method to interact with the wearable. Users can "Double Tap" by finger pinching with their watch hand to perform actions like answering calls, pausing music, or accessing weather information.
Use the watch while walking pets
The "Double Tap" facility works by utilizing the new chip and machine learning to detect subtle movements as well as changes in blood flow, when fingers are tapped together by users. The idea is to give users a way to operate the Apple Watch when their non-watch hand is busy walking a pet or holding a cup of tea or coffee.
AI-powered image capture in latest iPhones
Apple has enhanced image capture for its new iPhones. The camera automatically recognizes when an individual is in the frame and collects the required data to blur the background later. Apple has long offered a "portrait mode" that blurs backgrounds to simulate a large camera lens, but users had to manually turn the feature on. Apple is not the sole smartphone maker to add AI to its hardware. Google's Pixel phones allow users to erase unwanted people/objects from images.