Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
The return of college and NFL football may mark the start of fall for many people, but in the tech world, it’s Apple’s annual September launch event that denotes the new season. Every year, the company trots out its latest iPhones and wearable devices, like Apple Watch and AirPods, and waxes poetic about its latest hardware advances.
This year, however, things were a bit different. Sure, Apple did unveil a family of iPhone 16s, the Watch Series 10 and several new AirPod models, but the real focus was software − specifically, the company’s Apple Intelligence AI features. First announced at the WWDC developer conference in June (see “Apple’s WWDC showcases AI to make daily tasks easier” for more), Apple Intelligence provides a number of capabilities powered by generative AI that are designed to make the experience of using your iPhone (any of the new iPhone 16s or last year’s iPhone 15 Pro and Pro Max) better and more intuitive.
Chief among these capabilities is a new version of Siri designed to be much smarter and more relevant than earlier versions. You can use it to more easily find information on your phone − such as when you can’t remember whether a recipe was sent to you in a text or email or included in a shared document. In addition, it now has the ability to “see” and understand what you’re doing on your phone and can provide useful guidance on next steps. Or, if you want to know how to take advantage of a particular feature − such as turning on a personal hot spot or taking a panoramic photo − Siri now has that intelligence built in.
The Siri experience has also changed thanks to a colorful glowing light around the edge of the screen when it’s engaged. Apple also added the ability to type into Siri if you’d rather not speak and, if you do speak, to have Siri better understand your requests, even if you change things midstream.
On the photos side, Apple has added a Clean Up feature that lets you easily remove an element (or person) from your photos, similar to what Google has offered on its Pixel phones for more than a year now. One unique Apple feature is the ability to create custom genmojis by simply typing in a description − this is bound to be a hit, especially among enthusiastic emoji users.
Like other AI offerings, Apple has integrated tools for creating animation-style images from a text-based prompt or simple drawing. It also has enhanced the ability to do specific text-based searches of images − simple requests have been possible for a while on other iPhones − to make it easier to find exactly what you’re looking for. You can also request that your iPhone make custom slide shows with a text description, giving you more control over that capability.
On the text-based side of things, Apple offers text generation for documents and email, the ability to change the “tone” of the writing, text summarization capabilities and more. Again, most of these are available via other cloud-based AI Large Language Models, but by integrating it into the iPhone, Apple will be bringing this capability to many more people.
One of the more clever and unique text-based capabilities Apple introduced can summarize and prioritize your emails in the Mail app and notifications on your home screen. So instead of just seeing the first two lines of an email or notification, you can quickly see a summary of the crucial information − and in a prioritized order. For people who deal with lots of emails, texts and other notifications, this could prove to be a huge timesaver. Plus, it’s exactly the kind of simplified advancement that many people were hoping Apple could bring to the world of AI. Of course, it’s also probably something that other companies will quickly replicate, but in these early days of AI, improvements in how features get implemented are important.
The real question when it comes to Apple Intelligence, however, is: Will it be compelling enough to get people who own older iPhones to upgrade to one of the newer AI-capable models? From my perspective, the jury is still out on this question for several reasons. First, many of these capabilities will roll out over the course of several months and they’re initially limited to English language only, so they don’t have an immediate payback for everyone (and will limit their adoption around the world).
Second, as people get used to using some of these AI capabilities on the device, they may want to start using some of the more advanced capabilities offered by cloud-based providers. Thankfully, Apple does have an arrangement with OpenAI’s ChatGPT that’s free of charge, allowing people to start investigating those more advanced options. Still, many owners of older iPhones may find that some of the cloud-based services are good enough − or even better for some applications − and won’t feel compelled to upgrade to a new iPhone. In Apple’s defense, many of its Apple Intelligence features run directly on the iPhone and don’t send any data to the cloud, which improves privacy, but it seems many people don’t care as much about privacy as you might imagine.
Ultimately, the Apple Intelligence capabilities are an important first step, especially because of the huge audience of iPhone users, but they are only the beginning. AI-based features will not change things overnight, and even Apple knows the integration of these capabilities is going to take years. Still, if you’re eager to begin exploring AI, Apple is providing a broad range of AI-capable phones and some genuinely useful AI features to get that process started.
USA TODAY columnist Bob O’Donnell is president and chief analyst of TECHnalysis Research, a market research and consulting firm. You can follow him on Twitter @bobodtech. The views and opinions expressed in this column are the author’s and do not necessarily reflect those of USA TODAY.