iPhone 16 Pro longterm review: While Apple Intelligence underwhelms, Camera Control fits right in
When we reviewed the iPhone 16 Pro last year, Apple Intelligence was barely available. Since then, the iPhone 16 series has benefitted from several new features, apps and improvements. Some (or most) of them were Apple Intelligence features that were teased back at WWDC 2024, months before the iPhone 16 Pro launched. AI features weren't the only changes this time around, with the iPhone 16 getting an entirely new button. The so-called Camera Control wasn’t just a simple app shortcut, but an elaborate multifunction button that offered a haptic half-press and the ability to swipe across to adjust camera settings and options. Managing Editor Cherlynn Low said we were still ”waiting on Apple Intelligence” in our initial review. Now, as we hit iOS 18.4, was it worth the wait? iOS 18 and Apple Intelligence so far Mat Smith for Engadget Apple Intelligence was late, arriving as part of iOS 18.1 back in October 2024. Initial generative AI features included writing tools like proofreading and rewriting, as well as text summaries and live transcription for phone calls and voice notes in the Notes app. A few months later, iOS 18.2 gave us the Apple Intelligence features that made most of the headlines when first announced at WWDC. That included ChatGPT integration, AI image generation in Image Playgrounds and Genmoji. Starting with iOS 18.1, one of the most controversial Apple Intelligence features is actually my favorite: Notification Summaries. It’s a good attempt at taming the wild west of countless group chats across multiple messaging apps, calendar reminders, Substack pings and everything else. You don't need to update every individual app for Notification Summaries to kick in. As soon as iOS 18.1 landed, my messy WhatsApp chats were streamlined. I thought it cute when my iPhone told me a friend had laughed at my comment, and suggested where we could meet and a date. That was distilled from a ten-message barrage they’d sent. Mat Smith for Engadget It’s not perfect, though. Apple had to clarify that notification summaries are AI-generated, and make that clearer beyond a small Apple Intelligence icon. This was after the BBC complained about multiple summaries that twisted the content of some of its headlines. I’ve also had notification summaries that incorrectly guessed the subject of a sentence or entire topic of a thread, but on the whole, it’s a useful utilization of AI smarts. There are more natural language hooks across most of iOS 18, too. You can now search for images in the Photos app with descriptions, dates, location and more. Those natural-language smarts take on a different function with writing tools, courtesy of Apple Intelligence. Even though I’m not a particularly clean writer (hey, Cher), these writing tools are not all that useful. I haven’t used them much beyond quick email responses to work requests and events, but the ability to proofread, rewrite or check tone may prove useful to some. The writing tools also work inside the Voice Memos app, which can now transcribe conversations, meetings and more. Here, it helps make things more concise, with options to turn transcripts into summaries, key points, lists and even tables. If the recording is clear enough and they’re not too long, iOS 18 does well on these transcriptions. Several times, the iPhone 16 Pro straight-up declined to summarize a transcript. Why? All I got was a notification that the tools “aren’t designed to work with this type of content.” Elsewhere, Apple struggles to catch up with AI innovations on rival devices. Like Google Pixel’s Magic Eraser (and all the other Android riffs), Clean Up now allows you to scrub out distracting elements and photo bombers from your pics. However, the results aren’t quite up to the standard of the competition. Sometimes, however, it nails it. Some Apple Intelligence features have faded into the background after an initial buzz. Image Playground offers the ability to AI-generate your own images and had plenty of users itching to get off the beta waitlist to play around with the app. After a bit of testing, however, I haven’t used it in the months since it launched. There might be a future for Playgrounds within iOS, though. For instance, Apple’s Invites app lets you embed Image Playground results within your events, which is helpful if you don’t have a photo to illustrate the invitation. As a standalone app, however, it doesn’t quite deliver enough to live on my home screen. It’s gone the way of GarageBand, Pages and Apple Maps on my iPhone. Genmoji, on the other hand, is easier to use, and I use it often. With it, you can create your own emoji reactions with specific people, objects and backgrounds. I already have several established favorites, like me eating cereal and a chronically late friend with clocks in the background, and I’ll probably continue to make emoji as life demands it. For example, a passenger behind me on a flight to Barcelona last month had three cats with her. So, I made a cats-
When we reviewed the iPhone 16 Pro last year, Apple Intelligence was barely available. Since then, the iPhone 16 series has benefitted from several new features, apps and improvements. Some (or most) of them were Apple Intelligence features that were teased back at WWDC 2024, months before the iPhone 16 Pro launched.
AI features weren't the only changes this time around, with the iPhone 16 getting an entirely new button. The so-called Camera Control wasn’t just a simple app shortcut, but an elaborate multifunction button that offered a haptic half-press and the ability to swipe across to adjust camera settings and options.
Managing Editor Cherlynn Low said we were still ”waiting on Apple Intelligence” in our initial review. Now, as we hit iOS 18.4, was it worth the wait?
iOS 18 and Apple Intelligence so far
Apple Intelligence was late, arriving as part of iOS 18.1 back in October 2024. Initial generative AI features included writing tools like proofreading and rewriting, as well as text summaries and live transcription for phone calls and voice notes in the Notes app. A few months later, iOS 18.2 gave us the Apple Intelligence features that made most of the headlines when first announced at WWDC. That included ChatGPT integration, AI image generation in Image Playgrounds and Genmoji.
Starting with iOS 18.1, one of the most controversial Apple Intelligence features is actually my favorite: Notification Summaries. It’s a good attempt at taming the wild west of countless group chats across multiple messaging apps, calendar reminders, Substack pings and everything else.
You don't need to update every individual app for Notification Summaries to kick in. As soon as iOS 18.1 landed, my messy WhatsApp chats were streamlined. I thought it cute when my iPhone told me a friend had laughed at my comment, and suggested where we could meet and a date. That was distilled from a ten-message barrage they’d sent.
It’s not perfect, though. Apple had to clarify that notification summaries are AI-generated, and make that clearer beyond a small Apple Intelligence icon. This was after the BBC complained about multiple summaries that twisted the content of some of its headlines. I’ve also had notification summaries that incorrectly guessed the subject of a sentence or entire topic of a thread, but on the whole, it’s a useful utilization of AI smarts.
There are more natural language hooks across most of iOS 18, too. You can now search for images in the Photos app with descriptions, dates, location and more. Those natural-language smarts take on a different function with writing tools, courtesy of Apple Intelligence. Even though I’m not a particularly clean writer (hey, Cher), these writing tools are not all that useful. I haven’t used them much beyond quick email responses to work requests and events, but the ability to proofread, rewrite or check tone may prove useful to some.
The writing tools also work inside the Voice Memos app, which can now transcribe conversations, meetings and more. Here, it helps make things more concise, with options to turn transcripts into summaries, key points, lists and even tables. If the recording is clear enough and they’re not too long, iOS 18 does well on these transcriptions. Several times, the iPhone 16 Pro straight-up declined to summarize a transcript. Why? All I got was a notification that the tools “aren’t designed to work with this type of content.”
Elsewhere, Apple struggles to catch up with AI innovations on rival devices. Like Google Pixel’s Magic Eraser (and all the other Android riffs), Clean Up now allows you to scrub out distracting elements and photo bombers from your pics. However, the results aren’t quite up to the standard of the competition. Sometimes, however, it nails it.
Some Apple Intelligence features have faded into the background after an initial buzz. Image Playground offers the ability to AI-generate your own images and had plenty of users itching to get off the beta waitlist to play around with the app. After a bit of testing, however, I haven’t used it in the months since it launched.
There might be a future for Playgrounds within iOS, though. For instance, Apple’s Invites app lets you embed Image Playground results within your events, which is helpful if you don’t have a photo to illustrate the invitation. As a standalone app, however, it doesn’t quite deliver enough to live on my home screen. It’s gone the way of GarageBand, Pages and Apple Maps on my iPhone.
Genmoji, on the other hand, is easier to use, and I use it often. With it, you can create your own emoji reactions with specific people, objects and backgrounds. I already have several established favorites, like me eating cereal and a chronically late friend with clocks in the background, and I’ll probably continue to make emoji as life demands it. For example, a passenger behind me on a flight to Barcelona last month had three cats with her. So, I made a cats-on-a-plane Genmoji. Exactly what Tim Cook intended.
Some software features are also specific to the iPhone 16 Pro series (and the 15 Pro). Camera Control aside, the new button also offers access to Visual Intelligence with a long press. Visual Intelligence is Apple’s take on Google Lens, tapping visual AI smarts to analyze what your iPhone is pointing at. It can recognize text, like words on menus, and even translate for you. If you get lucky, it’ll even identify the outside of a restaurant and (with some ChatGPT power) tell you the opening hours and what kind of cuisine it offers. It’s particularly effective in identifying landmarks, but busy scenes can quickly derail it. Unfortunately, you can’t tap on a particular object in the frame to clarify specifics. However, the ability to create calendar events from a poster is pretty cool — even if I usually forget to use it.
That’s Apple Intelligence, for now. There are a lot of smart touches, but so far, it isn’t remarkable. I credit Apple’s attempts to make most of its AI features either processed on device, or connected to the company’s Private Cloud Compute, which uses larger server-based models. Apple has reiterated that your data is never stored during these requests. Even with ChatGPT integration, if you don’t use an account with OpenAI, only your request and attachments are sent to ChatGPT. Your Apple Account and IP address are not shared with OpenAI. Apple’s deal with OpenAI means the latter can’t use your request to improve or train its models, either.
The patchy arrival of iOS features, especially Apple Intelligence, isn’t a good look for the company.
However, the rest of iOS 18 continues to deliver valuable new features and upgrades to the iPhone experience, like the aforementioned Invites app, additional content (and games) for News+ subscribers and my favorite feature since the AirPods introduced noise-cancellation, the ability to nod or shake your head to halt Siri announcements, Fitness prompts and even decline calls. I’m a busy guy!
Cameras and Camera Control
With Camera Control, Apple has introduced its most intriguing interface change since the short-lived 3D Touch.” That tech, if you forgot, offered haptic feedback on your screen tapping during the iPhone 6 era. Camera Control acts like a proper manual camera button, even when I know it combines a physical button with elaborate touch sensors. This enables deeper controls by swiping across it or semi-pressing.
Instantly, it works as a basic camera app launcher, just like the iPhone’s Action button has done in recent years. Already, it’s better placed, though, lower on the right edge of the phone, ready for your thumb if held vertically, or index finger when held horizontally. It goes beyond being a launcher, though. Swiping across it lets you adjust zoom, exposure and even toggle the new Photographic Style options. Then, a half-press works as a way to confirm your settings.
Depending on how you use the iPhone’s camera, a lot of the settings might not deserve their place within the Camera Control’s menus. While I often tinkered with Exposure and Zoom, Photographic styles are easier to adjust from the iPhone in the normal touchscreen way. I’m also not going to meddle with simulated f-stops when taking candid shots of my friends and family.
Like Apple Intelligence, Camera Control launched incomplete. When the iPhone 16 Pro first went on sale, it lacked a half-press focus like ye olde traditional cameras, and it took till January 2025 for an AF-AE lock to arrive through iOS 18.3. The feature, however, remains buried in settings and has to be toggled on.
It works well though. You tap on an object you’d like the iPhone to focus on, hold the button halfway, and it’ll lock exposure and focus for as long as you keep your finger down – like a camera. It’s frustrating that we had to wait this long for what seems like a core function of Camera Control.
To be devil’s advocate – and it’s an argument regularly leveled at AI features, both Apple Intelligence and elsewhere – did you need Camera Control? Given how much I use my phone’s cameras, I'd argue that a camera launcher is worth factoring into the hardware. But the Action Button covered that.
Camera Control also seemed like a ‘pro’ iPhone feature, so it’s surprising to see it across the entire device lineup, barring the more recent iPhone 16e.
Another change worth noting is that the iPhone 16 Pro got camera parity with the larger Pro Max this year, with a 5x optical zoom I use most of the time. Of course, this isn't a remarkable feature if you were already using an iPhone 15 Pro Max. But if you prefer the smaller of the two Pro options, aside from battery life, there’s no reason to stop you getting the cheaper iPhone 16 Pro.
Repairability and longevity
I haven’t had to repair my iPhone 16 Pro so far, and there are no pronounced scratches on the 6.3-inch screen or body. While I am also a case dweeb, I don’t use screen protectors — I haven’t needed to. The latest iPhones do come with even more repair-friendly hardware and policies if the worst were to happen. Anecdotally, according to my experience the titanium body and Apple's Ceramic Shield treatment on the display do seem to make this generation of 'pro' iPhones tougher than its predecessors
Apple’s new Repair Assistant, designed to address parts pairing issues, lets both you and repair professionals configure new and used Apple parts directly on the device, with no need to contact Apple personnel on the phone to ensure iOS plays with new parts.
Another improvement, while not part of the 16 Pro, is battery removal. On both the iPhone 16 and 16 Plus, the battery can be released from its enclosure by running a low-voltage current through the iPhone’s battery adhesive. However, the iPhone 16 Pro’s battery is now encased in aluminum, making repairs less fraught, and protecting the battery more when exposed to repair tools. I haven’t had to put it to use yet, but the company’s increased willingness to embrace right-to-repair is headed in the right direction, especially after its sluggish response in the past.
Six months on, the 16 Pro’s slightly bigger battery is holding up, with the one I have still showing it’s at full health. I am a heavy phone user, and 256 charge cycles later, iOS says it’s still at 100 percent capacity, which I found impressive. If I need to replace the battery, I’m heartened that it’s easier than ever on an iPhone.
Wrap-up
The iPhone 16 Pro is one of the best smartphones available. But if there’s a particular area where Apple’s phones are lacking compared to the flagship competition, it might be the messaging and the marketing. It took too long for Apple Intelligence to land on devices. If Apple Intelligence hadn’t been so key to Apple’s presentation both at WWDC and the iPhone 16 launch event, the delay wouldn’t look so bad.
Intriguingly, we’re at a time when the likes of the Galaxy and Pixel series have never felt more like iPhones. Or do iPhones feel like Android phones? I’m not sure anymore. Either way, we haven’t seen rival devices mimic the Camera Control button.
I appreciate that a lot of the new features and additions don’t seem to clog up the iPhone experience. Don’t care for camera filters? You’ll rarely see them. Want to swap the Camera Control button for another function? Go ahead. Want to prioritize Messages notifications, but not WhatsApp messages? Go wild.
With a light, strong titanium build, there’s still a tangible premium feel to the iPhone 16 Pro, compared to the aluminum iPhone 16. The same can be said about the cameras, with a 48-megapixel ultrawide sensor and 5x optical zoom, which the base iPhone cannot match. Functionality-twise, the base iPhone 16 now delivers the Dynamic Island and Camera Control, which makes a ‘pro’ iPhone a little harder to define. For the iPhone 16 series, it boils down to more premium materials and a powerful zoom camera.This article originally appeared on Engadget at https://www.engadget.com/mobile/smartphones/iphone-16-pro-longterm-review-apple-intelligence-134517480.html?src=rss