fbpx

Apple iPhone 16 Pro Max – Review

Having reviewed iPhones for quite a while now, it has become apparent that there are two phases to each iPhone release. The more obvious one, that detractors refer to as incremental upgrades, are the minor improvements for each iteration – the improved camera, sharper or brighter display, thinner bezels, new SoC (System on Chip) processor and so on, that perform better than its predecessor. There’s also the introduction of new, or rather updated colours, and the elimination of others in some musical chair colour game that Apple plays, by subsequently reintroducing colours in later years, but it’s what the majority care about and discuss when deciding to buy a new phone.

The second phase is where new features are introduced that change how you use your phone, and while not always fully embraced, turns into something that we become comfortable with and eventually accustomed to. Like the introduction of Dynamic Island on the iPhone 14 series, where the pill shape area on the top of the screen becomes a shape-changing notification hub for users. Before that, it was MagSafe wireless charging on the iPhone 12, and Face ID on the iPhone X. As a standalone feature, these were not standalone reasons to pick up a new iPhone, but when both phases combined, provided a value offering that few could pass up on.

Advertisement ▼

It’s the same with the latest iPhone 16 range, and of course, the flagship iPhone 16 Pro Max. For this, Apple has increased the screen size, more so by having thinner bezels so you get a grand 6.9-inch screen, up from the 6.7-inch of last year’s iPhone 15 Pro Max. Measuring 163 x 77.6 x 8.25 mm and weighing 227g, it is comparable to the 159.9 x 76.7 x 8.3 mm dimensions of the iPhone 15 Pro Max, which weighs a similar 221g. Available in four titanium finishes – Black, White Natural and Desert, we’re just going to call it as we see it – black, white, silver and pink, the last being the review unit that we have, and the ones my children refer to as the new, pink iPhone.

On the insides, there is the new A18 Pro chip which promises faster and better performance and you can do all the benchmarking comparisons out there, but there are two things to know – unless you’re stressing the phone by shooting 4K ProRes video at 120 fps, editing high-quality videos on the go or playing graphics and performance intensive games, you’re unlikely to tax the device to its fullest.

Here are Geek Culture’s gaming tests on both the iPhone 16 and iPhone 16 Pro Max, where we run both devices through a selection of high-performance and graphically intensive games, and as you can see, both phones carve through these hit games like a hot knife through butter.

As for the 4K ProRes capabilities, that’s the new camera array running, with a 48MP Fusion Camera that has a faster quad-pixel sensor for reduced shutter lag, a new 48MP Ultra Wide, up from the 12MP on the iPhone 15 Pro Max, and a 12MP Telephoto. With it also comes a selection of new Photographic Styles, where users can use Apple filters to enhance the photo after it has been taken, and further tweak the hues and contrast of each image with a digital touchpad, by adjusting a dot around.

But the crown jewel here is the new Camera Control button, located on the right edge of the phone, below where the Power Button sits. When you rotate the device left, the shutter button appears on the top right edge, where most users know a camera shutter button should be. In either horizontal or vertical mode, you click it once to get directly to the camera, and in video mode, you click it to start recording. But what’s amazing is that this button also has a capacitive touch sensor, so when you depress, but not press on the button, you open up a selection of secondary control options.

The first depress gets you to a selection of preset styles, from Standard, to Vibrant, Natural, Luminous and so on, each offering a subtle tweak to the image to give it a different look and style, much like a more refined filter on Instagram. Depress the button twice, and you get to the secondary controls, where Camera allows you to 0.5x, 1x, 2x or 5x zoom; Zoom gets you a slider for more precise controls when zooming in and out, and Depth and Aperture lets you adjust exposure settings, while Tone adjusts the brightness. The concept of a press and a depress on the Camera Control might seem confusing at first, but once you realise it’s a capacitive touch sensor, it’s rather intuitive, and you’ll catch yourself with both hands on the device held horizontally, adjusting camera setting more efficiently without having to navigate around touch controls. 

What’s more interesting though, is that Apple has released the iPhone 16 range by heralding what’s to come for owners. The Camera Control button does not have it now, but in a future rollout, it will introduce a two-stage shutter that lets you automatically lock focus and exposure. It will also provide one-touch access to Apple’s upcoming Visual Intelligence object identification.

As a general rule of thumb, you should not buy something on a promise of the future, or a future premise because there’s never a guarantee of success. Take the current excitement over AI or artificial intelligence, where brands and software developers are asking consumers to update and upgrade the hardware they are using for something better, of advanced operations supported by machines that can help you improve the way you write, draw and design. Brands are touting it, but the performative premise has yet to be realized.

Then there’s Apple’s latest iPhone 16 series and the promise of AI. In this case though, it’s Apple Intelligence, an upcoming artificial intelligence platform whose abbreviations have seemingly been co-opted by the American company to match the more popular nomenclature. First revealed at Apple’s Worldwide Developers Conference in June, Apple Intelligence is a set of functionalities and features that makes use of on-device and cloud computing to mimic human intelligence in the form of writing, transcription, image creation, advanced search and virtual assistant support. It’s coming to the Apple family of iPhones, iPads and Macs, starting with the iPhone 16 series of devices… except that while the devices are here, but the full suite of Apple’s AI isn’t available just yet.

This portion of the article has been updated to reflect the introduction of Apple Intelligence from October 2024.

The rollout of Apple AI started with iOS 18.1, in October, alongside iPadOS 18.1, and macOS Sequoia 15.1. It will launch first in U.S. English, before expanding to include localised English in Australia, Canada, New Zealand, South Africa, and the U.K. in December. By next year, Apple Intelligence will expand to include more languages, including Chinese, English (India), English (Singapore), French, German, Italian, Japanese, Korean, Portuguese, Spanish, Vietnamese, and more.

Does it mean that Singapore users can only access AI in 2025? No, as these are language support. If you have your language set to US English, AI will work on your device. The inclusion of Singapore English simply uplifts the Singapore context and picks up on spoken English in Singapore, as we’re known to say, “One, two and tree” instead of three, or mudder and brudder instead of mother and brother, because ‘th’ is a challenge to pronounce. Only time (by Aprol 2025) will tell if AI will recognise ‘tew-tion’ as tuition, ‘meh-nets’ as magnets, and “Wait a-wow” instead of wait a while.

And while AI is not the key selling feature of Apple’s 2024 range of devices, it’s perhaps the most exciting one in what was previously described as the second phase of an iPhone launch, as it’s a feature that changes the way you interact and use your iPhone, though not the first reason for buying a new model. Apple is not the first to introduce artificial intelligence on mobile devices, but being first means little, as the ones who have, haven’t made a significant impact that draws consumers to the hardware.

To turn on Apple Intelligence, you first have to opt in under the phone’s Settings, where you’ll be out on a short wait list, as the phone downloads an initial 3GB+ file to your phone. Make sure that your phone’s language is set to English (US) and you’re all set, as the phone will introduce you to a world of advanced features, including Writing Tools, which can help you proofread, edit, rewrite, and summarise text. The tool can rewrite a short note in a friendly or professional manner, and if you feel you’re rambling on, also do it in a more concise manner.

A new app, Image Playground provides you the capability of generating new images, either by describing what you want in text, using photos in your phone’s Photos gallery, or a combination of both. Say you have a self portrait and you want to create an avatar in a fantasy setting, or a winter theme – the app can make one based on your photo. After that, you might want to add a description that you’re dressed in a suit. Meanwhile, Clean Up in Photos lets you edit your images by removing unwanted objects or people, with AI filling in the gaps that the removed items or persons leave.

Writing Tools turned out to be the most commonly used because with the tap of a button, you can easily have what you wrote proofread and rewritten in a more natural or professional manner. This can be done to a Notes document,or in an email in the Gmail app. It won’t work everywhere, as it’s dependent on app developers so while you can deploy Writing Tools while typing a message on the Telegram messaging app, it’s not available on WhatsApp as the Meta-owned equivalent has its own AI-enhancement offering as well.

Image Playground is fun to play with for some image generation, but it does have some limits, as the art style is limited to a 3D render that looks rather cartoon-like, and even with several prompts, any generated image with a person will always have said person as the main subject in the final photo. You can also sketch out something, move the sketch to Image Playground while adding in a description, for the app to generate an animated render. What worked is how the app would follow closely to my sketch, of where the mountain, tree, building and main subject was placed. With another image generation app on a competing platform, the render produced placed the four elements haphazardly without reason.

There are also a few noticeable drawbacks. Apple’s implementation puts the processing on the device, and not doing it on the cloud. This ensures that your data remains your own, instead of using it as a learning model to beef up its AI capabilities. But without the power of cloud processing, some things don’t work as well. For example, asking it to read a paragraph of text will not be as smooth, and if the image is filled with multiple items to clean up, the final product will have image tears and plenty of artifacts in the photo.

And how would a user know which tasks is AI supported? Apple intelligence uses a graphical hypocycloid to represent AI, so when you see it appear in the menu, you can tap on it to establish what types of AI can be used, such as with Image Playground. However, when it comes to Writing Tools, this icon may not aways appear but the Writing Tools option does, so you have to be familiar with the feature, to know what it’s for. It’s the same with Clean Up in your photos app as well.

Another hiccup is with the standalone Image Playground app. Apple Intelligence is activated on both my iPhone 16 Pro Max and iPhone 15 Pro Max, but for some reason, the app cannot be installed on the latest iPhone 16 Pro Max. Yet, it is available as users with this bug can simply open up the Freeform app, and enter the Image Playground feature through this simple, but additional workaround. On the iPhone 15 Pro Max, it appears as a standalone app icon.

More features and support for additional languages will roll out throughout the year.

GEEK REVIEW SCORE
Overall
9.2/10
9.2/10
  • Aesthetics - 9/10
    9/10
  • Build Quality - 9/10
    9/10
  • Performance - 9.5/10
    9.5/10
  • Value - 9/10
    9/10
  • Geek Satisfaction - 9.5/10
    9.5/10