Skip to main content

I’m torn on the iPhones 16’s Camera Control – it’s handy but unfinished

Web Hosting & Remote IT Support

If you’ve read my previous thoughts on iPhones here at TechRadar and its sibling site Tom's Guide, you’ll know I have fairly firm opinions on Apple’s smartphones.

Since moving from Android to iPhone at the end of 2021, I’ve not gone back to the platform Google built, despite trying some of the best Android phones. The ease of iOS has taken in me; I love the titanium construction, I’ve found Ceramic Shield glass to be a minor game changer, I enjoy the Action button, and the cameras almost never let me down on iPhones.

But for once, I’m on the fence.

What’s got me pondering is the Camera Control ‘button.’ In some ways, it’s a cool new feature that uses haptics well. In other ways, it’s superfluous and not fully featured.

I’ve been trying out the iPhone 16 Pro Max for a couple of weeks now, and when it comes to capturing a photo, l try and use Camera Control as much as possible. As I’m 37 and a millennial, I still like snapping photos on my phone in landscape orientation, so having a physical button where my finger naturally sits is good for capturing a shot without messing up the framing by tapping on the screen or trying to hit the Action button – I have this mapped to trigger the ‘torch’ anyway, which is surprisingly helpful.

I also like flicking through zoom ranges with a swipe on the Camera Control without the need to tap on small icons. The exposure control is kind of cool, though swapping between the features Camera Control can control doesn’t quite feel intuitive to me yet, and often, my taps cause me to lose the precise design of a scene.

So yeah, Camera Control is interesting. But…

Did anyone really ask for it? It feels like a feature for the sake of Apple’s mobile execs to have something new to talk about at the September Apple event. It’s just about a ‘nice to have’ feature, but it’s hardly a phone photography game changer.

Not my tempo

Apple iPhone 16 Pro Max Hands on

(Image credit: Future / Lance Ulanoff)

However, maybe I’ll warm to it over time. Yet, the biggest issue is the lack of AI tools at launch for Camera Control. Apple actively touts the AI features for Camera Control that can be used to smartly identify things the cameras are pointed at and serve up all manner of information. That hasn’t happened yet, with a rollout arriving post-launch when Apple Intelligence fully arrives; there's a beta option, but I'm not willing to try that on my main phone.

I’ve yet to understand that. Sure, other phone makers have touted AI features that will come after their phones are released and may be limited to certain regions, to begin with, but at least they launch with some of the promised AI suites. The iPhone 16 range launched without any Apple Intelligence features.

This is not what I expected from Apple, a company that famously doesn’t adopt new tech until it’s refined and ready for polished prime time. So, for it to launch smartphones without next-generation smarts is baffling to me. But it’s also the primary reason why I feel torn about Camera Control; if it had Google Lens-like abilities at launch, baked into a hardware format, I can see myself being a lot more positive about Camera Control.

Of course, Apple's use of such a camera button will undoubtedly cause other phone makers to follow suit. I only hope they don’t skimp on features when their phones launch.

As for Camera Control in the here and now, I’ll keep an open mind and keep using it; I’ll just cross my fingers that it'll become seriously handy once it gets its prescribed dose of AI smarts.

You might also like



via Hosting & Support

Comments

Popular posts from this blog

Microsoft, Google, and Meta have borrowed EV tech for the next big thing in data centers: 1MW watercooled racks

Web Hosting & Remote IT Support Liquid cooling isn't optional anymore, it's the only way to survive AI's thermal onslaught The jump to 400VDC borrows heavily from electric vehicle supply chains and design logic Google’s TPU supercomputers now run at gigawatt scale with 99.999% uptime As demand for artificial intelligence workloads intensifies, the physical infrastructure of data centers is undergoing rapid and radical transformation. The likes of Google, Microsoft, and Meta are now drawing on technologies initially developed for electric vehicles (EVs), particularly 400VDC systems, to address the dual challenges of high-density power delivery and thermal management. The emerging vision is of data center racks capable of delivering up to 1 megawatt of power, paired with liquid cooling systems engineered to manage the resulting heat. Borrowing EV technology for data center evolution The shift to 400VDC power distribution marks a decisive break from legacy sy...

Google’s AI Mode can explain what you’re seeing even if you can’t

Web Hosting & Remote IT Support Google’s AI Mode now lets users upload images and photos to go with text queries The feature combines Google Gemini and Lens AI Mode can understand entire scenes, not just objects Google is adding a new dimension to its experimental AI Mode by connecting Google Lens's visual abilities with Gemini . AI Mode is a part of Google Search that can break down complex topics, compare options, and suggest follow-ups. Now, that search includes uploaded images and photos taken on your smartphone. The result is a way to search through images the way you would text but with much more complex and detailed answers than just putting a picture into reverse image search. You can literally snap a photo of a weird-looking kitchen tool and ask, “What is this, and how do I use it?” and get a helpful answer, complete with shopping links and YouTube demos. AI Eyes If you take a picture of a bookshelf, a plate of food, or the chaotic interior of your junk...

Passing the torch to a new era of open source technology

Web Hosting & Remote IT Support The practice of developing publicly accessible technologies and preventing monopolies of privately-owned, closed-source infrastructure was a pivotal technological movement in the 1990s and 2000s. The open source software movement was viewed at the time as a form of ‘digital civil duty’, democratizing access to technology. However, while the movement's ethos underpins much of today’s technological landscape, its evolution has proven to be a challenge for its pioneers. Hurdles Facing Young Developers Open source models successfully paved a path for the development of a multitude of technologies, cultivating a culture of knowledge sharing, collaboration , and community along the way. Unfortunately, monetizing such projects has always been a challenge, and ensuring contributors are compensated for their contributions working on them, even more so. On the other hand, closed-source projects offer greater control, security, and competitive advant...