The iPhone 13 is further proof that the phone is quietly turning into an AR machine
This story is part of, our comprehensive coverage of the latest Apple news.
If Apple made one thing clear during Tuesday’s sessionis that the cameras and processors of these new phones are going to be a big deal. Apple’s speech even included a short film by Oscar-winning director Kathryn Bigelow and cinematographer Greig Fraser that was shot on show off .
It is obvious that Apple is marketing theto photographers and videographers. But the advancements remind me of how Apple could set the stage for something bigger. This could lay the groundwork for what should be Apple’s next big thing: a .
Upgrades like better cameras, more powerful processors, and additional storage options are typical of new phones. But it’s how these additions combine with other iPhone updates from the past two years that suggest Apple is setting up the iPhone to become an augmented reality powerhouse.
The iPhone could one day be the brain of Apple’s AR and VR headset
Rumors have been circulating for years that Apple may be working on smart glasses that deliver augmented and virtual reality experiences. But unlike most Apple products, which typically leak in detail long before release, reports have so far offered a mixed outlook.
Bloomberg reported last January that Apple is developing an all-in-one AR and VR headset for developers that runs on its most powerful chips. This headset would serve as a precursor to a more stylish pair of AR glasses, according to the report.
But a recent story from The Information, published just days before the iPhone event, describes something entirely different. This suggests that Apple’s headset will run on a less powerful chip and therefore will need to connect to a host device such as the iPhone.
If the Information report is correct,certainly seems to be a capable host for this type of portable device. Apple calls the version of its A15 Bionic chip found in the iPhone 13 Pro, which has five graphics processing cores instead of the regular iPhone 13’s four, the fastest chip ever in a smartphone. Apple is offering this phone to photo and video editors, but better graphics also likely mean better performance in AR and VR applications.
Processing power aside, all of Apple’s iPhones also benefit from increased battery life and more storage space, with the Pro becoming the first iPhone to get a. Again, these are updates that would likely be needed if AR apps became more popular, and Apple certainly they will be. It makes me think that Apple could make these iPhones sustainable for a scenario where we all use AR or VR apps on our phones almost daily. Or for when the long-talked about Apple headset exists.
The iPhone is gradually becoming better equipped for augmented reality
These upgrades alone don’t suggest anything significant about Apple’s ambitions for future products. But Apple is clearly improving the power of the iPhonethat are meant to live on your phone, as we’ve seen in recent years.
Apple positions the– including a new cinematic mode that automatically shifts focus between subjects – ideal for media professionals. And Apple is probably correct in assuming that this is the most meaningful way to see these sophisticated cameras being used in the short term. But I can imagine having cameras that could lock onto subjects faster and more accurately would also be extremely useful for AR applications, although Apple didn’t focus on AR during its event.
In addition to upgrading the iPhone’s cameras, Apple has equipped its gadgets with sensors that give them a much better sense of their surroundings. This is the key for a technology like AR that must accurately detect objects in the real world in order to work.
The biggest clue came last year when Apple added a lidar scanner to the, a sensor that detects depth by measuring the time it takes for light to reflect off an object. Apple hasn’t been subtle about how lidar can improve augmented reality on the iPhone; he highlighted AR as one of the main reasons lidar was built into the iPhone in the first place. I can imagine the iPhone 13 Pro’s improved cameras combined with lidar could allow it to run powerful AR apps.
A year earlier, Apple had also installed an ultra-wide band chip in the iPhone for the first time. The iPhone 11 introduced Apple’s U1 chip, which provides much more accurate location tracking when used indoors compared to GPS. At present, the iPhone’s ultra wideband technology is mainly used to improve AirDrop and to find lost items via.
Still, this is another example of how the iPhone is becoming more space conscious, and it could have a lot of potential for future AR apps. AirTags are already providing a first indication of how this technology could be used in AR applications.
A feature called Precision Search, for example, displays prompts on your iPhone screen leading you to your lost AirTag. It’s easy to imagine how this could translate into future AR apps that overlay instructions on top of the real world rather than on your iPhone screen.
And if that’s not enough, Apple’s upcoming iOS 15 also comes with features that seem, as my colleague Scott Stein notes.
Did Apple give the iPhone 13 a better camera, more processing power, and longer battery life just for augmented and virtual reality apps? No. These upgrades are useful for all smartphone users, even those who don’t shoot movies on their iPhone and mainly use their device for taking animal photos and reading the news. But when you consider these updates in the context of how the iPhone has evolved over the years, it sure looks like there’s a lot more potential.