Issue #53: Apple and Meta’s Latest Mixed Reality Moves
Howdy 👋🏾. Apple surprised everyone by prominently featuring the Apple Vision Pro at its Worldwide Developers Conference several weeks ago. If you follow Apple, there is plenty of reason for concern. Early rumors indicated Apple was cutting its delivery predictions, initial buyers reported high return rates, and several reviewers, including myself, found the app lacking killer features.
However, there was one huge positive: the importance of competition. Meta responded by releasing numerous updates to its Meta Quest devices, reviving forgotten or abandoned features to close the feature gap. Meta also announced a significant reshuffling of its Reality Labs division into two sub-units: one focused on wearables and the other on the metaverse. The metaverse division aims to build out its VR platforms, like Horizon’s social platform and VR headsets. At the same time, the wearables team is set to capitalize on the unexpected early success of the company’s Ray-Ban glasses.
I love the Ray-Bans and have been quite happy with the constant release of features, including adding AI tools to detect objects. Unfortunately, they do not accommodate my prescription for the frames, but I’m hopeful the next set, rumored to include a Google Glass screen, might.
With this new view of the evolving mixed reality market, I downloaded VisionOS 2 beta 2 onto Apple Vision Pro before a quick work trip to New York to see how the system-level changes to the device stand up. Special thanks to Mindgrub for letting me hold on to this headset for a little longer.
But before we dive in, a word from my sponsors and my thoughts on tech & things:
🤝 This week’s newsletter issue is proudly sponsored by:
If you are looking to find amazing people, contact Baird Consulting.
⚡️Anthropic continues to update its AI models with a new release they say matches ChatGPT 4o. What I’m most excited about is a very cool concept called Artifacts. Artifacts allow you to create a shared document space that you and Claude can both share and work on together, and this sounds like a killer feature!
⚡️Apple is buckling down for a huge lawsuit and fight with the EU over the DMA and has now stated that certain features, including Apple Intelligence, will not launch this year, and they don’t know when. Folks have varying views on whether this is a threat from Apple or an issue with a rule that requires interpretation.
Beta means beta, and it’s not uncommon for Apple’s early beta releases to have significant bugs. You never know when a release might cause major issues. Of course, as soon as I got back from camping and had a stable internet connection, I installed all the betas on my TV, watch, phone, laptop, and Apple Vision Pro. Aside from some issues with macOS, I’ve been pleasantly surprised with how complete the betas have felt.
At WWDC, Apple listed an exhaustive array of new Apple Vision Pro features. Some of these are immediately available in the betas, while others, like multi-view, which allows users to watch up to 5 soccer or baseball games simultaneously, are still in development. As I use and test these features, I’ve kept my eyes on how well they address some of my biggest complaints. Here’s what I think so far:
Eye Tracking
Most of Apple Vision Pro’s interface works by tracking what you’re looking at, allowing you to interact with an object by looking at it and pinching your fingers together. I found this frustrating because your eyes drift, or your focus sometimes jumps to your next target without you realizing it. While Apple hasn’t announced changes here, I’ve noticed fewer issues selecting dialogs or smaller elements since installing the beta. This is one of those weird things where you do something that once felt frustrating, and then you realize it wasn’t, but you can’t quite figure out why. I am glad to see this getting improved, and I’m curious to hear if others have felt similarly.
Mouse Support
Apple Vision Pro, at launch, supported Bluetooth keyboards and trackpads but did not provide support for a mouse. I found this odd, but I didn’t expect it to be much of an issue. The problem is that many third-party trackpads present themselves as a mouse, so my portable keyboard and trackpad combo didn’t work. I’m happy to say that these devices now pair and work without issues.
Hand Gestures and Notification Center
Some of the early ideas for how you launch the home screen with apps, access the notification center, or dismiss notifications felt a bit unpolished. The introduction of new hand gestures is a very cool way to fix that. Instead of tapping the digital crown to bring up the home screen with apps, you can now hold your hand out palm up, causing a floating icon to appear. If you complete an “O” with your fingers, Apple Vision Pro will launch or close the home screen without requiring you to touch the device. Turning your hand so the back faces up while holding your fingers together transforms the floating “O” into a display that shows the time and volume, which can easily expand to become the notification center.
It took some trial and error to get used to this action, but once it became comfortable, it felt natural. It also kept me from constantly touching the device, which sometimes required me to readjust from the downward pressure of tapping it.
Screen Sizes
When sharing my laptop’s screen, new settings appear that allow you to adjust the resolution to many additional choices. I love having a large rectangular display that gives me more space to play, and it still connects and works with ease. One promised but still missing feature is the environmental passthrough of an external keyboard, which makes your keyboard invisible when moving into an environment.
Additional Environments
Apple introduced additional immersive environments, including Bora Bora, and I’m already in love with working from the beach. It’s perfect for evenings when I can recline back on my patio, sit on the beach, and listen to the waves as they hit the shore. It is so relaxing and feels like I’m in a Corona beer commercial.
Handling Darkness
Darkness remains an issue, but tracking is better, and in my testing, Apple Vision Pro does a better job of maintaining its position at night. On my patio, as the evening became darker, I received warnings that tracking would become difficult, but it managed to pick up my hands until things got closer to pitch black.
Overall, the updates, while minor, really go a long way to making the device feel complete. This beta reminds me of my experience using the very first Apple Watch, now known as the Series 0. It was an okay device, but over time, operating system updates fixed issues, and the iteration turned an okay device into a good or great device. Apple Vision Pro feels set on that same journey, and I can already tell you that visionOS 2 beta moves it a few notches above ok.
If you like this content, please share it with your coworkers and friends. Also, this a reminder that I’m looking for newsletter sponsors and that I’m available as a fractional Chief AI Officer or a technical consultant.
-jason
p.s. I really love Perplexity AI. I use DuckDuckGo as my default search engine, but that’s due to the muscle memory of searching by typing in the browser’s URL bar. For anything that feels remotely complicated, I hit up Perplexity AI. Wired has a great article on not only Perplexity creating BS but also indexing content when it says it explicitly won’t.
If that wasn’t enough, Perplexity indexed Wired’s article (which blocks AI bots) and created an article about itself using the content of the article! You can’t make this stuff up.