Issue #36: Building websites and apps for Spatial Computing — Jason Michael Perry

Howdy 👋🏾, Apple Vision Pro reminds us of how flat our virtual worlds are compared to the 3D objects in reality. These interfaces remain stubbornly flat regardless of the work applications – be it Word, Email, or Browsers.

At launch, Apple Vision Pro supports just over 600 native apps. However, the majority that I’ve used, even Apple’s, are simply iPad or iOS applications. Thing is all iOS and iPad apps just work on Apple Vision Pro, but few apps truly take advantage of the immense possibilities.

Let’s start with the basics. Any existing iOS or iPad app can work with no adjustments. These apps have some common traits:

  • The aspect ratio locks to iPhone or iPad, limiting resize/move options.
  • A floating icon allows rotating between portrait and landscape modes
  • The interfaces use non-transparent iOS/iPadOS 17 designs, so they look like huge ported apps
  • Many apps are touch-focused, struggling with eyesight controls. You see this in hover behaviors and dense interactions.
  • Some touch actions lack eyesight alternatives, leaving you stuck or confused in apps. Some apps are impossible to use without a paired touchpad.

iPad apps generally perform better than iPhone ones since iPad interfaces assume more screen space. Of course, no work is required for App Store apps – if developers did nothing, their apps are available.

This is likely no surprise. For years, Apple has pushed cross-platform app development, allowing builds targeting iOS, iPadOS, visionOS and macOS. As Macs now share processors with Vision Pro and iPhone, the onus for a quality experience is spending time on app design.

For the easiest approach to building existing apps for Apple Vision Pro, open the app in Xcode and check the box to build for Vision Pro.

This defaults the look to include transparent elements, providing depth and making windows feel like frosted panes rather than opaque objects. As people move behind interfaces, this better allows peeking through and handling lighting environments.

Of course, you can do more. While most apps I used remain as anchored 2D canvases, treating the interface as a volume provides a 3D space, breaking the frame to hold objects.

Applications supporting immersive space fully control the environment, transporting you elsewhere like Disney+ recreations that place you on Star War’s Tatooine or in Monsters, Inc Scare Factory. While amazing, few apps truly lean into this magical potential of walking up and spinning a 3D globe like in Carrot Weather or placing objects on a table. I hope more developers reimagine possibilities as they get hands-on with Vision Pro.

For many, the browser remains a killer app, portaling to interfaces and web apps. Sitting before a giant Safari window is incredible, but many websites still need to be built for Vision Pro. Some common issues:

  • Selection targets are meant for mice where using eyesight can be a struggle to navigate
  • Clustered links, icons, and controls that are hard to precisely touch

Safari also enables interaction in AR with 3D objects pulled from sites. This has been possible on iPhone/iPad for some time – certain Apple product pages offer an AR view option.

Doing this on your phone creates an object you can manipulate by looking into your screen. But on Apple Vision Pro, these 3D objects become things you can hold, place around a room, and scale – all life-sized.

I furnished a room on Ikea’s site by pulling virtual bookshelves and pieces into my actual space, linking to stored USDZ 3D models.

By default, web experiences using webXR seem more limited, staying within 2D portals, unlike Meta Quest and others where these work. Apple Vision Pro requires toggling settings to enable features – something many won’t discover.

Once enabled, I got some impressive web environments working, including webXR demo apps. While functioning, most felt rough with interaction expectations that didn’t translate, this is likely why Apple disabled these features by default.

Viewing 3D frameworks like three.js or babylon.js offers possibilities like spinning 3D models. Web developers are just getting hands-on with Vision Pro so that experimentation will uncover more, but this opens doors for websites literally leaping off pages and interacting with objects in new ways. Mindgrub has already started experimenting, utilizing a transformative device that opens up innovative web development and app design possibilities. Now, onto my thoughts on tech & things:

⚡️ San Diego Hospital Invests in Spatial Computing. Augmented reality allows medical professionals to access patient vitals, imagery, and more overlaid seamlessly during in-person interactions. Spatial Computing from Apple Vision Pro and other XR headsets could help balance tech with human connection.

⚡️ React Native is ready for Apple Vision Pro. React Native is a popular cross-platform framework for building mobile apps that work across both Android and iOS. By adding support for Vision Pro, React Native will enable its large ecosystem of existing apps to more easily build augmented reality versions optimized for Apple’s new headset.

⚡️ AI Debuts at the Super Bowl. After years dominated by crypto, AI took center stage in 2024’s Super Bowl ads. Numerous commercials experimented with showcasing AI – both its amusing flaws and capabilities. Welcome to the age of AI.

I fought the temptation to bring my Apple Vision Pro out for Mardi Gras, but I would love an app that offered real-time information on floats in the parade as they passed. At CES, Walmart’s exhibit showcased custom QR codes, allowing its employees to get information from hard-to-reach warehouse packaging. I can see how these similar concepts could work.

Of course, all this talk of apps for the public can make us forget the tons of internal enterprise apps. Many of our clients have developed apps for all types of tasks, from inventory management to fixing power lines. Apple Vision Pro and other AR headsets open many possibilities, especially when paired with AI, computer vision, and more. I’m excited to see what we can build.

-jason

p.s. San Fran is really really fed up with driverless taxis. I enjoyed my ride in Google’s Waymo self-driving car, but living in a giant test bed for training autonomous vehicles must get frustrating for San Francisco residents. The violent backlash seen here reflects simmering community tensions – it highlights worries over safety, transparency, and real-world impacts as companies advance automated transport on public streets, sometimes without enough engagement.