Friday , December 14 2018
Home / Augmented Reality / Google’s AR design guidelines aren’t complete shit, but should be better

Google’s AR design guidelines aren’t complete shit, but should be better

Google’s AR design guidelines aren’t complete shit, but should be better

When Google and Apple announced their mobile augmented reality (AR) platforms last summer, they rocked the 3D world. Almost overnight, Google’s ARCore and Apple’s ARKit shifted the center of gravity for 3D UX design away from headsets like the Oculus Rift, HTC Vive, and PlayStation VR toward already ubiquitous mobile devices.

ARCore and ARKit sparked interest in 3D from a new — and huge — group of designers and developers with experience building 2D mobile applications. They also added confusion to an already chaotic field. At the time, 3D workflows were a hodgepodge of tools and design patterns borrowed from gaming, film, 3D printing, and architecture. They were fragmented and time consuming, especially compared to 2D prototyping workflows. They required mastery of new tools with steep learning curves.

Have you visited TNW’s hype-free blockchain and cryptocurrency news site yet?

It’s called Hard Fork.

To complicate matters, there were no how-to’s, reference applications, or design guidelines to help get these designers started in mobile AR. However, that thankfully changed with the recent release of a set of AR design guidelines from Google.

The Google Augmented Reality Design Guidelines should be a lifeline for 2D mobile app developers. But how helpful are they really? Do they reflect the priorities or address the needs of designers engaged in building complex 3D applications? Are they something I would recommend to designers looking to get started in AR?

What follows are my (hopefully helpful) observations as I try to answer these questions.

The pluses

In general, the guidelines excel when they encourage designers to build applications that focus on motion and environmental engagement. Movement, specifically user movement, plays a critical role in AR.

Mobile app developers must account for new device orientations and camera positions. The guidelines recognize that users of AR apps no longer simply cradle a device; they constantly re-position it.

Designers must account for user fatigue and whether screen-locked UI elements obscure the camera view. Different device types and weights also impact a user’s experience. A tablet offers a larger screen, but weighs more than a phone, making lengthy user sessions more problematic.

“Be mindful of the length of play experiences,” the guidelines state. “Consider when users might need to take a break.”

The guidelines remind designers of one of the most overlooked aspects of AR: end-user mobility and how this shapes interactions with immersive designs. What if the user is on the bus, uses a wheelchair or is unable to move or hold a device?

In order to ensure access for everyone, the guidelines urge designers to consider four user modes — seated, hands fixed; seated, hands moving; standing, hands fixed; and standing, hands moving (full range of motion).

Credit: Google