top of page

Controller for AR Experiences

Why

 

Currently all phone based AR apps are experienced by tapping on the phone. The process is usually like this. Scan the area to detect feature points, detect a plane / surface, place objects, and once the objects of interest appear, you tap on the screen which then performs a hit test / raycast in the 3D world to interact with the object.

There are a few UX issues with this kind of interaction. The user usually covers the part of the screen when tapping. Sometimes the object of interaction is itself hidden behind the finger. On phone that are big, or on tablets, the fingers get stretched creating a bad user experience.

When we have AR glasses or strap our phones to our eyes, we won't be able to touch the screen and we will need a controller. 

We are making good way forward in terms of UX of AR and VR but a big challenge is we are making interfaces of tomorrow with the hardware that we have today (our smartphones).

What

 

In order to deal with some of these issues, I made as simple Bluetooth Low Energy powered Controller using Adafruit Bluefruit Feather, and connected it to my iOS AR app. The communication between the controller and the phone is very low latency and the user can interact with objects without blocking them with fingers.

The controller itself is of a very tiny form factor and has very simple functionality, clicking. Most of the pointing comes from the motion of the device itself. Because of the tiny formfactor of the controller, we can attach it at the back of the phone where fingers can comfortably reach to create good interaction experience.

 

I explored 2 interaction methods.

1. Reticle for point and shoot

In this method, you move your phone to point to a target and then press the button to shoot. pretty straight forward.

In this demo you point to move the reticle to the box you want to shoot the ball at and the press the button to release the ball

 

2. Gaze cursor

In this approach the assumption is that the users is wearing the headset on the head. When the head moves the cursor moves to point on the object which would be selected. The cursor is confined to the X-Z plane in the video. The cursor exists in 3D space. When the cursor intersects with the object in the AR world, it appears on the top of it.  

When the user clicks the button on the controller. The cursor changes the shape, the selected object also changes in color indicating that it is selected. when the user moves the phone or head, in case of a wearable experience, then the selected object also moves.

A UX Design Technologist passionate about creating meaningful user experiences

bottom of page