Category Archives: Augmented reality

ZenFone AR – Tango and Daydream together

The ZenFone AR is a potentially very interesting device, combining both Tango for spatial mapping and Daydream capability for VR headset use all in one package. This is a step up from the older Phab 2 Pro Tango phone in that it can also be used with Daydream (and looks like a neater package). Adding Tango to Daydream means that it is possible to do inside-out spatial tracking in a completely untethered VR device. It should be a step up from ARKit in its current form which relies on just inertial and VSLAM tracking from what I understand. Still, the ability for ARKit to be used with existing devices is a massive advantage

Maybe in the end the XR market will divide up into those applications that don’t need tight spatial locking (where standard devices can be used) and those that do require tight spatial locking (demanding some form of inside-out tracking).

Mixed reality: does latency matter and is it immersive anyway?

I had a brief discussion last night about latency and its impact on augmented reality (AR) versus virtual reality (VR). It came up in the context of tethered versus untethered HMDs. An untethered HMD either has to have the entire processing system in the HMD (as in the HoloLens) or else use a wireless connection to a separate processing system. There’s a lot to be said for not putting the entire system in the HMD – weight, heat etc. However, having a separate box and requiring two separate battery systems is annoying but certainly has precedent (iPhone and Apple Watch for example).

The question is whether the extra latency introduced by a wireless connection is noticeable and, if so, is it a problem for AR and MR applications (there’s no argument for VR – latency wants to be as close to zero as possible).

Just for the record, my definition of virtual, augmented and mixed reality is:

  • Virtual reality. HMD based with no sense of the outside world and entire visual field ideally covered by display.
  • Augmented reality. This could be via HMD (e.g. Google Glass) or via a tablet or phone (e.g. Phab 2 Pro). I am going to define AR as the case where virtual objects are overlaid on the real world scene with no or partial spatial locking but no support for occlusion (where a virtual object correctly goes behind a real object in the scene). Field of view is typically small for AR but doesn’t have to be.
  • Mixed reality. HMD based with see-through capability (either optical or camera based) and the ability to accurately spatially lock virtual objects in the real world scene. Field of view ideally as large as possible but doesn’t have to be. Real time occlusion support is highly desirable to maintain the apparent reality of virtual objects.

Back to latency and immersion. VR is the most highly immersive of these three and is extremely sensitive to latency. This is because any time the body’s sensors disagree with what the eyes are seeing (sensory inconsistency) is pretty unpleasant, leading rapidly to motion sickness. Personally I can’t stand using the DK2 for any length of time because there is always something or some mode that causes a sensory inconsistency.

AR is practically insensitive to latency since virtual objects may not be locked at all to the real world. Plus the ability to maintain sight of the real world seems to override any transient problems. It’s also only marginally immersive in any meaningful sense – there very little telepresence effect.

MR is virtually the same as AR when it comes to latency sensitivity and is actually the least immersive of all three modes when done correctly. Immersion implies a person’s sense of presence is transported to somewhere other than the real space. Instead, mixed reality wants to cement the connection to the real space by also locking virtual objects down to it. It’s the opposite of immersion.

Real world experience with the HoloLens tends to support the idea that latency is not a terrible problem for MR. Even when running code in debug mode with lots of messages being printed (which can reduce frame rate to a handful of frames per second) isn’t completely awful. With MR, latency breaks the reality of virtual objects because they may not remain perfectly fixed in place when the user’s head is moving fast. But at least this doesn’t generate motion sickness, or at least not for me.

There is a pretty nasty mode of the HoloLens though. If the spatial sensors get covered up, usually because it is paced on a table with things blocking them, the HoloLens can get very confused and virtual objects display horrendous jittering for a while until it settles down again. That can be extremely disorientating (I have seen holograms rotated through 90 degrees and bouncing rapidly side to side – very unpleasant!).

On balance though, it may be that untethered, light weight HMDs with separate processor boxes will be the most desirable design for MR devices. The ultimate goal is to be able to wear MR devices all day and this may be the only realistic way to reach that goal.

Wide field of view holographic displays

Absolutely fascinating paper here from Microsoft Research describing the design of a holographic display technology that can achieve 80 degrees field of view or more. I remember sitting in a bar in London circa 1980 with a colleague discussing how to produce custom wavefronts for CGI applications. We went down a black hole fast but this kind of tech is exactly what we would have needed.

Developing Unity projects for Moverio BT-300 AR glasses on Windows

Since the Moverio BT-300 AR glasses run Android 5.1 using an Atom processor, it is possible to run Unity projects on them. The starting point is the instructions here on setting up Unity for the Android platform. One problem with this is that the android command is not included in Android Studio apparently so Unity builds will fail. So, to get Unity builds for Android to work, it is necessary to download and unzip the command line tools from the bottom of this page. This will create a directory tree that includes a tools directory. This should be used to replace the original tools directory in the Android Studio install, usually found at:


Incidentally, that is also the path that Unity needs to know in order to perform its builds.

There is a Unity plugin that provides support for 3D on the BT-300. For instructions on how to use the plugin, read:

 Assets > MoverioBT300UnityPlugin > MoverioController > README

The plugin includes a scene called MoverioTutorial that can be used as a starting point. It demonstrates some of the features of the plugin.

After the package name has been set in Player > Other Settings, it should then be possible to build, deploy and run on the BT-300 directly from Unity. I had a few problems with the tutorial with regard to SDK functionality but the Unity part seemed to work well (although I had to set 3D mode and disable the 2D camera manually sometimes). I am sure that I am doing something wrong – I’ll update the post when I work out what is happening.

WebRTC on the Moverio BT-300

bt-300-webrtc-1Since the BT-300 is an Android device, I thought it would be fun to try out WebRTC on it using the Android WebRTC sample here. Right now it’s WebRTC everything chez richard so this was a natural thing to test on the BT-300.

It worked just fine. The screen capture above shows the browser display from a desktop running Chrome. I am looking at the display hence the funky effect. The photo below is an attempt to show what it looks like in the BT-300.

bt300webrtcIn this case, the desktop camera is watching me desperately trying to juggle my camera and the BT-300 to get this image which is the left eye display from the BT-300. I must stress that the real quality of the image on the BT-300 is vastly superior to how it looks here – it’s really tricky to get any sort of photo in fact.

Building the Android sample takes a long time – in fact downloading the code takes for ever as it is about 16G of download! One extra point is that, after the gsync line, it is necessary to enter:


in order to get all the prerequisites installed. Apart from that, everything was straightforward.

Latest fun thing in the office: Epson Moverio BT-300 Smart Glasses

bt-300-1The arrival of a pair of BT-300 Smart Glasses gave me an opportunity to take another daft photo of myself wearing a wearable. My eyes don’t really look like that – that’s just where the (presumably) semi-silvered mirror surface is for each eye. Projectors at the sides generate images that are combined with the real light coming in to form a composite AR image.

Continue reading