Configuring Hyper-V networking for the HoloLens emulator

emulatorThe HoloLens emulator is a handy tool for quickly checking out code destined for the HoloLens itself. Apart from requiring Windows Pro for Hyper-V support, the only problem encountered was to give the HoloLens emulator access to some sort of network. I basically followed the instructions here but still encountered issues with getting it to connect to the sharing server, probably because I wasn’t thinking about what was happening.

The HoloLens emulator automatically creates the Emulator NAT Network Adaptor and Emulator Internal Adaptor when it starts up if they aren’t already there. When you manually create the new switch and add it to the Emulator NAT Network Adaptor, this couples things to the physical network adaptor. That adaptor will get an address on the external network (e.g. 192.168.1.23). However, the emulator is connected internally and (in my case at least), gets an address of 172.16.80.2 while the host PC gets an address of 172.16.80.1 on that interface.

So, if you run the sharing server on the host PC, the emulator needs to be pointed to the server at 172.16.80.1 rather than the 192.168.1.23 address that a real HoloLens would have to use. Once I realized that, things started working.

Another problem with my initial configuration I suspect was that the host PC originally had a static IP address. This seemed to cause strange effects and changing to dynamic IP let things work properly.

Phab 2 Pro first impressions

phab2My Phab 2 Pro turned up yesterday and I finally had some time to play with it. The photo shows the WayfairView app running, placing a rather unusual table lamp on the floor. It definitely seems to be AR rather than MR since it doesn’t handle occlusions with real world (or other virtual) objects as far as I can tell – the virtual object is always in front of any real object and you can merge virtual objects without regards to any concept of physical reality. Hopefully this is just an app limitation rather then a technology limitation although I tried the Lowe’s┬áVision app also and it seemed to be similar. Whatever, it is certainly better than looking at a catalogue or an image on a website.

The reason for getting this phone is to see what it can do with its Tango technology, especially in comparison to the HoloLens. Obviously the HoloLens is six times the price of the Phab 2 Pro but really there’s no comparison, either in terms of immersive experience or any sense of reality. My general impression is that while it was fun playing with the Phab 2 Pro for a bit, now I’d like to go back to working with the HoloLens for the real experience­čÖé.

The Phab 2 Pro is now indicated as “temporarily unavailable” on the Lenovo web site – not sure what that means – lots of sales or a problem?

Telepresent Enhanced Reality (TER)

Following on from an earlier post on Enhanced Reality, it occurred to me that separating the stereo cameras (and microphones) from the ER headset creates a new way of achieving telepresent remote participation – Telepresent Enhanced Reality or TER. I was actually trying out a simpler version a while back when I had a camera on a pan/tilt platform slaved to an Oculus DK2 VR headset.┬áA real TER setup would require stereo cameras and multiple microphones on a pan/tilt/roll mount. The user would have a VR headset and the pose of the pan/tilt/roll mount would mirror movements of the user’s head.

An interesting use would be for conferences where some of the participants are in a conventional conference room but wearing AR/MR/ER headsets (eg HoloLens). Positions in the room for the remote participants would each have a stereo camera/microphone remote. The local participants would obviously be able to see each other but instead of the camera/microphone hardware, they would see avatars representing the remote users. These avatars could be as sophisticated or as simple as desired. Remote participants would see (via the stereo cameras) the conference room and local participants and would also see the remote participant avatars which replace the physical camera/microphone hardware at those locations. Alternatively, these could be suitably equipped telepresence robots (or even cameras mounted on small drones) which would also allow movement around the room. Really anything that has the essential hardware (stereo cameras, microphones, pan/tilt/roll capability) could be used.

Given that everyone has AR/MR capability in this setup, something like a conventional projected presentation could still be done except that the whole thing would be virtual – a virtual screen would be placed on a suitable wall and everyone could look at it. Interaction could be with simulated laser pointers and the like. Equally, every position could have its own simulated monitor that displays the presentation. Virtual objects visible to everyone could be placed on the table (or somewhere in the room) for discussion, annotation or modification.

Obviously everyone could be remote and use a VR headset and everything could then be virtual with no need for hardware. However, the scheme described preserves some of the advantages of real meetings while at the same time allowing remote participants to feel like they are really there too.

HoloLens tutorials

20161104_153840_hololensI’ve been working through some of the HoloLens tutorials and thought that the Holograms 230┬átutorial was pretty amusing. The screen capture shows a solar system being projected in space. The spatial mapping mesh can be seen conforming to objects in view. The poster just to the left of the sun isn’t real – it’s one of the things that you can place on a wall to demonstrate this capability.

Building the HoloToolkit.Sharing Library

Collaboration using AR is a fascinating area with many potential applications. The HoloToolkit is a very handy resource in general and includes the HoloToolkit.Sharing library to assist with collaboration. The HoloToolkit-Unity actually contains built versions of the Server, SessionManager and Profiler but it seemed like a good idea to build from scratch.

There are a few pre-requisites:

  • Windows SDK 10.0.10240
  • Windows SDK 10.0.10586
  • Common Tools for Visual C++
  • Windows 8.1 SDK and Universal CRT SDK
  • Java 8 SDK

An easy way to get the Windows SDKs is to run the BuildAll.bat script which will exit with an error if something is missing. Then use the solution file for the element that failed to start VS2015. VS2015 will then install the missing components. The Java SDK needs to be installed manually and requires environment variables JAVA_BIN that points to the JDK bin directory and JAVA_INCLUDE that points to the JDK include directory. The BuildAll.bat script should then complete successfully.

The Server is run using SharingService.exe and the user needs administrator permission to install as a service. This can be done by opening a command window in administrator mode and running the command from that for example. It’s actually useful to run the server using the -local flag (as a command line program) as then it’s easy to see status and error messages. The SessionManager displays current server state including connected clients.