Category Archives: Mixed reality

Smart spaces and IoT data – the challenge is what to do with it

A while back I built some add-on cards for Raspberry Pis to do some environmental monitoring around the house. This is one of them.

The project starting collecting dust when I couldn’t really think of good ways of using the data, beyond triggering an alarm under some conditions or something. However, it’s often interesting just to see what’s going on around the place so I have revived the sensors (a good use for old first generation Pis). The screen capture shows a simple but actually quite effective way of using the data that’s being generated, providing a display that’s adjacent to the camera feed from a webcam on the same Pi. Between the two streams, you can get good confidence on what’s happening in the smart space.

One day, I’d like to get the HoloLens integrated with this so that I can see the data when I am in the smart space. That would be even more fun.

Why HoloLens is like Aibo…except hopefully it isn’t

aibohololensLooks like Aibo has got hold of my HoloLens again. So why is HoloLens like Aibo? Well Aibo was an absolutely fantastic piece of engineering and way ahead of its time. Sony managed to make a viable consumer robot that didn’t do anything practical but nevertheless was highly entertaining. Some of the tricks it can do with its ball are very impressive to say the least! The skill in building robots is to bring together a large number of disparate technologies and integrate them into a consistent product. Aibo is a great example of doing this in a very successful way.

HoloLens similarly brings a raft of disparate technologies into a very well engineered and complete device that seems to stand alone in terms of the totality of its capabilities for Mixed Reality. It really does remind me of Aibo in this regard.

Just one thing. Sony killed off the entire robotics effort because it wasn’t making enough cash in the short term, a wonderful example of myopia in my opinion. I am hoping that Microsoft don’t fall into the same trap with HoloLens. This piece suggests that HoloLens won’t suffer a similar fate which is fantastic. The AR and MR market is going to be driven by continuing new developments in devices that make them smaller, lighter and have longer battery life so that, one day, people will wear them all day and leave their smartphones gathering dust in a drawer. I look forward to seeing and using many future generations of HoloLens!

Latest fun thing in the office: Epson Moverio BT-300 Smart Glasses

bt-300-1The arrival of a pair of BT-300 Smart Glasses gave me an opportunity to take another daft photo of myself wearing a wearable. My eyes don’t really look like that – that’s just where the (presumably) semi-silvered mirror surface is for each eye. Projectors at the sides generate images that are combined with the real light coming in to form a composite AR image.

Continue reading

Telepresent Enhanced Reality (TER)

Following on from an earlier post on Enhanced Reality, it occurred to me that separating the stereo cameras (and microphones) from the ER headset creates a new way of achieving telepresent remote participation – Telepresent Enhanced Reality or TER. I was actually trying out a simpler version a while back when I had a camera on a pan/tilt platform slaved to an Oculus DK2 VR headset. A real TER setup would require stereo cameras and multiple microphones on a pan/tilt/roll mount. The user would have a VR headset and the pose of the pan/tilt/roll mount would mirror movements of the user’s head.

An interesting use would be for conferences where some of the participants are in a conventional conference room but wearing AR/MR/ER headsets (eg HoloLens). Positions in the room for the remote participants would each have a stereo camera/microphone remote. The local participants would obviously be able to see each other but instead of the camera/microphone hardware, they would see avatars representing the remote users. These avatars could be as sophisticated or as simple as desired. Remote participants would see (via the stereo cameras) the conference room and local participants and would also see the remote participant avatars which replace the physical camera/microphone hardware at those locations. Alternatively, these could be suitably equipped telepresence robots (or even cameras mounted on small drones) which would also allow movement around the room. Really anything that has the essential hardware (stereo cameras, microphones, pan/tilt/roll capability) could be used.

Given that everyone has AR/MR capability in this setup, something like a conventional projected presentation could still be done except that the whole thing would be virtual – a virtual screen would be placed on a suitable wall and everyone could look at it. Interaction could be with simulated laser pointers and the like. Equally, every position could have its own simulated monitor that displays the presentation. Virtual objects visible to everyone could be placed on the table (or somewhere in the room) for discussion, annotation or modification.

Obviously everyone could be remote and use a VR headset and everything could then be virtual with no need for hardware. However, the scheme described preserves some of the advantages of real meetings while at the same time allowing remote participants to feel like they are really there too.

HoloLens tutorials

20161104_153840_hololensI’ve been working through some of the HoloLens tutorials and thought that the Holograms 230 tutorial was pretty amusing. The screen capture shows a solar system being projected in space. The spatial mapping mesh can be seen conforming to objects in view. The poster just to the left of the sun isn’t real – it’s one of the things that you can place on a wall to demonstrate this capability.