Just in is this Stereolabs ZED depth-sensing stereo camera. I have been looking for an affordable, good quality stereo camera for ages and this seems to fit the bill. Plus, since it is able to generate a point cloud depth map simultaneously with video frames, it’s possible to compose mixed reality scenes in Unity using a plugin available here.
One of the fun things to try is the ZEDfu program.
This is designed for mapping 3D spaces and turns the ZED into a scanner. The screen capture shows an intermediate view while it is collecting data. You can see the original image, a depth map and the surface map. Thought it was a fun image.
Definitely looking forward to working with this device and using it in some MR projects.
Absolutely fascinating paper here from Microsoft Research describing the design of a holographic display technology that can achieve 80 degrees field of view or more. I remember sitting in a bar in London circa 1980 with a colleague discussing how to produce custom wavefronts for CGI applications. We went down a black hole fast but this kind of tech is exactly what we would have needed.
Nice article here about Arvizio (full disclosure – I work for Arvizio (USA)). We are going to be at AWE 2017 – see you there!
A while back I built some add-on cards for Raspberry Pis to do some environmental monitoring around the house. This is one of them.
The project starting collecting dust when I couldn’t really think of good ways of using the data, beyond triggering an alarm under some conditions or something. However, it’s often interesting just to see what’s going on around the place so I have revived the sensors (a good use for old first generation Pis). The screen capture shows a simple but actually quite effective way of using the data that’s being generated, providing a display that’s adjacent to the camera feed from a webcam on the same Pi. Between the two streams, you can get good confidence on what’s happening in the smart space.
One day, I’d like to get the HoloLens integrated with this so that I can see the data when I am in the smart space. That would be even more fun.
Looks like Aibo has got hold of my HoloLens again. So why is HoloLens like Aibo? Well Aibo was an absolutely fantastic piece of engineering and way ahead of its time. Sony managed to make a viable consumer robot that didn’t do anything practical but nevertheless was highly entertaining. Some of the tricks it can do with its ball are very impressive to say the least! The skill in building robots is to bring together a large number of disparate technologies and integrate them into a consistent product. Aibo is a great example of doing this in a very successful way.
HoloLens similarly brings a raft of disparate technologies into a very well engineered and complete device that seems to stand alone in terms of the totality of its capabilities for Mixed Reality. It really does remind me of Aibo in this regard.
Just one thing. Sony killed off the entire robotics effort because it wasn’t making enough cash in the short term, a wonderful example of myopia in my opinion. I am hoping that Microsoft don’t fall into the same trap with HoloLens. This piece suggests that HoloLens won’t suffer a similar fate which is fantastic. The AR and MR market is going to be driven by continuing new developments in devices that make them smaller, lighter and have longer battery life so that, one day, people will wear them all day and leave their smartphones gathering dust in a drawer. I look forward to seeing and using many future generations of HoloLens!
The arrival of a pair of BT-300 Smart Glasses gave me an opportunity to take another daft photo of myself wearing a wearable. My eyes don’t really look like that – that’s just where the (presumably) semi-silvered mirror surface is for each eye. Projectors at the sides generate images that are combined with the real light coming in to form a composite AR image.