Developing Electron apps with Visual Studio Code

I have been trying out Electron as a way of developing some WebRTC apps to work with the Janus gateway. In the end I decided that Visual Studio Code was a good route to take for Javascript code development. One thing that wasn’t at all obvious though was how to get breakpoints to work. I found this blog entry that had the answer – no way I would have been able to work it out myself so go to that link for the original source (reproduced here for my convenience).

First thing is to install the Debugger for Chrome extension for VS Code – instructions are here. Then, the .vscode/launch.json file should look something like this:

  // Use IntelliSense to learn about possible Node.js debug attributes.
  // Hover to view descriptions of existing attributes.
  // For more information, visit:
  "version": "0.2.0",
  "configurations": [
      "type": "node",
      "request": "launch",
      "name": "Main Debug",
      "runtimeExecutable": "${workspaceRoot}/node_modules/.bin/electron",
      "windows": {
        "runtimeExecutable": "${workspaceRoot}/node_modules/.bin/electron.cmd"
      "program": "${workspaceRoot}/main.js",
      "protocol": "legacy"
      "name": "Renderer Debug",
      "type": "chrome",
      "request": "launch",
      "runtimeExecutable": "${workspaceRoot}/node_modules/.bin/electron",
      "runtimeArgs": [
      "sourceMaps": false

Using this launch file, to debug in the main process use Main Debug, to debug in the renderer process use Renderer Debug.

Developing Unity projects for Moverio BT-300 AR glasses on Windows

Since the Moverio BT-300 AR glasses run Android 5.1 using an Atom processor, it is possible to run Unity projects on them. The starting point is the instructions here on setting up Unity for the Android platform. One problem with this is that the android command is not included in Android Studio apparently so Unity builds will fail. So, to get Unity builds for Android to work, it is necessary to download and unzip the command line tools from the bottom of this page. This will create a directory tree that includes a tools directory. This should be used to replace the original tools directory in the Android Studio install, usually found at:


Incidentally, that is also the path that Unity needs to know in order to perform its builds.

There is a Unity plugin that provides support for 3D on the BT-300. For instructions on how to use the plugin, read:

 Assets > MoverioBT300UnityPlugin > MoverioController > README

The plugin includes a scene called MoverioTutorial that can be used as a starting point. It demonstrates some of the features of the plugin.

After the package name has been set in Player > Other Settings, it should then be possible to build, deploy and run on the BT-300 directly from Unity. I had a few problems with the tutorial with regard to SDK functionality but the Unity part seemed to work well (although I had to set 3D mode and disable the 2D camera manually sometimes). I am sure that I am doing something wrong – I’ll update the post when I work out what is happening.

Connecting a webcam to a VirtualBox guest OS

I am running Ubuntu 16.04 in a VirtualBox VM on a Windows 10 machine and wanted to access the laptop’s webcam from a Python script running in the Ubuntu VM. The trick (as described here) is to enter this line on the host while the VM is running:

VboxManage controlvm "vmname" webcam attach .0

where vmname is the name of the VM to be modified.

There doesn’t seem to be any need to add a USB filter for the webcam – doing that doesn’t seem to help at all.

The only problem with this is that the change isn’t permanent – it has to be run each time the VM is started. Simplest way to deal with that is to start the VM from a batch file:

cd "c:\Program Files\Oracle\VirtualBox"
VboxManage startvm "vmname"
VboxManage controlvm "vmname" webcam attach .0

Incidentally, this attaches the default webcam. Individual ones can be specified using .1, .2 etc. Use:

VboxManage list webcams

to get a list of webcams and aliases.

Why HoloLens is like Aibo…except hopefully it isn’t

aibohololensLooks like Aibo has got hold of my HoloLens again. So why is HoloLens like Aibo? Well Aibo was an absolutely fantastic piece of engineering and way ahead of its time. Sony managed to make a viable consumer robot that didn’t do anything practical but nevertheless was highly entertaining. Some of the tricks it can do with its ball are very impressive to say the least! The skill in building robots is to bring together a large number of disparate technologies and integrate them into a consistent product. Aibo is a great example of doing this in a very successful way.

HoloLens similarly brings a raft of disparate technologies into a very well engineered and complete device that seems to stand alone in terms of the totality of its capabilities for Mixed Reality. It really does remind me of Aibo in this regard.

Just one thing. Sony killed off the entire robotics effort because it wasn’t making enough cash in the short term, a wonderful example of myopia in my opinion. I am hoping that Microsoft don’t fall into the same trap with HoloLens. This piece suggests that HoloLens won’t suffer a similar fate which is fantastic. The AR and MR market is going to be driven by continuing new developments in devices that make them smaller, lighter and have longer battery life so that, one day, people will wear them all day and leave their smartphones gathering dust in a drawer. I look forward to seeing and using many future generations of HoloLens!