Back in September 2013, I saw the product announcement for the Vuzix M100 Smart Glasses.
The blurb said “It is an Android-based wearable computer, enhanced with a wearable monocular display and computer, recording features and wireless connectivity capabilities designed for commercial, professional, and prosumer users.”
Reading beyond the hype, the device has potential real merit – especially in the field of field service applications, which we develop at Microsec. Specifically, it could have real benefit in one of our systems, used to manage medical water filters at hospitals, which requires as hands-free an operation as possible – the operator needs two hands to change the filter and scan barcodes, so avoiding the need to hold a smartphone or tablet has real benefits.
So, deciding it was time to invest a bit in the future, I ordered the SDK – with associated (all important) hardware – for $999.
The SDK arrived promptly, within a few days, while the hardware was destined to arrive sometime in October. Fair enough, the product is still in development after all.
To cut a long story short, the physical device finally arrived on 2nd January 2014. To be honest, I wasn’t too upset, as it was clear in my communication with Vuzix that the product is still very much in development.
What’s In The Box?
On unpacking the unit, I was pleasantly suprised with what appeared.
- A well-constructed carrying case
- A pair of Vuzix safety glasses
- A backup battery
- A USB charging data lead
- A soft carrying case
The kit for mounting the M100 on the glasses was documented on the delivery note as ‘to follow’.
The M100 Itself
The build quality is very good – it is sturdily constructed and certainly feels like it’s going to last. The device consists of a main unit which contains the main processor as well as having an earpiece, buttons (more on those later), usb socket, gesture sensor (more later) and LED indicators.
The buttons available are:
- On/off button on the bottom
- Back/select button on the top at the back
- Volume up/down buttons on the top at the front
These are a bit fiddly to use – especially as the buttons on the top have two functions. Short-pressing the back/select button selects the current item, while long-pressing it has the same effect as the back button on an Android device. I found myself struggling with this – repeatedly going back when I wanted to select.
The up/down buttons can be re-mapped by applications to have specific actions depending on whether they are long-pressed or short-pressed.
Given that the input options with the buttons are somewhat limited, the device comes with a gesture sensor, allowing hand gestures to be used to control the device. I can already see the “I’m going to look like a real wally” thought processes going through readers’ minds at this concept. So here I am, wearing this bit of kit, and now I’m going to start waving my hand around next to it to control it… However, It’s an intriguing idea.
Unfortunately I’ll just have to remain intrigued, as the gesture sensor API hasn’t been shipped yet, so I couldn’t try it out.
The M100 also comes equipped with the standard GPS and accelerometer sensors – all of which are fully functioning – I installed a standard Android compass/GPS app which worked well, and a Xamarin accelerometer app which showed that these sensors are indeed functioning.
Fitting the device was quite fiddly, as it is important to get the screen in just the right position in your field of view. The headset allows adjustment up and down, as well as distance away from the eye. In addition, arm can be bent laterally and the display rotated up or down by +/- 15 degrees.
The temptation is to position the display directly in front of the eye – but this reveals an immediate flaw. It’s not transparent, and blocks your vision. In fact, I think the best position is just below (well, below or above, but I found below to be better) your main field of vision, as illustrated below.
Firing It Up – The Screen
Pressing the power button results in a Vuzix splash icon, then the standard Android startup icon, and finally the home screen is displayed.
The first thing to notice is that it’s got a very custom size and aspect ratio. The actual figures (taken from the emulator setup instructions) are:
Physical Size: 3.9″, Width: 428dp, Height: 240dp, Size: small, Density: ldpi
Of course the ‘physical size’ here relates to the size as the screen appears through the focussing lens. Crucially, the lens presents the screen at a distance of a long arm’s length, so switching view between looking around and down to the screen requires little eye accommodation. In addition – as can be seen from the photo – the device sits happily in front of a pair of normal glasses.
It’s a bit disappointing that there are no M100-specific apps already installed on the device, to show off its features. The standard apps which are installed consist of:
Given the limited input options on the physical unit, it’s questionable where several of these (for example, Calendar, Contacts, Email) are of much use, and may actually serve to emphasize its shortcomings. For example, the Calendar app works:
The camera is more interesting – here, in Escher-esque mode, I’m viewing the Droid@Screen mirror of the M100 on my laptop using the camera.
So, let’s get some more interesting apps on the device.
Here’s where a bit of fun and games comes as the M100 doesn’t have the Google Play store, and therefore needs to use ADB to install via USB. (Vuzix are promising an M100-specific app store in the near future).
The M100 uses the Google USB driver, and therefore this needs to be installed Android SDK manager, and then the android_winusb.inf file needs to be edited with the right details and the M100 updated with the driver through the Windows Device Manager (I’m running Windows 7). All a bit fiddly but also fairly familiar if you’ve had to get a non-standard Android device talking over USB.
Once you’ve got the device visible via ADB (not forgetting to turn on debugging on the M100 itself – something I forgot to do), you can install apps either by using the
command, or using the M100 System File Manager – a windows utility which allows you to install APKs on the device and also, incidentally, reflash the device with new OS versions.
The Proof Of The Pudding – Custom Apps
So given that we now can get apps onto the device, what can we run?
For me, as a Xamarin developer, I was interested in whether a Xamarin app – written, of course in C# – would run on the M100.
I needn’t have worried. The standard ‘Hello World’ Xamarin.Android app runs just fine.
Of course, I shouldn’t have been too surprised – after all the M100 is a fairly standard device when you’re at the API level.
Accelerometer and GPS
I also – as previously mentioned – successfully installed the Xamarin.Android accelerometer app, showing that it’s possible to detect motion in the three axes, along with a GPS app. With these capabilities, thoughts start to form regarding possible heads-up navigation or enhanced reality type applications.
The standard Barcode Scanner app – built on the excellent zxing android library – also runs very well on the M100. The camera is sharp and focusses well – it scans 1D codes very efficiently, as the screenshot below shows.
The problem with using the inbuilt scanner is that, while it reduces the number of hands needed, one needs to get the barcode in just the right position in the field of vision, which means positioning the M100 (and therefore ones head) about 15cm from the barcode.
It would be better if we could use a separate dedicated scanner, paired with the M100 via bluetooth. This is a solution we use on an iOS application which is used for tracking water filters in medical institutions – it has the advantage of very efficient scanning as well as reduced battery consumption on the mobile device.
Alas, trying to pair the M100 with a Bluetooth Koamtac KDC300 scanner has proved problematic- the first version of the M100 OS had a problem with the Bluetooth pairing dialogue, while the latest version displays the dialogue OK, but pairing fails. Therefore the jury’s out on the effectiveness of this solution with the M100.
All the apps discussed so far run on the M100 with all input coming from the M100 device itself. However, there is also the option to control the M100 from a paired Android device (with iOS support coming soon, we’re told).
Vuzix have supplied their own Smart Glasses Manager app, which installs on an Android device and allows the user to control the M100 by either using the separate device’s screen as a trackpad to control the M100 cursor, or pushing button clicks to the M100. The Smart Glasses Manager has a representation (but not a live view) of the M100 screen, with buttons to:
- Move up/down on screen lists
- Simulate the ‘Home’ button
- Simulate the ‘Back’ button
- Push text to the M100 from the linked device’s keyboard
- Bring up the current app’s menu
The screenshot above shows the Smart Glasses Manager in action – with the M100 screen mirrored at the top, up/down and keyboard buttons in the middle, the trackpad area below this and the Back, Home, Menu and Sync buttons below the trackpad. The latter button synchronises the M100 screen mirror between the devices.
With the Smart Glasses Manager, it’s possible to control the M100 much more effectively than with the on-device buttons. However, one has to ask the question of why have the M100 if you’re going to also need a separate device to drive it efficiently.
Also, the exact API for driving the M100 from the remote is not straightforward, it seems. The method recommended by Vuzix is to use dispatchKeyEvent to send key events to the application. However, this is only straightforward for an app running on the M100 itself. To control an app from a separate device will, I think, require an Android service running on the M100 to pick up Bluetooth data from the remote application and then send the key events to the App – as I say, not completely straightforward and crying out for a packaged solution. Perhaps a candidate for a 3rd party library.
So, with all these features, what possible applications spring to mind for the M100? It’s here that I start to struggle.
This is one of the first possibilities – helped by a very slick video on the Vuzix website. However, it’s clear that the video is a mock-up, and not taken from any real device.
The crucial need is for efficient barcode scanning and presentation of information, and easy app navigation. For efficient scanning, one will either need large barcodes so that they can be scanned from a distance, or in fact a separate scanner is going to be needed. Then for easy app navigation, very careful screen design will be needed, making use of the limited buttons of the M100.
Field Service App
Our filter management system (currently running on iOS) is a good candidate for the M100. If we could have the user only having to hold the scanner, with the application running on the M100, then we would have significant increase in ease of use over the current system – where the user has to juggle the iPhone and scanner.
However, there are modes of operation where text needs to be input into the app (for example, when entering data manually). For this, we will need to write a separate partner app to run on an Android phone or iPhone, which sends the input text to the M100 app. Not impossible and maybe the best of both worlds.
One can see M100s being useful for presentations, with autocue features allowing the presenter to wander around freely while being fed with information. However, the app would again need very careful design to avoid the presenter getting lost and having an awkward “Erm, hang on a minute…” moment while fiddling with his/her M100.
There’s also the question of looking like a complete geek while presenting!
Field Instruction Manual
With the hands-free nature of the device, one concept we are investigating is presenting step by step instructions to an engineer/technician while performing in-field maintenance is attractive – again, however, careful design will be needed!
The concept of wearable, hands-free computers is fascinating, and I think – from what I’ve managed to see so far – that the M100 has the potential for a good implementation. It’s a bit rough around the edges in terms of functionality and implemented features (for example, the lack of gesture controls), but Vuzix are sending out updates regularly. I will post further articles as and when significant updates come out.
The main problem – and it’s a problem shared by other wearable technologies (Google Glass for example) – is designing a compelling, really useful application which takes full advantage of the device. But then perhaps I just lack the necessary imagination!