Vuzix M100 Smart Glasses – A Developer’s Perspective


Back in September 2013, I saw the product announcement for the Vuzix M100 Smart Glasses.

The blurb said “It is an Android-based wearable computer, enhanced with a wearable monocular display and computer, recording features and wireless connectivity capabilities designed for commercial, professional, and prosumer users.”

Reading beyond the hype, the device has potential real merit – especially in the field of field service applications, which we develop at Microsec. Specifically, it could have real benefit in one of our systems, used to manage medical water filters at hospitals, which requires as hands-free an operation as possible – the operator needs two hands to change the filter and scan barcodes, so avoiding the need to hold a smartphone or tablet has real benefits.

So, deciding it was time to invest a bit in the future, I ordered the SDK – with associated (all important) hardware – for $999.


The SDK arrived promptly, within a few days, while the hardware was destined to arrive sometime in October. Fair enough, the product is still in development after all.

To cut a long story short, the physical device finally arrived on 2nd January 2014. To be honest, I wasn’t too upset, as it was clear in my communication with Vuzix that the product is still very much in development.

What’s In The Box?

On unpacking the unit, I was pleasantly suprised with what appeared.

Vuzix M100 Package Contents

  • A well-constructed carrying case
  • A pair of Vuzix safety glasses
  • A backup battery
  • A USB charging data lead
  • A soft carrying case

The kit for mounting the M100 on the glasses was documented on the delivery note as ‘to follow’.

The M100 Itself

The build quality is very good – it is sturdily constructed and certainly feels like it’s going to last. The device consists of a main unit which contains the main processor as well as having an earpiece, buttons (more on those later), usb socket, gesture sensor (more later) and LED indicators.


The buttons available are:

  • On/off button on the bottom
  • Back/select button on the top at the back
  • Volume up/down buttons on the top at the front

These are a bit fiddly to use – especially as the buttons on the top have two functions. Short-pressing the back/select button selects the current item, while long-pressing it has the same effect as the back button on an Android device. I found myself struggling with this – repeatedly going back when I wanted to select.

The up/down buttons can be re-mapped by applications to have specific actions depending on whether they are long-pressed or short-pressed.

Gesture Sensor

Given that the input options with the buttons are somewhat limited, the device comes with a gesture sensor, allowing hand gestures to be used to control the device. I can already see the “I’m going to look like a real wally” thought processes going through readers’ minds at this concept. So here I am, wearing this bit of kit, and now I’m going to start waving my hand around next to it to control it… However, It’s an intriguing idea.

Unfortunately I’ll just have to remain intrigued, as the gesture sensor API hasn’t been shipped yet, so I couldn’t try it out.

Other Sensors

The M100 also comes equipped with the standard GPS and accelerometer sensors – all of which are fully functioning – I installed a standard Android compass/GPS app which worked well, and a Xamarin accelerometer app which showed that these sensors are indeed functioning.


Fitting the device was quite fiddly, as it is important to get the screen in just the right position in your field of view. The headset allows adjustment up and down, as well as distance away from the eye. In addition, arm can be bent laterally and the display rotated up or down by +/- 15 degrees.

The temptation is to position the display directly in front of the eye – but this reveals an immediate flaw. It’s not transparent, and blocks your vision. In fact, I think the best position is just below (well, below or above, but I found below to be better) your main field of vision, as illustrated below.

IMAG0008-2It looks like the unit is blocking my vision, but in fact it’s just below a natural distance view. This way, you can see normally but just have to glance down to see the screen.

Firing It Up – The Screen

Pressing the power button results in a Vuzix splash icon, then the standard Android startup icon, and finally the home screen is displayed.


The first thing to notice is that it’s got a very custom size and aspect ratio. The actual figures (taken from the emulator setup instructions) are:

Physical Size: 3.9″, Width: 428dp, Height: 240dp, Size: small, Density: ldpi

Of course the ‘physical size’ here relates to the size as the screen appears through the focussing lens. Crucially, the lens presents the screen at a distance of a long arm’s length, so switching view between looking around and down to the screen requires little eye accommodation. In addition – as can be seen from the photo – the device sits happily in front of a  pair of normal glasses.

Installed Apps

It’s a bit disappointing that there are no M100-specific apps already installed on the device, to show off its features. The standard apps which are installed consist of:

  • Browser
  • Calculator
  • Calendar
  • Clock
  • Contacts
  • Downloads
  • Email
  • Gallery
  • Movie
  • Music
  • Search
  • Settings
  • Sound

Given the limited input options on the physical unit, it’s questionable where several of these (for example, Calendar, Contacts, Email) are of much use, and may actually serve to emphasize its shortcomings. For example, the Calendar app works:

M100-calendarBut would you really want to use an M100 for viewing your calendar?

The camera is more interesting – here, in Escher-esque mode, I’m viewing the Droid@Screen mirror of the M100 on my laptop using the camera.

M100-cameraInstalling Apps

So, let’s get some more interesting apps on the device.

Here’s where a bit of fun and games comes as the M100 doesn’t have the Google Play store, and therefore needs to use ADB to install via USB. (Vuzix are promising an M100-specific app store in the near future).

The M100 uses the Google USB driver, and therefore this needs to be installed Android SDK manager, and then the android_winusb.inf file needs to be edited with the right details and the M100 updated with the driver through the Windows Device Manager (I’m running Windows 7). All a bit fiddly but also fairly familiar if you’ve had to get a non-standard Android device talking over USB.

Once you’ve got the device visible via ADB (not forgetting to turn on debugging on the M100 itself – something I forgot to do), you can install apps either by using the

adb install

command, or using the M100 System File Manager – a windows utility which allows you to install APKs on the device and also, incidentally, reflash the device with new OS versions.

The Proof Of The Pudding – Custom Apps

So given that we now can get apps onto the device, what can we run?

Xamarin.Android Apps

For me, as a Xamarin developer, I was interested in whether a Xamarin app – written, of course in C# – would run on the M100.

I needn’t have worried. The standard ‘Hello World’ Xamarin.Android app runs just fine.


Of course, I shouldn’t have been too surprised – after all the M100 is a fairly standard device when you’re at the API level.

Accelerometer and GPS

I also – as previously mentioned – successfully installed the Xamarin.Android accelerometer app, showing that it’s possible to detect motion in the three axes, along with a GPS app. With these capabilities, thoughts start to form regarding possible heads-up navigation or enhanced reality type applications.

Barcode Scanning

The standard Barcode Scanner app – built on the excellent zxing android library – also runs very well on the M100. The camera is sharp and focusses well – it scans 1D codes very efficiently, as the screenshot below shows.


The problem with using the inbuilt scanner is that, while it reduces the number of hands needed, one needs to get the barcode in just the right position in the field of vision, which means positioning the M100 (and therefore ones head) about 15cm from the barcode.

It would be better if we could use a separate dedicated scanner, paired with the M100 via bluetooth. This is a solution we use on an iOS application which is used for tracking water filters in medical institutions – it has the advantage of very efficient scanning as well as reduced battery consumption on the mobile device.

Alas, trying to pair the M100 with a Bluetooth Koamtac KDC300 scanner has proved problematic- the first version of the M100 OS had a problem with the Bluetooth pairing dialogue, while the latest version displays the dialogue OK, but pairing fails. Therefore the jury’s out on the effectiveness of this solution with the M100.

Remote Control

All the apps discussed so far run on the M100 with all input coming from the M100 device itself. However, there is also the option to control the M100 from a paired Android device (with iOS support coming soon, we’re told).

Vuzix have supplied their own Smart Glasses Manager app, which installs on an Android device and allows the user to control the M100 by either using the separate device’s screen as a trackpad to control the M100 cursor, or pushing button clicks to the M100. The Smart Glasses Manager has a representation  (but not a live view) of the M100 screen, with buttons to:

  • Move up/down on screen lists
  • Simulate the ‘Home’ button
  • Simulate the ‘Back’ button
  • Push text to the M100 from the linked device’s keyboard
  • Bring up the current app’s menu


The screenshot above shows the Smart Glasses Manager in action – with the M100 screen mirrored at the top, up/down and keyboard buttons in the middle, the trackpad area below this and the Back, Home, Menu and Sync buttons below the trackpad. The latter button synchronises the M100 screen mirror between the devices.

With the Smart Glasses Manager, it’s possible to control the M100 much more effectively than with the on-device buttons. However, one has to ask the question of why have the M100 if you’re going to also need a separate device to drive it efficiently.

Also, the exact API for driving the M100 from the remote is not straightforward, it seems. The method recommended by Vuzix is to use dispatchKeyEvent to send key events to the application. However, this is only straightforward for an app running on the M100 itself. To control an app from a separate device will, I think, require an Android service running on the M100 to pick up Bluetooth data from the remote application and then send the key events to the App – as I say, not completely straightforward and crying out for a packaged solution. Perhaps a candidate for a 3rd party library.

Possible Applications

So, with all these features, what possible applications spring to mind for the M100? It’s here that I start to struggle.

Warehouse picking

This is one of the first possibilities – helped by a very slick video on the Vuzix website. However, it’s clear that the video is a mock-up, and not taken from any real device.

The crucial need is for efficient barcode scanning and presentation of information, and easy app navigation. For efficient scanning, one will either need large barcodes so that they can be scanned from a distance, or in fact a separate scanner is going to be needed. Then for easy app navigation, very careful screen design will be needed, making use of the limited buttons of the M100.

Field Service App

Our filter management system (currently running on iOS) is a good candidate for the M100. If we could have the user only having to hold the scanner, with the application running on the M100, then we would have significant increase in ease of use over the current system – where the user has to juggle the iPhone and scanner.

However, there are modes of operation where text needs to be input into the app (for example, when entering data manually). For this, we will need to write a separate partner app to run on an Android phone or iPhone, which sends the input text to the M100 app. Not impossible and maybe the best of both worlds.


One can see M100s being useful for presentations, with autocue features allowing the presenter to wander around freely while being fed with information. However, the app would again need very careful design to avoid the presenter getting lost and having an awkward “Erm, hang on a minute…” moment while fiddling with his/her M100.

There’s also the question of looking like a complete geek while presenting!

Field Instruction Manual

With the hands-free nature of the device, one concept we are investigating is presenting step by step instructions to an engineer/technician while performing in-field maintenance is attractive – again, however, careful design will be needed!


The concept of wearable, hands-free computers is fascinating, and I think – from what I’ve managed to see so far – that the M100 has the potential for a good implementation. It’s a bit rough around the edges in terms of functionality and implemented features (for example, the lack of gesture controls), but Vuzix are sending out updates regularly. I will post further articles as and when significant updates come out.

The main problem – and it’s a problem shared by other wearable technologies (Google Glass for example) – is designing a compelling, really useful application which takes full advantage of the device. But then perhaps I just lack the necessary imagination!

18 thoughts on “Vuzix M100 Smart Glasses – A Developer’s Perspective

  1. I’d really like to chat with you further about your use of the M100. As the European PR Manager for Vuzix we’d love to create some case studies based around your usage. I hope this would be possible. Regards

  2. Thank you for the post! It’s great to get some hands-on info from a developer’s perspective. It’s even better that you mention barcode scanning. As soon as we receive our M100, we will check whether reading barcodes from a more comfortable distance (arm’s length) is feasible. That will depend on the image effective resolution and noise level and whether we can apply our algorithms to those images. I look forward to reading more about your experience with the M100.

    • Hi Benoit,
      One thing to consider when thinking of arms-length scanning is how the camer auto-focuses. I’m not sure whether it will focus on the barcode effectively if it’s not the main feature in the frame – depending on other high-contrast items also visible.

      The ZXing library definitely needs the barcode to be close up for scanning to work – as you say, it depends on how your algorithms work. It will be interesting to find out.

      • You are right, autofocus can be a lot of trouble, and on smartphones too!
        Another factor to consider is the size of the barcode (the width of the thinnest bar). Some are just to be small to be read at arm’s length.
        That being said, we just received our unit and the first tests are very encouraging. We will post our findings soon.

  3. Pingback: Barcode Scanning with the Vuzix M100 Smart Glasses | Vision Smarts blog

    • I’ve just tried Benoit’s M100 demo of the Pic2shop app – and it’s very impressive. Very reliable, efficient scanning.

      The problem now is not that I have to hold the barcode close to the device, but that my arms aren’t long enough to get the code far enough away!

      As Benoit says in his article, judicious use of zoom, and also good algorithms implemented in the VisionSmarts software, are crucial to good performance.
      There remains the worry for me regarding battery usage on the device – even with the more efficient, less CPU-intensive algorithms, day-long commercial scanning will always challenge the battery capacity of the M100.

  4. Hi,

    You said, “The M100 uses the Google USB driver, and therefore this needs to be installed Android SDK manager, and then the android_winusb.inf file needs to be edited with the right details and the M100 updated with the driver through the Windows Device Manager (I’m running Windows 7). All a bit fiddly but also fairly familiar if you’ve had to get a non-standard Android device talking over USB.”

    Is there any way you can go into more details about that? I have never had to get a non-standard Android device talking over USB, and that’s where I’m stuck right now.

    Josh Davis

    • Hi Josh,
      My apologies for not responding earlier. I haven’t got the details of what we needed to do to hand. However, Randy Nunez has commented that he’s sorted this out (quite recently too) – it’s worth contacting him to see if he can help.



    • Hi Zuhry,
      Sorry – I don’t have much of an example application to hand. It really depends what you want to do with the device! I suggest you check the Vuzix and Xamarin websites.



  5. Global positioning navigation which is already inside M100 or do we need a separate beacon device to navigate the position?

    • The device has GPS incorporated. However, you need to be aware of the fact that GPS reception inside a building can be bad, therefore navigation won’t be possible (this is of course true of any mobile device with GPS).

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s