Wednesday, February 08, 2012

Google Glasses rumor. Is it time to take heads-up displays serious as a device form factor?

Android Authority Story >>

In a smartphone market filled with look-alike, black rectangular products, is it time to eye fresh, new form factors?

Way back in the year 2000, I had the opportunity to try a few near-to-eye displays being pitched at several major trade shows. The devices came from both start-ups and mega consumer electronics vendors which saw these gadgets as a potential new computing form factor.

So, the rumor is that Google is actively working on developing a stand-alone heads-up display device that could be, in effect, "smart glasses." The glasses would run a thin version of Android on top of an ARM processor, have voice recognition, and a series of sensors such as gyroscope and accelerometer to detect head movement and direction.

Apparently one of the key applications of the glasses is to support augmented reality. Such glasses could, for example, detect what products the user might be looking at using a front-facing camera and then display additional information and pricing details for the user. We could begin to wonder what new types of information collection this would enable: Google already knows what we're looking at in the virtual world. Now the company can begin to collect information about what people are looking at in the physical world as well.

Privacy issues aside, I think Google and other vendors of HUDs and NEDs have an uphill battle. While I see discreet wearable computing has significant potential, I can't say that I see eye-to-eye with such an in-your-face approach to keeping users connected. But as always, it's important to remain open-minded to change, and maintain a heads-up approach to monitoring any potential form factor shifts and opportunities.


View to a kill. (via 9TO5 Google)


Sony's near-to-eye display: the market seems far away.



No comments: