Will smartphones and tablets soon be wondering what touched me?
There are almost certainly more than one billion touch devices being actively used today. Android-based devices account for at least half of a billion, iOS for around 400 million, and there are tens of millions of touch Symbian devices still being used. And there are Windows Phone phones, and there are Bada phone phones. The list goes on, and it is touching.
With at least 90% of smartphone sales being touch monoblock, the number of full-touch devices is growing very quickly. So it's nice to think that many of these devices could be introduced to a new UI paradigm through a software upgrade.
Check out this user input enhancement called FingerSense from startup Qeexo. FingerSense informs the underlying platform about WHAT is touching the screen: a thumb or an index finger. A knuckle or a fingernail. A big, fat stylus or a little, skinny one. Etc.
Apparently Qeexo accomplishes this using a clever software algorithm which can, among other things, distinguish the sounds and physical vibrations of what's touching the screen together with variables such as thickness and placement.
I could imagine that clever UI designers could do some amazing things with such flexibility. And poor UI designers could ruin a platform.
With touch device activations around two million per day (around 23 per second), it's nice to think there's something a bit refreshing on the horizon. Unfortunately I've seen lots of good ideas like this die on the operating table. I do hope this one has the magic touch.
PCMAG: Android Device Activations Top 500 Million
Engadget: Apple brags: sells 365 million iOS devices, 140 million iMessage users