,

Eye Tribe Makes Jedi Mind Control Available for Your Android Device

photo 1

Forget gesture and voice control – pretty soon, you’ll be gliding around on your smartphone with just your eyeballs. That’s how The Eye Tribe sees things, anyway, and they’re looking to put it into practice with their new Android SDK.

The Eye Tribe has already made their eye tracking technology compatible with Windows PCs, Macs, and tablets, but this year’s Mobile World Congress is the first time they’re announcing smartphone compatibility. The technology will be able to track your eyes to allow on-screen navigation and other features – that includes retinal log-in, new kinds of gaming controls, and navigation of apps and browsers solely by eye movements and blinking. And, yeah, they aren’t keeping it a secret – the technology opens up new ways to collect data about how you use your phone. I’m sure you can guess how that will be used.

The Eye Tribe sells a $99 eye tracker and SDK for Windows and Mac, so with their new Android SDK, all of the bases are covered. Stay tuned to see how third parties start working The Eye Tribe’s technology into their apps in the near future.

  • Jeff Kang

    **Integrating an eye tracker into the hardware**

    “The Eye Tribe released its first eye-tracking product to developers in December — a long, thin $99 module that attaches to a Windows laptop, computer or tablet. It sold out immediately and the company is now working on a second batch. But it also has a more exciting proposition in the pipeline — a software development kit module for Android phones that it eventually wants to see integrated into the a wide range of mobile devices.

    “Most of the requisite hardware is already built into phones. The Eye Tribe just needs to persuade companies to integrate the technology.

    All that’s required is a camera sensor with infrared capabilities. “What we know is that in Q4 this year, sensors are coming out that can switch between regular camera and infrared camera.””

    wired/co/uk/news/archive/2014-02/25/eye-tribe-android

    **Cost**

    “OEM vendors could likely add this sensor to their handsets for just five dollars”

    reviewscnet/eye-tribe-shows-off-working-eye-tracking-on-a-mobile-phone/

    If modifying the device to add eye-tracking only adds 5 dollars to the manufacturing cost, then I’m sure that at least one of the smartphone, tablet, notebook, and laptop manufacturers will make the supposedly easy camera modification.

    **See before touch**

    I think that most of the time, a person will see a widget that they want to touch before they actually reach out, and physically touch it

    (The only times where I’m not looking is when I press the Android Navigation Bar buttons that are near the bottom edge of the screen. Although, on a larger Nexus 10, I usually have to look at them first).

    **Eyes + consecutively touching the same few buttons**

    On certain tasks, it might be convenient and fast to have the option of touching “single tap where I’m looking”, and “swipe up where I’m looking” buttons. You would only need one or two buttons that are close to you (kind of like the Navigation Bar buttons at the bottom).

    Look, touch an easy-to-reach spot, look, and then touch the same button again. You don’t have to keep changing your hand and finger positions between each tap.

    “Looking at icons on a desktop instantly highlights them, and you can then tap anywhere on the screen to open up the selected app.”

    stuff/tv/mwc-2014/eyes-eye-tribe-we-play-fruit-ninja-using-nothing-our-eyeballs/feature

    I guess that in one of their demos, they temporarily made the entire screen a “tap where I’m looking” button.

    Besides the three default buttons in the Navigation Bar, you could add “single tap where I’m looking”, and “swipe up where I’m looking” (to perhaps simulate a Page Down for reading) buttons, and those alone should allow you to do a lot of things).

    Vertical touchscreen

    If you have a vertically propped up tablet with an external keyboard, you could remap a keyboard button to be the “tap where I’m looking” button.

    **Hands-free interaction**

    Even without the above options, I still think the ability to have a page automatically scroll down when your eyes reach the bottom of the page, or have an e-book automatically turn the page when the gaze reaches the corner of the text would be pretty good features to have. They would be especially handy for computer interaction while cooking and eating, and interacting with a vertically set up touch device that is more than an arms-length away while doing other stuff on the desktop.