Gaze Technology

0 favourites
  • 15 posts
From the Asset Store
The internet is bombarded daily with new apps (app is the short term for application)
  • Have any of you read up on Gaze Technology? It looks like Windows 8 is going to include/or be able to support Gaze Technology. (This is where the computer tracks your eye movement and a sudo-mouse pointer goes where you look. It will not be very accurate, but the mouse will take over from that location for the fine tuning and clicking.)      Article/video

    I say we get to implimenting a plugin for this, because I got some REAL GOOD ideas!   Imagine what WE could do with that technology.   <img src="smileys/smiley1.gif" border="0" align="middle">

  • Try Construct 3

    Develop games in your browser. Powerful, performant & highly capable.

    Try Now Construct 3 users don't see these ads
  • O_o

    Whoa.... that would be awesome.

    Although I suppose there would be security issues with a website tracking your EYE MOVEMENTS!!

  • You don't need a particular plugin, since the eye tracker moves your mouse cursor.

  • What if there's more people watching same screen? ^^

  • What if there's more people watching same screen? ^^

    Multiplayer ftw

  • You do have to attach the peripheral though from what i read so it's not like you wouldn't know it's happening. Though I'm sure at some point the hardware will be built in by default and set to always-on just for your convenience <img src="smileys/smiley2.gif" border="0" align="middle" />

    If you can also track pupil dilation though it could be useful to gauge emotional response. That and seeing what you're looking at and in what order would definitely be useful to advertisers.

    O_o

    Whoa.... that would be awesome.

    Although I suppose there would be security issues with a website tracking your EYE MOVEMENTS!!

  • Cool stuff though! Kind of reminds me of HAL...   Just wait until out computer starts telling us we are to tired to play any more and shuts off, because our eye lids are droopy!!!!   <img src="smileys/smiley29.gif" border="0" align="middle" />

  • The scary side to this is just imagine what the U.S. government will do with this technology. They are already tracking, and recording keystrokes so if you accidentally looked at the wrong website, link, or article you'd be put into there terrorist database. Then seal team 6 would kick in your door, and assassinate you.

  • The scary side to this is just imagine what the U.S. government will do with this technology. They are already tracking, and recording keystrokes so if you accidentally looked at the wrong website, link, or article you'd be put into there terrorist database. Then seal team 6 would kick in your door, and assassinate you.

    Yep. You know a book or movie is probably being made about this as we speak!

    "Computer, snap a picture of the individual's eyes at IP Address computer console xyz. Computer, run through Identification database... Place individual on watch list ALPHA!"

  • Just for the information, the Tobbii is recording the glint of an IR light on the retina. So no arousal information from pupil dilatation.

    Another information : you can't infer what people are doing just by looking at their mouse trail patterns (because in the end, it's just that). You need to correlate that with what's onscreen at the moment they were looking.

    And no computer, available at the right price for the public, can record a fullscreen video (let say something starting from 1440x900) while generating the graphics at the same time, without stuttering (think about fullframe acquisition, storing in RAM, realtime compression and disk access time, while generating graphics...)

    And, just for the record, I worked with those, so I don't say they are not useful : for disabled people, hemiplegia/tetraplegia, it's a wonderful tool. It's the difference between locked in your body, and interacting at least with a computer to say and do things to the world...

  • Pode, your no fun. <img src="smileys/smiley36.gif" border="0" align="middle" /> I did not read about the glint. That is interesting... Identification from the eye is currently being used by the Government. All they need is a picture of it. Every person's Iris is different, like a fingerprint.   If they are using a couple of IR cameras, the likelihood of using it against you are slim, but honestly, if the government wanted to, and you have any kind of camera hooked up to your machine, they could do what we were suggesting already. <img src="smileys/smiley4.gif" border="0" align="middle" />   But for this to be true, they would have to have a picture of your eye already on file.   

    Boy, how we did digress on this thread...

    My original point is, I want to use eye tracking technology in my games. I hope it comes to pass!

    Pode, "I worked with those". Have you used eye tracking technology?

  • Zetar : but, I'm fun ! (At least, ask those on the IRC chan, I always have something to discover regarding babies, with my daughter <img src="smileys/smiley2.gif" border="0" align="middle" />)

    Yes I worked with one of those devices, but not a Tobii specifically.

    About the glint/identification, they aren't the same.

    The glint, used in eyetracking, is from an infrared LED shined through the cornea, bouncing back on the retina, and coming back through the cornea, leaving 4 "dots" (because of the two refractions of the cornea). That lightning blob pattern is then tracked by the eyetracker.

    It only works if the head isn't moving, obviously. If the head is free to move, you also need to know the head displacement regarding the camera (a plane of displacement, in fact, parallel to the camera, and three axis of rotation, for the head on the neck).

    It's still possible to track that : you need a wide lens camera, working in visible light (a regular webcam, in fact <img src="smileys/smiley1.gif" border="0" align="middle" />) to track the head movements. You can use models like Active Appearance Models, or Dementhon's POSIT (because the head is an undeformable solid regarding the problem interesting us here) to follow the head.

    When I worked with that 5 years ago in my lab, and using OpenCV (Matlab is slow as a turtle for that), it wasn't possible to do that realtime.

    I remember the setup : a USB webcam (the ToUcam II at that time the only one able to do 60 fps), and a second one with the IR filter removed.

    In the end, we could only do the tracking on videos, because no computer could do realtime what we wanted to do : two webcam streaming at 60 fps, you need to do the AAM on one of the image, blob tracking on the other and mouse pointer interpolation (because remember that the head isn't moving all the time, and the eye, even moving, wasn't covering the whole image, so we needed to create a relationship between a displacement on a 320x240 or 640x480 image from the eye and the whole screen image, at 1650 or something - each 1 pixel evaluation error from the webcam cost us a x3 error on the cursor position ! And since the eye is jittering all the time, because of the saccades, you can imagine the performances where terrible !

    Furthermore, the USB2 protocol is maxed out by images at 640x480, at 60 fps - remember that you have the overhead of the protocol on top of that, and DMA was good enough to do that for two cam in parallel...)

    With Tobii, all that is done on the chip, so it can be done fast. But they are two problems, in the end : first, we don't really know what are the health effects of shining IR lights 8 hours a day or more on the retina of a geek (my wife is an optometrist, and she did biblio work on that). Second, they are 3 interactions with the mouse pointer : move, designate, interact. You can do the move part with the eyetracker, you can designate by leaving the cursor on the target, but how do you interact (ie. send a command to the computer) ? You need another channel (and no, the eyebrow isn't going to make it, even by frowning, or flapping the eyelid. Too cumbersome. At the end of the day, you are going to die with two heavy, muscular eyelids <img src="smileys/smiley17.gif" border="0" align="middle" />).

    To make an efficient iris-recognition, you need to do an high-res picture of the iris, warp the image from polar coordinates to cartesian (to make the comparison possible against a database). That's why it only works with really good cam (CSI only works on TV, you know. And by the way, who in Hollywood thinks you can go on crime scene in a suit, and that a blonde legist can search evidences on a body with their hair unnattached - DNA everywhere, anyone ? <img src="smileys/smiley4.gif" border="0" align="middle" />)

  • "At the end of the day, you are going to die with two heavy, muscular eyelids"   <img src="smileys/smiley11.gif" border="0" align="middle" />

    LOL - Very Interesting Pode!

  • I appreciate this is an old post, but the only one I could see on creating an eye tracking object plugin. Pode, although you are correct about the limited computer interaction with eye trackers, for some people their eyes are all they have. We work with persons whose only controllable body part are their eyes and the eye trackers open up a world of opportunities for them. Often this can be enhanced using adapated switches in conjunction with the eye tracking to provide some extra functionality.

    I was wondering if you are aware if anyone had progressed on this issue. We have two Tobii PCEye trackers and are starting to develop games for both educational and entertainment for our eye tracking clients. Although they are playable using the Tobii Mouse Emulation mode, and using the mouse object, it would improve the experience if the games could be eye enabled, using an eye tracking object in an effort to avoid erroneous interactions that can arise from using the older Mouse Emulation instead of the new Tobii Gaze Selection.

    Hope to hear from you.

  • Heh.. co-op would make me laugh so much.

Jump to:
Active Users
There are 1 visitors browsing this topic (0 users and 1 guests)