Changes between Version 2 and Version 3 of ProjectopenGazerPage


Ignore:
Timestamp:
03/20/18 00:51:43 (7 years ago)
Author:
Monika Rizova
Comment:

--

Legend:

Unmodified
Added
Removed
Modified
  • ProjectopenGazerPage

    v2 v3  
    1 = !Opengazer: open-source gaze tracker for ordinary webcams =
     1= Opengazer: open-source gaze tracker for ordinary webcams =
    22
    33''Description''
    44
    5     Opengazer is an open-source gaze tracker that uses an ordinary webcam to estimate the direction of your gaze and can then pass the information to other applications. It has two version, the first one developed by Piotr Zieliński, which was supported by Samsung, and the second one is the revived version by Emli-Mari Nel, and is now supported by the European Commission in the context of the AEGIS project and the Gatsby Charitable Foundation.The workflow of the first version had three stages and those were: feature point selection, calibrating the system and tracking. This version worked the following way: first you select with your mouse the feature points, and the algorithm tracks the dots in subsequent steps. Then follows the calibrating, where a few red dots are displayed on various positions on the screen and images of the eyes are extracted. Finally gaze is predicted using the extracted eye images and the trained Gaussian Process. The current version is working on head tracking and gesture switch.
    6    Opengazer coms from the Machine Intelligence Laboratory in Cambridge University Engineering Department.
     5  Opengazer is an open-source gaze tracker that uses an ordinary webcam to estimate the direction of your gaze and can then pass the information to other applications. It has two version, the first one developed by Piotr Zieliński, which was supported by Samsung, and the second one is the revived version by Emli-Mari Nel, and is now supported by the European Commission in the context of the AEGIS project and the Gatsby Charitable Foundation.The workflow of the first version had three stages and those were: feature point selection, calibrating the system and tracking. This version worked the following way: first you select with your mouse the feature points, and the algorithm tracks the dots in subsequent steps. Then follows the calibrating, where a few red dots are displayed on various positions on the screen and images of the eyes are extracted. Finally gaze is predicted using the extracted eye images and the trained Gaussian Process. The current version is working on head tracking and gesture switch.
     6  Opengazer coms from the Machine Intelligence Laboratory in Cambridge University Engineering Department.
    77 
    8     URL:
     8  URL:
    99   
    10     1. http://www.inference.org.uk/opengazer/
    11     2. http://www.inference.org.uk/opengazer/#opengazerprevious
     10  1. http://www.inference.org.uk/opengazer/
     11  2. http://www.inference.org.uk/opengazer/#opengazerprevious
    1212
    1313== '''Project Anatomy''' ==