SR Research

Platforms:

Required Python Version:

Supported Models:

Additional Software Requirements

The SR Research EyeLink implementation of the ioHub common eye tracker interface uses the pylink package written by SR Research. If using a |PsychoPy|3 standalone installation, this package should already be included.

If you are manually installing PsychPy3, please install the appropriate version of pylink. Downloads are available to SR Research customers from their support website.

On macOS and Linux, the EyeLink Developers Kit must also be installed for pylink to work. Please visit SR Research support site for information about how to install the EyeLink developers kit on macOS or Linux.

EyeTracker Class

class psychopy.iohub.devices.eyetracker.hw.sr_research.eyelink.EyeTracker[source]

The SR Research EyeLink implementation of the Common Eye Tracker Interface can be used by providing the following EyeTracker path as the device class in the iohub_config.yaml device settings file:

eyetracker.hw.sr_research.eyelink

Examples

  1. Start ioHub with SR Research EyeLink 1000 and run tracker calibration:

    from psychopy.iohub import launchHubServer
    from psychopy.core import getTime, wait
    
    
    iohub_config = {'eyetracker.hw.sr_research.eyelink.EyeTracker':
                    {'name': 'tracker',
                     'model_name': 'EYELINK 1000 DESKTOP',
                     'runtime_settings': {'sampling_rate': 500,
                                          'track_eyes': 'RIGHT'}
                     }
                    }
    io = launchHubServer(**iohub_config)
    
    # Get the eye tracker device.
    tracker = io.devices.tracker
    
    # run eyetracker calibration
    r = tracker.runSetupProcedure()
    
  2. Print all eye tracker events received for 2 seconds:

    # Check for and print any eye tracker events received...
    tracker.setRecordingState(True)
    
    stime = getTime()
    while getTime()-stime < 2.0:
        for e in tracker.getEvents():
            print(e)
    
  3. Print current eye position for 5 seconds:

    # Check for and print current eye position every 100 msec.
    stime = getTime()
    while getTime()-stime < 5.0:
        print(tracker.getPosition())
        wait(0.1)
    
    tracker.setRecordingState(False)
    
    # Stop the ioHub Server
    io.quit()
    
clearEvents(event_type=None, filter_id=None, call_proc_events=True)

Clears any DeviceEvents that have occurred since the last call to the device’s getEvents(), or clearEvents() methods.

Note that calling clearEvents() at the device level only clears the given device’s event buffer. The ioHub Process’s Global Event Buffer is unchanged.

Parameters:

None

Returns:

None

enableEventReporting(enabled=True)[source]

enableEventReporting is the device type independent method that is equivalent to the EyeTracker specific setRecordingState method.

getConfiguration()

Retrieve the configuration settings information used to create the device instance. This will the default settings for the device, found in iohub.devices.<device_name>.default_<device_name>.yaml, updated with any device settings provided via launchHubServer(…).

Changing any values in the returned dictionary has no effect on the device state.

Parameters:

None

Returns:

The dictionary of the device configuration settings used to create the device.

Return type:

(dict)

getEvents(*args, **kwargs)

Retrieve any DeviceEvents that have occurred since the last call to the device’s getEvents() or clearEvents() methods.

Note that calling getEvents() at a device level does not change the Global Event Buffer’s contents.

Parameters:
  • event_type_id (int) – If specified, provides the ioHub DeviceEvent ID for which events

  • ID (should be returned for. Events that have occurred but do not match the event)

  • class; (specified are ignored. Event type ID's can be accessed via the EventConstants)

  • EventConstants. (all available event types are class attributes of)

  • clearEvents (int) – Can be used to indicate if the events being returned should also be

  • True (removed from the device event buffer.)

  • buffer. (being returned. False results in events being left in the device event)

  • asType (str) – Optional kwarg giving the object type to return events as. Valid values

  • 'namedtuple' (are)

Returns:

New events that the ioHub has received since the last getEvents() or clearEvents() call to the device. Events are ordered by the ioHub time of each event, older event at index 0. The event object type is determined by the asType parameter passed to the method. By default a namedtuple object is returned for each event.

Return type:

(list)

getLastGazePosition()[source]

getLastGazePosition returns the most recent x,y eye position, in Display device coordinate space, received by the ioHub server from the EyeLink device. In the case of binocular recording, and if both eyes are successfully being tracked, then the average of the two eye positions is returned. If the eye tracker is not recording or is not connected, then None is returned. The getLastGazePosition method returns the most recent eye gaze position retrieved from the eye tracker device. This is the position on the calibrated 2D surface that the eye tracker is reporting as the current eye position. The units are in the units in use by the Display device.

If binocular recording is being performed, the average position of both eyes is returned.

If no samples have been received from the eye tracker, or the eye tracker is not currently recording data, None is returned.

Returns:

If the eye tracker is not currently recording data or no eye samples have been received.

tuple: Latest (gaze_x,gaze_y) position of the eye(s)

Return type:

None

getLastSample()[source]

getLastSample returns the most recent EyeSampleEvent received from the EyeLink system. Any position fields are in Display device coordinate space. If the eye tracker is not recording or is not connected, then None is returned.

Returns:

If the eye tracker is not currently recording data.

EyeSample: If the eye tracker is recording in a monocular tracking mode, the latest sample event of this

event type is returned.

BinocularEyeSample: If the eye tracker is recording in a binocular tracking mode, the latest sample event

of this event type is returned.

Return type:

None

getPosition()

See getLastGazePosition().

isRecordingEnabled()[source]

isRecordingEnabled returns True if the eye tracking device is currently connected and sending eye event data to the ioHub server. If the eye tracker is not recording, or is not connected to the ioHub server, False will be returned.

Returns:

True == the device is recording data; False == Recording is not occurring

Return type:

bool

runSetupProcedure(calibration_args={})[source]

Start the EyeLink Camera Setup and Calibration procedure.

During the system setup, the following keys can be used on either the Host PC or Experiment PC to control the state of the setup procedure:

  • C = Start Calibration

  • V = Start Validation

  • ENTER should be pressed at the end of a calibration or validation to accept the calibration,

or in the case of validation, use the option drift correction that can be performed as part of the validation process in the EyeLink system. * ESC can be pressed at any time to exit the current state of the setup procedure and return to the initial blank screen state. * O = Exit the runSetupProcedure method and continue with the experiment.

sendCommand(key, value=None)[source]

The sendCommand method sends an EyeLink command key and value to the EyeLink device. Any valid EyeLInk command can be sent using this method. However, not that doing so is a device dependent operation, and will have no effect on other implementations of the Common EyeTracker Interface, unless the other eye tracking device happens to support the same command, value format.

If both key and value are provided, internally they are combined into a string of the form:

“key = value”

and this is sent to the EyeLink device. If only key is provided, it is assumed to include both the command name and any value or arguments required by the EyeLink all in the one argument, which is sent to the EyeLink device untouched.

sendMessage(message_contents, time_offset=None)[source]

The sendMessage method sends a string (max length 128 characters) to the EyeLink device.

The message will be time stamped and inserted into the native EDF file, if one is being recorded. If no native EyeLink data file is being recorded, this method is a no-op.

setRecordingState(recording)[source]

setRecordingState enables (recording=True) or disables (recording=False) the recording of eye data by the eye tracker and the sending of any eye data to the ioHub Server. The eye tracker must be connected to the ioHub Server by using the setConnectionState(True) method for recording to be possible.

Parameters:

recording (bool) – if True, the eye tracker will start recording data; false = stop recording data.

Returns:

the current recording state of the eye tracking device

Return type:

bool

trackerSec()[source]

trackerSec returns the current EyeLink Host Application time in sec.msec format.

trackerTime()[source]

trackerTime returns the current EyeLink Host Application time in msec format as a long integer.

Supported Event Types

The EyeLink implementation of the ioHub eye tracker interface supports monoculor or binocular eye samples as well as fixation, saccade, and blink events.

Eye Samples

class psychopy.iohub.devices.eyetracker.MonocularEyeSampleEvent(*args, **kwargs)[source]

A MonocularEyeSampleEvent represents the eye position and eye attribute data collected from one frame or reading of an eye tracker device that is recoding from only one eye, or is recording from both eyes and averaging the binocular data.

Event Type ID: EventConstants.MONOCULAR_EYE_SAMPLE

Event Type String: ‘MONOCULAR_EYE_SAMPLE’

time

time of event, in sec.msec format, using psychopy timebase.

eye

Eye that generated the sample. Either EyeTrackerConstants.LEFT_EYE or EyeTrackerConstants.RIGHT_EYE.

gaze_x

The horizontal position of the eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data.

gaze_y

The vertical position of the eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data.

angle_x

Horizontal eye angle.

angle_y

Vertical eye angle.

raw_x

The uncalibrated x position of the eye in a device specific coordinate space.

raw_y

The uncalibrated y position of the eye in a device specific coordinate space.

pupil_measure_1

Pupil size. Use pupil_measure1_type to determine what type of pupil size data was being saved by the tracker.

pupil_measure1_type

Coordinate space type being used for left_pupil_measure_1.

ppd_x

Horizontal pixels per visual degree for this eye position as reported by the eye tracker.

ppd_y

Vertical pixels per visual degree for this eye position as reported by the eye tracker.

velocity_x

Horizontal velocity of the eye at the time of the sample; as reported by the eye tracker.

velocity_y

Vertical velocity of the eye at the time of the sample; as reported by the eye tracker.

velocity_xy

2D Velocity of the eye at the time of the sample; as reported by the eye tracker.

status

Indicates if eye sample contains ‘valid’ data. 0 = Eye sample is OK. 2 = Eye sample is invalid.

class psychopy.iohub.devices.eyetracker.BinocularEyeSampleEvent(*args, **kwargs)[source]

The BinocularEyeSampleEvent event represents the eye position and eye attribute data collected from one frame or reading of an eye tracker device that is recording both eyes of a participant.

Event Type ID: EventConstants.BINOCULAR_EYE_SAMPLE

Event Type String: ‘BINOCULAR_EYE_SAMPLE’

time

time of event, in sec.msec format, using psychopy timebase.

left_gaze_x

The horizontal position of the left eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data.

left_gaze_y

The vertical position of the left eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data.

left_angle_x

The horizontal angle of left eye the relative to the head.

left_angle_y

The vertical angle of left eye the relative to the head.

left_raw_x

The uncalibrated x position of the left eye in a device specific coordinate space.

left_raw_y

The uncalibrated y position of the left eye in a device specific coordinate space.

left_pupil_measure_1

Left eye pupil diameter.

left_pupil_measure1_type

Coordinate space type being used for left_pupil_measure_1.

left_ppd_x

Pixels per degree for left eye horizontal position as reported by the eye tracker. Display distance must be correctly set for this to be accurate at all.

left_ppd_y

Pixels per degree for left eye vertical position as reported by the eye tracker. Display distance must be correctly set for this to be accurate at all.

left_velocity_x

Horizontal velocity of the left eye at the time of the sample; as reported by the eye tracker.

left_velocity_y

Vertical velocity of the left eye at the time of the sample; as reported by the eye tracker.

left_velocity_xy

2D Velocity of the left eye at the time of the sample; as reported by the eye tracker.

right_gaze_x

The horizontal position of the right eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data.

right_gaze_y

The vertical position of the right eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data.

right_angle_x

The horizontal angle of right eye the relative to the head.

right_angle_y

The vertical angle of right eye the relative to the head.

right_raw_x

The uncalibrated x position of the right eye in a device specific coordinate space.

right_raw_y

The uncalibrated y position of the right eye in a device specific coordinate space.

right_pupil_measure_1

Right eye pupil diameter.

right_pupil_measure1_type

Coordinate space type being used for right_pupil_measure1_type.

right_ppd_x

Pixels per degree for right eye horizontal position as reported by the eye tracker. Display distance must be correctly set for this to be accurate at all.

right_ppd_y

Pixels per degree for right eye vertical position as reported by the eye tracker. Display distance must be correctly set for this to be accurate at all.

right_velocity_x

Horizontal velocity of the right eye at the time of the sample; as reported by the eye tracker.

right_velocity_y

Vertical velocity of the right eye at the time of the sample; as reported by the eye tracker.

right_velocity_xy

2D Velocity of the right eye at the time of the sample; as reported by the eye tracker.

status

Indicates if eye sample contains ‘valid’ data for left and right eyes. 0 = Eye sample is OK. 2 = Right eye data is likely invalid. 20 = Left eye data is likely invalid. 22 = Eye sample is likely invalid.

Fixation Events

Successful eye tracker calibration must be performed prior to reading (meaningful) fixation event data.

class psychopy.iohub.devices.eyetracker.FixationStartEvent(*args, **kwargs)[source]

A FixationStartEvent is generated when the beginning of an eye fixation ( in very general terms, a period of relatively stable eye position ) is detected by the eye trackers sample parsing algorithms.

Event Type ID: EventConstants.FIXATION_START

Event Type String: ‘FIXATION_START’

time

time of event, in sec.msec format, using psychopy timebase.

eye

Eye that generated the event. Either EyeTrackerConstants.LEFT_EYE or EyeTrackerConstants.RIGHT_EYE.

gaze_x

Horizontal gaze position at the start of the event, in Display Coordinate Type Units.

gaze_y

Vertical gaze position at the start of the event, in Display Coordinate Type Units.

angle_x

Horizontal eye angle at the start of the event.

angle_y

Vertical eye angle at the start of the event.

pupil_measure_1

Pupil size. Use pupil_measure1_type to determine what type of pupil size data was being saved by the tracker.

pupil_measure1_type

EyeTrackerConstants.PUPIL_AREA

ppd_x

Horizontal pixels per degree at start of event.

ppd_y

Vertical pixels per degree at start of event.

velocity_xy

2D eye velocity at the start of the event.

status

Event status as reported by the eye tracker.

class psychopy.iohub.devices.eyetracker.FixationEndEvent(*args, **kwargs)[source]

A FixationEndEvent is generated when the end of an eye fixation ( in very general terms, a period of relatively stable eye position ) is detected by the eye trackers sample parsing algorithms.

Event Type ID: EventConstants.FIXATION_END

Event Type String: ‘FIXATION_END’

time

time of event, in sec.msec format, using psychopy timebase.

eye

Eye that generated the event. Either EyeTrackerConstants.LEFT_EYE or EyeTrackerConstants.RIGHT_EYE.

duration

Duration of the event in sec.msec format.

start_gaze_x

Horizontal gaze position at the start of the event, in Display Coordinate Type Units.

start_gaze_y

Vertical gaze position at the start of the event, in Display Coordinate Type Units.

start_angle_x

Horizontal eye angle at the start of the event.

start_angle_y

Vertical eye angle at the start of the event.

start_pupil_measure_1

Pupil size at the start of the event.

start_pupil_measure1_type

EyeTrackerConstants.PUPIL_AREA

start_ppd_x

Horizontal pixels per degree at start of event.

start_ppd_y

Vertical pixels per degree at start of event.

start_velocity_xy

2D eye velocity at the start of the event.

end_gaze_x

Horizontal gaze position at the end of the event, in Display Coordinate Type Units.

end_gaze_y

Vertical gaze position at the end of the event, in Display Coordinate Type Units.

end_angle_x

Horizontal eye angle at the end of the event.

end_angle_y

Vertical eye angle at the end of the event.

end_pupil_measure_1

Pupil size at the end of the event.

end_pupil_measure1_type

EyeTrackerConstants.PUPIL_AREA

end_ppd_x

Horizontal pixels per degree at end of event.

end_ppd_y

Vertical pixels per degree at end of event.

end_velocity_xy

2D eye velocity at the end of the event.

average_gaze_x

Average horizontal gaze position during the event, in Display Coordinate Type Units.

average_gaze_y

Average vertical gaze position during the event, in Display Coordinate Type Units.

average_angle_x

Average horizontal eye angle during the event,

average_angle_y

Average vertical eye angle during the event,

average_pupil_measure_1

Average pupil size during the event.

average_pupil_measure1_type

EyeTrackerConstants.PUPIL_AREA

average_velocity_xy

Average 2D velocity of the eye during the event.

peak_velocity_xy

Peak 2D velocity of the eye during the event.

status

Event status as reported by the eye tracker.

Saccade Events

Successful eye tracker calibration must be performed prior to reading (meaningful) saccade event data.

class psychopy.iohub.devices.eyetracker.SaccadeStartEvent(*args, **kwargs)[source]
time

time of event, in sec.msec format, using psychopy timebase.

eye

Eye that generated the event. Either EyeTrackerConstants.LEFT_EYE or EyeTrackerConstants.RIGHT_EYE.

gaze_x

Horizontal gaze position at the start of the event, in Display Coordinate Type Units.

gaze_y

Vertical gaze position at the start of the event, in Display Coordinate Type Units.

angle_x

Horizontal eye angle at the start of the event.

angle_y

Vertical eye angle at the start of the event.

pupil_measure_1

Pupil size. Use pupil_measure1_type to determine what type of pupil size data was being saved by the tracker.

pupil_measure1_type

EyeTrackerConstants.PUPIL_AREA

ppd_x

Horizontal pixels per degree at start of event.

ppd_y

Vertical pixels per degree at start of event.

velocity_xy

2D eye velocity at the start of the event.

status

Event status as reported by the eye tracker.

class psychopy.iohub.devices.eyetracker.SaccadeEndEvent(*args, **kwargs)[source]
time

time of event, in sec.msec format, using psychopy timebase.

eye

Eye that generated the event. Either EyeTrackerConstants.LEFT_EYE or EyeTrackerConstants.RIGHT_EYE.

duration

Duration of the event in sec.msec format.

start_gaze_x

Horizontal gaze position at the start of the event, in Display Coordinate Type Units.

start_gaze_y

Vertical gaze position at the start of the event, in Display Coordinate Type Units.

start_angle_x

Horizontal eye angle at the start of the event.

start_angle_y

Vertical eye angle at the start of the event.

start_pupil_measure_1

Pupil size at the start of the event.

start_pupil_measure1_type

EyeTrackerConstants.PUPIL_AREA

start_ppd_x

Horizontal pixels per degree at start of event.

start_ppd_y

Vertical pixels per degree at start of event.

start_velocity_xy

2D eye velocity at the start of the event.

end_gaze_x

Horizontal gaze position at the end of the event, in Display Coordinate Type Units.

end_gaze_y

Vertical gaze position at the end of the event, in Display Coordinate Type Units.

end_angle_x

Horizontal eye angle at the end of the event.

end_angle_y

Vertical eye angle at the end of the event.

end_pupil_measure_1

Pupil size at the end of the event.

end_pupil_measure1_type

EyeTrackerConstants.PUPIL_AREA

end_ppd_x

Horizontal pixels per degree at end of event.

end_ppd_y

Vertical pixels per degree at end of event.

end_velocity_xy

2D eye velocity at the end of the event.

average_gaze_x

Average horizontal gaze position during the event, in Display Coordinate Type Units.

average_gaze_y

Average vertical gaze position during the event, in Display Coordinate Type Units.

average_angle_x

Average horizontal eye angle during the event,

average_angle_y

Average vertical eye angle during the event,

average_pupil_measure_1

Average pupil size during the event.

average_pupil_measure1_type

EyeTrackerConstants.PUPIL_AREA

average_velocity_xy

Average 2D velocity of the eye during the event.

peak_velocity_xy

Peak 2D velocity of the eye during the event.

status

Event status as reported by the eye tracker.

Default Device Settings

Last Updated: January, 2021


Back to top