Pupil Labs - Neon

High Level Neon Introduction

Neon is a calibration-free, wearable eye tracker. The system consists of two inward-facing eye cameras and one forward-facing world camera mounted on a wearable eyeglasses-like frame.

Neon provides gaze data in its world camera’s field of view, regardless of the wearer’s head position. As such, gaze can be analysed with the wearer looking and moving freely in their environment.

Neon is unlike remote eye trackers which employ cameras mounted on or near a computer monitor. They provide gaze in screen-based coordinates, and this facilitates closed-loop analyses of gaze based on the known position of stimuli on-screen and eye gaze direction.

To use Neon for screen-based work in |PsychoPy|, the screen needs to be robustly located within the world camera’s field of view, and Neon’s gaze data subsequently transformed from world camera-based coordinates to screen-based coordinates. This is achieved with the use of AprilTag Markers.

Neon in the "Just Act Natural" frame

For a detailed overview of wearable vs remote eye trackers, check out this Pupil Labs blog post.

Join the Pupil Labs Discord community to share your research and/or questions.

Device, Software, and Connection Setup

Setting Up the Eye Tracker

  1. Follow Neon’s Getting Started guide to setup the headset and companion device.

Setting Up |PsychoPy|

  1. Install the Pupil Labs plugin using the Plugin Manager and restart Builder.

  2. Open experiment settings in the Builder Window (cog icon in top panel)

  3. Open the Eyetracking tab

  4. Modify the properties as follows:

    • Select Pupil Labs (Neon) from the Eyetracker Device drop down menu

    • Companion address / Companion port - Defines how to connect to the Companion Device. These values can be found in the Neon Companion app by clicking the Stream button in the top-left corner of the app.

  5. Add AprilTag components to the routines that require eyetracking. - The April Tag Frame component provides an easy way to add an array of AprilTag markers around the edges of the display. - If the April Tag Frame is not workable (for example, you need to render a stimulus in the corner of the display), advanced marker placement is possible using individual April Tag components.

    • Three markers is generally considered a bare minimum, but more markers will yield more robust detection and more accurate mapping.

    • All markers which are displayed simultaneously must each use a unique marker ID.

    • Markers can be placed anywhere on the screen as long as they are fully visible and do not overlap.

A sample experiment is available for reference.

Implementation and API Overview

EyeTracker Class

class psychopy.iohub.devices.eyetracker.hw.pupil_labs.neon.EyeTracker(*args, **kwargs)[source]

Bases: EyeTrackerDevice

Implementation of the Common Eye Tracker Interface for the Pupil Core headset.

Uses ioHub’s polling method to process data from Pupil Capture’s Network API.

To synchronize time between Pupil Capture and PsychoPy, the integration estimates the offset between their clocks and applies it to the incoming data. This step effectively transforms time between the two softwares while taking the transmission delay into account. For details, see this real-time time-sync tutorial.

Note

Only one instance of EyeTracker can be created within an experiment. Attempting to create > 1 instance will raise an exception.

getLastGazePosition() Tuple[float, float] | None[source]

The getLastGazePosition method returns the most recent eye gaze position received from the Eye Tracker. This is the position on the calibrated 2D surface that the eye tracker is reporting as the current eye position. The units are in the units in use by the ioHub Display device.

If binocular recording is being performed, the average position of both eyes is returned.

If no samples have been received from the eye tracker, or the eye tracker is not currently recording data, None is returned.

Returns:

  • None:

    If the eye tracker is not currently recording data or no eye samples have been received.

  • tuple:

    Latest (gaze_x,gaze_y) position of the eye(s)

getLastSample() None | psychopy.iohub.devices.eyetracker.MonocularEyeSampleEvent | psychopy.iohub.devices.eyetracker.BinocularEyeSampleEvent[source]

The getLastSample method returns the most recent eye sample received from the Eye Tracker. The Eye Tracker must be in a recording state for a sample event to be returned, otherwise None is returned.

Returns:

  • MonocularEyeSample:

    Gaze mapping result

  • BinocularEyeSample:

    Eye state data

  • None:

    If the eye tracker is not currently recording data.

isConnected() bool[source]

isConnected returns whether the ioHub EyeTracker Device is connected to Pupil Capture or not. A Pupil Core headset must be connected and working properly for any of the Common Eye Tracker Interface functionality to work.

Parameters:

None

Returns:

bool: True = the eye tracking hardware is connected. False otherwise.

isRecordingEnabled() bool[source]

The isRecordingEnabled method indicates if the eye tracker device is currently recording data.

Returns:

True == the device is recording data; False == Recording is not occurring

setConnectionState(enable: bool) None[source]

setConnectionState either connects (setConnectionState(True)) or disables (setConnectionState(False)) active communication between the ioHub and Pupil Capture.

Note

A connection to the Eye Tracker is automatically established when the ioHub Process is initialized (based on the device settings in the iohub_config.yaml), so there is no need to explicitly call this method in the experiment script.

Note

Connecting an Eye Tracker to the ioHub does not necessarily collect and send eye sample data to the ioHub Process. To start actual data collection, use the Eye Tracker method setRecordingState(bool) or the ioHub Device method (device type independent) enableEventRecording(bool).

Parameters:

enable (bool) – True = enable the connection, False = disable the connection.

Returns:

bool: indicates the current connection state to the eye tracking hardware.

setRecordingState(should_be_recording: bool) bool[source]

The setRecordingState method is used to start or stop the recording and transmission of eye data from the eye tracking device to the ioHub Process.

If the pupil_capture_recording.enabled runtime setting is set to True, a corresponding raw recording within Pupil Capture will be started or stopped.

should_be_recording will also be passed to EyeTrackerDevice.enableEventReporting().

Parameters:

recording (bool) – if True, the eye tracker will start recordng data.; false = stop recording data.

Returns:

bool: the current recording state of the eye tracking device

trackerSec() float[source]

Returns EyeTracker.trackerTime()

Returns:

The eye tracker hardware’s reported current time in sec.msec-usec format.

trackerTime() float[source]

Returns the current time reported by the eye tracker device.

Implementation measures the current time in PsychoPy time and applies the estimated clock offset to transform the measurement into tracker time.

Returns:

The eye tracker hardware’s reported current time.

Supported Event Types

The Neon–|PsychoPy| integration provides real-time access to MonocularEyeSampleEvent events for gaze data.

class psychopy.iohub.devices.eyetracker.MonocularEyeSampleEvent(*args, **kwargs)[source]

A MonocularEyeSampleEvent represents the eye position and eye attribute data collected from one frame or reading of an eye tracker device that is recoding from only one eye, or is recording from both eyes and averaging the binocular data.

Event Type ID: EventConstants.MONOCULAR_EYE_SAMPLE

Event Type String: ‘MONOCULAR_EYE_SAMPLE’

device_time: float

time of gaze measurement, in sec.msec format, using Pupil Capture clock

logged_time: float

time at which the sample was received in PsychoPy, in sec.msec format, using PsychoPy clock

time: float

time of gaze measurement, in sec.msec format, using PsychoPy clock

delay: float

The difference between logged_time and time, in sec.msec format

gaze_x: float
  • x component of gaze location in display coordinates.

gaze_y: float

y component of gaze location in display coordinates.

Eye state data, if enabled, is provided through BinocularEyeSampleEvent events.

class psychopy.iohub.devices.eyetracker.BinocularEyeSampleEvent(*args, **kwargs)[source]

The BinocularEyeSampleEvent event represents the eye position and eye attribute data collected from one frame or reading of an eye tracker device that is recording both eyes of a participant.

Event Type ID: EventConstants.BINOCULAR_EYE_SAMPLE

Event Type String: ‘BINOCULAR_EYE_SAMPLE’

device_time: float

time of eye state measurement, in sec.msec format, using Pupil Capture clock

logged_time: float

time at which the sample was received in PsychoPy, in sec.msec format, using PsychoPy clock

time: float

time of eye state measurement, in sec.msec format, using PsychoPy clock

delay: float

The difference between logged_time and time, in sec.msec format

left_eye_cam_x: float
  • x component of left eye’s position relative to the scene camera

left_eye_cam_y: float
  • y component of left eye’s position relative to the scene camera

left_eye_cam_z: float
  • z component of left eye’s position relative to the scene camera

left_gaze_x: float
  • x component of left eye’s optical axis vector

left_gaze_y: float
  • y component of left eye’s optical axis vector

left_gaze_z: float
  • z component of left eye’s optical axis vector

left_pupil_measure1: float
  • left eye pupil diameter in mm

right_gaze_x: float
  • x component of right eye’s optical axis vector

right_gaze_y: float
  • y component of right eye’s optical axis vector

right_gaze_z: float
  • z component of right eye’s optical axis vector

right_pupil_measure1: float
  • right eye pupil diameter in mm

Default Device Settings

Last Updated: October, 2024


Back to top