The eyelink_core library defines special data types that allow the same programming calls to be used on different platforms such as Windows, Linux, and macOS. You will need to know these types to read the examples and to write your own experiments. Using these types in your code will also help to prevent common program bugs, such as signed/unsigned conversions and integer-size dependencies.
Several platform-portable data types are defined in eyetypes.h, which is automatically included in eyelink.h and eye_data.h. This creates the data types below:
Type Name | Data | Uses |
Byte | 8-bit unsigned byte | images, text and buffers |
INT16 | 16-bit signed word | Return codes, signed integers |
UINT16 | 16-bit unsigned word | Flags, unsigned integers |
INT32 | 32-bit signed dword | Long data, signed time differences |
UINT32 | 32-bit unsigned dword | Timestamps |
You only need to read this section if you are planning to use real-time link data for gaze-contingent displays or gaze-controlled interfaces, or to use data playback.
The EyeLink library defines a number of data types that are used for link data transfer, found in eye_data.h. These are based on the basic data types above. The useful parts of these structures are discussed in the following sections.
There are two basic types of data available through the link: samples and events.
The EyeLink tracker measures eye position 250, 500, 1000, or 2000 times per second depending on the tracker and tracking mode you are working with, and computes true gaze position on the display using the head camera data. This data is stored in the EDF file, and made available through the link in as little as 2 milliseconds after a physical eye movement.
Samples can be read from the link by eyelink_get_float_data()
or eyelink_newest_float_sample()
. These functions store the sample data as a structure of type FSAMPLE:
typedef struct {UINT32 time; // time of sampleINT16 type; // always SAMPLE_TYPEUINT16 flags; // flags to indicate contents// binocular data: indices are 0 (LEFT_EYE) or 1 (RIGHT_EYE)float px[2], py[2]; // pupil xyfloat hx[2], hy[2]; // headref xyfloat pa[2]; // pupil size or areafloat gx[2], gy[2]; // screen gaze xyfloat rx, ry; // screen pixels per degree (angular resolution)UINT16 status; // tracker status flagsUINT16 input; // extra (input word)UINT16 buttons; // button state & changesINT16 htype; // head-tracker data type (0=noe)INT16 hdata[8]; // head-tracker data (not prescaled)} FSAMPLE;
Each field contains one type of data. If the data in a field was not sent for this sample, the value MISSING_DATA (or 0, depending on the field) will be stored in the field, and the corresponding bit in the flags
field will be zero (see eye_data.h for a list of bits). Data may be missing because of the tracker configuration (set by commands sent at the start of the experiment, from the Set Options screen of the eye tracker). Eye position data may also be set to MISSING_VALUE during a blink, but the flags will continue to indicate that this data is present.
The sample data fields are further described in the following table:
Field | Contents |
time | Timestamp when camera imaged eye (in milliseconds since EyeLink tracker was activated) |
type | Always SAMPLE_TYPE |
flags | Bits indicating what types of data are present, and for which eye(s) |
px, py | Camera X, Y of pupil center |
hx, hy | HEADREF angular gaze coordinates |
Pa | Pupil size (arbitrary units, area or diameter as selected) |
gx, gy | Display gaze position, in pixel coordinates set by the screen_pixel_coords command |
rx, ry | Angular resolution at current gaze position, in screen pixels per visual degree |
status | Error and status flags (report CR status and tracking error). See eye_data.h for useful bits. |
input | Data from input port(s) |
buttons | Button input data: high 8 bits indicate changes from last sample, low 8 bits indicate current state of buttons 8 (MSB) to 1 (LSB) |
htype | Type of head position data (0 if none) (RESERVED FOR FUTURE USE) |
hdata | 8 words of head position data |
The EyeLink tracker simplifies data analysis (both on-line and when processing data files) by detecting important changes in the sample data and placing corresponding events into the data stream. These include eye-data events (blinks, saccades, and fixations), button events, input-port events, and messages.
Events may be retrieved by the eyelink_get_float_data()
function, and are stored as C structures. All events share the time
and type
fields in their structures. The type
field uniquely identifies each event type:
// EYE DATA EVENT: all use FEVENT structure#define STARTBLINK 3 // pupil disappeared, time only#define ENDBLINK 4 // pupil reappeared, duration data#define STARTSACC 5 // start of saccade, time only#define ENDSACC 6 // end of saccade, summary data#define STARTFIX 7 // start of fixation, time only#define ENDFIX 8 // end of fixation, summary data#define FIXUPDATE 9 // update within fixation, summary data for interval#define MESSAGEEVENT 24 // user-definable text: IMESSAGE structure#define BUTTONEVENT 25 // button state change: IOEVENT structure#define INPUTEVENT 28 // change of input port: IOEVENT structure#define LOST_DATA_EVENT 0x3F // NEW: Event flags gap in data stream
Events are read into a buffer supplied by your program. Any event can be read into a buffer of type ALLF_EVENT, which is a union of all the event and sample buffer formats:
It is important to remember that data sent over the link does not arrive in strict time sequence. Typically, eye events (such as STARTSACC and ENDFIX) arrive up to 32 milliseconds after the corresponding samples, and messages and buttons may arrive before a sample with the same time code. This differs from the order seen in an ASC file, where the events and samples have been sorted into a consistent order by their timestamps.
The LOST_DATA_EVENT is produced within the DLL to mark the location of lost data. It is possible that data may be lost, either during recording with real-time data enabled, or during playback. This might happen because of a lost link packet or because data was not read fast enough (data is stored in a large queue that can hold 2 to 10 seconds of data, and once it is full the oldest data is discarded to make room for new data). This event has no data or time associated with it.
The EyeLink tracker analyzes the eye-position samples during recording to detect saccades, and accumulates data on saccades and fixations. Events are produced to mark the start and end of saccades, fixations and blinks. When both eyes are being tracked, left and right eye events are produced, as indicated in the eye
field of the FEVENT
structure.
Start events contain only the start time, and optionally the start eye or gaze position. End events contain the start and end time, plus summary data on saccades and fixations. This includes start and end and average measures of position and pupil size, plus peak and average velocity in degrees per second.
typedef struct {UINT32 time; // effective time of eventINT16 type; // event typeUINT16 read; // flags which items were includedINT16 eye; // eye: 0=left,1=rightUINT32 sttime, entime; // start, end sample timestampsfloat hstx, hsty; // href position at startfloat gstx, gsty; // gaze or pupil position at startfloat sta; // pupil size at startfloat henx, heny; // href position at endfloat genx, geny; // gaze or pupil position at endfloat ena; // pupil size at startfloat havx, havy; // average href positionfloat gavx, gavy; // average gaze or pupil positionfloat ava; // average pupil sizefloat avel; // average velocityfloat pvel; // peak velocityfloat svel, evel; // start, end velocityfloat supd_x, eupd_x; // start, end angular resolutionfloat supd_y, eupd_y; // (pixel units-per-degree)UINT16 status; // error, warning flags} FEVENT;
The sttime
and entime
fields of an end event are the timestamps of the first and last samples in the event. To compute duration, subtract these and add 1 sample duration (i.e., 4 ms for a 250 hz recording, 2 ms for a 500 hz recording and 1 msec for a 1000 hz recording).
Each field of the FEVENT
structure is further described in the following table:
Field | Contents |
time | Timestamp of sample causing event (when camera imaged eye, in milliseconds since EyeLink tracker was activated) |
type | The event code |
eye | Which eye produced the event: 0 (LEFT_EYE ) or 1 (RIGHT_EYE ) |
read | Bits indicating which data fields contain valid data. Empty fields will also contain the value MISSING_DATA |
gstx, gsty, genx, geny, gavx, gavy | Display gaze position, in pixel coordinates set by the screen_pixel_coords command. Positions at start, end, and average during saccade, fixation or FIXUPDATE period are reported. |
hstx, hsty, henx, heny, havx, havy | HEADREF gaze position at start, end, and average during saccade, fixation or FIXUPDATE period. |
sta, ena, ava | Pupil size (arbitrary units, area or diameter as selected), at start, and average during fixation of FIXUPDATE interval. |
svel, evel, avel, pvel | Gaze velocity in visual degrees per second. The velocity at the start and end of a saccade or fixation, and average and peak values of velocity magnitude (absolute value) are reported. |
supd_x, supd_y,eupd_x, eupd_y | Angular resolution at start and end of saccade or fixation, in screen pixels per visual degree. The average of start and end values may be used to compute magnitude of saccades. |
status | Collected error and status flags from all samples in the event. See eye_data.h for useful bits. |
Peak velocity for fixations is usually corrupted by terminal segments of the preceding and following saccades. Average velocity for saccades may be larger than the saccade magnitude divided by its duration, because of overshoots and returns.
The supd_x
, supd_y
, eupd_x
, and eupd_y
fields are the angular resolution (in pixel units per visual degree) at the start and end of the saccade or fixation. The average of the start and end angular resolution can be used to compute the size of saccades in degrees. This C code would compute the true magnitude of a saccade from an ENDSACC event stored in the buffer evt:
dx = (evt.fe.genx - evt.fe.gstx) /((evt.fe.eupd_x + evt.fe.supd_x)/2.0);dy = (evt.fe.geny - evt.fe.gsty) /((evt.fe.eupd_y + evt.fe.supd_y)/2.0);dist = sqrt(dx*dx + dy*dy);
When reading real-time data through the link, event data will be delayed from the corresponding samples. This is caused by the velocity detector and event validation processing in the EyeLink tracker. The timestamps in the event reflect the true (sample) times.
BUTTONEVENT and INPUTEVENT types are the simplest events, reporting changes in button status or in the input port data. The time field records the timestamp of the eye-data sample where the change occurred, although the event itself is usually sent before that sample. The data field contains the data after the change, in the same format as in the FSAMPLE structure.
Button events from the link are rarely used; monitoring buttons with one of eyelink_read_keybutton()
, eyelink_last_button_press()
, or eyelink_button_states()
is preferable, since these can report button states at any time, not just during recording.
typedef struct {UINT32 time; // time loggedINT16 type; // event type:UINT16 data; // coded event data} IOEVENT;
A message event is created by your experiment program, and placed in the EDF file. It is possible to enable the sending of these messages back through the link, although there is rarely a reason to do this. Although this method might be used to determine the tracker time (the time
field of a message event will indicate when the message was received by the tracker), the use of eyelink_tracker_time()
is more efficient for retrieving the current time from the eye tracker's timestamp clock. The eye tracker time is rarely needed in any case, and would only be useful to compute link transport delays.
typedef struct {UINT32 time; // time message loggedINT16 type; // event type: usually MESSAGEEVENTUINT16 length; // length of messagebyte text[260]; // message contents (max length 255)} IMESSAGE;