This template introduces the use of the link in transferring gaze-position data. This can be transferred in real time, or played back after recording has ended, which helps to separate recording from analysis.
The eyedata template uses real-time eye position data set through the link to display a real-time gaze cursor, using a trial implemented in data_trial.c. The data is then played back using the module playback_trial.c. The bitmap for the trial is a grid of letters, generated by grid_bitmap.c. This module is not discussed further, as it is similar to other bitmap-drawing functions discussed previously. You may wish to read through it as an example of drawing more complex displays using SDL.
These are the files used to build eyedata. Those that were covered previously are marked with an asterisk.
main.c * | WinMain() function for windows non console compilation and main() for other compilations, setup and shutdown link and graphics, open EDF file. This file is unchanged for all templates, and can be used with small changes in your experiments. |
data_trials.c | Called to run a block of trials for the picture template. Performs system setup at the start of each block, then runs the trials. Handles standard return codes from trials to allow trial skip, repeat, and experiment abort. This file can be modified for your experiments, by replacing the trial instance code. |
data_trial.c | Implements a trial with a real-time gaze cursor, and displays a bitmap. |
playback_trial.c | Plays back data from the previous trial. The path of gaze is plotted by a line, and fixations are marked with "F". |
grid_bitmap.c | Creates a bitmap, containing a 5 by 5 grid of letters. |
The block loop module (data_trials.c) creates a grid bitmap, and runs the real- time gaze trial (data_trial.c). This is followed immediately by playback of the trial's data (playback_trial.c).
Only a single trial is run by calling do_data_trial()
. This marks the trial, creates the grid bitmap, and calls realtime_data_trial()
to record data while displaying a gaze cursor. After this, playback_trial()
plays back the data just recorded, plotting the path of gaze and fixations.
Note the command "mark_playback_start"
that is sent to the eye tracker just before the "TRIALID"
message. This command sets the point at which playback of data from the EDF file will start later. If this command is not used, playback begins from the start of recording, and therefore will not include the TRIALID message. Similar to the TEXT template, the display PC bitmap is saved as a .png file in the "images" directory. The bitmap image is also transferred over the link to the tracker PC as a backdrop for gaze cursors.
// There is only one data trial. followed by playing back the data.// Returns trial result codeint do_data_trial(int num){int i;SDL_Surface *bitmap = NULL;// This supplies the title at the bottom of the eyetracker displayeyecmd_printf("record_status_message 'GAZE CURSOR TEST' ");// Marks where next playback will begin in file// If not used, playback begins from start of recordingeyecmd_printf("mark_playback_start");set_offline_mode();// Must be offline to draw to EyeLink screen// Always send a TRIALID message before starting to record.// It marks the start of the trial and should precede any other messageseyemsg_printf("TRIALID GRID");//!V TRIAL_VAR message is recorded for EyeLink Data Viewer analysis// It specifies a trial variable and value for the given trial// This must be specified within the scope of an individual trial (i.e., after// "TRIALID" and before "TRIAL_RESULT")eyemsg_printf("!V TRIAL_VAR condition Playback");// IMGLOAD command is recorded for EyeLink Data Viewer analysis// It displays a default image on the overlay mode of the trial viewer screen.// Writes the image filename + path infoeyemsg_printf("!V IMGLOAD FILL grid.png");// IAREA command is recorded for EyeLink Data Viewer analysis// It creates a set of interest areas by reading the segment files// Writes segmentation filename + path infoeyemsg_printf("!V IAREA FILE grid.ias");bitmap = draw_grid_to_bitmap_segment("grid.ias", get_output_folder(), 1);if(!bitmap){eyemsg_printf("ERROR: could not create bitmap");return SKIP_TRIAL;}// Save bitmap and transfer to the tracker pc.// Since it takes a long to save the bitmap to the file, the// value of sv_options should be set as SV_NOREPLACE to save timei = realtime_data_trial(bitmap, 60000L); // display gaze cursor during recordingplayback_trial(); // Play back trial's dataSDL_FreeSurface(bitmap);return i;}
The data_trial.c module uses real-time gaze data to plot a cursor. This is a simple example of using transferred data, which will be expanded in a later template into a gaze-contingent window. Link data is only required during recording for real-time experiments, since playback is preferred for on-line analysis.
As eye movement data arrives from the tracker, the samples and events are stored in a large queue, to prevent loss of data if an application cannot read the data immediately. The eyelink_core DLL provides two ways to access link data immediately or in sequence. As will be shown in the playback module, samples and events can be read from the queue in the same order they arrived. However, data read in this way will be substantially delayed if the queue contains significant amounts of data.
When eye position data is required with the lowest possible delay, the newest sample received from the link must be obtained. The EyeLink library keeps a copy of the latest data, which is read by eyelink_newest_float_sample()
. This function returns 1 if a new sample is available, and 0 if no new samples have been received since the previous call to the function. It can be called with NULL as the buffer address in order to test if a new sample has arrived. The sample buffer can be a structure of type FSAMPLE or ALLF_DATA, defined in eyetypes.h.
When starting recording, we tell the EyeLink tracker to send samples to us through the link, by setting the third argument to start_recording()
to 1. Data will be available immediately, as start_recording()
does not return until the first data has arrived. Realtime mode is also set, to minimize delays between reading data and updating the gaze cursor.
// Start data recording to EDF file, BEFORE DISPLAYING STIMULUS// You should always start recording 50-100 msec before required// otherwise you may lose a few msec of data// NEW CODE FOR GAZE CURSOR: tell start_recording() to send link dataerror = start_recording(1,1,1,1); // record with link data enabledif(error != 0) return error; // ERROR: couldn't start recording// record for 100 msec before displaying stimulus// Windows: no interruptions from now onbegin_realtime_mode(100);
When using data from the link queue (which will be introduced in the playback module), some samples and/or events will build up in the data queue during the 100 millisecond delay before image display begins. These will be read later, causing an initial burst of data processing. This does not occur with eyelink_newest_float_sample()
, as only the latest sample is read. The queue is not read in this example, so data simply builds up in the queue and the oldest data is eventually overwritten.
After recording starts, a block of sample and/or event data will be opened by sending a special event over the link. This event contains information on what data will be available sent in samples and events during recording, including which eyes are being tracked and sample rates, filtering levels, and whether corneal reflections are being used. The data in this event is only available once it has been read from the link data queue, and the function eyelink_wait_for_block_start()
is used to scan the queue data for the block start event. The arguments to this function specify how long to wait for the block start, and whether samples, events, or both types of data are expected. If no data is found, the trial should end with an error.
Once the block start has been processed, data on the block is available. Because the EyeLink system is a binocular eye tracker, we don't know which eye's data will be present, as this was selected during camera setup by the experimenter. After the block start, eyelink_eye_available()
returns one of the constants LEFT_EYE, RIGHT_EYE, or BINOCULAR to indicate which eye(s) data is available from the link. LEFT_EYE (0) and RIGHT_EYE (1) can be used to index eye data in the sample; if the value is BINOCULAR (2) we use the LEFT_EYE. A message should be placed in the EDF file to record this, so that we know which eye's data to use during analysis.
// determine which eye(s) are availableeye_used = eyelink_eye_available();switch(eye_used)// select eye, add annotation to EDF file{case RIGHT_EYE:eyemsg_printf("EYE_USED 1 RIGHT");break;eye_used = LEFT_EYE;case LEFT_EYE:eyemsg_printf("EYE_USED 0 LEFT");break;}
Additional information on the block data can be accessed by a number of functions: see the reference sections of this manual and the eyelink.h file for more information. Sample rates and other data specific to the EyeLink II tracker is available with the new eyelink2_mode_data()
function.
Code is added to the recording loop to read and process samples. This calls eyelink_newest_float_sample()
to read the latest sample. If new data is available, the gaze position for the monitored eye is extracted from the sample, along with the pupil size.
During a blink, pupil size will be zero, and both x and y gaze position components will be the value MISSING_DATA
. The eye position is undefined during a blink, and this case must be treated carefully for gaze-contingent displays. In this example, the cursor is simply hidden. Otherwise, it is redrawn at the new location.
// NEW CODE FOR GAZE CURSOR{eyelink_newest_float_sample(&evt);// get a copy of the samplex = evt.fs.gx[eye_used]; // get gaze position from sampley = evt.fs.gy[eye_used];// make sure pupil is presentdraw_gaze_cursor((int)x,(int)y); // show and move cursorelseerase_gaze_cursor(); // hide cursor if no pupil}} // END OF RECORDING LOOP
The EyeLink system can supply data on eye movements in real time during recording, via the Ethernet link. The high data rate (binocular, 250, 500, 1000, or 2000 Hz depending on the tracker version and configuration) makes data processing or display generation difficult while data is being transferred, and writing data to disk will cause significant delays that will impact stimulus presentation. Instead, data can be written to the EyeLink EDF file as a permanent record, then played back after recording of the trial is finished. This is the best method to implement on-line analysis when information is needed before the next trial, for example to implement convergent threshold paradigms or to detect if a participant fixated outside of a region on the display.
The playback_trial.c module also demonstrates processing of samples and events from the link queue. The data-processing code is similar to what would be used inside a recording loop for real-time data, except that we will not lose data if analysis takes a long time or the program is stopped by a debugger. In fact, playback data is sent "on demand" at a rate determined by how fast the data queue is read, so it is feasible to process thousands of samples per second during playback.
Data from the last trial (or the last recording block if several were recorded) can be played back by calling the function eyelink_playback_start()
, then waiting for data to arrive by calling eyelink_wait_for_data()
. If no data arrives, this function will return an error code. The first data received during playback will include messages and button presses written to the file just after the end of the next-to-last recording block (or from the start of the file if the first data block is being played back). All data up to the start of block data can be skipped by calling eyelink_wait_for_block_start()
, after which information on sample rate and eyes recorded will also be available. Playback will fail if no EDF file is open, or if no data has yet been recorded.
// Set up the displayclear_full_screen_window(grey); // erase displayget_new_font(NULL, 24,0); // select a fontgraphic_printf(window, black, NONE, SCRWIDTH/2, 24, "Playing back last trial...");SDL_Flip(window);graphic_printf(window, black, NONE, SCRWIDTH/2, 24, "Playing back last trial..."); // drawing to the backgroundset_offline_mode(); // set up eye tracker for playbackeyelink_playback_start(); // start data playback// Wait for first data to arrive// Failure may mean no data or file not open// This function discards other data in file (buttons and messages)// until the start of recording.// If you need these events, then don't use this function.// Instead, wait for a sample or event before setting eye_data,// and have a timeout if no data is available in 2000 msec.if(!eyelink_wait_for_block_start(2000, 1, 1)){alert_printf("ERROR: playback data did not start!");return -1;}
eyelink_wait_for_block_start()
function reads and discards events until the start of the recording is encountered in the data stream. This means that any data before recording (such as the "TRIALID"
message) will be lost. If you need to read this message or other data before recording begins, do not use this function. Instead, read and process events until the first sample is found, or until a message such as "DISPLAY ON"
is found. If this data is not found within 100 milliseconds after playback is started, it is likely that the playback failed for some reason. Once a sample is found, eyelink_eye_available()
and eyelink2_mode_data()
may be called as described below.In the same way as in data_trial.c, we call eyelink_eye_available()
to determine which eye's data to use. (We could have used the messages we placed in the EDF file as well, which will be available during playback). We also determine the sample rate by calling eyelink2_mode_data()
, which returns -1 if extended information is not available - in this case, the EyeLink I sample interval of 4 msec is used.
eye_used = eyelink_eye_available();// determine sample ratei = eyelink2_mode_data(&sample_rate, NULL, NULL, NULL);if(i==-1 || sample_rate<250) sample_rate = 250; // EyeLink I: sample rate = 4 msec
The data received will match that previously recorded in the EDF file. For example, if both samples and events were recorded, both will be sent through the link. Also, the types of events and data selected by EyeLink configuration commands for the file will apply to the playback data, not those selected for real- time link data.
Playback is halted when all data has been read, or when the eyelink_playback_stop()
function is called. The usual tests for the ESC key and for program termination are performed in the loop.
|| eyelink_last_button_press(NULL)) // tracker button also exits{clear_full_screen_window(target_background_color); // hide displayeyelink_playback_stop(); // stop playbackreturn 0;}
When the EyeLink library receives data from the tracker through the link, it places it in a large data buffer called the queue. This can hold 4 to 10 seconds of data, and delivers samples and events in the order they were received.
A data item can be read from the queue by calling eyelink_get_next_data()
, which returns a code for the event type. The value SAMPLE_TYPE
is returned if a sample was read from the queue. Otherwise, an event was read from the queue and a value is returned that identifies the event. The header file eye_data.h contains a list of constants to identify the event codes.
If 0 was returned, the data queue was empty. This could mean that all data has been played back, or simply that the link is busy transferring more data. We can use eyelink_current_mode()
to test if playback is done.
// PROCESS PLAYBACK DATA FROM LINKi = eyelink_get_next_data(NULL); // check for new data itemif(i==0) // 0: no new data{ // Check if playback has completed}
If the item read by eyelink_get_next_data()
was one we want to process, it can be copied into a buffer by eyelink_get_float_data()
. This buffer should be a structure of type FSAMPLE for samples, and ALLF_DATA for either samples or events. These types are defined in eye_data.h.
It is important to remember that data sent over the link does not arrive in strict time sequence. Typically, eye events (such as STARTSACC and ENDFIX) arrive up to 32 milliseconds after the corresponding samples, and messages and buttons may arrive before a sample with the same time code.
In playback_trial.c, fixations will be plotted by drawing an 'F' at the average gaze position. The ENDFIX event is produced at the end of each fixation, and contains the summary data for gaze during the fixation. The event data is read from the fe
(floating-point eye data) field of the ALLF_DATA
type, and the average x and y gaze positions are in the gavx
and gavy
subfields respectively.
{ // PLOT FIXATIONSeyelink_get_float_data(&evt); // get copy of fixation eventif(evt.fe.eye == eye_used) // is it the eye we are plotting?{ // Print a black "F" at average positiongraphic_printf(window, black, NONE, (int)evt.fe.gavx, (int)evt.fe.gavy, "F");SDL_Flip(window);graphic_printf(window, black, NONE, (int)evt.fe.gavx, (int)evt.fe.gavy, "F");}}
It is important to check which eye produced the ENDFIX event. When recording binocularly, both eyes produce separate ENDFIX events, so we must select those from the eye we are plotting gaze position for. The eye to monitor is determined during processing of samples.
It is possible that data may be lost, either during recording with real-time data enabled, or during playback. This might happen because of a lost link packet or because data was not read fast enough (data is stored in a large queue that can hold 2 to 10 seconds of data, and once it is full the oldest data is discarded to make room for new data). The EyeLink library will mark data loss by causing a LOST_DATA_EVENT to be returned by eyelink_get_next_data()
at the point in the data stream where data is missing. This event is defined in the latest version of eye_data.h, and the #ifdef
test in the code below ensures compatibility with older versions of this file.
#ifdef LOST_DATA_EVENT // AVAILABLE IN V2.1 OR LATER DLL ONLY{alert_printf("Lost data in sequence");}#endif
Samples are plotted by connecting them with lines to show the path of the participant's gaze on the display. The gaze position for the monitored eye is extracted from the sample, along with the pupil size. During a blink, pupil size will be zero, and both x and y gaze position components will be the value MISSING_DATA
. Otherwise, we connect the current and last gaze position with a line.
{eyelink_get_float_data(&evt); // get copy of sampleif(eye_used != -1) // do we know which eye yet?{sample_count ++;if (sample_count % SAMPLEPLAYBACKOFFSET == 0){//msec_delay(SAMPLEPLAYBACKOFFSET*1000/sample_rate); // delay for real-time playbackx = evt.fs.gx[eye_used]; // get gaze position from sampley = evt.fs.gy[eye_used];evt.fs.pa[eye_used]>0 ) //check if pupil is present{if(prevFlag && endIdx >1){sarray[countIdx++] = sarray[endIdx-1];sarray[countIdx++] = sarray[endIdx];sarray[endIdx-1] = MISSING_DATA;sarray[endIdx] = MISSING_DATA;endIdx = 0; prevFlag = 0;}if(countIdx < 200){sarray[countIdx++] = x;sarray[countIdx++] = y;}else if (countIdx == 200){drawMultipleSamples(sarray,200);memset(sarray, MISSING_DATA, 200);endIdx = 199;prevFlag = 1;countIdx = 0;}}else // no pupil present: must be in blink{endIdx = countIdx -1;drawMultipleSamples(sarray,countIdx);memset(sarray, MISSING_DATA, 200);prevFlag = 1;countIdx = 0;}}}else{ // if we don't know which eye yet, check which eye presenteye_used = eyelink_eye_available();}}} // END OF PLAYBACK LOOP//check if we still have data to drawif(countIdx>0)drawMultipleSamples(sarray,countIdx);eyelink_playback_stop(); // Stop data playback
Playback data arrives much more quickly than it was recorded. You can take as long as you want to process each data item during playback, because the EyeLink library controls data flow for you. To approximate the original timing of the data, we set a delay of 1, 2 or 4 milliseconds after each sample - this could be improved by using the timestamp available for each sample.