Beauchamp:Tobii: Difference between revisions

From OpenWetWare
Jump to navigationJump to search
No edit summary
No edit summary
Line 7: Line 7:


The manual for the Tobii Pro Lab software, as well as other useful documentation and the software itself, can be found [https://www.tobiipro.com/product-listing/tobii-pro-lab/ here].
The manual for the Tobii Pro Lab software, as well as other useful documentation and the software itself, can be found [https://www.tobiipro.com/product-listing/tobii-pro-lab/ here].


===Velocity Classification===
===Velocity Classification===
Line 12: Line 13:


With the Tobii Pro's velocity classification system, it is easy to implement a method of categorization of outputted data and metrics in order to use fixation points as valid data points for further analysis (i.e. a fixation point would represent a point where the subject was directly looking at the stimuli).
With the Tobii Pro's velocity classification system, it is easy to implement a method of categorization of outputted data and metrics in order to use fixation points as valid data points for further analysis (i.e. a fixation point would represent a point where the subject was directly looking at the stimuli).


===Times of Interest===
===Times of Interest===
Line 20: Line 22:


The times of interest allow for quick data manupilation once the data has been exported. For example, if data had been pulled from an EEG experiment where the subject was shown a bright color on a screen at random time intervals, times of interest that correspond to the onset and offset of the stimuli would result in the exported data having a classification for this specific time of interest, allowing for any extra unnecessary data to be easily avoided.
The times of interest allow for quick data manupilation once the data has been exported. For example, if data had been pulled from an EEG experiment where the subject was shown a bright color on a screen at random time intervals, times of interest that correspond to the onset and offset of the stimuli would result in the exported data having a classification for this specific time of interest, allowing for any extra unnecessary data to be easily avoided.


===Snapshot Mapping===
===Snapshot Mapping===
Line 25: Line 28:


This feature works even better with the times of interest feature, allowing for custom times of interest to be generated according to the imported snapshot. In implementing snapshot mapping to the example described in the last section, if the specific stimulus image that appeared on the recording was imported as an image, the program would automatically generate times of interest according to the exact points when the snapshot was present in the original video, making data acquisition and classification even faster and easier.
This feature works even better with the times of interest feature, allowing for custom times of interest to be generated according to the imported snapshot. In implementing snapshot mapping to the example described in the last section, if the specific stimulus image that appeared on the recording was imported as an image, the program would automatically generate times of interest according to the exact points when the snapshot was present in the original video, making data acquisition and classification even faster and easier.


===Areas of Interest===
===Areas of Interest===
Line 30: Line 34:


Of even more interest are the more complex [https://www.tobiipro.com/learn-and-support/learn/steps-in-an-eye-tracking-study/data/analyzing-dynamic-stimuli-with-the-aoi-tool/ '''dynamic stimuli''' that can be tracked with the AOI feature]. These are useful for dynamic environments such as video clips or moving objects, as the AOI is mapped and located even with movement. This could be helpful in EEG tasks that utilize moving parts (i.e. if a stimuli has a face rotating around a square, the face could be the AOI which is followed around, with gaze data being classified accordingly).
Of even more interest are the more complex [https://www.tobiipro.com/learn-and-support/learn/steps-in-an-eye-tracking-study/data/analyzing-dynamic-stimuli-with-the-aoi-tool/ '''dynamic stimuli''' that can be tracked with the AOI feature]. These are useful for dynamic environments such as video clips or moving objects, as the AOI is mapped and located even with movement. This could be helpful in EEG tasks that utilize moving parts (i.e. if a stimuli has a face rotating around a square, the face could be the AOI which is followed around, with gaze data being classified accordingly).


===Changes in Set Parameters===
===Changes in Set Parameters===
Line 40: Line 45:
#'''Merging Adjacent Fixations''': The software also automatically merges fixations that it deems to have been separated accidentally, utilizing two parameters: '''max time between fixations''' which defines the maximum time between fixations that should be merged, and '''max angle between fixations''' which defines the maximum visual angle of the eyes between fixations that should be merged.  
#'''Merging Adjacent Fixations''': The software also automatically merges fixations that it deems to have been separated accidentally, utilizing two parameters: '''max time between fixations''' which defines the maximum time between fixations that should be merged, and '''max angle between fixations''' which defines the maximum visual angle of the eyes between fixations that should be merged.  
#'''Discarding Short Fixations''': The program also automatically discards fixations that are deemed to short, in order to keep the data as clean and accurate as possible. For this the program uses the parameter '''minimum fixation duration''', setting the value for the minimum duration a time segment can have to be classified as a fixation.
#'''Discarding Short Fixations''': The program also automatically discards fixations that are deemed to short, in order to keep the data as clean and accurate as possible. For this the program uses the parameter '''minimum fixation duration''', setting the value for the minimum duration a time segment can have to be classified as a fixation.


===Exporting Data===
===Exporting Data===
The Tobii Pro software allows for two types of exportation:
#'''Metrics Export''': Which involves exporting only the eye tracking metrics based on AOIs or Events to third-party software such as Excel or Matlab.
#'''Data Export''': Here the data exported is not tied to AOIs, being the raw gaze point data (such as gaze point in different coordinate systems, pupil diameter, eye position, and recording information). Metrics regarding the AOIs is not provided in this exportation type.
The types of values available for exportation through the '''metrics export''' are as follows:
#'''Interval Duration''': Duration of all time intervals for each TOI, including averages, medians, sums, and counts. (Format: ''HH:MM:SS:mmm'')
#'''Interval Start''': Start time of all intervals for each TOI, including averages, medians, and counts. (Format: ''HH:MM:SS:mmm'')
#'''Event Count''': Number of events, including custom event types, for each TOI, including averages, medians, sums, counts, variance, and standard deviation (N-1). Here, descriptive statistics only include recordings where events occur. (Format: ''Number'')
#'''Event Count (Including Zeroes)''': Same as the other '''Event Count''' metric except that here descriptive statistics also include recordings where no events occur. (Format: ''Number'')
#'''AOI Time to First Fixation''': The time to first fixation for each AOI on all media, including averages, medians, count, and recording durations. (Format: ''HH:MM:SS:mmm'')
#'''AOI Total Visit Duration''': Total time each participant has visited each AOI on all media, including averages, medians, sums, the share of total time spent in each AOI out of all AOIs, and the percentage of participants that visited each AOI at least once. Here, descriptive statistics are only based on recordings with fixations within the AOIs. (Format: ''HH:MM:SS:mmm'')
#'''AOI Total Visit Duration (Including Zeroes)''': Same as other '''AOI Total Visit Duration''' metric except here descriptive statistics also include recordings with no fixations within the AOIs. (Format: ''HH:MM:SS:mmm'')
#'''AOI Average Visit Duration''': Average duration each participant has visited each AOI on all media, including averages, medians, sums, and the percentage of participants that visited each AOI. (Format: ''HH:MM:SS:mmm'')
#'''AOI Visit Count''': Number of visits within each AOI on all media, including averages, medians, and the percentage of participants that fixated within each AOI at least once. Here, descriptive statistics are only based on recordings with fixations within the AOIs. (Format: ''Number'')
#'''AOI Visit Count (Including Zeroes)''': Same as other '''AOI Visit Count''' metric except here descriptive statistics also include recordings with no fixations within the AOIs. (Format: ''Number'')
#'''AOI Total Fixation Duration''': Total time each participant has fixated on each AOI on all media, including averages, medians, sums, variance, standard deviations (N-1), the share of total time spent on each AOI out of all AOIs, and the percentage of participants that fixated within each AOI at least once. Here, descriptive statistics are only based on recordings with fixations within the AOIs. (Format: ''HH:MM:SS:mmm'')
#'''AOI Total Fixation Duration (Including Zeroes)''': Same as other '''AOI Total Fixation Duration''' metric except here descriptive statistics also include recordings with no fixations within the AOIs. (Format: ''HH:MM:SS:mmm'')
#'''AOI Average Fixation Duration''': Average duration of the fixations within each AOI on all media, including averages, medians, variances, standard deviations (N-1), the total Time of Interest, and recording durations. (Format: ''HH:MM:SS:mmm')
#'''AOI Fixation Count''': Number of fixations within each AOI on all media, including averages, medians, sums, variances, standard deviations (N-1), the percentage of participants that visited each AOI at least once, total number of fixations within the TOI, and the total TOI and recording durations. Here, descriptive statistics are only based on recordings within the AOIs. (Format: ''Number'')
#'''AOI Fixation Count (Including Zeroes)''': Same as other '''AOI Fixation Count''' metric except here descriptive statistics also include recordings with no fixations within the AOIs. (Format: ''Number'')
The types of values available for exportation through the '''data export''' are as follows:
#'''Project Name''':


==Sending Signals To and From the Glasses==
==Sending Signals To and From the Glasses==

Revision as of 14:33, 27 July 2017

Notes by Santiago on using the Tobii Glasses


Data Analysis

The Tobii Pro Lab analysis software enables the user to actively manipulate the received data from the Tobii Pro glasses. This can be helpful towards ensuring that the produced data is concise and clear. Essentially, the eye tracking data from the glasses is displayed in real-time on the software while it is being recorded, along with a visual representation of the user's field of vision. The software uses these features to output a useful metrics describing the position of the eyes, their velocity (according to several eye point values), among other things.

The manual for the Tobii Pro Lab software, as well as other useful documentation and the software itself, can be found here.


Velocity Classification

One of the useful aspects of the Tobii Pro software is its classification of gaze data according to their relative velocities. Essentially, a specific threshold velocity is set, being a neutral point, such that any velocity point above this is classified as a saccade and any point below is classified as a fixation. This is useful as a fixation refers to generally fixed eye movement, such that the subject is holding his gaze in relatively the same position, possibly indicating concentration and awareness. Although a saccade is generally linked to rapid eye movement, it should be understood that brief saccades, termed micro-saccades, could be the result of arbitrary noise from an eyelash or the like, and the surrounding data may still be useful.

With the Tobii Pro's velocity classification system, it is easy to implement a method of categorization of outputted data and metrics in order to use fixation points as valid data points for further analysis (i.e. a fixation point would represent a point where the subject was directly looking at the stimuli).


Times of Interest

The Tobii Pro software also allows for the creation of times of interest which enable the user to segment the overall data from the initial analysis in order to make the handling of the data easier. There are two forms of generating times of interest:

  1. Automatic Times of Interest: Created through the implementation of snapshot mapping (explained below), holding all of the data that corresponds to the snapshot that was mapped.
  2. Custom Times of Interest: Created manually by the user.

The times of interest allow for quick data manupilation once the data has been exported. For example, if data had been pulled from an EEG experiment where the subject was shown a bright color on a screen at random time intervals, times of interest that correspond to the onset and offset of the stimuli would result in the exported data having a classification for this specific time of interest, allowing for any extra unnecessary data to be easily avoided.


Snapshot Mapping

Snapshot mapping is another useful feature of the Tobii Pro analysis software, allowing the user to obtain classification of gaze data points on an image of interest quickly. This feature uses an imported "snapshot" (being an image of less than 25 MP) to automatically map gaze data from a recording onto the snapshot according to the points the program matches. This allows for specific analysis of stimuli to be done easily as the program does the bulk of the work. Also, if any points are off the user can simply manually map the corresponding gaze data onto the snapshot.

This feature works even better with the times of interest feature, allowing for custom times of interest to be generated according to the imported snapshot. In implementing snapshot mapping to the example described in the last section, if the specific stimulus image that appeared on the recording was imported as an image, the program would automatically generate times of interest according to the exact points when the snapshot was present in the original video, making data acquisition and classification even faster and easier.


Areas of Interest

Areas of interest are another useful feature of the Tobii Pro analysis software, functioning similarly to the snapshot feature. This allows the user to select specific objects in the video that are of interest and force the program to map their location in the video, further classifying gaze data according to that specific area of interest.

Of even more interest are the more complex dynamic stimuli that can be tracked with the AOI feature. These are useful for dynamic environments such as video clips or moving objects, as the AOI is mapped and located even with movement. This could be helpful in EEG tasks that utilize moving parts (i.e. if a stimuli has a face rotating around a square, the face could be the AOI which is followed around, with gaze data being classified accordingly).


Changes in Set Parameters

The Tobii Pro software also allows the user to change the parameters followed by the program as well as its management of data. Mainly, the user can change the following:

  1. Eye Selection: The user can select to discard data from a specific eye if needed. This is currently not available for Glasses projects, however I thought it would be useful to understand that this is also done if only one eye is detected by the glasses, allowing for a rather manual approach to this if it is needed.
  2. Noise Reduction: The user can also manipulate the noise reduction median used by the software, in order to use a smaller sample of data points for noise reduction or a larger one.
  3. Velocity: The user can also manipulate the velocity parameter in the software, changing the threshold velocity for which fixations and saccades are defined, as well as the actual calculation of velocity (amount of data points to be used for velocity calculation).
  4. Gaze Data Error Filling: The software also has a parameter for fixing empty gaps of data due to noise, with the parameter being the max gap length for which the software will utilize the surrounding data to fill in the gap with in order to avoid segmentation of the data at the point. The user can manipulate the size of this parameter as needed.
  5. Merging Adjacent Fixations: The software also automatically merges fixations that it deems to have been separated accidentally, utilizing two parameters: max time between fixations which defines the maximum time between fixations that should be merged, and max angle between fixations which defines the maximum visual angle of the eyes between fixations that should be merged.
  6. Discarding Short Fixations: The program also automatically discards fixations that are deemed to short, in order to keep the data as clean and accurate as possible. For this the program uses the parameter minimum fixation duration, setting the value for the minimum duration a time segment can have to be classified as a fixation.


Exporting Data

The Tobii Pro software allows for two types of exportation:

  1. Metrics Export: Which involves exporting only the eye tracking metrics based on AOIs or Events to third-party software such as Excel or Matlab.
  2. Data Export: Here the data exported is not tied to AOIs, being the raw gaze point data (such as gaze point in different coordinate systems, pupil diameter, eye position, and recording information). Metrics regarding the AOIs is not provided in this exportation type.

The types of values available for exportation through the metrics export are as follows:

  1. Interval Duration: Duration of all time intervals for each TOI, including averages, medians, sums, and counts. (Format: HH:MM:SS:mmm)
  2. Interval Start: Start time of all intervals for each TOI, including averages, medians, and counts. (Format: HH:MM:SS:mmm)
  3. Event Count: Number of events, including custom event types, for each TOI, including averages, medians, sums, counts, variance, and standard deviation (N-1). Here, descriptive statistics only include recordings where events occur. (Format: Number)
  4. Event Count (Including Zeroes): Same as the other Event Count metric except that here descriptive statistics also include recordings where no events occur. (Format: Number)
  5. AOI Time to First Fixation: The time to first fixation for each AOI on all media, including averages, medians, count, and recording durations. (Format: HH:MM:SS:mmm)
  6. AOI Total Visit Duration: Total time each participant has visited each AOI on all media, including averages, medians, sums, the share of total time spent in each AOI out of all AOIs, and the percentage of participants that visited each AOI at least once. Here, descriptive statistics are only based on recordings with fixations within the AOIs. (Format: HH:MM:SS:mmm)
  7. AOI Total Visit Duration (Including Zeroes): Same as other AOI Total Visit Duration metric except here descriptive statistics also include recordings with no fixations within the AOIs. (Format: HH:MM:SS:mmm)
  8. AOI Average Visit Duration: Average duration each participant has visited each AOI on all media, including averages, medians, sums, and the percentage of participants that visited each AOI. (Format: HH:MM:SS:mmm)
  9. AOI Visit Count: Number of visits within each AOI on all media, including averages, medians, and the percentage of participants that fixated within each AOI at least once. Here, descriptive statistics are only based on recordings with fixations within the AOIs. (Format: Number)
  10. AOI Visit Count (Including Zeroes): Same as other AOI Visit Count metric except here descriptive statistics also include recordings with no fixations within the AOIs. (Format: Number)
  11. AOI Total Fixation Duration: Total time each participant has fixated on each AOI on all media, including averages, medians, sums, variance, standard deviations (N-1), the share of total time spent on each AOI out of all AOIs, and the percentage of participants that fixated within each AOI at least once. Here, descriptive statistics are only based on recordings with fixations within the AOIs. (Format: HH:MM:SS:mmm)
  12. AOI Total Fixation Duration (Including Zeroes): Same as other AOI Total Fixation Duration metric except here descriptive statistics also include recordings with no fixations within the AOIs. (Format: HH:MM:SS:mmm)
  13. AOI Average Fixation Duration: Average duration of the fixations within each AOI on all media, including averages, medians, variances, standard deviations (N-1), the total Time of Interest, and recording durations. (Format: HH:MM:SS:mmm')
  14. AOI Fixation Count: Number of fixations within each AOI on all media, including averages, medians, sums, variances, standard deviations (N-1), the percentage of participants that visited each AOI at least once, total number of fixations within the TOI, and the total TOI and recording durations. Here, descriptive statistics are only based on recordings within the AOIs. (Format: Number)
  15. AOI Fixation Count (Including Zeroes): Same as other AOI Fixation Count metric except here descriptive statistics also include recordings with no fixations within the AOIs. (Format: Number)

The types of values available for exportation through the data export are as follows:

  1. Project Name:

Sending Signals To and From the Glasses

Possible Method for Typical ECOG Setting