Skip to content

The tutorial describes how to create an experiment in Opensesame with an integrated Pupil Lab tracker.

Notifications You must be signed in to change notification settings

nina563/Pupil_lab_tracker_tutorial

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Pupil_lab_tracker_tutorial

This tutorial shows how to create an experiment in Openseame with an integrated Pupil Lab eye-tracking device. We use Python inline coding. An example of the Opensesame experiment with an integrated tracker is provided.

Table of Contents

  1. Connecting to the eye-tracking device
  2. Starting the recording
  3. Saving events
  4. Stop the recording
  5. Processing the recordings
  6. Opensesame template
  7. Resources

Connecting to the eye-tracking device

To connect the laptop/PC to the tracker you will need to connect both devices to the same local network. For discovery, the local network must allow MDNS and UDP traffic. In large public networks, this may be prohibited for security reasons. Alternatively, the hotspot can be created using a third device - neither the Companion phone nor the laptop you are using to run Opensesame.

Pupil Lab provides a real-time API, that allows you to control the tracking device.

To work with the package, we install it through the Opensesame console:

pip install -r requirements.txt

Opensesame provides inline_script item. We will use them to add Python code to the experiment body. Use further code snippet in the Prepare phase of the inline_script to initialize the tracking device.

To check the IP of the Neon device, go to the settings of the phone, navigate to About device -> Status, and you will find the IP address.

from pupil_labs.realtime_api.simple import Device
ip = “ip of the Neon device "
device = Device(address=ip, port=8080)

You can check if the connection was set by printing out a status update from the device:

print(f"Phone IP address: {device.phone_ip}")
print(f"Phone name: {device.phone_name}")
print(f"Phone unique ID: {device.phone_id}")

Starting the recording

To start the recording you need to have this code in the Run phase of the inline_script item:

recording_id = device.recording_start()
print(f"Started recording with id {recording_id}”)

Saving events

While recording is running, you can create events using the save_event() method.

device.send_event("test event 2", event_timestamp_unix_ns=time.time_ns())

Optionally, you can set a custom timestamp for your event, instead of using the time of the arrival, as in the example.

Stop the recording

Use the recording_stop_and_save() method to stop the recording:

device.recording_stop_and_save()
device.close()

Processing the recordings

After stopping and saving, the recording will be automatically uploaded to the Pupil Cloud. For analysing the gaze data of the participants, the data need to be mapped to the defined surface. Pupil Cloud enrichment Marker Mapper enables that. Further, the remapped gaze data can be downloaded in CVS format.

For defining the surface of the PC screen we use the April Tags 36H11 family. We have 4 tags on the corners of the Opensesame canvas throughout the experiment.

Screenshot 2024-02-14 at 13 48 08

Steps to process the video in the Pupil Cloud :

  1. Create a project in your workspace
  2. Add recording to the project
  3. On the project page, navigate to Enrichments, press Create enrichment, choose Marker Mapper and press create.

You might need to move a few frames forward or backward to get the April tags detected.

Screenshot 2024-02-14 at 13 54 09

  1. After giving a name to the surface and defining it, press Run button on the top left corner to start the video processing.
  2. At the end the mapped gaze data can be found for download at the Downloads tab under Enrichment data.

Screenshot 2024-02-14 at 15 03 07

Opensesame template

The provided Opensesame example experiment shows a viewer an image and the correspondent should choose if the image is AI-generated or real. The recording starts right before the instruction sketchpad is shown and ends after the last image. When the new image is shown, we save the timestamp of the event.

Run the example file

To run the example file:

  • You need to download the image folders (fake, real, april_tag) and save them to the same folder as the .osexp experiment file.
  • Pip install the real-time API in the Opensesame console.
  • Connect laptop/PC to the same local network as Neon companion device

After that, you are ready to run the example experiment!

Resources

Documentation

  1. OpenSesame Python Manual Official documentation for Python scripting in OpenSesame, providing detailed information about its usage and functionalities.
  2. Pupil Labs Real-Time API Documentation Detailed documentation for the Pupil Labs Real-Time API, including examples and usage instructions for interfacing with Pupil Labs eye-tracking devices.

Support

  • Support from Pupil Labs - The Pupil Labs team provided valuable assistance and guidance during the development of this tutorial.

About

The tutorial describes how to create an experiment in Opensesame with an integrated Pupil Lab tracker.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published