This tutorial shows how to create an experiment in Openseame with an integrated Pupil Lab eye-tracking device. We use Python inline coding. An example of the Opensesame experiment with an integrated tracker is provided.
- Connecting to the eye-tracking device
- Starting the recording
- Saving events
- Stop the recording
- Processing the recordings
- Opensesame template
- Resources
To connect the laptop/PC to the tracker you will need to connect both devices to the same local network. For discovery, the local network must allow MDNS and UDP traffic. In large public networks, this may be prohibited for security reasons. Alternatively, the hotspot can be created using a third device - neither the Companion phone nor the laptop you are using to run Opensesame.
Pupil Lab provides a real-time API, that allows you to control the tracking device.
To work with the package, we install it through the Opensesame console:
pip install -r requirements.txt
Opensesame provides inline_script
item. We will use them to add Python code to the experiment body. Use further code snippet in the Prepare phase of the inline_script
to initialize the tracking device.
To check the IP of the Neon device, go to the settings of the phone, navigate to About device -> Status, and you will find the IP address.
from pupil_labs.realtime_api.simple import Device
ip = “ip of the Neon device "
device = Device(address=ip, port=8080)
You can check if the connection was set by printing out a status update from the device:
print(f"Phone IP address: {device.phone_ip}")
print(f"Phone name: {device.phone_name}")
print(f"Phone unique ID: {device.phone_id}")
To start the recording you need to have this code in the Run phase of the inline_script
item:
recording_id = device.recording_start()
print(f"Started recording with id {recording_id}”)
While recording is running, you can create events using the save_event() method.
device.send_event("test event 2", event_timestamp_unix_ns=time.time_ns())
Optionally, you can set a custom timestamp for your event, instead of using the time of the arrival, as in the example.
Use the recording_stop_and_save() method to stop the recording:
device.recording_stop_and_save()
device.close()
After stopping and saving, the recording will be automatically uploaded to the Pupil Cloud. For analysing the gaze data of the participants, the data need to be mapped to the defined surface. Pupil Cloud enrichment Marker Mapper enables that. Further, the remapped gaze data can be downloaded in CVS format.
For defining the surface of the PC screen we use the April Tags 36H11 family. We have 4 tags on the corners of the Opensesame canvas throughout the experiment.
- Create a project in your workspace
- Add recording to the project
- On the project page, navigate to Enrichments, press
Create enrichment
, chooseMarker Mapper
and press create.
You might need to move a few frames forward or backward to get the April tags detected.
- After giving a name to the surface and defining it, press
Run
button on the top left corner to start the video processing. - At the end the mapped gaze data can be found for download at the
Downloads
tab under Enrichment data.
The provided Opensesame example experiment shows a viewer an image and the correspondent should choose if the image is AI-generated or real. The recording starts right before the instruction sketchpad is shown and ends after the last image. When the new image is shown, we save the timestamp of the event.
To run the example file:
- You need to download the image folders (fake, real, april_tag) and save them to the same folder as the .osexp experiment file.
- Pip install the real-time API in the Opensesame console.
- Connect laptop/PC to the same local network as Neon companion device
After that, you are ready to run the example experiment!
- OpenSesame Python Manual Official documentation for Python scripting in OpenSesame, providing detailed information about its usage and functionalities.
- Pupil Labs Real-Time API Documentation Detailed documentation for the Pupil Labs Real-Time API, including examples and usage instructions for interfacing with Pupil Labs eye-tracking devices.
- Support from Pupil Labs - The Pupil Labs team provided valuable assistance and guidance during the development of this tutorial.