Meetings
Link to page

Data Collection

The experiment aimed to capture EEG signals associated with motor commands using OpenBCI’s Cyton+Daisy board and a gel-free electrode cap equipped with 16 electrodes (FP1, FP2, C3, C4, T3, T4, O1, O2, F7, F8, F3, F4, T5, T6, P3, P4) . The collected data was used to build a classification model for predicting driver intentions based on EEG signals, synchronized with external sensors and robotic control systems.

1. Pre-Session Preparations

A detailed pre-session checklist ensured that all hardware and software components were functional and that data integrity was maintained throughout the session.

1.1 Gel-Free EEG Cap (OpenBCI Headset)

Cleanliness: Ensured that the headset and electrodes were clean and free from debris.
Battery Check: Verified the battery level and replaced it if necessary.
Bluetooth Connectivity: Confirmed a stable wireless connection between the EEG headset and the PC.
Electrode Positioning: Checked all electrode connections for proper placement to optimize signal capture.
y-splitter cable connect together the bottom SRB pins on the Cyton and the Daisy module, then connect to the REF location of the cap.
Connect bottom pins N1P through N8P on the Cyton, these pins will be channels 1-8 according to table below:
1 - Cyton Board
Number
Electrode
Cyton Board Pin
Wire Color
y-splitter
Bottom SRB pin (SRB2)
White
1
FP1
Bottom N1P pin
Grey
2
FP2
Bottom N2P pin
Purple
3
C3
Bottom N3P pin
Blue
4
C4
Bottom N4P pin
Green
5
T3
Bottom N5P pin
Yellow
6
T4
Bottom N6P pin
Orange
7
O1
Bottom N7P pin
Red
8
O2
Bottom N8P pin
Brown
There are no rows in this table
Connect top pins N1P through N8P on the Daisy module to another set of HPTA cables, shown below, these pins will be channels 9-16:
Connect bottom BIAS pin of the Cyton to the GND location on the cap
2 - Daisy
Number
Electrode
Daisy Pin
Wire Color
9
F7
Bottom N1P pin
Grey
10
F8
Bottom N2P pin
Purple
11
F3
Bottom N3P pin
Blue
12
F4
Bottom N4P pin
Green
13
T5
Bottom N5P pin
Yellow
14
T6
Bottom N6P pin
Orange
15
P3
Bottom N7P pin
Red
16
P4
Bottom N8P pin
Brown
There are no rows in this table
Signal Quality Test: A brief test using the OpenBCI GUI was conducted to verify EEG signal strength and check for packet loss, ensuring proper electrode placement and data quality.

1.2 Zed Camera

Lens Inspection: Cleaned the camera lens to ensure clear video capture.
Connectivity: Checked that the camera was securely connected to the robot computer.
Positioning: Ensured proper camera positioning on the robot for optimal visual input.
Video Feed Test: Conducted a video capture test to verify proper video feed transmission.

1.3 Yoctopuce Sensors (GPS, IMU, Light)

Connection Verification: Confirmed secure connections of the Yoctopuce sensors to the robot computer.
Functionality Test: Tested the sensors (GPS, IMU, light) for accurate readings and functionality.

1.4 Robot Computer

Peripheral Recognition: Verified that all peripherals (camera, sensors) were recognized by the robot computer.
Communication Link: Tested the communication link between the robot computer and the PC to ensure proper synchronization, using the same NTP (Network Time Protocol) to ensure consistent time alignment across devices.
Storage Space: Checked for sufficient storage capacity on the robot computer for data collection.

1.5 PC Computer

Display Frequency: Configured the display screens to operate at a refresh rate of 60 Hz.
Device Recognition: Ensured that all connected devices (EEG headset, Xbox controller) were recognized by the PC.
Data Storage & Backup: Verified sufficient storage space and functionality of the data backup system on the PC.

1.6 Xbox Controller

Battery Check: Verified the battery level of the Xbox controller.
Bluetooth Connectivity: Ensured a stable Bluetooth connection between the controller and the PC.
Functionality Test: Tested all buttons, triggers, and joysticks for functionality.
Speed Adjustment: Set the rover’s control speed to 0.1, appropriate for the experiment.
Control Test: Performed a test run to confirm the Xbox controller could accurately control the rover.

1.7 General Preparations

Consent Forms: Prepared participant consent forms and information sheets.
Environment Setup: Organized the experimental environment for participant safety and comfort, ensuring all equipment was properly secured and arranged for ease of use. Additionally, steps were taken to minimize environmental noise and disturbances that could affect EEG signal quality.
Walkthrough: Conducted a final walkthrough of the experimental process to identify potential issues.
Troubleshooting Guide: Ensured the availability of a troubleshooting guide and spare parts (e.g., cables, batteries).

1.8 Final System Check

Full System Test: Conducted a mock session to ensure all devices (EEG headset, camera, sensors, Xbox controller) and software components were working in harmony.
Data Verification: Verified that rosbags were recording data accurately from all devices.
Data Backup: Confirmed that all collected data was saved and backed up in multiple locations.

2. Pre-Session Steps

Before each data collection session, several critical configurations were applied to ensure smooth operation and data integrity. These steps ensured that both hardware and software components were properly set up:

2.1 Display Settings Configuration

To ensure both the subject and researcher screens were configured correctly:
The resolution for both screens was set to 1920x1080 with a refresh rate of 60 Hz using the following commands:
xrandr --output HDMI-0 --mode 1920x1080 --rate 60
xrandr --output DP-3 --mode 1920x1080 --rate 60
This configuration ensured that visual stimuli, control interfaces, and monitoring systems displayed accurately and at the correct refresh rate.

2.2 Initialization of Rover Components

To begin the data collection session, the rover’s components were initialized:
Initialization Scripts: Pre-prepared shell scripts were executed to start the following components:
./rover.sh
./zed.sh
./yocto.sh
These scripts started the rover, the Zed camera, and the Yoctopuce sensors.
ROS Launch: As an alternative, the initialization could be done using ROS launch commands:
roslaunch rr_openrover_basic pro.launch
roslaunch zed_wrapper zed.launch
roslaunch yoctopuce Yocto.launch
This method integrates all components into the ROS framework, streamlining synchronization and data flow management.

3. Data Collection & Recording

Once all hardware components were configured and initialized, data collection and recording were initiated. This phase involved streaming and recording the data from multiple sources simultaneously.

3.1 EEG and Xbox Controller Streaming

To start streaming EEG data and control commands from the Xbox controller:
Manual Start: Shell scripts were used to initialize the streaming processes:
./xbox.sh
./eeg.sh
./ssvep.sh
These scripts controlled the Xbox input manager, EEG streaming, and SSVEP stimuli presentation.
ROS Integration: Alternatively, ROS commands were executed to manage the same processes:
roslaunch rr_control_input_manager xbox_controller.launch
rosrun bcv_ip4d streamingEEGData.py
rosrun bcv_ip4d ssvep_stimuli.py

3.2 GitHub Repositories for Code

The data collection process leveraged code from the following repositories:
: This repository contains modified code for controlling the rover using an Xbox controller. The original code, developed by Rover Robotics, was adapted to fix performance issues and ensure smooth operation during the experiment. ROS-based command handling allows for integration with the robotic system, ensuring precise control over the rover's movements.
: This repository contains the code for streaming EEG data from the OpenBCI Cyton board and presenting SSVEP visual stimuli via PsychoPy. The ROS integration ensures that the EEG signals are synchronized with the rover's sensor data and that visual stimuli are presented at flickering frequencies designed to measure brain responses.

3.3 Disk Space Check
Before recording the data, disk space was verified to avoid potential issues during recording:
Robot System: Checked available storage on the robot using the command:
df -h /mnt/bcv/
PC System: Verified sufficient storage on the PC’s capture drive:
df -h /media/capture/bcv

3.4 Start Rosbag Recording

To record the synchronized data streams from all devices, rosbag was used for efficient data capture and compression:
On Robot System: The following command was used to record multiple topics:
rosbag record -b 4096 --lz4 -o SXXSXX \
/eeg_data \
/yoctopuce/fix \
/yoctopuce/imu \
/yoctopuce/light \
/cmd_vel/joystick \
/zed/zed_node/left/camera_info \
/zed/zed_node/left/image_rect_color \
/zed/zed_node/odom \
/zed/zed_node/path_map \
/zed/zed_node/path_odom \
/zed/zed_node/pose \
/zed/zed_node/pose/status \
/zed/zed_node/right/camera_info \
/zed/zed_node/right/image_rect_color \
/rr_openrover_basic/odom_encoder \
/rr_openrover_basic/raw_fast_rate_data \
/rr_openrover_basic/raw_slow_rate_data \
/tf
Buffer size (-b 4096) ensures efficient memory usage during the recording.
Compression (--lz4) reduces the file size while maintaining data integrity.
Output Filename (-o SXXSXX) indicates a session-specific naming convention for the recorded rosbags.
On PC: To record processed image data separately on the PC:
rosbag record -b 4096 --lz4 -o SXXSXX /processed_image
This setup ensured synchronized recording of EEG, joystick, camera, and sensor data, which were critical for downstream analysis of driver intentions and sensor data fusion.

4. Experimental Design

The experiment was designed to measure participants' brain activity during motor tasks while controlling a rover. Each participant sat in front of a screen displaying five continuously flickering circles, each corresponding to a specific driving command. These visual stimuli were active throughout the session, but participants responded according to a predefined route they had familiarized themselves with prior to the experiment.

4.1 Subject Information

A total of 12 healthy subjects participated in this experiment, comprising 6 males and 6 females aged between 20-40 years. All participants provided informed consent before participating in the study, ensuring that ethical guidelines were followed.

4.2 Visual Stimuli and Corresponding Commands

The visual stimuli consisted of five flickering circles, each representing a different driving command:
Stop (Top Circle): Flicker frequency of 5 Hz
Turn Left (Left Circle): Flicker frequency of 10 Hz
Reverse (Center Circle): Flicker frequency of 15 Hz
Turn Right (Right Circle): Flicker frequency of 20 Hz
Forward (Bottom Circle): Flicker frequency of 30 Hz
These stimuli were presented on a screen with a refresh rate of 60 Hz to maintain consistent flicker visibility and accuracy.

4.3 Task Execution

Although all five circles flickered simultaneously, participants were instructed to perform specific motor tasks (e.g., stopping, turning, reversing) only when they reached certain points on a predefined route. This route, shown in the figure below, was designed to ensure that each participant encountered all possible driving commands (stop, turn left, reverse, turn right, forward) at specific intervals.
image.png

Participants were fully familiarized with this route before the session to ensure that they understood when to perform each motor task. The clear route design helped standardize the experiment and ensured that participants encountered the same driving scenarios in a controlled environment.

5. Repeated Sessions

To ensure robust data collection, each participant completed 10 sessions spread over multiple days. On each day, participants performed two sessions, with a 10-minute break between sessions to allow for cognitive rest.

Key Design Considerations:

Session Structure: Each session replicated the same experimental setup, ensuring consistency across trials.
Diversity of EEG Data: By repeating the experiment over 10 sessions, a broad range of EEG signals associated with different motor commands was captured, providing rich data for training the classification model.
Data Quality: The breaks between sessions helped reduce fatigue, ensuring participants remained focused and data quality remained high.
This approach ensured that enough data was collected to train a reliable classification model, capable of predicting driver intentions based on EEG signals.

Data Processing and Analysis

Want to print your doc?
This is not the way.
Try clicking the ⋯ next to your doc name or using a keyboard shortcut (
CtrlP
) instead.