Game Development & VR Integration#
DSI-24 with VR Adapter clips#
Use Wearable Sensing EEG devices with game engines and VR platforms to create brain-computer interface experiences. Stream real-time neural data into your engine of choice, trigger in-game events from brain signals, and build immersive neurofeedback applications.
The concepts and architecture described here apply to any game engine. Unity, Unreal Engine, Godot, and others can all consume LSL streams and communicate with a Python backend. The documentation focuses on Unity as the primary example, but the same approaches work across platforms.
What Are You Looking For?#
Tutorials#
Best Practices & Tooling — Recommended architecture, triggering methods, tooling overview, and system requirements. Start here before anything else.
Unity Integration — How to receive LSL markers and EEG data in Unity, with Python backend examples and communication code.
Game Development — BCI paradigms, game design considerations, and an overview of the bci-essentials framework for building BCI games.
VR Development — Device selection, VR adapter setup, platform SDK configuration, and EEG-VR integration patterns.