Game Development & VR Integration#

DSI-24 in a VR development setup

DSI-24 with VR Adapter clips#

Use Wearable Sensing EEG devices with game engines and VR platforms to create brain-computer interface experiences. Stream real-time neural data into your engine of choice, trigger in-game events from brain signals, and build immersive neurofeedback applications.

The concepts and architecture described here apply to any game engine. Unity, Unreal Engine, Godot, and others can all consume LSL streams and communicate with a Python backend. The documentation focuses on Unity as the primary example, but the same approaches work across platforms.

What Are You Looking For?#

How do I align triggers and events?
Triggers and Event Alignment
How do I get EEG data into my engine?
Best Practices & Tooling
How do I build a real-time BCI game?
Game Development with DSI
How do I develop for VR with EEG?
VR Development with DSI devices

Tutorials#

  • Best Practices & Tooling — Recommended architecture, triggering methods, tooling overview, and system requirements. Start here before anything else.

  • Unity Integration — How to receive LSL markers and EEG data in Unity, with Python backend examples and communication code.

  • Game Development — BCI paradigms, game design considerations, and an overview of the bci-essentials framework for building BCI games.

  • VR Development — Device selection, VR adapter setup, platform SDK configuration, and EEG-VR integration patterns.

Additional Resources#