Skip to content

OpenRoboTMS/SlicerRoboTMS

Repository files navigation

SlicerRoboTMS

CI Status Tests License: MIT Python 3.8+

Introduction

SlicerRoboTMS is an open-source 3D Slicer extension that provides a unified interaction infrastructure for Robot-assisted Transcranial Magnetic Stimulation (Robo-TMS) research. By leveraging 3D Slicer's medical imaging and visualisation capabilities, the extension supports MRI-based neuronavigation and interfaces with robotic systems through standardised communication protocols and configurable system descriptions.

The extension is organised as three independent functional modules — Calibration, Registration, and Navigation — which together support a complete Robo-TMS workflow. SlicerRoboTMS is designed to support diverse hardware configurations and rapid prototyping, lowering the barrier to entry and facilitating reproducible and extensible research in Robo-TMS.


Citation

This extension is described in the following paper, accepted at EMBC 2026. Before the publication in IEEE Xplore, we provided an arXiv preprint for open access. If you use SlicerRoboTMS in your research, please cite:

@inproceedings{bai2026slicerrobotms,
  title     = {{SlicerRoboTMS}: An Open-Source {3D} {Slicer} Extension for Robot-Assisted Transcranial Magnetic Stimulation}, 
  author    = {Bai, Wenzhi and Guo, Yituo and Basu, Bhaskar and Weightman, Andrew and Li, Zhenhong},
  booktitle = {2026 48th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)}, 
  year      = {2026},
  volume    = {},
  number    = {},
  pages     = {1-6},
  publisher = {IEEE},
  eprint    = {2604.25661},
  archivePrefix = {arXiv},
  doi       = {10.48550/arXiv.2604.25661},
  url       = {https://github.com/OpenRoboTMS/SlicerRoboTMS}
}

Features

  • Openness and Extensibility: Fully open-source with example implementations of three functional modules — hand-eye calibration, anatomical landmark-based MRI-to-patient registration, and real-time MRI-guided neuronavigation — supporting rapid prototyping, reproducible research, and community-driven development.
  • Modularity and Compatibility: Follows the modular design principles of 3D Slicer, leveraging its established data structures and communication protocols for seamless integration with existing tools and broader research workflows.
  • Hardware Agnosticism: Standardised communication interfaces and configuration-based hardware descriptions decouple image-guided workflows from hardware-specific implementations, enabling interfacing with a wide range of external devices.
  • Real-time 3D Visualisation: Event-driven data exchange and configurable update rates (30 Hz) support real-time visualisation of robotic poses, target locations, and anatomical context, with robot geometry initialised from URDF.

System Architecture

SlicerRoboTMS operates within a three-layer architecture:

Application layer (SlicerRoboTMS): handles MRI-based neuronavigation, system monitoring, and user interaction within 3D Slicer.

Integration layer (ROS): encapsulates system-specific algorithms — tracking, calibration, registration, and robotic control — and communicates with the application layer via OpenIGTLink through the ROS-IGTL-Bridge.

Physical layer: tracking devices and robotic manipulators, accessed indirectly through the integration layer.

Communication Protocol

All communication between layers uses OpenIGTLink over TCP/IP (default port 18944). Three message types are used:

Type Description
TRANSFORM Real-time spatial data: robot joint poses, tracking transforms, camera calibration
POLYDATA Geometrical objects: registration fiducials, landmark point clouds
STATUS System states and execution commands: readiness flags, calibration/registration completion

Scene updates are driven by a Qt timer at 30 Hz, matching the refresh frequency of optical tracking systems.


Modules

Calibration Module

Performs hand-eye calibration between the Intel RealSense D455 tracking camera and the Franka Research 3 robot base. The calibration computes the rigid transformation that aligns the camera coordinate frame with the robot base frame, using AprilTag fiducial markers attached to the robot end-effector.

Workflow:

  1. Prerequisites verified: FR3 ready, VINS ready, end-effector marker detected
  2. User clicks Start Calibration — command sent to ROS via start_calibration STATUS message
  3. ROS calibrator node computes hand-eye transform from multiple viewpoints
  4. On completion (calib_complete = true), camera model becomes visible in the 3D scene

Registration Module

Aligns the MRI image coordinate space with the physical patient/phantom space using five anatomical landmarks. The predefined landmark positions in the head model coordinate frame are sent to ROS as a POLYDATA point cloud, and the physical landmark positions are sampled interactively using a registration stylus.

Workflow:

  1. Prerequisites verified: head marker ready, registration stylus ready
  2. User samples each of the five landmarks in turn
  3. User clicks Start Registration — landmark reference point cloud sent via OpenIGTLink
  4. ROS register node computes MRI-to-physical transform
  5. On completion (regist_complete = true), head marker and registration stylus models become visible

Navigation Module

Provides real-time MRI-guided navigation for TMS procedures. The user selects a target plane on the MRI volume in 3D Slicer; the plane origin and normal are used to construct a target pose matrix that is sent to the robot. The module also maintains real-time MRI slice views that track head movement.

Workflow:

  1. Prerequisites verified: all prior steps complete (FR3, VINS, marker, calibration, registration)
  2. User creates a markup plane on the MRI volume to define the coil target
  3. Send Target — target pose transform sent to ROS tracker node
  4. Start Navigation — robot begins moving; MRI slices track head movement in real time

Getting Started

Clone the Repository

git clone https://github.com/OpenRoboTMS/SlicerRoboTMS.git
cd SlicerRoboTMS
git submodule update --init --recursive

Installation

See INSTALLATION.md for complete step-by-step instructions.

Quick summary:

  1. Install Ubuntu 20.04 with real-time kernel (for robot control)
  2. Install 3D Slicer 5.8.1 or later with the SlicerOpenIGTLink extension
  3. Set up ROS Noetic and build the ROS-IGTL-Bridge
  4. Clone SlicerRoboTMS and register it with 3D Slicer's module paths

First Steps

  1. Launch the ROS-IGTL-Bridge on the integration machine:
    roslaunch ros_igtl_bridge bridge.launch
  2. Navigate to the SlicerRoboTMS category in the module selector
  3. Work through Calibration → Registration → Navigation in sequence

Requirements

Software

Dependency Version Notes
Ubuntu 20.04 LTS Real-time kernel patches recommended for robot control
3D Slicer 5.8.1+ With SlicerOpenIGTLink extension
ROS Noetic Required for integration layer
Python 3.8+ Bundled with Slicer

Hardware (example integration)

Component Model used in paper
Tracking camera Intel RealSense D455 (1280×800 px, 30 Hz)
Fiducial markers AprilTags (customisable)
Robot manipulator Franka Research 3
Host computer Intel Core i9, 64 GB RAM

The extension is designed for hardware agnosticism — alternative cameras, robots, and tracking systems can be integrated by adapting the ROS integration layer without modifying the SlicerRoboTMS extension itself.


Configuration

All configurable parameters are in SlicerRoboTMS/Config.py:

IGTL_CONFIG = {
    "host": "127.0.0.1",  # IP of the ROS-IGTL-Bridge machine
    "port": 18944,        # OpenIGTLink port
    "timeout": 5000,      # Connection timeout (ms)
}

UPDATE_RATES = {
    "update_from_ros": 30,  # State polling frequency (Hz)
}

Model files, MRI data paths, and visualisation colours are also defined there.


Project Structure

SlicerRoboTMS/
├── LICENSE
├── README.md
├── INSTALLATION.md
├── CONTRIBUTING.md
│
├── SlicerRoboTMS/             # Main extension modules
│   ├── Calibration/           # Hand-eye calibration
│   │   ├── Calibration.py     # Module + CalibrationTest
│   │   ├── Resources/         # UI file and icon
│   │   └── Testing/Python/    # CMake test registration
│   ├── Navigation/            # MRI-guided navigation
│   │   ├── Navigation.py      # Module + NavigationTest
│   │   ├── Resources/
│   │   └── Testing/Python/
│   ├── Registration/          # Landmark-based registration
│   │   ├── Registration.py    # Module + RegistrationTest
│   │   ├── Resources/
│   │   └── Testing/Python/
│   ├── utils/                 # Shared utilities
│   │   ├── BaseLogic.py       # Shared IGTL/scene logic base class
│   │   ├── BaseWidget.py      # Shared UI/timer base class
│   │   ├── LoggingUtils.py
│   │   ├── PathSetup.py
│   │   └── URDFParser.py      # URDF → Slicer scene loader
│   ├── Config.py              # Central configuration
│   └── CMakeLists.txt
│
├── models/                    # STL mesh files (robot links, coil, head, etc.)
├── urdfs/                     # Robot URDF description
├── mri_data/                  # MRI volume (NIfTI format)
├── logs/                      # Runtime log files (gitignored)
│
├── .github/workflows/         # CI: linting and tests
├── .pre-commit-config.yaml
├── .flake8
├── .pylintrc
├── pyproject.toml
│
└── third_party/               # External dependencies (submodules)
    ├── OpenIGTLink/
    ├── ROS-IGTL-Bridge/
    ├── Slicer/
    └── franka_description/

Troubleshooting

Cannot connect ROS-IGTL-Bridge to Slicer

  • Ensure the bridge and Slicer have opposite roles (server vs. client)
  • Check that port 18944 is not blocked by a firewall
  • Confirm the bridge is running: roslaunch ros_igtl_bridge bridge.launch

Modules do not appear in Slicer

  • Add the extension path in Edit → Application Settings → Modules
  • Check Slicer's Python console for import errors: View → Python Console

Robot does not visualise correctly

  • Verify urdfs/fr3.urdf and all models/fr3/ STL files are present
  • Confirm ROS is publishing joint transforms at the expected node names

See INSTALLATION.md for a full troubleshooting guide.


Contributing

We welcome contributions. Please see CONTRIBUTING.md for the code style guide (3D Slicer Python conventions), testing requirements, and pull request workflow.


License

This project is licensed under the MIT License — see LICENSE for details.


Contact & Support


Acknowledgments

This project builds upon:

About

This is the open-source repository for SlicerRoboTMS, one of the two core components of OpenRoboTMS — an open-source robotic platform for navigated Transcranial Magnetic Stimulation (TMS).

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Contributors