Skip to content

Commit 611a749

Browse files
Merge pull request ARISE-Initiative#577 from ARISE-Initiative/update-devices-docs
2 parents a651720 + 1872481 commit 611a749

File tree

3 files changed

+46
-41
lines changed

3 files changed

+46
-41
lines changed

docs/algorithms/demonstrations.md

Lines changed: 4 additions & 30 deletions
Original file line numberDiff line numberDiff line change
@@ -2,39 +2,13 @@
22

33
## Collecting Human Demonstrations
44

5-
We provide teleoperation utilities that allow users to control the robots with input devices, such as the keyboard and the [SpaceMouse](https://www.3dconnexion.com/spacemouse_compact/en/). Such functionality allows us to collect a dataset of human demonstrations for learning. We provide an example script to illustrate how to collect demonstrations. Our [collect_human_demonstrations](https://github.com/ARISE-Initiative/robosuite/blob/master/robosuite/scripts/collect_human_demonstrations.py) script takes the following arguments:
5+
We provide teleoperation utilities that allow users to control the robots with input devices, such as the keyboard, [SpaceMouse](https://www.3dconnexion.com/spacemouse_compact/en/) and mujoco-gui. Such functionality allows us to collect a dataset of human demonstrations for learning. We provide an example script to illustrate how to collect demonstrations. Our [collect_human_demonstrations](https://github.com/ARISE-Initiative/robosuite/blob/master/robosuite/scripts/collect_human_demonstrations.py) script takes the following arguments:
66

77
- `directory:` path to a folder for where to store the pickle file of collected demonstrations
88
- `environment:` name of the environment you would like to collect the demonstrations for
9-
- `device:` either "keyboard" or "spacemouse"
10-
11-
### Keyboard controls
12-
13-
Note that the rendering window must be active for these commands to work.
14-
15-
| Keys | Command |
16-
| :------: | :--------------------------------: |
17-
| q | reset simulation |
18-
| spacebar | toggle gripper (open/close) |
19-
| w-a-s-d | move arm horizontally in x-y plane |
20-
| r-f | move arm vertically |
21-
| z-x | rotate arm about x-axis |
22-
| t-g | rotate arm about y-axis |
23-
| c-v | rotate arm about z-axis |
24-
| ESC | quit |
25-
26-
### 3Dconnexion SpaceMouse controls
27-
28-
| Control | Command |
29-
| :-----------------------: | :-----------------------------------: |
30-
| Right button | reset simulation |
31-
| Left button (hold) | close gripper |
32-
| Move mouse laterally | move arm horizontally in x-y plane |
33-
| Move mouse vertically | move arm vertically |
34-
| Twist mouse about an axis | rotate arm about a corresponding axis |
35-
| ESC (keyboard) | quit |
36-
9+
- `device:` either "keyboard" or "spacemouse" or "mjgui"
3710

11+
See the [devices page](https://robosuite.ai/docs/modules/devices.html) for details on how to use the devices.
3812

3913
## Replaying Human Demonstrations
4014

@@ -79,7 +53,7 @@ The reason for storing mujoco states instead of raw observations is to make it e
7953

8054
## Using Demonstrations for Learning
8155

82-
We have recently released the [robomimic](https://arise-initiative.github.io/robomimic-web/) framework, which makes it easy to train policies using your own [datasets collected with robosuite](https://arise-initiative.github.io/robomimic-web/docs/introduction/datasets.html#robosuite-hdf5-datasets), and other publically released datasets (such as those collected with RoboTurk). The framework also contains many useful examples for how to integrate hdf5 datasets into your own learning pipeline.
56+
The [robomimic](https://arise-initiative.github.io/robomimic-web/) framework makes it easy to train policies using your own [datasets collected with robosuite](https://arise-initiative.github.io/robomimic-web/docs/introduction/datasets.html#robosuite-hdf5-datasets). The framework also contains many useful examples for how to integrate hdf5 datasets into your own learning pipeline.
8357

8458
The robosuite repository also has some utilities for using the demonstrations to alter the start state distribution of training episodes for learning RL policies - this have proved effective in [several](https://arxiv.org/abs/1802.09564) [prior](https://arxiv.org/abs/1807.06919) [works](https://arxiv.org/abs/1804.02717). For example, we provide a generic utility for setting various types of learning curriculums which dictate how to sample from demonstration episodes when doing an environment reset. For more information see the `DemoSamplerWrapper` class.
8559

docs/modules/devices.md

Lines changed: 41 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -10,16 +10,20 @@ We support keyboard input through the OpenCV2 window created by the mujoco rende
1010

1111
Note that the rendering window must be active for these commands to work.
1212

13-
| Keys | Command |
14-
| :------- | :--------------------------------- |
15-
| q | reset simulation |
16-
| spacebar | toggle gripper (open/close) |
17-
| w-a-s-d | move arm horizontally in x-y plane |
18-
| r-f | move arm vertically |
19-
| z-x | rotate arm about x-axis |
20-
| t-g | rotate arm about y-axis |
21-
| c-v | rotate arm about z-axis |
22-
| ESC | quit |
13+
| Keys | Command |
14+
| :------------------ | :----------------------------------------- |
15+
| Ctrl+q | reset simulation |
16+
| spacebar | toggle gripper (open/close) |
17+
| up-right-down-left | move horizontally in x-y plane |
18+
| .-; | move vertically |
19+
| o-p | rotate (yaw) |
20+
| y-h | rotate (pitch) |
21+
| e-r | rotate (roll) |
22+
| b | toggle arm/base mode (if appli cable) |
23+
| s | switch active arm (if multi-armed robot) |
24+
| = | switch active robot (if multi-robot env) |
25+
| ESC | quit |
26+
2327

2428
## 3Dconnexion SpaceMouse
2529

@@ -35,3 +39,30 @@ We support the use of a [SpaceMouse](https://www.3dconnexion.com/spacemouse_comp
3539
| Move mouse vertically | move arm vertically |
3640
| Twist mouse about an axis | rotate arm about a corresponding axis |
3741
| ESC (keyboard) | quit |
42+
43+
44+
## Mujoco GUI Device
45+
46+
To use the Mujoco GUI device for teleoperation, follow these steps:
47+
48+
1. Set renderer as `"mjviewer"`. For example:
49+
50+
```python
51+
env = suite.make(
52+
**options,
53+
renderer="mjviewer",
54+
has_renderer=True,
55+
has_offscreen_renderer=False,
56+
ignore_done=True,
57+
use_camera_obs=False,
58+
)
59+
```
60+
61+
Note: if using Mac, please use `mjpython` instead of `python`. For example:
62+
63+
```mjpython robosuite/scripts/collect_human_demonstrations.py --environment Lift --robots Panda --device mjgui --camera frontview --controller WHOLE_BODY_IK```
64+
65+
2. Double click on a mocap body to select a body to drag, then:
66+
67+
On Linux: `Ctrl` + right click to drag the body's position. `Ctrl` + left click to control the body's orientation.
68+
On Mac: `fn` + `Ctrl` + right click.

docs/simulation/device.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
Device
22
======
33

4-
Devices allow for direct real-time interfacing with the MuJoCo simulation. The current support devices are ``Keyboard`` and ``SpaceMouse``.
4+
Devices allow for direct real-time interfacing with the MuJoCo simulation. The currently supported devices are ``Keyboard``. ``SpaceMouse`` and ``MjGUI``.
55

66
Base Device
77
-----------

0 commit comments

Comments
 (0)