title | description | author | ms.author | ms.date | ms.topic | ms.localizationpriority | keywords |
---|---|---|---|---|---|---|---|
4. Making your scene interactive |
Part 4 of 6 in a tutorial series to build a simple chess app using Unreal Engine 4 and the Mixed Reality Toolkit UX Tools plugin |
hferrone |
v-hferrone |
08/18/2020 |
article |
high |
Unreal, Unreal Engine 4, UE4, HoloLens, HoloLens 2, mixed reality, tutorial, getting started, mrtk, uxt, UX Tools, documentation, mixed reality headset, windows mixed reality headset, virtual reality headset |
In the previous tutorial you added an ARSession, Pawn, and Game Mode to complete the mixed reality setup for the chess app. This section focuses on using the open source Mixed Reality Toolkit UX Tools plugin, which provides tools to make the scene interactive. By the end of this section you'll be able to move the chess pieces with user input.
- Downloading the Mixed Reality Toolkit UX Tools plugin
- Adding Hand Interaction Actors to your fingertips
- Creating and adding Manipulators to objects in the scene
- Using input simulation to validate the project
Before you start working with user input, you'll need to add the plugin to the project.
-
On the Mixed Reality UX Tools releases page on GitHub, navigate to the UX Tools for Unreal v0.9.0 release and download UXTools.0.9.0.zip. Unzip the file.
-
Create a new folder called Plugins in the root folder of the project. Copy the unzipped UXTools plugin into this folder and restart the Unreal editor.
- The UXTools plugin has a Content folder with subfolders for components including Buttons, Input Simulation, and Pointers, as well as a C++ Classes folder with additional code.
Note
If you don’t see the UXTools Content section in the Content Browser, click View Options > Show Plugin Content.
With the plugin installed, you're ready to start using the tools it has to offer, starting with hand interaction actors.
Hand interaction with UX elements is performed with Hand Interaction Actors, which create and drive the pointers and visuals for near and far interactions.
- Near interactions are performed by pinching elements between index finger and thumb or by poking them with a fingertip.
- Far interactions are performed by pointing a ray from the virtual hand at an element and pressing index and thumb together.
In our case, adding a Hand Interaction Actor to MRPawn will:
- Add a cursor to the tips of the Pawn’s index fingers.
- Provide articulated hand input events that can be manipulated through the Pawn.
- Allow far interaction input events through hand rays extending from the palms of the virtual hands.
In order to drive these concepts home, you're encouraged to read through the documentation on hand interactions before continuing.
Once you're ready, open the MRPawn Blueprint and go to the Event Graph.
-
Drag and release the execution pin from Event BeginPlay to place a new node.
- Select Spawn Actor from Class, click the dropdown next to the Class pin and search for Uxt Hand Interaction Actor.
-
Spawn a second Uxt Hand Interaction Actor, this time setting the Hand to Right. When the event begins, a Uxt Hand Interaction Actor will be spawned on each hand.
Your Event Graph should match the following screenshot:
Both Uxt Hand Interaction Actors need owners and initial transform locations. The initial transform doesn’t matter since the Hand Interaction Actors will jump to the virtual hands as soon as they're visible (this behavior is included in the UX Tools plugin). However, the SpawnActor
function requires a Transform input to avoid a compiler error, so you'll use the default values.
-
Drag and release the pin off one of the Spawn Transform pins to place a new node.
- Search for the Make Transform node, then drag the Return Value to the other hand’s Spawn Transform so that both SpawnActor nodes are connected.
-
Click the down arrow at the bottom of both SpawnActor nodes to reveal the Owner pin.
- Drag the pin off one of the Owner pins and release to place a new node.
- Search for self and select the Get a reference to self variable, then create a link between the Self object reference node and the other Hand Interaction Actor’s Owner pin.
-
Last but not least, check the Show Near Cursor on Grab Targets box for both Hand Interaction Actors. This will cause a cursor to appear on the grab target as your index finger approaches it, which will make it easier to see where your finger is relative to the target.
- Compile, save, and return to the Main window.
Make sure the connections match the following screenshot, but feel free to drag around nodes to make your Blueprint more readable
You can find more information about the Hand Interaction Actor provided in the MRTK UX Tools plugin in the documentation.
Now the virtual hands in the project have a way of selecting objects, but they still can't manipulate them. Your last task before testing the app is to add Manipulator components to the actors in the scene.
A Manipulator is a component that responds to articulated hand input and can be grabbed, rotated, and translated. Applying the Manipulator’s transform to an Actors transform allows direct Actor manipulation.
- Open the Board blueprint, click Add Component and search for Uxt Generic Manipulator in the Components panel.
- Expand the Generic Manipulator section in the Details panel. You can set one-handed or two-handed manipulation, rotation mode, and smoothing from here. Feel free to select whichever modes you wish, then Compile and Save Board.
- Repeat the steps above for the WhiteKing Actor.
You can find more information about the Manipulator Components provided in the MRTK UX Tools plugin in the documentation.
Good news everyone! You're ready to test out the app with its new virtual hands and user input. Press Play in the Main Window and you'll see two mesh hands provided by the MRTK UX Tools plugin with hand rays extending from each hand’s palm. You can control the hands and their interactions as follows:
- Hold down the left Alt key to control the left hand and the left Shift key to control the right hand.
- Move your mouse to move the hand and scroll with your mouse wheel to move the hand forwards or backwards.
- Click the left mouse button to pinch, click the middle mouse button to poke.
Note
Input simulation may not work if you have multiple headsets plugged into your PC. If you're having issues, try unplugging your other headsets.
Try using the simulated hands to pick up, move, and set down the white chess king and manipulate the board! Experiment with both near and far interaction - notice that when your hands get close enough to grab the board and king directly, the hand ray disappears and is replaced with a finger cursor at the tip of the index finger.
You can find more information about the simulated hands feature provided by the MRTK UX Tools plugin in the documentation.
Now that your virtual hands can interact with objects, you're ready to move on to the next tutorial and add user interfaces and events.
Next Section: 5. Adding a button & resetting piece locations