Skip to content

nthState/RobotKit

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

2 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

RobotKit

πŸ“š Table of Contents

Video

Watch the video

Overview

An iPhone is attached to a Robot, UI Tests command the Robot to move during a test

Here's an example UITest using the Page Object Pattern

import RobotKit
import XCTest

final class ARTests: XCTestCase {

  // Note: Use YOUR Robots' IP Address
  let robot = Robot(configuration: .init(ipAddress: "192.168.1.183", timeout: 10))

  func testAddingCubeWhilstMoving() async throws {
    
	  try await robot.defaultPose()
	
	  try await HomePageObject(app: app)
	    .tapAddEntityButton()
	    .moveRobot {
        try await robot.rotateToPortrait()
	    }
	    .tapPlaceCube()
	    .moveRobot {
	  	  async let _ = robot.panRight()
	    }
	    .tapCloseButton()
	    .tapCenterOfScreen()
	    .assertSelectedEntity(named: "cube")
  }
}

Sample Project

The Sample Project used for this project is available here: https://github.com/nthState/RobotKitSample.

Why did I build this?

Xcode allows you to use a video with embedded AR Sensor data during development instead of a live video feed. However, It doesn't let you set it during a UITest, I raised a Radar regarding this: rdar://FB21103904

So, Instead, I decided to attach an iPhone to a Robot whilst a UITest is running for a similar experience.

AR Session Replay

In Reality Composer on the iPhone you can record AR Session Data as a Video

  • Triple Dot Menu -> Developer -> Record AR Session

This will record an *.MP4 Video, embedding AR Tracking Sensor data too.

Once you have this video file, you can set it inside of Xcode like this:

AR Replay Data

Then when you run your Augmented Reality Project, instead of seeing a live video feed of your surroundings, you will see the playback of the recorded video, but crucially it will also fire all delegate methods regarding plane detection etc.

It's incredibly useful, I wish I could run it in a UITest.

Advantages & Disadvantages

Advantages

When it comes to ARKit/RealityKit testing, we should keep in mind:

  • The Real World - Lighting, Lidar, Plane Detection, Occlusion (The real world is noisy)
  • Performance
  • Energy Usage

How does one know from one change to another that I haven't introduced any issue that affects:

  • CPU usage
  • GPU Frames per second
  • Memory Consumption
  • Device Overheating

One could run the same code in different ways and get different answers.

Having a rig that can repeatedly re-run the same test again and again helps to avoid these issues.

Soak Tests

With this setup, I can have a long-running test overnight, save a report of performance and then compare the report to other runs.

Disadvantages

  • You need a Robot (I got mine second hand on eBay, some bits were missing, 3D printing helps)
  • You need a physical device to attach to the Robot
  • You need physical space, watch out for hitting cups of coffee
  • You may need to keep your physical layout the same

Isn't this overkill?

...did I mention I have a Robot that runs my ARKit Tests?

Installation

If you have the Braccio robot, you can follow these steps to get up and running:

Robot Firmware

Supporting other Robots

The current project supports the Braccio Robot, but others are on the market.

If you want to support a different type of Robot, please conform to this protocol:

  • The Robot should be accessible over WiFi, as we will send requests to it over HTTP
  • Optional: If you want to share your Robot code with others, create a PR with the Robots source code
  • The Robot should receive and process JSON like:
{"base": 0, "shoulder": 90, "elbow": 90, "wristVertical": 90, "wristRotation": 90, "duration": 20}

JSON Schema

Here's the schema for what the Robot will receive and process

{
  "$schema": "https://json-schema.org/draft/2020-12/schema",
  "type": "object",
  "properties": {
    "base": {
      "type": "integer"
    },
    "shoulder": {
      "type": "integer"
    },
    "elbow": {
      "type": "integer"
    },
    "wristVertical": {
      "type": "integer"
    },
    "wristRotation": {
      "type": "integer"
    },
    "duration": {
      "type": "integer"
    }
  },
  "required": []
}

Swift Package

Add this url https://github.com/nthState/RobotKit to either your Xcode project dependencies, or add it to your Swift Package.

Note: If you're wanting to run UITests, add it to your UITest target, not your App target

.package(url: "https://github.com/nthState/RobotKit", branch: "main")

or preferably, pin to an exact version:

.package(url: "https://github.com/nthState/RobotKit", exact: "<version>")

Testing

Please take a look at the sample project as a "what you could do", https://github.com/nthState/RobotKitSample for creating UITests.

Writing Tests

Once everything is setup, you can start writing UITests.

Import RobotKit

import RobotKit

Initalize the robot before tests start, as it can take a short while to initalise:

robot = Robot(configuration: .init(ipAddress: "192.168.1.183", timeout: 10))

You can then issue commands and even make up your own complex routines:

func rotateToPortrait() async throws  {
  try await self.send(.init(wristVertical: 90))
}

Testing without UITest

If your robot is up and running, you will be able to send curl commands move it.

Note: Ensure the IP Address is your Robots' IP Address

The default pose for the Braccio

curl -v http://192.168.1.183/robot \
  -H "Content-Type: application/json" \
  -d '{"base": 0, "shoulder": 90, "elbow": 0, "wristVertical": 90, "wristRotation": 90}'

Rotating the base

curl -v http://192.168.1.183/robot \
  -H "Content-Type: application/json" \
  -d '{"base": 80}'

Rotating the base and elbow

curl -v http://192.168.1.183/robot \
  -H "Content-Type: application/json" \
  -d '{"base": 80, "elbow": 45}'

Slow base movement

curl -v http://192.168.1.183/robot \
  -H "Content-Type: application/json" \
  -d '{"base": 80, "duration": 100}'

About me

I'm Chris, you can read more about me and my projects on my website: www.chrisdavis.com