Skip to content

Commit

Permalink
Samples: Automatic updates to public repository
Browse files Browse the repository at this point in the history
Remember to do the following:
    1. Ensure that modified/deleted/new files are correct
    2. Make this commit message relevant for the changes
    3. Force push
    4. Delete branch after PR is merged

If this commit is an update from one SDK version to another,
make sure to create a release tag for previous version.
  • Loading branch information
csu-bot-zivid committed Sep 13, 2024
1 parent bd1e49c commit af37c0f
Show file tree
Hide file tree
Showing 57 changed files with 184 additions and 2,284 deletions.
26 changes: 4 additions & 22 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# C\# samples

This repository contains csharp code samples for Zivid SDK v2.13.1. For
This repository contains csharp code samples for Zivid SDK v2.11.1. For
tested compatibility with earlier SDK versions, please check out
[accompanying
releases](https://github.com/zivid/zivid-csharp-samples/tree/master/../../releases).
Expand Down Expand Up @@ -62,9 +62,6 @@ from the camera can be used.
- [CaptureHDRPrintNormals](https://github.com/zivid/zivid-csharp-samples/tree/master/source/Camera/Advanced/CaptureHDRPrintNormals/CaptureHDRPrintNormals.cs) - Capture Zivid point clouds, compute normals and print a
subset.
- **InfoUtilOther**
- [AutomaticNetworkConfigurationForCameras](https://github.com/zivid/zivid-csharp-samples/tree/master/source/Camera/InfoUtilOther/AutomaticNetworkConfigurationForCameras/AutomaticNetworkConfigurationForCameras.cs) - Automatically set the IP addresses of any number of
cameras to be in the same subnet as the provided IP address
of the network interface.
- [CameraInfo](https://github.com/zivid/zivid-csharp-samples/tree/master/source/Camera/InfoUtilOther/CameraInfo/CameraInfo.cs) - List connected cameras and print camera version and state
information for each connected camera.
- [CameraUserData](https://github.com/zivid/zivid-csharp-samples/tree/master/source/Camera/InfoUtilOther/CameraUserData/CameraUserData.cs) - Store user data on the Zivid camera.
Expand All @@ -73,16 +70,8 @@ from the camera can be used.
- [FirmwareUpdater](https://github.com/zivid/zivid-csharp-samples/tree/master/source/Camera/InfoUtilOther/FirmwareUpdater/FirmwareUpdater.cs) - Update firmware on the Zivid camera.
- [GetCameraIntrinsics](https://github.com/zivid/zivid-csharp-samples/tree/master/source/Camera/InfoUtilOther/GetCameraIntrinsics/GetCameraIntrinsics.cs) - Read intrinsic parameters from the Zivid camera (OpenCV
model) or estimate them from the point cloud.
- [NetworkConfiguration](https://github.com/zivid/zivid-csharp-samples/tree/master/source/Camera/InfoUtilOther/NetworkConfiguration/NetworkConfiguration.cs) - Uses Zivid API to change the IP address of the Zivid
camera.
- [Warmup](https://github.com/zivid/zivid-csharp-samples/tree/master/source/Camera/InfoUtilOther/Warmup/Warmup.cs) - A basic warm-up method for a Zivid camera with specified
time and capture cycle.
- **Maintenance**
- [CorrectCameraInField](https://github.com/zivid/zivid-csharp-samples/tree/master/source/Camera/Maintenance/CorrectCameraInField/CorrectCameraInField.cs) - Correct the dimension trueness of a Zivid camera.
- [ResetCameraInField](https://github.com/zivid/zivid-csharp-samples/tree/master/source/Camera/Maintenance/ResetCameraInField/ResetCameraInField.cs) - Reset infield correction on a camera.
- [VerifyCameraInField](https://github.com/zivid/zivid-csharp-samples/tree/master/source/Camera/Maintenance/VerifyCameraInField/VerifyCameraInField.cs) - Check the dimension trueness of a Zivid camera.
- [VerifyCameraInFieldFromZDF](https://github.com/zivid/zivid-csharp-samples/tree/master/source/Camera/Maintenance/VerifyCameraInFieldFromZDF/VerifyCameraInFieldFromZDF.cs) - Check the dimension trueness of a Zivid camera from a ZDF
file.
- **Applications**
- **Basic**
- **Visualization**
Expand All @@ -102,22 +91,15 @@ from the camera can be used.
- [HandEyeCalibration](https://github.com/zivid/zivid-csharp-samples/tree/master/source/Applications/Advanced/HandEyeCalibration/HandEyeCalibration/HandEyeCalibration.cs) - Perform Hand-Eye calibration.
- [MultiCameraCalibration](https://github.com/zivid/zivid-csharp-samples/tree/master/source/Applications/Advanced/MultiCameraCalibration/MultiCameraCalibration.cs) - Use captures of a calibration object to generate
transformation matrices to a single coordinate frame.
- [ROIBoxViaArucoMarker](https://github.com/zivid/zivid-csharp-samples/tree/master/source/Applications/Advanced/ROIBoxViaArucoMarker/ROIBoxViaArucoMarker.cs) - Filter the point cloud based on a ROI box given relative
to the ArUco marker on a Zivid Calibration Board.
- [ROIBoxViaCheckerboard](https://github.com/zivid/zivid-csharp-samples/tree/master/source/Applications/Advanced/ROIBoxViaCheckerboard/ROIBoxViaCheckerboard.cs) - Filter the point cloud based on a ROI box given relative
to the Zivid Calibration Board.
- [TransformPointCloudFromMillimetersToMeters](https://github.com/zivid/zivid-csharp-samples/tree/master/source/Applications/Advanced/TransformPointCloudFromMillimetersToMeters/TransformPointCloudFromMillimetersToMeters.cs) - Transform point cloud data from millimeters to meters.
- [TransformPointCloudViaArucoMarker](https://github.com/zivid/zivid-csharp-samples/tree/master/source/Applications/Advanced/TransformPointCloudViaArucoMarker/TransformPointCloudViaArucoMarker.cs) - Transform a point cloud from camera to ArUco marker
coordinate frame by estimating the marker's pose from the
point cloud.
- [TransformPointCloudViaCheckerboard](https://github.com/zivid/zivid-csharp-samples/tree/master/source/Applications/Advanced/TransformPointCloudViaCheckerboard/TransformPointCloudViaCheckerboard.cs) - Transform a point cloud from camera to checkerboard (Zivid
Calibration Board) coordinate frame by getting checkerboard
pose from the API.
- **HandEyeCalibration**
- [PoseConversions](https://github.com/zivid/zivid-csharp-samples/tree/master/source/Applications/Advanced/HandEyeCalibration/PoseConversions/PoseConversions.cs) - Convert to/from Transformation Matrix (Rotation Matrix
+ Translation Vector)
- [UtilizeHandEyeCalibration](https://github.com/zivid/zivid-csharp-samples/tree/master/source/Applications/Advanced/HandEyeCalibration/UtilizeHandEyeCalibration/UtilizeHandEyeCalibration.cs) - Transform single data point or entire point cloud from
camera to robot base reference frame using Hand-Eye
camera frame to robot base frame using Hand-Eye
calibration

## Installation
Expand All @@ -143,8 +125,8 @@ Visual
Studio](https://support.zivid.com/latest/api-reference/samples/csharp/build-c-sharp-samples-using-visual-studio.html).

Some of the samples depend on external libraries, in particular
`MathNet.Numerics` and `System.ValueTuple`. These libraries will be
installed automatically through NuGet when building the sample.
`MathNet.Numerics`. These libraries will be installed automatically
through NuGet when building the sample.

## Support

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ Perform Hand-Eye calibration.
using System.Collections.Generic;
using System.IO;
using System.Linq;

using Zivid.NET.Calibration;
using Duration = Zivid.NET.Duration;

Expand All @@ -24,13 +25,9 @@ static int Main()

var calibrationResult = performCalibration(handEyeInput);

Console.WriteLine("Zivid primarily operates with a (4x4) transformation matrix. To convert");
Console.WriteLine("to axis-angle, rotation vector, roll-pitch-yaw, or quaternion, check out");
Console.WriteLine("our PoseConversions sample.");

if (calibrationResult.Valid())
{
Console.WriteLine("{0}\n{1}\n{2}", "Hand-Eye calibration OK", "Result: ", calibrationResult);
Console.WriteLine("{0}\n{1}\n{2}", "Hand-Eye calibration OK", "Result:", calibrationResult);
}
else
{
Expand All @@ -52,34 +49,32 @@ static List<HandEyeInput> readHandEyeInputs(Zivid.NET.Camera camera)
var currentPoseId = 0U;
var beingInput = true;

var calibrationObject = "";
while (true)
{
Console.WriteLine("Enter calibration object you are using, m (for ArUco marker(s)) or c (for Zivid checkerboard): ");
calibrationObject = Console.ReadLine();

if (calibrationObject.Equals("m", StringComparison.CurrentCultureIgnoreCase) ||
calibrationObject.Equals("c", StringComparison.CurrentCultureIgnoreCase))
{
break;
}
}


Interaction.ExtendInputBuffer(2048);

Console.WriteLine("Zivid primarily operates with a (4x4) transformation matrix. To convert");
Console.WriteLine("from axis-angle, rotation vector, roll-pitch-yaw, or quaternion, check out");
Console.WriteLine("our PoseConversions sample.");

do
{
switch (Interaction.EnterCommand())
{
case CommandType.AddPose:
try
{
HandleAddPose(ref currentPoseId, ref handEyeInput, camera, calibrationObject);
var robotPose = Interaction.EnterRobotPose(currentPoseId);
using (var frame = Interaction.AssistedCapture(camera))
{
Console.Write("Detecting checkerboard in point cloud: ");
var detectionResult = Detector.DetectFeaturePoints(frame.PointCloud);

if (detectionResult.Valid())
{
Console.WriteLine("Calibration board detected");
handEyeInput.Add(new HandEyeInput(robotPose, detectionResult));
++currentPoseId;
}
else
{
Console.WriteLine("Failed to detect calibration board, ensure that the entire board is in the view of the camera");
}
}
}
catch (Exception ex)
{
Expand All @@ -96,82 +91,25 @@ static List<HandEyeInput> readHandEyeInputs(Zivid.NET.Camera camera)
return handEyeInput;
}

public static void HandleAddPose(ref uint currentPoseId, ref List<HandEyeInput> handEyeInput, Zivid.NET.Camera camera, string calibrationObject)
{
var robotPose = Interaction.EnterRobotPose(currentPoseId);

Console.Write("Detecting calibration object in point cloud");
if (calibrationObject.Equals("c", StringComparison.CurrentCultureIgnoreCase))
{
var frame = Zivid.NET.Calibration.Detector.CaptureCalibrationBoard(camera);
var detectionResult = Detector.DetectCalibrationBoard(frame);

if (detectionResult.Valid())
{
Console.WriteLine("Calibration board detected");
handEyeInput.Add(new HandEyeInput(robotPose, detectionResult));
++currentPoseId;
}
else
{
Console.WriteLine("Failed to detect calibration board, ensure that the entire board is in the view of the camera");
}
}
else if (calibrationObject.Equals("m", StringComparison.CurrentCultureIgnoreCase))
{
var frame = AssistedCapture(camera);

var markerDictionary = Zivid.NET.MarkerDictionary.Aruco4x4_50;
var markerIds = new List<int> { 1, 2, 3 };

Console.WriteLine("Detecting arUco marker IDs " + string.Join(", ", markerIds));
var detectionResult = Detector.DetectMarkers(frame, markerIds, markerDictionary);

if (detectionResult.Valid())
{
Console.WriteLine("ArUco marker(s) detected: " + detectionResult.DetectedMarkers().Length);
handEyeInput.Add(new HandEyeInput(robotPose, detectionResult));
++currentPoseId;
}
else
{
Console.WriteLine("Failed to detect any ArUco markers, ensure that at least one ArUco marker is in the view of the camera");
}
}
}

static Zivid.NET.Calibration.HandEyeOutput performCalibration(List<HandEyeInput> handEyeInput)
{
while (true)
{
Console.WriteLine("Enter type of calibration, eth (for eye-to-hand) or eih (for eye-in-hand): ");
Console.WriteLine("Enter type of calibration, eth (for eye-to-hand) or eih (for eye-in-hand):");
var calibrationType = Console.ReadLine();
if (calibrationType.Equals("eth", StringComparison.CurrentCultureIgnoreCase))
{
Console.WriteLine("Performing eye-to-hand calibration with " + handEyeInput.Count + " dataset pairs");
Console.WriteLine("The resulting transform is the camera pose in robot base frame");
Console.WriteLine("Performing eye-to-hand calibration");
return Calibrator.CalibrateEyeToHand(handEyeInput);
}
if (calibrationType.Equals("eih", StringComparison.CurrentCultureIgnoreCase))
{
Console.WriteLine("Performing eye-in-hand calibration with " + handEyeInput.Count + " dataset pairs");
Console.WriteLine("The resulting transform is the camera pose in flange (end-effector) frame");
Console.WriteLine("Performing eye-in-hand calibration");
return Calibrator.CalibrateEyeInHand(handEyeInput);
}
Console.WriteLine("Entered unknown method");
}
}
public static Zivid.NET.Frame AssistedCapture(Zivid.NET.Camera camera)
{
var suggestSettingsParameters = new Zivid.NET.CaptureAssistant.SuggestSettingsParameters
{
AmbientLightFrequency =
Zivid.NET.CaptureAssistant.SuggestSettingsParameters.AmbientLightFrequencyOption.none,
MaxCaptureTime = Duration.FromMilliseconds(800)
};
var settings = Zivid.NET.CaptureAssistant.Assistant.SuggestSettings(camera, suggestSettingsParameters);
return camera.Capture(settings);
}
}

enum CommandType
Expand All @@ -192,7 +130,7 @@ public static void ExtendInputBuffer(int size)

public static CommandType EnterCommand()
{
Console.Write("Enter command, p (to add robot pose) or c (to perform calibration): ");
Console.Write("Enter command, p (to add robot pose) or c (to perform calibration):");
var command = Console.ReadLine().ToLower();

switch (command)
Expand All @@ -217,4 +155,16 @@ public static Pose EnterRobotPose(ulong index)
var robotPose = new Pose(elements); Console.WriteLine("The following pose was entered: \n{0}", robotPose);
return robotPose;
}

public static Zivid.NET.Frame AssistedCapture(Zivid.NET.Camera camera)
{
var suggestSettingsParameters = new Zivid.NET.CaptureAssistant.SuggestSettingsParameters
{
AmbientLightFrequency =
Zivid.NET.CaptureAssistant.SuggestSettingsParameters.AmbientLightFrequencyOption.none,
MaxCaptureTime = Duration.FromMilliseconds(800)
};
var settings = Zivid.NET.CaptureAssistant.Assistant.SuggestSettings(camera, suggestSettingsParameters);
return camera.Capture(settings);
}
}
Loading

0 comments on commit af37c0f

Please sign in to comment.