Skip to content

Commit

Permalink
init commit
Browse files Browse the repository at this point in the history
  • Loading branch information
ccyinlu committed Mar 13, 2019
0 parents commit 9ee78b2
Show file tree
Hide file tree
Showing 162 changed files with 5,698 additions and 0 deletions.
103 changes: 103 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,103 @@
# chessboard_camera_lidar_calibration_toolbox
<div align=center><img src="docs/camera_lidar_calibration_toolbox_02.png"/></div>

The package is used to calibrate a Lidar with a camera. Specially, the csi camera has been successfully calibrated against Velodyne VLP-16 and Hesai P40P using `chessboard_camera_lidar_calibration_toolbox`. We show the reprojection error by projecting the plane Lidar points to the image plane, which has high accuracy. You can also load other image-lidar pairs captured by the same setup to see the calibration accuracy.

The package finds a rotation and translation that transform all the points in the LiDAR frame to the (monocular) camera frame. Please see Usage for a [video tutorial](http://47.100.39.180/download/MultiModal_Data_Studio/camera_lidar_calibration_toolbox.mp4).

The toolbox extracts the chessboard plane both from the `image coordinates` and `Lidar points coordinates` automatically, and align the normals of the plane of minimization of the reprojection error to estimate the optimized rotation and translation. The methodolofy of the toolbox derived from the fusion package of the Autoware, which is called [autoware_camera_lidar_calibrator](https://github.com/CPFL/Autoware/tree/master/ros/src/sensing/fusion/packages/autoware_camera_lidar_calibrator).

Compared to the original [autoware_camera_lidar_calibrator](https://github.com/CPFL/Autoware/tree/master/ros/src/sensing/fusion/packages/autoware_camera_lidar_calibrator), the toolbox is developed with matlab language, which is easy to developed and maintain. Additionally, the toolbox provided in this git repository does not rely on [ROS](http://www.ros.org/). Seperate image-point pairs including different image types and lidar point types are supported, such as `pcd` and `binary` type, which is common in [KITTI datasets](http://www.cvlibs.net/datasets/kitti/).

Additionally, calibration process with [autoware_camera_lidar_calibrator](https://github.com/CPFL/Autoware/tree/master/ros/src/sensing/fusion/packages/autoware_camera_lidar_calibrator) needs so much manual interaction and always takes hours to finish a qualified calibration results. However, the automatic tricks reduce the effort in this process.

## Table of Contents
* [Setup](#1)
- [matlab](#11)
- [NLOPT-ubuntu](#12)
- [NLOPT-windows](#12)
* [Getting Started](#2)
- [data preparation](#21)
* [Usage](#3)
* [Fusion using `chessboard_camera_lidar_calibration`](#4)
* [Future Improvements](#5)

## <span id=1>Setup</span>
The toolbox is so easy to setup due to the depenence only on [matlab](https://www.mathworks.com/products/matlab.html).
1. <span id=11>matlab</span>
For more support for ROS-interface, we strongly suggest you upgrade your matlab version at least after `2017a`.
2. <span id=12>NLOPT-ubuntu</span>
[NLopt](https://nlopt.readthedocs.io/en/latest/) is a free/open-source library for nonlinear optimization, providing a common interface for a number of different free optimization routines available online as well as original implementations of various other algorithms.
The optimization of the toolbox depends on NLOPT, so we need to build the [NLOPT plugins](https://nlopt.readthedocs.io/en/latest/NLopt_Installation/) for matlab.
`cd ./thirdParty/nlopt-2.5.0/`
`mkdir build`
`cd build`
`cmake -DMatlab_ROOT_DIR=/usr/local/MATLAB/R2017b -DINSTALL_MEX_DIR=/usr/local/MATLAB/R2017b/mex ..`
`make`
***Attention***:
You can change the `Matlab_ROOT_DIR` and `INSTALL_MEX_DIR` according to your own configuration, do not forget add the `INSTALL_MEX_DIR` to your **matlab searching path** in order to use **NLOPT** normally.

3. <span id=13>NLOPT-windows</span>
Enable `NLOPT` on the windows OS, you can follow the [Official Tutorial](https://nlopt.readthedocs.io/en/latest/NLopt_on_Windows/) to install the `NLOPT plugin` for `matlab interface`.
In the released package, we have include the built `mex` files within the `./thirdParty/nlopt/windows/mex` directory, you should only copy the `libnlopt-0.dll` to the matlab directory `WHERE_YOUR_MATLAB_INSTALLATION\bin\win64`.
If you want to build your own `mex` file. then follow the tips below:
- download the [precompiled 64-bit windows DLLs](http://ab-initio.mit.edu/nlopt/nlopt-2.4.2-dll64.zip)
- unzip the [nlopt-2.4.2-dll64.zip](http://ab-initio.mit.edu/nlopt/nlopt-2.4.2-dll64.zip) and cd to the directory `nlopt-2.4.2-dll64`
- copy the DLL file `libnlopt-0.dll` to `WHERE_YOUR_MATLAB_INSTALLATION\bin\win64`, for example `C:\Program Files\MATLAB\R2017b\bin\win64`
- convert the Dll file to lib file, we strongly suggest you to use the [GNU compilers(minGW)](https://en.wikipedia.org/wiki/MinGW). If you are using GNU compilers (MinGW), run the following command(dlltool comes with MinGW) in `command` or `powerShell`:
`dlltool --input-def libnlopt-0.def --dllname libnlopt-0.dll --output-lib libnlopt-0.lib`
For example, in my windows computer, the command looks like:
`C:\TDM-GCC-64\x86_64-w64-mingw32\bin\dlltool --input-def libnlopt-0.def --dllname libnlopt-0.dll --output-lib libnlopt-0.lib`
- Copy the `nlopt_optimize.c` from `nlopt-2.4.2-dll64\matlab` to `nlopt-2.4.2-dll64`, you will find the source file `nlopt_optimize.c`, the header file `nlopt.h` and the lib file `libnlopt-0.lib` under the directory `nlopt-2.4.2-dll64`.
- Using the `mex` command in the `matlab command`:
`mex -L. -lnlopt-0 nlopt_optimize.c`
After the command returns successfully, you will find `nlopt_optimize.mexw64` within the current diretory.
- Copy the `nlopt_optimize.mexw64` to the directory `nlopt-2.4.2-dll64\matlab` and set the directory to the searching path of matlab using command `addpath`.


## <span id=2>Getting Started</span>
The toolbox depends on the image-points-pairs with chessboard. The quality of the data has great influence on the calibration results.
1. <span id=21>data preparation</span>
You should prepare the time-sychronized image-points pairs with chessboard exsiting within the overlapped field view of both the camera and Lidar.
<div align=center><img src="docs/image_raw.png"/></div>

You should get as much as orientation and position of the chessboard as descriped below, which has at least **5 poses** each position and different distance between the chessboard and camera.

For more requirement of the data, you can refer to the tutorial of the [Autoware - How to Calibrate Camera](https://github.com/CPFL/Autoware/wiki/Calibration).
<div align=center><img src="docs/showExtrinsics.png"/></div>

Additionally, for more robust plane extraction in Lidar points, you'd better provide a static image-point pair with has no chessboard in it. And you should name the **static pair** as the **first one pair**.

## <span id=3>Usage</span>
In the `MultiModal_data_studio` toolboxes, you may need to be authorized before you can use the `cheesboard_camera_lidar_calibration_toolbox`. Feel free to contact the author `ccyinlu@whu.edu.cn` to get the valid `account`.
<div align=center><img src="docs/multimodal_data_studio.JPG"/></div>

You should first run `multimodal_data_studio.m` to launch the GUI as the above. When the matlab ask you if you want to **change your path** or **add to the path**, you should choose **change your path** to focus on the current path.
After you click the `chessboard_camera_lidar_calibration_toolbox` button, you will be confronted with an `authorization` dialog to validate yourself.
<div align=center><img src="docs/authorization.JPG"/></div>

Feel free to contact the `ccyinlu@whu.edu.cn` to get the `account` and have a try.

## <span id=4>Fusion</span>
Project the lidar points to the image using the `calibration file` and the points are rendered according to `range`.

<div align=center><img src="http://47.100.39.180/download/MultiModal_Data_Studio/DemoLidar2Camera_range.gif"/></div>

Project the lidar points to the image using the `calibration file` and the points are rendered according to `height`.

<div align=center><img src="http://47.100.39.180/download/MultiModal_Data_Studio/DemoLidar2Camera_height.gif"/></div>

Set a threshold to simply filter the ground points.

<div align=center><img src="http://47.100.39.180/download/MultiModal_Data_Studio/DemoLidar2Camera_height_no_ground.gif"/></div>

## Questions

You are welcome to submit questions and bug reports as [GitHub Issues](https://github.com/ccyinlu/chessboard_camera_lidar_calibration_toolbox/issues).

## Copyright and License

## Disclaimer

## Connect with us
* Ethan: ccyinlu@whu.edu.cn
Binary file added calibration/affine_fit.p
Binary file not shown.
Binary file added calibration/detectImageChessboardPoints.p
Binary file not shown.
Binary file added calibration/detectLidarChessboardPoints.p
Binary file not shown.
Binary file added calibration/detectLidarChessboardPointsDiffPlane.p
Binary file not shown.
Binary file added calibration/estimateExtrinsicParameters.p
Binary file not shown.
Binary file added calibration/extractChessboardPoints.p
Binary file not shown.
Binary file added calibration/findPointNormals.p
Binary file not shown.
Binary file added calibration/matchValidPair.p
Binary file not shown.
Binary file added calibration/pointCompare.p
Binary file not shown.
Binary file added chessboard_camera_lidar_calibration.p
Binary file not shown.
Binary file not shown.
Binary file added docs/authorization.JPG
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/camera_lidar_calibration_toolbox_02.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/image_raw.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/multimodal_data_studio.JPG
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/showExtrinsics.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
55 changes: 55 additions & 0 deletions example/calibrateCameraDemo.m
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
% Auto-generated by cameraCalibrator app on 17-Feb-2019
%-------------------------------------------------------


% Define images to process
imageFileNames = {...
'/home/ethan/matlab/MultiModal_Data_Studio/TestData/calib-20180929/usb_cam/000002.png',...
'/home/ethan/matlab/MultiModal_Data_Studio/TestData/calib-20180929/usb_cam/000003.png',...
'/home/ethan/matlab/MultiModal_Data_Studio/TestData/calib-20180929/usb_cam/000004.png',...
'/home/ethan/matlab/MultiModal_Data_Studio/TestData/calib-20180929/usb_cam/000005.png',...
'/home/ethan/matlab/MultiModal_Data_Studio/TestData/calib-20180929/usb_cam/000006.png',...
'/home/ethan/matlab/MultiModal_Data_Studio/TestData/calib-20180929/usb_cam/000007.png',...
'/home/ethan/matlab/MultiModal_Data_Studio/TestData/calib-20180929/usb_cam/000008.png',...
'/home/ethan/matlab/MultiModal_Data_Studio/TestData/calib-20180929/usb_cam/000009.png',...
'/home/ethan/matlab/MultiModal_Data_Studio/TestData/calib-20180929/usb_cam/000010.png',...
'/home/ethan/matlab/MultiModal_Data_Studio/TestData/calib-20180929/usb_cam/000011.png',...
'/home/ethan/matlab/MultiModal_Data_Studio/TestData/calib-20180929/usb_cam/000012.png',...
'/home/ethan/matlab/MultiModal_Data_Studio/TestData/calib-20180929/usb_cam/000013.png',...
'/home/ethan/matlab/MultiModal_Data_Studio/TestData/calib-20180929/usb_cam/000014.png',...
};

% Detect checkerboards in images
[imagePoints, boardSize, imagesUsed] = detectCheckerboardPoints(imageFileNames);
imageFileNames = imageFileNames(imagesUsed);

% Read the first image to obtain image size
originalImage = imread(imageFileNames{1});
[mrows, ncols, ~] = size(originalImage);

% Generate world coordinates of the corners of the squares
squareSize = 77; % in units of 'millimeters'
worldPoints = generateCheckerboardPoints(boardSize, squareSize);

% Calibrate the camera
[cameraParams, imagesUsed, estimationErrors] = estimateCameraParameters(imagePoints, worldPoints, ...
'EstimateSkew', false, 'EstimateTangentialDistortion', false, ...
'NumRadialDistortionCoefficients', 2, 'WorldUnits', 'millimeters', ...
'InitialIntrinsicMatrix', [], 'InitialRadialDistortion', [], ...
'ImageSize', [mrows, ncols]);

% View reprojection errors
h1=figure; showReprojectionErrors(cameraParams);

% Visualize pattern locations
h2=figure; showExtrinsics(cameraParams, 'CameraCentric');

% Display parameter estimation errors
displayErrors(estimationErrors, cameraParams);

% For example, you can use the calibration data to remove effects of lens distortion.
undistortedImage = undistortImage(originalImage, cameraParams);

% See additional examples of how to use the calibration data. At the prompt type:
% showdemo('MeasuringPlanarObjectsExample')
% showdemo('StructureFromMotionExample')
27 changes: 27 additions & 0 deletions example/nlopt_example1.m
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
clear;

addpath('/usr/local/MATLAB/R2017b/mex');

opt.algorithm = NLOPT_LD_MMA;
opt.verbose = 1;
opt.lower_bounds = [-inf, 0];
opt.min_objective = @myfunc;
opt.fc = { (@(x) myconstraint(x,2,0)), (@(x) myconstraint(x,-1,1)) };
opt.fc_tol = [1e-8, 1e-8];
opt.xtol_rel = 1e-4;

[xopt, fmin, retcode] = nlopt_optimize(opt, [1.234 5.678])

function [val, gradient] = myconstraint(x,a,b)
val = (a*x(1) + b)^3 - x(2);
if (nargout > 1)
gradient = [3*a*(a*x(1) + b)^2, -1];
end
end

function [val, gradient] = myfunc(x)
val = sqrt(x(2));
if (nargout > 1)
gradient = [0, 0.5 / val];
end
end
95 changes: 95 additions & 0 deletions example/simExtrasincCalibration.m
Original file line number Diff line number Diff line change
@@ -0,0 +1,95 @@

addpath('../calibration');
N = 100;
xoptGroundTruth = [-3.05 0.03 -1.68 -0.8 -0.5 -0.3];
lidarNormalsNoise = 0.1;
chessboardPointsNoise = [-3 3 -1 1 7 10];
global lidarPointsNumMean;
lidarPointsNumMean = 500;
global lidarPointsNumStd;
lidarPointsNumStd = 10;
global lidarPointsXSpread;
lidarPointsXSpread = 0.5;
global lidarPointsZSpread;
lidarPointsZSpread = 0.5;

lidarPointsCenterNoise = 0.1;
global lidarPointsPlaneNoise;
lidarPointsPlaneNoise = 0.1;


rmatGroundTruth = eul2rotm([xoptGroundTruth(1) xoptGroundTruth(2) xoptGroundTruth(3)]);
vmatGroundTruth = [xoptGroundTruth(4) xoptGroundTruth(5) xoptGroundTruth(6)];
TmatGroundTruth = eye(4);
TmatGroundTruth(1:3, 1:3) = rmatGroundTruth;
TmatGroundTruth(1, 4) = vmatGroundTruth(1);
TmatGroundTruth(2, 4) = vmatGroundTruth(2);
TmatGroundTruth(3, 4) = vmatGroundTruth(3);

chessboardNormalsBase = [0 0 1];
eulerBase = [0 0 0];
euler = ones(N, 1) * eulerBase + random('unif', -pi/6, pi/6, [N, 3]);

chessboardNormals = zeros(N, 3);
lidarNormals = cell(N, 1);
for i = 1 : N
rotm = eul2rotm(euler(i, :));
chessboardNormals(i, :) = chessboardNormalsBase * rotm';
lidarNormals{i} = chessboardNormals(i, :) * rmatGroundTruth';
lidarNormals{i} = lidarNormals{i} + random('unif', -lidarNormalsNoise, lidarNormalsNoise, [1, 3]);
end

lidarPoints = cell(N, 1);
chessboardPoints = zeros(N, 3);

for i = 1 : N
chessboardPoints(i, :) = [ random('unif', chessboardPointsNoise(1), chessboardPointsNoise(2), 1) ...
random('unif', chessboardPointsNoise(3), chessboardPointsNoise(4), 1) ...
random('unif', chessboardPointsNoise(5), chessboardPointsNoise(6), 1)];
LidarPointCenter = chessboardPoints(i, :)*rmatGroundTruth' + vmatGroundTruth;
LidarPointCenter = LidarPointCenter + [ random('unif', -lidarPointsCenterNoise, lidarPointsCenterNoise, 1) ...
random('unif', -lidarPointsCenterNoise, lidarPointsCenterNoise, 1) ...
random('unif', -lidarPointsCenterNoise, lidarPointsCenterNoise, 1)];
lidarPoints{i} = generateLidarPoints(LidarPointCenter, lidarNormals{i});
end

cameraLidarextrinsicCalibrationData = struct();
cameraLidarextrinsicCalibrationData.chessboardPoints = chessboardPoints;
cameraLidarextrinsicCalibrationData.chessboardNormals = chessboardNormals;
cameraLidarextrinsicCalibrationData.lidarPointsNormals = lidarNormals;
cameraLidarextrinsicCalibrationData.lidarPoints = lidarPoints;

% ESTIMATE THE EXTRINSIC MATRIX
[Params, ~] = estimateExtrinsicParameters(cameraLidarextrinsicCalibrationData, []);

disp('errorOpt:');
errorOpt = Params.opt - xoptGroundTruth;
disp(errorOpt);

errorMatrix = Params.extrinsicMat - TmatGroundTruth;
disp('error matrix');
disp(errorMatrix);

function lidarPoint = generateLidarPoints(pointCenter, pointNormal)
global lidarPointsNumMean;
global lidarPointsNumStd;
global lidarPointsXSpread;
global lidarPointsZSpread;
global lidarPointsPlaneNoise;

pointsNum = floor(random('norm', lidarPointsNumMean, lidarPointsNumStd));
lidarPoint = zeros(pointsNum, 3);
for i = 1 : pointsNum
x = random('unif', pointCenter(1) - lidarPointsXSpread, pointCenter(1) + lidarPointsXSpread, 1);
z = random('unif', pointCenter(3) - lidarPointsZSpread, pointCenter(3) + lidarPointsZSpread, 1);
y = (pointCenter*pointNormal' - pointNormal(1)*x - pointNormal(3)*z)/pointNormal(2);
lidarPoint(i, :) = [x y z];

lidarPoint(i, :) = lidarPoint(i, :) + [ random('unif', -lidarPointsPlaneNoise, lidarPointsPlaneNoise, 1) ...
random('unif', -lidarPointsPlaneNoise, lidarPointsPlaneNoise, 1) ...
random('unif', -lidarPointsPlaneNoise, lidarPointsPlaneNoise, 1)];
end
lidarPointDiff = lidarPoint(2:end, :) - lidarPoint(1:end-1, :);
normalError = norm(lidarPointDiff*pointNormal');
disp(normalError);
end
30 changes: 30 additions & 0 deletions example/testSVD.m
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@

N = 100;
eulerGroundTruth = [pi, 0, pi/2];
rotmGroundTruth = eul2rotm(eulerGroundTruth);

chessboardNormalsBase = [0 0 1];
eulerBase = [0 0 0];
euler = ones(N, 1) * eulerBase + random('unif', -pi/6, pi/6, [N, 3]);

chessboardNormals = zeros(N, 3);
lidarNormals = zeros(N, 3);
for i = 1 : N
rotm = eul2rotm(euler(i, :));
chessboardNormals(i, :) = chessboardNormalsBase * rotm';
lidarNormals(i, :) = chessboardNormals(i, :) * rotmGroundTruth';
end


% get the initial rotation via SVD
NN = chessboardNormals'*chessboardNormals;
if isequal(det(NN), 0)
warningNotEnoughPatternsDialog();
return;
end

NM = chessboardNormals'*lidarNormals;
UNR = NM;
[U, S, V] = svd(UNR);
InitRotation = V*U';
svdEuler = rotm2eul(InitRotation);
Binary file added figure/createCameraLidarCalibrationBrowserFigure.p
Binary file not shown.
Binary file added figure/createCameraLidarCalibrationImageDataFigure.p
Binary file not shown.
Binary file added figure/createCameraLidarCalibrationImageFigure.p
Binary file not shown.
Binary file added figure/createCameraLidarCalibrationPointDataFigure.p
Binary file not shown.
Binary file added figure/createCameraLidarCalibrationPointFigure.p
Binary file not shown.
Binary file added figure/createCameraLidarCalibrationResultsFigure.p
Binary file not shown.
Binary file added figure/createEmptyFigure.p
Binary file not shown.
Binary file added figure/createMultiRosbagImageViewFigure.p
Binary file not shown.
Binary file added figure/createMultiRosbagPlayFigure.p
Binary file not shown.
Binary file added figure/createMultiRosbagPointViewFigure.p
Binary file not shown.
Binary file added figure/createMultiRosbagStatusTableFigure.p
Binary file not shown.
Binary file added figure/createMultiRosbagTableFigure.p
Binary file not shown.
Loading

0 comments on commit 9ee78b2

Please sign in to comment.