Skip to content

Commit

Permalink
Radar visualization (nutonomy#16)
Browse files Browse the repository at this point in the history
* Added filtering for invalid RADAR points

* Created separate classes for Radar and Lidar PointCloud

* Fixed issue nutonomy#15

* Fixed rare bug when point cloud is empty

* Made readme clearer

* Added radar visualization

* Uncommented opencv rendering in tutorial

* Added missign requirement for scipy

* Implemented multisweep visualization

* Reformatting

* Added radar examples to tutorial

* Advanced filtering options for radar points

* Made nsweeps optional, refactoring

* Updated notebook

* Removed outputs from Ipython notebook

* changed order of data class methods for readability

* changed to used the same sample for full tutorial
  • Loading branch information
holger-motional authored Dec 5, 2018
1 parent 56487d3 commit f9bb467
Show file tree
Hide file tree
Showing 7 changed files with 569 additions and 710 deletions.
20 changes: 10 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,9 @@ Welcome to the devkit of the [nuScenes](https://www.nuscenes.org) dataset.
- [Setting up a new virtual environment](#setting-up-a-new-virtual-environment)

## Changelog
- Oct. 4, 2018: Code to parse RADAR data released
- Sep. 12, 2018: Devkit for teaser dataset released
- Nov. 21, 2018: RADAR filtering and multi sweep aggregation.
- Oct. 4, 2018: Code to parse RADAR data released.
- Sep. 12, 2018: Devkit for teaser dataset released.

## Dataset download
To download nuScenes you need to go to the [Download page](https://www.nuscenes.org/download),
Expand All @@ -24,10 +25,10 @@ Please unpack the archives to the `/data/nuscenes` folder \*without\* overwritin
Eventually you should have the following folder structure:
```
/data/nuscenes
maps - Large image files (~500 Gigapixel) that depict the drivable surface and sidewalks in the scene
samples - Sensor data for keyframes
sweeps - Sensor data for intermediate frames
v0.1 - JSON tables that include all the meta data and annotations
maps - Large image files (~500 Gigapixel) that depict the drivable surface and sidewalks in the scene.
samples - Sensor data for keyframes.
sweeps - Sensor data for intermediate frames.
v0.1 - JSON tables that include all the meta data and annotations.
```
If you want to use another folder, specify the `dataroot` parameter of the NuScenes class below.

Expand All @@ -36,10 +37,9 @@ Download the devkit to your home directory using:
```
cd && git clone https://github.com/nutonomy/nuscenes-devkit.git
```
The devkit is tested for Python 3.7.
We may add backward compatibility in future releases.
To install the required packages, run the following command in your favourite virtual environment. If you need help in
installing Python 3.7 or in setting up a new virtual environment, you can look at [these instructions](#setting-up-a-new-virtual-environment):
The devkit is tested for Python 3.7.
To install Python 3.7 and set up a new virtual environment, you can look at [these instructions](#setting-up-a-new-virtual-environment).
To install the required packages, run the following command in your favourite virtual environment:
```
pip install -r requirements.txt
```
Expand Down
8 changes: 5 additions & 3 deletions python-sdk/export/export_pointclouds_as_obj.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,14 +13,16 @@
from pyquaternion import Quaternion
from tqdm import tqdm

from nuscenes_utils.data_classes import PointCloud
from nuscenes_utils.data_classes import LidarPointCloud
from nuscenes_utils.geometry_utils import view_points
from nuscenes_utils.nuscenes import NuScenes


def export_scene_pointcloud(nusc: NuScenes, out_path: str, scene_token: str, channel: str='LIDAR_TOP',
min_dist: float=3.0, max_dist: float=30.0, verbose: bool=True) -> None:
"""
Export fused point clouds of a scene to a Wavefront OBJ file.
This point-cloud can be viewed in your favorite 3D rendering tool, e.g. Meshlab or Maya.
:param nusc: NuScenes instance.
:param out_path: Output path to write the point-cloud to.
:param scene_token: Unique identifier of scene to render.
Expand Down Expand Up @@ -59,7 +61,7 @@ def export_scene_pointcloud(nusc: NuScenes, out_path: str, scene_token: str, cha
sample_rec = nusc.get('sample', sc_rec['sample_token'])
lidar_token = sd_rec['token']
lidar_rec = nusc.get('sample_data', lidar_token)
pc = PointCloud.from_file(osp.join(nusc.dataroot, lidar_rec['filename']))
pc = LidarPointCloud.from_file(osp.join(nusc.dataroot, lidar_rec['filename']))

# Get point cloud colors.
coloring = np.ones((3, pc.points.shape[1])) * -1
Expand Down Expand Up @@ -113,7 +115,7 @@ def pointcloud_color_from_image(nusc: NuScenes, pointsensor_token: str, camera_t
cam = nusc.get('sample_data', camera_token)
pointsensor = nusc.get('sample_data', pointsensor_token)

pc = PointCloud.from_file(osp.join(nusc.dataroot, pointsensor['filename']))
pc = LidarPointCloud.from_file(osp.join(nusc.dataroot, pointsensor['filename']))
im = Image.open(osp.join(nusc.dataroot, cam['filename']))

# Points live in the point sensor frame. So they need to be transformed via global to the image plane.
Expand Down
Loading

0 comments on commit f9bb467

Please sign in to comment.