Skip to content

Commit

Permalink
Add awards to challenges (nutonomy#527)
Browse files Browse the repository at this point in the history
Co-authored-by: Holger Caesar <holger.caesar@motional.com>
  • Loading branch information
holger-motional and Holger Caesar authored Dec 21, 2020
1 parent 568762b commit cee2952
Show file tree
Hide file tree
Showing 4 changed files with 38 additions and 38 deletions.
42 changes: 21 additions & 21 deletions python-sdk/nuscenes/eval/detection/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,12 +49,12 @@ Note that this challenge uses the same [evaluation server](https://eval.ai/web/c
A summary of the results can be seen below.
For details, please refer to the [detection leaderboard](https://www.nuscenes.org/object-detection).

| Rank | Team name | NDS |
|--- |--- |--- |
| 1 | CenterPoint | 71.4% |
| 2 | PointAugmenting | 71.1% |
| 3 | MoCa | 70.9% |
| 4 | PVC ensemble | 70.4% |
| Rank | Team name | NDS | Award |
|--- |--- |--- |--- |
| 1 | CenterPoint | 71.4% | Best submission |
| 2 | PointAugmenting | 71.1% | Second best |
| 3 | MoCa | 70.9% | Best PKL |
| 4 | PVC ensemble | 70.4% | Best lidar-only submission |

### Workshop on Benchmarking Progress in Autonomous Driving, ICRA 2020
The second nuScenes detection challenge will be held at [ICRA 2020](https://www.icra2020.org/).
Expand All @@ -65,14 +65,14 @@ Note that the previous [evaluation server](https://eval.ai/web/challenges/challe
A summary of the results can be seen below.
For details, please refer to the [detection leaderboard](https://www.nuscenes.org/object-detection).

| Rank | Team name | NDS |
|--- |--- |--- |
| 1 | Noah CV Lab fusion | 69.7% |
| 2 | CenterPoint | 67.5% |
| 3 | CVCNet ensemble | 66.6% |
| 4 | PanoNet3D | 63.1% |
| 5 | CRIPAC | 63.2% |
| 6 | SSN | 61.6% |
| Rank | Team name | NDS | Award |
|--- |--- |--- |--- |
| 1 | Noah CV Lab fusion | 69.7% | Best submission |
| 2 | CenterPoint | 67.5% | Best student submission |
| 3 | CVCNet ensemble | 66.6% | Honorable mention |
| 4 | PanoNet3D | 63.1% | - |
| 5 | CRIPAC | 63.2% | - |
| 6 | SSN | 61.6% | - |

### Workshop on Autonomous Driving, CVPR 2019
The first nuScenes detection challenge was held at CVPR 2019.
Expand All @@ -84,13 +84,13 @@ Note that the [evaluation server](https://eval.ai/web/challenges/challenge-page/
A summary of the results can be seen below.
For details, please refer to the [detection leaderboard](https://www.nuscenes.org/object-detection).

| Rank | Team name | NDS |
|--- |--- |--- |
| 1 | MEGVII G3D3 | 63.3% |
| 2 | Tolist | 54.5% |
| 3 | SARPNET AT3D | 48.4% |
| 4 | MAIR | 38.4% |
| 5 | VIPL | 35.3% |
| Rank | Team name | NDS | Award |
|--- |--- |--- |--- |
| 1 | MEGVII G3D3 | 63.3% | Best submission |
| 2 | Tolist | 54.5% | Best student submission |
| 3 | SARPNET AT3D | 48.4% | - |
| 4 | MAIR | 38.4% | Best vision-only submission |
| 5 | VIPL | 35.3% | - |

## Submission rules
### Detection-specific rules
Expand Down
12 changes: 6 additions & 6 deletions python-sdk/nuscenes/eval/lidarseg/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,12 +49,12 @@ Note that the [evaluation server](https://eval.ai/web/challenges/challenge-page/
A summary of the results can be seen below.
For details, please refer to the [lidar segmentation leaderboard](https://www.nuscenes.org/lidar-segmentation).

| Rank | Team name | mIOU |
|--- |--- |--- |
| 1 | Noah_Kyber | 0.783 |
| 2 | Cylinder3D++ | 0.779 |
| 3 | CPFusion | 0.777 |
| 4 | MIT-HAN-LAB | 0.774 |
| Rank | Team name | mIOU | Awards |
|--- |--- |--- |--- |
| 1 | Noah_Kyber | 0.783 | Best submission |
| 2 | Cylinder3D++ | 0.779 | Second best |
| 3 | CPFusion | 0.777 | - |
| 4 | MIT-HAN-LAB | 0.774 | - |

## Submission rules
### Lidar segmentation-specific rules
Expand Down
10 changes: 5 additions & 5 deletions python-sdk/nuscenes/eval/prediction/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,11 +48,11 @@ to be eligible for the prizes.
A summary of the results can be seen below.
For details, please refer to the [prediction leaderboard](https://www.nuscenes.org/prediction).

| Rank | Team name | minADE_5 |
|--- |--- |--- |
| 1 | cxx | 1.630 |
| 2 | MHA-JAM | 1.813 |
| 3 | Trajectron++ | 1.877 |
| Rank | Team name | minADE_5 | Awards |
|--- |--- |--- |--- |
| 1 | cxx | 1.630 | Best submission |
| 2 | MHA-JAM | 1.813 | Second best |
| 3 | Trajectron++ | 1.877 | Third best |

## Submission rules
### Prediction-specific rules
Expand Down
12 changes: 6 additions & 6 deletions python-sdk/nuscenes/eval/tracking/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -69,12 +69,12 @@ Results and winners will be announced at the [AI Driving Olympics](http://www.dr
A summary of the results can be seen below.
For details, please refer to the [tracking leaderboard](https://www.nuscenes.org/tracking).

| Rank | Team name | AMOTA |
|--- |--- |--- |
| 1 | StanfordIPRL-TRI | 55.0% |
| 2 | VV_team | 37.1% |
| 3 | CenterTrack-Open | 10.8% |
| 4 | CenterTrack-Vision | 4.6% |
| Rank | Team name | AMOTA | Awards |
|--- |--- |--- |--- |
| 1 | StanfordIPRL-TRI | 55.0% | Best lidar-only submission, best student submission |
| 2 | VV_team | 37.1% | - |
| 3 | CenterTrack-Open | 10.8% | Best fusion submission |
| 4 | CenterTrack-Vision | 4.6% | Best vision-only submission |

## Submission rules
### Tracking-specific rules
Expand Down

0 comments on commit cee2952

Please sign in to comment.