diff --git a/python-sdk/nuscenes/eval/detection/README.md b/python-sdk/nuscenes/eval/detection/README.md index a07c088e..0f8e8577 100644 --- a/python-sdk/nuscenes/eval/detection/README.md +++ b/python-sdk/nuscenes/eval/detection/README.md @@ -49,12 +49,12 @@ Note that this challenge uses the same [evaluation server](https://eval.ai/web/c A summary of the results can be seen below. For details, please refer to the [detection leaderboard](https://www.nuscenes.org/object-detection). -| Rank | Team name | NDS | -|--- |--- |--- | -| 1 | CenterPoint | 71.4% | -| 2 | PointAugmenting | 71.1% | -| 3 | MoCa | 70.9% | -| 4 | PVC ensemble | 70.4% | +| Rank | Team name | NDS | Award | +|--- |--- |--- |--- | +| 1 | CenterPoint | 71.4% | Best submission | +| 2 | PointAugmenting | 71.1% | Second best | +| 3 | MoCa | 70.9% | Best PKL | +| 4 | PVC ensemble | 70.4% | Best lidar-only submission | ### Workshop on Benchmarking Progress in Autonomous Driving, ICRA 2020 The second nuScenes detection challenge will be held at [ICRA 2020](https://www.icra2020.org/). @@ -65,14 +65,14 @@ Note that the previous [evaluation server](https://eval.ai/web/challenges/challe A summary of the results can be seen below. For details, please refer to the [detection leaderboard](https://www.nuscenes.org/object-detection). -| Rank | Team name | NDS | -|--- |--- |--- | -| 1 | Noah CV Lab fusion | 69.7% | -| 2 | CenterPoint | 67.5% | -| 3 | CVCNet ensemble | 66.6% | -| 4 | PanoNet3D | 63.1% | -| 5 | CRIPAC | 63.2% | -| 6 | SSN | 61.6% | +| Rank | Team name | NDS | Award | +|--- |--- |--- |--- | +| 1 | Noah CV Lab fusion | 69.7% | Best submission | +| 2 | CenterPoint | 67.5% | Best student submission | +| 3 | CVCNet ensemble | 66.6% | Honorable mention | +| 4 | PanoNet3D | 63.1% | - | +| 5 | CRIPAC | 63.2% | - | +| 6 | SSN | 61.6% | - | ### Workshop on Autonomous Driving, CVPR 2019 The first nuScenes detection challenge was held at CVPR 2019. @@ -84,13 +84,13 @@ Note that the [evaluation server](https://eval.ai/web/challenges/challenge-page/ A summary of the results can be seen below. For details, please refer to the [detection leaderboard](https://www.nuscenes.org/object-detection). -| Rank | Team name | NDS | -|--- |--- |--- | -| 1 | MEGVII G3D3 | 63.3% | -| 2 | Tolist | 54.5% | -| 3 | SARPNET AT3D | 48.4% | -| 4 | MAIR | 38.4% | -| 5 | VIPL | 35.3% | +| Rank | Team name | NDS | Award | +|--- |--- |--- |--- | +| 1 | MEGVII G3D3 | 63.3% | Best submission | +| 2 | Tolist | 54.5% | Best student submission | +| 3 | SARPNET AT3D | 48.4% | - | +| 4 | MAIR | 38.4% | Best vision-only submission | +| 5 | VIPL | 35.3% | - | ## Submission rules ### Detection-specific rules diff --git a/python-sdk/nuscenes/eval/lidarseg/README.md b/python-sdk/nuscenes/eval/lidarseg/README.md index 438d66ee..1f3027ed 100644 --- a/python-sdk/nuscenes/eval/lidarseg/README.md +++ b/python-sdk/nuscenes/eval/lidarseg/README.md @@ -49,12 +49,12 @@ Note that the [evaluation server](https://eval.ai/web/challenges/challenge-page/ A summary of the results can be seen below. For details, please refer to the [lidar segmentation leaderboard](https://www.nuscenes.org/lidar-segmentation). -| Rank | Team name | mIOU | -|--- |--- |--- | -| 1 | Noah_Kyber | 0.783 | -| 2 | Cylinder3D++ | 0.779 | -| 3 | CPFusion | 0.777 | -| 4 | MIT-HAN-LAB | 0.774 | +| Rank | Team name | mIOU | Awards | +|--- |--- |--- |--- | +| 1 | Noah_Kyber | 0.783 | Best submission | +| 2 | Cylinder3D++ | 0.779 | Second best | +| 3 | CPFusion | 0.777 | - | +| 4 | MIT-HAN-LAB | 0.774 | - | ## Submission rules ### Lidar segmentation-specific rules diff --git a/python-sdk/nuscenes/eval/prediction/README.md b/python-sdk/nuscenes/eval/prediction/README.md index b2af1091..b1d14cc8 100644 --- a/python-sdk/nuscenes/eval/prediction/README.md +++ b/python-sdk/nuscenes/eval/prediction/README.md @@ -48,11 +48,11 @@ to be eligible for the prizes. A summary of the results can be seen below. For details, please refer to the [prediction leaderboard](https://www.nuscenes.org/prediction). -| Rank | Team name | minADE_5 | -|--- |--- |--- | -| 1 | cxx | 1.630 | -| 2 | MHA-JAM | 1.813 | -| 3 | Trajectron++ | 1.877 | +| Rank | Team name | minADE_5 | Awards | +|--- |--- |--- |--- | +| 1 | cxx | 1.630 | Best submission | +| 2 | MHA-JAM | 1.813 | Second best | +| 3 | Trajectron++ | 1.877 | Third best | ## Submission rules ### Prediction-specific rules diff --git a/python-sdk/nuscenes/eval/tracking/README.md b/python-sdk/nuscenes/eval/tracking/README.md index 9f46b007..3002da63 100644 --- a/python-sdk/nuscenes/eval/tracking/README.md +++ b/python-sdk/nuscenes/eval/tracking/README.md @@ -69,12 +69,12 @@ Results and winners will be announced at the [AI Driving Olympics](http://www.dr A summary of the results can be seen below. For details, please refer to the [tracking leaderboard](https://www.nuscenes.org/tracking). -| Rank | Team name | AMOTA | -|--- |--- |--- | -| 1 | StanfordIPRL-TRI | 55.0% | -| 2 | VV_team | 37.1% | -| 3 | CenterTrack-Open | 10.8% | -| 4 | CenterTrack-Vision | 4.6% | +| Rank | Team name | AMOTA | Awards | +|--- |--- |--- |--- | +| 1 | StanfordIPRL-TRI | 55.0% | Best lidar-only submission, best student submission | +| 2 | VV_team | 37.1% | - | +| 3 | CenterTrack-Open | 10.8% | Best fusion submission | +| 4 | CenterTrack-Vision | 4.6% | Best vision-only submission | ## Submission rules ### Tracking-specific rules