You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
While loading the ParisLille3D dataset, it appears that there are no labels provided for the test point clouds. Do you know where we can find these semantic segmentation labels for the test point clouds? Thanks!
Steps to reproduce
We can observe the distribution of labels across all 10 classes of points in the val split of the dataset:
importopen3d.ml.torchasml3dimportpandasaspd# construct a dataset by specifying dataset_pathdataset=ml3d.datasets.ParisLille3D(dataset_path=DATASET_DIR)
# Get the validation splitval_split=dataset.get_split('val')
# Print the value counts of the labelforidxinrange(len(val_split)):
print(dict(pd.Series(val_split.get_data(idx)['label']).value_counts()))
However, we note a "blank" output for all point clouds in the test section:
importopen3d.ml.torchasml3dimportpandasaspd# construct a dataset by specifying dataset_pathdataset=ml3d.datasets.ParisLille3D(dataset_path=DATASET_DIR)
# Get the test splittest_split=dataset.get_split('test')
# Print the value counts of the labelforidxinrange(len(test_split)):
print(dict(pd.Series(test_split.get_data(idx)['label']).value_counts()))
Produces the following output:
{0: 10000000}
{0: 10000000}
{0: 10000000}
The text was updated successfully, but these errors were encountered:
Checklist
main
branch).My Question
Hello!
While loading the ParisLille3D dataset, it appears that there are no labels provided for the test point clouds. Do you know where we can find these semantic segmentation labels for the test point clouds? Thanks!
Steps to reproduce
We can observe the distribution of labels across all 10 classes of points in the
val
split of the dataset:Produces the following output:
However, we note a "blank" output for all point clouds in the
test
section:Produces the following output:
The text was updated successfully, but these errors were encountered: