Skip to content

Flexible and unified directions of measurement within sensor view configuration #421

Open
@jnthi

Description

@jnthi

Describe the feature

Currently, the definition of measurement directions is limited to an image plane (e.g. in case of the camera using the pinhole approach) or equidistant-angle-sampling (e.g. in case of lidar using a fixed number of samples/rays in horizontal and vertical direction to sample the field of view).
This is a good approach for simulating pinhole-like cameras with less distortion and multi-layer-lidars with fixed vertical resolution.
However, when it comes to the simulation of fisheye cameras with up to 180 degree field of view, or scanning lidars with dynamic scan patterns, the current sensor view configuration does not fit or is not efficient. Reason: The pinhole approach does not fit for fisheye cameras, and equidistant-angle-sampling is very inefficient for simulating arbitrary scan patterns (biggest part of the defined samples would not be used when taking the scan pattern into account).

Describe the solution you would like

Our solution: Define directions of measurement via polar angle maps (i.e. pairs of azimuth and elevation angles as floats organized in a 2D array) that clearly define the wished directions of measurement, which shall be performed within the next simulation step. Those polar angle maps can be easily generated by the user (knowing the scanning mechanics or the lens distortion model).
Moreover, these may be used to unify the sensor view configuration of further environment perceiving sensors, since flexible definition of measurement directions is useful for all sensor types (thinking about the Use Case 3: "Sampled Geometry Sensor Model" of the proposal, where the sensor model is on the user side and not within the environment simulation).
Finally, this definition of measurement directions can serve as some kind of abstraction layer to let the environment simulation decide whether rasterization or ray-tracing or something else is used to generate the requested simulation results (with a requested simulation depth) for the given measurement directions (overall goal: "solver"-independent physics-based interface definition).

Describe alternatives you have considered

Defining measurement directions via linking to a call-back function that gives current measurement directions for given time stamps.

Describe the backwards compatibility

Backwards compatibility depends on the implementation. If polar angle maps are used as one alternative to define the sensor view, it may be compatible. If it is used as replacement, it will not. One could think about some kind of wrapper generating appropriate polar angle maps based on the parameters which are currently used (i.e. field of view, number of rays/pixels in each direction).

Additional context

Following the approach of generalization as presented in:
J. Thieling and J. Roßmann, "Scalable Sensor Models and Simulation Methods for Seamless Transitions Within System Development: From First Digital Prototype to Final Real System," in IEEE Systems Journal, doi: 10.1109/JSYST.2020.3006739.

which is in use in:
Simulated Rear View Fisheye Camera: https://youtu.be/Z-3ms6cNVSc
Simulated Camera Monitoring System: https://youtu.be/khEDTagAKLw
And further Lidar simulations (see referenced Journal Paper).

Metadata

Metadata

Assignees

No one assigned

    Labels

    FeatureRequestProposals which enhance the interface or add additional features.SensorModelingThe Group in the ASAM development project working on sensor modeling topics.

    Type

    No type

    Projects

    Status

    Todo

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions