Skip to content

Video Metadata/WebVMT: Search Clips By Sensor Data With Location #92

Open
@rjksmith

Description

@rjksmith

Outline
An experiment to test how a video archive containing footage in mixed web formats, i.e. MPEG, WebM and OGG, with metadata (including geolocation) from disparate sources could best be searched by sensor data to return a sequence of video clips with location.

  1. Establish demonstrable benefits of using a common metadata format.
  2. Determine the merits of non-embedded metadata versus an embedded approach.
  3. Refine existing use cases and identify any further applications.

Use Cases

  • Accident Investigation/Motor Insurance
    Vehicle collisions can be automatically identified in dashcam footage, using high acceleration characteristics indicating an impact recorded in their metadata, to establish the incident time and location. A video clip centred around this time, e.g. 60 seconds before and after, includes forensic information about the collision which is useful in determining the precise details of the event to assist officers, in the case of a police investigation, or the motor insurer where a financial claim has been made.

Further Details

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions