We establish one of the largest virtual reality (VR) videos (typically in the form of 360° videos) database, containing 502 user-generated videos with rich content and commingled authentic distortions (often localized in space and time). We capture viewing behaviors (i.e., scanpaths) of 139 users, and collect their opinion scores of perceived quality under four different viewing conditions (two starting points × two exploration times). We provide a thorough statistical analysis of recorded data, resulting in several interesting observations, which reveal how viewing conditions affect human behaviors and perceived quality.
The database and the annotations(MOS, head data and eye data) can be downloaded at the Baidu Yun(Code: Fang).
Some video samples with scanpaths and perceived quality in different viewing conditions, click on the picture to watch the video.