IEEE Access: Characterization of the Quality of Experience and Immersion of Point Cloud Videos in Augmented Reality through a Subjective Study

IEEE Access, A Multidisciplinary, Open-access Journal of the IEEE

[PDF]

Minh Nguyen (Alpen-Adria-Universität Klagenfurt, Austria), Shivi Vats (Alpen-Adria-Universität Klagenfurt, Austria), Sam Van Damme (Ghent University – imec and KU Leuven, Belgium), Jeroen van der Hooft (Ghent University – imec, Belgium), Maria Torres Vega (Ghent University – imec and KU Leuven, Belgium), Tim Wauters (Ghent University – imec, Belgium), Filip De Turck (Ghent University – imec, Belgium), Christian Timmerer (Alpen-Adria-Universität Klagenfurt, Austria), Hermann Hellwagner (Alpen-Adria-Universität Klagenfurt, Austria)

Abstract: Point cloud streaming has recently attracted research attention as it has the potential to provide six degrees of freedom movement, which is essential for truly immersive media. The transmission of point clouds requires high-bandwidth connections, and adaptive streaming is a promising solution to cope with fluctuating bandwidth conditions. Thus, understanding the impact of different factors in adaptive streaming on the Quality of Experience (QoE) becomes fundamental. Point clouds have been evaluated in Virtual Reality (VR), where viewers are completely immersed in a virtual environment. Augmented Reality (AR) is a novel technology and has recently become popular, yet quality evaluations of point clouds in AR environments are still limited to static images.

In this paper, we perform a subjective study of four impact factors on the QoE of point cloud video sequences in AR conditions, including encoding parameters (quantization parameters, QPs), quality switches, viewing distance, and content characteristics. The experimental results show that these factors significantly impact the QoE. The QoE decreases if the sequence is encoded at high QPs and/or switches to lower quality and/or is viewed at a shorter distance, and vice versa. Additionally, the results indicate that the end user is not able to distinguish the quality differences between two quality levels at a specific (high) viewing distance. An intermediate-quality point cloud encoded at geometry QP (G-QP) 24 and texture QP (T-QP) 32 and viewed at 2.5 m can have a QoE (i.e., score 6.5 out of 10) comparable to a high-quality point cloud encoded at 16 and 22 for G-QP and T-QP, respectively, and viewed at a distance of 5 m. Regarding content characteristics, objects with lower contrast can yield better quality scores. Participants’ responses reveal that the visual quality of point clouds has not yet reached an immersion level as desired. The average QoE of the highest visual quality is less than 8 out of 10. There is also a good correlation between objective metrics (e.g., color Peak Signal-to-Noise Ratio (PSNR) and geometry PSNR) and the QoE score. Especially the Pearson correlation coefficients of color PSNR is 0.84. Finally, we found that machine learning models are able to accurately predict the QoE of point clouds in AR environments.

The subjective test results and questionnaire responses are available on Github: https://github.com/minhkstn/QoE-and-Immersion-of-Dynamic-Point-Cloud.

Index Terms: Point Clouds, Quality of Experience, Subjective Tests, Augmented Reality

This entry was posted in SPIRIT. Bookmark the permalink.