Skip to main content Accessibility help
×
Home

Modern trends on quality of experience assessment and future work

  • Woojae Kim (a1), Sewoong Ahn (a1), Anh-Duc Nguyen (a1), Jinwoo Kim (a1), Jaekyung Kim (a1), Heeseok Oh (a2) and Sanghoon Lee (a1)...

Abstract

Over the past 20 years, research on quality of experience (QoE) has been actively expanded even to cover aesthetic, emotional and psychological experiences. QoE has been an important research topic in determining the perceptual factors that are essential to users in keeping with the emergence of new display technologies. In this paper, we provide in-depth reviews of recent assessment studies in this field. Compared to previous reviews, our research examines the human factors observed over various recent displays and their associated assessment methods. In this study, we first provide a comprehensive QoE analysis on 2D display including image/video quality assessment (I/VQA), visual preference, and human visual system-related studies. Second, we analyze stereoscopic 3D (S3D) QoE research on the topics of I/VQA and visual discomfort from the human perception point of view on S3D display. Third, we investigate QoE in a head-mounted display-based virtual reality (VR) environment, and deal with VR sickness and 360 I/VQA with their individual approach. All of our reviews are analyzed through comparison of benchmark models. Furthermore, we layout QoE works on future display and modern deep-learning applications.

  • View HTML
    • Send article to Kindle

      To send this article to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle. Find out more about sending to your Kindle.

      Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

      Find out more about the Kindle Personal Document Service.

      Modern trends on quality of experience assessment and future work
      Available formats
      ×

      Send article to Dropbox

      To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Dropbox.

      Modern trends on quality of experience assessment and future work
      Available formats
      ×

      Send article to Google Drive

      To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Google Drive.

      Modern trends on quality of experience assessment and future work
      Available formats
      ×

Copyright

This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.

Corresponding author

Corresponding author: Sanghoon Lee, E-mail: slee@yonsei.ac.kr

References

Hide All
[1]Brunnström, K. et al. : QualiNet white paper on definitions of quality of experience (2013).
[2]Laghari, K.U.R.; Connelly, K.: Toward total quality of experience: a QoE model in a communication ecosystem. IEEE Commun. Mag., 50 (4) (2012), 5865.
[3]Union, I.T.: Recommendation ITU-R BT. 500-13: methodology for the subjective assessment of the quality of television pictures (2012)
[4]Subjective Video Quality Assessment Methods for Multimedia Applications. Recommendation ITU-T P.910 (Sep. 1999).
[5]Wang, Z. et al. : Image quality assessment: from error visibility to structural similarity. IEEE Trans. Image Process., 13 (4) (2004), 600612.
[6]Gu, K. et al. : The analysis of image contrast: from quality assessment to automatic enhancement. IEEE Trans. Cybernet., 46 (1) (2015), 284297.
[7]Kim, H. et al. : Visual preference assessment on ultra-high-definition images. IEEE Trans. Broadcast., 62 (4) (2016), 757769.
[8]Kim, H. et al. : Saliency prediction on stereoscopic videos. IEEE Trans. Image Process., 23 (4) (2014), 14761490.
[9]Kim, H.; Lee, S.: Transition of visual attention assessment in stereoscopic images with evaluation of subjective visual quality and discomfort. IEEE Trans. Multimedia, 17 (12) (2015), 21982209.
[10]Ahn, S. et al. : Visual attention analysis on stereoscopic images for subjective discomfort evaluation, in 2016 IEEE Int. Conf. on Multimedia and Expo, IEEE, 2016.
[11]Nguyen, A.-D. et al. : Deep visual saliency on stereoscopic images. IEEE Trans. Image Process., 28 (4) (2018), 19391953.
[12]Lee, S. et al. : Foveated video compression with optimal rate control. IEEE Trans. Image Process., 10 (7) (2001), 977992.
[13]Lee, S. et al. : Foveated video quality assessment. IEEE Trans. Multimedia, 4 (1) (2002), 129132.
[14]Oh, H. et al. : Blind deep S3D image quality evaluation via local to global feature aggregation. IEEE Trans. Image Process., 26 (10) (2017), 49234936.
[15]Oh, H. et al. : Stereoscopic 3D visual discomfort prediction: a dynamic accommodation and vergence interaction model. IEEE Trans. Image Process., 25 (2) (2015), 615629.
[16]Oh, H. et al. : Enhancement of visual comfort and sense of presence on stereoscopic 3d images. IEEE Trans. Image Process., 26 (8) (2017), 37893801.
[17]Kim, T. et al. : Transfer function model of physiological mechanisms underlying temporal visual discomfort experienced when viewing stereoscopic 3D images. IEEE Trans. Image Process., 24 (11) (2015), 43354347.
[18]Park, J. et al. : 3D visual discomfort prediction: vergence, foveation, and the physiological optics of accommodation. IEEE J. Sel. Topics Signal Process., 8 (3) (2014), 415427.
[19]Park, J. et al. : 3D visual discomfort predictor: analysis of disparity and neural activity statistics. IEEE Trans. Image Process., 24 (3) (2014), 11011114.
[20]Kim, H.G. et al. : Binocular fusion net: deep learning visual comfort assessment for stereoscopic 3D. IEEE Trans. Circuits Syst. Video Technol., 29 (4) (2018), 956967.
[21]Kim, J. et al. : Virtual reality sickness predictor: analysis of visual-vestibular conflict and VR contents, in 2018 Tenth Int. Conf.on Quality of Multimedia Experience (QoMEX). IEEE, 2018.
[22]Liu, T.-J. et al. : Visual quality assessment: recent developments, coding applications and future trends, in APSIPA Transactions on Signal and Information Processing, 2013.
[23]Chikkerur, S. et al. : Objective video quality assessment methods: a classification, review, and performance comparison. IEEE Trans. Broadcast., 57 (2) (2011), 165182.
[24]IEEE P3333.1. (2012). Standard for the quality assessment of three dimensional displays, 3D contents and 3D devices based on human factors. IEEE Standards Association.
[25]Lambooij, M. et al. : Visual discomfort of 3D TV: assessment methods and modeling. Displays, 32 (4) (2011), 209218.
[26]Kim, T. et al. : Multimodal interactive continuous scoring of subjective 3D video quality of experience. IEEE Trans. Multimedia, 16 (2) (2013), 387402.
[27]Park, J. et al. : Video quality pooling adaptive to perceptual distortion severity. IEEE Trans. Image Process., 22 (2) (2013), 610620.
[28]VQEG: final report from the video quality experts group on the validation of objective models of video quality assessment, Phase II, in VQEG, Boulder, CO, USA, Rep, 2003.
[29]Nguyen, A.-D. et al. : A simple way of multimodal and arbitrary style transfer, in IEEE Int. Confe. on Acoustics, Speech and Signal Processing. IEEE, 2019.
[30]Sugawara, M. et al. : Research on human factors in ultrahigh-definition television (UHDTV) to determine its specifications. SMPTE Motion Imag. J., 117 (3) (2008), 2329.
[31]Kim, W. et al. : No-reference perceptual sharpness assessment for ultra-high-definition images, in IEEE Inte. Conf. on Image Processing. IEEE, 2016.
[32]Oh, T. et al. : No-reference sharpness assessment of camera-shaken images by analysis of spectral structure. IEEE Trans. Image Process., 23 (12) (2014), 54285439.
[33]Kim, H. et al. : Blind sharpness prediction for ultrahigh-definition video based on human visual resolution. IEEE Trans. Circuits Syst. Video Technol., 27 (5) (2016), 951964.
[34]Oh, H.; Lee, S.: Visual presence: Viewing geometry visual information of UHD S3D entertainment. IEEE Trans. Image Process., 25 (7) (2016), 33583371.
[35]Kim, J. et al. : Video sharpness prediction based on motion blur analysis, in IEEE Int. Confe. on Multimedia and Expo. IEEE, 2015.
[36]Hu, S. et al. : Objective video quality assessment based on perceptually weighted mean squared error. IEEE Trans. Circuits Syst. Video Technol., 27 (9) (2016), 18441855.
[37]Sheikh, H. et al. : A statistical evaluation of recent full reference image quality assessment algorithms. IEEE Trans. Image Process., 15 (11) (2006), 34403451.
[38]Ponomarenko, N. et al. : TID2008-a database for evaluation of full-reference visual quality assessment metrics. Adv. Modern Radioelectronics, 10 (4) (2009), 3045.
[39]Larson, E.C.; Chandler, D.M.: Most apparent distortion: full-reference image quality assessment and the role of strategy. J. Electron. Imaging, 19 (1) (2010), 1921.
[40]Jayaraman, D. et al. : Objective quality assessment of multiply distorted images, in Proc. Asilomar Conf. Signals, Systems, and Computers, 2012, 16931697
[41]Ponomarenko, N. et al. : Image database TID2013: peculiarities, results and perspectives. Signal Process. Image Commun., 30, (2015), 5777.
[42]Ghadiyaram, D.; Bovik, A.C.: Massive online crowdsourced study of subjective and objective picture quality. IEEE Trans. Image Process., 25 (1) (2016), 372387.
[43]Seshadrinathan, K. et al. : Study of subjective and objective quality assessment of video. IEEE Trans. Image Process., 19 (6) (2010), 14271441.
[44]Laboratory of Computational Perception & Image Quality, Oklahoma State University, CSIQ Video Database. [Online]. Available: http://vision.okstate.edu/?loc=stmad
[45]Zhang, F.; Li, S.; Ma, L.; Wong, Y.C.; Ngan, K.N. IVP Video Quality Database, 2011, [online] Available: http://ivp.ee.cuhk.edu.hk/research/database/subjective/index.html.
[46]Lai, Y.-K.; Jay Kuo, C-C.: A Haar wavelet approach to compressed image quality measurement. J. Vis. Commun. Image R., 11 (1) (2000), 1740.
[47]Sheikh, H.R.; Bovik, A.C.: A visual information fidelity approach to video quality assessment, in The First Int. Workshop on Video Processing and Quality Metrics for Consumer Electronics, Vol. 7. SN, 2005.
[48]Zhang, L. et al. : FSIM: a feature similarity index for image quality assessment. IEEE Trans. Image Process., 20 (8) (2011), 23782386.
[49]Xue, W. et al. : Gradient magnitude similarity deviation: a highly efficient perceptual image quality index. IEEE Trans. Image Process., 23 (2) (2013), 684695.
[50]Kim, J.; Lee, S.: Deep learning of human visual sensitivity in image quality assessment framework, in Proc. IEEE Conf. Comput. Vis. Pattern Recognit. (CVPR), 2017
[51]Mittal, A. et al. : Blind /referenceless image spatial quality evaluator, in 2011 Conf. record of the forty fifth asilomar conference on signals, systems and computers (ASILOMAR). IEEE, 2011.
[52]Ye, P. et al. : Unsupervised feature learning framework for no-reference image quality assessment, in Proc. IEEE Conf. Comput. Vis. Pattern Recognit.(CVPR). IEEE, 2012
[53]Zhang, L. et al. : A feature-enriched completely blind image quality evaluator. IEEE Trans. Image Process., 24 (8) (2015), 25792591.
[54]Xu, J.; Ye, P.; Li, Q.; Du, H.; Liu, Y.; Doermann, D.: Blind image quality assessment based on high order statistics aggregation. IEEE Trans. Image Process., 25 (9) (2016), 44444457.
[55]Kim, J.; Lee, S.: Fully deep blind image quality predictor. IEEE J. Sel. Top. Signal Process., 11 (1) (2016), 206220.
[56]Kim, J. et al. : Deep CNN-based blind image quality predictor. IEEE Trans. Neural Netw. Learn. Syst., 30 (1) (2018), 1124.
[57]Vu, P.V. et al. : A spatiotemporal most-apparent-distortion model for video quality assessment, in IEEE Int. Confe. on Image Processing. IEEE, 2011.
[58]Vu, P.V. et al. : ViS3: an algorithm for video quality assessment via analysis of spatial and spatiotemporal slices. Journal of Electron. Imaging, 23 (1) (2014), 013016.
[59]Kim, W. et al. : Deep video quality assessor: from spatio-temporal visual sensitivity to a convolutional neural aggregation network, in Proc. Eur. Conf. Comput. Vis.(ECCV). 2018.
[60]Saad, M. et al. : Blind Prediction of Natural Video Quality and H. 264 Applications (2013).
[61]Seshadrinathan, et al. : Motion tuned spatio-temporal quality assessment of natural videos. IEEE Trans. Image Process., 19 (2) (2009), 335350.
[62]Li, Y. et al. : No-reference video quality assessment with 3D shearlet transform and convolutional neural networks. IEEE Trans. Circuits Syst. Video Technol., 26 (6) (2015), 10441057.
[63]Köhler, R. et al. : Recording and playback of camera shake: benchmarking blind deconvolution with a real-world database, in Proc. Eur. Conf. Comput. Vis.(ECCV), 2012.
[64]Elder, J.H.; Zucker, S.W.: Local scale control for edge detection and blur estimation. IEEE Trans. Pattern Anal. Mach. Intell., 20 (7) (1998), 699716.
[65]Hu, H.; De Haan, G. Low cost robust blur estimator, in IEEE Int. Conf. on Image Processing. IEEE, 2006.
[66]Marziliano, P. et al. : A no-reference perceptual blur metric, in IEEE Int. Conf. on Image Processing. IEEE, 2002.
[67]Narvekar, N.D.; Karam, L.J.: A no-reference image blur metric based on the cumulative probability of blur detection (CPBD). IEEE Trans. Image Process., 20 (9) (2011), 26782683.
[68]Ferzli, R.; Karam, L.J.: A no-reference objective image sharpness metric based on the notion of just noticeable blur (JNB). IEEE Trans. Image Process., 18 (4) (2009), 717728.
[69]Caviedes, J.; Oberti, F.: A new sharpness metric based on local kurtosis, edge and energy information. Signal Process. Image Commun., 19 (2) (2004), 147161.
[70]Wang, S. et al. : A patch-structure representation method for quality assessment of contrast changed images. IEEE Signal Process. Lett., 22 (12) (2015), 23872390.
[71]Fang, Y. et al. : No-reference quality assessment of contrast-distorted images based on natural scene statistics. IEEE Signal Process. Lett., 22 (7) (2014), 838842.
[72]Chen, Z. et al. : Quality assessment for comparing image enhancement algorithms, in Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition, 2014.
[73]Gu, K. et al. : Learning a no-reference quality assessment model of enhanced images with big data. IEEE Trans. Neural Netw. Learn. Syst., 29 (4) (2017), 13011313.
[74]Emoto, et al. : Repeated vergence adaptation causes the decline of visual functions in watching stereoscopic television. J. Display Technol., 1 (2) (2005), 328.
[75]Hoffman, D.M. et al. : Vergence–accommodation conflicts hinder visual performance and cause visual fatigue. J. Vis., 8 (3) (2008), 3333.
[76]Lee, K. et al. : 3D visual activity assessment based on natural scene statistics. IEEE Trans. Image Process., 23 (1) (2013), 450465.
[77]Wann, J.P. et al. : Natural problems for stereoscopic depth perception in virtual environments. Vision research, 35 (19) (1995), 27312736.
[78]Okada, Y. et al. : Target spatial frequency determines the response to conflicting defocus-and convergence-driven accommodative stimuli. Vision Res., 46 (4) (2006), 475484.
[79]Levelt, W.J.M. On binocular rivalry. Diss. Van Gorcum Assen, 1965.
[80]Blake, R. et al. : What is suppressed during binocular rivalry? Perception, 9 (2) (1980), 223231.
[81]IEEE-SA Stereo Image Database 2012 [online]. Available: http://grouper.ieee.org/groups/3dhf/
[82]IEEE P3333.1.1. Standard for the quality of experience (QoE) and visual comfort assessments of the three dimensional contents based on psychophysical studies. 2015.
[83]Jung, Y.J. et al. : Predicting visual discomfort of stereoscopic images using human attention model. IEEE Trans. Circuits Syst. Video Technol., 23 (12) (2013), 20772082.
[84]Sohn, H. et al. : Predicting visual discomfort using object size and disparity information in stereoscopic images. IEEE Trans. Broadcast., 59 (1) (2013), 2837.
[85]Bando, T. et al. : Visual fatigue caused by stereoscopic images and the search for the requirement to prevent them: a review. Displays, 33 (2) (2012), 7683.
[86]Kim, J. et al. : Quality assessment of perceptual crosstalk on two-view auto-stereoscopic displays. IEEE Trans. Image Process., 26 (10) (2017), 48854899.
[87]Yano, S. et al. : A study of visual fatigue and visual comfort for 3D HDTV/HDTV images. Displays, 23 (4) (2002), 191201.
[88]Nojiri, Y. et al. : Measurement of parallax distribution and its application to the analysis of visual comfort for stereoscopic HDTV, in Stereoscopic Displays and Virtual Reality Systems X, vol. 5006, Int. Society for Optics and Photonics, 2003.
[89]Choi, J. et al. : Visual fatigue modeling and analysis for stereoscopic video. Opt. Eng., 51 (1) (2012), 017206.
[90]Kim, D.; Sohn, K.: Visual fatigue prediction for stereoscopic image. IEEE Trans. Circuits Syst. Video Technol., 21 (2) (2011), 231236.
[91]Jung, Y.J. et al. : Visual comfort assessment metric based on salient object motion information in stereoscopic video. J. Electron. Imag., 21 (1) (2012), 011008.
[92]Oh, H. et al. : Deep visual discomfort predictor for stereoscopic 3D images. IEEE Trans. Image Process., 27 (11) (2018), 54205432.
[93]Moorthy, A.K. et al. : Subjective evaluation of stereoscopic image quality. Signal Process. Image Commun., 28 (8) (2013), 870883.
[94]Chen, M.-J. et al. : Full-reference quality assessment of stereopairs accounting for rivalry. Signal Process. Image Commun., 28 (9) (2013), 11431155.
[95]Wang, J. et al. : Quality prediction of asymmetrically distorted stereoscopic 3D images. IEEE Trans. Image Process., 24 (11) (2015), 34003414.
[96]Joveluro, P. et al. : Perceptual video quality metric for 3d video quality assessment. in 2010 3DTV-Conf.: The True Vision-Capture, Transmission and Display of 3D Video. IEEE, 2010.
[97]Jin, L. et al. : 3D-DCT based perceptual quality assessment of stereo video. in IEEE Int. Conf. on Image Processing. IEEE, 2011.
[98]Lu, F. et al. : Quality assessment of 3D asymmetric view coding using spatial frequency dominance model, in 2009 3DTV Conf.: The True Vision-Capture, Transmission and Display of 3D Video. IEEE, 2009.
[99]Benoit, A. et al. : Quality assessment of stereoscopic images. EURASIP J. Image Video Process., 2008 (1) (2009), 659024.
[100]You, J. et al. : Perceptual quality assessment for stereoscopic images based on 2D image quality metrics and disparity analysis, in Proc. Int. Workshop Video Process. Quality Metrics Consum. Electron, Vol. 9. 2010.
[101]Yang, J. et al. : Objective quality assessment method of stereo images, in 3DTV Conf.: The True Vision-Capture, Transmission and Display of 3D Video. IEEE, 2009.
[102]Lin, Y.-H.; Wu, J.-L.: Quality assessment of stereoscopic 3D image compression by binocular integration behaviors. IEEE Trans. Image Process., 23 (4) (2014), 15271542.
[103]Lee, K.; Lee, S.: 3D perception based quality pooling: stereopsis, binocular rivalry, and binocular suppression. IEEE J. Sel. Topics Signal Process., 9 (3) (2015), 533545.
[104]Sazzad, Z.M. et al. : Objective no-reference stereoscopic image quality prediction based on 2D image features and relative disparity. Adv. Multimedia, 2012, (2012), 8.
[105]Chen, M.-J. et al. : No-reference quality assessment of natural stereopairs. IEEE Trans. Image Process., 22 (9) (2013), 33793391.
[106]Zhang, W. et al. : Learning structure of stereoscopic image for no-reference quality assessment with convolutional neural network. Pattern Recognit., 59, (2016), 176187.
[107]Ding, Y. et al. : No-reference stereoscopic image quality assessment using convolutional neural network for adaptive feature extraction. IEEE Access, 6, (2018), 3759537603.
[108]Han, J. et al. : Stereoscopic video quality assessment model based on spatial-temporal structural information, in 2012 Visual Communications and Image Processing. IEEE, 2012.
[109]Malekmohamadi, H. et al. : A new reduced reference metric for color plus depth 3D video. J. Vis. Commun. Image Representation, 25 (3) (2014), 534541.
[110]Zhu, H. et al. : A stereo video quality assessment method for compression distortion, in 2015 Int. Conf. on Computational Science and Computational Intelligence (CSCI). IEEE, 2015.
[111]Qi, F. et al. : Stereoscopic video quality assessment based on visual attention and just-noticeable difference models. Signal Image Video Process., 10 (4) (2016), 737744.
[112]Chen, Z.; Zhou, W.; Li, W.: Blind stereoscopic video quality assessment: from depth perception to overall experience. IEEE Trans. Image Process., 27 (2) (2017), 721734.
[113]Jiang, G. et al. : No reference stereo video quality assessment based on motion feature in tensor decomposition domain. J. Vis. Commun. Image Representation, 50, (2018), 247262.
[114]Yang, J. et al. : Stereoscopic video quality assessment based on 3D convolutional neural networks. Neurocomputing, 309, (2018), 8393.
[115]Lee, S. et al. : Foveated image/video quality assessment in curvilinear coordinates, in Int'l. Workshop on Very Low Bitrate Video Coding, Urbana, IL, USA, October 1998, 189192.
[116]Sun, W. et al. : A large-scale compressed 360-degree spherical image database: from subjective quality evaluation to objective model comparison, in 2018 IEEE 20th Int. Workshop on Multimedia Signal Processing, 2018, 16.
[117]Duan, H. et al. : Perceptual quality assessment of omnidirectional images, in IEEE Int. Symp.on Circuits and System, 2018, 15.
[118]Li, C. et al. : Bridge the gap between vqa and human behavior on omnidirectional video: a large-scale dataset and a deep learning model. arXiv preprint arXiv:1807.10990 (2018).
[119]Zhang, B. et al. : Subjective and objective quality assessment of panoramic videos in virtual reality environments, in IEEE Int. Conf. on Multimedia and Expo Workshops, 2017.
[120]Qian, F. et al. : Optimizing 360 video delivery over cellular networks, in Proc. the 5th Workshop on All Things Cellular: Operations, Applications and Challenges, 2016, 16.
[121]Yu, M.; Lakshman, H.; Girod, B.: A Framework to Evaluate Omnidirectional Video Coding Schemes. IEEE Int. Symp. on Mixed and Augmented Reality, 2015, 3136.
[122]Sun, Y.; Lu, A.; Yu, L.: AHG8: WS-PSNR for 360 video objective quality evaluation. document JVET-D0040 (2016).
[123]Zakharchenko, V. et al. : Quality metric for spherical panoramic video, in Proc. SPIE 9970, Optics and Photonics for Information Processing X, 2016
[124]Luz, G. et al. : Saliency-driven omnidirectional imaging adaptive coding: modeling and assessment, in IEEE 19th Int. Workshop on Multimedia Signal Processing, 2017, 16.
[125]Xu, M. et al. : Assessing visual quality of omnidirectional videos. IEEE Trans. Circuits Syst. Video Technol. (2018), 11.
[126]Yu, M. et al. : A framework to evaluate omnidirectional video coding schemes, in 2015 IEEE Int. Symp. on Mixed and Augmented Reality, 2015, 3136.
[127]Ozcinar, C. et al. : Visual attention-aware omnidirectional video streaming using optimal tiles for virtual reality. IEEE J. Emerging Sel. Topics Circuits Syst., 9 (1) (2019), 217230.
[128]Kim, H.G. et al. : Deep virtual reality image quality assessment with human perception guider for omnidirectional image. IEEE Trans. Circuits Syst. Video Technol., (2019).
[129]Lim, H.-T. et al. : VR IQA NET: deep virtual reality image quality assessment using adversarial learning, in IEEE Int. Conf. on Acoustics, Speech and Signal Processing, 2018, 67376741
[130]Li, C. et al. : Viewport Proposal CNN for 360deg Video Quality Assessment, in Proc. IEEE Conf. Comput. Vis. Pattern Recognit.(CVPR), 2019, 1017710186.
[131]Padmanaban, N. et al. : Towards a machine-learning approach for sickness prediction in 360 stereoscopic videos. IEEE Trans. Vis. Comput. Graph., 24 (4) (2018), 15941603.
[132]Kim, J. et al. : A deep cybersickness predictor based on brain signal analysis for virtual reality contents, in Proc. of the IEEE Int. Conf. on Computer Vision, 2019.
[133]Kim, J. et al. : Deep blind image quality assessment by learning sensitivity map, in IEEE Int. Conf. on Acoustics, Speech and Signal Processing. IEEE, 2018.

Keywords

Modern trends on quality of experience assessment and future work

  • Woojae Kim (a1), Sewoong Ahn (a1), Anh-Duc Nguyen (a1), Jinwoo Kim (a1), Jaekyung Kim (a1), Heeseok Oh (a2) and Sanghoon Lee (a1)...

Metrics

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed