GAMES Webinar 2021 – 198期(VR专题) | 胡志明 (北京大学),Difeng Yu (University Of Melbourne),孟晓旭 (腾讯游戏AI研究中心)
【GAMES Webinar 2021-198期】(VR专题)
报告嘉宾1:胡志明 (北京大学)
报告时间:2021年9月2号星期四晚上8:00-8:30(北京时间)
报告题目:虚拟现实环境中用户视觉注意的分析与预测
报告摘要:
用户在虚拟现实环境中的视觉注意具有重大的意义,可以被应用到很多方面,包括虚拟现实内容设计、注视引导、注视点渲染、以及基于眼动的交互。本报告以虚拟现实环境中用户视觉注意的分析与预测为研究目标,重点针对静态虚拟场景、动态虚拟场景、以及任务驱动虚拟场景展开了深入的研究,揭示了用户的眼睛注视位置与其头部运动、场景内容、动态物体、以及任务相关物体之间的相关性,并提出了基于深度学习的算法来对用户的注视位置进行预测。
讲者简介:
胡志明,北京大学2017级博士生。2017年获得北京理工大学学士学位,并进入北京大学信息科学技术学院攻读博士学位。研究方向包括人机交互、虚拟现实、以及眼动追踪。曾多次获得国家奖学金以及北京大学校长奖学金。以第一作者身份在计算机图形学与可视化领域顶级期刊TVCG和虚拟现实领域顶级会议IEEE VR上发表过多篇论文,并获得过TVCG最佳期刊论文提名奖。
讲者个人主页:https://cranehzm.github.io/
报告嘉宾2:Difeng Yu (University Of Melbourne)
报告时间:2021年9月2号星期四晚上8:30-9:00(北京时间)
报告题目:Gaze-Supported 3D Object Manipulation in Virtual Reality
报告摘要:
In this talk, I will present a paper that investigates integration, coordination, and transition strategies of gaze and hand input for 3D object manipulation in VR. Specifically, our work aims to understand whether incorporating gaze input can benefit VR object manipulation tasks, and how it should be combined with hand input for improved usability and efficiency. I will demonstrate four gaze-supported techniques that leverage different combination strategies for object manipulation. For example, I will show a technique called ImplicitGaze which allows the transition between gaze and hand input to happen without any trigger mechanism like button pressing. Next, I will introduce the results from two user evaluation studies of those techniques. I will further offer insights regarding combination strategies of gaze and hand input, and present implications that can help guide the design of future VR systems that incorporate gaze input for 3D object manipulation.
Difeng is a 3rd year Ph.D. student in Human-Computer Interaction Group, The University of Melbourne, advised by Dr. Jorge Goncalves (primary), Dr. Tilman Dingler, and Dr. Eduardo Velloso. He received his BSc degree in Computer Science from Xi’an Jiaotong-Liverpool University in 2018 and was a research assistant at X-CHI Lab directed by Dr. Hai-Ning Liang. His recent research in Human-Computer Interaction (HCI) focuses on 1) designing novel interactive techniques in augmented or virtual reality systems and 2) investigating, analyzing, and modeling user behavior in 3D virtual environments.
讲者个人主页:https://www.difeng.me
报告嘉宾3:孟晓旭 (腾讯游戏AI研究中心)
报告时间:2021年9月2号星期四晚上9:00-9:30(北京时间)
报告题目:3D-Kernel Foveated Rendering for Light Fields
报告摘要:
Light fields capture both the spatial and angular rays, thus enabling free-viewpoint rendering and custom selection of the focal plane. Scientists can interactively explore pre-recorded microscopic light fields of organs, microbes, and neurons using virtual reality headsets. However, rendering high-resolution light fields at interactive frame rates requires a very high rate of texture sampling, which is challenging as the resolutions of light fields and displays continue to increase. In this article, we present an efficient algorithm to visualize 4D light fields with 3D-kernel foveated rendering (3D-KFR). The 3D-KFR scheme coupled with eye-tracking has the potential to accelerate the rendering of 4D depth-cued light fields dramatically. We have developed a perceptual model for foveated light fields by extending the KFR for the rendering of 3D meshes. On datasets of high-resolution microscopic light fields, we observe 3:47x-7:28x speedup in light field rendering with minimal perceptual loss of detail. We envision that 3D-KFR will reconcile the mutually conflicting goals of visual fidelity and rendering speed for interactive visualization of light fields.
孟晓旭博士,腾讯游戏AI研究中心研究科学家。2015年本科毕业于上海交通大学,2018年与2020年于马里兰大学分别获得硕士和博士学位。主要研究方向为注视点渲染,蒙特卡洛光线追踪的降噪与三维重建。
讲者个人主页:www.mengxiaoxu.com
主持人简介:
杨旭波,上海交通大学教授,研究领域为虚拟现实VR/AR/MR与计算机图形学。博士毕业于浙江大学CAD&CG国家重点实验室,曾在德国Fraunhofer-IMK所虚拟现实系、新加坡国立大学混合现实实验室、美国北卡大学教堂山分校计算机系做研究工作。现任中国图学学会计算图学专委会副主任、中国图学学会理事、中国计算机学会虚拟现实专委会常委、中国图象图形学会虚拟现实专委会常委等,担任IEEE VR等国际会议论文共同主席,Frontiers in VR期刊和Presence:VR & AR等国际期刊编委。
GAMES主页的“使用教程”中有 “如何观看GAMES Webinar直播?”及“如何加入GAMES微信群?”的信息;
GAMES主页的“资源分享”有往届的直播讲座的视频及PPT等。
观看直播的链接:http://webinar.games-cn.org