GAMES Webinar 2024 – 316期(智能超分与插帧技术) |钟智华(香港科技大学(广州)),吴松隐(加州大学圣巴巴拉分校)

【GAMES Webinar 2024-316期】(渲染专题-智能超分与插帧技术)



报告题目:FuseSR: Super Resolution for Real-time Rendering through Efficient Multi-resolution Fusion


The workload of real-time rendering is steeply increasing as the demand for high resolution, high refresh rates, and high realism rises, overwhelming most graphics cards. To mitigate this problem, one of the most popular solutions is to render images at a low resolution to reduce rendering overhead, and then manage to accurately upsample the low-resolution rendered image to the target resolution, a.k.a. super-resolution techniques. Most existing methods focus on exploiting information from low-resolution inputs, such as historical frames. The absence of high frequency details in those LR inputs makes them hard to recover fine details in their high-resolution predictions. With LR images and HR G-buffers as input, the network requires to align and fuse features at multi resolution levels. We introduce an efficient and effective H-Net architecture to solve this problem and significantly reduce rendering overhead without noticeable quality deterioration. Experiments show that our method is able to produce temporally consistent reconstructions in 4 x 4 and even challenging 8 x 8 upsampling cases at 4K resolution with real-time performance, with substantially improved quality and significant performance boost compared to existing works.






报告题目:ExtraSS: A Framework for Joint Spatial Super Sampling and Frame Extrapolation


We introduce ExtraSS, a novel framework that combines spatial super sampling and frame extrapolation to enhance real-time rendering performance. By integrating these techniques, our approach achieves a balance between performance and quality, generating temporally stable and high-quality, high-resolution results. Leveraging lightweight modules on warping and the ExtraSSNet for refinement, we exploit spatial-temporal information, improve rendering sharpness, handle moving shadings accurately, and generate temporally stable results. Computational costs are significantly reduced compared to traditional rendering methods, enabling higher frame rates and alias-free high resolution results. Evaluation using Unreal Engine demonstrates the benefits of our framework over conventional individual spatial or temporal super sampling methods, delivering improved rendering speed and visual quality. With its ability to generate temporally stable high-quality results, our framework creates new possibilities for real-time rendering applications, advancing the boundaries of performance and photo-realistic rendering in various domains. 





过洁现为南京大学计算机科学与技术系长聘副教授(特聘研究员),主要研究领域为计算机图形学和虚拟现实。迄今为止,共主持相关领域科研项目30余项,包括国家自然科学基金面上项目、“十三五”装发预研项目课题、江苏省自然科学基金面上项目、企业合作项目多项,在国内外主流期刊和会议上发表论文90余篇,包括SIGGRAPH、CVPR、ICCV、IEEE TVCG、IEEE TIP等,开发的材质建模、光照估计、高性能渲染等技术已被多家知名企业应用,产生了显著的经济和社会效益。过洁曾获华为火花奖、江苏省计算机学会青年科技奖、江苏省工程师学会优秀青年工程师奖等奖励,入选江苏省“双创计划”科技副总计划。

GAMES主页的“使用教程”中有 “如何观看GAMES Webinar直播?”及“如何加入GAMES微信群?”的信息;

You may also like...