Exploring Mixed Reality Presentation with Holographic Near-eye and Head-up Displays

This project is showcased in the fifth exhibition – Technologies and Innovations.

Principal Investigator: Professor Yifan Evan PENG ( Assistant Professor from Department of Electrical and Electronic Engineering)

About the scholar

Professor Yifan Evan PENG

Research interests:

Optics, Vision, Graphics, and AI, focused on: Computational Imaging (sensors, displays, microscopy), Holography & VR/AR/MR, Low-level Vision & Inverse Rendering, Human-centered Sensory Systems.

Email:evanpeng@hku.hk

Website: https://www.eee.hku.hk/~evanpeng/

Project information

This project advances holographic displays through AI-driven advancements like ​​3D-HoloNet​​, a neural network that synthesizes real-time, high-fidelity 3D holograms without bulky optical filters, overcoming the traditional speed-quality trade-off.

Novelty

  •            Two holographic display prototypes are built: VR prototype and AR HUD prototype.

  •        Achieve high quality real-time hologram synthesis using deep learning.

  •        Enables dynamic 3D visualization with natural depth perception.

  •        Demonstrates AI’s role in overcoming computational bottlenecks in classical optics. 

Project images
Illustration of camera-calibrated learning (blue arrows) and 3D-HoloNet training (green arrows).
Photograph of holographic VR display prototype.
Experime​ntal results on the unfiltered holography setup with PSNR (dB) / SSIM metrics (higher means better) of various CGH algorithms. ​
An AR prototype for the HUD-type holographic display. ​
The captured result of our AR holographic display prototype. ​
Other projects