This repository contains the Dynamic Periocular Data Generation (DPDG) environment and our depth estimation model code, as introduced in our research paper on eye health in Virtual Reality (VR).
The captivating realism offered by Virtual Reality (VR) brings along challenges, including potential eye strain and long-term visual impairments for users. To combat these side effects and further the understanding of eye health in VR, this repository presents two primary contributions:
-
Dynamic Periocular Data Generation (DPDG): An innovative environment based on UE MetaHuman, designed to synthesize a plethora of training images from a limited set of human facial scan data. This addresses the data collection challenges inherent in VR research.
-
Periocular Depth Estimation: Building on the U-Net 3+ deep learning architecture, we've developed and fine-tuned a method for estimating periocular depth maps. This tool aids in reconstructing three-dimensional periocular regions, providing a foundation for in-depth analyses related to light stimulus and medical guidelines in VR.
By open-sourcing these tools, we aim to drive forward research on eye health in VR and invite collaboration from the broader scientific community.
Please see the readme files in subfolders
The raw data from our experiments contains sensitive facial information. To ensure the confidentiality of our participants, those interested in accessing the raw data must contact the authors directly and sign a Non-Disclosure Agreement (NDA).
Email: yitong.sun@network.rca.ac.uk
We welcome contributions and feedback on our code. If you find any issues or have suggestions, please open an issue or submit a pull request.
This project is licensed under the GPL-3.0 License]. See the LICENSE file for details.
If you find our work useful for your research, please consider citing our paper: "To be released soon"