​Improving Gaze Reconstruction Accuracy in Generated Faces
This project investigated the effects of adding multiple loss terms to the optimization functions of a face swapping model. We found that both an image reconstruction metric based on the eyes and a metric using difference in gaze angles derived by a pretrained expert model both increased the accuracy of gaze representation in generated faces.
Funding source(s):
-
NIH R21 “Protecting the privacy of the child through facial identity removal in recorded behavioral observation sessions” (2020-2022)
Publications
Towards mitigating uncann(eye)ness in face swaps via gaze-centric loss terms
Wilson, Ethan and Shic, Frederick and Joerg, Sophie and Jain, Eakta. Towards mitigating uncann(eye)ness in face swaps via gaze-centric loss terms. Computers and Graphics Journal Special Issue: Eye Gaze Visualization, Interaction, Synthesis, and Analysis (2024).
Resources:
-
Bibtex:
@article{wilson_uncanneyeness_2024,
author = {Ethan Wilson and Frederick Shic and Sophie Jörg and Eakta Jain},
title = {Towards mitigating uncann(eye)ness in face swaps via gaze-centric loss terms},
year = {2024},
journal = {Computers & Graphics},
doi = {https://doi.org/10.1016/j.cag.2024.103888},
}
Introducing Explicit Gaze Constraints to Face Swapping
Wilson, Ethan and Shic, Frederick and Jain, Eakta. Introducing Explicit Gaze Constraints to Face Swapping. ACM Symposium on Eye Tracking Research & Applications (ETRA). (2023) (in press)
Resources:
-
Paper (coming soon)
-
Bibtex:
@inproceedings{wilson_gazeconstraints_2023,
title={Introducing Explicit Gaze Constraints to Face Swapping},
author={Wilson, Ethan and Shic, Frederick and Jain, Eakta},
booktitle={2023 Symposium on Eye Tracking Research and Applications},
year={2023}