Ean Grady-Looking Outwards-05

http://niessnerlab.org/projects/thies2016face.html

Title: Face2Face
Creators: Justus Thies, Michael Zollhofer, Marc Stamminger, Christian Theobalt, Matthias Niebner
2016

TUM Visual Computing has created technology that allows “real-time face facial reenactment of a monocular target video sequence (e.g., Youtube video). Which essentially means that if a person is in front of their commodity webcam and have a video playing of another person talking, this webcam can allow the person’s facial expressions to replace the person in the video’s, in real-time. The project creators track facial expressions of the source and target individuals using a ‘dense photometric consistency measure’. Reenactment of the source’s facial expressions on the target is achieved through fast and efficient ‘deformation transfer’ between the source and target.

I find this work more interesting than inspiring, not to say it isn’t inspiring. It is especially interesting how fluid and realistic the facial reenactments look on the target (video example linked below). Obviously, this was made in 2016 and so the technology now is most likely better than what is shown, however, it is great that such technology exists. A more advanced version of this technology could bring a plethora of possibilities to many different fields, including drastically revolutionizing entertainment or even as a potential means of creating holograms.

Video below is a demonstration of the real-time reenactment.

Leave a Reply