Comment fonctionne les filtres avec l’IA ?
Introduction to Face Filter
Many of you must have used face filters like the age filter on Snapchat or Blossom filter on Instagram. But have you ever wondered, how these face filters recognize our faces? And at the same time apply those filters exactly on our faces, even as our faces move?
In this session, we will not only learn the concept behind how these face filters work but also get a hands-on experience in building a face filter that allows us to apply a face filter sprite on our face.
This filter appears exactly at the position of our face and is customized to the size of our face. The filter is also coded to move along with our face. We do all of this using PictoBlox’s Face Detection and Pen extension.
How do these filters work?
First, the computer needs to figure out where your face is. Then it maps out the features of your specific face, distorting and applying effects to your appearance.
- Detection: In order to detect a human face from the camera on the stage:
- The computer converts the image to grayscale image (image with only shades of grey, instead of colors) to make it simpler to understand.
- It then analyzes the color values of the pixels in this grayscale image to recognize patterns of contrast.
- Something called the Viola-Jones object detection framework is used for such rapid face detection.
- Mapping: Active Shape Models (ASM) are widely used to tag facial features of any face based on its structure.
For this, machine learning models are trained using face data where those features have already been mapped.
ASM marks a set of points on the face to define its features, forming a Point Distribution Model (PDM).
- Training a face model: Basically, you feed the computer hundreds of images and tell them where the eyes are in every picture. The computer then becomes capable of pinpointing eyes in a new image (that it has never seen).
- When you capture your own face using your camera, the computer places the points of an average face around where it detected your face to be. These points are adjusted according to their knowledge of how a face is “supposed” to appear. It then creates a ‘mesh’ of your face from these points.
- Modification: The filter that is implemented on your face must move with your face. That’s why the lipstick filter you apply, doesn’t get shaken off when you move your face to one side.
- A mesh like structure is implemented on your face for this, as shown below:
Reference: www.warwickwarp.com
- Above, is a 3d mesh constructed from three views of a persons face. This mesh can also be distorted to create desired alterations on your face, i.e., it can make your face look puffy, your nose looks pointy, or your eyes look bigger.