How the snapchat filter works- Behind the seances !

Source : openframeworks.cc

How Snapchat’s Filters work ?

Snapchat:

Source : apkpure.com

Source : apkpure.com

 

These filters are the special unique features of snapchat, They are very crazy but the engineering behind them is serious, Snap chat calls these filters as lenses Link.

Technology :

The technology came from a Ukrainian startup Looksery, they used this facial recognition technology to photoshop a face during video chats.

Snapchat  acquired this Odesa based RT face changing startup Looksery in september 2015 for USD 150 million dollars, That’s reportedly the largest tech acquisition in Ukrainian history.

 Source: ukrainemap.facts.co

Source: ukrainemap.facts.co

Source : www.looksery.com

Source : www.looksery.com

Their augmented reality filters link tap into the large and rapidly growing field of “computre vision” , These are the similar application that use pixel data from a camera inorder to identify object .

Computer vision is how you can deposit checks, with your phone

Its how Facebook knows who’s in your photos, how self-driving cars can avoid running over people and how you can give yourself a dodgy nose.

So how the filters works with snapchat :

Looksery maintains their engineering more confidential, but every one can access their patents online.

The first step is detection, How does the computer know which part of an image is a human face ?

This is something that human brains are fantastic at. Too good even.

1-binary

But this is what a photo looks like to a computer.

If all you have is the data for the colour value of each individual pixel, How do you find a face ?

Well the key is looking for areas of contrast, between light and dark parts of the image.

Source : openframeworks.cc

Source : openframeworks.cc

The pioneering facial detection tool is called the “ VIOLA-JONES ALGORITHM” link

It works by repeatedly scanning through the image data calculating the difference between the grayscale pixel value underneath the white boxes and the black boxes.

For instance, the bridge of the nose is usually lighter than the surrounding area on both sides.

The eye sockets are darker than the forehead, and the middle of the forehead is lighter than the size of it.

These are crude test for facial features, but if they find enough matches in one are of the image.

It concludes the there is a face there.

This kind of algorithm won’t find your face if you’re really tilted of facing sideways, but they’re really accurate for fontal faces, and it’s how digital cameras have been putting boxes around faces for years.

But in order to apply this virtual lipstick, the app need to more than just detect my face.

It has to locate my facial features.

According to the patents. It does this with an “active shape mode” — a statistical mosel of the face swap that’s been trained by people manually marking the borders of facial features on hundreds,

Some times thousands of sample images.

The algorithm takes an average face from that trained data aligns it with the image from your phones camera, scaling it and rotating it according to where it already knows your face is located.

But its not a perfect fit so the model analyses the pixel data around each of the points.




Looking for the edges defined by brightness and darkness. From the training images, the model has a template for what the bottom of your lips should look like. For example, So it looks for that pattern in your images and adjust the point to match it.

Because some of these individual guesses might be wrong, the model can correct and smooth them by tracking into account the locations of all the other points. Once it locates your facial feature, Those points are used as Coordinates to create a mesh.

Thats the 3D mask that can move, rotate, and scale along with your face as the video data comes in for every frame and once they’ve got that, they can do a lot with it.

They can deform the mask to change your face shape, change your eye colour, add accessories, and set animations to trigger when you open your mouth or move your eyebrows.

And like the IOS app Face Swap live, Snapchat can switch your face with a friends, although that involves a bunch more data.

The main components of this technology are not new. What’s new is the ability to run them in real time, from a mobile device. that levels of processing speed is a pretty recent development.

So why go through all this trouble just to give people a visual flower crown ?

well snapchats sees a revenue opportunity here. in a world that’s flooded with advertisements, may be the best hope that brands have to get us to look at their ads.. is to putthem on our faces.

facial detection has a creepy side too, particularly when its used to identify you by name both the FBI and Private companies like Facebook and Google massing huge database of faces and there’s currently no federal law regulating it.

So some privacy advocates have come up with ways to camouflage your face from facial detection algorithms.

its actually illegal in a lot of places to wear a face mask in public.

So this project by artist adam Harvey suggest some things that can do with your hair and your makeup that can, for now, make your face invisible to computers.

 

 

Comments

comments

Be the first to comment on "How the snapchat filter works- Behind the seances !"

Leave a comment

Your email address will not be published.


*