Facebook created AI to read all the memes

A figurine is seen in front of the Facebook logo in this illustration taken

Facebook's 'Rosetta' AI can extract text from a billion images daily

It will scan all the image on which it is overlaid which will lead to the engineers I order to equip Rosetta with the predictive capabilities.

From improving search functionality to better policing hate speech, there are many reasons why Facebook would want a more efficient way to decipher text inside of images and videos.

Rosetta has already been widely integrated with various products within Instagram and Facebook.

In the future, Facebook says it could apply the same technology toward understanding text that appears in video as well, though that requires a more complex system.

Optical Character Recognition (OCR) is a commonly used technology to pull text from images, such as scanned pages, without the time consuming burden of manually transcribing.

The company announced that it has built a machine learning system called Rosetta which extracts texts from billion of public image posts and videos on Facebook and Instagram to understand the context of the text and images together.

Suffice to say, with approximately 350 million photos being uploaded to the social network each day, Rosetta and Facebook are fighting an uphill battle. Among other issues, extracting text efficiently from videos remains a work in progress, and Facebook continues to experiment with different solutions. As a large number of posts are being posted on the social media, Facebook's moderators aren't able to review the content of the memes while they are being posted, creating a lot of fuss related to offensive posts.

By its own admission, Facebook has been struggling to suppress the spread of inappropriate content - from hate speech and threats of violence to disinformation and "fake news" across its vast platform.

In one recent instance, the tech titan admitted it had not acted quickly enough stop the spread of ethnic hate and violence across Myanmar.

Facebook has struggled in the past to adequately identify hate speech or misleading information and in documents seen by Motherboard, Facebook's own training material incorrectly said that an image of the 2010 natural disaster in Jeiegu, China was a picture of a Myanmar genocide.

Latest News