Facebook launches AI tool to curb cyber bullying

Cyber bullying is an issue that can no longer be ignored. In the digital age, some argue that not enough attention is given to this issue, various women have fallen victims of the crime.

Victims of cyber bullying are left traumatized and others end up commiting suicide.

Facebook is introducing a new AI tool which will detect and remove intimate pictures and videos posted without the subject’s consent.

It claims that the machine learning tool will make sure the posts, commonly referred to as ‘revenge porn’, are taken down – saving the victim from having to report them. 

Facebook users or victims of unauthorised uploads currently have to flag the inappropriate pictures before content moderators will review them. 

The company has also suggested that users send their own intimate images to Facebook so that the service can identify any unauthorised uploads. 

Many users are reluctant to share revealing photos or videos with the social-media giant, particularly given its history of privacy failures.

This is the latest attempt to rid the platform of abusive content after coming under fire after moderators claimed they were developing post traumatic stress disorder.

Image result for cyber bullying in kenya

Facebook is using them as ‘human filters’ for the most horrific content on the internet, according to one leading cyber expert.

The company’s new machine learning tool is designed to find and flag the pictures automatically, then send them to humans to review.

Social media sites across the board have struggled to monitor and contain  abusive content users upload, from violent threats to inappropriate photos.

The company has faced harsh criticism for allowing offensive posts to stay up too long and sometimes for removing images with artistic or historical value. 

Facebook has said it’s been working on expanding its moderation efforts, and the company hopes its new technology will help catch some inappropriate posts.

The technology, which will be used across Facebook and Instagram, is trained using pictures that Facebook has previously confirmed were revenge porn.

It recognises a ‘nearly nude’ photo, for example, a lingerie shot, coupled with derogatory text which would suggest someone uploaded the photo to embarrass or seek revenge on someone else.

Leave a Reply

Your email address will not be published. Required fields are marked *