You will now be able to dictate what posts appear on your Facebook newsfeed through this genius new invention

A new "Why am I seeing this post?" button will indicate what activity influenced Facebook's algorithms.

Facebook is launching a new feature that explains how its algorithms decide what to display in your News Feed.

A new “Why am I seeing this post?” button will indicate what activity influenced Facebook’s algorithms.It is the first time the company has given people access to this insight directly in its app and on the website.

Facebook, Twitter, YouTube and others have been criticised for using algorithms to recommend content without explaining to users how they work.

A new "Why am I seeing this post?" button will indicate what activity influenced Facebook's algorithms.

Facebook told the BBC the new feature was available for some users in the UK today. It will roll out fully by 2 May.

The “Why am I seeing this post?” button will be found in the drop-down menu that appears at the top right of every post in the News Feed.

The tool will offer insights such as: “You’ve commented on posts with photos more than other media types.”

Facebook said it was also adding more information to the “Why am I seeing this ad?” button that has appeared on advertisements since 2014.

It will now let people know if details on their Facebook profile matched those on an advertiser’s database.

The "Why am I seeing this post?" button will be found in the drop-down menu that appears at the top right of every post in the News Feed.

It already revealed whether some of your online activity, such as the location where you connected to the internet, was being used to target ads at you.

“Both of these updates are part of our ongoing investment in giving people more context and control across Facebook,”the company said in a blog.

Facebook has faced intense scrutiny after a series of data breaches, privacy scandals and allegations that the platform was used to interfere in elections.

Last week, chief executive Mark Zuckerberg called for government regulation, saying the responsibility for monitoring harmful content was too great for companies to tackle alone.

Leave a Reply

Your email address will not be published. Required fields are marked *