Twitter is studying its own algorithms to see what “unintentional harms” they cause
The company says it will be focusing on three things regarding its algorithms.
Twitter announced this week that it would be taking an in-depth look at its machine learning algorithms. While things like machine learning can be a great tool, it can also be harmful as, ultimately, it is fed information from humans who can show bias, even if unintentionally.
On Twitter, these algorithms play a part in what you see as recommended topics, and even images are subject to bias. One of the more known examples can be found here, where images featuring both Mitch McConnell and Barack Obama defaulted to showing the image of McConnell.
In the post, Twitter notes that it will look at three main things:
- A gender and racial bias analysis of our image cropping (saliency) algorithm
- A fairness assessment of our Home timeline recommendations across racial subgroups
- An analysis of content recommendations for different political ideologies across seven countries
Twitter says it will use the findings to not only study how its systems work but use the information to improve the platform’s experience for everyone on the social platform.
The post notes that findings may “result in changing our product, such as removing an algorithm and giving people more control over the images they Tweet.”
Recently, Twitter CEO Jack Dorsey noted that eventually he wanted users to be able to not only see more of the company’s algorithms, but allow people to also choose the algorithms that are used to present them recommendations and ads.
Have any thoughts on this? Let us know down below in the comments or carry the discussion over to our Twitter or Facebook.
- Twitter apparently explored options to buy Clubhouse for $4 billion
- Twitter has confirmed that its Clubhouse-like Spaces will also be available on desktop
- Oh great, there is now another Facebook phone number database for sale
- MyPillow Guy’s new free speech social network is like Parler but you can’t say fuck