Photo Credit: Twitter
Twitter's image-cropping algorithm has a problematic bias toward excluding Black people and men, the company said in new research on Wednesday, adding that "how to crop an image is a decision best made by people."
The study by three of its machine learning researchers was conducted after user criticism last year about image previews in posts excluding Black people's faces.
It found an 8 percent difference from demographic parity in favour of women, and a 4 percent favour toward white individuals.
The paper cited several possible reasons, including issues with image backgrounds and eye colour, but said none were an excuse.
"Machine learning based cropping is fundamentally flawed because it removes user agency and restricts user's expression of their own identity and values, instead imposing a normative gaze about which part of the image is considered the most interesting," the researchers wrote.
To counter the problem, Twitter recently started showing standard aspect ratio photos in full - without any crop - on its mobile apps and is trying to expand that effort.
The researchers also assessed whether crops favoured women's bodies over heads, reflecting what is known as the "male gaze," but found that does not appear to be the case.
The findings are another example of the disparate impact from artificial intelligence systems including demographic biases identified in facial recognition and text analysis, the paper said.
Work by researchers at Microsoft and the Massachusetts Institute of Technology in 2018 and a later US government study found that facial analysis systems misidentify people of colour more often than white people.
Amazon in 2018 scrapped an AI recruiting tool that showed bias against women.
For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.