Twitter explains why it removed image crop on iOS and Android
"Not everything is a good candidate for an algorithm," Twitter says.
On Wednesday, Twitter shared the results of analysis of its image-cropping algorithm, which it stopped using after users noted instances where it chose white individuals over Black ones in photo crops, and male-presenting ones over female-presenting ones, among other problems related to bias.
Twitter tested the potential gender- and race-based biases of the algorithm, and found that in comparisons of Black and white individuals, there was a 7% difference in parity in favor of white individuals. Between Black and white women, there was also a 7% difference in parity in favor of white women. And when comparing Black and white men, there was a 2% difference in favor of white men. Finally, when comparing men and women generally, there was an 8% difference in favor of women.
In May, Twitter got rid of the automatic image cropping, allowing users to post photos in their entirety (or to decide how to crop the photos themselves).
"One of our conclusions is that not everything on Twitter is a good candidate for an algorithm, and in this case, how to crop an image is a decision best made by people," Rumman Chowdhury, Twitter's director of software engineering, wrote in a blog post about the team's findings.
Bias in technology, and particularly in artificial intelligence algorithms, is a major issue the tech community is grappling with, as these algorithms are increasingly used to determine things like access to health care. There has been some legislative activity around this topic, but it's not easy to address, experts say.