Facebook says it's working to make its social media platforms a safer and more welcoming community for women by partnering with domestic violence organizations. Facebook wants to reduce harassment that might be keeping women offline, it said in a blog post Tuesday, and is "developing cutting-edge technology to help prevent abuse from happening in the first place."
"In the background, we have tools like AI and machine learning to prevent harassment," Antigone Davis, Facebook's global head of safety, says in a video accompanying the blog post. "We also are giving user controls -- that means things like blocking, being able to hide and delete comments you don't want underneath your posts, it means being able to report to us."
The social media giant says it's using digital fingerprinting and photo-matching technology when a non-consensual intimate image is shared, to prevent it from being posted again after it's reported.
"But we wanted to do even more," Facebook added. "We developed machine-learning and artificial-intelligence techniques to proactively detect nude or near-nude images and videos shared without permission -- without anyone having to report them."
According to Facebook, when it expanded its hate speech policy in July, it spoke with women's rights organizations, safety organizations and even anthropological and cognitive linguists to ensure the policy covered harassment of women across different cultural standards.
For instance, showing a nude photo could be used to shame a woman in the US, while showing a photo of an ankle or a woman walking with a man who's not a family member could shame them in other nations. Facebook is able to take down photos in all of these situations.
Facebook also developed a profile picture guard for women in nations like Egypt, India and Pakistan to prevent access to and sharing of their photos.