Contributed by: kellyseal on Wednesday, November 02 2022 @ 10:35 am
Last modified on Wednesday, November 02 2022 @ 10:59 am
Bumble has open sourced the AI it uses to detect nude photos, giving developers the technology to help combat the sending of unsolicited images over their platforms.
Bumble’s photo blocking feature called Private Detective aims to curb so-called “cyber flashing” that users have long complained about on dating apps, giving the user the option to delete and report the sender without having to view it. The refined version of the same AI is now available on GitHub for commercial use, distribution, and modification, according to Tech Crunch. The idea is to provide the technology to smaller companies who don’t have the time or resources to develop themselves.
In theory, as Tech Crunch[*1] points out, this AI could go beyond dating apps to be incorporated into any app or social platform to help shield users from unwanted content and cut down on lewd photos sent over many types of apps.
“Even though the number of users sending lewd images on our apps is luckily a negligible minority — just 0.1% — our scale allows us to collect a best-in-the-industry dataset of both lewd and non-lewd images, tailored to achieve the best possible performances on the task,” Bumble wrote in a press release[*2] .
Still, users complain it is a pervasive problem on dating apps in general, so this move might make a difference for start-up apps to provide some kind of security and a selling point for new users. Popular apps like Tinder are already offering more safety and security features for users, including the ability to run background checks on their matches.
“There’s a need to address this issue beyond Bumble’s product ecosystem and engage in a larger conversation about how to address the issue of unsolicited lewd photos — also known as cyberflashing — to make the internet a safer and kinder place for everyone,” Bumble said in its press release.
In addition to the AI, Bumble has also been working state by state to push legislation that would ban the sending of unsolicited photos. The company has won in Texas and has advocated for several other pieces of similar legislation in various states in the U.S., and has also lobbied U.K. lawmakers to ban the practice.
Earlier this year, Bumble announced its partnership with Bloom in launching a new online trauma support center, created in partnership with Bloom for those who have experienced sexual harassment and abuse on its app. Bumble has also offered one-on-one chat support and up to six therapy sessions for certain cases.
For more on this dating service, please check out our Bumble review.