Bumble’s New AI-Driven ‘Private Detector’ Feature Automatically Blurs Explicit Photos

Bumble
  • Contributed by:
  • Views: 1,787
Image: Bumble

Starting in June, artificial intelligence will shield Bumble users from unsolicited lewd photographs sent through the app’s messaging tool. The AI feature - which has been dubbed Private Detector, as in “private parts” - will automatically blur explicit photos shared within a chat and warn the user that they’ve received an obscene image. The user can then decide if they want to view the image or block it, and if they’d like to report it to Bumble’s moderators.

“With our revolutionary AI, we are able to detect potentially inappropriate content and warn you about the image before you open it,” says a screenshot of the new feature. “We are committed to keeping you protected from unsolicited photos or offensive behavior so you can have a safe experience meeting new people on Bumble.”

The algorithmic feature has been trained by AI to analyze photographs in real-time and determine with 98 percent accuracy whether they contain nudity or another form of explicit sexual content. In addition to blurring lewd images sent via chat, it will also prevent the images from being uploaded to users’ profiles. The same technology is already used to help Bumble enforce its 2018 ban of images that contain firearms.

Andrey Andreev, the Russian entrepreneur whose dating group includes Bumble and Badoo, is behind Private Detector.

"The safety of our users is without question the number one priority in everything we do and the development of Private Detector is another undeniable example of that commitment," Andreev said in a statement. "The sharing of lewd images is a global issue of critical importance and it falls upon all of us in the social media and social networking worlds to lead by example and to refuse to tolerate inappropriate behaviour on our platforms.”

"Private Detector is not some '2019 idea' that's a response to some other tech company or a pop culture idea," added Bumble founder and CEO Wolfe Herd. "It's something that's been important to our company from the beginning--and is just one piece of how we keep our users safe and secure."

Wolfe Herd has also been working with Texas legislators to pass a bill that would make sharing unsolicited lewd images a Class C misdemeanor punishable with a fine up to $500.

"The digital world can be a very unsafe place overrun with lewd, hateful and inappropriate behaviour. There’s limited accountability, making it difficult to deter people from engaging in poor behaviour," Wolfe Herd said. "The 'Private Detector,' and our support of this bill are just two of the many ways we’re demonstrating our commitment to making the internet safer."

Private Detector will also roll out to Badoo, Chappy and Lumen in June 2019. For more on this dating service you can read our review of the Bumble app.