Bumble introduced a new reporting option that allows members to report profiles if they suspect someone is using AI-generated photos or videos1. Users can choose "Fake profile" and then select the option "Using AI-generated photos or videos." This aims to help keep the dating app free from AI-generated profiles and maintain a safe and trusted dating environment1.
Bumble's "Deception Detector" is an AI-powered tool that helps identify fake, scam, and spam profiles on the app. It uses a machine learning-based model to assess the authenticity of profiles and connections. The feature has been successful in blocking 95% of fake profiles automatically and has reduced reports of spam and scams by 45%.
Bumble's "Private Detector" tool is an AI-powered feature that automatically detects and blurs lewd images sent by users4. It sends a warning to recipients about the photo before they open it, allowing them to choose whether to view, block, or report the image to moderators. The tool aims to protect users from unsolicited nude images and improve safety on the dating app.