The tech giant has begun rolling out a feature that automatically reacts to potentially inappropriate visual material. The initiative, which was first introduced last year, is now rolling out to select Android devices. The innovation is part of Google’s messaging app.
When the system detects a visual that contains nudity, it automatically blurs it. In addition, the recipient sees a message explaining why the content is not being displayed immediately. However, there is the option to manually reveal the photo if the recipient chooses to do so. The sender is also informed of the potential risks of sharing such material.

Sharing visuals without permission can have serious consequences for the people who see them. There is a possibility of abuse through blackmail, spreading on social networks or among acquaintances, including at work or school. This practice often ends in so-called revenge distribution, especially after relationships have ended.
Users under the age of eighteen will have this feature enabled automatically. Adults must manually set the activation within the settings. This is expected to help protect minors and raise awareness about responsible behavior when sending private media.
The company emphasizes that the tool does not guarantee complete accuracy. It may misjudge content and blur out harmless photos or miss those that should be hidden. However, the goal is to improve security and reduce the chances of misuse of visual information.