ADVERTISEMENT

WhatsApp: content can be reported to the police | The NO to violence – Player.it

ADVERTISEMENT

WhatsApp: content can be reported to the police |  The NO to violence – Player.it
Written by aquitodovale

Be very careful what you write from now on on WhatsApp because it could cost you dearly, especially if someone, reading your messages, should feel offended or even in danger. WhatsApp has added a feature to avoid unpleasant inconveniences.

On social media, there has long been a phenomenon of people using social media to spread hate speech, threats and violent content. This type of behavior is inappropriate and can cause emotional damage to those affected, and can also incite hatred and violence, creating major disruptions in everyday life.

Social media has implemented policies for content moderation and handling inappropriate behavior on their platform, such as the ability to report offensive and violent messagesHowever, often these moderation systems are criticized for not being effective enough.

This misconduct can also occur on messaging apps just like Whatsapp. From today on, however, those with this bad habit will have to be very careful because thanks to a new feature, hate messages and violence are numbered.

Whatsapp says no to violence

On Whatsapp you may run into messages that glorify violence or that directly endanger the recipients. When this happens in chat, it’s always a good idea to report everything to the competent authorities or, in less serious cases, cancel the chat if you feel offended.

However, it happens that people can write violent and dangerous messages in their status update. Until today it has never been possible to take countermeasures against inappropriate content that users share as status on the platform. WhatsApp, in fact, allows users to share images, videos and text as a status update for a period of 24 hours, and many people use this feature to share offensive or inappropriate content, such as nude images or videos or violent content.

In response to this issue, WhatsApp has introduced strict policies for moderating content and eliminating content that violates the terms of service. Eg, the application uses artificial intelligence and machine learning to detect and remove inappropriate content, and it also has a flagging system that allows users to report unpleasant content.

Additionally, users who violate WhatsApp’s terms of service may be subject to penalties, such as suspension or deletion of their account. WhatsApp encourages users to report any inappropriate content they see on the platform to help keep the platform safe and appropriate for everyone.

Despite these measures, there are those who manage to circumvent the problem and share violent messages on their status update.

The update that keeps us safe

The Google Play Beta Program has released the latest WhatsApp Beta update for Android, bringing the app to the version 2.23.1.27. There are no new features immediately accessible in this update, however, a small change for status updates is expected to be introduced in a future update.

WhatsApp is working to introduce a new feature that will allow users to report inappropriate content in status updates. This feature aims to ensure that content shared on the application complies with the Terms of Service and to increase user control over content posted in status updates. The ability to report inappropriate content already exists for Messages, and this new feature will also be available on the version for Android devices, as well as the desktop version of WhatsApp.

In practice, reporting an inappropriate status update on WhatsApp is simple to do. If a user believes that a status update violates the Terms of Service or is a form of abuse, you can report it pressing the options key and selecting the option to report. Reported status updates are then put through a moderation team to check for any violations of the Terms of Service.

#WhatsApp #content #reported #police #violence #Playerit

ADVERTISEMENT

About the author

aquitodovale

Leave a Comment