in

Facebook will be more drastic when blocking accounts for children under 13

The social network says that when a profile is suspicious, it reviews texts and photos to determine its age.

Facebook and Instagram will now work more proactively to block the accounts of potential users under the age of 13, the minimum age established for the use of these social networks. Although the regulation was already stipulated, the company announced that the policies that restrict the use by minors will be strengthened.

Previously, the social network only investigated an account when receiving a report from a user that should specifically indicate that it belonged to a minor. However, now reviewers can suspend the profile if they have any indication that the user is a minor and even if it was reported for another reason such as the type of content.

  • Facebook suspends US analytical company UU for violating policies

  • Technological giants, towards a standard of data portability

  • Facebook will not eliminate messages of denial of the Holocaust

When the company receives these reports or suspects a user, the moderators are responsible for reviewing the content of their profile, texts and photos, to try to determine their age.

If the user wants to recover his account, he must send some proof, such as a copy of his photo identification, certifying that he meets the requirement.

However, Facebook does not ask for any proof of age at the time of registration nor is it clear how they can verify that the user is a minor through an image or a post.

The announcement to strengthen its policy for minors comes after the publication of a documentary in the United Kingdom, made by Channel 4 and Firecrest Films, which revealed that during some training sessions in Dublin, Ireland, they were I asked reviewers to ignore users who looked younger than 13 to avoid blocking a profile.

In a statement on its official blog, the company highlighted that Facebook has teams from different companies around the world that review the reports 24 hours a day, seven days a week, in all time zones and in dozens of languages. “When it’s necessary, they scale decisions for Facebook specialists who have a deep knowledge of the country. For specific and highly problematic content types, such as child abuse, the final decisions are made by Facebook employees, “the social network stressed.

This year, Facebook doubled its moderator teams to 20,000 people, including more than 7,500 content reviewers.

In December 2017, Facebook launched a service called Messenger Kids, aimed at users under 13 years old, which offers chat features and photographs that disappear in a short time and that is controlled directly by parents.

What do you think?

Written by Geekybar

Linguist-translator by education. I have been working in the field of advertising journalism for over 10 years.

For over 7 years in journalism. Half of them are as editor. My weakness is doing mini-investigations on new topics.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

GIPHY App Key not set. Please check settings

Japan

Here Are Some Hand Picked Best Destinations in Asia To Plan Your Next Holiday Trip

22 Facts about Game of thrones that make this series even more interesting