May 15, 2025
Trending News

Instagram, not responsible for minors

  • November 27, 2023
  • 0

history of Instagram in relation to minors, It is one of the most disgraceful and disgraceful files (I can think of much more forceful and cheeky qualifiers, but

Instagram, not responsible for minors

history of Instagram in relation to minors, It is one of the most disgraceful and disgraceful files (I can think of much more forceful and cheeky qualifiers, but I’ll keep those to myself) from the history of social media. A few weeks ago, when it became known that a new leaker would inform the United States Congress about the company’s internal affairs, we already reviewed its history, so I will not elaborate on it here. I fondly remember calling Adam Mosseri, the director of Instagram, a moron at the time, but it’s becoming increasingly clear to me that I failed.

So, The situation is not looking too good for the social network, that despite not being part of the everyday conversations about hate and fake news that we see from others, especially on Twitter and Facebook, another type of toxicity that is not as obvious as insults and lies in the first place, but for many people can have devastating effects.

So, Today we have information, twice, both about a lawsuit from no fewer than 41 states that make up the United States, and about new and disturbing findings about Instagram’s risks to minors. And as could be expected, in both cases it is information that does not leave Meta in a good light, both for their actions in the past and for the solutions with which they intend to protect minors today.

Instagram, not responsible for minors

On the one hand in Ghacks we can read more information about the Instagram lawsuit, thanks to leaked documentation. So these documents tell us that the social network has been running promotions for years aimed specifically at attracting the under-13 demographic. Yes, you read that right, 13, which is especially interesting considering that’s exactly the minimum age set (theoretically) by Meta for you to sign up for the service.

And what are the measurements? They are summarized in these three points:

  • Design your platforms to be attractive to children.
  • Failure to take adequate measures to protect minor users from harmful content
  • Exposure of minor users to potentially harmful advertising

On the other hand, Phone Arena reports on several tests carried out to evaluate Instagram’s protection system, with which the social network tries to prevent the display of content that brands do not want to be associated with their ads. In theory, thanks to it, advertising attachments should always be displayed surrounded by publications suitable for all types of audiences. However, we can summarize the following from the mentioned test:

«In a series of Instagram-recommended videos, an ad for dating app Bumble appeared between a video of someone caressing the face of a life-size latex doll and a video of a young woman with a digitally obscured face lifting her shirt to reveal . his belly. In another, a Pizza Hut ad followed the video a man lying on a bed with his arm around what legend said was a 10-year-old girl.»

There are already several advertisers who have suspended their advertising plans on Instagram, but after reading the part of the previous text that I’ve bolded, it seems almost the least important to me (although it will no doubt be Meta’s main concern). Did Meta really try that hard to attract kids and teenagers into the space to find this type of content? Surely by now you can imagine some of the qualifications that came to mind when I started writing this report, right?

Source: Muy Computer

Leave a Reply

Your email address will not be published. Required fields are marked *