Social networks have many good aspects, but they can also be dark. There are many users and not all of them have good intentions. On the platform that
Social networks have many good aspects, but they can also be dark. There are many users and not all of them have good intentions. On the platform that allows us to watch cat videos and memes, it is possible to find hate speech, abusers, content that promotes unhealthy lifestyles and habits, etc.
It is very important to put an end to this, especially when we are talking about minors, which is why Instagram has taken action on this issue. Your solution: teen accounts. A timely proposal. Just when some countries are considering legislating a minimum age for accessing social networks.
Let’s go piece by piece.
What is a youth account?. These accounts, which are a literal translation of the recently announced “Teen Accounts,” are “a new experience for teens under the guidance of parents,” according to Meta. When a minor creates an account on Instagram, that account will be a teen account by default, and that has certain consequences for the content that’s served. The goal, Meta says, is to “better assist parents and give them peace of mind that their children are safe with the appropriate protections.”
Private accounts. The first and most important of these is that teen accounts are private by default. Both existing Instagram users under the age of 16 and those who sign up in the future will have private accounts. On the other hand, accounts will now be private for all users under the age of 18 who sign up, but this does not seem to be the case for people who are already users of the app.
This way, the content posted by the minor will be private and anyone who wants to see it will have to send a follow request, that is, a request that the minor will have to approve.
Account privacy for teens and options to change settings with parental permission | Image: Meta
Limited messages and interactions. Teens will only be able to receive messages from users they follow or are connected to. Likewise, they can only be tagged and mentioned by people they follow. They will also have secret words enabled by default, and sensitive content like violence or cosmetics promotions will be limited in Reels and Explore. Limited to the strictest level possible.
Finally, if minors use the app for more than 60 minutes, they will receive a notification warning them to leave the app (the effectiveness of this is not yet known) and activate sleep mode, which will activate between 10pm and 7am, muting notifications and sending automatic replies via direct messages.
Time limits and list of contacts | Image: Meta
Parents have power. Users under the age of 16 will need parental permission to use the less restrictive settings. Meanwhile, parents must have an Instagram account and have parental controls configured. Minors over the age of 16 will not have this filter, but parents can enable it if they wish.
Parents, on the other hand, will be able to see who their child is talking to via direct messages. They won’t be able to see the messages, but they will be able to see the users they’ve talked to in the last seven days. Similarly, in the Explore tab, they’ll be able to see what topics their child has decided to see, activate time limits, and block Instagram use during certain hours.
Are you 18 years old? Since lying about your age can be as simple as choosing a number or two, Instagram has activated a number of mechanisms to detect accounts created by minors who lie about their age. To prevent this, minors who create accounts will be required to verify their age using a selfie video or a photo of their ID card, and Instagram will also use AI to predict whether someone is the age they say they are or younger. As the company notes in a report on these measures (PDF):
“We will use AI technology to predict whether a person is over or under 18. We train this technology on signals such as profile information, when the person’s account was created, and interactions with other profiles and content. Based on these signals, we can: Start making calculations about the probability that someone is an adult or a teenager, even if the teenager has listed an adult birthday on their account
This use of this technology is a big change and a first in our industry. We’re here to make sure our AI models are accurate, but we can make mistakes along the way, so we want to take a proportionate approach to the adjustments we make. So we’ll be giving people we predict are in their teens the option to change these settings. “We’ll start testing this change in the U.S. in early 2025.”
Instagram logo | Image: Xataka
So they will implement this technology, but they are aware that there will be errors, and that is why they are calling for harmonization across the entire mobile ecosystem at the industry, government and expert level. They state that Instagram is advocating for “an approach that includes a technical solution for age verification at the operating system or app store level as the simplest, most effective and privacy-preserving way to verify age.”
When? According to Instagram, the rollout of these measures is already underway. “We plan to include teens on teen accounts in the United States, United Kingdom, Canada, and Australia within 60 days, and begin including them on teen accounts in the European Union later this year,” the company says. “We will bring teen accounts to other Meta platforms next year.” These platforms include WhatsApp, Facebook, and Meta Quest.
The minimum age to use social networks is not harmonized even in the European Union, where the GDPR sets a range between 13 and 16 years of age at the discretion of each country.
Age debate. Instagram’s decision comes at a time when there is an intense debate about the minimum age for accessing social networks. In Spain, the government has approved a bill that aims to raise the minimum age to 16 (currently the minimum age is 14) to use social networks such as Instagram or TikTok, without going any further. This law does not prohibit their use, but rather the processing of their data. A user under the age of 16 will not be able to consent to the use of their data by a social network unless they have the express consent of their parents.
This announcement comes a week after the Australian government proposed banning social media for children between the ages of 14 and 16, although the age has not yet been determined. British Technology Minister Peter Kyle also said he would pay attention to the implementation of this measure in his country.
Image | Gaelle Marcel
The problem is, it’s not common… This is a major disadvantage. The minimum age for accessing social networks is not common worldwide, even in the European Union. Although the General Data Protection Regulation sets the minimum age at 13, each country has the right to make its own decision. In Spain, it is 14 years (for now), in France it is 15 years, and the GDPR range is between 13 and 16 years. This forces networks to adapt to each country and, in short, causes confusion.
…and parents are not using the tools. All parental control measures are great, but they depend on parents using them effectively. This is not the case. This was stated by Nick Clegg, President of Meta World Affairs, at a conference in London just a few days ago. As Clegg put it:
One of the things we find is that even when we develop these controls, parents don’t use them. So we have a behavioral problem: We build these as an engineering company, and at events like this, we say, ‘Oh, we gave parents the option to limit the amount of time their kids are online.’ “Parents don’t use it.”
Image | Xataka
On Xataka | Privacy laws in Australia are laxer than in the EU. Meta uses this to train its AI like hell
Donald Salinas is an experienced automobile journalist and writer for Div Bracket. He brings his readers the latest news and developments from the world of automobiles, offering a unique and knowledgeable perspective on the latest trends and innovations in the automotive industry.