More and more social media platforms are taking steps to improve protections for younger folk. Instagram is the latest company to take steps in that direction with the recent announcement that the platform will soon require users to provide their date of birth if they have not already done so.

The requirement has been introduced just two days before the UK begins enforcing the age appropriate design code. The UK’s new “Children’s Code” rules- officially known as the Age Appropriate Design Code (AADC) – apply to companies based outside the UK and come with hefty fines for services judged to have failed to protect underage users.

The DOB prompts are already present and if you’re an Instagram user you may have already seen them. As of now,  those prompts are opt-in. If you don’t want to provide your date of birth you can simply close the window. The day is coming however when you won’t have a choice. If you want to keep using Instagram you’ll have to report your DOB.

The change is part of a broader effort which seeks to make it harder for adults to contact teens or pre-teens on Instagram. The company is also monitoring user contacts and flagging certain adults as “potentially suspicious” if they have a habit of reaching out minors on the platform.

In March 2021, Instagram announced that adult users worldwide would no longer be able to send a direct message to under-18s who did not already follow them. Just recently the company said that accounts for children under 16 years of age would now default to a higher privacy setting.

In addition to that update, the company is also making changes on how advertisers can reach young people with ads. Starting in a few a weeks, Instagram will only allow advertisers to target ads to people under 18 (or older in certain countries) based on their age, gender, and location.

“This means that previously available targeting options, like those based on their interests or on their activity on other apps and websites, will no longer be available to the advertisers,” the company said.

 

These new updates represent important progress towards creating a safer, more private experience for young people on Instagram. In particular, using machine learning to understand when it might not be appropriate for an adult to interact with a teen puts teens in the driver’s seat as far as who they interact with, and defaulting teens under 16 into private accounts helps young people keep their content less visible to adults.”

—Larry Magid, CEO | ConnectSafely

 

These are good changes and long overdue. Even most privacy advocates who are usually wary about providing more information to service providers of any type generally applaud the recent announcement.

In any case it’s very good to see more and more social media platforms taking solid steps to see to the protection of minors. The internet is (or can be) a wild and dangerous place. Anything we can do to make it even marginally safer for our children has to be counted as a good thing.

Kudos to Instagram for joining the ever-growing chorus of social media companies to embrace changes like this. A list that currently includes social media and technology giants like TikTok, YouTube, and Google. While it will take some time yet to measure their full impact and overall effectiveness, these are undoubtedly moves in the right direction that will make our kids safer.

 

Used with permission from Article Aggregator