Parental protections for young people place the burden of Instagram’s failings onto parents

  • Media Release
Decorative image

Meta’s announcement today of new parental controls fails to address the root causes of the harm that Instagram causes young people. The changes enable greater oversight and controls from parents, allowing them to set time limits, review followers and following lists, and review any harmful content children have reported to Instagram.

But making parents responsible for keeping their kids safe on Instagram won’t fix the problems the platform causes. Providing parental controls alone is an inadequate response given the scale of the problem.

“Any features that help some young people to be safer, such as parental controls, are welcome. But they’re not enough and will not address the root cause of the issues. They can unfairly make parents seem responsible for Instagram’s own problems” said Rys Farthing, Director of Data Policy at Reset Australia.

“For example, while Instagram is now enabling parents to set time limits for young people, they are also facing a number of lawsuits globally for their intentionally addictive design. Perhaps Meta should address their addictive designs, so that parents have to rely less on setting time limits. These sorts of controls can just push responsibility on to busy parents”.

“Given the money and power Instagram has to create a safer platform, compared to the busy lives of parents, this just feels a bit like gaslighting. There’s lots of things Instagram themselves should also be doing to take the burden off parents, and protect children and young people whose parents might not be able to use these tools. They need to hold themselves accountable too”.

“These tools could be helpful for children and young people whose parents have the capacity, time and understanding to support them, but not all young people are that lucky. There will be more than a few families in Australia where children themselves are the most Instagram savvy. Pushing responsibility for safety onto parents in these situations isn’t enough”.

Instagram needs to improve their platform, to make it safer and more private for young people in the first instance. But unless forced by legislation, the company is unlikely to institute simple steps that could make the platform safer for young people, for example:

  1. Preventing algorithms recommending harmful content to children’s accounts
  2. Enforcing their own guidelines so there’s less harmful content on the platform
  3. Preventing inappropriate advertising reaching young people
  4. Default accounts for 16 and 17 year olds to private settings.

Instagram is now 11 years old and it hasn’t taken any of these common sense steps. Ultimately, we need strong regulations to ensure platforms are accountable for young people’s safety and privacy.

For comment and interviews, please contact Rys on 0490 875 958