Social media ‘dark patterns’ nudging Australian kids into shady data harvesting
The report, Did we really consent to this? Terms & Conditions and young people’s data, ranks the ease of understanding a minor could have on reading the Terms and Conditions on popular video streaming, online gaming, messaging, and social media services. Analysing the use of language, design, and dark patterns, which nudge people towards specific decisions, Reset Australia scored each platform out of 5 stars. The highest score was 2.5 stars out of 5, for Epic Games, with two platforms, Tik Tok and Spotify, scoring 0. See score table below.
“It’s nearly impossible for kids to opt out of data collection. Complex and opaque Terms and Conditions mean young people have even less opportunity to meaningfully consent to how their data is collected and used.” said Rys Farthing, Reset Australia’s Children’s Data Policy Director.
“These apps are designed to be easy for young people to use, but when it comes to disclosing how data will be collected and stored, suddenly they become very difficult to understand. ‘Dark patterns’ are nudging kids to agree to terms and conditions, without making any effort to explain them coherently.”
Reset Australia, which advocates against digital harms, found the Ts and Cs of nine of the 10 surveyed apps required a tertiary level reading age, and on average would take one hour and 46 minutes to read.
“To put this into perspective, Tik Tok’s terms and conditions run the length of two novels, or about 6 hours of reading at a university level. If all two billion people who use TikTok read the full terms and conditions, it would take 1.24 million years of effort,” Dr Farthing said.
Working with YouGov, Reset Australia polled 400 16- and 17-year-olds and found only 7% of young people are confident they understood the terms and conditions they have ‘accepted’, and only 4% read them all the time.
“Surveyed young people told us they don’t understand what they’re signing up for, and want to see better recognition of their data rights,” Dr Farthing said.
On reading the report, Elizabeth Handsley, professor of law at Western Sydney University and president of Australian Council on Children and the Media, said:
“None of us should be surprised by the evidence in this report of self-regulation failing children. Still, we need to compel the industry to act in the public interest.
“It’s not too hard to simplify terms and conditions, and we shouldn’t accept overly complex ones as the price we pay for free platforms and services.
“We can create an internet where the rights, needs, and interests of children are properly recognised and attended to,” Professor Handsley said.
Reset Australia is calling for a data code for children and young people under 18 years old, so their data is only collected and processed in ways that are in their best interests.
“Social media and digital services often don’t respect children’s privacy or rights. We shouldn’t leave it up to tech companies to decide what they can and can’t do - we need some ground rules so they’re compelled to prioritise children’s rights,” Dr Farthing said.
“Australia needs a regulatory code governing how children and young people’s data is collected and used. Other countries have already implemented or proposed similar codes, including the UK’s Age Appropriate Design Code, and Ireland’s Fundamentals for a Child-Oriented Approach to Data Processing.”