UK's Ofcom says one-third of under-18s lie about their age on social media
Companies like Instagram are getting heavily fined (and dragged through the publicity coals) over how they have mishandled children's privacy on their platforms. But if a recent report from Ofcom is accurate, maybe they are getting off lightly.
The U.K. media watchdog is publishing research today that found that one-third of all children between the ages of 8 and 17 are using social media with falsified adult ages, based mainly on them signing up with a fake date of birth.
It also noted that social media use by these younger consumers is extensive: of those between 8 and 17 years of age using social media, some 77% are using services on one of the larger platforms under their own profile; 60% of those in the younger bracket of that group, aged 8 to 12, have accounts under their own profiles (others use their parents' it seems).
Up to half of those underage signed up on their own, and up to two-thirds were aided by a parent or guardian.
The three pieces of research, commissioned by Ofcom from three separate organizations -- Yonder Consulting, Revealing Reality, and the Digital Regulation Cooperation Forum -- are coming out ahead of the U.K. pushing forward on the Online Safety Bill.
Years in the making (and still being altered, seemingly, with each changing political tide in country), Ofcom expects for the bill to be ratified finally in early 2023. But the mandate of the bill is a tricky (if not potentially self-contradicting) one, aiming to both "make the UK the safest place in the world to be online" while also "defending free expression."
In that regard, the research Ofcom is publishing could be viewed as a cautionary signal of what not to overlook, and what could easily spill into mismanagement if not handled correctly, regardless of which platform those younger users are using at the moment. But it also highlights the idea of taking different approaches to different kinds of over-18 content.
Ofcom notes that even within the area of children and digital content, there seems to be a fundamental gray area as far as adults' perceptions are concerned: some content marked for "adults" such as social media and gaming is relatively "less risky" than other adult content like gambling and pornography, which are always inappropriate for underage users.
The former is more likely to rely on simple verifications (which are easy to skirt around). Parents and children, the research found, were more inclined to favor "hard identifiers" like identification verification for the latter sites that focus on gambling, porn, and other materials already illegal for minors.
The choices that parents are making to help their younger ones skirt around age requirements also highlight just how entangled digital platforms have become in the lives of their young people, and how good intentions might land in the wrong way.
Ofcom said that parents noted that in cases where they viewed content as "less risky" -- such as on social media or gaming platforms -- they were balancing keeping children safe with both the peer pressure their children faced (not wanting to feel left out) and the idea that as they grew older, they wanted them to learn how to manage risks themselves.
But that is not to say that social media is always less risky: the recent court case in the U.K. investigating the death of a teenaged girl found that self-harm and suicide content the girl found and browsed on Instagram and Pinterest were factors in her death. That highlights how sites like these police the content (or don't, as the case may be) that appears on their platforms, and what approaches they take to steer users away from, while continuing to allow it to be posted for others to see. And given that the proportion of children lying about their age could actually be higher than one-third; and that children who lie about their age at 8 to get online are still only 13 five years later, aging out of the problem disconcertingly can take years.
All of this means that the aim of keeping freedom of expression intact may well increasingly be put to the test. Ofcom notes that it's coming up to its first full year of regulation of video sharing platforms. Its first report will focus "on the measures that platforms have in place to protect users, including children, from harmful material and set out our strategy for the year ahead."