Family First continues to support efforts to enable parents to better protect their children from online harms.
The research is clear, as is the experience of parents, that young people (notably those under 16 years of age) are being detrimentally impacted by social media. Research by the likes of Dr Jonathan Haidt clearly shows the correlation between social media use and the growing anxiety, depression, stress, and isolation of young people. Importantly, he and other researchers have noted that this is simply happening due to the use of social media. Add to this the violent, pornographic, and abusive material that young people can too readily access, and the issues compound.
Family First wants to see how parents can be better empowered to protect their children through existing dynamics and technologies that require little governmental involvement.
Already, identification and age are required for accessing the likes of Apple’s App Store and Google Play. As has happened in Utah, Texas, and other US states, it is already possible to ensure that young people do not have access to various social media apps. This, in turn, prevents the need to have each social media platform, such as Facebook, Snapchat, or TikTok, age-verify each user. Put simply, the young person cannot download the app in the first place.
Related to this is linking a child’s app store account(s) to their parents, with the latter having to provide consent to. It is already widely recognised in law that children cannot consent to legal contracts. Yet, when it comes to accessing internet apps, children are signing off pages of contractual agreements that even adults don’t read or understand. Furthermore, in New Zealand law currently, you need to be at least 18 years old to sign most contracts (with a few exceptions around the likes of employment agreements, which allow a sixteen-year-old to sign).
Family First is also aware that there exist many options to verify a person without requiring a person to share a driver’s license, passport, or similar document. This can involve the use of AI to identify age via your face or even your hand. It is essential to note that this technology does not reference your face to a previous record of yours, but rather compares the features of your face or hand to what AI knows about human development.
While maybe of concern to many, most social media and big tech companies already have an existing array of data on every person (some suggest up to 90 data points), which readily identifies who a person is, their age, interests, and so forth. All to say, ‘Big Tech’ already has the data it needs to identify a person’s age to a high degree of certainty.
Family First believes that Big Tech does have a responsibility to act in ways that support parents and protect children. Suggestions that parents alone should fix the problem ignore the reality of the situation and the harm facing children. Just as we age, identify and protect children around alcohol usage or accessing online gambling, the same dynamics are true of wider social media access.
We are conscious that some argue that any move to regulate under-16s’ access to social media is impinging on freedom of speech. Rights are rarely absolute, and in this case, the rights of parents to guide their children and children themselves to be safe from violent, abusive, and pornographic material are important. Very few are arguing that fences around pools should be removed because of a child’s right to free movement.
There are also suggestions that some children will get around any regulations or attempts to limit their access to social media. This is undoubtedly true, but it is not an argument to do nothing. As above, there are tools and approaches that can be used and will be effective for many. As Baroness Beeban Kidron from the United Kingdom has also noted:
“… those children who are using VPNs understand they are transgressing — an important change from allowing them unfettered access to adult content. The restriction sets a new cultural norm that says this content, which damages their emotional and social development, is not ok. It is better that a small number of children consciously take themselves where they should not be, than that all children are inundated with adult material whether they want it or not.”[1]
Much is often made of the phrase “putting children at the centre” of policy development or societal consideration, and yet in the space of protecting under-16-year-olds from the many known harms of social media, nothing is currently being done. Family First stands with parents who want some tools to assist them in protecting their children online.
This does not require government overreach or digital IDs, but instead regulatory pressure on Big Tech to take responsibility for their platforms, and the use of existing technologies to ascertain age. Together, this will enable the collective response of families and communities that Dr Jonathan Haidt notes is the best way forward in protecting young children.
[1] https://www.afterbabel.com/p/the-uk-is-doing-the-hard-work-of
*Written by Family First staff writers*




