Views and announcements

Why California taking action to protect children online sends a strong message to the rest of the world

  • Written by Damian Cranney from Big Motive

    The need to improve online safety and privacy for children is a topic that has been discussed a lot in recent years. Elon Musk’s takeover of Twitter has brought this into the light again as he appears keen to loosen content moderation rules, despite growing pressure to do the opposite. The State of California however, has thankfully taken perhaps the most significant step yet towards making much-needed improvements a reality.

    The California Age Appropriate Design Code, or CAADC, was passed by lawmakers last month with the goal of protecting the State’s children from apps and digital services that have until now, operated with impunity.

    The new bill requires organisations such as TikTok, Instagram and YouTube to improve their privacy and safety features for users under the age of 18. This includes measures such as the introduction of privacy-by-default settings and refraining from the collection of users’ location data. The bill also requires companies to review their products and algorithms to determine how addictive they are and the potential negative impact they have on young users. The move represents a giant step towards ensuring the digital world is as safe as it can be for young people on the internet.

    Inspiring accountability for kids’ safety

    The bill draws on the UK’s Age Appropriate Design Code (AADC) and the Design Guidance service which Big Motive worked on with the Information Commissioner’s Office (ICO). 

    Through collaborating with designers, data experts and technology advisors, we created this service — also known as the Children’s Code Design Guidance — to help teams empathise with young users and better understand children’s rights. Its aim is to make apps and online services safer for children and has already led to changes in how teams are creating digital products and services for young kids as well as tweens and teenagers. 

    It’s great to see that our work with the ICO is continuing to make an impact and that its reverberations around the globe are inspiring policy changes and new guidelines such as California’s Code. We’ve also seen improvements in other parts of the world with social media services in Singapore implementing measures this month that seek to limit users’ exposure to harmful content. This comes as part of a number of measures under the country’s Online Safety Bill. 

    We hope the ICO’s pioneering Children’s Code continues to inspire change in other places so that large corporations are held accountable for protecting children’s safety, rather than exploiting vulnerable young users in the relentless war for engagement. 

    The consequences of addictive design

    The digital world is more accessible and irresistible to younger generations now than ever before and kids are exposed to the internet at an increasingly younger age – opening them up to dangers many of us didn’t face ourselves. 

    5Rights Foundation highlights that common features on apps nudge children toward risky behaviours, expose them to predators, recommend harmful material and encourage compulsive behaviour. Alarmingly, 75% of the top social media platforms use AI to recommend children’s profiles to strangers and the average screen time for American teens has increased by 17% in the last two years to over 8.5 hours a day.  

    Many tech companies are designing products and services to be as addictive as possible – as economic incentives offer rewards to businesses that generate mass adoption and engagement with content often published with little or no moderation. As highlighted by the Centre for Humane Technology, many tech companies are driven by maximising growth and profits, rather than the well-being of the people they are creating for. 

    More and more organisations are dealing with the negative consequences of their actions, such as receiving fines and facing lawsuits. TikTok is currently facing multiple lawsuits from parents who say their children died of strangulation in attempting the ‘blackout challenge’ after the app showed them videos of other people trying it. 

    In other devastating news, it’s been revealed that London teenager Molly Russell, died from an act of self-harm in 2017 after the darker side of the online world overwhelmed her. According to The Guardian, of 16,300 pieces of content saved, liked or shared by Molly on Instagram in the six months before she died, 2,100 were related to suicide, self-harm and depression. 

    It’s clear that the internet is not a safe place for children and young people and something needs to change. This should be a wake-up call to organisations worldwide to take children’s privacy and safety seriously when creating digital products and services.

    Thankfully, the introduction of the Children’s Code and the CAADC will help this change to gather momentum.

    Progress towards a better future

    The UK’s Children’s Code has already led to a number of notable positive changes. 5Rights Foundation shared the below updates from tech companies that are making changes to their products as a result:

    • Google has made SafeSearch the default browsing mode for all under 18
    • YouTube has turned off autoplay for under 18s
    • TikTok and Instagram have disabled direct messages between children and adults they do not follow
    • Google’s Play Store now prevents under 18s from viewing and downloading apps as adult-only
    • TikTok do not push notifications after 9pm to children aged 13-15 and after 10pm to 16-17-year-olds

    The California Code will help similar changes to be implemented in the US. The new bill is a literal game-changer as some of the largest and most active companies in this space, many of whom are based in Silicon Valley and the Bay Area, will have to rethink their practices.

    We believe and hope that this could be the start of a global movement towards better data protection for children. The UK and California Codes need to influence big tech to change as well as inspire responsible tech startups to demonstrate best practices. There is much to do but welcome signs of positive change – towards a future where our kids are safer online. 

    About the author

    An article that is attributed to Sync NI Team has either involved multiple authors, written by a contributor or the main body of content is from a press release.

    Got a news-related tip you’d like to see covered on Sync NI? Email the editorial team for our consideration.

    Sign up now for a FREE weekly newsletter showcasing the latest news, jobs and events in NI’s tech sector.

Share this story