Written by Rebecca Walsh, Design Director Big Motive.
Instagram has been handed a jaw-dropping fine of €405 million by Irish regulators for the mishandling of children’s personal information, data and privacy rights.
This is the third fine distributed by Ireland’s DPC in four years, and certainly not the first time Instagram’s owner Meta has been at the centre of a safety violation. It comes just months after an NSPCC warning that the virtual reality metaverse is “dangerous by design” after its investigation found kids were able to explore virtual strip clubs and mix freely with adults.
Although Instagram has stated that this new inquiry focused on old settings that they’ve since updated, they were still stung with a hefty fine. This should be a wake-up call to other tech organisations out there to take children’s safety seriously when creating apps and services. It may be Instagram today, but it could be your organisation tomorrow.
According to a new report by Pew Research Centre, the amount of young people, aged 13 to 17, who are “almost constantly” online has roughly doubled from 24% in 2014-15 to 46% now, with another 48% using the internet “several times a day”. With this increased online activity, it is more important now than ever to ensure all apps and online services are safe to use by kids.
At Big Motive, we are passionate about keeping children safe online. We feel, as designers, that we can help to solve the problem and want to encourage others to do the same.
Last year, we worked with the Information Commissioner’s Office (ICO) to create design guidance for a new statutory code of practice called the Children’s Code. Its aim is to make apps and online services safer for children and has already led to changes from the likes of Google, Facebook and Tik Tok.
We hope our work with the ICO can help organisations learn from Instagram’s mistakes and ensure they don’t find themselves in a similar position but also show them how to put children’s safety at the forefront, making the internet a better and safer place for all.
The registration journey
So, where to start? The “registration journey” is the process a user goes through when creating an account on an app or website. During its registration process, Instagram allowed children aged 13 to 17 to create business accounts, meaning their personal data such as phone numbers and email accounts were displayed publicly on their profile.
To avoid this happening on your service, it is crucial to understand how you collect data during the user journey so you can identify when and how you can communicate about privacy information.
When signing up to a service, a lot of personal information can be collected. This is the perfect time to make privacy information known to users and inform them what their data is actually being used for. It ensures they are aware from the start and can review the privacy decisions before creating an account.
Product teams can use “privacy moments maps” to visually map important privacy moments onto your user journey and identify risks, questions and ideas. Use the outcomes to prioritise where you can improve privacy in your user experience.
Instagram may have avoided this fine if their design team used a privacy moments map. They would have noticed the lack of information provided to teens when signing up to a business account, thus giving them the opportunity to make necessary changes and ensure users knew what data they would be sharing publicly.
Protect children’s privacy by default
In this case, children and young people who set up business accounts on Instagram had their information visible to the public by default. To make sure that a child’s personal information is always protected, designers need to make privacy settings the strictest they can be by default.
Many children will just accept whatever privacy settings you provide them with and never change them. This is why it is so important for the defaults you set to offer the highest level of protection. A high privacy setting would mean that a child’s personal data is only accessible to other users of the service if they change their settings to allow this.
If a child makes the decision to change their privacy settings, think about ways you can help them strengthen their settings. Perhaps you could give them options to change their settings for a limited amount of time to ensure the child’s safety online.
If a service can be accessed by multiple users from one device, where possible, allow them to set up their own profiles with individual privacy settings. This means children don’t have to share adult privacy settings when using a shared device.
Think about parents/guardians
It’s also important to think about parents during the children’s user experience. Children using a product for the first time may not know enough about the service to make confident decisions about sharing data. They may need to seek help from parents and carers to understand privacy information or give consent.
Design privacy information that’s easy for parents to find, understand and get support if they are unsure, create resources to help parents discuss privacy with children and give parents support to give consent on children’s behalf or help children make decisions.
The future of child safety
Young users become adult users, so technology companies are unlikely to ignore this audience any time soon. The Children’s Code is just the start of a process that might introduce a more responsible alternative for design teams everywhere. To achieve the long-term change that leads to better practice for young, vulnerable users, it’s important that we, as designers, understand the power at our fingertips and our collective responsibility to leverage design for good so that we can protect this generation and the next.