“Can We Build It?” vs “Should We Build It?” The New Code To Protect Children’s Privacy Online
Recently, the Information Commissioner’s Office released new policies required for all tech/digital-based companies.
Tech-based businesses will be required to enhance their coding in such a way which will ensure the protection of children’s rights and privacy are integral to the overall user experience. In their words, [the] “Best interests of the child must be a primary consideration when designing their services”. In general terms, this means that any organisation which has a digital presence must comply with the following:
- Assess and mitigate risks to children’s rights and freedoms when designing and developing their services, and they must take into consideration different ages, capacities and development needs.
- All rules of engagement, including the steps taken to protect a user’s privacy must be concise, easy to find, and in a language which is appropriate to the child’s age.
- All user access settings default to “high privacy”
- Only the minimum amount of PII (personally identifiable information) required in order to use the service should be collected and retained, and can only be shared if a reasonable reason for sharing can be shown.
- Personal data must not be used in ways that have been “shown to be detrimental to their wellbeing, or that go against industry codes of practice, other regulatory provisions or Government advice”.
- Geolocation tracking features should default to ‘off’, provide an “obvious sign for children when location tracking is active”, and default back to ‘off’ after each session has ended.
- Parental controls must provide an “obvious sign to the child when they are being monitored”, and all parental controls will be written in the user’s age-appropriate language.
- Profiling must default to ‘off’, and only be allowed if there are “appropriate measures” in place to protect the child from any harmful effects, such as content that is detrimental to their health or wellbeing.
- All of the above must be upheld at all times according to policy, and regular community standards
In The News – from the BBC
These changes are meant for any kind of technological program which involves data collection and online presence; and includes:
- Social Media platforms
- Mobile apps
- Connected toys and smart home devices (Alexa, Hey Google)
- Online games, inc. gaming machines (Xbox, Playstation, hand-held)
- Educational Websites
- Streaming services
While no organisation wants to be known as an unsafe place for its users, many are struggling to come to terms with this new policy. For some, this will mean a very big, very real shift in priorities. For decades, multi-billion pound organisations have been operating under the philosophy of “Can we build it?” rather than thinking “Should we build it” for far too long.
Andy Burrows, the NSPCC’s head of child safety online policy, said the code would force social networks to “finally take online harm seriously and they will suffer tough consequences if they fail to do so”.
He said: “For the first time, tech firms will be legally required to assess their sites for sexual abuse risks, and can no longer serve up harmful self-harm and pro-suicide content.”
By forcing businesses to think more about the end-user experience and consequences of their actions, there will be more than a few boardrooms struggling to change their overall company mission and philosophy.
“For me, and more broadly, for [Instagram], and also for the industry this has translated into a lot of concrete changes in policy, in to what shows up where on IG more investment in finding people who might be at risk, and those affects are real and they’re important” – Adam Mosseri – Head of Instagram
At StrawberrySocial, much of what we offer is brand risk-management. This includes the active removal of inappropriate, potentially harmful and illegal content from the sites of our clients. We are excited about these changes, and look forward to seeing how they will impact our day to day work and yet, we’re also very aware that the need for human intervention (eg our moderators) will never go away.
Rebecca Fitzgerald, Founder and MD of StrawberrySocial:
“We have a combined experience of over 30 years working with kid friendly, global brands, assisting web brands, online games, and social platforms with everything from child safety training, to product development, moderation, risk management and support.”
“What this new policy means is that companies will need to take a hard, long look at their overall business structure, and ensure the safety of children is now integral to every conversation – even before development begins. And this is where we can be a tremendous help – to both burgeoning companies and established brands.”
For more information on the DPIA and New Code policies you can visit the official website for the Information Commissioner’s Office