The Children’s Code sets out 15 standards organisations must meet to ensure that children’s data is protected online. The code will apply to all the major online services likely to be accessed by children in the UK and includes measures such as providing default settings which ensure that children have the best possible access to online services whilst minimising data collection and use.
Michael Murray, Head of Regulatory Strategy at the Information Commissioner's Office, sets out what this new Code will mean for children and service providers.
This webinar is suitable for anyone who provides or markets their services in Northern Ireland, particularly those whose target audience includes children and young people. This event will also suit teachers and principals in schools and colleges of further and higher education. Parents may also find Michael’s presentation of interest.
The Recording
Transcript
Rolanda: Good morning, everyone, and welcome to our webinar with Michael Murray, Head of Regulatory Strategy with the Information Commissioner's Office. My name is Rolanda Markey and I'm part of the Learning & Development team at Legal-Island. I'm joined this morning with my colleague, Alison, who is looking after the technology in the background. So thank you all for joining us on Zoom this morning. Makes a wee change from GoToWebinar.
Just to tell you a wee bit about Michael, Michael, as I said, is Head of Regulatory Strategy, working within the Regulatory Futures Directorate at the Information Commissioner's Office. That is quite a mouthful.
He leads on policy development related to the Information Commissioner's Children's Code, which is formally known The Age Appropriate Design Code, and on engagement with Internet Society Services, regulators, and civil society organisations in the UK and internationally.
He also leads the development of lesson plans and materials to support schools and parents to better understand data protection issues and the role of the code in protecting children and young people. And I'll drop a wee link to those lesson plans into the chat because I found those yesterday when I was having a wee look.
The Children's Code itself sets like 15 standards that organisations must meet to ensure that children's data is protected online. The code will apply to all major online services, which are likely to be accessed by children in the UK, and include such measures as providing default settings, which ensure that children have the best possible access to online services whilst minimising data collection and use.
Michael sets out what this new code will mean for children and service providers. And the webinar is suitable really for anyone who provides or markets their services in Northern Ireland, particularly those whose audiences include children and young people. The event will also suit teachers and principals in schools and colleges of further and higher education. And parents may also find Michael's presentation of interest.
Now, if you have any questions for Michael, please pop them into the chat, and we'll deal with those at the end of his presentation.
So thank you very much for your attention. Over to you, Michael.
The Children's Code
Michael: Thanks, Rolanda. A great introduction. Probably given half the information I was going to give in my presentation already, but that's no problem. Thank you. Thanks for inviting me to talk to colleagues in Northern Ireland.
Children represent about 30% of the UK's online users, but the internet was not designed for them, nor has it consistently been designed with solutions to reduce the risks that children face online.
These risks range from a whole variety of things. There is cyber bullying and trolling, inappropriate content, and contact with adults. The wide sharing of personal data and information from children to other children and to adults and targeted advertising.
Kids' data is collected from an early age. In some cases, before they're even born.
And apologies for the dog outside, she doesn't like it when I'm on presentations.
By the time a child is 18, they will have received tens of thousands of targeted adverts. Online services will know their name, their age, their sex, their date of birth, their favourite football team, their email address, where they live, who they bank with, where they go to school, what kind of trainers they like to buy.
For some kids, online services will know also about their health, their sexuality, their political beliefs, and their willingness to pay for certain products.
All this profiling can have an enormous impact on children's opportunities, their reputations, their finances, and their relationships.
So kids are often most at risk in their own rooms with the doors shut at home and parents just a door away, but a digital world removed from what their kids are doing.
The Children's Code is designed to sort of shift the balance of responsibility for keeping children's data safe online, away from parents who are often overstretched or don't know about the role that data plays in their kids' lives or children themselves to online services and online service companies.
The Children's Code or the Age Appropriate Design Code, as it is formally known, as Rolanda said, is a set of 15 standards of age-appropriate design for online services which process children's data. It's designed to help those online services to better design their services, to make sure that they are child-friendly.
The code is grounded very much in the provisions of the United Nations Convention on the Rights of the Children, UNCRC, the best interest framework within that code. And it creates an open, transparent, and safer place children to play, explore, and learn online.
It's now been six months since the end of the transition period for the code. It has now entered into its supervision phase. So it's a great opportunity to talk to, to share some of the learning we've had from online services to date, and to learn what's working and what some companies are finding difficult, for example.
Who Does the Code Apply To?
This slide shows a little bit more about who's in scope of the code. And there are a couple of really important definitions here. So the code applies to relevant information society services that are likely to be accessed by children. And here, we're using the term for children, anybody under the age of 18.
So that's different than similar regimes that are operating, for example, in the United States where the Copper Code is relevant to children under 13. Thirteen is also clearly one of the important ages of the GDPR. But the parliament in the UK decided that the code would apply to anybody who is under 18 and a child as defined by the UNCRC.
Information society services are any service normally provided for remuneration at a distance by electronic means and at the individual request of the recipient of the services.
So by remuneration here, it's important to recognise that that does not just mean services where the child or their parent is providing funding directly to the service. For example, in an in-app purchase. It also applies to any service that is monetised and provided for children. So for example, a service that relies on advertising for its monetisation would fall within scope of the code as an in information society service.
Likely to be accessed is defined as more probable than not. And here, we're not going to give any particular number about what percentage of children you would need to have in order to qualify. Basically, if you think children are using your service, you're more likely to be in scope than not.
And just to say that this likely to be accessed was something that the UK parliament was keen to apply to the code, and it's a change from previous regimes who looked at targeted ads or aimed at children.
We were keen to ensure that the internet that the children use and not just the ones that are supposedly aimed at them are covered by the code. So services that are targeted at adults but where children use the service would also fall into the likely to be accessed category.
You can go to the next slide.
What Role Does The Code Play in Law?
The code is on a statutory footing. It means that it is rooted in existing data protection laws largely introduced by the Data Protection Act in 2018. It mostly is linked over to the UK GDPR but other data regulations as well. So when we are assessing conformance with the code, we are doing so looking at conformance with the UK GDPR.
The code is a useful tool to help organisations in meeting the requirements for their underlying data rights responsibilities and their GDPR responsibilities. Again, it sets out how to design services for kids rather than telling you what not to do. It's much more a design guidance of what's the best way forward. What does good look like?
Like I said, it's on a statutory footing, which means that the ICO and the courts must take the code into account where relevant. So if we receive, for example, a complaint or there's a data breach, then my supervision colleagues will look to see that these services that are likely to be accessed by children have taken the standards of the code into consideration.
Services that do not conform to the code will find it very difficult to show that they're processing children's data fairly. We find it very difficult to show they comply with data protection requirements.
And just to confirm, failure to comply with the data protection requirements that underpin the code could potentially open nonconforming companies up to the full range of statutory powers that the ICO has from the information notices all the way up to fines.
Next slide, please.
The 15 Standards of the Code
This slide shows the 15 standards of the code. They are some basic standards that apply to the GDPR. You can find some good practice standards and some basically wider policy standards. So they're all based on the best interest of the child, which is defined in the UNCRC.
Interestingly, these are the 15 standards that the ICO has designed for the code. Most of these standards you can also now find in the fundamentals document that the Irish Data Protection has released recently, and are similarly found in data protection regulations and guidance released in Sweden and in the Netherlands recently.
So the ICO has been at the forefront of work on children's privacy. And the code was the first of its kind, but it's not the only one of its kind now. We also heard recently that the California Consumer Protection Agency is also looking at adopting the Age Appropriate Design Code into California law.
So it's a direction of travel for online services globally. We've seen already some good changes, some good practice changes happening in the UK that are being applied across Europe. And we'll hopefully continue to see that happening on a global level as more data protection agencies adopt these measures and more similar measures.
Again, the best interest of the child is the fundamental one. And all services need to consider the best interest of the child and it to be a primary consideration in the design of their services. And everything else underneath applies back to that best interest.
Let me get the next slide, please.
So I talked a little bit about the code and where it's at. It's in supervision at the moment and it is also . . . Well, the code is not, but the principles of the code are also being applied more internationally.
The bigger picture is this code in the UK is complementary to but is distinct from the new online safety regulations that were laid in parliament last week.
The ICO is working closely with OFCOM and DCMS on the online safety regulation. The Children's Code for the ICO is very much a data processing code whereas the online safety regulations focus much more on content.
Now, data and content differences, many children and parents might not quite understand those, so we're working closely with colleagues in OFCOM to help industry and users of the internet understand how these two regulations will work together.
The code applies to services established in the UK and outside the UK for those who are targeting UK users. So for example, if a U.S. social media company has a privacy policy that mentions the UK, then that would be targeting UK users, so they would be in scope of the code.
Because so many potential organisations could fall within scope, we estimate that upwards of 200,000 online services could fall within scope of the likely to be accessed. We are focussing our attention on where the risk is highest. So the ICO is taking a risk-based and proportionate approach to regulation, looking for where risk to children is most and looking at initially at the sectors where children are most found.
So at the moment, the three sectors that we're looking at initially for supervision are the games industry, social media, and streaming services. However, that being said, we are continuing to look at other services and how the code is being applied across online services.
Can we get the next slide, please?
Why is this Code Important?
So my introduction talked a little bit about why the code is important, but here's some more information for you and why this is so important to kids.
So we know that 97% of 5- to 15-year-olds use online services for learning, entertainment, social interaction. But the internet is not designed for them. So children are creating the digital footprints at a very young age, in some cases starting with services before they are legally allowed to use them, before they meet the terms of service of some of those services.
So for example, some social media will have a minimum terms of service age of 13 or 16. Many children are lying about their age to get to services that they want to see.
Our research showed that just over half of young people have admitted to lying about their age. And that number is probably slightly higher for the 11 to 12 bracket or that 15/16/17 bracket who are interested in getting services that they might not be able to access because of terms of service of 13 or 18.
So once a child lies about their age, then that age will often follow them through the internet. So for example, being able to log into a social media site or Google or Apple or Meta, for example, as a 13-year-old even when they're 11, oftentimes those logins for social media are used for games and other services. So lying once for one service can often have on-going consequences for children and allows them to access services that they perhaps shouldn't be accessing at that age.
Interestingly, we also ask parents about whether or not they allow their children to access sites with inappropriate age content, and over a third said yes. If I go to the next slide, we can explore this in a bit more detail.
Our research also looked at what parents are doing with age restrictions on their websites. And again, just under half of parents say they do not allow online access to sites that their children are too young for. However, that shows that over 50% do.
But half of those believe that their child is mature enough, old enough to use the site. And that might be, for example, being able to access the adult YouTube or the main YouTube rather than YouTube Kids for educational purposes.
However, a good percentage, 13%, allowed access to services just to stop their kids pestering them. And most of us probably can understand a child wanting to get access to a popular game or to a social media site to allow them to communicate with their friends at school, share photos, and so on.
Six per cent had this sort of fear of missing out, and then another 6% was parents not having time to review the sites that their kids are looking at.
So over 50% of parents admitting that children get access to services that, according to the terms of service of that service, they shouldn't be accessing.
So why is the code important? It supports children's rights to wellbeing and development as part of the UNCRC, and it protects children from intrusive profiling and data sharing.
It's basically pushing the responsibility for keeping kids safe away from the kids so that they don't have to lie about their age, away from parents so they don't have to worry about age assurance in the same way, and puts it back over onto the services that are designed for kids.
If a service is able to apply all 15 standards, then our view is that there's less incentive for children to lie about their age to gain access.
Also, the code is designed to prevent some of the more dangerous and higher risk activities that children might get involved with. And the key here is having children's privacy set at high by default and having both geolocation and profiling set off by default initially when a child is using those services.
That should prevent the behavioural advertising that I mentioned earlier, and it should prevent people knowing the exact location of a child, and preventing some of those dangers that are associated with that.
We are not saying that profiling is never possible with children, because in some cases some children might want to receive ads. They might want to get profiling content recommendations. But the choice of whether or not to turn on profiling and geolocation, for example, should be with the child and not with the online service itself.
And transparency notices, privacy notices, terms of service, and so on should be written in a way that makes sense for children, in language that children are able to understand, and delivered at places where children need to make a choice.
So if a child is interested in profiling they want to change, they should be able to receive some information that is in language they can understand about the risks associated with profiling before they decide to change.
The Role of Schools
Rolanda mentioned that schools might be interested in the code as well, and we agree. So the research we've found is that schools play a very important role in raising awareness of data protection and the code with kids and families.
Schools, as all teachers and parents will know, are more than just a school. They are a core part of the community and help learning from all ages. So we are working with schools and education authorities across the UK to try to help build digital literacy in children and in parents, and to help children and parents understand what their data protection rights are.
It's difficult for parents or kids to know what is right or wrong, what they can or can't complain about, if they don't know what their fundamental rights are.
So we are working with schools and designing materials and lesson plans to help with the PSHE or other learning in the classroom about data rights, and also providing guidance to schools about how they buy ed tech and the risks associated with buying ed tech and where the responsibilities for the code might fall between the school or the ed tech provider.
We've got some FAQs on our website for those who are looking for more information on that.
But basically, the code is a gold standard for organisations to consider when using children's and pupils' data. That would include online services that are provided through schools to children.
So even though the Children's Code might not apply to all ed tech services provided within schools for that core teaching role, those services would still fall under the UK GDPR. And the code is a great way for schools and ed tech organisations to see that the services are designed with kids in mind.
So we mentioned some of the resources that are already on the ICO website, and I think Rolanda mentioned that she's going to put a link into the chat. Thanks for that, Rolanda.
We have on the website at the moment some resources that were created in the midpoint of last year that are focussed on data rights and the GDPR.
There are primary and secondary lesson plans or workshops associated with the materials, which have been designed so that they are relevant to the educational needs of all four countries of the UK. So there will be a Northern-Ireland-specific area within the school's resources pages for your reference.
We will also be releasing some additional lesson plans tomorrow, so take a look at our website tomorrow and you'll see some code-specific-related lesson plans that will add to the current resources available on the site at the moment.
So that's a bit of an outline of the code. What I want to do now is to switch over to what's happening with the supervision.
Supervision
The code represents a bit of a break for how the ICO has supervised data protection in the past. So whereas in the past we'd focussed much more on complaints and data breaches, we are going to, with the code, take a much more proactive approach to supervision.
That being said, the data breaches and the complaints are still important, and we will continue to respond to issues that are raised with us through complaints or breaches, and we'll look to see how the organisations that feature in complaints or data breaches are implementing or conforming with the code.
However, we've not received any complaints from kids yet about the code, and it's probably unreasonable to expect children to take the initiative to complain about something that they don't understand.
So the ICO has taken a proactive approach to code supervision, initially focussing on the three sectors I mentioned already, social media, games, and streaming, and writing to those organisations, engaging with them to understand how they are implementing the code at the moment.
So we've written to about 49 organisations in those three sectors already, and we'll be writing to a few more as we are made aware of organisations that might not be conforming.
In addition, we have colleagues in our assurance teams who are offering voluntary audits to organisations we feel need a bit more of a deep dive or who've approached us who are seeking some testing about how they're conforming with the code by engaging with the ICO directly.
We focussed on those three sectors immediately, but they're not the only sectors we're concerned about. And we will continue to be looking at other sectors in the future when we have capacity and the work on the current three is advanced or complete.
We are also developing and influencing policy and practice-related age assurance with the commissioner's opinion on age assurance, and are reviewing certifications conducted by the Age Check Certification Scheme. So companies who want to have a certification on the Children's Code or on their application of age assurance can go to the Age Check Certification Scheme, which has been approved by the ICO.
So lots of different avenues for organisations to engage with the ICO voluntarily, but also for us to be taking the lead. And I'm really pressing organisations to see what they're doing to conform.
Go to the next slide. Thanks.
The Impact of the Code
So how has it impacted industry to date? Well, it depends on the size of the company, to be honest. Medium to larger companies are fairly well aware of the code and quite a lot are doing things already, are making progress towards conformance.
What we found is micro-companies, sole enterprises, small businesses are less aware of the code. And it's not a surprise because these are companies that are probably less aware of regulation more generally, and they probably, throughout the COVID pandemic, have been focussed much more on growing the business, on surviving the pandemic, and so on.
So it is part of our ambition for the code this year to increase those numbers, to make smaller enterprises more aware of what the code means to them, and to raise awareness within those companies of some of the resources we have to help them conform with the code.
About 40% of companies have already started making changes to their under-18 business. Some of them as a result of the code, some of them just because they thought it was a good thing to do.
These tended to be focussed on things like reviewing their data protection impact assessments, reviewing their policies, redrafting some of the information to make it more appropriate for children.
We've found that fewer businesses than we anticipated are facing costs associated with the code, because most of the smaller to medium-sized businesses have been able to make the changes I just mentioned internally.
We do see some costs associated with the larger businesses that need to redesign or make some changes to their online services, for example. But the overall cost implication of the code has been lower than we had anticipated when we did an impact assessment in 2020.
Really encouragingly, two-fifths of businesses that are aware of the code already see opportunities from implementing it. These are opportunities related to increased customer trust and confidence in their service, profitability, and also their reputational benefits of providing a safe product and service.
So what have we learned to date? Well, from those first 49 organisations, first thing is a general lack of understanding by industry of who uses their services and whether their services are in scope of the code.
We've had quite a few industries say, "Well, we're not aimed at children, so we're not in scope". And we've had to raise it again with the industry that the likely to be accessed applies. If kids are likely to access their services, then they will be in scope.
So one of the things that they should be doing is understanding better about who their users are and to recognise that even though they might target adults, if children are using their services, then they should be looking at the code and looking at how they design services to protect children's data.
Businesses also don't have a full understanding of what data they are collecting and sharing with their partners and other data controllers. We've talked to organisations that design or provide businesses with website services, with commerce services, to try to raise awareness of the code, for example.
But there are still on-going issues with data being shared with advertisers, with analytics providers. Where you have a social media sign-on, for example, as a login, data is being shared with social media services. Where there are SDKs or other plugins or games engines, for example, there are opportunities for data sharing, data leakage out from online services to those services as well.
And we're really keen for services to look at the data maps and determine where data is being processed, especially where children's data is being processed, and to identify if that processing is essential. Are there opportunities to reduce the data processing, to minimise the data being collected, to reduce the data that's being shared, to look at how long they're storing data, and to put purpose limitations in place as well?
We've noted that companies have been slow, but 40% of companies have started to work on their DPIAs. Some of them have been slow to get started.
There was a feeling in industry that 2 September 2021, when the transition period ended, was not a hard deadline for the code. So they kind of started working on things at the end of the transition period, even though the ICO has made clear that from 2 September 2021 the code is in full effect, and we will be expecting online services to be conforming with it.
There is still a widespread reliance on self-declaration for age assurance. The issue pertains largely to two particular ages. So the 13 age and the 18 age are the two probably most difficult for online services in relation to the code.
Children under 13 require a parent or guardian to consent for their access to websites and online services. So services that are using only self-declaration for children who are under that age of 13, if they are able to get access to the service, that service is processing those children's data illegally under the UK GDPR.
So we're trying to work with organisations to better understand how they can improve their self-declaration process and other measures that they might want to bring into use with age assurance.
Also, the other age bracket is the 13 to 17. So in this case, the 13- to 17-year-old does not require parental consent to access a service, but they're often continuing to potentially access services that they're too young for and gaining access to things like online dating sites or other adult services that are not appropriate for them.
So again, self-declaration is often used in games that are designed for older audiences or online social media designed for older audiences. And there have been tensions and issues around whether or not companies are meeting their own terms of services and community standards, which is one of the code standards that organisations need to conform with.
So one of the pleas for today is take a look at what form of age assurance you are using, and to make sure that it is appropriate for the risks to your service.
There's also an on-going reliance on profiling for monetisation, the in-app purchases or other sales online, for advertising, and for non-core services. So again, profiling can be used if it's core to the service. If it's not core to the service, you should be switching it off by default and giving the child the opportunity to switch it back on if they'd like.
So we spent about a year in the transition period designing support materials for online services to access, to help you to conform. And these still remain online. So there is a Children's Code hub in the "For Organisations" section of the ICO website. And there's also an SME web hub, an innovation web hub as well, that provides further support to small business, and the ICO Sandbox for organisations looking for additional targeted help with product developments.
We have more information on the Children's Code on harms that might be experienced by children and how you can show that you are conforming with the best interest framework. We have some guidance there. And there will be some new guides being released next week on the best interest framework to help organisations navigate their way through what could be a bit of a complex set of information to apply to your services.
There are also DPIA templates and examples for mobile games, connected toys, and online retail to take a look at to help you to improve your own DPIAs. And just to say, any service that is in scope with the code needs to have a DPIA. That's Standard 2 of the code.
There are also use beliefs, design guidance. UX design guidance has been produced in a format that's designed to meet designers' needs. So there are mirror boards on the ICO website with design guidance, and we will be releasing within the next month or two some design conformance tests as well.
So this should help those small businesses who want to know, "Well, how do I do it? What does good look like on our service?" So take a look.
The guidance was produced with input from about 150 designers. Four hundred or so designers have already used it. So it's a really good place to get started if you want to know how to apply the code.
And then there's also a Commissioner's opinion on age assurance to help you understand what we're looking for, for age assurance as well.
Go to the next slide.
Steps Businesses Can Take Now
So steps you could take now. Really look at who's using your service and determine whether your service is likely to be accessed by children. So consider user testing, some surveys, market research, or looking at some grey literature from civil society or academics, which might indicate that your service or your service area, your sector is likely to be accessed by children.
Get started on updating your DPIAs. Map out where children's data is being processed and used on their user journeys and identifying any risks and impacts on their rights and how you can mitigate those risks.
And then develop a transformation plan to ensure that all relevant disciplines within your organisation are engaged, use privacy-by-design, and think about how you'll engage parents and children in that process.
And then, as I mentioned, feel free to use the design guidance at hand to help with that process.
Please do stay in touch. We are continuing to release further information for schools, for industry, and for parents and children to help raise awareness of the code and to help you conform.
If you have any questions or if your organisation would like to take part in some voluntary audits or engage with us, have any questions, then do send us an email. We can connect you to colleagues in the ICO who can best help with your query. I hope that's been useful, and thanks for your time.
Rolanda: Thank you very much, Michael. That was very interesting, I have to say. If anyone has any questions for Michael, then pop them into the chat box or unmute yourself and ask it directly if you'd like to.
In the meantime, I just had one or two questions when I was thinking about this.
In relation to the types of breaches perhaps that have been reported, what sorts of things have been reported in relation to this particular area?
Michael: Well, again, when we talk about data breaches in the ICO, it means something in particular, right? So it's when a company's data has been hacked or accessed inappropriately. So any data breach that happens will need to be notified, and we'll be looking to see if in-scope organisations are applying the code standards if there's children's data involved in those data breaches.
As far as supervision learning, what I've highlighted around data sharing, around profiling, lack of potentially adequate age assurance measures in place to prevent kids who should not be accessing those services from accessing them, and probably organisations being a bit slow in updating their privacy notices, their terms of service, the communities services so that they're written in a way that are accessible to kids.
We'd also like to see parental controls being updated so kids know when there are parental controls on their service. And online tools for children and parents to exercise their data rights to be made more prominent on service as well.
So lots of things that are still happening around things like profiling, it's not always being done as good as it can be. We would like to see some improvements there and for organisations to be having profiling off by default.
Rolanda: And you mentioned about improving the self-declaration process so that it's not just a child saying, "Yes, I'm 16", or whatever it is.
What might that look like, that improvement? Is it that then a parent would need to confirm?
Michael: Well, if a child is under 13, the GDPR says a parent needs to give consent. So if your processes are allowing children under 13 in, then the data you're collecting is in breach of the UK GDPR. Now, that's a critical first thing. That's Article 8 the GDPR for those who are looking for the reference.
It's a difficult area, because we know that age assurance technology is still developing and there are some things that are better than others. But this is an area where what we've noticed is industry is almost kind of waiting to see what happens both with the technology and with the ICO and how we supervise the code.
There are things that industry can do. If they're using a self-declaration process, they can use some technical measures. So for example, if a child puts in an age under 13, that means that they're not eligible and you can prevent them from going back and changing that age immediately or on that same device. So it sort of prevents them from switching the age if they've been told no. That's one technical measure that can be made.
Organisations with a bit more capacity or capability could monitor, for example, chat services to look for language that a child would use, for example, or any references to schools that might indicate that they're younger than what they say they are. A reporting mechanism for people to report that a child is underage.
So different techniques that could potentially be used if self-declaration is the only thing that's technically feasible at the moment. Look to see what you can do to identify underage users and how you can update your systems and prevent their access in the future.
But as age assurance technology develops, we'll be tracking what industry is able to do, and we'll be expecting more as the years go on as age assurance technology improves. So we will want to see industry adopting better measures and more effective measures of ensuring kids aren't accessing services they shouldn't be.
Rolanda: And one thing you mentioned earlier . . . I had written it down as a question, but you kind of answered it, but just to clarify that.
So obviously, the code really applies to anything that's created, I suppose, at children, is created within the UK, but also created outside the UK, but could be accessed by children in the UK. Is that correct?
Michael: Yeah. So we are in the supervision process, the 49 letters that I mentioned, several of those are to U.S. social media or games companies, or streaming companies, for example. Most or all would have an office in the UK, or they are directly targeting children in the UK.
So for example, they'd have a UK site or a UK and EEA terms of service that would apply only to children here. So it would show that they are actively targeting UK children. And then yes, they would be within scope of the code.
Rolanda: Okay. Now, does anyone have any questions for Michael? I feel like I'm monopolising the time, but this is a very interesting area. Wee comment that it takes a combined effort of parents, schools, ICO, and service providers to support young people, really. It is very much a joint effort.
I know, as a parent, your child just gives you their iPads and says, "Will you do something?" You're like, "What is it you want me to do?" They sort of time that at a time whenever they know that you're busy and you might not really focus on what it is they're doing. But you do have to really be on top of it as a parent, I have to say.
One last question I had, unless anybody else really has anything for you, was . . .
This is probably a whole webinar in itself, but you mentioned just data protection impact assessments. Can you briefly outline what that actually involves if a service provider or whatever was attempting to do that?
Michael: Well, luckily, we've got templates on our website. That helps to start with so you're not starting from scratch.
Basically, it is an outline of your service. That would include things like what the service is, an outline of what you're doing, who's it aimed at, the security, what risks you've identified and how you're mitigating those risks, how you've engaged with your audience and service users. It's a whole range of issues there.
It also maps the data that you process, and identifies your legal basis for data processing, for example, across all the services that you offer.
So for example, it might require consent for some parts of your service but legitimate interest for others where it might relate to shipping and handling or something else, for example.
So I would advise colleagues to take a look at the templates. And then, again, there are three worked examples on the children's hub website if you want to see how, for example, an online retail service might start to answer the DPIAs.
They are just a start. They're not as detailed as we probably expect to see because they won't have, for example, a registrar of processing activities or some of the other additional documents that go with them. But it's a good place for colleagues to start to understand what we're looking for.
Rolanda: And there's just a wee question there.
You mentioned schools applying due diligence when buying ed tech. Do you have any helpful templates for such due diligence?
Michael: At the moment, we don't. It's one of the things that we're looking at producing. But I'd say for schools, if they are looking at developing a joint controllership relationship with an ed tech provider . . . And again, it's going to be a language that schools oftentimes probably think, "Oh my god, what does that mean?" So start by reading the FAQs for schools on our children's hub website. That will start giving you links to areas of ICO policy that'll help explain some of this language.
But generally, schools are covered within the public task of requirements, so they wouldn't be recognised as an ISS, an Internet Service Society, falling in the scope of the code.
Schools, however, do fall within the scope of the UK GDPR, and some of the ed tech services, depends how much control over data they have, could be within scope of the code as well as being in scope of the GDPR.
So in some cases, it depends. It depends on what's happening with the data. It depends on who's accessing the data. It depends on what level of consents are being provided, whether the ed tech provider is using the data for things beyond the primary use in the classroom or not, to determine whether or not those ed tech services fall within scope.
It's a complicated subject, so I would say if ed tech suppliers are looking for more guidance, then start by looking at the code, at the FAQs, and following the links from there, or get in touch and talk to us.
Rolanda: Okay. Thank you so much for your time, Michael, this morning. That's been very interesting. And thank you to everyone who's joined us today. We have been recording this, so we'll share this with the wider audience. I hope your dog forgives you because clearly she's very unhappy with you for locking her out of the room. But thank you all for your time. Thanks, Michael.
Michael: Thank you.
Rolanda: Goodbye now, everyone.
Continue reading
We help hundreds of people like you understand how the latest changes in employment law impact your business.
Please log in to view the full article.
What you'll get:
- Help understand the ramifications of each important case from NI, GB and Europe
- Ensure your organisation's policies and procedures are fully compliant with NI law
- 24/7 access to all the content in the Legal Island Vault for research case law and HR issues
- Receive free preliminary advice on workplace issues from the employment team
Already a subscriber? Log in now or start a free trial