The Age Appropriate Design Code – Casting The Net Wide
The Age Appropriate Design Code – Casting The Net Wide
The Age Appropriate Design Code – Casting The Net Wide
The Age Appropriate Design Code – What is it?
As regulatory bodies develop more interest in how children’s data is handled, it is time to remind readers of the Age Appropriate Design Code (”Code”). The Code is a statutory code of practice developed by the Information Commissioners Office (“ICO”). It applies to businesses offering online products or services that involve the processing of personal data targeted at or ‘likely to be accessed’ by children. Children are defined in the Code as being individuals under the age of 18.
The Code comprises a set of 15 standards (“Standards”) that businesses should adhere to in order to ensure that the personal data of children using their services is subject to appropriate safeguards.
The Standards are summarised below, with the full detail available here. The highlighted Standards map to existing GDPR principles. Each Standard is underpinned by a consideration of what is in the best interests of the child.
Why should businesses care?
The Code isn’t confined to services targeted at children – the phrase ‘likely to be accessed’ means that its net reaches beyond ‘targeting’ – this is to ensure that services that children are using in reality aren’t excluded by the use of a very narrow definition. The trouble for many businesses however is that they may not know whether children (and let’s not forget that children can be anything from 0-17 years) are accessing or likely to access their service if as a business, they do not intentionally target them. Code Standard 3 states that you must take a risk based approach to recognising the age of your users and that if you can’t ascertain users age with a level of certainty appropriate to the risk then you should apply the Standards to all users. When thinking about this Code and its potential impact, one cannot help but note the many apps (the prevalence of which have grown in number and use during the course of the pandemic) that might fall into the ‘likely to be accessed by’ category:
fitness apps
health and wellbeing apps
food delivery apps
ride hailing apps
gaming apps
….. were a few that immediately sprang to mind.
Practical steps to determine whether the Code might apply to you
Step 1
Does the code apply to your service? The ICO guidance includes a flow chart that you can work through if you aren’t sure whether you are covered by the Code. In conducting your assessment, we would recommend that you consider:
whether you make it clear that your service is for over 18s only and how this is achieved (e.g. can users simply self declare or do you have a more complex method in place?)
does your service allow others to be added by a primary (adult) user that would indicate that under 18s may be using it?
does the interaction of your users with your service indicate that some of them might be under 18?
Do you target under 18s?
Step 2
If your analysis determines that children are using your service or (to paraphrase the ICO guidance) the possibility of children using your service is more probable than not then, (subject to the below), you will need to take steps to comply with the Code.
It is important to point out that if you establish that your service is likely to be accessed by children but that your service isn’t designed or intended for children then you should focus on ensuring that it cannot/is not accessed by children – the aim is not to make services that are not suitable for children child friendly.
Step 3
If you have established that the Code applies to you and you are comfortable with children accessing your service, you will need to assess your service and your current privacy policies and procedures in the context of the Code. Having clearly communicated and easily understood privacy policies and procedures that meet the key requirements of the GDPR are a necessary base from which to make any adjustments to comply with the Code.
In conducting your assessment, we would recommend that you
complete a written review of your compliance against the Standards
make changes to your service or policies and procedures underpinning that service to ensure it is age appropriate. Grouped by age, the Code includes some factors for consideration based on the age bracket that a child falls in to - for example, it defines 10-12 as ‘transition years’ and highlights that within this bracket, children’s use of online services are likely to ramp up and they are likely to be accessing services independently from their own device whereas those in the 16-17 bracket are described as approaching adulthood with technical abilities and skills that are more developed than their emotional intelligence
review your data protection impact assessments to consider use of your services by children. You should have a records of processing register that sets out the scope of personal data that you are collecting and the third parties that you are sharing this data with – that information will provide a base from which to consider the Data Minimisation and Data Sharing Standards.
If your service incorporates AI, you may want to familiarise yourself with the EU AI Act. Both the Code and the EU AI Act recognise the fundamental rights of children enshrined in the United Nations Convention on the Rights of the Child and the AI Regulation incorporates obligations where high risk AI systems are likely to be accessed by or have an impact on children.
Summary
Governments and businesses are scrambling to ensure that legal frameworks maintain pace and currency with the technology that we are using, however it is fair to say that this is work in progress especially when it comes to children. The Code is one tool that has been developed to help ensure that children are protected when online.
If you would like further information on the above or wish to discuss other data privacy matters, you can contact us at inform@taceo.co.uk
The Age Appropriate Design Code – What is it?
As regulatory bodies develop more interest in how children’s data is handled, it is time to remind readers of the Age Appropriate Design Code (”Code”). The Code is a statutory code of practice developed by the Information Commissioners Office (“ICO”). It applies to businesses offering online products or services that involve the processing of personal data targeted at or ‘likely to be accessed’ by children. Children are defined in the Code as being individuals under the age of 18.
The Code comprises a set of 15 standards (“Standards”) that businesses should adhere to in order to ensure that the personal data of children using their services is subject to appropriate safeguards.
The Standards are summarised below, with the full detail available here. The highlighted Standards map to existing GDPR principles. Each Standard is underpinned by a consideration of what is in the best interests of the child.
Why should businesses care?
The Code isn’t confined to services targeted at children – the phrase ‘likely to be accessed’ means that its net reaches beyond ‘targeting’ – this is to ensure that services that children are using in reality aren’t excluded by the use of a very narrow definition. The trouble for many businesses however is that they may not know whether children (and let’s not forget that children can be anything from 0-17 years) are accessing or likely to access their service if as a business, they do not intentionally target them. Code Standard 3 states that you must take a risk based approach to recognising the age of your users and that if you can’t ascertain users age with a level of certainty appropriate to the risk then you should apply the Standards to all users. When thinking about this Code and its potential impact, one cannot help but note the many apps (the prevalence of which have grown in number and use during the course of the pandemic) that might fall into the ‘likely to be accessed by’ category:
fitness apps
health and wellbeing apps
food delivery apps
ride hailing apps
gaming apps
….. were a few that immediately sprang to mind.
Practical steps to determine whether the Code might apply to you
Step 1
Does the code apply to your service? The ICO guidance includes a flow chart that you can work through if you aren’t sure whether you are covered by the Code. In conducting your assessment, we would recommend that you consider:
whether you make it clear that your service is for over 18s only and how this is achieved (e.g. can users simply self declare or do you have a more complex method in place?)
does your service allow others to be added by a primary (adult) user that would indicate that under 18s may be using it?
does the interaction of your users with your service indicate that some of them might be under 18?
Do you target under 18s?
Step 2
If your analysis determines that children are using your service or (to paraphrase the ICO guidance) the possibility of children using your service is more probable than not then, (subject to the below), you will need to take steps to comply with the Code.
It is important to point out that if you establish that your service is likely to be accessed by children but that your service isn’t designed or intended for children then you should focus on ensuring that it cannot/is not accessed by children – the aim is not to make services that are not suitable for children child friendly.
Step 3
If you have established that the Code applies to you and you are comfortable with children accessing your service, you will need to assess your service and your current privacy policies and procedures in the context of the Code. Having clearly communicated and easily understood privacy policies and procedures that meet the key requirements of the GDPR are a necessary base from which to make any adjustments to comply with the Code.
In conducting your assessment, we would recommend that you
complete a written review of your compliance against the Standards
make changes to your service or policies and procedures underpinning that service to ensure it is age appropriate. Grouped by age, the Code includes some factors for consideration based on the age bracket that a child falls in to - for example, it defines 10-12 as ‘transition years’ and highlights that within this bracket, children’s use of online services are likely to ramp up and they are likely to be accessing services independently from their own device whereas those in the 16-17 bracket are described as approaching adulthood with technical abilities and skills that are more developed than their emotional intelligence
review your data protection impact assessments to consider use of your services by children. You should have a records of processing register that sets out the scope of personal data that you are collecting and the third parties that you are sharing this data with – that information will provide a base from which to consider the Data Minimisation and Data Sharing Standards.
If your service incorporates AI, you may want to familiarise yourself with the EU AI Act. Both the Code and the EU AI Act recognise the fundamental rights of children enshrined in the United Nations Convention on the Rights of the Child and the AI Regulation incorporates obligations where high risk AI systems are likely to be accessed by or have an impact on children.
Summary
Governments and businesses are scrambling to ensure that legal frameworks maintain pace and currency with the technology that we are using, however it is fair to say that this is work in progress especially when it comes to children. The Code is one tool that has been developed to help ensure that children are protected when online.
If you would like further information on the above or wish to discuss other data privacy matters, you can contact us at inform@taceo.co.uk
The Age Appropriate Design Code – What is it?
As regulatory bodies develop more interest in how children’s data is handled, it is time to remind readers of the Age Appropriate Design Code (”Code”). The Code is a statutory code of practice developed by the Information Commissioners Office (“ICO”). It applies to businesses offering online products or services that involve the processing of personal data targeted at or ‘likely to be accessed’ by children. Children are defined in the Code as being individuals under the age of 18.
The Code comprises a set of 15 standards (“Standards”) that businesses should adhere to in order to ensure that the personal data of children using their services is subject to appropriate safeguards.
The Standards are summarised below, with the full detail available here. The highlighted Standards map to existing GDPR principles. Each Standard is underpinned by a consideration of what is in the best interests of the child.
Why should businesses care?
The Code isn’t confined to services targeted at children – the phrase ‘likely to be accessed’ means that its net reaches beyond ‘targeting’ – this is to ensure that services that children are using in reality aren’t excluded by the use of a very narrow definition. The trouble for many businesses however is that they may not know whether children (and let’s not forget that children can be anything from 0-17 years) are accessing or likely to access their service if as a business, they do not intentionally target them. Code Standard 3 states that you must take a risk based approach to recognising the age of your users and that if you can’t ascertain users age with a level of certainty appropriate to the risk then you should apply the Standards to all users. When thinking about this Code and its potential impact, one cannot help but note the many apps (the prevalence of which have grown in number and use during the course of the pandemic) that might fall into the ‘likely to be accessed by’ category:
fitness apps
health and wellbeing apps
food delivery apps
ride hailing apps
gaming apps
….. were a few that immediately sprang to mind.
Practical steps to determine whether the Code might apply to you
Step 1
Does the code apply to your service? The ICO guidance includes a flow chart that you can work through if you aren’t sure whether you are covered by the Code. In conducting your assessment, we would recommend that you consider:
whether you make it clear that your service is for over 18s only and how this is achieved (e.g. can users simply self declare or do you have a more complex method in place?)
does your service allow others to be added by a primary (adult) user that would indicate that under 18s may be using it?
does the interaction of your users with your service indicate that some of them might be under 18?
Do you target under 18s?
Step 2
If your analysis determines that children are using your service or (to paraphrase the ICO guidance) the possibility of children using your service is more probable than not then, (subject to the below), you will need to take steps to comply with the Code.
It is important to point out that if you establish that your service is likely to be accessed by children but that your service isn’t designed or intended for children then you should focus on ensuring that it cannot/is not accessed by children – the aim is not to make services that are not suitable for children child friendly.
Step 3
If you have established that the Code applies to you and you are comfortable with children accessing your service, you will need to assess your service and your current privacy policies and procedures in the context of the Code. Having clearly communicated and easily understood privacy policies and procedures that meet the key requirements of the GDPR are a necessary base from which to make any adjustments to comply with the Code.
In conducting your assessment, we would recommend that you
complete a written review of your compliance against the Standards
make changes to your service or policies and procedures underpinning that service to ensure it is age appropriate. Grouped by age, the Code includes some factors for consideration based on the age bracket that a child falls in to - for example, it defines 10-12 as ‘transition years’ and highlights that within this bracket, children’s use of online services are likely to ramp up and they are likely to be accessing services independently from their own device whereas those in the 16-17 bracket are described as approaching adulthood with technical abilities and skills that are more developed than their emotional intelligence
review your data protection impact assessments to consider use of your services by children. You should have a records of processing register that sets out the scope of personal data that you are collecting and the third parties that you are sharing this data with – that information will provide a base from which to consider the Data Minimisation and Data Sharing Standards.
If your service incorporates AI, you may want to familiarise yourself with the EU AI Act. Both the Code and the EU AI Act recognise the fundamental rights of children enshrined in the United Nations Convention on the Rights of the Child and the AI Regulation incorporates obligations where high risk AI systems are likely to be accessed by or have an impact on children.
Summary
Governments and businesses are scrambling to ensure that legal frameworks maintain pace and currency with the technology that we are using, however it is fair to say that this is work in progress especially when it comes to children. The Code is one tool that has been developed to help ensure that children are protected when online.
If you would like further information on the above or wish to discuss other data privacy matters, you can contact us at inform@taceo.co.uk