UK children are now in safe hands as digital services are “forced” to adhere to a child-friendly design code.

The UK digital authorities drew up a code in September last year to work towards a digital space that is better suited for children under the age of 18. The design code was created for all digital services aimed at children under the age of 18. Of course, the code could not be implemented immediately, so these services were given a grace period of 12 months by the ICO, the UK data protection authority.

Finally, this grace period has expired, so that these services were provided without any complaints. Since the main purpose of this program is to prevent children from being persecuted or profiled, we do not see how any ministry could argue against it. These services include everything a child might be involved in, from toys and games to educational services. It even includes retail stores and social media platforms that are more attractive to minors.

Let’s talk a little about the pointers this code contains. First, a high level of data protection should be a top priority, which means that profiles owned by children or even intended to be operated by people under the legal age should have certain features such as geolocation and profiling disabled by default. We believe that removing them would be a much safer choice, but there are some instances where a compelling case comes up that these need to be kept.

The code then includes an instruction for app makers to include a section where parents can monitor their children, much like TikTok’s parental controls introduced last year. However, with the fall of that platform, the kids knew about the clock, but with the UK code, kids wouldn’t notice the controls installed on their account, which allowed their parents to keep an eye on them fairly quietly.

Next, we have a direct hit on the dark patterns that developers involve young users in to potentially steer them down unnecessary routes or things that could weaken their privacy nets. Although this code with 15 standard rules is not part of the legislature, it has certainly been pressured by the ICO to have app manufacturers included in their plans.

While app developers can easily avoid this code as there is no legal force involved, this would lead to a fairly strict watchdog focus that would end up as a full-time investigation into the protection system of this service, and we are sure that not many companies would want to go this route walk. Those who fail to abide by this Code are also accused of violating UK data protection law, leading to a biased opinion about the fairness and compliance of the service with the General Data Protection Regulation and the Data Protection and Electronic Communications Regulations.

The ICO also explained how social media platforms are being asked to demonstrate through their videos, posts, images and the like, how they adhere to this code. If necessary, the ICO will intervene itself and make the necessary changes. According to the minds, some of the biggest risks are identified in the social media platforms, gaming sites, and music – both auditory and visual. They have also been hit by surveillance capitalism, where advertisers use minors’ personal information to bombard them with services that need their attention. Extracting personal data for any type of use is now strictly prohibited for minors in the UK.

While the government actually wants what is best for the children and the Code is in line with their rights, they also want companies and services to abide by them. For this reason, this Code is more of a manual guiding these services on children’s rights to keep them mentally, physically and emotionally healthy.

Since the enforcement powers of the ICO are no joke, we certainly see this Code without any compliance issues. In the case of non-compliant services, we have the watchdog that can even prohibit or discontinue services that do not choose to adhere to them. Given the extended grace period that has been granted, we are pretty confident that services should have a smooth regulatory path by now

Read on: Analysis shows that the following popular permissions are requested by iOS apps