When Meta, the parent company in charge of Facebook, and Instagram, and owners of WhatsApp announced that the minimum age limit for entering the social media platform What’s App would be reduced from 16 to 13 in the European Region, in order to bring it in line with America, it sparked a furor amongst child safety campaigners
Dr Briant, who was involved in exposing the Facebook-Cambridge Analytica data scandal concerning data misuse and disinformation, warned WhatsApp’s features such as disappearing messages and end-to-end encryption create the sense of privacy and also encourage over-sharing of intimate photos said,
“While Meta encourages parental controls and for parents to talk to and educate their kids, kids of that age experience extreme social pressures already, and even if your kid is emotionally mature enough not to do this, they may be added to groups with others who are not,” she said.
Not mentioned in much of the press coverage is the fact that until 2018, the WhatsApp age limit was 13, but was raised to 16 so as to comply with European Data Protection Legislation on privacy. It is ironical that the limit was not raised so as to make a child’s online experience safer. It led, according to WhatsApp to differing age limits around the world, and the recent reduction to 13 was made purely to introduce consistency.
It has been argued that the decision has simply been made to keep up with the competition like Snapchat, and that the majority of users come from a younger age group, hence the pressure groups’ complaints that profit is being put before child safety.
I was asked to give my views on the decision by Talk TV on their morning Jake Berry show on 12th April 2024.
In the program, I made the following points:-
- The Online Safety Act was brought into force to make the internet safer for children, and in particular the method of proving age is to be reinforced, so as to, for example, cross check to other databases which already contain proof of birth date.
- Whilst the Act has come into force, at the moment Offcom is liasing with Social Media Platforms so as to agree codes of practice, breach of which, when agreed by all, will constitute an offence under the Act. It is estimated that this will take two to three years. At the moment we are still in the negotiation phase. See my article on the Act here
- As the codes of practice have not even been agreed it is inappropriate for WhatsApp to be downgrading the safety of their platform by reducing the age requirement.
- Children are sometimes contacted using less secure social media platforms by paedophiles, and diverted onto What’s App because it is encrypted. Once on WhatsApp, neither the police nor even WhatsApp can find out what is being said.
- During the run up to the Act, WhatsApp complained that they should not be punished for failing to police their own App when they could not find out what was being said due to the encryption.
- A common method used by paedophiles is to entice children into a What’s App group, then encourage young members to introduce their friends, which can be done almost automatically. Inappropriate material can then be shown to children.
- The NSPCC have suggested that types of “friction” should be incorporated into the user side of the app rather than the network so as to prevent children from accessing harmful content such as the “Nudity Filter” currently being developed by Meta.
- Whilst it has been suggested that the only way forward is to ban smart phones with access to the internet from children under a certain age, as opposed to “dumb phones”, the NSPCC again say that the benefits of mobile phones outweigh the disadvantages, and that methods to keep children safe are preferable.
See my article on the Online Safety Act here
by Peter Garsden, Solicitor. For help on any aspect of this article, or, indeed, any aspect of abuse cases, please contact us by filling in our form.