September, Friday 20, 2024

The potential implementation of face scanning for UK porn viewers


40QSNXKTRbelsuv.png

A new draft guidance from Ofcom suggests that porn users may be required to have their faces scanned to verify their age, particularly targeting young-looking adults. Ofcom, the UK's communications regulator, has outlined various measures that explicit websites can implement to prevent children from accessing pornography. A recent survey indicates that the average age at which children first view explicit content is 13. However, privacy campaigners have expressed concerns about the potential consequences of leaking age verification data, labeling them as "catastrophic." According to Ofcom's report, nearly 14 million people in the UK watch online pornography, with one in five doing so during office hours. These figures have raised alarm about children having easy access to explicit websites, with one in ten children reportedly viewing such content by the age of nine, according to the Children's Commissioner. The Online Safety Act, which has become law, mandates social media platforms and search engines to safeguard children from harmful online content. Ofcom, empowered to impose substantial fines for non-compliance, has detailed its expectations for companies to adhere to the new regulations, highlighting that age checks should effectively determine whether a user is a child. Ofcom suggests that acceptable age verification methods may include facial age-estimation technology, which employs facial scanning and software inference to determine if a user is an adult. If the technology is insufficiently accurate, websites could implement additional checks for individuals who appear to be below a specified "challenge" age, similar to retailers requesting identification for alcohol sales to those who appear under 25 years old. Ofcom acknowledges that it is improbable to devise an age assurance method that is entirely foolproof, but websites must guard against simple tactics to circumvent age verification. For systems that rely on comparing a user's face with a photo ID like a passport, a "liveness check" should be conducted to prevent children from using borrowed or fake ID along with a photo of an older person to deceive the system. Sex education advocates believe that implementing these types of safeguards would help protect children from exposure to pornography. Although the Online Safety Act represents progress, education about the distorted portrayal of sex, the objectification of women, and the issue of consent associated with pornography remains essential, according to experts. Concerns have also been raised that some young people may seek riskier, unregulated websites to access pornography if mainstream sites are blocked. Ofcom has identified safeguarding personal data as a major concern among adult users who must prove their age, and the drafted guidance stipulates that sites must adhere to data protection regulations established by the Information Commissioner's Office. However, campaigners argue that not enough emphasis is placed on data security and call for more specific privacy rules given the sensitive nature of the information that will be processed. Draft codes of practice covering pornography on social media platforms are expected to be published in 2024.