
Discord Begins Testing Controversial Age Verification System
Discord is implementing a new age verification process aimed at safeguarding minors on the platform by requiring users to verify their age via facial scans or ID submissions.
Discord is trialing a new system to verify the ages of users in regions including the UK and Australia, responding to increasing regulatory demands for online safety. Users may need to submit a facial scan or government ID to access specific content flagged as sensitive.
This development arises amid rising pressure from authorities, which argue that previous self-reporting age methods provided inadequate safeguards for children. Discord’s new verification measures aim to prevent minors from being exposed to inappropriate material online.
Once a user submits their verification details, the system processes the information and informs the user of their assigned age group. Discord emphasizes that this process is intended to be a one-time requirement, with options for manual review in case of misclassification. Users who are deemed underage for their region may face account restrictions until their eligibility is confirmed via an appeals process.
While some users express dissatisfaction with the necessity of sharing personal facial and identification data, Discord’s new measures are a response to tightened regulations, including new laws in Australia and the UK.