Discord is tightening its rules around age verification and teen safety. Starting next month, users who do not confirm their age using a face scan or an ID may see their accounts restricted. According to the company’s blog, the change will roll out in phases and will apply to both new and existing users. Until age verification is completed, users will be placed under stricter default settings, limiting access to certain features and content as part of Discord’s broader push to make the platform safer for younger users.
Discord teen safety update
According to Discord, the update is part of a new “teen-by-default” approach. Under this system, all users will initially be placed in a safer, age-appropriate environment. To unlock certain features or access sensitive content, users may be asked to confirm that they are adults. This includes access to age-restricted channels, servers, app commands, and some message settings.
Age verification: Privacy
The company said privacy protections are built directly into the age verification process. It also clarified that age verification requests will appear only in the Discord app, not via email or text message. The blog added:
Video selfies used for facial age checks stay on the user’s device
ID documents submitted for verification are deleted quickly, often immediately after age confirmation
A user’s age verification status is not visible to other users
Users can appeal or retry the verification process through account settings
Safety settings
Along with age checks, Discord is changing default safety settings for all users. Sensitive content will be blurred unless a user is verified as an adult. Only verified adults will be able to access age-gated spaces, speak on stage in servers, or change certain direct message settings. Messages from people a user may not know will be routed to a separate inbox by default, and users will see warning prompts when receiving friend requests from unfamiliar accounts.
The company said teen safety remains a key priority and that the update builds on existing protections. Discord had earlier rolled out similar settings in the UK and Australia and is now expanding the approach to users globally.
Teen protections on other platforms
Age verification and teen safety measures are becoming more common across major platforms. YouTube has introduced supervised accounts, content filters, and restricted features for younger users, while Meta has also added age checks and safety tools across its apps to limit access to sensitive content for teens.