Discord has introduced Teen Safety Assist, a new safety initiative to promote a safer environment for younger users. The initiative includes teen safety features enabled by default in the app, such as: B. proactive filters and alerts.
“As great as all of these amazing updates are to Discord, none of it matters if you don’t feel safe on Discord,” the company said in one blog entry Thursday. Teen Safety Assist will be officially launched next week.
One of the features that Discord implements is to automatically blur sensitive images shared with teenagers on the platform. The app may have taken a page from Apple’s book that began Blur inappropriate photo messages to children in 2021.
Teens on Discord will also receive a notification when they receive messages from a new sender, asking them to double-check whether they want to reply or block the user.
According to the company, more than 150 million people use Discord every month Statista found that 22% of them are younger users aged 16 to 24. The minimum age to join the platform is 13, but Discord is most popular with older users in their 20s and 30s.
Teen Safety Assist also includes a new warning for Discord rules violations. Offenders will receive a message informing them of the rule they violated and whether Discord is warning them or violating them. The alerts provide users with the opportunity to understand the rules in place and become a better digital citizen.
However, these warnings only apply to users who violate certain rules. Discord continues to have a zero-tolerance policy toward violent extremism and policies that sexualize children.
Discord has one Security News Hub where it publishes new security, privacy and policy initiatives. “Creating a safer internet is at the heart of our shared mission,” the company said in a statement blog entry. About 15% of Discord’s employees focus on security initiatives.