New Jersey complains of discord for allegedly failing to protect children
Discord faces a new lawsuit from New Jersey. This argues that chat apps are engaged in “deceptive and ruthless business practices” that put younger users at risk.
Litigationcomes after a multi-year investigation by the New Jersey Attorney General, filed Thursday. The AG office claims that despite Discord’s policies to protect children and teens, popular messaging apps have revealed evidence that youths are “in danger.”
“We are the first state in the country to file a disagreement,” Attorney General Matthew Platkin tells Wired.
Platkin says there were two catalysts for the investigation. One is personal. A few years ago, a family friend came to Platkin. Despite the platform that prohibits children under 13 from registering, he was surprised his 10-year-old son was able to sign up for Discord.
The second was a mass shooting in Buffalo, neighbouring New York. The assailant uses Discord as his personal diary, led to the attack, Live streaming of the massacre Direct to the chat and video app. (The video was deleted immediately.)
“These companies are consistently and deliberately profiting more than our children’s interests and happiness,” says Platkin.
AG’s office alleges in the lawsuit that Discord violated the state’s consumer fraud laws. The allegations filed Thursday morning turn on a set of policies adopted by Discord to keep children under 13 from the platform and keep teenagers safe from sexual exploitation and violent content. The lawsuit is the latest in a list of growth in state lawsuits against major social media companies, and so far has proven to be rather ineffective.
Discord’s children and teen safety policies are clear. Children under the age of 13 are prohibited from messaging apps, but they more widely prohibit sexual interactions with minors, including young people’s “self-consumption.” Additionally, there is an algorithm filter that works to stop Unnecessary direct sexual message. Safety policy of the California-based company; Released in 2023“We created discrepancies to work relentlessly to make it a fun and safe space for teens, as differently.”
But New Jersey says “Discord’s promises are falling and keeping flat.”
The Attorney General said that Discord has three levels of safety, “keep me safe” to prevent unwanted and exploitative messages from adults, and the platform scans all messages into users’ inboxes. They don’t scan messages from friends saying “My friend is fine.” If the “Don’t Scan” message is not scanned.
Even for teenagers, the platform claims to be the default saying “My friend is good.” The Attorney General argues that this is an intentional design that represents a threat to younger users. The lawsuit also alleges Discord is failing by not verifying age to prevent children under the age of 13 from signing up for services.
In 2023, Discord added a new filter to detect and block unwanted sexual content, but AG’s office says that the “Safe My Safe” option should be enabled by default.