Charleston, South Carolina – A complaint filed on Thursday accuses two major online platforms of allowing a Charleston 11-year-old to be groomed by an adult predator, causing life-altering suffering.
The case was brought against Roblox and Discord. Roblox, an online gaming platform primarily used by youngsters, with 111.8 million daily active users. Discord is an instant messaging program that focuses on gaming with friends.
The 15-year-old girl met a man on Roblox in 2022 who pretended to be a peer but was actually an adult predator, according to court records. The predator allegedly used grooming strategies to speak with and manipulate the child on Discord, allowing them to continue their conversations.
On Discord, the predator sent the young girl explicit messages and forced her to give a sexually graphic image of herself.
According to documents, the 11-year-old lost her self-confidence after being sexually exploited and continues to suffer significant harm as a result of the grooming, manipulation, and exploitation she encountered on both applications.
Dolman Law Group Partner Stan Gipe filed the case on behalf of the youngster and her family, stating that Roblox is especially dangerous because anybody can create a profile on the platform.
“Roblox has become a new playground; kids go out there, they interact with other children and other people on the platform,” Gipe told me.
“If I’m a 50-year-old predator, I can play Roblox and pretend to be a 12-year-old girl, something I would never do in public. But on Roblox, I can now play as a 12-year-old female and approach other 12-year-old girls in that character,” Gipe explained. “It allows a predator to open a door, strike up a friendship, and begin the grooming process.”
The youngster had been an ardent Roblox user for years, and according to the docs, her grandma believed the gaming platform and Discord were safe for children because they were intended and promoted for children.
According to a spokesman for the Dolman Law Group, the alleged predator has not been charged criminally.
“It’s a kids’ game, right? “It’s poorly designed,” Gipe stated. “There have been sexual attacks. This has been going on for a while. Roblox fails to adequately alert parents about the risk.”
In July, Gipe’s firm filed a lawsuit in Florida over another incident involving an underage child on Roblox. According to the documents, a predator gradually acquired the girl’s trust, amplified his tactics, and persuaded her to meet in person.
The predator allegedly drove to the girl’s grandfather’s house and persuaded her into his car. The two traveled to a nearby neighborhood, where he brutally assaulted her and forced her to perform sex acts on him, causing unimaginable agony, harm, and misery.
“They (Roblox) go out and make positive statements about how safe the platform is to leave your children on. It’s these affirmative claims that provide parents, grandparents, and others a false sense of security and allow their children to use the platform,” Gipe added.
Roblox is facing nationwide worries, since Thursday’s case is the 70th filed in Federal Court against the firm since mid-July, with Discord named in at least eight of them.
The lawsuit claims that Roblox and Discord spent a significant amount of time and money publicizing the applications’ safety and security, giving the public the impression that both had established a safe environment for children.
“They had the ability to protect children five years ago. They’re beginning to implement safety measures that should have been implemented years and years ago,” Gipe remarked. “It is a company that values company profits over user safety.”
When asked about the lawsuit, a spokeswoman for Discord issued a statement that reads: