A mum was left “sickened” after she discovered her daughter, 11, had been sexually groomed by a secret abuse network on Spotify for months.
The Manchester Evening News reports that the young girl from Stockport had been encouraged to upload explicit pictures of herself to the app. Children’s charities have warned that predators will use any app for grooming if they can.
The 11-year-old’s account has now been deleted. A Spotify spokesperson has removed the content and involved users, saying they ‘take the safety of minors on our platform extremely seriously’.
Mum-of-two Rachel (whose name has been changed for this article) said her daughter was directed to create playlists on the app, and then the title would be edited by other users as a form of messaging, with users also able to upload pictures to the playlists. Her 11-year-old daughter received an email from another Spotify user she’d met online, who claimed to be a 12-year-old boy.
He asked her to send a video of her masturbating. Another Spotify user called ‘I have nudes’, tagged her in a playlist and sent a message telling her to ‘show a good view’ of her genitals.
The MEN reports that it has found playlists on the app where users change the titles to message and tag other profiles asking them to upload indecent pictures. One playlist had warned users of a YouTube channel that had uncovered ‘the secret porn community on Spotify’, writing in the title ‘important message to all porn posters’.
The MEN found playlists and profiles showing explicit pictures when they searched for porn and nudes on the platform.
‘I asked if she’d been putting up inappropriate pictures and she nodded’
Rachel has two daughters, aged eight and eleven, and has kept them off Snapchat, Facebook, Instagram and TikTok to keep them safe online. However, she allows her eldest daughter to use Spotify as she enjoys listening to podcasts before going to bed.
Rachel told the MEN: “I am a teacher so I’m probably a bit stricter than other parents but I just wanted to make sure they are safe. They both have iPads but I can control their screen time and what apps they use through my phone.
“The only app my eldest daughter was able to use after 8pm was Spotify because she’s always fallen asleep to spoken word. She likes to listen to podcasts before she goes to sleep which she plays out loud on the Alexa so I can hear exactly what she’s listening to.
“She has access to Spotify adults because of the podcasts she listens to but I didn’t think that would be an issue because it’s just a streaming platform.”
Just after Christmas, Rachel’s daughter had been locked out of her Spotify account, and her mum logged into her email account to work out the problem.
“When I started looking at why it wasn’t working I logged into her email account which she doesn’t have access to. It’s only set up so she can have Spotify but I have the login details,” she said.
“I opened the email and lots of them were from Spotify. It said her playlist had been removed for breaching terms and conditions. I thought it was because of the ones she’d been listening to, I didn’t realise she’d made them herself.
“I saw another email in her inbox this time from a man’s name I didn’t recognise. I asked her who he was and she said ‘he is one of my friends on Spotify.’ She said she had given him her email address so they could play Minecraft together. She said he was only twelve.
“I asked her how because you can’t message people on Spotify. She told me you make a new playlist and you put your message in the title and they check the playlist and reply.”
Rachel said she began to panic, and researched reasons why Spotify playlists would be removed. She found one that said playlists might be taken down if they contained copyright images.
“I asked her if she’d been putting up copyright pictures but my daughter said she didn’t understand,” she said. “I just got that sinking feeling. I asked if she’d been putting up inappropriate pictures and she nodded.
“We found another email from the man who had asked her to pleasure herself. She didn’t understand and was getting very upset. I felt physically sick. When I searched her name on Spotify I could see her playlist. We could see pictures she’d uploaded on there. They were very explicit.”
Rachel immediately phoned 111, and was shocked when they told her to phone 999. She also contacted Spotify and told them what had happened, requesting the pictures were removed from the platform.
“I got through to an actual person really quickly,” she recalled. “They said the platform was never intended to be used in this way. We asked if they could remove the images and they said they would pass it on to the team but that they couldn’t help any further.”
The explicit photos have now been deleted from Spotify after Rachel said she reported the account and playlists multiple times. The MEN reports that they have approached Greater Manchester Police for comment, but haven’t yet had a response.
“The police officer who came to our house said she had not heard of Spotify being used like this so she was quite shocked,” Rachel said. “They said they would try and track down the email address of the man who asked her for a video but if he lives in another country there isn’t much they can do.
“That’s why I am so determined to spread awareness of this to make other parents aware as we had no idea this could happen. I think in my daughter’s eyes the people she was messaging were not real people.
“Because she is so young we hadn’t had a proper chat about explicit photographs or anything like that. But I have taught kids under the age of ten googling how sex works. I don’t think education is keeping up with the online world.”
‘Offenders will exploit any app children use’
Richard Collard, Online Safety Regulatory Manager at the NSPCC, said: “Child sexual abuse is an evolving threat and taking place at a record scale online, and this incident shows how offenders will exploit any app children use if it is possible to do so.
“We must ensure tech firms do all they can to disrupt online child abuse. The Government’s Online Safety Bill will introduce legislation that will hold platforms responsible for finding and disrupting this activity. A strengthened Bill that holds senior managers accountable for safety would make the UK the global authority for children’s safety online.”
The government’s Online Safety Bill is set to be passed this year, with the promise to protect children from harmful online content. It’s been rewritten since July after Conservative MPs rowed about freedom of speech online.
Campaigners and the Labour party criticised the changes, which would have forced big tech firms to take down legal but harmful material. They will still have to stop children from seeing content posing significant harm.
A Spotify spokesperson said: “Spotify takes the safety of minors on our platform extremely seriously, and we do not allow content that promotes, solicits, or facilitates child sexual abuse or exploitation.
“We have processes and technology in place that allow us to detect and remove any such exploitative material. In this case, we found the imagery in question, terminated the user, and removed the content.”
The platform says on its rule page that sexually explicit content is not allowed, and will be removed. Repeated or egregious violations can lead to accounts being terminated, the rule page says.
“We have tons of amazing content on Spotify, but there are certain things that we don’t allow on our platform,” the rule page states. “Don’t post excessively violent or graphic content, and don’t post sexually explicit content.
“What to avoid: Content that contains sexually explicit material includes, but may not be limited to: pornography or visual depictions of genitalia or nudity presented for the purpose of sexual gratification, advocating or glorifying sexual themes related to rape, incest, or beastiality.
“Please respect Spotify, the owners of the Content, and other users of the Spotify Service. Don’t engage in any activity, post any User Content, or register and/or use a username, which is or includes material that is offensive, abusive, defamatory, pornographic, threatening, or obscene.”
If you or your child has been affected by grooming contact the police on 101 or 999, Childline on 0800 1111, or the NSPCC on 0808 800 5000.