More than 5,000 online grooming offences were recorded by police in England and Wales in the 18 months to September 2018, a charity has found.
The NSPCC said police figures suggested Instagram, Facebook and Snapchat had been used in 70% of cases of sexual communication with a child since it became an offence in April 2017.
It wants new laws to force social media firms to do more to protect children.
Facebook and Instagram said they “aggressively fight” such content.
The NSPCC said 39 of the 43 police forces in England and Wales had responded to Freedom of Information requests, with only Surrey, Sussex, Northampton and City of London police failing to provide data.
In the first 18 months since the crime of sexual communication with a child came into force in 2017, a total of 5,161 such crimes were recorded by the police, with 1,944 recorded in the six months between April and September 2018.
The NSPCC said Instagram, Facebook and Snapchat had been used in 70% of the 1,317 cases in that six-month period where police had recorded the method used.
Instagram was used in 32% of cases, Facebook in 23% and Snapchat in 14%.
Where police forces recorded age and gender, seven in 10 victims were girls aged 12 to 15. One in five was 11 or under. The youngest child recorded was five years old.
The charity has urged the government to “tame the Wild West Web” by bringing in regulation to protect children on social networks.
‘We exchanged texts which quickly became sexual’
In one case of abuse given by the charity, a girl was groomed by a 24-year-old man when she was 13.
Emily – not her real name – met the man through a friend. He had introduced himself, saying he was 16, which quickly changed to 18. She told him she was 13. Later that evening he added her on Facebook and Snapchat.
Emily said: “It escalated very quickly from there. We exchanged texts which quickly became sexual, then photos and videos before arranging for him to come and pick me up after school.
“He drove me somewhere quiet… and took me into the woods and had sex with me. He drove me in the direction of home straight afterwards, refusing to even talk, and then kicked me out of the car at the traffic lights.
“I was bleeding and crying. This was my first sexual experience.”
Emily’s mother said: “We felt as though we had failed as parents – we knew about these social media sites, we thought we were doing everything we could to ensure our children’s safety when they were online, but we still couldn’t protect Emily.”
NSPCC chief executive Peter Wanless accused social media networks of “10 years of failed self-regulation”.
“These figures are overwhelming evidence that keeping children safe cannot be left to social networks. We cannot wait for the next tragedy before tech companies are made to act,” he said.
Ahead of the government publishing a white paper on online harm, the charity is pushing for statutory regulation to enforce a legal duty of care to children on social networks, with a penalty of substantial fines if they fail.
The NSPCC said the figures did not “fully reflect the scale of the issue”, as many crimes go undetected or unreported.
A National Crime Agency spokesperson said: “It is vital that online platforms used by children and young people have in place robust mechanisms and processes to prevent, identify and report sexual exploitation and abuse, including online grooming.
“Children and young people also need easy access to mechanisms allowing them to alert platforms to potential offending.”
A spokesperson for Facebook, which also owns Instagram, said: “Keeping young people safe on our platforms is our top priority and child exploitation of any kind is not allowed.
“We use advanced technology and work closely with the police and CEOP [Child Exploitation and Online Protection] to aggressively fight this type of content and protect young people.”