Internet companies are “fundamentally enabling” child sex abuse, the police chief in charge of child protection has said.

Simon Bailey, the National Police Chiefs Council’s lead on child protection, said online firms are not doing enough to make internet chatrooms safe places for children, or to take down indecent imagery.

He acknowledged that intelligence is being shared but said all images of abuse should be taken down.

Chatrooms “should be policed” to make sure young people can have a conversation online “without facing the threat of an adult coming into that room, trying to groom them with one thing and one thing alone on their mind”, he said.

As it was revealed that Scotland Yard had witnessed a 700% spike in the number of online child abuse cases referred to them by the National Crime Agency since 2014, Mr Bailey said internet companies have a “moral and social responsibility to make their platforms safe”.

He added on ITV News: “They are making some progress but it is nowhere near enough.

“These companies are making sums of money which are huge, but the fact is that children are being abused and not enough is being done to make chatrooms safe places for our children to go and not enough is being done to take down indecent imagery which is out there.”

Mr Bailey added: “Absolutely no question at all – they (internet companies) are fundamentally enabling it.”

Undated handout file photo issued by Norfolk Police of Norfolk Police Chief Constable Simon Bailey (Norfolk Police/PA)Mr Bailey said young people should be able to enjoy online chatrooms without the threat of grooming (Norfolk Police/PA)

A Facebook and Instagram spokesman told the broadcaster the platforms have “zero tolerance” for child exploitation.

“We proactively search for and take down this kind of content and immediately alert the police to potential offenders and young people at risk.

“We’ve spent the past decade working with safety experts including the IWF, CEOP and the UK Safer Internet Centre to develop powerful tools to combat this kind of activity and we have a global team responding around the clock to reports from our communities.”

A Snapchat spokeswoman told the programme: “The safety of our community is our top priority and we go to great lengths to prevent and respond to any instance of child exploitation on our platform.

“Our dedicated trust & safety team and law enforcement operations team work round the clock to enforce our policies and work closely with law enforcement and national organisations to prevent and respond to this type of illegal activity.”

The Live.me live-streaming app said it is “extremely invested” in creating a safe community, and works to combat policy violations through human moderators and artificial intelligence.

The company went on: “Live.me does, however, take issue with the accusation of ‘fundamentally enabling’ child sex abuse, which we believe shows a lack of understanding of social media platforms on the agency’s part.

“We have invested and continue to invest significant resources to develop the tools, technology, systems and processes to combat some of the dangers that exist on social platforms, including but not limited to grooming.

“We work closely with law enforcement agencies across the globe to support investigations while navigating complex privacy issues.

“Instead, we would love to see these agencies provide guidance and leadership to make reporting and escalation more universal and accessible for all internet apps and services.”