YouTube's child protection mechanism is breaking down، according to some of the company's volunteer watchdogs، according to "BBC". There's a constant anxiety that those seeking to abuse or groom young children will use social media to reach them - and YouTube is aware of the problem. The video-sharing site has a special network of volunteers، called Trusted Flaggers، who help identify worrisome posts and comments on the network. But now members of YouTube's Trusted Flagger programme have told BBC Trending that the company has a huge backlog of reports، some months old، and that the company responds to only a small fraction of complaints from the public about child endangerment and suspected child grooming. One volunteer says that he made more than 9،000 reports in December 2016 - and that none have been processed by the company. A small group of Trusted Flaggers also grew suspicious about effectiveness of YouTube's public reporting abuse page. Over a recent 60-day period، they used it to flag up hundreds of accounts which potentially violated the site's guidelines. However، out of a total of 526 reports، they received only 15 responses، and the volunteers say the survey is emblematic of a larger problem with lack of child protection on the site. Sexually explicit comments The reports were made against accounts which leave potentially objectionable comments، mostly on videos made by young teenagers and children. The videos themselves، according to the Trusted Flaggers and examples of which have been seen by Trending، are not pornographic in nature and do not contain nudity. Many are innocent videos of young people emulating their favourite YouTube stars by performing make-up tutorials and filming their "morning rituals"، or exercising، or just goofing around with friends. The comments below the videos، however، are often sexually explicit. Some encourage the young YouTubers to make videos with fewer or no clothes، talk about their bodies، or simply make graphic sexual references. Some ask children to move to private chat apps or other، less public means of communication. Trending previously reported on inappropriate comments on similar videos which sparked rumours of a huge "paedophile ring" operating on the site. Trusted Flaggers Those allegations of a large organisation or "ring"، spread by a few popular YouTube stars، were backed up by scant evidence. But the persistence of sexualised comments on videos made by young people has troubled several Trusted Flaggers who contacted BBC Trending and who believe the popular video-sharing site isn't doing enough about potential sex offenders using the site. YouTube's Trusted Flagger programme began in 2012، and is comprised of groups - including law enforcement agencies and child protection charities - along with concerned individuals، some of whom work in the tech industry. The volunteers aren't paid by YouTube، but do receive some perks such as invitations to Trusted Flagger meet-ups. They are given a tool which allows them to report multiple videos، comments or accounts at one time for concerns ranging from child exploitation to violent extremism. YouTube employees then review the complaints، and the company says reports of violations by Trusted Flaggers are accurate more than 90% of the time. Despite that headline hit rate، however، of the 15 responses received by Trusted Flaggers testing the public reporting mechanism، just seven (47%) resulted in action being taken. Given the volume of the reports they file on a regular basis، the Trusted Flaggers who spoke to Trending estimated that there are thousands of potential predators using YouTube to contact young people. One of the Trusted Flaggers، who requested to remain anonymous so not to jeopardise his volunteer role، told BBC Trending that lack of response shows "there is no reliable way for a concerned parent، child in danger، or anyone else to reliably report and get action on a predatory channel." The volunteer، who joined the Trusted Flagger programme in 2014، said that the time it took for YouTube to take action on his reports has steadily increased over the time he's been involved in the programme. "It's been an on-going issue since I joined، with the average report I send directly to staff taking three months to be reviewed. Over the last year، it has been significantly worse and as a result of this I still have reports outstanding from last year،" he says. For example، the volunteer says، he is still awaiting responses on more than 9،000 complaints he made in December 2016. "They [YouTube] have systematically failed to allocate the necessary resources، technology and labour to even do the minimum of reviewing reports of child predators in an adequate timeframe،" he says. "There also seems to be an overall lack of understanding regarding predatory activity on the platform and that thousands of children are being targeted and manipulated. "YouTube has inadvertently become a portal of access to children for paedophiles around the world،" he says. "In the long term، YouTube needs to change their stance from being reactive to proactive."