Plans have been announced to increase Google's moderation staff to 10,000, following concerns over 'inappropriate content' on YouTube.
Recent weeks have shown the hugely popular video site - which is owned by Google - criticised over content targeting or involving children.
Concerns over clips of 'scantily clad children' saw several major advertisers pull their ads from the site, while commentators have also highlighted examples of disturbing or inappropriate videos aimed at young audiences.
YouTube now insists that it will "take the steps necessary to protect our community" and stop "abuse of our platform".
Susan Wojcicki, CEO of YouTube, explained that human moderators are vital alongside software that automatically flags suspect content.
She said: "Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualised decisions on content."
She also pledged measures to protect both advertisers and video-makers from "bad actors" and inappropriate content.