YouTube is introducing tougher requirements for video publishers who want to make money from its platform.
In addition, it has said staff will manually review all clips before they are added to a premium service that pairs big brand advertisers with popular content.
The moves follow a series of advertiser boycotts and a controversial vlog that featured an apparent suicide victim.
One expert said that the Google-owned service had been slow to react.
"Google presents the impression of acting reactively rather than proactively," said Mark Mulligan, from the consultancy Midia Research.
"It needs to get better at acting faster."
Subscriber count
The first part of the new strategy involves a stricter requirement that publishers must fulfil before they can make money from their uploads.
Clips will no longer have adverts attached unless the publisher meets two criteria – that they have:
- at least 1,000 subscribers
- more than 4,000 hours of their content viewed by others within the past 12 months
YouTube said that this represented a "higher standard" than the previous requirement of 10,000 lifetime views, which was introduced nine months ago.
It blogged that this should help it combat "spammers, impersonators, and other bad actors" as well as to prevent "potentially inappropriate videos from monetising, which can hurt revenue for everyone".
YouTube faced a backlash from many of its creators, dubbed the "adpocalypse", last year when it prevented videos about some topics being able to include adverts.
This was in response to more than 200 major brands pulling campaigns over concern their ads had been attached to clips featuring hate speech and other extremist content.
Mr Mulligan suggested the latest change should prove less controversial.
"In terms of impact on creators, this makes taking that first step to commercial status that bit harder to reach," he told the BBC.
"But given how much the platform is growing, higher benchmarks will be easier to meet now than they were a few years ago."
Human reviewers
The second part of the effort focuses on the Google Preferred programme.
This lets brands pay extra to attach their adverts to the top 5% of videos most popular with 18- to 34-year-olds.
Until now, the process was automated.
But YouTube said it would manually review all relevant content by the end of March.
In theory, this process would have alerted YouTube to a controversial clip by vlogger Logan Paul at an earlier stage.
At the end of last year, the 22 year-old American featured what appeared to be a dead man's body hanging from a tree in Japan's Aokigahara forest in one of his videos.
Mr Paul – who has more than 15 million subscribers – was excluded from Google Preferred last week as a consequence.
YouTube had previously announced that it planned to have more than 10,000 workers reviewing clips in general on the service by the end of 2018.
It appears to acknowledge that further steps will be necessary to prevent a similar scandal in the future, and said it intended to "schedule conversations with our creators in the months ahead" to discuss ways to address the problem.
But Mr Mulligan suggested that YouTube still faced a fundamental problem.
"The Logan Paul experience highlights the risk that young creators like Logan have been shorn of the structure that their peers in traditional media have: the people to advise, guide and mentor them," he said.
Analysis:
By Dave Lee, North America technology reporter
Manually and proactively reviewing videos on its most popular channels opens up a whole range of potential issues, even if that stipulation will apply only to those on its Google Preferred programme.
Most notably, it removes YouTube's ability to duck behind its "as soon as we were made aware" defence when removing inappropriate content.
Much like a traditional media company, it will need to make decency judgements, and the most skilled team in the world won't get it right 100% of the time.
And when these hard decisions are made, don't expect YouTubers to like what the moderation team decides.
Whether justified or not, YouTube will find itself accused of political bias, discrimination, racism and homophobia at various stages of this process.
By reducing the number of channels able to monetise, YouTube shifts its moderation task from impossible to merely incredibly difficult.
YouTube will be watching its algorithms closely.
It will analyse how often the algorithms miss something, and if it becomes a sufficiently rare occurrence, expect this human moderation layer to be removed as soon as YouTube feels it's ready to take the risk.
[contf] [contfnew]
BBC
[contfnewc] [contfnewc]