A feature being rolled out in the US displays notices below videos uploaded by news broadcasters which receive government or public money, according to a blog post by YouTube News senior product manager Geoff Samek.
"Our goal is to equip users with additional information to help them better understand the sources of news content that they choose to watch on YouTube," Samek said.
Instagram, Google+ join EU group fighting hate speech
"News is an important vertical for us and we want to be sure to get it right."
The move is likely to affect videos from services such as Russia-backed RT, which critics call a propaganda outlet for Moscow, but others as well.
The blog post included a screenshot with a disclaimer about the US government-funded Radio Free Asia. The flagging may also apply to state-chartered news organizations such as the BBC and AFP, and US-based public broadcasters.
Notices displayed with state-sponsored news broadcasts will include links to Wikipedia online encyclopedia so viewers can find out more about agencies behind the reports, according to Samek.
The feature is nascent and will be refined based on feedback from users.
YouTube made a series of changes last year intended to "better surface authoritative news," according to Samek.
YouTube priorities for this year include tightening and better enforcing rules at the service, according to chief executive Susan Wojcicki.
"The same creativity and unpredictability that makes YouTube so rewarding can also lead to unfortunate events where we need to take a clear, informed, and principled stance," Wojcicki said in an online post.
"We realize we have a serious social responsibility to get these emerging policy issues right."
Solutions being worked on include enhanced software smarts and more human review of videos uploaded to YouTube, according to Wojcicki.
Apple in talks to 'acquire' Shazam
The number of workers at YouTube and Google focused on content that might violate policies was to increase to more than 10,000.
"We're also currently developing policies that would lead to consequences if a creator does something egregious that causes significant harm to our community as a whole," Wojcicki said.
YouTube last month announced ramped-up rules regarding when it will run ads with videos as it scrambled to quell concerns by brands about being paired with troublesome content.
YouTube late last year pulled 150,000 videos of children after lewd comments about them were posted by viewers.
COMMENTS
Comments are moderated and generally will be posted if they are on-topic and not abusive.
For more information, please see our Comments FAQ