Not activists and not entirely black, but I know that Youtube works with the Metropolitan Police in the UK to remove drill rap videos that the Met considers too violent (violent content, of course, being one of the things that is against Youtube's TOS).
(As always, content removal is a controversial topic, and that includes this topic.)
Youtube has been stung by various "scandals" in the past where advertisers temporarily "boycott" (or, at least, raised a ruckus) when an ad appears next to content the advertisers consider objectionable (such as, as I recall, child exploitation content and al-Qaeda / jihadist groups several years ago). As an ad supported platform, I do think that Youtube certainly should have the right to monitor their platform in order to remove content that they believe is bad for business, including protecting advertiser brands from content they don't want to appear next to. Youtube also has legal requirements to follow (eg copyright law) and may do some things more to manage their own brand, or even perhaps manage legal liability. It would be nice if Youtube was more transparent about the whys of their content management, but as a private company they are not obligated to do so.
Youtube is not the last word in streaming videos. There are several platforms dedicated to hosting videos that would fall afoul of Youtube's content policy. Many of the ones I can think of are not funded by advertising, though some platforms are big enough where "niche advertising networks" are possible (eg big adult video networks like Pornhub).
https://www.vice.com/en/article/bvnp8v/met-police-youtube-dr...
(As always, content removal is a controversial topic, and that includes this topic.)
Youtube has been stung by various "scandals" in the past where advertisers temporarily "boycott" (or, at least, raised a ruckus) when an ad appears next to content the advertisers consider objectionable (such as, as I recall, child exploitation content and al-Qaeda / jihadist groups several years ago). As an ad supported platform, I do think that Youtube certainly should have the right to monitor their platform in order to remove content that they believe is bad for business, including protecting advertiser brands from content they don't want to appear next to. Youtube also has legal requirements to follow (eg copyright law) and may do some things more to manage their own brand, or even perhaps manage legal liability. It would be nice if Youtube was more transparent about the whys of their content management, but as a private company they are not obligated to do so.
Youtube is not the last word in streaming videos. There are several platforms dedicated to hosting videos that would fall afoul of Youtube's content policy. Many of the ones I can think of are not funded by advertising, though some platforms are big enough where "niche advertising networks" are possible (eg big adult video networks like Pornhub).