>> Experts said that even photos that are partially obscured, such as the image shared by the influencer, typically qualify as illegal child sex abuse material, or CSAM.
>> In fact, the image in question had drawn more than 3 million views and 8,000 retweets, according to Twitter statistics on a cached version of the tweet from Tuesday.
* Researchers at the Stanford Internet Observatory say the company failed to deal with 40 items of child sexual abuse material (CSAM) over a period of two months between March and May this year.
* Research such as this is about to become far harder—or at any rate far more expensive—following Elon Musk’s decision to start charging $42,000 per month for its previously free API.