Don't standard tools like PageRank help here where new articles will need to get links from established sites to be ranked? Search engines already have to deal with https://en.wikipedia.org/wiki/Article_spinning, duplicate/near-duplicate content, low-quality content and black hat SEO tricks so what's the difference here?
I suppose the difference here is that the content will look different unique enough to fool these tools. Bots copy and paste stuff all the time, but now it seems it would be easier to fake an a surge of unique human responses.