Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The act of publishing a book or an article (edit: or a cartoon), has occasionally led to far-reaching, sometimes violent, unforeseen and unpredictable systemic consequences, sometimes bringing down governments.

Yet Taleb has published several, apparently with no concern whatsoever about his own precautionary principle.



Not at all. The point of his book is to educate people and reframe how they think about risk, so he achieved exactly what he intended. If one of those is to bring a more risk-based understanding of GMOs, which in turns makes people more cautious from a scientific perspective, then mission accomplished.

The fundamental problem with loosing GMOs into the environment is that there's no plan B for screwing up the earth or your health, so we have to be extra, extra conservative. Even if only 1 in a million GMOs turn out to have disastrous consequences, given enough development and use, eventually we will create something with unforeseen consequences that passes whatever standards for safety we have. But given the replicable nature of biology, it will be everyone's problem instead of a localized disaster.

The people claiming we have sufficient scientific evidence for GMOs don't understand this key point: our usual standards for evidence of safety must be orders of magnitude higher to risk the planet.


This is a point of view that seems to rest on the idea that "conventional" human agriculture isn't a disaster for the planet. But it manifestly is.


Well before there were GMO, there was the fertilizer bloom at the mouth of the Mississippi that is a clear disaster: http://science.nasa.gov/earth-science/oceanography/living-oc....


Who proposed the precautionary principle and whether they've acted in accordance with it is irrelevant when the considering the matter of whether it should apply to GMO food.


Bringing down a government does not belong to the same group of consequences the precautionary principle is applied to.

PP is used when a system, in this case the humanity, is at risk of total failure, i.e. ruin.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: