1. you run your existing spambot software through phantomjs.
2. your unmodified bot fills in all the visible fields without changing a single line of code, and the webkit backend transparently computes your hashes and other automated javascript "human" tests.
3. again, your existing "stupid" spambot code submits your form, and your site is now overrun by spam.
With Captcha, you get an image and a unique ID that is validated at the server. Sure, you could run it through mechanical turk, but I'm guessing that a few CPU cycles to load a webkit backend is still vastly cheaper than farming work out to MechTurk.
My point is that you wouldn't even have to change your spambot software to defeat these "new" validations, and they can be trivially overcome, as opposed to MechTurk+reCaptcha. Add to that the benefits of targeting sites that are relatively spam-free, and you have a real incentive for spammers to simply plug-in phantomjs instead of using WWW::Mechanize or what have you.
The point is that all these measure are 'trivial' to break, and so are captcha's. Except with captcha's you impose a burden on your user, and with other techniques you can offload that burden to the developer. I'm not sure what the 'existing' part in 'existing spambot' has to do with it - the time it would take to add farmed captcha solving is marginal (you don't even have to mech turk it - most captcha's are broken with OCR software readily available on the underground market anyway).
captcha = sign of clueless or lazy, or both, developer. I don't put up with it anymore - I have yet to meet a single registration that I actually need that uses a captcha. I'm not the only one, either.