Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What can I tell you here? You're right, of course: most apps don't care about performance at all, either in space efficiency or time efficiency. Those apps shouldn't use C, and should certainly use a GC'd language.

I'm only reacting to the idea that (pp) "because it's 2011, nobody should be using ARC". Well, a statistically tiny number of apps may require the bare-metal performance that hand allocation provides, but they're also disproportionately important apps.



Ah, I read it more as 'It's 2011, where's my flying car/why aren't just about all apps running in managed environments'. Sort of like one says, 'It's 2011, why did syslogd just brick my server'. And it's not a completely lisp-machiny, neckbeardy sentiment, 10 odd years ago everyone was telling us the flying car was just around the corner - Apple was busily trying to bridge Java into Rhapsody, Microsoft was working on .NET/CLR. And yet, and yet...


The last years I thought that Objective-C was perhaps not future-proof enough when MS pushed .Net more and more into being desktop class.

And now MS shows WinRT that is not .Net. Hmmm... :)


I think eropple's point was that ARC is not a very high-level sort of approach to GC, i.e., not Lisp-style exact GC.


That's right (sorry, forgot about this thread). Not saying that ARC is bad, for where it's used--but it doesn't strike me as terribly "high-level".

(Neither, though, does C++, and smart pointers don't help there either. ;) )




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: