It strikes me that this developer chooses to write an abstraction on top of JS for solving his recurring problems, over simply using language features readily available.
Promises are as much a pattern or mechanic as callbacks, but the latter feels far more natural in JS. The `promised` decorator is interesting, but it has problems:
- The application I work on, and have in mind with this, would need just about every function decorated.
- It doesn't account for methods, unless you're default decorating the method, ie.: `Foo.prototype.myMethod = promised(function(/* ... /) { / ... */ });`
- Every decorated function takes a noticable performance hit, because things like `Function#apply` and concatenating `arguments` are relatively slow.
- It doesn't sit well with me that the example rewrites a method on a prototype declared elsewhere: `console.log = promised(console.log)`
Further, the article and library don't even scratch the surface of complicated async flows. Think async versions of common functional-style methods like `map`, `reduce`, etc.
For example, a basic scenario from our own build process is: Scan a directory for template files, read them, compile them, then concatenate the result and write it out.
We used to have a promise library to do all of this, from handling a build process to performing database queries. I discovered Async.js at some point and haven't look back since: https://github.com/caolan/async
First of all I implement abstraction using language features and there for I take advantage of it.
- I favor maintainability over performance, also keep in
mind that promised function take promises as arguments
and can't do much until they're fulfilled (associated IO
is done) so that small performance hit is insignificant
in most of the cases.
- You can write your own decorators to wrap constructors
and their methods if you need to. That being said, I'd
recommend against, mixing mutable state with logic does
no good in long run. You'll be better of with functional.
- As for map / reduce, promises represent eventual values,
not sequences of them. For that there are streams and I
have explored that area as well:
https://github.com/Gozala/streamer/wiki/stream
I have not wrote about it because I don't think it was
good idea to dump everything in one post.
Promises are likely to be baked into JavaScript at some point in the future[1], so I would recommend becoming familiar with them now. Even if you don't use the pattern personally you will likely run into libraries that do use it once it is part of the spec.
Just to be clear this library implements just a subset of Q with exact same API, with only addition of `promised` wrapper. I'm convinced that this wrapper is a better way
to deal with promises than dozens of utility functions that you have to learn about.
For example Q.all is promised(Array) I find later more intuitive.
That's not to say don't use Q! Q is brilliant piece of software and I'd be more than happy to see more people using it.
On a site called 'Hacker News', it's not entirely unreasonable to expect that a user may have the skills and inclination to contribute back to a project like the Linux kernel. Especially when it's a problem that affects them directly.
I wonder why the HN account name field is optional, for getting in the shared circle.
Currently, it looks like anyone can add themselves just by having an email and G+ account. Someone already mentioned OAuth, and it'd be great if proper auth against both G+ and HN was required to be added to the circle.
On the other hand, it's also not that hard to selectively toss someone out of your copy of the circle, even right from the stream page.
Well, Google wouldn't replace their login process because they are the BrowserID primary. If you have a gmail account, BrowserID expects you to be logged in with it, which essentially means you're logged in to Google. (They could add it for non-gmail Google accounts, I guess.)
But Facebook could benefit from this. Maybe not at this early stage, but the way you log into Facebook is using your email account. That's exactly the step BrowserID wants to make easier.
I think the intention is for the primary to be your email provider. So if they become compromised to that extent, then I wouldn't feel very safe about my email account in the first place. Pretty much all of my credentials everywhere depend on that.
Until secondaries go away, Mozilla seems like a very competent and trustworthy organization to have in charge of browserid.org, IMHO. Much better than even Google. It's great to see that even the branding on browserid.org is minimal.
My guess is that, concerning nonces and revocation, they didn't consider the current situation (OpenID, OAuth for login, etc.) any better. BrowserID doesn't seem to do away with the strong advice to run HTTPS for such sites.
LTS releases are actually pretty boring, for obvious reasons.
Unity is probably getting more usable. It's now got a pretty built-in backup tool. The App store thingy is better. There's something called juju, which is a "DevOps" branded deployment package, or something. It uses words "cloud", "DevOps", "charms" (wtf?) and so on, and is probably just a wrapper on apt-get, but hey, we all need new buzzwords (and I'll look into it if it stops me shooting myself in the foot too much).
Also, the lightweight distros are probably getting to the "heavy enough to use, but not too Spartan" stage that the Gnome haters want, but will become bloated in 2 years time (and the anti-bloat crowd will jump onto xmonad or something ... why can't they just maximize a terminal?).
Juju is for quickly assembling servers or clusters on AWS or any OpenStack-compatible platform. The charms are what you use to define what the machines provide and what they require to work in order to let juju assemble the whole thing for you. I was with some of the guys working on it until yesterday, at the PythonBrasil conference. They made a couple presentations on it.
As a Dutch citizen, I am ashamed our own government's CA was compromised. And I'm a bit angry, because this hardly concerns just us but the entire secure web.
Frankly, I'm hoping for a lot more than just damages.
No, the Dutch root certificates were not compromised, insofar as I am aware.
Root certs for services provided to the Dutch public were compromised as they were distributed through DigiNotar. The Dutch government has entirely different root certificates and it is where they are currently handing out certificates from to fix various different services that were using the DigiNotar certs.
Promises are as much a pattern or mechanic as callbacks, but the latter feels far more natural in JS. The `promised` decorator is interesting, but it has problems:
- The application I work on, and have in mind with this, would need just about every function decorated.
- It doesn't account for methods, unless you're default decorating the method, ie.: `Foo.prototype.myMethod = promised(function(/* ... /) { / ... */ });`
- Every decorated function takes a noticable performance hit, because things like `Function#apply` and concatenating `arguments` are relatively slow.
- It doesn't sit well with me that the example rewrites a method on a prototype declared elsewhere: `console.log = promised(console.log)`
Further, the article and library don't even scratch the surface of complicated async flows. Think async versions of common functional-style methods like `map`, `reduce`, etc.
For example, a basic scenario from our own build process is: Scan a directory for template files, read them, compile them, then concatenate the result and write it out.
We used to have a promise library to do all of this, from handling a build process to performing database queries. I discovered Async.js at some point and haven't look back since: https://github.com/caolan/async