"Look, here's a pretty way to configure git! This can also be done in these 17 other ways, wheeeee! It's 2012 and the top article is how to configure colours in 'git log'"
Seriously, boffins. Stop posting this stuff already. Better techniques are nice, but good only if they're revolutionary (like that Light Table editor). We need more new, ground-breaking stuff and less arrangements of Linus's Lego bricks.
There's the annual calendar of "news" stories pertaining to major holidays, sports events, and political cycles.
If you've lived a few years (or decades) in any particular area, you'll find a recycled set of "places to go, things to do, sights to see" articles.
If you've lived through a few economic cycles, there's a very predictable trend through boom, cusp, bust, downturn, recovery cycles (NB: still in the downturn).
News is cyclic. Not everyone knows everything. Reiterating the basics every so often is necessary.
The challenge isn't not doing it, but getting the balance right.
I generally hate Microsoft stuff (including Windows 8), but this is a refreshingly original and good design (personal opinion, of course.) Since Zune and XBox, Microsoft seem to have sharpened their game to the point where they are capable of delivering small, compact, and useful hardware which functions well. Some people mocked the Microsoft Mouse in the initial stages of the presentation, but come to think of it, most people I know prefer Microsoft's ergonomic laptop mice to Apple's Mighty Mouse, which, quite frankly, is rubbish.
Priced at the right point, this could take aim at a number of different devices, which it seems to fit at the smooth spot between:
- Stylus-capable tablets you can actually write on (IBM x series)
- Small media-capable tablets like the iPad/Kindle Fire (for consuming eBooks, media and the Web).
- eBook readers like the standard Kindle. I don't know what digital Ink capable means, but if it goes any way towards making eBooks more readable than they are on the Fire/iPad, consumers will buy this device just so they don't have to buy different things for watching videos and reading books with lesser eyestrain.
I bet that both this and the Lumia and other Windows phones are going to be massive in markets like India, which know Microsoft and Nokia well, and have never seen much of Apple tech.
AFAIK Chrome has never detected RSS correctly for me. I assumed this was special to Mozilla, with their "we should probably provide a basic version of everything - newsreader, FTP client, etc., even if it makes the browser a bit more bloated." Bit like emacs.
Chrome is a bit like vi, if you want more stuff, there's probably some sort of extension.
Sorry for the emacs/vi analogies, I'm not trying to flame :)
Well, no, the extension only allows you to subscribe to an RSS feed, whereas the aforementioned feature in Chrome was that it would open the RSS feed inline.
Shit happens. Don't use AWS as your only platform, you will get burned sometime. Guaranteed, you will also get burned if you try to host and run your own stuff. How competent you are determines which way you get burned less.
Actually, starting right now, AWS is probably your best bet.
Old story about Chuck Yeager from the 1950's: one time shortly after take-off, Yeager's aircraft suffered an engine failure, and he had to do an emergency semi-crash landing. When he realized that a mechanic had put the wrong type of fuel in the plane, he went looking for the guy. The mechanic profusely apologized, said he would resign and never work in aviation again. Yeager replied something along the lines of "Nonsense. In fact, I need someone to refuel my plane right now, and I want you to be the one to fuel it. That's because of all the guys here, I know you'll be the one guy who'll be sure to do it right."
This is mentioned in 'How to win friends and influence people' where the anecdote is about Bob Hoover and Jet fuel put in a WW2 plane. It is used as an example that it is easy to criticize and complain but that it takes character to be understanding.
Or, if you use AWS as your only platform, accept that shit will happen from time to time. Unless your application is a matter of life and death, or unless billions of dollars are at stake, a little downtime now and then probably isn't that big a deal. (All my sites went down when Heroku did (including railstutorial.org, which pays my bills), but the losses are acceptable given the convenience of not having to run my own servers.)
I think it's reasonable to escalate criticism of Heroku for remaining in a single AZ. They have had plenty of time and resources to fix this, and haven't, despite being quite competent. I don't know if it is that they don't think it's necessary (due to the profile of their current customers) or what, but I wouldn't use Heroku for anything as long as they remain in a single AZ, and would be really reluctant to advise other people to do so. I obviously really like the Heroku team and product and would love to use them otherwise.
It wouldn't even need to be true seamless failover across AZs right away -- just offering a us-west and us-east Heroku would be enough for me, with shared nothing (maybe billing, or not even that), and then figure out redundancy yourself inside your app. Multiple regions is WAY better than multiple AZs within a region, too -- both for reliability and for locality.
Obviously a real seamless multi AZ/multi region solution would be much more technically impressive, useful to users, and Heroku-like, but they shouldn't let the perfect be the enemy of the good here.
While I'd agree with the general premise that diversification is a good thing in platform use if high-availability is a requirement, given that this outage was single-AZ, this particular outage should really highlight the point that your application should be multi-AZ scaled if it needs to be up.
More accurately: “don't trust any single data center”. All of the people who complained were directly ignoring Amazon's own advice, not to mention decades of engineering experience.
Going multi-AZ, multi-region or multi-cloud will help, each step up that list being significantly more work for increasingly small returns.
Yes. Also, stop navel-gazing (usually that means stop reading Hacker News). Stop commenting on Hacker News as well. Funny thing about the Singularity/aliens/heaven--it'll come even if you don't spend a lot of time worrying about it.
Oh cracking, missed the point completely and is first page on hacker news? My trousers could write more insightful articles.
Point: If you can squeeze a couple more years of service out of an upgradable laptop by maxing out the RAM, you should probably do so since the alternative is, basically throwing it away as landfill (note that the iFixit article said that claims about the body being recyclable were basically bunkum.)
So basically, every time you "never upgraded a laptop anyways, and that's OK", God kills an environmentally friendly kitten. Or something like that.
Point: If you can squeeze a couple more years of service out of an upgradable laptop by maxing out the RAM, you should probably do so since the alternative is, basically throwing it away as landfill (note that the iFixit article said that claims about the body being recyclable were basically bunkum.)
Sorry, but even this strikes me as dangerously close to FUD. A generic recycler couldn't be bothered to separate the screen glass from the lid, but I doubt there's any reason why Apple couldn't figure out a recycling strategy that would work. (And they basically provide the FedEx fee for you to send it back to them.)
"The design may well be comprised of “highly recyclable aluminum and glass” — but my friends in the electronics recycling industry tell me they have no way of recycling aluminum that has glass glued to it like Apple did with both this machine and the recent iPad."
I took this to mean that recycling was essentially impossible (by Apple or anyone else). Perhaps Apple can indeed figure out a recycling strategy. I doubt they will spend any time on it unless prompted by adverse publicity (like this article is generating).
but my friends in the electronics recycling industry tell me they have no way of recycling aluminum that has glass glued to it like Apple did with both this machine and the recent iPad.
I took this to mean that recycling was essentially impossible (by Apple or anyone else).
Well, as you basically admit below, the economics and motivations of most recyclers are not going to be the same as Apple. In any case, this doesn't mean that they won't take any aluminum that has ever had glass glued to it. It just means that they won't take something with the glass still glued to it.
I'll bet you $1000 that I could get the screen glass off of a Retina Macbook Pro with a hammer and an angle grinder or another common shop tool. (You supply the Macbook.) Most anything I can do inconveniently and dangerously at a hackerspace with an ordinary shop tool could be done with greater safety and lower unit cost with an appropriate custom tool.
Perhaps Apple can indeed figure out a recycling strategy. I doubt they will spend any time on it unless prompted by adverse publicity (like this article is generating).
So you don't know, and neither does your source. All you have is supposition, for which you leave yourself an out.
PROTIP: Tracey Emin is a Professor at the Royal Academy. It's art that a lot of the art world considers "good art".
This Emin person has gained tons of attention by displays so ridiculous that people thought they must be brilliant since no one had thought of doing something so utterly devoid of talent before.
From the Wikipedia article:
"In 1999 she was a Turner Prize nominee and exhibited My Bed — an installation, consisting of her own unmade dirty bed with used condoms and blood-stained underwear."
And engineers are supposed to be the scruffy, smelly disgusting masses of humanity.
The scene represented a point of serious depression (contemplating suicide) and realisation/rebirth for the artist. Some might see it and think of dirty engineers or needing to clean up. Others might see that a bed can be a site for pleasure, pain, love, resentment, physical activity, rest, birth and death.
Each to their own. Some may see weeds growing in cracks on the side of the road and think it looks messy and needs poison. I always see the evolution of life, incredible ways that dog-eat-dog life in a godless world can see some measure of success against the odds and think about how a weed may be different to different people. (Then I poison it.)
"The name 'Stuckism' was coined in January 1999 by Charles Thomson in response to a poem read to him several times by Billy Childish. In it, Childish recites that his former girlfriend, Tracey Emin had said he was 'stuck! stuck! stuck!' with his art, poetry and music."
Dammit, this is going to lead to a whole new set of questions from confused family members.
"How do I get to Start > Programs > [Some app which was made for usage on Windows XP]? I can't find the Start button, this new Windows is no good. grumble"
There's nothing particularly amazing about watching a man try to navigate a system he hasn't been taught how to navigate. Most of these kinds of experiments tend to end the same way, unless the system includes metaphors the users are already well accustomed to using.
I don't think there's any question Microsoft will include an extensive tutorial in Windows 8's final release. If they don't, we can safely laugh at them. As it is, though, this video is rubbish. I could make a similar video filming my Grandmother trying to use Windows 7(She only knows how to use a specific type of Web TV, sans mouse). I guarantee you we'll see similar results.
Windows has a massive install base. One of Microsoft's most important tasks will be to train their current users to make the transition to this new UI. It's not a small feat. But this video completely sidesteps the most important part in the process, and instead asks a man to use a novel, manufactured system with nothing but his natural inclinations. On those grounds, I think he does quite well. But the video is rubbish.
There's another video at http://www.youtube.com/watch?v=XeeOkHjV7nM where he is asked to use a mac for the first time in his life. The guy has apparently used XP for many years and is a die-hard windows fan. Watch how quickly he is able to pick it up (with zero instruction) in comparison to Win8.
The Win8 start screen isn't hard to get to - if you know how. I'll have no trouble with it, you'll have no trouble with it, and anyone who is given a tutorial or basic training will have no trouble with it.
But I'll bet there's going to be millions of users who find themselves in the same situation as the guy in the video, and it's going to be a nightmare for any company who tries to deploy this to all their users.
For me the big take-away is this:
Important user interface elements should be visible on screen.
I'm troubled by the invisible UI stuff, including the hot corners. And I'm worried because knowledge of these features is required to operate the system at a basic level. That's kind of frightening. Not insurmountable, but frightening.
I think OSX's interface elements are more approachable than these hidden elements, given his(and most people's) prior experience. If you've used Windows, you're not going to be a stranger to drop down menus or desktop like icons (a la the dock).
And I agree that the invisible user interface elements will be a nightmare for IT people around the world. But to be frank, I don't want to restrict change to things that make IT folks happy ;) (Of course, I'm not a company worth a few hundred billion dollars whose livelihood depends on enterprise acceptance...)
I'm interested in seeing how long it will take for these 'new' UI concepts, where screen edges and corners are elements to be touched and modified, to sink into the general consciousness. It seems to have sunk into the OSX world rather quickly. Now it's time to see how the other 90% cope with it.
I remember reading a wikipedia article on the Bloop about half a year ago, and none of these other unexplained sounds were on it. Seems like NOAA fanboys dug up and wrote the others.
That being said NOAA fanboys are far more preferable to Apple fanboys. :)
cost to invent drugs = identifying compounds (90%, say) + determining manufacturing process (10%)
^ Where are you getting these numbers from? They seem way too simplistic, and your argument essentially falls apart without them.
Real 'invention cost' would be determined by far more factors
- cost of raw materials
- labour cost (I bet it's cheaper for pharmaceutical companies to pay workers much lesser in India too.),
- whether your competitors are targeting the same market (if these companies stopped researching said drugs, Cipla would come up with a cheaper way to do the R&D required, and then make a killing, even with their low rates, since they would have cornered the market.)
Plus the goodwill of Govts of developing countries that are offered these medicines at low rates may lead to fruitful collaboration with national research institutes in those countries, thereby lessening the need for enormous profit margins anyway.
As someone said below, the fact that there is more money to be made selling iPads rather than bread does not mean that everyone switches to making iPads.
Of 5,000 new compounds, you'll find 250 which are interesting enough to test in a lab (animals or in vitro), 5 which are interesting enough to test on humans, and 1 which gets approved.
It costs peanuts to find the 5,000 new compounds, and a little bit to figure out which ones are interesting (say, $50,000 each, about $250 million). Those 250 interesting drugs will cost a million each to test - subtotal $250,000. The 5 drugs which are tested on humans cost a fair bit (say $50 million each - $250 million for all 5). Getting the final compound approved takes a lot too, because you need a massive trial.
All up, it's about $1 billion per drug.
If you want to reverse engineer it, it's about $10 million dollars for a chemical engineer to read the publicly available formula, figure out how to synthesize it, and set up a small plant.
Whether it's a new drug, or a drug you copied, it costs a few cents labor / materials to make each dose once the factory is built. Yes, India could knock $0.01 off each tablet, by employing cheaper factory techs. But no-one cares about saving $0.01 off a $1 product.
India could do the R&D cheaper, but not a lot cheaper. It's like building an OS - you need experience people who know what they are doing, not just cheap process workers.
Ripping off US companies isn't a bad idea, because it lets Indian workers gain more experience, which will help them create better R&D jobs. In the long run, this might even be good for the US, because Indian R&D could create a lot of good drugs for the US to buy.
Bloomberg "learning to code", Obama directing agencies to have an API.
All part of politicians (particularly Democrats) trying to look like they have a clue. Give up already for heavens sake and get back to managing the deficit.
Seriously, boffins. Stop posting this stuff already. Better techniques are nice, but good only if they're revolutionary (like that Light Table editor). We need more new, ground-breaking stuff and less arrangements of Linus's Lego bricks.