Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's easy to call Excel the cause, but it's not, any more than the existence of memory management is the cause of memory leaks.

Putting a very powerful (if archaic) tool in the hands of people without the experience or understanding to use it effectively and then passing around the results for editing by other people with similarly (or differently) deficient ability with the program is the problem.

The tool is simply too flexible to be used for many of the things it's doing by many of the people who are using it. This is one of the good reasons that we (application developers) do our best to gather requirements: Excel is usually good at what it does, but it does so many things that understanding what it does that you don't need it to do, or how and why it does things you've accidentally done or need to watch out for, has become as -- if not more -- important as/than understanding what it was you wanted to do in the first place.

So yes... Excel plays a role in some very bad things. But no, it's not Excel's fault.



Are people taking this article seriously? I'm fairly certain that it's a tongue-in-cheek jab at various other articles recently blaming Excel for the Reinhart-Rogoff thing.

I don't know how anyone could see the last bit about Native Americans and think that the article was actually serious.


It's serious, though playful. Some of the cases are very interesting, particularly the one about Barclays getting burned by the lack of access control on data in spreadsheets.


the "hide vs delete" problem is incredibly prevalent in corporate financial statements.

Accountants during preparation of these documents often "hide" for a reason - they want to try out how certain numbers work, and want to keep their work-in-progress easily accessible without deleting them. And often corporate secretaries just send out these documents "as ready" without thinking that there's tons of confidential and sometimes very wrong (or very incriminating) numbers "hidden" as work-in-progress.

This happens so often on Quarterly financial statements that it's an embarrassment, frankly. Disclaimer: I worked as a lowly proofreader of financial documents, quarterly and annual reports, etc.

The problem is not Access Control. The problem is people failing to recognize when a document is "ready for public consumption" vs an internal working draft.

The real story of "lacking access control" is when these excel worksheets actually link into MS Access databases hosted inside the firewalls of these companies - and holes were poked in the firewall to let the data out because executives wanted to work on these documents in their lodge in aspen, etc.

I see this all the time - it's really freaky when the excel worksheet I was proofreading start reaching out over the net and into supposedly private networks deep within financial houses. If I run something like little snitch, it would show access attempts over nonstandard ports, etc.

It's scary.


Well, of course "it's always a people problem" as Jerry Weinberg says. But the Barclays example is interesting* precisely for its technical detail. The spreadsheet author used Excel's "hide" feature to hide some data. Had Excel given them the ability to restrict read access to some portion of the sheet, it seems obvious that they would have done that instead.

* Interesting to me at least, because I work on things like this. Not contradicting your comment at all.


That's also not an Excel issue, but a process/"giving VP's whatever they want even when bad" issue.


I guess the poing might be where there is no peer review or quality assurance - these cultures are prone to produce this kind of errors. And Excel culture is most of the time a one man show for every spreadsheet. They are hard to audit.


I agree. It seem to me that the article is taking the Mickey out of these high profile people / companies who are quick to place blame on Excel. I would see most of these a simply human errors which could have been avoided had the people involved been more thorough. If they are able to model complexes financial and economic problems in a spreadsheet surely they are also capable of building safeguards and double checks into their sheets or have someone proof read the results. Evidently, they didn't ... Though luck.


Excel is part of the cause. It simply mangles and hides programming. So it creates just the atmosphere were people think they can "avoid" programming and use Excel. Not realising that they are programming but in a very bad way.

So don't try to hide and dumb down the programming part. This of course doesn't mean that mistakes wouldn't happen with better tools.

http://www.pages.drexel.edu/~bdm25/excel2007.pdf


> It simply mangles and hides programming. So it creates just the atmosphere were people think they can "avoid" programming and use Excel.

This attitude bothers me somewhat. I'll certainly agree that the combination of Excel's power and ease of use allowing people to model really quite complex calculations and scenarios quickly and easily, and the ease of which they can do it leads to a inappropriate level of "respect" for the gravity of what they are calculating, insufficient testing, double checks, etc, leading to some massive blunders.

But many people seem to go beyond this, asserting that one cannot do such things properly in Excel. I've actually heard one person with an absolutely straight face say that Excel should literally be taken away from engineers and finance folks, and any calculations they need to do should be custom implemented by "professional" developers who will do it "properly".

Thoughts?


People who make that suggestion don`t know that there are tools like ResolverONE. It is a spreadsheet that your finance people can use as usual. But when it comes time to validate assumptions and check the calculations, a professional developer can take the generated Python code, write unit tests and other kind of tests to make sure that it does what is intended. Creating spreadsheets could become a collaboration between professionals in finance and professionals in software development. The tools exist to allow this, but where is the will?


When modeling something where mistakes are expensive in Excel, a reasonable person checks their work. I've made Excel spreadsheets that are responsible for millions of dollars in inventory purchases. You better believe I check the shit out of those. I'm paranoid about making mistakes because a small number multiplied by a million is usually a dollar amount of consequence.

In my opinion, people who don't check their work, just don't check their work. Whether they're using Excel or a "real programming language" doesn't matter. You can't blame Excel because people are reckless.


It's actually another reason people should learn programming basics along with reading, writing and algebra.


One company I worked at was making billion dollar decisions based on tools made in spreadsheets. It was the only option really, the instances where we did actually have the bespoke tools we wanted were even worse! There is no way whoever made this felt they were avoiding programming though - this buggy, unmaintainable monstrosity was three-quarters VBA. I bet someone on work experience or an internship thought "I know how I can make this tedious manual process easier for everyone!" and (with the very best of intentions) created something that no-one down the line would ever be in a position to fix or update.


I guess finance folks have to get themselves educated about test driven development


This reminds me of Edward Tufte's criticisms of PowerPoint [1,2]. Both PowerPoint and Excel are widely abused, in part due to their flexibility and initial ease of use.

However, in addition to problems caused by its users, there have also been problems resulting from design deficiencies in Excel: automatically and irreversibly converting strings representing gene names into dates [3], displaying some numbers incorrectly [4], and incorrectly evaluating some statistical procedures [5].

[1] http://www.edwardtufte.com/bboard/q-and-a-fetch-msg?msg_id=0... [2] http://www.wired.com/wired/archive/11.09/ppt2.html [3] http://www.biomedcentral.com/1471-2105/5/80 [4] http://www.joelonsoftware.com/items/2007/09/26b.html [5] http://dl.acm.org/citation.cfm?id=635312


"Too flexible" sounds like someone implemented all the requirements. A programming language is too flexible too. That's not really the problem, which is more about the opaqueness and lack of DRY.


What's DRY?


It stands for "Don't Repeat Yourself," and in its most general form refers to avoiding code duplication.


Yes indeed but the principle also applies to avoiding repetition of configuration and data.


In the end it's always "pilot error", right? But some blame rests on Excel for creating a tool that engendered a level of confidence that was unwarranted.

Excel is designed to be used for mission critical applications under certain circumstances. And it does a piss-poor job of delineating what those circumstances are, it deserves a lot of criticism for encouraging people to misuse it.


I guess you would say the same for C, right? So are you saying that Excel works correctly but because its easy to get started with it creates unrealistic confidence? I disagree.


Well, what a about detailed study? Quoting from abstract:

"Microsoft’s continuing inability to correctly fix errors is discussed. No statistical procedure in Excel should be used until Microsoft documents that the procedure is correct; it is not safe to assume that Microsoft Excel’s statistical procedures give the correct answer. Persons who wish to conduct statistical analyses should use some other package."

On the accuracy of statistical procedures in Microsoft Excel 2007, 2010 http://www.pages.drexel.edu/~bdm25/excel2007.pdf http://homepages.ulb.ac.be/~gmelard/rech/gmelard_csda23.pdf




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: