This won’t be the first or last blog post on the internet about the dangers of large spreadsheets!
It is inspired by a recent Prophecy implementation where the customer, like many Prophecy customers, is migrating from a spreadsheet-based system that had grown over the years and now suffered many of the (not unusual) issues of complex spreadsheets:
- The original developer was long gone.
- It did not support multiple concurrent forecasters. (All customers were forecasted in a single spreadsheet, avoiding the need for multiple linked spreadsheets but preventing multi-user access.)
- There appeared to be design errors which lead to incorrect financial forecasts
- It contained no sales history, other than year to date.
- It was cumbersome to navigate and locate items quickly.
There are, of course, many additional shortcomings associated with using Excel spreadsheets for sales forecasting and they are documented on our main site at https://www.dataperceptions.co.uk/spreadsheets.html.
A recent discovery, for me at least, is research and data on the incidence of serious errors (i.e. errors that affect key numbers) in corporate spreadsheets.
The bottom line from this research is that more than 90% of corporate spreadsheets contain serious errors. Yes, 90%. Ray Panko is a Professor of IT Management and has curated the results of numerous studies on corporate spreadsheet errors on his website at http://panko.com/ssr/Audits.html. If you doubt the claim about 90% errors, please do hop over to his site and see for yourself!
So, the point of this post is really to say that there is a high probability (90%!) that relying on spreadsheets for sales forecasting imposes hidden costs on your organisation through formula and other errors. As well as the limitations of spreadsheets in the 10% or so that are actually error free.
That’s why a purpose-built, proven solution like Prophecy™ has to be the right way to go. If it does nothing else, it avoids the high probability of spreadsheet errors, as well as opening up a raft of serious, forecaster-specific features that will help your forecasters develop better, more defensible sales forecasts, in less time.
The Spectre and Meldown ‘bugs’ have, rightfully, been hitting the headlines these last two to three weeks. There is still a lot of ‘noise’ and uncertainty, as well as a degree of hysteria about these bugs. And rightfully so – Intel (for example) have been inconsistent about the impact of the microcode ‘fixes’ they have released in response. Initially, they recommended everyone should apply them but when the number of spontaneous reboots increased ‘out there’ they reined back on that recommendation. PC vendors were initially encouraged to push the fixes, until some of the deployment issues emerged. Principally these were increased reboots and performance ‘hits’.
What should we do about Spectre and Meltdown? Definitely apply the fixes, wait till first adopters work out the bugs? It is surprising that there has been hardly any questioning over how dangerous these issues actually are in the real world. The enormity if Spectre/Meltdown relates to the scope of cpus affected – i.e. all from about 1995 onwards rather than the size of the risk. The following mitigations appear to be correct:
- Exploiting these vulnerabilities requires a significantly greater level of technical ability than more traditional malware development.
- The data that can theoretically be obtained is much more limited – i.e. no hard disc data, no keystrokes etc..
- Like traditional malware, something has to run in order to exploit the vulnerability – ‘drive-by’ is not possible.
- OS and browser vendors have already moved to reduce the likelihood of these vulnerabilities having real world impacts.
So overall, there are so many cherries lower down the tree waiting to be picked. Why would common-garden malware use these exploits when the degree of technical difficulty is so high and the data you can get from them is so limited?
The lowest risk is of course to just patch.
To test the performance impact of the current Windows 10 Spectre/Meltdown mitigations we ran some Prophecy benchmarks, with and without the patched Windows, on the same machine – a gen 3 i5 with 8gb of memory.
Bottom line – the patched Windows took 13% more time in our tests.
So, everyone has to make their own decision and what we see on Prophecy, is sure to be different to other applications. But a 13% speedup is non-trivial.
Incidentally, you can test your software in the same way. Steve Gibson / GRC have a small utility which evaluates your machine for the vulnerability and allows you to toggle the protections on and off. It’s here:
Very hard to add a simple blog to an existing non-Wordpress website. I just want https://www.dataperceptions.co.uk/blog to point to my blog page, not to a whole new website home page.
Any time I try to tweak the theme it reverts to giving me a home page etc.. I DON’T WANT THAT!
This is really just a test page and I hope no one reads it, as I’m so ashamed of not being able to tame WordPress. FWIW, I am the developer of Prophecy, which is written in C++ and I implemented the main Data Perceptions website. So I feel entitled to think I ought to be able to bend WordPress!