/ Excel vs program language
Ok. So my IT skill are very limited and I might not use the correct terminology here but I'd like to ask the opinion of those who do know this stuff.
I use a fianacial analysis program which is written in Excel. This is how it was developed and it is very good.
As I understand things, it should probably have been written in a different computer language in order to make it less bulky and afford it other properties which confer some advantage.
It is now too late to re-write things and there is some pressure to scrap the existing program and start again.
My problem is that I really like what I have and it works. Does anyone out there in the know have an opinion on whether excel is such a bad system that we should start again or whther they feel excel is perfectly suitable and fit for purpose?
I think excel is great and you can do loads with it.
I think a programming language would be a better option.
Aside from excels functions being traditionally riddled with mistakes (the subject of more than one statistics paper), excel overused in my experience becomes a web of complexity where unexpected consequences are common and silent.
In the long distant past (20 years) I ran a purchasing & logistics department that ended up using spreadsheet & macros to coordinate purchases & materials movements over the corporate software that was in place.
(it managed to spend 90 million a year very well)
The spreadsheets & macro templates were written by me when I joined the company and nothing else was available, Previously there was a badly designed stock database, but ordering was done on paper!
Spreadsheets are quick to set up, flexible and don't take much programming knowledge - which give both good and bad results.
On the downside
Spreadsheets don't scale well & don't do sharing well.
Nowadays I only ever use spreadsheets for rapid analysis of one off data queries.
For anything serious - a spreadsheet does not fit the bill.
depends on the user. Professionally built it does the job well enough in many cases however a properly designed and built product designed to do one job well will normally be superior.
Main issue in my experience with excel is they dont tend to be properly managed so you have muppets making changes and failing to test it properly and then watching it die down the line. Or when it keeps getting extended to the stage a proper db would do better.
That said I have seen people do the same with fully version controlled software.
It also depends on how much of excel you are using, if you are just extending it slightly then reinventing the entire thing from the bottom up may give more issues.
So the short answer is it really depends. I wouldnt bin something just because it was excel but i would be cautious.
Up to a point, it's great - you can go from concept to implementation very rapidly, and the guts of the data flow are there for all to see, understand and tweak.
As the application grows and more features are added, the complexity of impliemting them increases exponentially until you hit some point where implementing the next feature in Excel is 10x more complicated than it would be to add it to a well architected Java/Python/whatever version off your tool. At some point it becomes less effort to port the whole shebang to a suitable language than to shoehorn one more feature into excel,
Obviously I have no way of knowing how near to that point your tool is. I would suggest that you don't exactly "scrap" the existing program so much as use it as a template for your Python version - not as hard as it may seem as really a giant excel spreadsheet is little more than variables and functions, just like python etc, but laid out graphically.
My take is that there are massive advantages to moving to a general purpose programming language - more automation, more plotting abilities, access to a wide range of analytical and statistical packages, the ability to use "source control" to track changes to the code etc.
If you think it is very good then it is probably perfectly suitable and fit for purpose. Ask them what specific problems they have with it, someone might be trying to do something more complex than you. Speed of processing is unlikely to be a real problem but not impossible.
More persuasive things might be trying to draw together different tabs for analysis, which is trying to hammer a database shaped peg into a spreadsheet shaped hole. Access would be a better option for that and some of the work that's gone into Excel would be transferable. Some of the complicated statistical functions in Excel just don't work, if you're forecasting or something. Different security levels for different users would be difficult and clumsy to set up in Excel. Would be very difficult to have multiple users working on it at the same time. Image is a stupid reason to change software but nevertheless if clients see it they might consider you a two-bit operation.
On the plus side, Excel is very well known so accessible. It's virtually free. You don't need to hire a programmer to make minor tweaks, most problems can be solved with a quick Google and maybe a little VBA dabbling. Runs across different platforms, phones etc. You already have it so won't need to worry about transfering data and 5 years of ironing out bugs
Thanks everyone. You are pretty much confirming what I thought.
TBH, this spreadsheet is now very big and complex and probably does need to be re-written in a different language.
How dificult would it be for a professional programmer to pick up the existing spreadsheet and convert it into a better language?
Try microsofts free excel addon Power Pivot, has loads of Business analytics tools built in and is much much faster. If you are using huge vlookups with pivots, slicers and calls attached this makes life easier.
Ironically Microsoft have two of the best in the world - Access and SQL Server, and Access is still a superb departmental and desktop tool. Applications I wrote in 1993 to control retail operations of a chain of optcians are still going strong very well thank you 10 years on.
Unfortunately because Access takes, oh, at least 2 or 3 days and a bit of profesional help to get to grips with, wholly innappropriate tools like Excel still continue to lurch on.
Get a professional to look at it, and try to scope out what you want.
> Get a professional to look at it, and try to scope out what you want.
About 4-5 MB
About 4-5 MB
You call that a spreadsheet? ;-)
> About 4-5 MB
> You call that a spreadsheet? ;-)
Seriously, it can be double that but I don't know if that is too big or not.
The main problem is that a new file is created for each separate clients so I have hundreds of these.
Excel spreadsheets are extremely useful for informal work and data analysis, but are much to unstable and easily corrupted for formal work and data storage. They do not themselves constitute a programming language.
However, behind Excel is VBA, which is a proper programming language; maybe not the best, but one can write completely acceptable and manageable code with that. One can then use Excel spreadsheets simply for input and output, and do all the calculations modules and subroutines in VBA and none at all in the spreadsheets themselves. VBA is thousands of times faster than doing calculations in the spreadsheets themselves and much more transparent. Of course, you need to become (very) proficient in writing proper code to do this effectively.
I waste hours of my life trying to unpick business-critical tools like this, where they were written years ago and eventually break because the creator either did not think to make the tool scalable, or expected that their tactical fix would be discarded when the strategic solution (which was inevitably not developed) was implemented.
its small, however its whats in it which really counts (until you get to absolutely huge files).
that starts giving more of an argument for a central system (assuming it would be managed centrally), what happens if you find a bug at the moment. How are changes rolled out across clients?
I once used Access for a database for a large application in Shell. Before we embarked on this, we called in one of top experts in the company in databases including Access. When he walked into our conference room the first thing he said was "My main advice regarding Access is don't use it!" Anyway, we did not follow his advice and went ahead and used Access - and within months strongly regretted it for numerous technical reasons, and we had to completely rebuild our code around another database program. That was thirteen years ago, so Access may have improved, but actually it was then a terrible program.
Looks like you might want to check the date macros you are using......
If the problem is the number of files chucked out, maybe a database to sort though these. If your computer can cope with the data you have and you can cope with how fast it runs. There isnt a problem, but you can always pay someone to make it run that little bit faster. Its quite hard to say without looking at what you have and what you want/need.
Arguably the issue (there and here) is tightly coupling your data storage layer with your business logic layer...
If the amount of data you're handling is megabytes rather than gigabytes, and you're essentially running the analysis manually (ie you currently enter a load of data into a spreadsheet and wait for the result to pop out) it probably makes sense to read in the data from some files (eg csv), do the processing in something like Python, and then punt the output back to csv files so you can load it up in excel and draw pretty graphs or whatever.
Piling everything into a database is probably overkill, although sensibly written code should leave you the option of doing that later without having to rewrite everything else if you want to.
As to how hard it is - that depends on how complex the calculations are. Probably best to get someone in to have a look at it and make an estimate.
I'd say that the essential difference between a program and Excel is that Excel is generally both the data file and the program, whereas a dedicated program (just one instance) will operate on many distinct data files. And combining program and data means that any changes/corrections to the program have to be repeated across all files based on the spreadsheet (although that can be mitigated a bit by the use of external code modules; VBA functions).
Does this Excel spreadsheet include a lot of data in that 4-5MB?
> Arguably the issue (there and here) is tightly coupling your data storage layer with your business logic layer...
> If the amount of data you're handling is megabytes rather than gigabytes, and you're essentially running the analysis manually (ie you currently enter a load of data into a spreadsheet and wait for the result to pop out) it probably makes sense to read in the data from some files (eg csv), do the processing in something like Python, and then punt the output back to csv files so you can load it up in excel and draw pretty graphs or whatever.
Or, if you are making the same plots all the time (like weekly metric type things), you could use a Python plotting package. If you are going as far as to get someone to rebuild your spreadsheet anyway, there's no reason not to.
I've known and seen Excel used to calculate business-critical figures well into the tens of billion quid. In fact I've written plenty myself. In its place its a very, very good tool indeed.
However I've also reviewed many, many excel models from probably hundreds of organisations which have whopping errors.
Sizewise I think my record (review, not build) was a linked set of 388 workbooks, the largest of which were a set of six at 250mb each.
1. Doesn't scale well
2. Too easy for people who don't know what they are doing to make changes
3. Quite difficut to control the environment
4. Various extremely strange bits of behaviour which you occasionally run into and think "WTF????". "Too Many Formats" is an example of this.
5. Is subject to frequent crashing at large sizes
6. High tendency to take the operating system with it when it does crash.
If you accept the above, its fine.
Access is fine for medium scale implementations or relatively simple stuff, and its extremely picky about data formats, but really its easier to go straight from Excel to SQL server if your data volumes are large.
> VBA is thousands of times faster than doing calculations in the spreadsheets themselves...
I think this bit is not true. Built in functions are usually faster than the equivalemt VBA functions - built in functions are compiled code, VBA functions are interpreted at runtime.
For serious number-crunching there is just no comparison: VBA runs about the same speed as any other compiled language. Obviously functions in spreadsheets are themselves similar or identical to those in VBA, but there is apparently an enormous overhead to spreadsheet calculations - probably input and output to the spreadsheet cells one cell at a time.
If your Excel software does what you want and you don't have any problems with it, why re-write it in another language. The language doesn't have any direct effect on user experience.
If it ain't broke, why fix it?
> For serious number-crunching there is just no comparison...
I guess we may have to agree to differ here, but if you are talking about calling a VBA routine once and performing 'serious number crunching' within a single call then I can believe it. But calling a VBA routine multiple times is, in my experience, a lot slower than doing the work within the worksheet. Googling Excel optimisation suggests I am not alone! Having said that - most of my workbooks rely heavily on VBA!
advantage of VBA is you can be a lot more targeted in what you update which can give the impression of performance gains.
For OP: all clear now then which is best?
Dont trust anyone who comes down emphatically on one side or the other without them spending a lot of time talking through your requirements.
It depends on the specific circumstances and also the business vs technical trade off.
> For OP: all clear now then which is best?
Apologies for the thread hijack. Methinks the OP actually answered the question already in the OP. Excel is a great tool in a single user environment. In a professional, multi-user environment you need something more robust and scaleable.
> Apologies for the thread hijack. Methinks the OP actually answered the question already in the OP. Excel is a great tool in a single user environment. In a professional, multi-user environment you need something more robust and scaleable.
Thanks. And no probs on the hijack. My question was answered ages ago......I think ;-)
it depends exactly how it is being used, if people are just consuming the data it can work nicely, likewise if the different users can be isolated.
Of course i like my shiny expensive tools but from a business perspective the cost (everyone has excel by default but getting another sqlserver instance etc costs) and potential delays in deployment might not make sense.
> If your Excel software does what you want and you don't have any problems with it, why re-write it in another language. The language doesn't have any direct effect on user experience.
> If it ain't broke, why fix it?
I have not heard anyone mention the cost of the new software and maintenance.
Someone will have to analyse what it needs to do, architect a solution and implelment it. Then they will charge you an arm and a leg to maintain it and fixes will cost roughly $3 million each :)
Elsewhere on the site
So, just what is the Petzl RocTrip? Every year French climbing manufacturer pick a sport climbing area that has potential... Read more
This streamlined, midweight thermal layer has an incredibly speedy moisture wicking ability and dries ultra fast if it gets... Read more
The B.D.V. — short for Black Diamond Vertical — jacket and pants are Black Diamond’s most versatile climbing... Read more
October 21, 2014 – Textile Exchange, a global nonprofit dedicated to sustainability in the apparel and textile industry,... Read more
In tonight's Friday Night Video, we see Alex Honnold soloing Heaven 5.12d in Yosemite Valley. The route starts 3000ft above the... Read more