I know, that the title of the article seems a little tautological, but this is the truth – this is a data analysis book, written with passion. It is real keeper for Data Analysts, who are working all day long with MS Excel and have to perform tasks such as “Tell me how much should my ticket entry cost, in order to obtain 50 000 EUR profit, considering the fact that we do not know how many people will enter” or simply “These are our expenses, find a way to optimize them… And find some relation between them”. The book goes a few steps further in answering these questions: it provides a good working methodology for dealing with such cases and it introduces unpopular features of MS Excel, used only by experienced Data Analysts.
|Click on the book to go to its site in Amazon.com|
Furthermore, Excel Data Analysis is not only for Data Analysts with university education in statistics – it is written in a way to be understood by anyone with passion (or a task) for model creation. Basic statistical terms as “Chi-Square”, “z-Test”, “ANOVA”, “Sensitivity Analysis”, “Type of Distribution” are well explained with examples. Thus, the book is not just a boring statistics book (there are plenty of those available), but it goes beyond this – it shows you how to use all these statistical features in Excel Modeling and to put life into them. The examples in MS Excel are really fascinating and I personally enjoyed reading it.
What I personally learned from this book (and I am not ashamed to state it):
1. “What – If” feature in Excel:
Although I have some real work experience with Excel Data modeling, I did not know about the “What-If” options in the Data ribbon. Their functions would have been of use in 2-3 projects of mine. Anyway, I have managed to survive without them and to achieve my goals in another way. Here you may find a good presentation of the “Goal Seek” feature in Excel.
2. The possibility for subtotals in filtering:
This is another unpopular & useful feature of Excel, which gives you some great possibilities to summarize data. Really useful, when you need some quick presentation and you do not want to go for a Pivot table for it.
3. Various useful ideas for MS Excel model design:
At first, I would like to point out the great idea of combining radio buttons, vertical scrollers and simple IF operator for excel modeling. Really, the tool simply takes one Excel tab and it looks simply beautiful. The fact, that it is really easy in creation and understandable only makes it better. It is an example of what the excel dashboards should be: simple, understandable and powerful.
What I was reading and mentioning to myself smiling – “Yup, I think I know this, exactly this is the way it should be done” :
1. Standardization and user friendliness of Excel Dashboards (models).
Ok, let’s be honest – the moment I saw that someone actually formulizes that Excel dashboards should be user friendly and standardized – I liked it. Furthermore, the idea of having a “Brain” in a worksheet, which makes all the calculations is something that I have always followed. I have not called it “Brain”, but “Calculations” or “Parameter” sheet. But the function was the same. This is definitely a “Best Practice” to be followed.
2. Analysis ToolPack Add-in in Excel.
Sometimes, even Excel data analists forget about its presence and try to make their life harder. And they should not. Just one simple example – with the choices for “Moving Average” and “Exponential Smoothing” you may have a draft of a forecasting tool in less than 2 minutes. Otherwise you should waste about 15 minutes to create it.
3. Presentation of statistics terms.
I really liked the presentation of mean, standard deviation, range, median, mode, standard error, sample variance. These important terms are presented in 1-3 sentences, thus making them understandable for the people who do not have idea about statistics and do are a kind of refreshment for those, who have.
4. Transpose formula.
This is something, that I knew that existed, but only as a function of the Copy>Paste with a checkbox into Transpose. (Screenshot)
However, the ability to transpose a whole table with one formula was appreciated. The formula is better, because it is flexible, if you need to change later the first table.
5. Data Conversion.
The ability to convert data from CSV (or other files) to Excel report is something trivial, but it is not widely used. It is well explained.
6. Pivot Table Basics.
Almost every excel model I have created included a Pivot Table. It is well explained. However, what I did not notice so far was the ability to create a Pivot Table Chart. It is something beautiful and seems great! The point is, that I have seen charts with drop-down menus, but I have thought that these were charts, created with VBA. I just tried the Pivot Chart and I was smiling. Really, a useful Excel feature.
7. The steps for efficient modeling.
To be honest, I was following them, but they were not formulate in my head like this. So, these are the four steps, suggested by the author:
A. A pre-modeling or design phase – contributes to our preliminary understanding of the problem. (problem definition phase). This may take a considerable proportion of the entire modeling effort. After all, if you define the problem poorly, no amount of clever analysis will be helpful. At this stage the goal of the modeling effort should be made clear. What is expected from the model? What questions will it answer? How will it be used and by whom?
B. A modeling phase – building and implementing a model, that emerges from the pre-modeling phase. Specifications are refined to explore the model’s behaviour. At this point the model will have to be populated with very specific detail.
C. An analysis phase – we test the behaviour of the model developed in steps (1) and (2). Result analysis. Collection of data, produced by the model and analysis of the results.
D. Final acceptance phase – the model specification is reconsidered, if the result of the analysis phase suggests we need to do so. At this point we may return to earlier phases until the decision maker achieves desired results.
8. Excel Quick Access Toolbar
This is a must! Indeed, if you are working as an Excel Data Analyst and you do not have a quick access toolbar, then I would assume that you know at least 6 functional shortcuts by heart.
Indeed, the solver is a magical excel tool. If you have calculated the MINs and the MAXs functions just with a pen and paper as I did about 5 years ago, then you will really appreciate this function. It is well explained in the book.
10. Tips for verifying optimal solution in Excel:
A. If you have a non-linear target cell or objective function for a formulation in a single variable, attempt to plot the function by using successive values of inputs to see if the function might be a candidate for a local optimum. You can do this by copying the function to a long column of cells and placing consecutive values of input in an adjacent column. Then plot the results and note the shape of the curve. (Only possible for a single variable).
B. In case of a multi-variable, you may want to see if you can find some combination, that outperforms the so-called optimal solution.
C. If a solution is uncertain, but appears to be correct, investigate by examining values near the proposed solution.
D. Be careful to note any odd solutions – negative values where none are possible and values that are either too large or too small to accept as possible.
E. Verify that the constraints that are imposed on a formulation are satisfied.
F. Remember that in spite of your best efforts, you may still, on rare occasions have problems dealing with these issues.
At the end, I would like to clarify the following – the efforts of the author to make data analysis understandable and not boring are really remarkable! The existence of so many good examples, explained step by step is what makes these 338 pages a really great book. Last, but not least, the availability of exercises after each chapter and their online solution may help any Excel Data Analyst to become better if he spends a few hours trying to solve them.