Financial models serve a purpose: Make better decisions.

Insight, not numbers

The ultimate purpose of a financial model is not to generate numbers based on a set of assumptions. It’s rather gaining insight into a business situation so that managers can make better commercial decisions.

Think of the model as laboratory where you can test your decision alternatives in a safe and cost-efficient way. In fact, for most business situations, building a financial model is the only way to test before taking action in the real world.

But, building the actual calculation model is only part of the story. Although Excel is quite good at “crunching numbers,” it provides almost no support for conceptional modeling or gaining insights.

Analytica,  however, provides just the right feature for every step along the “Business Analytics Lifecycle.” You can build your model the way you should do—from understanding the business situation and building a conceptional, quantitative model to filling in the most complex, multidimensional calculations to sensitivity and uncertainty analysis. Going back and forth the cycle is really a snap, allowing for a truly flexible and agile model development. You can easily change your model at any time—even late in the development—mostly without having to edit any formula at all.

What makes Analytica special?

Intuitive Visual Modeling

Intuitive influence diagrams let you create a model the way you think, and communicate clearly with colleagues and clients.
An Analytica model is organized as a hierarchy of influence diagrams. They show you the big picture and let you build and navigate your model visually. Think of this as an interactive whiteboard—but with all the details, calculations, and numbers underneath.

Multidimensional Calculation

Intelligent Arrays™ let you create and manage tables with many dimensions with an ease and reliability not possible in Excel.
Although the influence diagram appears as simple, it scales up to handle complex, multidimensional problems—effortlessly. Don’t worry about the rows-and-columns layout of your model. You create, change, or extent dimension, and Analytica cares about the rest.

Monte Carlo Simulation

Efficient Monte Carlo simulation lets you quickly evaluate risk and uncertainty, and find out what variables really matter and why.
You can easily replace any variable in your model with a probability distribution. Analytica generates sample values based on this and let you see and analyze the effects on every “downstream” variable in the model. With no special statistical expertise required.

How does Analytica compare to other modeling tools?

No other software package on the market provides the full range of features that Analytica does. Influence Diagrams for visual modeling to make models transparent. Intelligent Arrays™  to handle multiple dimensions and scale up model flexibly. And fast Monte Carlo Simulation to probabilistically analyze risk and uncertainty.
Product category
Transparency & clarity
Flexibility & scalability
Analyzing uncertainty 
Spreadsheets like Excel
Simulation add-ins like @RISK or CrystalBall
System Dynamics like STELLA/iThink, Vensim, GoldSim
Statistical Analysis like MATLAB, Mathematica, R, SAS
OLAP (Business Intelligence) like  Business Objects, Oracle BI, Hyperion, IBM Cognos

What’s wrong with spreadsheets?

The spreadsheet was the first “killer app,” the application which led to millions of people buying their first personal computers in the early 1980s. Spreadsheets were a brilliant innovation for replacing accountants’ paper spreadsheets. But, they are poorly suited for serious financial modeling. Empirical studies show that more than half of spreadsheets in regular operational use have serious errors. Thirty-five years after the release of VisiCalc, the first spreadsheet, it’s time for something better.

For financial modeling, the main problem of Excel and other spreadsheet is that they work at too low a level of representation. The grid deals with cells. You, the financial modeler, are—or should be—thinking about higher level entities, such as variables, influences, modules, hierarchies, dimensions and arrays, uncertainties, and sensitivities.

A modeling tool like Analytica that displays these entities and lets you interact with them directly is much more intuitive to use. It reduces the need for you to mentally translate between your “inner representations” and the one used by the software. Analytica makes it much easier to write, review, verify, explain, and extend models. It reduces the number of errors by preventing many kinds of errors from being made in the first place, and by making remaining errors easier to detect and fix.

Here are the TOP 10 problems of Excel—and how you overcome them with Analytica.

Meaningless cell references
Formulas in Excel refer to other variables using cell addresses—such as B2 for column B of row 2—rather than meaningful names like Revenues or Expenses. Anyone who has tried to read such formulas, whether written by someone else or oneself, knows how hard they can be to understand or verify.
Analytica uses meaningful names to identify each variable.
Formulas are much easier to write, read, and debug.
No structured documentation
As an Excel modeler, you can type text as documentation into any cell that doesn’t contain a number or formula. The text may include the name, title, units, or even an explanation. It is conventional, but far from universal, to place this text in the cell to the left of each cell containing a number or formula. There is no consistent relationship between documentation and the formal model (numbers and formulas). Without any inherent link between documentation and model, the spreadsheet cannot reliably assist the user by prompting for documentation, or maintaining consistency between documentation and model.
In Analytica, each variable is an object with fields for name, description, units, as well as definition (formula) and value.
You can easily include clear, complete model documentation as you go.
Variables don’t have defined types or roles
An Excel cell can contain a number, formula, text value, documentary text, or empty space. It can be an input, output, or intermediate calculation—a decision variable, a constant, the index of a table, or an objective to be optimized—among many other things. The problem is that a cell is just a cell as far as the spreadsheet is concerned. There is no explicit representation of what role it is intended to play in the model. As a reader of an Excel model, you don’t know what kind of thing to expect in each cell. There is no easy way to tell text values from documentation, inputs from outputs, or decisions from objectives. Without representing this information explicitly, Excel can’t assist you by providing appropriate options, checking that the contents of the cell is consistent with its role, and so forestalling common conceptual errors.
Each object has a class, such as decision, constant, chance variable, objective, or index, defining what role it has in the model.
Not only you as the modeler, but the software itself understands the role of each variable, which prevents common conceptual errors.
Invisible model structure
Excel offers no way to visualize the overall structure of a model. One effect of using cell references in formulas is that it is laborious to find out which variables—or rather cells—depend on each other by tracing formulas from one cell to another. Of course, Excel offers ways to trace the inputs to a selected cell, e.g. by color-coding cell references to identify its inputs or with the “audit tool” that displays arrows from the inputs (or to the outputs) of a selected cell. These tools, however, can show the dependencies of only one, or a few, cells at a time. If they could show more, they would usually appear as incomprehensible “spaghetti,” because spreadsheets aren’t laid out to provide a clear dependency structure.
Influence diagrams provide an intuitive graphical view, depicting variables as nodes and dependencies as arrows.
Modelers and decision makers communicate clearly with each other about key assumptions and model structure. Plus, it’s easy to navigate large models.
Little support for modularity
The best way to manage large models is to organize them into a hierarchy of modules, where each module is small enough to be comprehensible. Excel supports one level of modularity: organizing a workbook into a number of worksheets, accessible by tabs along the bottom. If you use multiple worksheets to represent a third dimension of a table, e.g. region or business unit, it will interfere with even this kind of modularity. Experienced Excel users may organize complex models within each worksheet to reflect an underlying modular structure, but Excel itself is unaware of any such structure and so can offer no assistance in laying out or organizing the model.
You organize a large model as a hierarchy of simple comprehensible modules.
Complex models become manageable. Each diagram shows key variables and relations, and hides irrelevant details in submodules.
Formulas refer to cells not tables
Most Excel models consist largely of tables with one or two dimensions. You are forced to work at the level of cells rather than tables because Excel formulas usually refer to individual cells—and not to tables. Each cell must contain a separate formula, even when all the cells in the table have identical relationships with the cells of other tables. Excel offers a relatively convenient way to create such tables by “stretching” one cell across the table—provided you use correct absolute and relative cell referencing in the initial formula. But, after you have created the table, the fact that all the formulas are essentially the same gets lost. Indeed, if someone changes a single formula in a table—by design or by accident—it is very difficult to detect, which is a common source of mistakes.
With Analytica’s Intelligent Arrays™, a single formula expresses an operation on named tables. It automatically iterates over all dimensions.
The number of formulas to write, verify, and debug is often 100 to 1,000 times less, hugely reducing chances for error.
Editing or adding a dimension demands major surgery
When you build a model, the most important and challenging decisions are typically about how much detail to include in table dimensions. How far ahead should your time horizon be? Is it okay to do your analysis by year, or should it be by quarter or by month? What about dividing up by sales regions, by product type, or alternative economic scenarios? Each new dimension, and each increase in the level of detail, expands your model and your work substantially. Will this extra work be worth the improved accuracy and insight? Ideally, you should start out simply and experiment with adding detail to see how it affects the results. Unfortunately, this is usually too much work when building an Excel model. Changing the size of a dimension, or adding a dimension, to Excel tables requires much more effort that ought to be necessary.
Analytica understands the indexes that identify dimensions of a table or array. Editing or adding a dimension automatically updates all affected tables and keeps formulas correct.
Modifying and adding dimensions to tables is much faster and less error-prone. It is very easy to manage three or more dimensions.
No treatment of uncertainty
When working with numbers, we know when we think about it that almost all numbers are uncertain. Some, like the speed of light, are accurate to one part in one trillion. But, most quantities in practical financial modeling are far less accurate. It is often helpful to represent those uncertainties explicitly as probability distributions, so that you can evaluate and manage the risks, and discover which sources of uncertainty are the most important. Excel doesn’t let you express probabilistic uncertainties directly—although there are add-ins that make this possible.
The value of any variable can be a probability distribution. Efficient Monte Carlo simulation generates the corresponding distributions on results.
Modelers without special training can treat uncertainties explicitly and analyze risks.
Minimal support for sensitivity analysis
Often the most potent source of insights from modeling arises from sensitivity analysis. With it, you can figure which inputs and assumptions have the most effect on the results—and why. Sensitivity analysis includes simple what-if analysis to see how a change to one input affects the results, scenario analysis to examine the effects of combinations of input values, parametric analysis to graph how changing one or more inputs across several values affects the results, or tornado analysis comparing the relative importance of variables. It is, of course, easy to do single cell what-if analysis in Excel. You just change an input and look at the change in the result—provided you can remember or record its previous value. Excel also provides powerful tools for scenario analysis, allowing you to define and compare the results of scenarios. Parametric analysis or tornado analysis is much harder to achieved and requires a lot of manual work.
Automated sensitivity analysis shows the relative effect of each input or uncertain variable on affected outcomes and decisions.
Modelers without special training can easily do importance analyses and generate valuable insights.
No separation of end-user interface from the model logic
Frequently-used Excel models in an organization are usually built by one person and used by many others. A good Excel modeler knows to separate the inputs and outputs of interest to the users in one worksheet—as the user interface or “dashboard”—from the other internal components of the model. Ideally, the builder identifies the input cells clearly by color or shading, and locks all the other cells to prevent accidental or deliberate tampering with other parts of the model not designed for changing by end users. However, audits of operational spreadsheets find that builders often fail to maintain this clear separation and locking, mainly because Excel provides them little assistance in doing so.
It’s easy to create “dashboards” for users to access the key inputs and outputs.
The dashboard offers a simple user interface for model users, protecting them from seeing or messing with irrelevant details.

Intuitive Visual Modeling