
Many think that IT evolution in a firm is a very chronological stepwise approach where every step must be completed before going to the next one. And up to a certain point, there are right!
For example, it seems natural to make sure the accounting system is duly functioning and populated before extracting data from it to build ratios and analyses.
But beyond a certain point, sticking to that rule can delay the capacity of the management to have a decision-helping/making application, whatever its form is. And you shouldn't wait for promises of integration.
In order to make the case more concrete and more contextualised, let's take a particular situation, and analyse it: an industrial firm with entities on different continents, where some share one of various common ERPs (let's call each group of entities sharing the same ERP, an "ERP cluster"), while each of the other remaining firms has implemented a local/specific solution. And the goal is to build a financial decision-making tool at the end.
Let's drill-down the case into its elements, to show you what we mean by illusion of chronological completeness...
1 - System intregation into a single environment
This is typically a situation where waiting for a full integration of the group into the same ERP system is simply an illusion, even more when the entities have had a previous life before being integrated in the same group, which means each firm or cluster has habits and usages that would require a substantial common denominator before engaging into "integration".
According to research and reports on this research, 55% to 75% of ERP implementations fail to meet their objective. So, imagine when there is an institutional complexity of systems on top.
On top of the feasibility and effort required, there might also be very practical reasons: for example, some countries have very specific systems because tax regulation is very specific. Thus, there is a lot to waste/lose with a one-size-fits-all solution.
In that case, an intermediary consolidation system might be a very good choice, bringing a protocol or common language that can be based on the terms of the main ERP. It is all a question of the level where you want to operate the integration, deep-down or slightly higher to overcome the too local constraints.
2 - Some data vs. all the data
There is a human tendency to love "exhaustiveness". But it can lead to "exhaustion" instead. First, in the 21st century and given the big data generation in which we are, we don't organise data, we index it, we "smart search" it.
Second, remember your function is not to organise data...but use it to see what it can tell you.
Third, we don't need the whole data for everything! Consolidation tools need to fit in the same "boxes", information from different entities and they can consolidate them as long as they mean exactly the same thing. But that's an accounting need and view, not a financial one! In finance, we try to take the information "as good as it gets", and what matters is the awareness about the limitation of each source of data.
The important here is to have an application capable to aggregate data from different levels, qualities and granularities. Some data will be very precise, stemming from a complete ERP depicting the full cash conversion cycle, some other will be approximate based on proxies... until we get better data for that section.
Example 1 : we know that the data sourced from country X is a little bit rough, but we still put it in the loop, and once the A-to-Z chain and what it offers is visible to everyone, it will encourage us to put the adequate pressure to improve the quality/granularity of that item, and it will encourage them to bring in more effort for better data because they will see what can be done with it. It is also always easier to have moral authority in the group when you can share the results of an effort and not just require a lot without a concrete outcome at the horizon.
Example 2: in a cash flow forecasting engine, some revenue projections will come from very regular, recurrent and diligently paying clients, and some other will depend on some projects that might be binary at the end: either we get them, either we don't. Trying to be precise or waiting for more precisions on the latter, makes actually no sense. Building and testing scenarios do.
3 - Motivation to pursue, and continue digging deeper later

To complete the preceeding remark, it is always less risky to be able to project ourselves to the next steps and results, rather than investing substantial amounts of money without being sure to reach the final objective.
Allowing ourselves to dig deeper on the quality, behaviour and detailing of some of the inputs is converting "chronological completeness" into "stepwise completeness".
4 - A good architecture for the managerial layer is key
A well-designed managerial dashboard is a system that separates the various problems: data acquisition, data translation and aggregation, calculations, visualisations, exports and actions. Some of the formers steps can even be performed by other systems. For example, you can have a consolidation tool that aggregates some or all of the accounting data flows into one single view, and the managerial layer on top feeds itself from it and from other sources that are not covered by the consolidation (market data, forecasts,...).
And the day some additional ERP integration takes place, the only thing that will be needed to change will be a simple connector. But all the rest of the chain will remain intact.
5 - Keep an eye on quality limitations of the data and how it flows into the results
One of the reasons why many of us call some systems in place "blackboxes" is because there is some form of opacity and rigidity in knowing what the system actually does. The day it produces some strange results, the investigation will probably take a while... And a reason why many managers revert to Excel for some specialised reporting, is because it provides them with the impression that they master the whole chain by themselves. We know that this comes at the cost of lack of robustness and error-prone manipulations of the data, but still, incomprehension and lack of mastering of the result is lived as a bad experience.
6 - Less stress
If you don't like incomplete or approximate models, try without...
The analogy of the question raised in this article is an automotive firm waiting to see what will be the winning technology before going forward in its development, electric batteries or hydrogen fuel cells, not realising that this is not about perfection but about being the first in adressing new ideas.
