The question is related to the future of enterprise systems in organisations to cope with the widespread use of today's hottest ICT topics i.e., big data; data science; open data; & IoT? Sure there is an impact, but what kind of impact is expected?
My view is that there will be a race to find the bottlenecks. The successful organisations will be those that find the bottlenecks and correct them before they become a problem.
Usually, the longer the transmission, the lower the bandwidth, so local cache's can reduce the long haul load by delivering repetitive data locally, rather than always going to the source, i.e. distributed data. This leads to complex systems management though as one has to check that the source has not changed (Otherwise the cache send out of date data) and this leads to distributed indexing, which is intellectually challenging especially where some internet connections are unreliable. For example, if two users change the same data in the same data base, (say due to a connection outage) which one takes precedence? The latest data has either to be accepted as definitive (with the problem of data corruption) , or user load has to increase to correct the data.
The other area of problem are the routers, these can cause bottlenecks as traffic is combined. One recent example is the main Internet routers, the older ones have a limit of 500,000 users, recently in the UK this was exceeded and some ISP's had serious outages or long delays. So the problem of keeping infrastructure up to date is significant. Especially in poorer countries, or parts of an organisation that may not have sufficient budget.
Also, fibre will become more significant as local bandwidths increase. Here in the UK the so called last mile is the biggest problem. To get fibre to the home, or workplace involves digging up roads with all that involves. BT have quite a good solution in sending fibre to the local junction box, then the last bit is copper. In our village (I am a home worker so need fast internet to my house) the Internet speed used to be < 1Mb/s. (as we are a long way from the exchange). One fibre connection to the box meant I just needed to upgrade my internet router and the speed is 40Mb/s, including on-demand TV. It is upgradable to 80Mb/s if I pay a premium.
Thanks for the great insights, but actually I am trying to explore the question from an Enterprise Systems lens. That is, how the ERP systems of the future would look like?
But in all cases, your insights are indeed helpful.
Current systems only show historical information. A few can show the effect of orders on profit. (Most can only show effect on profit when an invoice comes in).
A digital dashboard that can show real time effect of decisions including what -if scenarios would revolutionise management.
Even better, to show Monte-Carlo analysis of effects of decisions , i.e. probabilities of outcomes based on most likely outcomes.
This would give more control to the main board, and predict outcomes of decisions in real time, thus educating the board and making better choices.
Big-data and its peer-systems are 'complementory' technologies that still need traditional ERP systems
Big-data and its peer-systems need 'specialized' resources (both human and machines) to get started and functioning.
In this respect, From the ERP side what we are presently seeing in the industry is:
Many organizations are viewing big-data and its peer technologies as 'disruptive' and hence not welcome, till a clear cut ROI is established. This, however, is set to change once the industry understands that 'big-data' and peers are 'complementory' technologies designed to work in tandem with existing Infra to get the best out of them.
Most of the service providers on the big-data are inclining to IaaS/PaaS/SaaS models which is a win-win situation for both providers and consumers (organizations).
Considering all these, what we can expect in couple of years down the line is: Many organizations moving to big-data and its peer technologies without additional CAPEX, enjoying the benefits through IaaS/PaaS models that work in tandem with their existing Infrastructure.
In other words, the existing data-warehouse systems are not going to change or go anywhere - but the organizations are going to get more insight into their DW data from borrowed resources maintained elsewhere (probably on AWS, Azure, RackSpace so on.).
The point to be noted here is: Big-data technologies, in their present state, cannot replace traditional OLTP systems.
Some organizations may slowly phase-out the on-premise DW systems to rely completely on PaaS, but soon later the world is going to re-discover the freedom of 'own it yourself' principle and the systems are going to come back to on-premises. Its a Cycle, just like everything else.
Thanks Palem for the heads-up. as a matter of fact, i do subscribe to most of what you shared, but i do not think that ERP will remain silent. Something has to happen to cope with new changes. Indeed, big data will continue to rely on sources including ERP, and sure neither big data nor DW are meant to replace OLTP/TPS. Yet, their exist a reciprocal effect. That is, the long standing ERP as systems will single DB is changing. Also, the "openness" of the system is taking a new shape, project management, stakeholders, costing models, etc. the picture is still not fully unfolded yet.