Why should you consider analytics when implementing or upgrading an ERP system

Summary

  • When we talk about ERPs in this context, we mean to include other transaction systems, for example CRM, HCM, supply chain.
  • We talk about “organizations” instead of “companies” in this white paper because the same considerations apply for non-profits and government entities that apply for companies.
  • Organizations upgrade or reimplement ERP’s periodically to take advantage of vendors’ new features.
  • Implementing prepackaged analytic applications simultaneously can save time and money in both the implementation of the analytics as well as the ERP itself.
    • It offers the opportunity to move reporting done against the ERP that doesn’t need up-to-the-minute data to a data warehouse, where construction and maintenance are likely to be easier.
    • It simplifies the tuning and maintenance of the ERP because it is overwhelmingly being asked to capture transactions and not to report on them, too. Doing both from one system makes doing either well difficult.
    • Advances in near real time reporting from data warehouses makes it possible to move even more reporting to the data warehouse than was feasible in prior years.
  • Delaying implementing analytics until after the ERP is implemented or upgraded leaves people without good telemetry to run their processes for months, if not quarters, after the ERP project goes live and delays major benefits the organization can reap.

This paper will discuss why implementing analytics, such as Oracle BI Applications, simultaneously with an ERP upgrade or reimplementation is not only possible but represents best practices for both ERP and Analytics implementations.

Upgrading or implementing ERP’s

Organizations upgrade ERP’s for both functional and technical reasons.  Functionally, they want the advantage of the new functionality vendors put into ERP’s.

This may reflect:

  • upgraded best practices for the underlying process,
  • changes in the technology used to implement the processes, like self-serve kiosks for HR or people taking pictures of receipts, which can be read by OCR software, in an expense processing application, and
  • changes in legal or standards bodies’ regulations that the organization must adhere to.

Organizations’ IT departments also want to stay current enough with what their vendors support that they do not risk becoming unsupported on a mission critical application.  One of us visited a client who was running a version of the PeopleSoft ERP that was out of support, which used a version of PeopleTools that was out of support and a version of the Oracle database that was out of support.  These, in turn, were running on an out of support version of AIX on a pSeries that IBM no longer made critical spare parts for.  The organization knew it was skating on thin ice.  However, its business had taken a huge downturn, and it could not find the money to reimplement PeopleSoft.  It could not upgrade because it was so many releases behind that reimplementing would have been less costly.   No IT department wants to find itself in such a position.

Organizations also reimplement ERP’s because they want to move from highly customized versions of an ERP that are hard to maintain to a more “vanilla” implementation.  Organizations’ functional units are expected to adapt their processes to the best practices the ERP vendor prescribes.  We see organizations migrating to SaaS products to force this change because SaaS products usually can only be configured, not customized.

ERP projects can be massive, running into millions of dollars and over a year from start to finish. To start, the planning for the project involves both functional and technical resources whose time needs to be coordinated.  New requirements must be collected.  Upcoming business and legal requirements need to be predicted and collected.  Often, because of the size of the project, clients will have to provision new test, development, and UAT instances.  The organization’s program management office will often become involved because the stakes are so high.

Once implementation is started, things remain complicated.  While SaaS ERP projects tend to be shorter because less customization is possible, they are still not trivial undertakings.  Besides the configurations, data must be migrated to the new ERP.  The migrated data needs to be tested to ensure that it generates expected results.  When sample transactions are entered, the results need to be tested to ensure that the transactions were processed correctly.  When the inevitable errors are found, the root causes of them must be found and fixed.  This is less of a problem for SaaS products than for on-premise ERP’s, but the problem does not go away completely.

However, not all data is usually migrated, particularly when the new ERP is a cloud ERP.  Often SaaS products make data be migrated one transaction at a time, putting a premium on only migrating the bare minimum of data.  However, people still need to conduct analysis across that boundary.  Making this analysis seamless ensures the organization does not miss a beat when the new ERP goes live.

Given the size of the stakes, it is incumbent on all involved to ensure the organization gets the maximum benefit from this effort.  Given this, we see a best practice during an ERP upgrade or reimplementation that both IT and functional teams should look for ways to improve the organization’s outcomes.

Some topics they should address include how to:

  • increase the organization’s ability to react to changing conditions quickly
  • improve collaboration within the organization
  • improve the productivity of the organization
  • decrease error rates
  • decrease the cost and time to incorporate relevant new technologies into business processes
  • capture knowledge the organization generates and making it easy to access
  • adopt best practices within and across organizational pillars
  • determine root causes of deviations quickly and accurately

Performance optimization

We often see that as time passes, ERP performance declines.  One of us had an ironclad rule.  Whenever our consultants were called into analyze ERP performance, we would first look at the reporting done directly from the ERP.  Inevitably we would find that reporting and transaction processing competed for the same resources, causing both to suffer.  We would also find that many of these reports did not need up-to-the-minute data.  Also, in an attempt to improve performance, more indices, materialized views, and other overhead had been added, making the ERP harder to administer, tune, and upgrade.  The solution was simple.  Take the reports that did not need up-to-the-minute data and run them from a data warehouse populated nightly.  Then, remove the added indices, views, etc. from the ERP.  Often performance problems would disappear.

In addition, many times, the volume and complexity of these reports has grown over time, and clients end up with over a thousand reports.  These reports are often large and complex.  At one client, one of us ran into a report with over 400 columns.  It ran daily during off peak times and more frequently during peak season.  This client, like many others, had retained a battery of consultants to maintain these reports.

These reports often are never used by themselves.  They are used to download data into spreadsheets.  Once data is exported to a spreadsheet, it can leak out of the organization more easily.  Also, it is easy for people to make errors in formulas in spreadsheets, leading to wrong answers.   For people to avoid arguments on whose data is right and focus on more productive discussions as to the meaning and implications of the data, we need to build an architecture that will minimize and even eliminate the need for downloads to spreadsheets. Today, self service BI tools are good enough for end users to do most, if not all, of this reporting from a data warehouse without downloading data into spreadsheets.

Besides degrading response times, reporting directly from the ERP has another drawback.  The reporting tools typically delivered with ERP systems typically provide static reports with limited or no ability to drill down to details or slice and dice.  If details or a different slice is needed, they become other reports. Particularly for on-premise ERP’s, the reporting tools are not designed for end users.  IT then can devolve into a report factory that never keeps up with the volume of requests.  This inability to keep up causes people to ask for massive reports to dump to spreadsheets.  Also, these reports have to be upgraded or rebuilt when the ERP is upgraded or reimplemented.  Clearly, this is not optimal.

We have attempted to understand how applicable this fix would be to all clients.  When we analyzed the reports being run, who was running them, and how often they were being run, we found that only about 2% of the report executions actually required data less than one day old.  The other 98% of report executions could be run off day old data with no decrease in the utility of the report.  If we can move these report executions to a data warehouse, we can simplify tuning as well as the maintenance of the reporting.  Finally, with change data capture and micro ETL, one can pare even that 2% number down, further removing load from the ERP and simplifying tuning its database for good performance.

Enriching an ERP with enterprise-wide analytics

Organizations increasingly realize that success relies on good analytics.  Books like Thomas Davenport and Jeanne Harris’s Competing on Analytics have been read enough to make people cognizent of the value analytics plays in corporate performance.  Several times a year, Harvard Business Review and similar journals have articles describing how analytics drive organizational performance.

Leading organizations also realize that the truly interesting challenges they face do not neatly fit into one pillar of the organization.  For example, we hear call center managers ask “How does customer service rep training, seniority, and certification affect average handle time and cross selling performance?” To get truly rich insight, one must incorporate HR data into the analysis.  If the analytics are architected up front to span the enterprise and allow this and many other cross functional questions to be answered easily, analytics become much more valuable to the organization than if they are architected to handle one pillar and one data source.

We also see that often in less centralized organizations, different parts of an organization may have their own ERP instances.  Again, truly interesting analytics questions will not obey the organizational boundaries set up when these ERP’s were implemented.  For example, if an industrial company wants to understand safety performance, to get the biggest sample of data possible, it will want, consistent with national privacy regulations, to pull data from across all divisions and regions to get the most complete view of safety.  This larger sample will allow for sharing of best practices and identification of opportunities local managers may have missed.

We see, though, that often analytics is put off to a later phase or that the prebuilt transactional analytics provided with the ERP are considered to be “good enough”.  This tendency to take short cuts has consequences.  It can mean that the more strategic analytics and the value they bring lag behind the ERP for a year or more.  If the ERP upgrade or reimplementation runs over its budget, the overage comes out of the budget for analytics, further shortchanging the analytics, the part of the project that Nucleus Research says delivers 10-13 times the investment in business benefits.

The impact of good design on maintenance costs

When planning for implementing analytics, good design up front means less maintenance, greater scalability, and better performance on the back end.

These principles include:

  • In the ETL, separate the extract, which must depend on the source, from the load, which should not. This will simplify upgrading the data warehouse when any ERP is upgraded.
  • Design the data warehouse around best practices for the process being modeled, not on any vendor’s implementation of that process or reports the organization is used to seeing. This plus the point above simplifies adding more instances of the ERP to the data warehouse, for example for different subsidiaries or geographies.
  • Divide the work between the ETL and BI tools depending on where the work can be done at the lowest total cost of ownership.
  • Ensure the architecture supports real-time updates, micro ETL, and conventional batch ETL.
  • Minimize the time spent extracting the data from the ERP to minimize the load on it. At the same time, minimize the time updating the data warehouse.  In both cases, get in and get out quickly.  If the extract and load are as efficient as possible, there will be less demand for tuning to keep the loading of data into a data warehouse from impacting other systems.
  • If near real time reporting is needed, build that into the initial architecture. It will be harder to add on later.


Why the ERP upgrade or reimplementation is the best time to add enterprise wide BI

We have seen that given all these imperatives, an ERP upgrade or reimplementation is the ideal time to add enterprise BI to an ERP and that Oracle BI Applications is a good platform to do this with.  People we speak with find this notion counterintuitive.  After all, an ERP upgrade or reimplementation is often a huge project.  Adding more to the scope of the project seems like it adds to the likelihood of failure and adds to the amount of money we are asking the organization to spend.

We want to look at this issue from three viewpoints.

  • There are synergies if one does both ERP upgrade and implementing BI simultaneously that drive down the cost of the combined project v. doing them separately.
  • There are project management techniques one can bring to bear to mitigate the risk of a project with a larger scope.
  • There is value to the organization in delivering the analytics sooner. If there weren’t, why implement them at all?

Synergies:  As described above, many organizations do a lot of reporting against their ERP’s.  Also, much of this reporting can be done with yesterday’s data with no loss in the value of the report.  BI tools are typically easier to build reports in and provide more functionality than are the tools provided by ERP makers.  These features let developers combine several reports into one using features like filters, row and column selectors, and view selectors, cutting development time and cost and lowering ongoing maintenance costs.

In planning and executing an ERP upgrade or reimplementation as well as a BI implementation, one will end up doing fit gap analyses and determining report requirements for both.  Doing both activities at once allows an organization to avoid duplicate effort.

When errors are found, it is typically easier to use BI tools to track down the source of the errors than to use tools typically provided with ERP systems.  If one has prebuilt adapters to the ERP systems, as Oracle BI Applications do for Oracle on-premise and SaaS ERP’s, one knows those links are solid and can do much of the root cause analysis from the data warehouse, not from the ERP system.

Together, these mean that delivering both the ERP upgrade or reimplementation and the accompanying BI can be delivered in almost no extra time compared to an ERP upgrade or reimplementation on its own.

Risks:  While there are several negative risks associated with simultaneous implementation of ERP and BI, there are also positive risks.

One of the main negative risk in such an endeavor is the larger number of stakeholders that get involved in the project. This increases the communication burden for the project tremendously. The number of communication channels are defined as {n*(n-1)/2} where “n” represents number of stakeholders. So, for example where ERP project has 10 stakeholders, adding even 2 stakeholders from BI team increases the communication channels from 45 to 66 – roughly a 50% increase! If not managed effectively this can lead to rework resulting in loss of time and money. Increased number of stakeholders also tend to make design and testing cycles longer. This is partly due to calendar coordination difficulties. When embarking on such a combined effort it is prudent upfront to secure commitment from all the stakeholders and embed the Organizational Change Management effort upfront in the project cycle.

On the side of positive risk, however, there are benefits to be had from the increased synergies in design. Table structures can be designed and indexed based on the needs of BI. This design also streamlines the ETL processes making the incremental load time shorter. Because the BI needs were considered alongside the ERP needs, there is higher probability that system adoption would be higher thus making the system the single source of truth rather than depending on offline spreadsheets. This generally leads to faster and effective decision making as well.

Value:  Various studies from Nucleus Research and other analysts have shown that analytics projects typically return between ten and thirteen times their investment.  Often, we find that investments in analytics, if accompanied by good business integration and change management, have payback periods of less than eighteen months.  By implementing analytics simultaneously, an organization can recoup substantially all the value of the investment by just implementing it a year earlier than it would take with a more sequential development methodology.  If you have done a business case for these analytics, estimate how much more you can make or save by implementing this several quarters later.

Why Oracle BI Applications are a good way to implement enterprise analytics

Many of the core functions in an ERP are best monitored by well understood and monitored by well understood processes.  As such, Oracle has built many of these processes into products, the Oracle BI Applications.  We use these as a base for implementing analytics against ERP systems for several reasons:

  • They incorporate best practices for BI, like
    • Separating the source dependent extract from the source independent load
    • Using incremental extracts and bulk load tools to minimize the impact on the ERP and the data warehouse
    • Intelligently deciding whether to update records or insert records
    • Dividing the work between the BI tool and the ETL tool
  • They have industry standard definitions for derived metrics, avoiding having IT departments learn the specific subject matter while delivering well understood metrics.
  • It can handle semi additive and non-additive metrics without users having to configure these
  • It has built in slowly changing dimensions.
  • The data model is set up to enable the kind of cross functional analysis we discussed before.
  • It has the capability for near real time reporting using Golden Gate.
  • They have documentation on how to modify these to deal with configurations and customizations without compromising the ability to upgrade it.

Oracle builds and maintains connectors to Oracle branded ERP’s and has committed to keep these up-to-date as it evolves the underlying ERP’s.  Partners also have built their own adapters to systems Oracle does not support, like Salesforce.

HEXstream does not exclusively implement Oracle BI Applications, but in the areas they cover, when a partner that knows how to implement them, they represent a good way to get value quickly.  We understand how to extend these and to build completely custom data warehouses where it makes sense.

Preparing for the next upgrade, on premise or in the cloud

Building an analytics infrastructure like we have described saves money, not just once but ongoing.  Besides lowering the cost of this upgrade or reimplementation, a well-constructed infrastructure saves money on ongoing maintenance.  That the organization has tested the end-to-end data flow means there are fewer places for the upgrade or modification to fail between upgrades.  Finally, because while Oracle is not developing new content for the Oracle BI Applications, they do continue to develop new drivers for new releases of Oracle branded ERP’s, both on-premise and in the cloud.  This work simplifies the work that will be needed when the next upgrade comes around, whether the organization decides it wants to continue to run on-premise, run using a bring-your-own-license model on top of IaaS, or run on a SaaS product.

Organizations, we find, also want to consider running their analytics in the cloud.  Oracle BI Applications can be run in a variety of cloud architectures, depending on the organization’s and the cloud vendor’s cost structure and the organization’s appetite for running in the cloud.

  • The BI layer can be run in the cloud on Oracle Analytics Cloud, connecting to an on-premise data warehouse using VPNaaS. This migration is relatively straightforward.
  • The BI layer and the database can be run in the cloud, with the ETL and the ERP still on-premise, again using VPNaaS to connect the on-premise with the cloud parts of the solution. The cloud-based database can either be database cloud service or it can be a BYOL database license running on Oracle IaaS.
  • All parts of the solution, ERP, data integration, data warehouse, and BI can be in the cloud. Given Oracle’s current support matrix, if an organization wants to elect this architecture, one has to use ODI running on IaaS and not ODI Cloud Service, at least until ODI 12c is certified for Oracle BI Applications.

Finding a partner to help you on the way

We have seen that a critical factor for the success of any Oracle BI strategy and implementation is employing a partner experienced in, and preferably specialized in, Analytics.  Our principals have implemented Oracle BI Applications and custom BI solutions sourcing data from both Oracle and non-Oracle branded ERP’s.  They have implemented both real time Golden Gate based solutions as well as conventional ETL based solutions using Informatica and Oracle Data Integrator.  In utility analytics, HEXstream is a co-developer of Oracle Utility Analytics.

We can help make this journey easier and less risky.

Conclusion

When an organization decides to upgrade or reimplement its ERP, it should consider implementing analytics simultaneously.  Doing so helps accelerate the returns from the combined system while reducing the time, cost, and risk of the upgrade or reimplementation.

As the organization implements analytics, it should lay the foundation for enterprise analytics, crossing functional, organizational, and geographic boundaries.  Doing so will allow the organization to answer the more interesting problems that bedevil senior level executives.  Being able to answer these questions will allow the organization to thrive in challenging times.

About the Authors

Will Hutchinson

Will Hutchinson is the Director of the Analytics Practice at HEXstream.  He is a former Master Principal Sales Consultant for Oracle and has over 35 years of experience with data warehousing and analytics. He is the author of a book on analytics and has extensive industry experience spanning pharmaceuticals, oil and gas, consumer goods, insurance, and manufacturing. Will is an expert in ROI and TCO analysis and is a polished speaker and trainer.

Jamal Syed

Jamal Syed is the Chief Executive Officer at HEXstream.  He has over 20 years of BI experience and keeps a keen eye on emerging technologies in this industry.  Jamal is considered a thought leader in the analytics space and enjoys speaking at conferences across the country each year.

 

Leave a Reply

Your email address will not be published. Required fields are marked *