Does your refresh schedule look like this?

By Daniel Parker
Daniel Parker

With more than 15 years of SAP experience, Daniel Parker specialises in data copy automation and data security. He leads an experienced consulting team, and delivers a variety of landscape solutions to organisations in the APJ region.

Written on Oct 16, 2017 6:19:57 PM

2 minute read

The reliability of testing data

Reliability of test data quality and quantity has become a core requirement of any modern IT organisation. The ability to effectively trial and test any change before applying to the production environment is paramount. But what is considered quality data? And how often should you refresh your landscape? This blog post provides you with key indicators to check if you’re working with poor quality data, along with an example of a healthy refresh schedule.

Why do we all need good testing data?

  • General BAU production support, troubleshooting & issue replication
  • System upgrades and patching (EHP8, S/4 etc)
  • Activation of new business functions or new configuration to match a changing business process
  • Mass maintenance of master data:
    • Pricing conditions change project
    • ERP System consolidation
    • ERP System split following business sell/split/restructure
    • ERP System consolidation

“The average organization loses $8.2 million annually through poor Data Quality” - Gartner

Are you working with poor quality data?

To understand quality testing data, we first need to understand what actually constitutes ‘poor’ testing data. Key indicators to review the value of your current testing data are Integrity, Age, Industry Relevance and Security.

  • Integrity: Poor testing data will often lack the Integrity of being production-like in nature, lacking correct and consistent data relationships within a system (ie ECC: Material -> PO -> Vendor) and not being consistently connected across system boundaries (ie. ECC -> CRM, ECC -> BW).
  • Age of data has two considerations:
    • Most of the time, the most recent 10% of the transactional data in your systems is of concern for testing; the 90% just consumes space in the testing environment and is of little worth; 3 to 6 month reduced time slices cover the bulk of requirements.
    • The second consideration is how old the data is compared to production. SAP customers on a yearly QA refresh cycle spend 9 months of the year with all of their QA testing data over 3 months old.
  • Relevance: Is your refresh cycle aligned with your industry's needs? For example, SAP systems performing Retail or Utilities functions typically have larger data volumes and very short transaction cycles, creating a need for very recent data in your test systems.
  • Security: Testing data should be like production but need not be identical (creating a security and governance headache). Information of a personal and sensitive nature in testing data needs to masked to ensure the data and business reputation are not put at risk.

It follows that quality testing data will be reduced in volume to target your need, maintain integrity both in and cross-system, and be intelligently masked so as not to expose sensitive data outside of the production system.

What are the negative impacts on your business?

Increased strain and resource requirements for testing teams. According to NIST, the average test team spend 30%-50% of time setting up environments rather than actual testing. NIST also found 74% of IT projects experience some form of delay related to testing data quality issues.

Here at EPI-USE Labs, other common negative impacts we find and resolve amongst our customer base are:

  • Slow project ramp-up times and resource burn rates while waiting for data refreshes
  • Long test system outage times while refreshes occur
  • High traditional SAN/Disk storage costs outside of production
  • High cost of HANA appliances and maintaining non-production appliances of equal memory size to production
  • Complex cross-team dependance for provision of test data (Basis team needs the Infrastructure team to allocate disk, Functional team needs the Basis team to copy the whole system, no one team is empowered)
  • Restricted functional and testing team access in non-production due to sensitive data not being scrambled
  • Ineffective testing due to the reliance on manually built data, not real production-like data. Testing becomes only as good of the built data, not truly reflective of production operation.

“Data Quality Best Practices boost revenue by 66%”

- SiriusDecisions

What a healthy refresh schedule looks like

Some of Asia Pacific's largest organisations such as Lion Co and Fairfax Media have improved their test system data quality with the help of an SAP data copying tool called Data Sync Manager. This is an example of a refresh schedule that becomes possible with the features of Data Sync Manager, and is a proven approach to provide simple, on-going refresh of your SAP testing data.

Download The Refresh Schedule

Topics: data testing

Add a comment