top of page

Why Data Management is Essential for Efficient and Compliant SAP S/4HANA Operations

  • Writer: Adam Hislop
    Adam Hislop
  • Sep 9
  • 10 min read

Updated: Sep 16

ree


For organisations moving to S/4HANA, and particularly for those new to the world of in-memory computing, there is a far greater imperative to proactively manage resources than ever before. My experience of managing SAP systems across the globe over the past three decades is that data management has rarely been treated as a strategic priority. Too often, it was perceived as a difficult and low-return activity. Storage charges were relatively inexpensive and on a steady downward trajectory, since the late 1990s, the cost per gigabyte of disk storage has dropped from well over USD$100 to just a few cents today. In such an environment, data housekeeping was rarely front of mind. In most cases, archiving or data reduction initiatives were triggered reactively by external events: deteriorating system performance, an impending upgrade, or the need to replatform.


Even when performance became an issue, the instinctive response was usually technical tuning. Development teams would look to optimise SQL statements or add indexes, which in reality only added further weight to the database without addressing the root cause: too much data being held unnecessarily. Furthermore, in traditional database environments, even if data was archived, the underlying storage footprint could only be reduced through reorganisation. For much of my career this was an offline task requiring downtime, making it an unattractive option for system owners wary of service interruptions.


The tangible benefits of good data management, however, are powerful and extend well beyond reducing the size of a production database. A smaller database footprint lowers primary storage costs, but it also cascades into savings across the system landscape. Quality and test systems are usually sized and refreshed in line with production, meaning inefficiencies are replicated several times over. When you factor in backups, the picture becomes even more compelling. A single gigabyte of data in production may be backed up daily for 30 days, compressed but still multiplied many times over, and then duplicated again across non-production environments. In some cases, one gigabyte of production data can equate to more than ten gigabytes of storage when measured across the entire landscape. Add in replication to disaster recovery sites or data warehouses, and the multiplier effect becomes staggering.


ree

There are also indirect operational benefits. Smaller databases result in shorter backup windows, reducing the performance impact on business operations. System copies—a regular requirement for project work, testing or compliance, become faster and less resource-intensive. In fact, with leaner data sets, full system copies become more viable, allowing organisations to refresh environments more frequently without resorting to costly selective copy tools. These operational efficiencies are often overlooked, yet they directly influence project agility and business continuity.


In my career I have never seen archiving or structured data management considered seriously at the design stage of an SAP project. It has always been an afterthought, a post go-live activity addressed years down the line, long after the project team has disbanded and institutional knowledge has faded. This was manageable in the past because storage was cheap and performance costs were deferred. That paradigm changes completely with S/4HANA.


With HANA, every gigabyte of data is not merely a block on disk but an object that must also be loaded and persisted in memory. The economics are transformed. To illustrate: I have worked on Oracle systems where a 3-terabyte database was comfortably supported by a server with 64 gigabytes of memory. In HANA, the same dataset, reduced to a third the size, thanks to columnar compression, still requires 1 terabyte of memory, plus working memory overhead. The cost profile of each additional gigabyte becomes far more significant, impacting hardware sizing, licensing, and operational expenditure.

This is why data management in S/4HANA can no longer be deferred to some distant future initiative. It must be designed and executed from day one, not as an afterthought. Once the true cost of each gigabyte is understood, replicated, backed up, and multiplied many times across the landscape, it becomes clear that proactive strategies are not optional. Some approaches are straightforward and inexpensive to implement, others require more investment and ongoing effort, but all should be considered as part of a comprehensive data management strategy from the outset.


When discussing data management in S/4HANA, the logical starting point is the simple yet highly effective practice of housekeeping. In my experience, the most cost-efficient improvements often come from addressing the basics, and SAP itself provides clear guidance in this area. OSS Note 2388483, “How-To: Data Management for Technical Tables”, is the definitive reference for the range of technical housekeeping activities that every organisation should review and adopt. At its core, the approach is straightforward: standard ABAP programmes can be scheduled with variants to automatically delete unnecessary technical data. Although some scenarios require more detailed analysis before they can be applied, many can be implemented with minimal effort, yielding what is effectively free capacity reduction. Neglecting these measures is akin to leaving the bins uncollected; the waste accumulates silently until it becomes a real problem.


One of the more remarkable patterns I continue to encounter is the tendency for organisations to allow binary objects such as images, attachments and scanned documents to remain stored directly within the database. SAP’s default configuration is to hold this content in tables like SOFFCONT1, and unless an alternative content repository is established early, this is precisely what will happen. I have seen production systems where the majority of the storage footprint, sometimes as much as two thirds of the database, was consumed by these tables alone. The cost of holding such data in HANA is compounded because it is not only stored but also requires memory allocation, and because backup processes must handle the full volume, with limited opportunity for compression. Daily backups of such content quickly multiply into a staggering waste of resources.


The remedy is simple but requires discipline from the outset. Establishing a Document Management System, whether SAP’s own Content Server, SAP Document Management on BTP, or a third-party solution such as OpenText, ensures that binary objects are held in the right place, outside the transactional database. Even if the issue has already developed, SAP provides migration routines that allow legacy content to be moved into an external repository with relative ease. The impact of such a step can be transformative: by reducing the size of the in-memory database, organisations not only lower licensing and infrastructure costs but also cut down on backup windows and improve overall system efficiency.


Housekeeping should therefore be regarded as the first line of defence in managing S/4HANA data volumes. It is preventive maintenance in its purest form. When carried out properly, it reduces technical debt, protects system performance, and helps IT teams stay ahead of uncontrolled data growth before it escalates into more complex and expensive challenges.


Another important aspect of managing data in S/4HANA is the use of Data Aging. This capability, which has its origins in Suite on HANA, allows tables to be segmented based on the age of their records, typically applying thresholds such as twenty-four months. When data crosses the threshold, it is flagged as “aged” and placed in a separate partition of the table. The effect is that this data is no longer automatically loaded into hot in-memory storage, but instead resides in warm storage. Crucially, it remains accessible through standard application logic and can be read programmatically if required. In practice, this means business users continue to have access to the information, but queries and transactions that require it may execute more slowly.


The distinction between Data Aging and archiving is an important one. Archiving removes data from the active database and stores it in a separate archive file, often outside the primary S/4HANA environment. Retrieval is possible, but it typically requires dedicated processes and is often more suited to compliance, audit, or dispute resolution scenarios than to day-to-day operations. By contrast, Data Aging keeps the data online within the same database, ensuring business continuity while still delivering a reduction in the memory footprint. It is, in effect, a technical optimisation that balances performance with efficiency rather than a full data lifecycle management measure.


In practice, I have found that Data Aging is particularly useful in scenarios where data is retained for legal or operational reasons but is only infrequently accessed. They must remain available for statutory reporting or audit but do not require the same performance levels as current year transactions. By shifting these into the aged partition, the organisation gains the benefit of reduced in-memory costs without compromising compliance.


The challenge, however, lies in the relatively limited set of predefined scenarios that SAP currently provides for Data Aging. Customers are restricted to those areas where SAP has enabled the framework, and it is far from exhaustive. This constraint means that organisations cannot rely on Data Aging as a universal solution to their data growth challenges. It is also worth noting that implementing Data Aging, whilst simple, is not a purely technical decision. It requires alignment with business stakeholders, as the change in access speed can affect reporting and user experience. Without careful communication and agreement, there is a risk of resistance or confusion when aged data does not perform in the same way as current transactions.


Despite these limitations, Data Aging remains a valuable part of the S/4HANA data management toolkit. It provides an elegant means of reducing memory consumption and controlling growth in areas where access patterns allow for it. When combined with proactive housekeeping and a structured approach to archiving, it forms a key layer in ensuring that S/4HANA remains cost-effective, performant, and aligned with business requirements.


As we move beyond housekeeping and data ageing, the next consideration is data tiering, a more technical but highly effective means of managing storage in an S/4HANA environment. The principle is to classify data according to its “temperature” and then allocate it to the appropriate storage class. Hot data is the most critical, consisting of records that the business requires constant and immediate access to. This data remains in-memory within HANA to guarantee the fastest possible retrieval. Warm data, by contrast, is accessed far less frequently. It is still useful enough to warrant automatic access, but the performance expectations are lower. By placing this in warm storage, such as SAP HANA Native Storage Extension (NSE) or SAP HANA Dynamic Tiering, organisations reduce the in-memory footprint without disrupting day-to-day processes. Finally, cold data is that which has little or no foreseeable operational use, but which may need to be retrieved occasionally for compliance, dispute resolution, or investigative purposes. Cold storage is typically external and low-cost, often object storage in the cloud, with retrieval carried out through special processes rather than being immediate.


ree

The cost efficiency of data tiering cannot be overstated. Keeping only the most valuable, high-use data in-memory ensures that infrastructure and licensing costs are aligned with business needs rather than driven by uncontrolled data growth. At the same time, placing less critical data into warm or cold storage frees up resources and reduces backup and replication demands across the landscape. In practical terms, organisations benefit from a leaner and more responsive core system while retaining the assurance that less-used data remains available if required. Data tiering therefore represents not just a technical optimisation, but a deliberate strategy to balance cost, performance and compliance in the long-term management of an S/4HANA system.


While the technical practices of housekeeping, data ageing and data tiering form the foundation of a well-managed S/4HANA system, they cannot be viewed in isolation from governance and compliance. In the Australia and New Zealand markets, organisations face increasingly stringent obligations to manage personal and business data responsibly. Data management is not only about performance and cost efficiency; it is also a legal and regulatory imperative.


In Australia, the Privacy Act 1988 (Cth) and the associated Australian Privacy Principles (APPs) require organisations to ensure that personal information is collected, stored and disposed of securely. A key aspect of this legislation is data minimisation: information should not be kept longer than is reasonably necessary for business or legal purposes. In recent years, high-profile data breaches have highlighted the risks of retaining excessive volumes of sensitive information, with regulators placing greater emphasis on organisations’ ability to justify their retention policies. For industries regulated by the Australian Prudential Regulation Authority (APRA), such as banking, superannuation and insurance, the bar is set even higher. Institutions must demonstrate robust processes for the retention and destruction of records, ensuring that information is available for statutory periods but permanently removed once those obligations have been met.


New Zealand imposes similar requirements under the Privacy Act 2020, which enshrines the principles of data minimisation and secure handling of personal information. Agencies and organisations are required to hold data only for as long as it serves a lawful purpose, and to ensure its safe disposal thereafter. For public sector bodies, the Public Records Act 2005 adds an additional layer of obligation, mandating that official records be properly retained and disposed of in line with standards set by Archives New Zealand. This creates a dual requirement: operational efficiency on the one hand, and a defensible records management process on the other.


For organisations running S/4HANA, these obligations translate directly into how archiving and data management policies are designed and executed. Retention rules must be aligned with legislative requirements so that data is not deleted prematurely, exposing the business to audit or compliance risk, nor retained unnecessarily, which increases costs and creates potential liability. Robust audit trails and defensible deletion processes are essential, ensuring that every data action can be justified in the event of regulatory scrutiny. Importantly, governance frameworks need to bridge the gap between IT and business stakeholders. While IT teams can implement the technical solutions, it is the business that must define the retention schedules, determine which data has ongoing value, and take responsibility for compliance.


The implications are clear: effective data management in S/4HANA is as much about governance as it is about technology. Organisations in Australia and New Zealand cannot afford to treat data housekeeping, ageing and tiering as purely operational measures. They are integral to meeting regulatory requirements, maintaining customer trust, and safeguarding the organisation against both financial penalties and reputational damage.

Conclusion: Balancing Efficiency, Cost, and Compliance


The shift to S/4HANA changes the economics of data management in a way that no organisation in Australia or New Zealand can afford to ignore. In-memory computing delivers exceptional speed and agility, but it comes at a premium that magnifies the cost of poor discipline around data volumes. Effective data management is therefore not an optional enhancement to system operations; it is fundamental to ensuring that S/4HANA performs as intended, remains cost-effective over time, and meets the stringent compliance requirements imposed by regulators across the region.


Housekeeping, Data Aging and Data Tiering each play distinct but complementary roles in this process. Housekeeping addresses the low-hanging fruit by eliminating redundant technical data and ensuring that inappropriate content such as binary objects is kept out of the transactional database. Data Aging provides an elegant means of reducing memory pressure by moving older, less frequently accessed records into warm storage while maintaining their availability for legitimate business needs. Data Tiering extends this approach further by ensuring that data is always aligned with the most appropriate and cost-efficient storage layer, whether hot, warm or cold. When these practices are embedded within a governance framework that clearly defines retention policies, responsibilities and audit requirements, the outcome is a system that is not only technically efficient but also defensible under law.


Organisations in Australia and New Zealand should view data management in S/4HANA as a pillar of digital resilience. It is not simply an IT housekeeping exercise but a business-critical capability that underpins compliance, cost control and operational efficiency. By treating it as such from the outset, companies position themselves to derive sustained value from their investment in S/4HANA, ensuring that the platform remains both a driver of innovation and a foundation of trust in an increasingly regulated digital economy.

bottom of page