The benefits of migrating to SAP HANA have been well documented by countless blogs, SAP content, and a glut of industry think pieces. You hear about success stories but not the technical challenges you will face. In this blog, I am going to talk about two major challenges that I have faced during an SAP HANA migration of a large-scale client’s SAP ERP to SAP Suite on HANA (SoH).
- Downtime Optimization
- Migration with Performance tuning in mind
In the best of times, outages are costly and disruptive; and when they occur with SAP workloads, it will cost organizations millions of dollars. That is because by nature SAP workloads are business-critical. The same applies to any SAP system migration project. You will need to agree on a downtime window with the Business to minimize disruption caused by migrating your online production transaction system to a new platform. From my experience, after a quality system upgrade, it typically takes about 60 hours to migrate the client’s production system to the SAP Suite on HANA (SoH).
For this migration, Auritas understood that the first challenge faced would be to optimize for any downtime. We hit the whiteboard and were able to successfully plan and execute a migration that brought that turnaround time down from 60 hours to just under 20 hours total.
The second added challenge that we were able to overcome – was to make sure that the migration to the SAP HANA database was done with performance in mind to prevent any future system downtime. This meant we only had one shot to migrate the ERP to the SAP HANA database and had to ensure that we designed said database to have a bulletproof design so that there would be no need for further downtime to tune the SAP HANA Database, post migration.
Regardless of what approach or technique you choose to minimize the production downtime of your SAP ERP system migration to SAP SoH or SAP S/4HANA; to achieve Near-zero Downtime Maintenance or a Downtime-Optimized Database Migration option, you need to ensure you have:
- Good archiving strategy to reduce the size of the application tables. it becomes essential to reduce the data volume as more data means more downtime during Migration.
- Defined a robust performance monitoring schedule for your upgrade project plan. It is important to monitor your application and database behavior under different loads. You will have to adjust many parameters to tune the application server processes and SAP HANA database performance.
- Talked to Functional teams in advance and agree on the number of partitions for the transactional tables that need to be created while migrating to SAP HANA with their growth ratio in mind. Keep in mind that SAP HANA tables can only grow up to 2 billion rows. You need to identify large tables in advance and plan a proper partitioning technic. This was very important in my case since the Business would not agree on a second downtime on the production system to maintain or tune the database after the migration was complete.
- After each successful dress rehearsal, Run an SAP HANA mini check and EWA reports and take advantage of pre and post-go-live SAP reports to identify the bugs, issues, and best practices.
- Parallel processing During Export and Imports of DMO.
This is an area needing more experimentation, but I can share with you my own experience with this parameter. System copy guides and migration guides do not give much advice about choosing an appropriate number of parallel processing when performing exports or imports. There was no way that we could have achieved to reduce the downtime of the under 20 hours with default parallel processing during the export or import phase of the DMO.
Therefore, sizing was made with migration performance in mind on the server that DMO runs. While monitoring your network traffic and CPU load, raise the number of R3load processes step by step, always waiting 5 to 10 minutes until they are started. When either the CPU load or the network traffic reaches 80% to 90%, you have found the optimal number of R3load processes. You can uptick your R3load process up to 1000. In my case, the optimal number of parallel processing was 400.
The strategies here will set you up to defeat the challenges posed at the beginning of this post; Downtime Optimization and Migration with performance tuning in mind.
If you’d like to learn more about the services, knowledge, and experience offered by Auritas, click here to learn more.
TELL US ABOUT YOUR PROJECT
Before we get into the meat and potatoes, let’s take a look at the value that Vendor Invoice Management (VIM) provides. OpenText’s VIM gives users the
Why should you consider an Information Lifecycle Management (ILM) plan? The answer is simple. We live in a world where technological demands are increasing more
“Bigger isn’t always better”. Jokes aside, there is some truth to this statement. When people hear these words Big Data, many think to call it a