Navigating the Shift: Insights from Upgrading to HANA Cloud Platform

Keeping up with the newest technology advancements is a goal shared by most enterprises, but this desire to stay a step ahead can prove challenging for many. We recently worked with a manufacturing industry client as it set out to migrate its SAP HANA Enterprise Cloud (HEC) environment onto the SAP HANA Cloud Platform (HCP). The company was new to the SAP Business Technology Platform (SAP BTP) and was unfamiliar with the processes and procedures it would need to implement PaaS and SaaS solutions. The move from HEC to HCP would give this client more autonomy over its environment than the more traditional IaaS hosted solution. Transitioning to HANA Cloud would position the client on the new Business Technology Platform (BTP) where SAP will focus all future product innovation.

Architecture overview

To better understand the migration, first look at the client’s legacy architecture:

Our primary objective with the upgrade to this new cloud-based infrastructure was to provide improved scalability and flexibility going forward. Several key differences to note between the HEC and HCP architectures:

  • HANA 2.0 XSC hosted in HEC vs. HANA 2.0 XSA hosted in SAP BTP Cloud
  • Schema based vs. HDI Container based
  • Dev Tools -> HANA Studio, Built-in version control vs. SAP Business Application Studio, External version control (Git)

New tools, new solutions

Because HANA Cloud is a newer technology, migrating to it from a traditional HANA HEC instance can be challenging – especially when prior development was in XS Classic (XSC) and not in XS Advanced (XSA). There are multiple tools available from SAP to migrate content from XSA to HANA Cloud, however, there are no solutions readily available for XS Classic (XSC) to HANA Cloud. Overall, this migration was largely a re-platforming and clients with a similar architecture should anticipate significant code remediations to successfully make the transition.

To solve this problem, Protiviti audited the XSC content to determine what repository-based files could be converted to HANA Cloud filetypes via the SAP HANA cloud migration tool, and which would need to be manually recreated. There are many object types that are depreciated in HANA Cloud including the use of CDS-based ( hdbdd) files for view and table creation, XS OData services and XS Application content, to name a few. We discovered that the loss of a repository, and the associated _SYS_REPO user and schema content, meant the entire security model would need to be rebuilt, and many factors that SAP customers face moving from XSC to XSA had to be faced during the migration efforts.

This initial audit told us the client would need a comprehensive data strategy and that it would be likely we would need to improvise throughout the project where the appropriate tools did not exist. For this client, we considered how these factors would be incorporated into the strategy before beginning the migration:

  • Architecture: HEC and HCP have different architectures and capabilities, so it is crucial to understand these differences to deliver an effective migration.
  • Development environment: HEC and HCP utilize different development tools. In HEC, the HANA development is done using HANA Studio, which has built-in version control. In HCP, the HANA development and version control are done separately using Business Application Studio and Git. This separation presented our client with challenges fitting into their existing development methodology.
  • Application readiness: Some applications may need adjustments to work efficiently in the HCP environment.
  • Data migration: The appropriate tools and methodologies are needed to ensure data integrity and minimal downtime.
  • Customizations and extensions: Customizations and extensions must be compatible with HCP.
  • Security and compliance: Security and compliance requirements for each application and data being migrated must be reviewed.
  • Performance optimization: Optimizing applications’ performance for HCP may involve tuning configurations, optimizing queries and utilizing HCP-specific features.
  • Integration: Seamless integration with other systems, both hosted and in the cloud, may require reconfiguring integrations or developing new ones.
  • Connected tools: Many tools outside of the HANA platform, that assist with tasks like ETL data management, reporting and data replication will require extensive manual remediations to accommodate new HCP view, table and schema adjustments resulting from the HANA DB XSC to HCP development architecture migration. In addition, connected tools also require middleware updates or HANA Client updates to properly access the HCP environment.
  • Training and support: The IT team must be familiar with HCP and its management tools to ensure smooth operation and effortless troubleshooting post-migration.
  • Testing and validation: Post-migration, thorough testing and validation of applications and data, including functional and performance testing must be conducted.
  • Backup and disaster recovery: HCP Recovery mechanisms to ensure data availability and business continuity must be in place.
  • Cost management: Consider the differences in pricing models between HEC and HCP.
  • Vendor support and SLAs: Including support for issues, updates and maintenance.
  • Documentation and knowledge transfer: Document the migration process and lessons learned; transfer knowledge to relevant teams to support and maintain the environment.

Side effects along the way

Another side-effect of moving to HCP is naming convention changes that occur, regardless of planning. Because some object types, like calculation views, change naming from namespace/VIEW_NAME to namespace::VIEW_NAME, means that all downstream systems referencing the object now need to be repointed. For this client, we were easily able to identify objects that needed to be remediated using database querying in SAP Data Services (BODS) for ETL jobs, and data source connections in Information Design Tool for Universes. The downstream BusinessObjects reporting platform picks up the changes via Universes, which reduced the amount of manual overhead that could have been present if Webi reports connected directly to Cubes.

These activities, in addition to creating jobs to load data from HEC to HCP, configuring SLT, developing a synonym strategy, QA, and more, took just under nine months to complete, from start to go-live. The bulk of the time on this project was spent in object remediation of BODS jobs, as the client had 400+ existing jobs that read and/or wrote to HEC. We also had to create jobs to load data from HEC to HCP, because the data only existed in the HEC environment. This was part of a larger data load strategy to help ensure no data loss when migrating to HCP, via the new HANA to HANA load jobs, usage of SLT to replicate ECC data, and ensuring updates to BODS jobs with ODP queues and ABAP extractors were configured correctly to not miss any deltas. Overall, creating the data load strategy, remediating the relevant content for data loads and loading the data took around four months.

Overcoming barriers

As noted, the biggest barrier for any client migrating to the HANA Cloud Platform is the XSC to XSA migration effort, and this client was no exception. We utilized the HANA XSA Migration tool to the fullest extent possible, but it was by no means a turnkey solution. The tool was unable to convert many of our tables or views, so we developed a custom Python program to convert the tables and views from CDS objects to native HANA tables and views. This Python program was 80% successful with the table conversions and 50% successful with the View conversions. The remainder of the conversions had to be done manually.

The effort to migrate and re-platform the client’s HANA database landscape from HEC to the HANA cloud was critical to continue providing valuable insight into the company’s essential data from its SAP ECC (legacy ERP), planning, product, part and pricing information management and supply chain logistics systems.

In total, we assisted the client in standing up four HANA Cloud environments (Dev, QA, Staging, Prod) and converted, migrated, and/or remediated these HANA and related objects:

  • HANA objects:
    • Tables: 375
    • Views: 442
    • Calculation views: 610
    • Procedures: 133
    • Functions: 38
  • Data Services ETL:
    • Net new jobs: 191
    • Remediation of existing jobs: 171
  • SAP BusinessObjects
    • Remediation of 35 critical universes supporting hundreds of Web Intelligence reports

The migration to HCP was not without its challenges. Many synonyms were used in the HEC environment, causing additional work to be done mapping synonyms to source objects, and those source objects to the new object schema and name in HCP. This caused confusion in HCP, but also while remediating BODS jobs and universes in IDT. This migration motivated the client to adopt best practices in this area moving forward in the HCP.

Overall, the HCP migration was a success. No objects or data were lost when migrating, and the only errors or failures that occurred in production were performance related. Custom web applications were migrated from HANA into other platforms like Mulesoft and BODS, which allowed for easier future maintenance by existing developers. The key lessons learned are that migrating from a HANA DB with XSC content and the impacted content outside of the HANA DB require significant time to remediate. These remediation efforts are further slowed when development standards and development best practices not previously followed. The successful migration allowed for another initiative to kick off immediately following this one, where business intelligence work will be sourced in the new HANA Cloud platform. In addition, the client is better prepared for future SAP innovations with the adoption of the XSA model, BTP platform and HANA Cloud Platform.

To learn more about our SAP capabilities, contact us or visit Protiviti’s SAP consulting services.

Tony Angell

Associate Director
Business Platform Transformation

Haley Howard-Jones

Senior Manager
Business Platform Transformation

Subscribe to Topics

Generative #AI is set to revolutionize the field of enterprise architecture. Get a comprehensive overview of the impact of #GenAI on EA activities, plus challenges, risks and limitations in the latest Technology Insights blog post. https://ow.ly/foPJ50SkUW6 #ProtivitiTech

Protiviti’s @KonstantHacker will join a panel to speak on “Quantum Leap: Securing Manufacturing's Next Frontier with Post Quantum Cryptography” on July 18 in Chicago, IL. Register today for this in-person event. https://ow.ly/s02X50SkfcI #ProtivitiTech #Quantum

Protiviti’s Kim Bozzella explains why it’s crucial for businesses to establish trust through transparent and secure data practices: “Losing trust means losing business.” Learn how to take action now. https://ow.ly/mIAX50Sjjju #ProtivitiTech #DataPrivacy

Protiviti’s Mark Carson discusses the importance of measuring analytics capabilities, the importance of taking an agile approach to analytics assessment, and the future of analytics maturity. Read more in TechTarget: https://ow.ly/GJKw50Siri7 #ProtivitiTech

Protiviti’s @KonstantHacker and guest Benedikt Fauseweh, of TU Dortmund University, discuss Richard Feynman’s 1981 quantum simulator idea, its relevance today and whether this work has anything to do with ‘The Three-Body Problem’ novel and Netflix show. https://ow.ly/CrRY50SibFV

Load More