Extracting SAP S/4 HANA ERP data for Analytics and ML
If you are struggling to extract your SAP S/4 HANA ERP data and are facing issues, you are not alone. SAP S/4 HANA is immensely useful to customers but extracting S/4 HANA ERP data to non-SAP platforms can prove challenging. BryteFlow SAP Data Lake Builder extracts SAP data with business logic intact in a completely automated manner (more on that later). Simplify SAP Data Integration
Our customers often say that they are looking to integrate SAP data with non-SAP data in real-time for analytics. They need IoT and sensor data from devices in real-time to populate their data and analytics platforms in the Cloud. SAP fortunately allows for several data extraction mechanisms for SAP applications, and we will look at the methods of extraction of SAP ERP data from SAP S/4 HANA here. Expose CDS Views in SAP HANA as an SAP OData Service
- SAP S/4 HANA Overview: What is SAP S/4 HANA?
- How does SAP HANA differ from SAP S/4 HANA?
- Data Extraction from SAP S/4 HANA
- Extracting SAP S/4 HANA Data at Database Level
- Extracting SAP S/4 HANA Data at Application Level
- Extracting SAP S/4 HANA Data with Operational Data Provisioning
- Extracting SAP S/4 HANA Data with SAP Landscape Transformation Replication Server (SLT)
- Extracting SAP S/4 HANA ERP Data with BryteFlow SAP Data Lake Builder
SAP S/4 HANA Overview: What is SAP S/4 HANA?
SAP S/4 HANA is an ERP application that runs on the SAP HANA database. It is the innovative in-memory version of the Business Suite ERP platform and is based on SAP HANA’s in-memory database. SAP S/4 HANA is the abbreviation for SAP Business Suite 4 SAP HANA, denoting it is the fourth version of SAP Business Suite. SAP S/4 HANA is a next-gen ERP application and ideal as a transactional system for large enterprises. How to create an SAP OData Service in SAP BW for SAP Extraction
SAP S/4 HANA has Powerful Capabilities and Built-in Intelligent Technologies for SAP ERP
The SAP S/4HANA application has much to recommend it. It has been designed with a simplified architecture, works expressly on SAP HANA, taking over from the SAP ECC/ERP. This transition for users is very much like the previous transitions from ERP versions – SAP R/2 to SAP R/3. SAP S/4HANA has built- in intelligent technologies like Machine Learning, AI and Advanced Analytics and enables efficient fact-based process handling and decision support within transactions. S/4HANA is meant to be far easier to use, handles huge volumes of data, and solves more complicated problems than its forerunners. It is available for on-premises, cloud, and hybrid deployment. SAP to Snowflake (Make the Integration Easy)
How does SAP HANA differ from SAP S/4 HANA?
SAP HANA is an in-memory database. This means it stores data in the main memory (RAM) rather than on disks. This makes data faster to access, uses less compute and has simpler internal optimization algorithms. SAP HANA is central to SAP product strategy. SAP S/4HANA on the other hand is an ERP application that runs on SAP HANA and is slated to be the foundation of SAP technologies in the near future. Read about BryteFlow replication for SAP HANA to Snowflake
Data Extraction from SAP S/4 HANA
SAP S/4 HANA ERP data extraction can be done in quite a few ways. These are some data replication patterns users follow. In SAP BW too, there different methods of SAP extraction – Create an SAP OData Service directly on BW Objects (ADSOs) and on SAP BW Queries
Extracting SAP S/4 HANA Data at Database Level
SAP S/4 HANA extraction at Database Level extracts raw data as it is being written to SAP database. Data replication software like BryteFlow replicate the SAP S/4 HANA data using CDC (Change Data Capture), transform the data to consumable formats with the required mappings and compress it (e.g. Parquet-snappy) on destination so it can be integrated with other data and queried easily. The software can also extract data from SAP cluster and pool tables. Change Data Capture Types and CDC Automation
SAP S/4 HANA extraction at Database level: Points to consider
- Extraction of SAP data at database levels means there is minimal impact on SAP database applications since the data is pulled out by third-party tools. Learn about SAP Data Integration Simplified
- Change Data Capture is available out of the box even if S/4 HANA tables do not capture date and time at application level.
- There may be database licensing issues which may prevent data from being extracted directly from the S/4 HANA database when certain techniques are used. Data pipelines and why automate them?
- SAP application logic (found in the ABAP layer) is not retained on extraction and data may need to be mapped again in a non-SAP environment. Extra maintenance efforts may be required for the data if there are changes to the SAP S/4 HANA data model. SAP ECC and Data Extraction from an LO Data Source
Extracting SAP S/4 HANA Data at Application Level
The ABAP layer generally stores business logic in the SAP ERP applications. Users can get API access to business logic through the ABAP Stack (SAP NetWeaver Application Server). ABAP Stack forms the ABAP-based technical foundation for many SAP products. Integration frameworks along with application-level extractors such as SAP Data Services can be used to extract data from SAP applications. They use Remote Function Call libraries to natively connect and extract data from remote function modules, views, tables, and queries. As an option, SAP Data Services can also install ABAP code in the target SAP application and push data instead of pulling it. Creating SAP OData Service on ADSOs and BW Queries
SAP S/4 HANA extraction at Application level: Points to consider
- Extraction of data from SAP S/4 HANA is possible with business logic intact. For e.g. you could actually pull out customer data for a retailer with all related associations and data intact with mapping of function modules. This cuts down on tedious business logic mapping in the non-SAP environment. Simplify SAP Data Integration
- Change Data Capture is not a given since it is not supported by all SAP function modules and frameworks.
- A third-party application is not required which brings down SAP integration costs.
- SAP application servers may face increased load since data is pulled using function modules and other frameworks, which may affect overall performance. SAP to Snowflake (Make the Integration Easy)
Extracting SAP S/4 HANA Data with Operational Data Provisioning
The Operational Data Provisioning (ODP) framework allows data to be replicated between SAP applications in SAP environments and non-SAP target databases. Operational Data Provisioning uses a provider and subscriber model and enables a full data extract as well as incremental data extracts using Change Data Capture with the use of operational delta queues. Create SAP OData Service on BW ADSO and BW Queries using ODP
Business logic is extracted using SAP DataSources (transaction code RSO2), SAP HANA Information Views, SAP Core Data Services (CDS) Views, or SAP Landscape Replication Server (SAP SLT). Operational Data Provisioning functions as a data source for SAP OData Services allowing for REST-based integrations with external applications. SAP ECC and Data Extraction from an LO Data Source
REST is an abbreviation for Representational State Transfer. An API or Application Programming Interface defines how applications or devices can communicate and connect to each other. REST APIs use HTTP requests to access and use data with GET, POST, PUT, and DELETE data types referring to the reading, creating, updating, and deleting of data. Applications such as SAP Data Services can integrate with Operational Data Provisioning, using native Remote Function Call libraries to extract data.
SAP S/4 HANA extraction using Operational Data Provisioning (ODP): Points to consider
- Data is extracted from SAP S/4 HANA with business logic intact, including tables, customizations and package configurations – which means very little data transformation is required. Simplify SAP Data Integration
- OPD-based extraction from S/4 HANA supports Change Data Capture via operational delta queue mechanisms. It can also handle Full data load with micro batches using SAP OData query parameters.
- Some coding may be involved in the creation of OData-based HTTP integrations with SAP applications, increasing development effort and costs. However, with BryteFlow SAP Data Lake Builder you can avoid all coding costs since it is a completely automated SAP extraction tool.
Extracting SAP S/4 HANA Data with SAP Landscape Transformation Replication Server (SLT)
SAP Landscape Transformation Replication Server or SLT for short is an SAP tool for real-time replication of data to the SAP HANA database. It can schedule data loads from SAP sources or non-SAP systems to SAP HANA. The data extraction to target is enabled by creating database triggers in the SAP source system. SAP SLT works with SAP HANA, SAP BW, SAP Data Hub, SAP Data Services and some non-SAP databases. For data replication to non-SAP targets, a good amount of customization using SAP SLT Replication Server SDK will be required. (needs SAP ONE Support access). SAP SLT Replication using ODP Replication Scenario
SAP S/4 HANA extraction using SAP Landscape Transformation Replication Server (SLT): Points to consider
- Full data extraction and incremental loads with Change Data Capture from S/4 HANA are possible. Extraction is trigger-based and enables CDC even on source tables that may not have updated timestamps.
- Allows for real-time or scheduling of data replication at specific times.
- During real-time replication, data can be migrated in SAP HANA format.
- SLT can handle Cluster and Pool tables.
- A lot of custom development work in ABAP is needed for extraction to targets not supported by SAP.
- May prove expensive with extra licensing costs needed for SAP Data Hub, SAP Data Services etc.
- Enterprise Licensing for SAP SLT may be required to replicate from S/4 HANA to non-SAP supported targets and will add onto costs considerably.
Extracting SAP S/4 HANA ERP Data with BryteFlow SAP Data Lake Builder
BryteFlow SAP Data Lake Builder is an extremely efficient SAP ETL tool. It offers one of the easiest and fastest ways to extract data from SAP S/4 HANA at the application level. It extracts SAP ERP data from SAP S/4 HANA with business logic intact to AWS through a completely automated setup. There is NO coding for any process, and you can get ready-to-use data in near real-time for Analytics or ML.
Highlights of BryteFlow SAP Data Lake Builder (SDLB) SAP Extraction
- The BryteFlow SAP Data Lake Builder connects and extracts data from SAP systems like SAP S/4 HANA, SAP ECC, SAP BW, SAP HANA etc. using the Operational Data Provisioning (ODP) framework and OData Services, and replicates data with business logic intact to your Data Lake or Data Warehouse, without any coding involved.
- The BryteFlow SAP Data Lake Builder gets data exposed via BW ODP Extractors or CDS Views as OData Services to build the Data Lake.
- Data in your SAP Data Lake or SAP Data Warehouse is ready to use. Get best practices and high performance for SAP integration on Snowflake, Databricks, Redshift, S3, Azure Synapse, ADLS Gen2, Azure SQL DB, Google BigQuery, Kafka, Postgres SQL and SQL Server
- Since you connect to data at the SAP Application level, you don’t need to rebuild SAP application logic on the data lake again, saving you a lot of valuable time and effort.
- Often customers cannot connect to the underlying database due to SAP licensing restrictions, extracting SAP data with the BryteFlow SAP Data Lake Builder enables you to avoid additional SAP licensing costs.
- You can use the data lake solution to augment or even replace SAP BW systems, thereby saving on costs.
- In case you need it, BryteFlow also allows you to connect to SAP applications at database level with BryteFlow Ingest. Try our SAP Extraction Tool Free: Get a Free Trial of BryteFlow SAP Data Lake Builder