Contents
- What is SAP SLT or SAP Landscape Transformation?
- SAP SLT Replication
- SAP SLT Advantages for Real-time SAP Replication
- SAP SLT ODP Replication Scenario
- SAP SLT Replication step by step using ODP and BryteFlow
- SAP Extraction and Replication Tool: BryteFlow SAP Data Lake Builder
What is SAP SLT or SAP Landscape Transformation?
SAP SLT’s full form is SAP Landscape Transformation Replication Server. SAP SLT, being an ETL tool enables us to extract and replicate data in real-time or as per schedule from SAP and non-SAP source systems into an SAP HANA database. 5 ways to extract data from SAP S/4 HANA
SAP SLT can be described as a real-time replication tool for SAP ABAP source systems to SAP HANA, to all SAP supported databases, to the SAP Business Suite and SAP applications. It can also support non-SAP sources. BryteFlow SAP Data Lake Builder for real-time ETL from SAP
Please Note: SAP OSS Note 3255746 has notified SAP customers that use of SAP RFCs for extraction of ABAP data from sources external to SAP (On premise and Cloud) is banned for customers and third-party tools. Learn More
SAP SLT replication tool is trigger-based and real-time
SAP SLT replication uses no-impact change data capture as part of its process and trigger-based replication that captures all inserts, deletes, and updates for the tables being replicated from source to target. By trigger-based replication we mean that a database trigger is created whenever data is delivered into the logging table at source. The SAP SLT tool then automates the extraction, transformation & loading process in real-time or as scheduled. Create CDS Views in SAP HANA for SAP extraction
SAP SLT Replication
SAP SLT Replication for initial full loads
How SLT in SAP works: SAP SLT can be used for replicating the initial full load of data and the incremental changes in data (deltas). SAP SLT reads and extracts the existing data, transforms it if needed, and loads it from source to target. Logging tables are created by the source system for every table to be replicated. During the initial load itself, SLT identifies changes on source using database triggers, saves changes to logging tables and writes them to the target when the initial load is done. SAP to Snowflake (Make the Integration Easy)
SAP SLT Replication for Deltas with Change Data Capture
After the completion of the initial load, SLT continues monitoring the source logging tables, and replicates the deltas from source to target in real-time, including inserts, updates and deletes. The changes are stored in the logging tables so even if there is a loss of connection between source and SLT, the replication will resume from where it left off, when connectivity is restored. Reliable SAP CDC and Data Provisioning in SAP
SAP SLT Advantages for Real-time SAP Replication
- SAP SLT uses a trigger-based CDC approach, capturing all inserts, deletes and updates with minimal impact on source systems.
- During real-time data replication, you can take across data in SAP HANA format. SAP to AWS (Specifically S3) – Know as Easy Method
- SAP SLT can be set up for real-time replication and for scheduled data replication. How to Migrate Data from SAP Oracle to HANA
- SAP SLT can be configured to support multiple system connections (1:N and N:1 replication scenarios) SAP HANA to Snowflake (2 Easy Ways)
- SAP SLT has the functionality for conversion, data transformation and filtering enabling you to rejig data before loading it to a SAP HANA database. How to Carry Out a Successful SAP Cloud Migration
- SAP SLT manages and supports all native SAP table formats including Cluster & Pool. About SAP BODS, the SAP ETL Tool
- SAP SLT is fully integrated with SAP HANA Studio and data can be monitored with SAP HANA Solution Manager
- SAP SLT supports both – non-Unicode and Unicode conversion during data replication. SAP to Snowflake (Make the Integration Easy)
- SAP SLT has mobile capability through the SLT Replication Manager which is a Mobile Application. This can be used to monitor and manage the replication from any mobile device with an Internet connection. SAP HANA to Snowflake (2 Easy Ways)
SAP SLT ODP Replication Scenario
SLT and the ODP Framework
The ODP Framework (Operational Data Provisioning Framework) is a technical infrastructure that supports extraction and replication from SAP (ABAP) systems to an SAP BW/4 HANA Data Warehouse. It also supports delta capture. For delta capture, data from a source is written automatically to an Operational Delta Queue (ODQ) through an Update process or delivered to the delta queue via an extractor interface. Data from the delta queue is extracted by the target applications (ODQ Subscribers or ODQ Consumers) to continue processing the data. Extract data in SAP BW using ADSOs and SAP BW Queries
SAP Lansdscape Transformation Replication Server can function as a provider for the Operational Data Provisioning Framework (ODP). SAP SLT stores data from connected SAP systems in the ODP framework as an Operational Data Queue (ODQ). Extraction and replication scenarios are supported by the ODP framework for different target SAP applications (subscribers). SAP ECC and Data Extraction from an LO Data Source
SAP Extraction using ODP Replication Scenario
ODP Replication Scenario is one of the many scenarios available to replicate the data from SAP or non-SAP sources to SAP HANA systems and sometimes non-SAP systems. It creates ODP Data Sources in the background for the tables selected for replication, using the ODP Framework. The SAP Cloud Connector and Why It’s So Amazing
Figure: SAP SLT with ODP Scenario
SAP SLT Replication step by step using ODP and BryteFlow
These are the steps to create the SAP SLT ODP Replication Scenario and to create SAP OData Services to move the data from SAP SLT Server to your data lake using BryteFlow.
- Create SAP SLT ODP Replication Scenario
- Login to SAP SLT Server, go to transaction LTRC.
- Provide the configuration technical name & description.
- In Source System Setup, select ‘RFC Destination’ radio button & mention ‘NONE’ in RFC Destination if the SLT is on the same server.
- In Destination System Setup, select ‘RFC Destination’ radio button. Mention ‘None’ in RFC Destination and ODP Replication Scenario.
- In Transfer Settings, select ‘Resource Optimized’. Specify the Data Transfer Jobs, Initial Load Jobs & number of Calculation Jobs. Also select ‘Real Time’ for real time replication of data under Replication Options.
- Review and select ‘Create’
- The SAP SLT ODP Scenario is now created, you can click on configuration name & review the settings. How to Migrate Data from SAP Oracle to HANA
- Login to SAP SLT Server, go to transaction LTRC.
- Create OData Services:
- Go to Transaction SEGW, create a project & select ‘ODP Replication’ option.
- Select the ODP context as the SLT ODP configuration name, create with above process.
- Create the OData Service for the table required to be replicated.
- Go to transaction /n/iwfnd/maint_service & register the OData Service created in step II.c
- Test the OData Service.
- Using SAP SLT with BryteFlow
- Use the SAP OData service created (in the process outlined above) in the BryteFlow SAP Data Lake Builder to build your own data lake or data warehouse on the platform you require, like Snowflake, Amazon Redshift, Amazon S3, Azure Synapse, Azure Data Lake, BigQuery, Databricks and Postgres.
Note: This blog describes the SAP replication process using SAP SLT and BryteFlow. This is a good option if you have an existing SAP SLT license. However, even if you are not using SAP SLT, you can still deliver your data directly from SAP applications to Cloud and on-premise destinations without manual coding. SAP to Snowflake (Make the Integration Easy)
SAP Extraction and Replication Tool: BryteFlow SAP Data Lake Builder
Our SAP extraction and replication tool (BryteFlow SAP Data Lake Builder) enables you to extract SAP data automatically from SAP applications like SAP ECC, S4HANA, SAP BW, and SAP HANA, with business logic intact. SAP Extraction: 2 Easy Methods using ODP and SAP OData Services
- Save time and effort with an automated setup of data extraction and automated analysis of the SAP source application, with zero coding to be done for any process. No external integration with third party tools like Apache Hudi is needed. SAP BODS, the SAP ETL Tool
- The SAP replication tool enables automated ODP extraction, using SAP OData Services to extract the data, both initial and incremental or deltas. It can connect to SAP Data Extractors or SAP CDS Views to get the data from SAP applications. Read about data extraction from SAP/S4 HANA
- BryteFlow Ingest enables log-based CDC (Change Data Capture) from the database using transaction logs if the underlying SAP database is accessible. It merges deltas automatically with initial data to keep the SAP Data Lake or SAP Data Warehouse continually updated for real-time SAP replication from Databases SAP to AWS (Specifically S3) – Know as Easy Method
- The BryteFlow SAP Data Lake Builder delivers ready-to-use data in your SAP Data Lake or SAP Data Warehouse for Analytics and Machine Learning. It has built-in best practices for high performance for SAP integration on Snowflake, Redshift, S3, Azure Synapse, ADLS Gen2, Kafka, SingleStore, Databricks, PostgresSQL, Azure SQL DB and SQL Server. Get a Free Trial of SAP Data Lake Builder