Digital Technology

SAP Datasphere – Replication Flow (Delta Functionality)

Replication flow is one of the main artifact to ingest data into SAP Datasphere from Source systems (SAP or Non SAP).

There are two types of extraction types Initial Only / Initial and Delta, based on the ‘Source Connection‘ type Delta option will be enabled in Replication Flow.

In the previous blog, I have explained about creation of replication flow with Initial only (refer: https://community.sap.com/t5/technology-blogs-by-members/sap-data-sphere-replication-flow/ba-p/13920…).

In this blog,  I will try to explain the creation of a Replication flow with Initial and Delta using HANA connection and with the use case to ingest Master Data (Info Object) from SAP BW system to SAP Datasphere with Replication Flow.

Use Case:

           Source: Master Data (ZCOUNTRY) from SAP BW system

           Target : Table in SAP Data Sphere

           Extraction: Replication Flow

           Connection Type: HANA

  Below are the steps involved to create Replication flow in SAP Data Sphere.

  1. Selecting the Replication flow
  2. Choose the connection
  3. Configuration with settings and deployment
  4. Monitoring the job and reconcile with source

————————————————————————————————————————————————–

Step 1. Selecting the Replication flow

Login to SAP Data Sphere -> Main menu -> Data Builder -> Replication Flow

    

Lokesh_Kumar_Pothapola_1-1730734743725.png

Step 2: Choose the connection

 Select source connection

Note: Connection type with SAP HANA must be configured from SAP BW to SAP Data Sphere as a pre requisite.

  Select Source Container

Lokesh_Kumar_Pothapola_3-1730734743725.png

  Select Source Objects and import (master data table /BIC/PZCOUNTRY Attribute table from SAP BW system)

Select and import

Lokesh_Kumar_Pothapola_5-1730734743726.png

Step 3:  Configuration with settings and deployment

Once the selected tables are imported, target table in Data sphere must be selected to store data in a local table.

Select Target Connection -> Select Local Repository configured for Data Sphere

After selecting the target connection, the target tables will be available with one to one mapping from source.

Lokesh_Kumar_Pothapola_7-1730734743727.png

Projections

While loading data, filtering or mapping conditions can be altered

Filter: Filtering based on required values

Lokesh_Kumar_Pothapola_8-1730734743727.png

Mapping: Target column names, data type etc

Lokesh_Kumar_Pothapola_9-1730734743727.png

Load Type is the extraction method where the Delta job can be configured. Select the option ‘Initial and Delta’ (as the source is SAP BW master data and connection type is HANA the Delta option is available).

Note: While importing master data with ABAP connection, only Initial option is available, please refer to my previous blog:https://community.sap.com/t5/technology-blogs-by-members/sap-data-sphere-replication-flow/ba-p/13920…).

Lokesh_Kumar_Pothapola_10-1730734743728.png

After selecting Delta, below columns will be added automatically to target structure to track the changes.

Change_Type – Insert, Modified and Deletion status will be captured

Change_Date – Date of the delta record

Lokesh_Kumar_Pothapola_11-1730734743728.png

Delta Load Interval: Scheduling the job as per the frequency required to load data from source to target

Lokesh_Kumar_Pothapola_12-1730734743729.png

Save the Replication flow in the required folder and deploy the artifact. After deployment notification, run the job and monitor via Tools.

Deployment notification

Step 4: Monitoring the Replication flow

Since this is a Replication flow with Delta, the job executes in 2 steps

  • All the available records will be transferred, in the below case 6 records are loaded.
  • In next job, only the required records will be transferred (modified, newly added and deleted).

 Below is the monitoring screen where extraction details like runtime, number records, loading status, partitions details are available.

Initial Job

 At the time of extraction

Lokesh_Kumar_Pothapola_15-1730734743730.png

After the extraction

Lokesh_Kumar_Pothapola_16-1730734743731.png

Data at source table level (SAP BW master data)

Lokesh_Kumar_Pothapola_17-1730734743732.png

Delta job

As the Delta load interval is set run at hourly, the next job automatically starts after one hour.

Delta Log:

Lokesh_Kumar_Pothapola_21-1730736173642.png

Below highlighted newly added 4 records are transferred with Delta job.

Lokesh_Kumar_Pothapola_20-1730734743734.png

In conclusion, Delta is one the main feature which is used in Replication flow to capture the modified data. However this option will be enabled based on the source connection type and source table.

In my next blog, i will explain about loading the transaction data (ADSO) from SAP BW to SAP Data Sphere.

Thanks

Lokesh Kumar Pothapola

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *