Integrate Synapse Database Templates with SAP Data – Part 2

Posted by

This post has been republished via RSS; it originally appeared at: Microsoft Tech Community - Latest Blogs - .

This blog is in continuation of part 1 linked : Integrate Synapse Database Templates with SAP Data - Part 1 - Microsoft Community Hub

 

3.Data Mapping

Next step is the data mapping task which was the most time consuming as we mapped our source data entities to entities in our lake database.

Some source SAP tables used:

Source tables

Table description

KNA1

General Data in Customer Master 

KNVV

Customer Master Sales Data

LIKP

SD Document: Delivery Header Data

LIPS

SD document: Delivery: Item data

 

Destination lake database tables used are:

Destination tables

Shot table description

Customer

A customer is an individual or legal entity that has or has purchased a product or service.

CustomerEmail

The customer email address(s) of the associated email type for the indicated period.

CustomerLocation

The customer location(s) of the associated Location Type for the indicated period.

CustomerName

The names by which the customer is known in various contexts or usages.

CustomerTelephoneNumber

The customer telephone number of the associated Telephone Number Type for the indicated period.

CustomerType

A categorization of customers based upon common characteristics or similar criteria.

Invoice

A document presented to customers for products delivered or services performed.

Order

A document or commission by the customer to order products.

OrderLine

The components of an Order broken down by Product and Quantity, one per line item.

Party

A party is an individual, organization, legal entity, social organization or business unit of interest to the business.

Shipment

A quantity of items, goods or bulk cargo that are shipped together.

ShipmentItem

The lines on a shipment each consisting of one item.

 

Database templates overview:

Overview of Azure Synapse database templates - Azure Synapse Analytics | Microsoft Learn

 

Data mapping table used for the Hack

Table name in Lake Database

Column name in Lake database table

Data type

SAP Table

SAP Field Name

Customer

CustomerId

string

KNA1

KUNRR

Customer

CustomerTypeId

string

KNA1

KTOKD

Customer

PartyId

integer

KNA1

LIFNR

CustomerEmail

CustomerId

string

KNA1

KUNRR

CustomerLocation

CustomerId

integer

KNA1

KUNNR

CustomerLocation

LocationId

string

KNA1

ADRNR

CustomerName

CustomerId

string

KNA1

KUNNR

CustomerName

CustomerNameId

string

KNA1

NAME1

CustomerName

CustomerNameNote

string

KNA1

NAME2

CustomerTelephoneNumber

CustomerId

string

KNA1

KUNNR

CustomerTelephoneNumber

LocationId

integer

KNA1

ADRNR

CustomerTelephoneNumber

TelephoneNumber

string

KNA1

TELF1

CustomerType

CustomerTypeId

string

KNA1

KTOKD

Invoice

InvoiceToLocationId

integer

KNA1

ADRNR

Party

PartyName

string

KNA1

NAME1

Shipment

ConsigneeLocationId

integer

KNA1

ADRNR

Shipment

ConsigneeName

string

KNA1

NAME1

Shipment

AutomaticProofOfDeliveryRequiredIndicator

string

KNVV

PODKZ

Shipment

AutomaticProofOfDeliveryPodIssuedTimestamp

integer

LIKP

PODAT

Shipment

ConsigneeCustomerAccountId

string

LIKP

KUNNR

Shipment

ConsigneePartyId

string

LIKP

KUNNR

Shipment

ChannelId

integer

LIPS

WERKS

Shipment

ConsignorPartyId

integer

LIPS

WERKS

Shipment

OriginationWarehouseId

integer

LIKP

LGNUM

ShipmentItem

ItemSku

string

LIPS

MATNR

Order

OrderActualDeliveryTimestamp

integer

LIKP

HANDOVERDATE

Order

ShipmentToName

string

LIKP

KUNNR

Order

WarehouseId

integer

LIKP

LGNUM

OrderLine

PlannedDeliveryDate

date

LIKP

LFDAT

OrderLine

PlannedPickDate

integer

LIKP

KODAT

 

JOIN condition for shipment table

JOIN condition

left table

column name

right table

column name

KNA1

KUNNR

KNVV

KUNNR

KNVV

KUNNR

LIKP

KUNNR

LIKP

VBELN

LIPS

VBELN

 

The mapping was created using the map tool, by selecting the input folders with data files.

 

ssonwane_0-1671625531433.png

With a mapping name provided, the next window with all the source data files and target table names for which we will create a mapping  will now open.

ssonwane_1-1671625546986.png

The target tables will be listed on the  left-hand side and source files will be in primary source table drop down.

Based on the data mapping table above the mapping between the source files and target tables was created.

The below snip shows the mapping for one table along with the preview of the data after checking for any data type inconsistencies

Shipment table:

ssonwane_2-1671625561608.png

ssonwane_0-1671625806494.png

 

Once the mapping has been created we could click on create pipeline to create all the dataflows and pipelines to ingest the data in the tables in lake database. We could manually validate the dataflows for correct mapping and ran the pipeline to ingest data in the lake database.

 

For more details on how to join multiple tables you may want to review our earlier blog: How to use the Additional Sources in Map Data tool within Azure Synapse Analytics - Microsoft Community Hub

 

ssonwane_4-1671625619609.png

 

4.Ingesting data in Lake Database

The pipelines ran successfully and ingested the data.

ssonwane_5-1671625634975.png

 

Notice the ingested data is now in the lake database as tables and the actual data sits on the storage account in their respective folders.

 

ssonwane_6-1671625642446.png

 

ssonwane_7-1671625642448.png

5.Querying the data from serverless SQL pool

We could then query the data using the serverless SQL pool and of course use it within PBI using the Synapse connectors

 

ssonwane_8-1671625689085.png

With this effort, it can be seen how to integrate SAP with Database Templates and the Lake Database. More importantly, this approach with industry specific database templates can help standardize data in the lake and that these templates serve as an accelerator for many types of large projects.

 

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.