Site icon TheWindowsUpdate.com

Mainframe EBCDIC Data File to ASCII conversion using Azure Logic App

This post has been republished via RSS; it originally appeared at: Microsoft Tech Community - Latest Blogs - .

Introduction

 

This article presents a game-changing solution that revolutionizes the conversion of Mainframe and Midrange EBCDIC files to ASCII format. By leveraging the power of serverless Azure Logic App PaaS, this cutting-edge technology enables seamless data conversion without the hassle of provisioning any Virtual Machines or IaaS resources. The best part? The entire process is achieved without writing a single line of code. With this groundbreaking approach, businesses can streamline their operational workflows and easily load converted data to Azure Data Tier within the same GUI. Say goodbye to complex data conversions and hello to a more efficient and streamlined process with Azure Logic App.

 

Below diagram is about different Azure services which are used in this solution to perform EBCDIC data file to ASCII file conversion.

 

 

EBCDIC To ASCII Conversion Diagram Workflow:

  1. Get blob content (V2) Azure Blob Storage Logic App connector reads EBCDIC file data from Azure storage.
  2. Parse Host File Contents connector converts EBCDIC data input to ASCII data.
  3. Create blob (V2) connector is used to persist converted data to Azure blob storage.
  4. Execute stored procedure (V2) executes Stored Procedure on Azure SQL DB.
  5. Stored Procedure performs Bulk Insert of data from ASCII - CSV file to Azure SQL DB.

EBCDIC to ASCII Conversion using Logic App involves few steps as a part of Development process and few steps are part of Execution process.

 

Development Steps:

Deployment/Execution Steps:

Overview

This solution helps in easy EBCDIC to ASCII data conversion using Azure Logic App PaaS Service. Azure Logic App has in-built connector named IBM Host File  - Parse Host File Contents which can be used to convert data in EBCDIC format to ASCII format. The Azure Logic App workflow below shows how to perform EBCDIC to ASCII data conversion.

 

Logic App Workflow to convert EBCDIC to ASCII using Parse Host File contents connector

 

 

Logic App Connectors present in above diagram:

  1. Recurrence Schedule trigger is used to trigger Logic App at scheduled time interval. In current example this will help to perform execution of logic app workflow on hourly frequency.
  2. Get blob content (V2) Azure Blob Storage connector is used read (get) EBCDIC format data present in Azure storage.
  3. Parse Host File Contents connector is used to convert EBCDIC data read in Step 2 to ASCII data. This connector converts input EBCDIC data into ASCII data separated by comma and stores in json format.
  4. Create CSV table connector is used to convert json objects into comma separate values.
  5. Create blob (V2) connector is used to persist data generated in step 4 to Azure blob storage.
  6. Execute stored procedure (V2) will execute Stored Procedure to perform Bulk Insert of data from ASCII - CSV file stored in blob storage to Azure SQL DB.

Detailed description of every step above is explained in the sections below.

Step1: Recurrence trigger

Recurrence trigger is used to schedule execution of Logic App workflow on definite frequency (every second, minutes, hour, daily etc.).  Additional details of recurrence connector can be found at location: link.

For test purposes we have kept it to run every hour as shown below:

 

Based on requirement this logic app workflow can even be triggered on multiple events e.g., arrival of file in Azure blob storage using Logic app Blog trigger action.

 

Step2: Get blob content (V2)

Get blob content (V2) action on Azure Blob Storage connector is to retrieve binary stream of bytes data from blob content in to Azure Logic App.

To get data from Azure Blob Storage first we need to create a connection to Storage account.

Click on Get blob content (V2) --> Parameters --> Connection.

Click on Create Connection:

 

You can even use different Authentication types like Service Principal etc. We are using Azure AD integrated for the current test.

Provide file name which contains EBCDIC data in Blob parameter as shown below.

 

Input EBCDIC file content - sample:

Copybook Layout:

 

As shown above this copybook contains Name has alphanumeric field, Age as native binary, Salary as packed decimal. Data shown below is created using the above copybook in EBCDIC format.

Input sample data from blob storage in EBCDIC format viewed in Hex Editor:

 

Mainframe EBCDIC file can be transferred to Azure storage using ADF FTP connector, Mainframe to  Azure sFTP or other methods.

 

Step3: Parse Host File Contents

This is the most important step in overall Logic App data conversion workflow. This step parses the contents of a file in IBM EBCDIC format i.e., Converts EBCDIC data to output ASCII format in json layout. Various steps involved in configuring this connector are mentioned below.

Development Steps:

Below steps are required to generate HIDX file from Cobol copybook. As a part of development activities copybook HIDX should be generated and saved to Azure storage so that they can be referenced inside Azure logic App to interpret EBCDIC data files.

Create HIDX layout from Cobol Copybook:

Parse Host File Contents connector expects HIDX layout and EBCDIC data file as inputs and generate ASCII data file as output. HIDX stands for Host Integration Designer XML, this is XML representation of COBOL copybook. Cobol Copybook can be converted to HIDX layout using Microsoft Host Integration Server Software. Below are steps involved in this conversion:

1. Download HIS software from location: link

2. Install Host Integration Server using reference link

3. Create Host Integration Project with Host File Project Template: link

4. Create Host Integration Designer XML (HIDX) for the Managed Data Provider for Host Files: link

 

Logic App workflow parameter for Parse Host File Contents:

Parameters

 

Name

Key

Description

Remark

HIDX Name

hidx

Name of an HIDX file containing meta-data about the IBM file format.

HIDX created during development phase should be copied to Logic App --> Artifacts --> Maps section as shown below:


 

 Employee.hidx file contents:

 

 

Schema Name

schema

Name of a Schema contained in the HIDX file.

DefaultSchema

Binary Contents

contents

Binary data representing the contents of an IBM file.

File Content

 

All parameters used during test execution are shown below:

Returns

Name

Path

Type

Description

Rows

rows

string

The result of parsing the binary file contents.

 

Output rows: (Only few rows shown)

As shown below every output row data is present in ASCII Json format.

Step4: Create CSV table

Step 3 will create output rows in ASCII JSON format present in array as shown above. This step will create output records in CSV record layout format.

 

 

Output of Create CSV Table connector:

As shown below rows are converted from input json format to CSV records with new line characters \r\n at the end.

 

Step5: Create blob (V2)

This step will write output CSV file to blob storage.

Below screenshot shows the parameter that should be set to that file is written to blob storage.

Output converted file in ASCII Format stored in Azure storage – sample

Step6: Execute stored procedure (V2)

This step will execute Stored Procedure to perform Bulk Insert data from ASCII - CSV file stored in blob storage to Azure SQL DB.

Steps to create Stored Procedure with Bulk Insert statements:

a)  Create Table to store ASCII converted data

CREATE TABLE SANDIPK.RECORD_LAYOUT (
  WS_NAME      CHAR(30),
  WS_AGE       SMALLINT,
  WS_SALARY    DECIMAL(12,2))

 

b) Create Master Key for Encryption

CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'Testpw&rd';

 

c) Create Database scoped credential

Detailed instruction to create Shared Access Signature for Blob Storage is mentioned at location: link

CREATE DATABASE SCOPED CREDENTIAL MyAzureBlobStorageCredential
WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
SECRET = 'sp=racwdlmeop&st=2023-01-01T14:02:21Z&se=2023-12-30T22:02:21Z&spr=https&sv=2021-06-08&sr=c&sig=MxeGt5ofHtestdummyg02cV%maskedcontent%3D';

 

d) Create External Data Source as Azure Blob Storage

CREATE EXTERNAL DATA SOURCE MyAzureBlobStorage
WITH ( TYPE = BLOB_STORAGE,
LOCATION = 'https://azsksawestus2.blob.core.windows.net',
CREDENTIAL= MyAzureBlobStorageCredential);

 

e) Create Stored Procedure to Bulk Insert ASCII converted data from Azure blob storage to Azure SQL DB

CREATE PROCEDURE LoadASCIIData
AS
       BEGIN
          BULK INSERT SANDIPK.RECORD_LAYOUT
                 FROM 'azsklawrite01/file01'
                 WITH (
                        DATA_SOURCE = 'MyAzureBlobStorage',
                        FORMAT      = 'CSV',
                        FIRSTROW    = 2,
                        TABLOCK
                        );
       END

 

Below screenshot shows the parameter that should be set to execute Stored Procedure on SQL DB which will perform bulk insert. Connection needs to be created to Azure SQL DB by following steps mentioned at location: link

 

 

Execute Logic App to test EBCDIC to ASCII conversion results

 

Run Logic App to perform end to end process for reading EBCDIC file from Azure blob storage, converting to ASCII format and loading ASCII data in Azure SQL DB. Below is successful execution of Logic App.

 

 

Check Results:

  1. Check the count of rows in Azure SQL Database Table:
SELECT COUNT(*) FROM SANDIPK.RECORD_LAYOUT

 


Record Count Output:

 

  1. Check sample rows from Azure SQL Database Table :
SELECT TOP 10 * FROM SANDIPK.RECORD_LAYOUT

 


Rows from SQL DB Table Output :

 

Important points for Consideration

The above example is shown to provide information about important steps to perform EBCDIC To ASCII conversion using Logic App. This example is not a complete solution. Actual implementation may include Error Handling / Splitting Large files into smaller files for processing / Parameterized file names, Stored procedure etc.

Limitations

Feedback and suggestions

If you have feedback or suggestions for improving this data migration asset, please send an email to Azure Databases SQL Customer Success Engineering Team. Thank you for your support!

Exit mobile version